Next.js File Uploads: Direct-to-S3 Presigned URLs, Multipart, Progress Tracking, and Virus Scanning
Handle file uploads in Next.js with direct-to-S3 presigned URLs, multipart upload for large files, client-side progress tracking with XMLHttpRequest, server-side validation, and ClamAV virus scanning via Lambda.
Uploading files through your Next.js server wastes bandwidth, increases server load, and hits serverless function size limits. The right pattern is presigned URLs: your server generates a short-lived S3 PUT URL, the client uploads directly to S3, and your server is only involved in authorization and post-upload processing.
This guide covers the complete pattern — single-file presigned PUT, multipart for files >100MB, client-side progress tracking, and virus scanning.
S3 Bucket Configuration
# terraform/s3-uploads.tf
resource "aws_s3_bucket" "uploads" {
bucket = "${var.app_name}-uploads-${var.environment}"
tags = local.tags
}
# Uploads must NOT be public — access via presigned URLs only
resource "aws_s3_bucket_public_access_block" "uploads" {
bucket = aws_s3_bucket.uploads.id
block_public_acls = true
block_public_policy = true
ignore_public_acls = true
restrict_public_buckets = true
}
# CORS: allow PUT from your app domain
resource "aws_s3_bucket_cors_configuration" "uploads" {
bucket = aws_s3_bucket.uploads.id
cors_rule {
allowed_headers = ["Content-Type", "Content-Length", "Content-MD5"]
allowed_methods = ["PUT", "POST"]
allowed_origins = [
"https://${var.app_domain}",
var.environment == "development" ? "http://localhost:3000" : null,
]
expose_headers = ["ETag"]
max_age_seconds = 3600
}
}
# Quarantine bucket for virus scanning (files move here pre-scan)
resource "aws_s3_bucket" "quarantine" {
bucket = "${var.app_name}-quarantine-${var.environment}"
}
# Lifecycle: delete quarantine files after 7 days (failed scans)
resource "aws_s3_bucket_lifecycle_configuration" "quarantine" {
bucket = aws_s3_bucket.quarantine.id
rule {
id = "expire-quarantine"
status = "Enabled"
expiration { days = 7 }
}
}
Presigned URL API Route
// app/api/uploads/presigned/route.ts
import { NextRequest, NextResponse } from "next/server";
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
import { auth } from "@/auth";
import { nanoid } from "nanoid";
import { z } from "zod";
const s3 = new S3Client({ region: process.env.AWS_REGION });
const ALLOWED_TYPES = [
"image/jpeg", "image/png", "image/webp", "image/gif",
"application/pdf",
"text/csv",
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
] as const;
const MAX_FILE_SIZE = 50 * 1024 * 1024; // 50MB for presigned PUT
const RequestSchema = z.object({
filename: z.string().min(1).max(255),
contentType: z.enum(ALLOWED_TYPES),
size: z.number().positive().max(MAX_FILE_SIZE),
// Optional: scope uploads to a resource
resourceId?: z.string().uuid().optional(),
});
export async function POST(req: NextRequest) {
const session = await auth();
if (!session?.user) {
return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}
const body = await req.json();
const parsed = RequestSchema.safeParse(body);
if (!parsed.success) {
return NextResponse.json(
{ error: "Invalid request", issues: parsed.error.issues },
{ status: 400 }
);
}
const { filename, contentType, size, resourceId } = parsed.data;
// Sanitize filename: strip path traversal, keep extension
const ext = filename.split(".").pop()?.toLowerCase() ?? "";
const safeFilename = `${nanoid()}.${ext}`;
const key = `uploads/${session.user.workspaceId}/${safeFilename}`;
// Generate presigned PUT URL — expires in 15 minutes
const command = new PutObjectCommand({
Bucket: process.env.AWS_S3_UPLOAD_BUCKET!,
Key: key,
ContentType: contentType,
ContentLength: size,
// Metadata stored on the S3 object
Metadata: {
"x-uploader-id": session.user.id,
"x-workspace-id": session.user.workspaceId,
"x-original-name": encodeURIComponent(filename),
...(resourceId && { "x-resource-id": resourceId }),
},
});
const uploadUrl = await getSignedUrl(s3, command, { expiresIn: 900 }); // 15 min
// Pre-create a database record with status=pending
const upload = await prisma.upload.create({
data: {
workspaceId: session.user.workspaceId,
uploaderId: session.user.id,
key,
filename,
contentType,
size,
status: "pending", // pending → scanning → ready | rejected
resourceId,
},
select: { id: true },
});
return NextResponse.json({ uploadUrl, key, uploadId: upload.id });
}
🌐 Looking for a Dev Team That Actually Delivers?
Most agencies sell you a project manager and assign juniors. Viprasol is different — senior engineers only, direct Slack access, and a 5.0★ Upwork record across 100+ projects.
- React, Next.js, Node.js, TypeScript — production-grade stack
- Fixed-price contracts — no surprise invoices
- Full source code ownership from day one
- 90-day post-launch support included
Client: Upload with Progress
// hooks/use-file-upload.ts
"use client";
import { useState, useCallback } from "react";
interface UploadState {
progress: number; // 0–100
status: "idle" | "getting-url" | "uploading" | "processing" | "done" | "error";
error?: string;
uploadId?: string;
key?: string;
}
export function useFileUpload() {
const [state, setState] = useState<UploadState>({ progress: 0, status: "idle" });
const upload = useCallback(async (file: File): Promise<string | null> => {
setState({ progress: 0, status: "getting-url" });
try {
// 1. Get presigned URL
const res = await fetch("/api/uploads/presigned", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
filename: file.name,
contentType: file.type,
size: file.size,
}),
});
if (!res.ok) {
const err = await res.json();
throw new Error(err.error ?? "Failed to get upload URL");
}
const { uploadUrl, key, uploadId } = await res.json();
// 2. Upload directly to S3 with XHR (fetch doesn't support upload progress)
setState({ progress: 0, status: "uploading", uploadId, key });
await uploadWithProgress(uploadUrl, file, (progress) => {
setState((s) => ({ ...s, progress }));
});
// 3. Confirm upload to backend
setState((s) => ({ ...s, progress: 100, status: "processing" }));
await fetch("/api/uploads/confirm", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ uploadId, key }),
});
setState({ progress: 100, status: "done", uploadId, key });
return key;
} catch (err) {
setState({ progress: 0, status: "error", error: (err as Error).message });
return null;
}
}, []);
const reset = useCallback(() => {
setState({ progress: 0, status: "idle" });
}, []);
return { ...state, upload, reset };
}
// XHR is required for upload progress — fetch() has no upload progress API
function uploadWithProgress(
url: string,
file: File,
onProgress: (progress: number) => void
): Promise<void> {
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener("progress", (e) => {
if (e.lengthComputable) {
onProgress(Math.round((e.loaded / e.total) * 100));
}
});
xhr.addEventListener("load", () => {
if (xhr.status >= 200 && xhr.status < 300) resolve();
else reject(new Error(`Upload failed: ${xhr.status}`));
});
xhr.addEventListener("error", () => reject(new Error("Network error")));
xhr.addEventListener("abort", () => reject(new Error("Upload aborted")));
xhr.open("PUT", url);
xhr.setRequestHeader("Content-Type", file.type);
xhr.send(file);
});
}
Upload Dropzone Component
// components/upload/dropzone.tsx
"use client";
import { useCallback, useRef } from "react";
import { useFileUpload } from "@/hooks/use-file-upload";
import { Upload, CheckCircle, AlertCircle, Loader2 } from "lucide-react";
interface DropzoneProps {
accept?: string; // MIME types: "image/*,application/pdf"
maxSizeMB?: number;
onUpload: (key: string) => void;
}
export function Dropzone({ accept = "*/*", maxSizeMB = 50, onUpload }: DropzoneProps) {
const { progress, status, error, upload, reset } = useFileUpload();
const inputRef = useRef<HTMLInputElement>(null);
const handleFile = useCallback(
async (file: File) => {
if (file.size > maxSizeMB * 1024 * 1024) {
alert(`File must be under ${maxSizeMB}MB`);
return;
}
const key = await upload(file);
if (key) onUpload(key);
},
[upload, maxSizeMB, onUpload]
);
function handleDrop(e: React.DragEvent) {
e.preventDefault();
const file = e.dataTransfer.files[0];
if (file) handleFile(file);
}
return (
<div
onDrop={handleDrop}
onDragOver={(e) => e.preventDefault()}
onClick={() => inputRef.current?.click()}
className={`
border-2 border-dashed rounded-xl p-8 text-center cursor-pointer transition-colors
${status === "uploading" ? "border-blue-300 bg-blue-50" :
status === "done" ? "border-green-300 bg-green-50" :
status === "error" ? "border-red-300 bg-red-50" :
"border-gray-200 hover:border-gray-300 bg-gray-50"}
`}
>
<input
ref={inputRef}
type="file"
accept={accept}
className="sr-only"
onChange={(e) => {
const file = e.target.files?.[0];
if (file) handleFile(file);
}}
/>
{status === "idle" && (
<>
<Upload className="w-8 h-8 text-gray-400 mx-auto mb-3" />
<p className="text-sm font-medium text-gray-700">Drop file here or click to browse</p>
<p className="text-xs text-gray-400 mt-1">Max {maxSizeMB}MB</p>
</>
)}
{(status === "getting-url" || status === "uploading" || status === "processing") && (
<>
<Loader2 className="w-8 h-8 text-blue-500 mx-auto mb-3 animate-spin" />
<p className="text-sm font-medium text-blue-700">
{status === "getting-url" ? "Preparing upload…" :
status === "processing" ? "Processing…" :
`Uploading… ${progress}%`}
</p>
{status === "uploading" && (
<div className="w-full bg-blue-100 rounded-full h-1.5 mt-3">
<div
className="h-1.5 bg-blue-500 rounded-full transition-all"
style={{ width: `${progress}%` }}
/>
</div>
)}
</>
)}
{status === "done" && (
<>
<CheckCircle className="w-8 h-8 text-green-500 mx-auto mb-3" />
<p className="text-sm font-medium text-green-700">Upload complete</p>
<button
onClick={(e) => { e.stopPropagation(); reset(); }}
className="text-xs text-green-600 underline mt-1"
>
Upload another
</button>
</>
)}
{status === "error" && (
<>
<AlertCircle className="w-8 h-8 text-red-500 mx-auto mb-3" />
<p className="text-sm font-medium text-red-700">{error}</p>
<button
onClick={(e) => { e.stopPropagation(); reset(); }}
className="text-xs text-red-600 underline mt-1"
>
Try again
</button>
</>
)}
</div>
);
}
🚀 Senior Engineers. No Junior Handoffs. Ever.
You get the senior developer, not a project manager who relays your requirements to someone you never meet. Every Viprasol project has a senior lead from kickoff to launch.
- MVPs in 4–8 weeks, full platforms in 3–5 months
- Lighthouse 90+ performance scores standard
- Works across US, UK, AU timezones
- Free 30-min architecture review, no commitment
Multipart Upload for Large Files (>100MB)
// lib/uploads/multipart.ts
import {
S3Client,
CreateMultipartUploadCommand,
UploadPartCommand,
CompleteMultipartUploadCommand,
AbortMultipartUploadCommand,
} from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
const s3 = new S3Client({ region: process.env.AWS_REGION });
// Returns presigned URLs for each part
export async function createMultipartUpload(
key: string,
contentType: string,
totalParts: number, // ceil(fileSize / 5MB)
): Promise<{ uploadId: string; partUrls: string[] }> {
// Initiate multipart upload
const { UploadId } = await s3.send(
new CreateMultipartUploadCommand({
Bucket: process.env.AWS_S3_UPLOAD_BUCKET!,
Key: key,
ContentType: contentType,
})
);
if (!UploadId) throw new Error("Failed to create multipart upload");
// Generate presigned URL for each part (max 10,000 parts)
const partUrls = await Promise.all(
Array.from({ length: totalParts }, (_, i) =>
getSignedUrl(
s3,
new UploadPartCommand({
Bucket: process.env.AWS_S3_UPLOAD_BUCKET!,
Key: key,
UploadId,
PartNumber: i + 1, // Part numbers start at 1
}),
{ expiresIn: 3600 } // 1 hour per part URL
)
)
);
return { uploadId: UploadId, partUrls };
}
export async function completeMultipartUpload(
key: string,
uploadId: string,
parts: Array<{ PartNumber: number; ETag: string }>
): Promise<void> {
await s3.send(
new CompleteMultipartUploadCommand({
Bucket: process.env.AWS_S3_UPLOAD_BUCKET!,
Key: key,
UploadId: uploadId,
MultipartUpload: { Parts: parts },
})
);
}
Cost and Timeline Estimates
| Scope | Team | Timeline | Cost Range |
|---|---|---|---|
| Presigned PUT + basic client upload | 1 dev | 1–2 days | $400–800 |
| + Progress tracking + Dropzone UI | 1 dev | 1–2 days | $400–800 |
| + Multipart upload for large files | 1 dev | 2–3 days | $600–1,200 |
| + Virus scanning (Lambda + ClamAV) | 1–2 devs | 2–3 days | $800–1,500 |
See Also
- SaaS CSV Export Pipeline
- AWS S3 SNS Patterns
- React Drag and Drop File Upload
- Next.js Environment Variables
- AWS IAM Least Privilege
Working With Viprasol
Direct-to-S3 file uploads require care at every layer: presigned URL expiry, content-type validation server-side (client headers can be spoofed), S3 CORS for your exact domain, metadata on the object for post-processing, and XHR for progress tracking. Our team builds the complete upload pipeline — presigned URL generation, database pre-record with status=pending, XHR upload with progress, confirm endpoint, and multipart for files over 50MB.
What we deliver:
- S3 Terraform: block public access, CORS for PUT/POST, quarantine bucket with 7d lifecycle
- Presigned URL route: content-type enum validation, size limit, safe filename with nanoid, metadata headers
useFileUploadhook: state machine idle→getting-url→uploading→processing→done/erroruploadWithProgressXHR withxhr.upload.addEventListener("progress")Dropzone: drag-and-drop, file input click, progress bar, done/error statescreateMultipartUpload+completeMultipartUploadfor files >100MB
Talk to our team about your file upload architecture →
Or explore our cloud infrastructure services.
About the Author
Viprasol Tech Team
Custom Software Development Specialists
The Viprasol Tech team specialises in algorithmic trading software, AI agent systems, and SaaS development. With 100+ projects delivered across MT4/MT5 EAs, fintech platforms, and production AI systems, the team brings deep technical experience to every engagement. Based in India, serving clients globally.
Need a Modern Web Application?
From landing pages to complex SaaS platforms — we build it all with Next.js and React.
Free consultation • No commitment • Response within 24 hours
Need a custom web application built?
We build React and Next.js web applications with Lighthouse ≥90 scores, mobile-first design, and full source code ownership. Senior engineers only — from architecture through deployment.