Back to Blog

GDPR Data Export for SaaS: DSAR Fulfillment, Export Pipeline, and Right of Access

Implement GDPR right-of-access data export for SaaS: DSAR request handling, automated export pipeline, data classification, ZIP packaging, and audit trail with PostgreSQL and Node.js.

Viprasol Tech Team
December 26, 2026
13 min read

GDPR Data Export for SaaS: DSAR Fulfillment, Export Pipeline, and Right of Access

Under GDPR Article 15, any data subject can request a copy of all personal data your SaaS holds about them. You have 30 days to fulfill it. Article 20 adds the right to receive that data in a "structured, commonly used and machine-readable format." For most B2B SaaS products, this means a downloadable ZIP containing JSON or CSV exports of everything your database holds about a user.

Getting this wrong is expensive: €20M or 4% of global annual turnover, whichever is higher. Getting it right is an engineering problem that most teams solve manually, which doesn't scale.

This post covers the complete automated DSAR pipeline: request intake, data collection across services, assembly, secure delivery, and the audit trail that proves compliance.


What "All Personal Data" Means in Practice

Before writing code, catalog what you collect:

CategoryExamplesTables
IdentityName, email, phoneusers
AccountPlan, billing address, payment methodssubscriptions, billing_addresses
ActivityLogin history, page views, feature usagesessions, audit_logs, analytics_events
ContentPosts, files, commentsposts, uploads, comments
CommunicationsEmails sent, support ticketsemail_logs, support_tickets
FinancialInvoices, paymentsinvoices, payments
PreferencesSettings, notification prefsuser_settings
DerivedUsage scores, trial statususer_health_scores

The data export must cover everythingβ€”not just the obvious tables.


Database Schema

-- migrations/20260101_dsar_requests.sql

CREATE TYPE dsar_status AS ENUM (
  'pending',          -- Received, not yet started
  'processing',       -- Export job running
  'ready',            -- ZIP ready for download
  'delivered',        -- User downloaded it
  'expired',          -- Download link expired
  'failed',           -- Export job failed
  'cancelled'         -- Withdrawn by user
);

CREATE TABLE dsar_requests (
  id                  UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  user_id             UUID NOT NULL REFERENCES users(id),
  team_id             UUID REFERENCES teams(id),    -- For DSAR on behalf of a team
  
  -- Request details
  request_type        TEXT NOT NULL DEFAULT 'access',  -- access | portability | erasure
  requested_by        TEXT NOT NULL,    -- email of requester (may differ from user)
  verified            BOOLEAN NOT NULL DEFAULT FALSE,
  verified_at         TIMESTAMPTZ,
  verification_token  TEXT UNIQUE,
  
  -- Processing
  status              dsar_status NOT NULL DEFAULT 'pending',
  started_at          TIMESTAMPTZ,
  completed_at        TIMESTAMPTZ,
  failed_at           TIMESTAMPTZ,
  failure_reason      TEXT,
  
  -- Output
  export_file_key     TEXT,             -- S3 key
  export_file_size    BIGINT,           -- bytes
  download_token      TEXT UNIQUE,      -- secure one-time token
  download_expires_at TIMESTAMPTZ,
  download_count      INTEGER NOT NULL DEFAULT 0,
  
  -- Audit
  due_date            DATE NOT NULL GENERATED ALWAYS AS (
    (created_at + INTERVAL '30 days')::DATE
  ) STORED,
  notes               TEXT,
  metadata            JSONB NOT NULL DEFAULT '{}'::jsonb,
  created_at          TIMESTAMPTZ NOT NULL DEFAULT now(),
  updated_at          TIMESTAMPTZ NOT NULL DEFAULT now()
);

CREATE INDEX idx_dsar_user_id ON dsar_requests(user_id);
CREATE INDEX idx_dsar_status ON dsar_requests(status);
CREATE INDEX idx_dsar_due_date ON dsar_requests(due_date) WHERE status = 'pending';

πŸš€ SaaS MVP in 8 Weeks β€” Seriously

We have launched 50+ SaaS platforms. Multi-tenant architecture, Stripe billing, auth, role-based access, and cloud deployment β€” all handled by one senior team.

  • Week 1–2: Architecture design + wireframes
  • Week 3–6: Core features built + tested
  • Week 7–8: Launch-ready on AWS/Vercel with CI/CD
  • Post-launch: Maintenance plans from month 3

Request Intake API

// app/api/privacy/dsar/route.ts
import { NextRequest, NextResponse } from "next/server";
import { z } from "zod";
import { db } from "@/lib/db";
import { getCurrentUser } from "@/lib/auth";
import { sendDSARVerificationEmail } from "@/lib/email/privacy";
import crypto from "crypto";

const DSARRequestSchema = z.object({
  requestType: z.enum(["access", "portability"]).default("access"),
  notes: z.string().max(1000).optional(),
});

export async function POST(req: NextRequest) {
  const user = await getCurrentUser();
  if (!user) {
    return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
  }

  const body = await req.json();
  const parsed = DSARRequestSchema.safeParse(body);
  if (!parsed.success) {
    return NextResponse.json({ error: "Invalid request" }, { status: 400 });
  }

  // Rate limit: max 2 DSAR requests per 90 days
  const recentRequests = await db
    .selectFrom("dsar_requests")
    .select(db.fn.count("id").as("count"))
    .where("user_id", "=", user.id)
    .where("created_at", ">", new Date(Date.now() - 90 * 24 * 60 * 60 * 1000))
    .where("status", "!=", "cancelled")
    .executeTakeFirst();

  if (Number(recentRequests?.count ?? 0) >= 2) {
    return NextResponse.json(
      { error: "You can submit at most 2 data requests per 90 days." },
      { status: 429 }
    );
  }

  const verificationToken = crypto.randomBytes(32).toString("hex");

  const request = await db
    .insertInto("dsar_requests")
    .values({
      user_id: user.id,
      request_type: parsed.data.requestType,
      requested_by: user.email,
      verification_token: verificationToken,
      notes: parsed.data.notes,
    })
    .returning(["id", "due_date"])
    .executeTakeFirstOrThrow();

  // Send verification email (required β€” must confirm intent)
  await sendDSARVerificationEmail({
    to: user.email,
    name: user.name,
    requestId: request.id,
    verificationToken,
    requestType: parsed.data.requestType,
    dueDate: request.due_date,
  });

  return NextResponse.json({
    id: request.id,
    message: "Request received. Check your email to verify.",
    dueDate: request.due_date,
  });
}

Verification endpoint:

// app/api/privacy/dsar/verify/route.ts
export async function GET(req: NextRequest) {
  const token = req.nextUrl.searchParams.get("token");
  if (!token) return NextResponse.redirect(`${appUrl}/privacy/dsar/invalid`);

  const request = await db
    .selectFrom("dsar_requests")
    .selectAll()
    .where("verification_token", "=", token)
    .where("verified", "=", false)
    .executeTakeFirst();

  if (!request) {
    return NextResponse.redirect(`${appUrl}/privacy/dsar/invalid`);
  }

  // Mark verified and enqueue export job
  await db.transaction().execute(async (trx) => {
    await trx
      .updateTable("dsar_requests")
      .set({
        verified: true,
        verified_at: new Date(),
        status: "processing",
        started_at: new Date(),
        verification_token: null, // Single-use token
      })
      .where("id", "=", request.id)
      .execute();

    // Enqueue background job (your queue of choice)
    await enqueueJob("dsar-export", {
      requestId: request.id,
      userId: request.user_id,
    });
  });

  return NextResponse.redirect(`${appUrl}/privacy/dsar/${request.id}/processing`);
}

Export Pipeline: Data Collection

// jobs/dsar-export.ts
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import archiver from "archiver";
import { PassThrough } from "stream";
import { db } from "@/lib/db";
import crypto from "crypto";

const s3 = new S3Client({ region: process.env.AWS_REGION });

export async function processDSARExport(requestId: string, userId: string) {
  console.log(`Processing DSAR export for user ${userId}, request ${requestId}`);

  try {
    // Collect data from all sources in parallel
    const [
      userData,
      accountData,
      activityData,
      contentData,
      billingData,
      communicationsData,
      settingsData,
    ] = await Promise.all([
      collectUserData(userId),
      collectAccountData(userId),
      collectActivityData(userId),
      collectContentData(userId),
      collectBillingData(userId),
      collectCommunicationsData(userId),
      collectSettingsData(userId),
    ]);

    // Build manifest
    const manifest = {
      exportedAt: new Date().toISOString(),
      userId,
      requestId,
      dataSubjectEmail: userData.profile.email,
      sections: [
        "profile",
        "account",
        "activity",
        "content",
        "billing",
        "communications",
        "settings",
      ],
      note: "This export contains all personal data held about you as of the export date. Data is provided in JSON format.",
    };

    // Stream ZIP to S3
    const fileKey = `dsar/${requestId}/export-${Date.now()}.zip`;
    const fileSize = await streamZipToS3({
      key: fileKey,
      files: [
        { name: "README.txt", content: buildReadme(manifest) },
        { name: "manifest.json", content: JSON.stringify(manifest, null, 2) },
        { name: "profile.json", content: JSON.stringify(userData, null, 2) },
        { name: "account.json", content: JSON.stringify(accountData, null, 2) },
        { name: "activity.json", content: JSON.stringify(activityData, null, 2) },
        { name: "content.json", content: JSON.stringify(contentData, null, 2) },
        { name: "billing.json", content: JSON.stringify(billingData, null, 2) },
        { name: "communications.json", content: JSON.stringify(communicationsData, null, 2) },
        { name: "settings.json", content: JSON.stringify(settingsData, null, 2) },
      ],
    });

    // Generate secure download token
    const downloadToken = crypto.randomBytes(32).toString("hex");
    const downloadExpiry = new Date(Date.now() + 7 * 24 * 60 * 60 * 1000); // 7 days

    await db
      .updateTable("dsar_requests")
      .set({
        status: "ready",
        completed_at: new Date(),
        export_file_key: fileKey,
        export_file_size: fileSize,
        download_token: downloadToken,
        download_expires_at: downloadExpiry,
        updated_at: new Date(),
      })
      .where("id", "=", requestId)
      .execute();

    // Notify user
    await sendDSARReadyEmail({ requestId, userId, downloadExpiry });

    console.log(`DSAR export complete for request ${requestId}, size: ${fileSize} bytes`);
  } catch (error) {
    console.error(`DSAR export failed for request ${requestId}:`, error);

    await db
      .updateTable("dsar_requests")
      .set({
        status: "failed",
        failed_at: new Date(),
        failure_reason: error instanceof Error ? error.message : "Unknown error",
        updated_at: new Date(),
      })
      .where("id", "=", requestId)
      .execute();

    throw error; // Rethrow for job queue retry
  }
}

πŸ’‘ The Difference Between a SaaS Demo and a SaaS Business

Anyone can build a demo. We build SaaS products that handle real load, real users, and real payments β€” with architecture that does not need to be rewritten at 1,000 users.

  • Multi-tenant PostgreSQL with row-level security
  • Stripe subscriptions, usage billing, annual plans
  • SOC2-ready infrastructure from day one
  • We own zero equity β€” you own everything

Data Collectors

// jobs/dsar-collectors.ts

export async function collectUserData(userId: string) {
  const profile = await db
    .selectFrom("users")
    .select([
      "id",
      "email",
      "name",
      "phone",
      "avatar_url",
      "email_verified_at",
      "created_at",
      "updated_at",
      "last_sign_in_at",
    ])
    .where("id", "=", userId)
    .executeTakeFirst();

  const teamMemberships = await db
    .selectFrom("team_members")
    .innerJoin("teams", "teams.id", "team_members.team_id")
    .select(["teams.name as teamName", "team_members.role", "team_members.created_at as joinedAt"])
    .where("team_members.user_id", "=", userId)
    .execute();

  return { profile, teamMemberships };
}

export async function collectActivityData(userId: string) {
  // Login sessions (last 90 days)
  const sessions = await db
    .selectFrom("user_sessions")
    .select(["id", "created_at", "ip_address", "user_agent", "expired_at"])
    .where("user_id", "=", userId)
    .where("created_at", ">", new Date(Date.now() - 90 * 24 * 60 * 60 * 1000))
    .orderBy("created_at", "desc")
    .limit(1000)
    .execute();

  // Audit log (all time)
  const auditEvents = await db
    .selectFrom("audit_logs")
    .select(["id", "action", "resource_type", "resource_id", "created_at", "ip_address", "metadata"])
    .where("user_id", "=", userId)
    .orderBy("created_at", "desc")
    .limit(5000)  // Cap at 5K to prevent giant exports
    .execute();

  // Feature usage analytics
  const analytics = await db
    .selectFrom("analytics_events")
    .select(["event_name", db.fn.count("id").as("count"), db.fn.max("created_at").as("lastSeen")])
    .where("user_id", "=", userId)
    .groupBy("event_name")
    .execute();

  return { sessions, auditEvents, analytics };
}

export async function collectBillingData(userId: string) {
  const subscriptions = await db
    .selectFrom("subscriptions")
    .select([
      "id", "plan", "status", "current_period_start",
      "current_period_end", "created_at", "cancelled_at",
    ])
    .where("user_id", "=", userId)
    .execute();

  const invoices = await db
    .selectFrom("invoices")
    .select([
      "id", "number", "amount_cents", "currency", "status",
      "issued_at", "paid_at", "metadata",
    ])
    .where("user_id", "=", userId)
    .orderBy("issued_at", "desc")
    .execute();

  // Omit full card numbers (PCI-DSS) β€” last 4 digits only
  const paymentMethods = await db
    .selectFrom("payment_methods")
    .select(["id", "type", "last_four", "brand", "exp_month", "exp_year", "created_at"])
    .where("user_id", "=", userId)
    .execute();

  return { subscriptions, invoices, paymentMethods };
}

export async function collectCommunicationsData(userId: string) {
  // Emails sent to the user
  const emailsSent = await db
    .selectFrom("email_logs")
    .select(["id", "template", "subject", "sent_at", "opened_at", "clicked_at"])
    .where("recipient_user_id", "=", userId)
    .orderBy("sent_at", "desc")
    .limit(500)
    .execute();

  // Support tickets
  const tickets = await db
    .selectFrom("support_tickets")
    .leftJoin("support_messages", "support_messages.ticket_id", "support_tickets.id")
    .select([
      "support_tickets.id",
      "support_tickets.subject",
      "support_tickets.status",
      "support_tickets.created_at",
      "support_messages.body",
      "support_messages.author_type",
      "support_messages.created_at as message_at",
    ])
    .where("support_tickets.user_id", "=", userId)
    .orderBy("support_tickets.created_at", "desc")
    .execute();

  return { emailsSent, tickets };
}

Streaming ZIP to S3

// lib/storage/zip-to-s3.ts
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import archiver from "archiver";
import { PassThrough } from "stream";

interface FileEntry {
  name: string;
  content: string;
}

export async function streamZipToS3({
  key,
  files,
}: {
  key: string;
  files: FileEntry[];
}): Promise<number> {
  const passThrough = new PassThrough();
  const archive = archiver("zip", { zlib: { level: 6 } });

  let totalBytes = 0;
  passThrough.on("data", (chunk: Buffer) => {
    totalBytes += chunk.length;
  });

  archive.pipe(passThrough);

  for (const file of files) {
    archive.append(file.content, { name: file.name });
  }

  await archive.finalize();

  // Wait for archive to finish before uploading
  await new Promise<void>((resolve, reject) => {
    archive.on("end", resolve);
    archive.on("error", reject);
  });

  await new S3Client({ region: process.env.AWS_REGION }).send(
    new PutObjectCommand({
      Bucket: process.env.DSAR_EXPORT_BUCKET!,
      Key: key,
      Body: passThrough,
      ContentType: "application/zip",
      // Server-side encryption
      ServerSideEncryption: "aws:kms",
      // Tag for lifecycle policy (auto-delete after 30 days)
      Tagging: "export-type=dsar&auto-delete=true",
    })
  );

  return totalBytes;
}

Secure Download Endpoint

// app/api/privacy/dsar/[requestId]/download/route.ts
import { GetObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";

export async function GET(
  req: NextRequest,
  { params }: { params: Promise<{ requestId: string }> }
) {
  const { requestId } = await params;
  const token = req.nextUrl.searchParams.get("token");

  if (!token) return NextResponse.json({ error: "Missing token" }, { status: 400 });

  const request = await db
    .selectFrom("dsar_requests")
    .select([
      "id", "status", "export_file_key", "download_token",
      "download_expires_at", "download_count", "user_id",
    ])
    .where("id", "=", requestId)
    .executeTakeFirst();

  if (!request || request.download_token !== token) {
    return NextResponse.json({ error: "Invalid or expired link" }, { status: 404 });
  }

  if (request.status !== "ready") {
    return NextResponse.json({ error: "Export not ready" }, { status: 409 });
  }

  if (new Date() > request.download_expires_at!) {
    await db.updateTable("dsar_requests")
      .set({ status: "expired" })
      .where("id", "=", requestId)
      .execute();
    return NextResponse.json({ error: "Download link expired" }, { status: 410 });
  }

  // Generate pre-signed S3 URL (valid 1 hour)
  const command = new GetObjectCommand({
    Bucket: process.env.DSAR_EXPORT_BUCKET!,
    Key: request.export_file_key!,
    ResponseContentDisposition: `attachment; filename="data-export-${requestId}.zip"`,
  });

  const presignedUrl = await getSignedUrl(s3, command, { expiresIn: 3600 });

  // Track download count
  await db
    .updateTable("dsar_requests")
    .set({
      download_count: db.fn("download_count + 1") as any,
      status: "delivered",
      updated_at: new Date(),
    })
    .where("id", "=", requestId)
    .execute();

  // Redirect to pre-signed URL
  return NextResponse.redirect(presignedUrl);
}

Due Date Monitoring (Cron)

// app/api/cron/dsar-due-date-monitor/route.ts
export async function GET(req: NextRequest) {
  // Alert on requests approaching 30-day deadline
  const urgentRequests = await db
    .selectFrom("dsar_requests as d")
    .innerJoin("users as u", "u.id", "d.user_id")
    .select(["d.id", "d.due_date", "d.status", "u.email"])
    .where("d.status", "in", ["pending", "processing"])
    .where("d.due_date", "<=", new Date(Date.now() + 5 * 24 * 60 * 60 * 1000)) // 5 days
    .execute();

  if (urgentRequests.length > 0) {
    await notifyComplianceTeam({
      subject: `⚠️ ${urgentRequests.length} DSAR requests approaching deadline`,
      body: urgentRequests.map((r) => 
        `Request ${r.id}: due ${r.due_date}, status: ${r.status}`
      ).join("\n"),
    });
  }

  return NextResponse.json({ checked: urgentRequests.length });
}

What to Include in README.txt

function buildReadme(manifest: typeof manifest): string {
  return `
GDPR DATA EXPORT
================

This file contains all personal data held about you by Viprasol Tech as of:
${manifest.exportedAt}

Request ID: ${manifest.requestId}
Email: ${manifest.dataSubjectEmail}

FILES IN THIS EXPORT
--------------------
- profile.json      Your account identity and contact information
- account.json      Team memberships and access permissions
- activity.json     Login sessions, feature usage, and audit events
- content.json      Content you have created on the platform
- billing.json      Subscription history, invoices, and payment methods
- communications.json  Emails and support ticket history
- settings.json     Notification preferences and account settings

DATA FORMATS
------------
All data is provided in JSON format. Timestamps are in ISO 8601 (UTC).
Currency amounts are in the smallest unit (e.g., cents for USD).

YOUR RIGHTS
-----------
Under GDPR you also have the right to:
- Correct inaccurate data: https://viprasol.com/privacy/correct
- Request deletion: https://viprasol.com/privacy/delete
- Withdraw consent: https://viprasol.com/privacy/consent
- Lodge a complaint: https://ico.org.uk/ (UK) or your local supervisory authority

Questions: privacy@viprasol.com
`.trim();
}

Cost and Timeline Estimates

ComponentTimelineCost (USD)
Request intake + verification email1–2 days$800–$1,600
Data collection across 5–10 tables2–3 days$1,600–$2,500
ZIP packaging + S3 upload + secure download1–2 days$800–$1,600
Admin dashboard + due date monitoring1–2 days$800–$1,600
Full DSAR pipeline (automated)1–2 weeks$5,000–$10,000
Legal review of data inventory2–5 days$2,000–$5,000

GDPR fine risk for non-compliance: €20M or 4% of global turnover. The engineering cost is always cheaper than the fine.


See Also


Working With Viprasol

We help SaaS companies implement GDPR compliance engineeringβ€”from data mapping through automated fulfillment pipelines and privacy-by-design architecture reviews. Our team has shipped GDPR tooling for SaaS products with EU customer bases across fintech, health, and enterprise software.

What we deliver:

  • Complete DSAR intake and automated export pipeline
  • Data inventory and classification documentation
  • Deletion (right-to-erasure) implementation
  • Consent management and preference center
  • Privacy impact assessments for new features

Learn about our SaaS development services or contact us to discuss your compliance requirements.

Share this article:

About the Author

V

Viprasol Tech Team

Custom Software Development Specialists

The Viprasol Tech team specialises in algorithmic trading software, AI agent systems, and SaaS development. With 100+ projects delivered across MT4/MT5 EAs, fintech platforms, and production AI systems, the team brings deep technical experience to every engagement. Based in India, serving clients globally.

MT4/MT5 EA DevelopmentAI Agent SystemsSaaS DevelopmentAlgorithmic Trading

Building a SaaS Product?

We've helped launch 50+ SaaS platforms. Let's build yours β€” fast.

Free consultation β€’ No commitment β€’ Response within 24 hours

Viprasol Β· AI Agent Systems

Add AI automation to your SaaS product?

Viprasol builds custom AI agent crews that plug into any SaaS workflow β€” automating repetitive tasks, qualifying leads, and responding across every channel your customers use.