Back to Blog

B2B API Integration: Connecting Enterprise Systems Without Losing Your Mind

Integrate enterprise APIs — ERP, CRM, HR systems — with practical patterns for rate limiting, data transformation, error handling, and sync strategies. Includes

Viprasol Tech Team
March 30, 2026
12 min read

B2B API Integration: Connecting Enterprise Systems Without Losing Your Mind

Enterprise API integrations are harder than they look. The APIs often have quirks: inconsistent pagination, rate limits that change without notice, batch endpoints that silently drop records, and authentication flows that require a phone call to the vendor's support team to understand.

This guide covers the patterns that survive contact with real enterprise systems — ERP, CRM, HRIS — and the implementation details that keep integrations running reliably in production.


Common Enterprise APIs and Their Quirks

SystemAPI StyleAuthCommon Gotchas
SalesforceREST + BulkOAuth215-minute access tokens, per-org API limits
SAPREST / OData / BAPIBasic, OAuth2, SAMLPer-system differences, versioning complexity
WorkdayREST + RaaS reportsOAuth2Tenant-specific URLs, SOAP still common
NetSuiteREST + SuiteTalk SOAPOAuth1/TBARecord locking on concurrent writes
HubSpotRESTOAuth2100 req/10s limit, association API quirks
Microsoft DynamicsODataOAuth2 (Azure AD)FetchXML for complex queries
QuickBooks OnlineRESTOAuth2500 req/min rate limit, sandbox != production

The first rule of enterprise integrations: read the changelog before assuming behavior. SAP S/4HANA Cloud behaves differently from SAP ECC. Salesforce's limits differ between orgs. Always verify against the target customer's instance.


Pattern 1: The Integration SDK Layer

Never call third-party APIs directly from your business logic. Wrap each integration behind a typed SDK that your application code imports.

// integrations/salesforce/client.ts
import axios, { AxiosInstance } from 'axios';
import { TokenCache } from '../shared/tokenCache';

interface SalesforceConfig {
  instanceUrl: string;
  clientId: string;
  clientSecret: string;
  refreshToken: string;
}

interface SalesforceContact {
  Id: string;
  FirstName: string;
  LastName: string;
  Email: string;
  AccountId: string;
  Phone?: string;
  CreatedDate: string;
  LastModifiedDate: string;
}

export class SalesforceClient {
  private http: AxiosInstance;
  private tokenCache: TokenCache;

  constructor(private config: SalesforceConfig) {
    this.tokenCache = new TokenCache(`sf:${config.instanceUrl}`);
    this.http = axios.create({ baseURL: config.instanceUrl });

    // Intercept every request to attach fresh access token
    this.http.interceptors.request.use(async (req) => {
      const token = await this.getAccessToken();
      req.headers.Authorization = `Bearer ${token}`;
      return req;
    });

    // Retry on 401 with fresh token
    this.http.interceptors.response.use(
      (res) => res,
      async (error) => {
        if (error.response?.status === 401) {
          this.tokenCache.invalidate();
          const token = await this.getAccessToken();
          error.config.headers.Authorization = `Bearer ${token}`;
          return this.http.request(error.config);
        }
        throw error;
      }
    );
  }

  private async getAccessToken(): Promise<string> {
    const cached = await this.tokenCache.get();
    if (cached) return cached;

    const res = await axios.post(`${this.config.instanceUrl}/services/oauth2/token`, {
      grant_type: 'refresh_token',
      client_id: this.config.clientId,
      client_secret: this.config.clientSecret,
      refresh_token: this.config.refreshToken,
    }, { headers: { 'Content-Type': 'application/x-www-form-urlencoded' } });

    const { access_token, expires_in } = res.data;
    await this.tokenCache.set(access_token, (expires_in - 60) * 1000);
    return access_token;
  }

  async getContacts(options: {
    modifiedAfter?: Date;
    limit?: number;
    offset?: number;
  } = {}): Promise<{ records: SalesforceContact[]; totalSize: number; done: boolean }> {
    let query = 'SELECT Id, FirstName, LastName, Email, AccountId, Phone, CreatedDate, LastModifiedDate FROM Contact';
    
    if (options.modifiedAfter) {
      query += ` WHERE LastModifiedDate > ${options.modifiedAfter.toISOString()}`;
    }
    
    query += ` ORDER BY LastModifiedDate ASC`;
    
    if (options.limit) query += ` LIMIT ${options.limit}`;
    if (options.offset) query += ` OFFSET ${options.offset}`;

    const res = await this.http.get('/services/data/v59.0/query', {
      params: { q: query },
    });

    return res.data;
  }

  async upsertContact(externalId: string, data: Partial<SalesforceContact>): Promise<string> {
    const res = await this.http.patch(
      `/services/data/v59.0/sobjects/Contact/ExternalId__c/${externalId}`,
      data
    );
    return res.data.id;
  }
}

🌐 Looking for a Dev Team That Actually Delivers?

Most agencies sell you a project manager and assign juniors. Viprasol is different — senior engineers only, direct Slack access, and a 5.0★ Upwork record across 100+ projects.

  • React, Next.js, Node.js, TypeScript — production-grade stack
  • Fixed-price contracts — no surprise invoices
  • Full source code ownership from day one
  • 90-day post-launch support included

Pattern 2: Incremental Sync with Watermarks

Full syncs (re-fetching all data every time) are slow, expensive, and hit rate limits. Incremental sync fetches only records modified since the last successful sync.

// services/syncService.ts
interface SyncState {
  integrationId: string;
  entityType: string;
  lastSyncedAt: Date;
  cursor?: string;         // Some APIs use cursor pagination instead of timestamp
  status: 'idle' | 'running' | 'error';
  lastError?: string;
}

export class ContactSyncService {
  constructor(
    private sfClient: SalesforceClient,
    private db: PrismaClient,
    private syncStateRepo: SyncStateRepository
  ) {}

  async syncContacts(integrationId: string): Promise<{ synced: number; errors: number }> {
    const state = await this.syncStateRepo.get(integrationId, 'contact');
    
    // Pessimistic window: fetch records modified 5 minutes before last sync
    // This handles clock skew between Salesforce and our server
    const modifiedAfter = state?.lastSyncedAt
      ? subMinutes(state.lastSyncedAt, 5)
      : new Date('2020-01-01');  // First sync: fetch from epoch

    await this.syncStateRepo.setRunning(integrationId, 'contact');

    let synced = 0;
    let errors = 0;
    let offset = 0;
    const batchSize = 200;
    const syncStartedAt = new Date();

    try {
      while (true) {
        const { records, done } = await this.sfClient.getContacts({
          modifiedAfter,
          limit: batchSize,
          offset,
        });

        if (records.length === 0) break;

        // Process in parallel batches of 10
        const batches = chunk(records, 10);
        for (const batch of batches) {
          const results = await Promise.allSettled(
            batch.map(contact => this.upsertContact(integrationId, contact))
          );

          synced += results.filter(r => r.status === 'fulfilled').length;
          errors += results.filter(r => r.status === 'rejected').length;

          // Log failures but don't abort the sync
          results.forEach((result, i) => {
            if (result.status === 'rejected') {
              console.error(`Failed to sync contact ${batch[i].Id}:`, result.reason);
            }
          });
        }

        if (done || records.length < batchSize) break;
        offset += batchSize;
      }

      // Only update watermark after successful completion
      await this.syncStateRepo.setComplete(integrationId, 'contact', syncStartedAt);
    } catch (err) {
      await this.syncStateRepo.setError(integrationId, 'contact', String(err));
      throw err;
    }

    return { synced, errors };
  }

  private async upsertContact(integrationId: string, sf: SalesforceContact): Promise<void> {
    await this.db.contact.upsert({
      where: { 
        integrationId_externalId: { 
          integrationId, 
          externalId: sf.Id 
        } 
      },
      create: {
        integrationId,
        externalId: sf.Id,
        firstName: sf.FirstName,
        lastName: sf.LastName,
        email: sf.Email,
        phone: sf.Phone ?? null,
        externalCreatedAt: new Date(sf.CreatedDate),
        externalUpdatedAt: new Date(sf.LastModifiedDate),
      },
      update: {
        firstName: sf.FirstName,
        lastName: sf.LastName,
        email: sf.Email,
        phone: sf.Phone ?? null,
        externalUpdatedAt: new Date(sf.LastModifiedDate),
      },
    });
  }
}

Pattern 3: Rate Limiting with Token Bucket

Enterprise APIs have rate limits that, if exceeded, result in 429 errors (and sometimes temporary bans). Implement a token bucket rate limiter that respects the API's documented limits:

// shared/rateLimiter.ts
export class TokenBucketRateLimiter {
  private tokens: number;
  private lastRefill: number;

  constructor(
    private maxTokens: number,       // e.g., 100 (Salesforce: 100 req/10s)
    private refillIntervalMs: number  // e.g., 10_000
  ) {
    this.tokens = maxTokens;
    this.lastRefill = Date.now();
  }

  async acquire(cost = 1): Promise<void> {
    // Refill tokens based on elapsed time
    const now = Date.now();
    const elapsed = now - this.lastRefill;
    const tokensToAdd = Math.floor(elapsed / this.refillIntervalMs) * this.maxTokens;
    
    if (tokensToAdd > 0) {
      this.tokens = Math.min(this.maxTokens, this.tokens + tokensToAdd);
      this.lastRefill = now;
    }

    if (this.tokens < cost) {
      const waitMs = (cost - this.tokens) / this.maxTokens * this.refillIntervalMs;
      await new Promise(resolve => setTimeout(resolve, waitMs));
      return this.acquire(cost);
    }

    this.tokens -= cost;
  }
}

// Salesforce: 100 API calls per 10 seconds
const sfRateLimiter = new TokenBucketRateLimiter(100, 10_000);

// Wrap every Salesforce call
async function sfRequest<T>(fn: () => Promise<T>): Promise<T> {
  await sfRateLimiter.acquire();
  return fn();
}

🚀 Senior Engineers. No Junior Handoffs. Ever.

You get the senior developer, not a project manager who relays your requirements to someone you never meet. Every Viprasol project has a senior lead from kickoff to launch.

  • MVPs in 4–8 weeks, full platforms in 3–5 months
  • Lighthouse 90+ performance scores standard
  • Works across US, UK, AU timezones
  • Free 30-min architecture review, no commitment

Pattern 4: Resilient Outbound Sync

When pushing data back to the third-party system (e.g., updating Salesforce when your CRM changes), use a queue with retry rather than inline API calls:

// workers/outboundSync.ts
import { Worker } from 'bullmq';

interface OutboundSyncJob {
  integrationId: string;
  operation: 'create' | 'update' | 'delete';
  entityType: 'contact' | 'deal' | 'company';
  externalId?: string;
  payload: Record<string, unknown>;
}

const outboundWorker = new Worker<OutboundSyncJob>(
  'outbound-sync',
  async (job) => {
    const { integrationId, operation, entityType, externalId, payload } = job.data;
    
    const integration = await db.integration.findUnique({
      where: { id: integrationId },
    });

    const sfClient = new SalesforceClient({
      instanceUrl: integration!.instanceUrl,
      clientId: process.env.SF_CLIENT_ID!,
      clientSecret: process.env.SF_CLIENT_SECRET!,
      refreshToken: integration!.refreshToken,
    });

    switch (operation) {
      case 'create':
      case 'update':
        if (!externalId) throw new Error('externalId required for upsert');
        await sfClient.upsertContact(externalId, payload as Partial<SalesforceContact>);
        break;
      case 'delete':
        // Salesforce typically: set a custom "Deleted" flag, not actual delete
        await sfClient.upsertContact(externalId!, { IsDeleted__c: true } as any);
        break;
    }

    await db.syncLog.create({
      data: {
        integrationId,
        direction: 'outbound',
        entityType,
        externalId: externalId ?? null,
        operation,
        status: 'success',
      },
    });
  },
  {
    connection: redis,
    concurrency: 5,
    defaultJobOptions: {
      attempts: 5,
      backoff: { type: 'exponential', delay: 2000 },
    },
  }
);

Data Transformation Layer

Enterprise APIs return data in their own schema. Transform into your internal schema with an explicit mapper — don't let external field names leak into your domain model.

// integrations/salesforce/mappers.ts
import { SalesforceContact } from './client';
import { Contact } from '@prisma/client';

export function mapSalesforceContact(sf: SalesforceContact): Omit<Contact, 'id' | 'createdAt' | 'updatedAt'> {
  return {
    externalId: sf.Id,
    firstName: sf.FirstName ?? '',
    lastName: sf.LastName,
    email: sf.Email,
    phone: normalizePhone(sf.Phone),
    externalCreatedAt: new Date(sf.CreatedDate),
    externalUpdatedAt: new Date(sf.LastModifiedDate),
    // Internal fields not in Salesforce — set defaults
    source: 'salesforce',
    status: 'active',
    ownerId: null,  // Resolved separately
  };
}

function normalizePhone(phone?: string): string | null {
  if (!phone) return null;
  // Strip everything non-numeric
  const digits = phone.replace(/\D/g, '');
  // US numbers: 10 digits
  if (digits.length === 10) return `+1${digits}`;
  if (digits.length === 11 && digits[0] === '1') return `+${digits}`;
  return phone;  // Return original if we can't normalize
}

Integration Cost Reference

ScopeTimelineCost Range
Single CRM read sync (Salesforce → your DB)3–6 weeks$8,000–20,000
Bidirectional CRM sync6–10 weeks$18,000–40,000
ERP integration (NetSuite/SAP read)4–8 weeks$12,000–30,000
Full ERP bidirectional8–16 weeks$30,000–80,000
iPaaS (Zapier/Make for simple flows)1–2 weeks$2,000–5,000 setup
Custom integration platform (multi-tenant)16–24 weeks$80,000–200,000

iPaaS tools (Zapier, Make, Tray.io) work well for simple linear flows. Custom integration code is necessary for complex transformation logic, high-volume syncs, or when the iPaaS doesn't support the target API.


Working With Viprasol

We've built B2B integrations with Salesforce, HubSpot, NetSuite, SAP, Workday, QuickBooks, and a dozen more enterprise systems. Our integration work includes the SDK layer, incremental sync, rate limiting, error handling, and the operational monitoring to know when a sync has stalled.

Talk to our integration team about connecting your systems.


See Also

Share this article:

About the Author

V

Viprasol Tech Team

Custom Software Development Specialists

The Viprasol Tech team specialises in algorithmic trading software, AI agent systems, and SaaS development. With 100+ projects delivered across MT4/MT5 EAs, fintech platforms, and production AI systems, the team brings deep technical experience to every engagement. Based in India, serving clients globally.

MT4/MT5 EA DevelopmentAI Agent SystemsSaaS DevelopmentAlgorithmic Trading

Need a Modern Web Application?

From landing pages to complex SaaS platforms — we build it all with Next.js and React.

Free consultation • No commitment • Response within 24 hours

Viprasol · Web Development

Need a custom web application built?

We build React and Next.js web applications with Lighthouse ≥90 scores, mobile-first design, and full source code ownership. Senior engineers only — from architecture through deployment.