Serverless & Edge Architecture for Full-Stack Apps in 2025
I've been building with serverless since AWS Lambda launched in 2014. Back then, it felt like magic-code that runs without servers. Now, in 2025, serverless has evolved into a comprehensive architecture pattern, and edge computing has joined the party to solve latency issues.
But here's what I've learned: serverless isn't always the answer, and edge isn't always faster. The key is knowing when and how to use each approach. Let me show you the complete picture.
Understanding the Spectrum: Traditional → Serverless → Edge
Traditional Servers
What it is:
- Always-running servers (VPS, dedicated hardware)
- You manage everything: OS, runtime, scaling, monitoring
When to use:
- Predictable, consistent traffic
- Long-running processes
- Legacy applications
- Full control requirements
Serverless
What it is:
- Functions run on-demand
- Auto-scaling from 0 to thousands of instances
- Pay per execution
When to use:
- Irregular traffic patterns
- Event-driven architectures
- Rapid development and deployment
- Cost optimization for low-traffic apps
Edge Computing
What it is:
- Code runs close to users geographically
- Ultra-low latency (< 50ms globally)
- Limited runtime environment
When to use:
- Global applications
- Latency-critical operations
- CDN-like functionality
- Real-time personalization
Serverless Platforms in 2025
Vercel Edge Functions
Perfect for Next.js applications and edge-first architectures.
// app/api/edge-example/route.ts
import { NextRequest, NextResponse } from 'next/server';
export const runtime = 'edge';
export async function GET(request: NextRequest) {
const country = request.geo?.country || 'US';
const city = request.geo?.city || 'Unknown';
return NextResponse.json({
message: `Hello from ${city}, ${country}!`,
timestamp: new Date().toISOString(),
region: process.env.VERCEL_REGION,
});
}
export async function POST(request: NextRequest) {
const body = await request.json();
// Process the data at the edge
const processed = {
...body,
processedAt: new Date().toISOString(),
region: process.env.VERCEL_REGION,
};
return NextResponse.json(processed);
}Edge Middleware Example:
// middleware.ts
import { NextRequest, NextResponse } from 'next/server';
export function middleware(request: NextRequest) {
const country = request.geo?.country;
const pathname = request.nextUrl.pathname;
// Redirect based on geography
if (pathname === '/pricing' && country === 'IN') {
return NextResponse.redirect(new URL('/pricing/india', request.url));
}
// Add custom headers
const response = NextResponse.next();
response.headers.set('x-user-country', country || 'unknown');
response.headers.set('x-pathname', pathname);
return response;
}
export const config = {
matcher: [
'/((?!api|_next/static|_next/image|favicon.ico).*)',
],
};AWS Lambda
The OG serverless platform, now with better cold start times and more runtime options.
// lambda/image-processor.ts
import { APIGatewayProxyHandler } from 'aws-lambda';
import AWS from 'aws-sdk';
const s3 = new AWS.S3();
export const handler: APIGatewayProxyHandler = async (event) => {
try {
const { bucket, key } = JSON.parse(event.body || '{}');
// Get image from S3
const image = await s3.getObject({
Bucket: bucket,
Key: key,
}).promise();
// Process image (resize, optimize, etc.)
const processedImage = await processImage(image.Body);
// Upload processed image
const result = await s3.putObject({
Bucket: bucket,
Key: `processed/${key}`,
Body: processedImage,
ContentType: 'image/webp',
}).promise();
return {
statusCode: 200,
body: JSON.stringify({
message: 'Image processed successfully',
location: `s3://${bucket}/processed/${key}`,
}),
};
} catch (error) {
console.error('Error:', error);
return {
statusCode: 500,
body: JSON.stringify({
error: 'Failed to process image',
}),
};
}
};
async function processImage(imageBuffer: any) {
// Image processing logic here
// Could use Sharp, Canvas, or other libraries
return imageBuffer;
}SAM Template:
# template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
ImageProcessorFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: dist/
Handler: image-processor.handler
Runtime: nodejs18.x
Timeout: 30
MemorySize: 1024
Environment:
Variables:
NODE_ENV: production
Events:
ImageUpload:
Type: Api
Properties:
Path: /process-image
Method: postCloudflare Workers
Excellent for global edge computing with generous free tier.
// worker.ts
interface Env {
KV_NAMESPACE: KVNamespace;
R2_BUCKET: R2Bucket;
}
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url);
const path = url.pathname;
// Handle different routes
if (path.startsWith('/api/')) {
return handleAPI(request, env);
}
if (path.startsWith('/cache/')) {
return handleCache(request, env);
}
return new Response('Hello from Cloudflare Workers!', {
headers: { 'Content-Type': 'text/plain' },
});
},
};
async function handleAPI(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url);
const endpoint = url.pathname.replace('/api/', '');
switch (endpoint) {
case 'user':
return handleUser(request, env);
case 'analytics':
return handleAnalytics(request, env);
default:
return new Response('API endpoint not found', { status: 404 });
}
}
async function handleUser(request: Request, env: Env): Promise<Response> {
if (request.method === 'GET') {
const userId = new URL(request.url).searchParams.get('id');
const user = await env.KV_NAMESPACE.get(`user:${userId}`);
if (!user) {
return new Response('User not found', { status: 404 });
}
return new Response(user, {
headers: { 'Content-Type': 'application/json' },
});
}
if (request.method === 'POST') {
const userData = await request.json();
const userId = crypto.randomUUID();
await env.KV_NAMESPACE.put(
`user:${userId}`,
JSON.stringify({ ...userData, id: userId })
);
return new Response(JSON.stringify({ id: userId }), {
headers: { 'Content-Type': 'application/json' },
});
}
return new Response('Method not allowed', { status: 405 });
}
async function handleCache(request: Request, env: Env): Promise<Response> {
const cache = caches.default;
const cacheKey = new Request(request.url, request);
// Check cache first
let response = await cache.match(cacheKey);
if (!response) {
// Generate response
response = new Response(JSON.stringify({
timestamp: new Date().toISOString(),
cached: false,
}), {
headers: {
'Content-Type': 'application/json',
'Cache-Control': 'max-age=300', // 5 minutes
},
});
// Store in cache
await cache.put(cacheKey, response.clone());
}
return response;
}Event-Driven Architecture
Serverless shines in event-driven systems.
Queue Processing
// Vercel: api/process-queue.ts
import { NextRequest, NextResponse } from 'next/server';
import { Queue } from 'bullmq';
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL!);
const emailQueue = new Queue('email', { connection: redis });
export async function POST(request: NextRequest) {
const { email, template, data } = await request.json();
// Add job to queue
await emailQueue.add('send-email', {
to: email,
template,
data,
}, {
delay: 1000, // 1 second delay
attempts: 3,
backoff: {
type: 'exponential',
delay: 2000,
},
});
return NextResponse.json({ success: true });
}
// Worker process (separate deployment)
export async function processEmailQueue() {
const worker = new Worker('email', async (job) => {
const { to, template, data } = job.data;
// Send email using your preferred service
await sendEmail(to, template, data);
return { sent: true, timestamp: new Date().toISOString() };
}, { connection: redis });
worker.on('completed', (job) => {
console.log(`Email sent: ${job.id}`);
});
worker.on('failed', (job, err) => {
console.error(`Email failed: ${job.id}`, err);
});
}Webhook Processing
// api/webhooks/stripe.ts
import { NextRequest, NextResponse } from 'next/server';
import Stripe from 'stripe';
const stripe = new Stripe(process.env.STRIPE_SECRET_KEY!);
export async function POST(request: NextRequest) {
const body = await request.text();
const signature = request.headers.get('stripe-signature')!;
let event: Stripe.Event;
try {
event = stripe.webhooks.constructEvent(
body,
signature,
process.env.STRIPE_WEBHOOK_SECRET!
);
} catch (error) {
console.error('Webhook signature verification failed:', error);
return NextResponse.json({ error: 'Invalid signature' }, { status: 400 });
}
// Process different event types
switch (event.type) {
case 'payment_intent.succeeded':
await handlePaymentSuccess(event.data.object as Stripe.PaymentIntent);
break;
case 'customer.subscription.created':
await handleSubscriptionCreated(event.data.object as Stripe.Subscription);
break;
case 'invoice.payment_failed':
await handlePaymentFailed(event.data.object as Stripe.Invoice);
break;
default:
console.log(`Unhandled event type: ${event.type}`);
}
return NextResponse.json({ received: true });
}
async function handlePaymentSuccess(paymentIntent: Stripe.PaymentIntent) {
// Update database, send confirmation email, etc.
console.log(`Payment succeeded: ${paymentIntent.id}`);
}
async function handleSubscriptionCreated(subscription: Stripe.Subscription) {
// Provision access, send welcome email, etc.
console.log(`Subscription created: ${subscription.id}`);
}
async function handlePaymentFailed(invoice: Stripe.Invoice) {
// Send dunning email, update account status, etc.
console.log(`Payment failed: ${invoice.id}`);
}Database Strategies for Serverless
Serverless-Native Databases
// lib/db-strategies.ts
// 1. Vercel Postgres (Neon)
import { sql } from '@vercel/postgres';
export async function getUsersWithVercelPostgres() {
try {
const result = await sql`
SELECT id, name, email, created_at
FROM users
ORDER BY created_at DESC
LIMIT 10
`;
return result.rows;
} catch (error) {
console.error('Database error:', error);
throw error;
}
}
// 2. PlanetScale (Serverless MySQL)
import mysql from 'mysql2/promise';
const planetscale = mysql.createConnection({
host: process.env.DATABASE_HOST,
username: process.env.DATABASE_USERNAME,
password: process.env.DATABASE_PASSWORD,
ssl: {
rejectUnauthorized: false,
},
});
export async function getUsersWithPlanetScale() {
const [rows] = await planetscale.execute(
'SELECT id, name, email, created_at FROM users ORDER BY created_at DESC LIMIT 10'
);
return rows;
}
// 3. Supabase (Serverless Postgres)
import { createClient } from '@supabase/supabase-js';
const supabase = createClient(
process.env.SUPABASE_URL!,
process.env.SUPABASE_ANON_KEY!
);
export async function getUsersWithSupabase() {
const { data, error } = await supabase
.from('users')
.select('id, name, email, created_at')
.order('created_at', { ascending: false })
.limit(10);
if (error) throw error;
return data;
}
// 4. DynamoDB (NoSQL)
import { DynamoDBClient } from '@aws-sdk/client-dynamodb';
import { DynamoDBDocumentClient, ScanCommand } from '@aws-sdk/lib-dynamodb';
const client = new DynamoDBClient({ region: 'us-east-1' });
const docClient = DynamoDBDocumentClient.from(client);
export async function getUsersWithDynamoDB() {
const command = new ScanCommand({
TableName: 'users',
Limit: 10,
});
const response = await docClient.send(command);
return response.Items;
}Connection Pooling
// lib/db-pool.ts
import { Pool } from 'pg';
// Create connection pool (outside handler for reuse)
const pool = new Pool({
connectionString: process.env.DATABASE_URL,
max: 1, // Serverless: keep connections low
idleTimeoutMillis: 1000,
connectionTimeoutMillis: 1000,
});
export async function query(text: string, params?: any[]) {
const start = Date.now();
try {
const result = await pool.query(text, params);
const duration = Date.now() - start;
console.log('Query executed', { text, duration, rows: result.rowCount });
return result;
} catch (error) {
console.error('Query error', { text, error });
throw error;
}
}
// Graceful shutdown
process.on('SIGINT', () => {
pool.end(() => {
console.log('Database pool closed');
process.exit(0);
});
});Hybrid Architecture Patterns
Sometimes you need a mix of traditional, serverless, and edge.
Microservices with Mixed Deployment
// Architecture example
const services = {
// Edge: Global, low-latency
auth: 'edge-function', // User auth, sessions
cdn: 'edge-function', // Asset delivery, caching
// Serverless: Scalable, event-driven
api: 'serverless-function', // REST API, CRUD operations
webhooks: 'serverless-function', // Payment processing, integrations
jobs: 'serverless-function', // Background jobs, email sending
// Traditional: Long-running, stateful
database: 'traditional-server', // Primary database
analytics: 'traditional-server', // Data processing, reporting
websocket: 'traditional-server', // Real-time connections
};API Gateway Pattern
// middleware.ts - Route requests to appropriate services
import { NextRequest, NextResponse } from 'next/server';
export async function middleware(request: NextRequest) {
const { pathname } = request.nextUrl;
// Edge-handled routes
if (pathname.startsWith('/auth/') || pathname.startsWith('/cdn/')) {
return NextResponse.next(); // Handle at edge
}
// Serverless API routes
if (pathname.startsWith('/api/')) {
// Add correlation ID for tracing
request.headers.set('x-correlation-id', crypto.randomUUID());
return NextResponse.next();
}
// WebSocket connections - redirect to traditional server
if (pathname.startsWith('/ws/')) {
return NextResponse.redirect(
new URL(pathname, process.env.WEBSOCKET_SERVER_URL)
);
}
return NextResponse.next();
}Cost Optimization
Function Sizing and Timeout Strategies
// vercel.json - Function configuration
{
"functions": {
"app/api/quick-operations/route.ts": {
"memory": 256,
"maxDuration": 10
},
"app/api/heavy-processing/route.ts": {
"memory": 1024,
"maxDuration": 60
},
"app/api/background-jobs/route.ts": {
"memory": 512,
"maxDuration": 300
}
}
}Cold Start Mitigation
// lib/warm-up.ts
let isWarm = false;
export function markWarm() {
isWarm = true;
}
export function checkWarm(): boolean {
return isWarm;
}
// Usage in API routes
export async function GET(request: NextRequest) {
if (!checkWarm()) {
// First request - do minimal setup
markWarm();
console.log('Cold start detected');
}
// Your handler logic
return NextResponse.json({ warm: checkWarm() });
}Monitoring and Debugging
Observability Stack
// lib/monitoring.ts
import { trace, context, SpanStatusCode } from '@opentelemetry/api';
const tracer = trace.getTracer('my-serverless-app');
export async function withTracing<T>(
name: string,
fn: () => Promise<T>
): Promise<T> {
const span = tracer.startSpan(name);
try {
const result = await fn();
span.setStatus({ code: SpanStatusCode.OK });
return result;
} catch (error) {
span.setStatus({
code: SpanStatusCode.ERROR,
message: error instanceof Error ? error.message : 'Unknown error',
});
throw error;
} finally {
span.end();
}
}
// Usage
export async function processOrder(orderId: string) {
return withTracing('process-order', async () => {
const order = await withTracing('fetch-order', () =>
fetchOrderFromDB(orderId)
);
const result = await withTracing('payment-processing', () =>
processPayment(order)
);
await withTracing('send-confirmation', () =>
sendConfirmationEmail(order.email)
);
return result;
});
}The Bottom Line
Serverless and edge computing aren't just trends-they're architectural patterns that solve real problems:
Choose Serverless When:
- Variable or unpredictable traffic
- Event-driven workflows
- Rapid development cycles
- Cost optimization is important
- You want managed infrastructure
Choose Edge When:
- Global user base
- Latency is critical
- Simple processing logic
- CDN-like functionality needed
Choose Traditional When:
- Predictable, consistent load
- Long-running processes
- Full control requirements
- Complex stateful applications
Key takeaways:
- Mix and match approaches based on requirements
- Start simple, optimize based on real usage
- Monitor costs and performance closely
- Plan for cold starts in serverless
- Use appropriate databases for each architecture
- Implement proper observability from day one
The future is hybrid: using the right tool for each part of your system.
---
What's your experience with serverless? Have you tried edge computing yet? Share your architecture choices in the comments below.Want articles like this in your inbox?
Join developers and founders who get practical insights on frontend, SaaS, and building better products.
Written by Salman Izhar
Full Stack Developer specializing in React, Next.js, and building high-converting web applications.
Learn More