Edge Functions for Ecommerce: What They’re Good At, What They Break, and Where the Boundary Should Sit
Edge functions get pitched as a free upgrade for any web app. They are not. They are a precise tool for a narrow set of jobs — and on the wrong workload they make your store slower, not faster. The teams that get edge runtime right ship pages that load in 200ms anywhere in the world. The teams that misuse it pay platform bills for an architecture that adds latency.
This piece walks through what edge functions are actually good at for ecommerce, what they’re bad at, and how to decide which parts of your storefront belong at the edge versus in your origin server.
What "edge" actually means
An edge function runs in a lightweight JavaScript runtime (V8 isolates, not full Node) deployed to hundreds of points of presence around the world. Cloudflare Workers, Vercel Edge Functions, Deno Deploy, Netlify Edge, and Fastly Compute are the main offerings. When a request hits, it executes at the PoP closest to the user — typically within 30–50ms of them — instead of crossing oceans to reach your origin.
The constraints that come with that speed are real:
- No full Node.js runtime. Most of npm doesn’t work. Native modules don’t work. Heavy frameworks (Next.js full Node mode, full Express apps) don’t work. You get fetch, Web Crypto, and a subset of Web APIs.
- Short execution budgets. Cloudflare Workers caps CPU time at 30 seconds (paid plan), 10ms (free plan). Vercel Edge Functions cap at 25 seconds. The runtime is for fast in-and-out work.
- Cold starts are negligible — but database round trips aren’t. The edge is fast because it’s close to the user. If your function then has to call a database in us-east-1, the user in Tokyo gets a 200ms penalty for the database hop, on every request.
This last point is the one most teams underestimate. The edge is fast for compute-local work. It is not fast for "query my Postgres in Virginia from a PoP in Sydney."
What edge functions are good at for ecommerce
The wins are real and specific:
A/B test variant assignment
Hash the user’s IP or session cookie, assign a variant, set a cookie. The whole thing runs in 5ms with no external calls. Edge handles this beautifully.
Geo routing and locale redirects
Reading the cf-ipcountry header (or equivalent) and redirecting EU users to /eu/, German speakers to /de/, etc. Pure header logic. Sub-50ms responses globally.
Bot mitigation and rate limiting
Reject obvious bots, rate-limit suspicious patterns, validate User-Agent headers. Edge runtimes typically have built-in rate-limiter primitives that make this efficient.
Auth token validation
Verifying a signed JWT can run entirely at the edge using Web Crypto. The auth check happens close to the user; only authenticated requests proceed to the origin. This shaves 100–200ms off every authenticated request.
Header-based personalization
Inject Set-Cookie headers, rewrite query strings, route to different cached variants based on existing cookies. The edge is the natural home for "route this request to the right cached version of this page."
Image optimization
Cloudflare Images, Vercel Image Optimization, and similar services run image resizing and format negotiation at the edge. Serving a webp or avif image resized for the requesting device, from a PoP near the user, is one of the highest-leverage performance moves available.
What edge functions are bad at for ecommerce
This is the section that needs more attention than it usually gets:
Anything that touches your primary database from a single region
If your Postgres or MySQL lives in one AWS region, an edge function in Sydney calling that database is making a 150–300ms round trip on every request. The edge runtime is fast; the network hop to your database is not. You’ve paid for "edge” and gotten centralized latency anyway.
The fix is one of: use an edge-native database (Cloudflare D1, Turso) replicated to PoPs; cache the data at the edge with a TTL and let the edge serve from cache; or don’t run the function at the edge — run it on a regional serverless function instead.
Cart and checkout logic
Cart state requires reads and writes against authoritative storage. Doing this from the edge against a centralized database is slow. Doing it from the edge against an edge KV store is fast but introduces eventual-consistency problems (a user adds an item from one region; their next request hits a different region; the item isn’t there yet).
For carts, run the logic on a regional serverless function or your origin server. Use the edge for the static skeleton of the cart page; do the data work where consistency is reliable.
Payment processing
Stripe webhook signature verification needs the official Stripe SDK, which uses Node-specific APIs. Run payment confirmation in your origin or in a regional serverless function. The edge is the wrong runtime for the workload.
Anything that needs more than a few KB of memory or 25ms of CPU
Server-rendering a complex product page with dozens of dynamic blocks, generating a PDF invoice, processing an image upload — these blow through edge limits. Move them to a regional function or your origin.
SEO-critical SSR
This one’s nuanced. Edge SSR can be fast, but if your page needs data from a database in another region, you’re back to the "200ms penalty per request" problem — and now it’s affecting your TTFB on the page that Google is crawling. For SEO-sensitive pages, measure carefully. Sometimes the regional Node function with proximity to the database is faster end-to-end.
The decision framework
For each route, ask three questions in order:
- Does this route need data from your primary database? If no — it’s a candidate for the edge. If yes — continue.
- Can the data be cached or replicated to the edge? If yes (catalogs, content, locale data) — cache it and run at the edge. If no (cart state, real-time inventory, user account data) — continue.
- Is the route latency-critical or latency-tolerant? Latency-critical (above-the-fold landing pages): consider edge with cached data and stale-while-revalidate. Latency-tolerant (account pages, checkout): run on a regional function close to the database.
The mistake to avoid is the "edge by default" mindset. Edge is a tool. Use it where it pays.
What this looks like in practice
For a typical Next.js storefront we’d architect like this:
- Edge middleware: A/B test assignment, geo redirects, auth token validation, locale routing.
- Edge functions / SSR: Static product pages with edge ISR, marketing landers, blog content. Cached aggressively, regenerated on data changes.
- Regional serverless: Cart routes, account routes, search-with-filters, checkout pages. Close to the database; full Node.js runtime where needed.
- Origin server: Stripe webhooks, payment processing, admin actions, anything that needs the full plugin ecosystem of WooCommerce or a long-running task.
The result is fast pages where speed matters and reliable infrastructure where consistency matters — with the boundary between them drawn deliberately, not by default.
Cost notes
Edge platform pricing has converged to roughly: Cloudflare Workers — free tier 100k requests/day, paid $5/month + $0.30/million; Vercel Edge Functions — bundled into Pro/Enterprise plans with generous included tiers; Deno Deploy — 1M requests/month free, $20/month + $0.30/million beyond.
For typical ecommerce traffic these are not the line items that matter. Where edge gets expensive is when teams over-route everything through edge middleware and start hitting compound bills across CPU time, KV reads, and bandwidth. Audit your edge spend quarterly; the bill will tell you which routes are doing too much work at the edge.
Practical takeaways
If you’re considering moving an ecommerce site toward edge runtime:
- Move geo redirects and A/B test assignment to the edge first. Highest leverage, lowest risk.
- Cache catalog data at the edge with explicit purge on product updates. This is where edge SSR earns its keep.
- Keep cart, checkout, and account logic at a regional function or origin. Don’t try to run them at the edge against a distant database.
- Measure end-to-end. The edge function timing in your dashboard is not the user’s experience — the database round trip is the cost you don’t see in the platform metrics.
If you’d like an honest assessment of which parts of your storefront would benefit from moving to edge runtime, our engineering team can walk through your application’s request patterns and tell you where the wins actually are. Most stores have two or three high-value moves; not twenty.
FAQ
Are edge functions faster than regional serverless functions?
For requests that don’t touch a centralized database, yes — typically by 100–300ms. For requests that do touch a centralized database, regional serverless functions in the same region as the database are often faster end-to-end. Measure your specific workload before assuming.
Can I run my entire Next.js app at the edge?
Technically you can opt every route into the edge runtime, but it almost always degrades performance. Cart, checkout, and database-heavy routes belong on Node. The Next.js team explicitly recommends a hybrid approach.
Should I use Cloudflare Workers or Vercel Edge Functions?
Cloudflare Workers if you’re not already on Vercel, want lower per-request pricing, and value Cloudflare’s broader product ecosystem (D1, KV, R2, Durable Objects). Vercel Edge Functions if you’re deploying a Next.js application and want the tightest framework integration. Both runtimes are competent; the choice is usually about the rest of your stack.
Does edge runtime work with WooCommerce?
Edge runtime doesn’t run PHP, so WooCommerce’s application code can’t run there directly. The pattern that works is a headless setup: WooCommerce on origin/regional infrastructure exposing a REST API or GraphQL endpoint, with an edge frontend consuming it and caching results aggressively.
What’s the most common edge runtime mistake?
Putting database calls in edge middleware. The middleware fires on every request, and the database call adds 100–300ms to every page load. Either cache the data at the edge or move that logic out of middleware entirely.
Written by OM, EtherLabz engineering. If you’re weighing edge runtime for a storefront and want a sanity check on where it actually pays off, get in touch.