A client rang me up on a Thursday afternoon in 2021 — antiques dealer based in Bath, wanted to take his monthly in-person auctions online. Simple enough, I thought. Then he said "and the bids need to update for everyone watching, live, no page refresh." Right. That's when a "simple WordPress job" turned into a two-week architecture conversation.
I've built over 12,000 sites atSeahawk Media, and real-time features are the ones that bite you if you don't plan them properly from the start. Polling every five seconds sounds fine until you have 200 bidders hammering a single endpoint simultaneously and your hosting bill doubles overnight. So let me show you exactly how I'd build a proper live auction platform today — usingNext.jsandSupabase— based on what I've actually shipped.
---
Why Next.js and Supabase for This Specifically
Look, there are a dozen ways to do real-time. Socket.io on a Node server, Ably, Pusher, Firebase — I've used all of them at various points. But the Next.js + Supabase combination earns its place here for a specific reason: Supabase Realtime is built on top of PostgreSQL's logical replication, which means your live bid updates and your persistent data layer are thesame system. No syncing two sources of truth. No wondering if a bid that went into the WebSocket also made it into the database.
Supabase also gives you Auth, Row Level Security, and Storage out of the box. For an auction site, where "only the auction owner can end a lot" and "a user cannot bid on their own item" are actual business rules, RLS policies in Postgres are genuinely the right tool.
And Next.js because — honestly — App Router with Server Components means you can render the auction catalogue statically, keep SEO happy, and only hydrate the real-time bidding widget on the client. That split matters. You don't want to pay for dynamic rendering on a page that's 90% static content.
---
Designing the Schema First (Don't Skip This)
This is where most people rush and regret it later. I spent an embarrassing three days refactoring the Bath antiques client's schema mid-project because I hadn't thought through the bid history model properly.
Here's the core structure I now use:
- `profiles`— extends Supabase's
auth.users, stores display name, verified bidder flag, and acredit_balanceif you're doing deposit-based bidding - `auctions`— the event itself;
starts_at,ends_at,status(draft | live | closed), andcreated_by - `lots`— individual items within an auction;
reserve_price,current_bid,current_bidder_id,lot_number,ends_at(lots can have individual countdowns) - `bids`— immutable append-only log;
lot_id,bidder_id,amount,placed_at. Never update this table. Ever. - `auction_participants`— a join table tracking who has registered for which auction (useful for deposit holds and notification targeting)
Thecurrent_bidandcurrent_bidder_idcolumns onlotsare denormalised intentionally. Yes, you could derive them from thebidstable on every read, but under concurrent load that query gets expensive fast. Denormalise it, keep thebidstable as your audit log, and use a Postgres function to updatelotsatomically when a bid is accepted.
The Atomic Bid Function
This is the bit most tutorials skip. Race conditions in auctions arereal. Two users submitting £520 at the same millisecond — what happens?
The answer is a Postgres function withFOR UPDATElocking on the lot row:
``` create or replace function place_bid(p_lot_id uuid, p_bidder_id uuid, p_amount numeric) returns json as $$ declare v_lot lots%rowtype; begin select * into v_lot from lots where id = p_lot_id for update;
if v_lot.status!= 'live' then return json_build_object('success', false, 'error', 'Lot is not live'); end if;
if p_amount <= v_lot.current_bid then return json_build_object('success', false, 'error', 'Bid too low'); end if;
if p_bidder_id = v_lot.current_bidder_id then return json_build_object('success', false, 'error', 'You are already the highest bidder'); end if;
insert into bids (lot_id, bidder_id, amount) values (p_lot_id, p_bidder_id, p_amount);
update lots set current_bid = p_amount, current_bidder_id = p_bidder_id where id = p_lot_id;
return json_build_object('success', true, 'new_bid', p_amount); end; $$ language plpgsql security definer; ```
Call this from your Next.js API route viasupabase.rpc('place_bid', {...}). TheFOR UPDATElock means only one transaction wins per lot at any given moment. The other one gets a serialisation error and you return a friendly "someone just outbid you" message on the client.
---
Row Level Security — The Auction Rules Layer
RLS is one of those things developers either love immediately or avoid because it feels opaque. I was in the avoidance camp until a fintech project at Seahawk taught me the hard way that enforcing access control only in application code is a single misconfigured API route away from disaster.
For an auction site, here are the policies that matter:
- Anyone can read live lots—
SELECTonlotswhereauctions.status = 'live' - Only authenticated, verified bidders can insert bids— check
profiles.verified_bidder = truein the policy - Only the auction creator can update lot status—
UPDATEonlotswhereauctions.created_by = auth.uid() - Bid history is readable by the lot's auction creator and the bidder themselves— no one else needs to see full bid history in real time
TheSupabase RLS documentationis genuinely good here — worth reading the section on security definer functions, because it interacts with how RPC calls likeplace_bidwork.
One gotcha: if you usesecurity defineron your Postgres function (as above), it runs with the function owner's privileges, bypassing RLS. That's intentional — youwantthe bid placement to bypass the bidder's RLS so it can lock and update the lot row. But it means you must enforce your own business logic checks inside the function, which the code above does.
---
Setting Up Supabase Realtime in Next.js
Here's where it actually gets satisfying. Supabase Realtime lets you subscribe to changes on a Postgres table usingWebSockets underneath, and the client SDK makes it almost embarrassingly simple.
In your auction lot page — a Client Component in Next.js App Router — you'd do something like:
``` 'use client'
import { useEffect, useState } from 'react' import { createClientComponentClient } from '@supabase/auth-helpers-nextjs'
export default function LotBidDisplay({ lotId, initialBid }) { const [currentBid, setCurrentBid] = useState(initialBid) const supabase = createClientComponentClient()
useEffect(() => { const channel = supabase.channel(lot-${lotId}).on( 'postgres_changes', { event: 'UPDATE', schema: 'public', table: 'lots', filter:id=eq.${lotId}}, (payload) => { setCurrentBid(payload.new.current_bid) } ).subscribe()
return () => { supabase.removeChannel(channel) } }, [lotId])
return <div>Current bid: £{currentBid.toLocaleString()}</div> } ```
PassinitialBidfrom a Server Component that fetches fresh data at request time. The client then takes over, listening forUPDATEevents on that specific lot row. Every timeplace_bidruns successfully, Supabase broadcasts the change and every connected bidder's UI updates within about 100–300ms typically.
Handling the Countdown Timer
Lots usually have a countdown — "closes in 3:42". Donottrust the client clock for this. Derive the end time fromlots.ends_at(stored in UTC in Postgres) and calculate remaining seconds on the client usingDate.now(). Re-sync it every 60 seconds with a fresh fetch in case of drift. And add "soft close" logic: if a bid arrives within the last 60 seconds, extendends_atby two minutes. That's standard auction behaviour and bidders expect it.
---
Next.js App Router Architecture for the Auction UI
The page structure I'd use:
``app/ auctions/ page.tsx ← Server Component, lists live auctions (ISR, revalidate: 60) [auctionId]/ page.tsx ← Server Component, fetches lots list server-side LotGrid.tsx ← Client Component, subscribes to lot status changes [lotId]/ page.tsx ← Server Component, initial lot data + metadata for SEO BidPanel.tsx ← Client Component, real-time bid display + bid form``
The catalogue (/auctions) uses Incremental Static Regeneration with a 60-second revalidation. Individual lot pages render server-side on first load (for sharing, previewing, og:image generation), then hand off to client components for the live stuff.
One thing I always do: keep theBidPanelcomponent lazy-loaded behinddynamic(() => import('./BidPanel'), { ssr: false }). It only makes sense client-side anyway, and it keeps your initial HTML payload lean for users on slow connections — which, if your auction audience skews older (as antique auctions tend to), matters more than you'd expect.
---
Authentication and the "Verified Bidder" Flow
Standard Supabase Auth with email/password or magic link works fine for sign-up. But auctions often need an extra step: bidder verification. You might need a credit card hold, ID verification, or just admin approval before someone can actually place a bid.
The pattern I use: averified_bidderboolean on theprofilestable, defaulting tofalse. After sign-up, the user sees a "Complete your registration" screen. Once approved (manually by admin, or automatically after a Stripe payment authorisation), you flip the flag. The RLS policy on bids checks it. They can browse, watch, but not bid until verified.
For Stripe payment authorisation holds,Stripe's payment intents with capture_method: manualis the right approach — you authorise a £50 hold, capture it if they win, release it if they don't. This dramatically reduces no-pay situations which, trust me, are the bane of every online auction operator's existence.
---
Deployment, Performance and the Bits That Will Bite You
Deploy toVercel— it's the obvious choice for Next.js and the edge network plays well with Supabase's global infrastructure. Make sure your Supabase project is in the AWS region closest to your Vercel deployment region. I've seen 40–60ms of completely unnecessary latency because someone deployed Vercel inus-east-1and Supabase ineu-west-2. Pick one region, put both there.
A few things that will cause you pain if you don't handle them upfront:
- WebSocket connection limits.Supabase's free tier allows around 200 concurrent Realtime connections. If your auction goes viral, that cap matters. Check your plan.
- Optimistic UI for bids.Show the bid immediately on the bidder's screen before the server confirms. If it fails (outbid, race condition), revert with an error. The 200–300ms server round-trip is imperceptibleunlessthe UI waits for it.
- Lot ending grace period.Never close a lot exactly at
ends_at. Give it a 2–3 second server-side buffer to allow in-flight bids that were submitted just before the deadline to process. Handle this in yourclose_lotscheduled function. - Email notifications.Use Supabase Edge Functions with Resend or Postmark to send "You've been outbid" and "You won!" emails. Don't try to do this from your Next.js API routes — they can time out, and auction participants get genuinely annoyed if notifications are unreliable.
---
FAQ
How many concurrent bidders can Supabase Realtime handle?
Supabase's Pro plan supports up to 500 concurrent Realtime connections by default, with higher limits available. For most auction sites — unless you're running something the size of Sotheby's online — that's more than enough. If you expect thousands of simultaneous viewers, consider broadcasting lot updates through a single server-side channel rather than per-user subscriptions, and look at Supabase'sRealtime Broadcastfeature which is more efficient for high fan-out scenarios.
Should I use Supabase Realtime or a dedicated service like Ably?
For most projects, Supabase Realtime is perfectly adequate and the integration is much simpler since your data is already in Supabase. I'd only reach for Ably or Pusher if you need sub-50ms latency globally, or if you're building something with millions of concurrent connections. An antiques auction, a charity fundraiser, a small art gallery's online sale — Supabase handles all of these fine.
What happens if a user's WebSocket connection drops mid-auction?
Supabase's client SDK will attempt to reconnect automatically. But you should always re-fetch the current lot state (current_bid,ends_at) on reconnection rather than trusting whatever was in local state before the drop. Add anonline/offlineevent listener in your Client Component and trigger a fresh server fetch when the connection restores.
Can I use Next.js Server Actions to place bids instead of an API route?
Yes, and I've done it. Server Actions in Next.js 14 are convenient — they remove the boilerplate of a dedicated/api/bidroute. The tradeoff is that Server Actions are a bit harder to rate-limit individually (you'd apply rate limiting at the middleware level rather than per-action). For a production auction site, I'd addUpstash Redisrate limiting in middleware to prevent a single user from spamming bid requests regardless of whether you use Actions or API routes.
How do I handle ties — two identical bids at the same time?
TheFOR UPDATElock in theplace_bidPostgres function serialises concurrent bids, so technically ties can't happen at the database level. One will succeed, the other will fail with a "bid too low" response (since both are equal tocurrent_bidand the check isp_amount <= v_lot.current_bid). First in, first served. That's standard auction practice and most bidders understand it.
---
The Bath antiques dealer, for what it's worth, has been running his monthly auctions online for over two years now. Peak concurrent bidders one Saturday evening hit 84 — his entire village apparently tuning in to watch a disputed lot of Georgian silver go for three times its reserve. Supabase didn't flinch. Next.js didn't flinch. The only thing that broke was his Wi-Fi, because he was running it from the shop floor.
Real-time is hard to think through upfront, but once the schema is solid and the atomic bid function is in place, the rest is mostly plumbing. Get the foundation right and you'll spend your time on the fun bits — the countdown animations, the "going once, going twice" UX — rather than debugging race conditions at midnight.
