streamtoolset
DocsChecking session…

Engineering

Hot-swap overlay config with Durable Objects

The first time you tweak a font size on the dashboard and watch the OBS browser source update half a second later — without reloading the source — that's the feature. Here's the architecture under it: KV for durable state, a Durable Object for websocket fanout, postMessage for instant local preview.

The constraint

A streamer is on the toolset dashboard adjusting the chat overlay while OBS Studio is open in another window, browser source previewing the live overlay. They drag the font-size slider. What should happen?

The bad version: nothing, until they hit "Save" and right-click the browser source > Refresh. That's every paid overlay product five years ago.

The right version: the OBS browser source updates as they drag. Live. No save click, no refresh.

The constraint is that the dashboard tab and the OBS browser source are two unrelated browser contexts running on different machines (your streaming PC and your gaming PC, often). They don't share localStorage, don't share session, don't even know about each other's existence.

The architecture

Three pieces, three different layers:

  1. KV — Cloudflare Workers KV stores the durable config. This is the source of truth when you reload the browser source from scratch. Eventually consistent, fine for this.
  2. Durable Object — one DO instance per overlay-source ID. Holds open websockets to every connected browser source for that overlay. Receives writes from the dashboard via HTTP and fans out to every connected client over ws.
  3. postMessage — the in-dashboard live preview (right next to the slider) gets updates via postMessage instead of the round-trip through the DO. ~80ms debounced, message type toolset:config-preview. Feels instant because it is instant — same browser, no network hop.

Why a Durable Object and not just KV polling

KV is eventually consistent. Reads can be served from edge cache for up to 60 seconds. If the browser source polled KV every 5 seconds, you'd get a 5-second-plus visible lag on every edit, and you'd burn a KV read per overlay per 5 seconds forever — that's 17,000 reads/day per active overlay before you've even started streaming.

A Durable Object pins to a single location, holds strong consistency, and lets us hold the websocket connection open. The dashboard PUTs to the DO once on save (and during throttled drags), the DO fans out to every connected browser source for that overlay-source ID, and KV gets the write asynchronously for the next time the source reconnects from cold.

KV is the persistence layer. The DO is the fanout layer. They're solving different problems.

Why postMessage for the dashboard preview

The live preview iframe inside the dashboard is the same overlay code that runs in OBS — same component tree, same render path. If we routed dashboard-preview updates through the DO too, we'd have:

  • ~80ms round-trip on every keystroke / slider tick
  • A noisy ws conversation where 99% of frames are throwaway (the streamer dragging a slider produces 60+ updates per second; only the last one matters)
  • A confusing failure mode where the local preview lags the actual OBS source by the network round-trip delta

postMessage skips all of that. The slider posts a config blob to the preview iframe directly. We debounce at 80ms — long enough to coalesce a drag burst, short enough that the preview still feels reactive. The same 80ms debounce gate then fires a single PUT to the DO, which fans out to OBS.

End result: the dashboard preview feels native (no network involved), OBS lags by one network round-trip + 80ms debounce (~150ms total on a good connection — well under the perception threshold).

Failure modes worth thinking about

Browser source loses its websocket

OBS suspends background tabs. Connection drops. The browser source reconnects on regain-focus or on a 30s heartbeat miss, and the DO replays the latest config to it. Source of truth is the DO's in-memory state plus KV behind it; the client doesn't need to track anything itself.

DO eviction

DOs evict when idle. When the next request hits, the DO spins back up, rehydrates from KV, and re-accepts websocket connections. There's a brief window during cold-start where a config write could land but no clients are connected to fan it out to — fine, because KV has it and every reconnecting client will pull the latest on handshake.

Dashboard goes offline mid-drag

The preview iframe keeps updating (it's local). The DO stops getting writes, so OBS stops updating. On reconnect, a single final PUT lands and OBS catches up. The streamer sees the dashboard preview agree with OBS again within a frame.

Why this matters for a free product

Paid overlay products often gate "live config" behind a higher tier because their architecture was built when this was hard. With Cloudflare DOs, the marginal cost of holding a websocket open per active stream is approximately nothing — they're bundled into the $5/mo Workers Paid plan, no per-connection billing. There's no business reason to gate it.

The same DO pattern is what makes the Timer overlay's server-side deadline resilient to OBS restarts, and what drives the event list overlay's live push of new subs/cheers/raids. One websocket fanout primitive, three different products on top.