Multi-track timeline in the browser
A browser-based editor built around one timeline model, so the state you edit, the playback you see, and the file you export stay aligned.

A browser-based video editor where what you edit, what you preview, and what you export are all the same thing.
The editor treats a video project as a normalized item graph: tracks, typed items, and assets shared between the interactive surface, live playback, and export. Preview and render are not separate authoring systems. That was the core architectural bet.
The result covers the full editing surface: canvas manipulation, multi-track timeline, type-aware inspector, caption generation, local-first media, and synchronized persistence.
Multiple entry points share the same editor shell. The shell reads and writes synchronized project state, then surfaces a normalized item graph that playback and export both consume directly.
Tracks, items, assets, dimensions, and fps live in a bounded history stack. Every structural edit is a snapshot entry. Undo and redo fall out without extra code.
Selection, editing modes, task state, trim indicators, and snap guides are volatile. They belong in the editor but not in history or synchronized storage. The separation is explicit in types.
Dozens of narrowly scoped React contexts replace a single large context object. More nesting, tighter render boundaries. In a dense interactive UI, that trade is worth it.
EditorState wraps two distinct layers. The undoable layer holds everything that belongs in project history. The transient layer holds everything that does not. The split is enforced structurally, not by convention.
The primary persistence path is synchronized shared storage. The provider reads undoable state on initialization, then writes a cleaned snapshot back on a debounce after every edit.
Loop mode, snapping, and layout preferences stay local. These are intentionally not shared project fields. Volatile UX preferences don't belong in synchronized project state.
The undo history provider applies mutations without committing first, then commits a no-op entry if needed. This guards against duplicate history entries from React Strict Mode double invocation in development.
When a file is dropped, a local preview source is created immediately, the file is cached client-side, and the item lands in the timeline. Upload starts asynchronously. Authoring continues even if upload fails.
Assets that already exist remotely arrive with metadata in place. They are inserted as remote-backed items and can stream immediately, with no local preparation cycle required.
Browser-managed blob URLs can become invalid over long sessions. The editor monitors cache key changes and revalidates URLs. If a local URL is unavailable, playback falls back to the remote source without intervention.
Local file import and remote asset import produce the same normalized asset and item records. After that, insertion, playback, and export all read from the same structure. The import path doesn't matter downstream.
The editor has its own composition that calculates metadata from current items, derives final duration from content, and injects font information from text and caption items. The same layer graph the editor renders interactively is what the export uses for file output.
Because preview and export share a composition, surprises at render time require the code to diverge. The structural constraint makes drift visible as an implementation inconsistency rather than a runtime surprise.
The export route validates a small payload, builds render input from tracks, items, and assets, enriches with font metadata, and starts a background render. The progress route polls until the render completes, then triggers a browser download.
Trimming one clip shifts effects, captions, and crossfades on neighboring tracks. Naive updates cascade into full re-renders across the timeline.
A drag-resize on a long clip fires hundreds of updates per second. Anything more expensive than constant-time breaks playback.
Live playback and final export carry subtly different timing assumptions. Exports drift from what the user saw on screen.
Browser-managed blob URLs can silently expire in long sessions. Without revalidation, assets disappear from playback without error.
Preview and export share the same composition root and item graph. Drift between them requires code to actively diverge.
Item, asset, and track relationships stay explicit. Undo/redo is tractable. Inspector, timeline, canvas, and export all read the same shape.
Uploads don't block editing. Files are cached client-side immediately. Authoring continues through upload failures and slow networks.
Synchronized shared storage makes collaborative persistence a first-class concern rather than an afterthought. Volatile UI state stays out of shared project state.
Dozens of narrowly scoped contexts instead of one object. More provider nesting, tighter render boundaries. Dense interactive UIs benefit from targeted subscriptions.
The feature was built on top of earlier editor scaffolding. Some older naming and structure remain visible. Faster delivery, visible evolutionary history.
Sharing a composition between editor and renderer requires care about what is player-only UI versus render-time logic. The boundary isn't enforced by the framework.
Local-first media requires tracking URL lifetime, cache consistency, and dual local/remote source resolution. The simplicity is in the UX, not in the implementation.
Multi-track edits stay smooth through aggressive drag interactions. Indexed state makes per-item updates constant-time regardless of timeline length.
Live preview and final export read the same selectors and the same composition. Surprises at render time require the code to diverge actively.
Undo/redo fall out of the state model at no extra cost. Shared sync is free at the persistence layer. Caption generation produces new editable entities without special-casing the editor.
Most timeline pain is in how state is represented, not what renders it. Normalized indexed maps made the rest of the system cheaper to build.
If preview and render disagree, the renderer is rarely where to fix it. The answer is in shared state. Move it there once and both surfaces follow.
Committing to local-first media means the feature must manage blob URL lifetime and cache consistency. The implementation cost is real. The UX benefit of editing without waiting is worth it.
Synchronized storage is for project content. Transient UI state persisted to shared storage introduces subtle race conditions and stale rehydration bugs. The split is strict by design.
Multi-tenant notifications platform