Engine
The root object. Owns the AudioContext, the bus graph, the scheduler, every loaded sound.
TL;DR
The Engine is the only object you construct directly. Everything else
(Bus, Sound, Voice, Parameter,
Snapshot) is reached through it. Lifecycle is explicit — the underlying
AudioContext is created on first use, never in the constructor — so
createEngine() is safe to call before any user interaction.
State machine
Four states, one terminal transition. unlock() is the only thing you
call manually; visibility/focus changes auto-suspend and auto-resume the context
behind the scenes.
Signal flow
Each play() call attaches a Voice to its bus's input node.
Voices feed through bus FX (if any), out the bus's output, into the master, and
onward to ctx.destination. Everything else is variation on this graph.
API surface
interface Engine {
readonly state: 'cold' | 'unlocking' | 'live' | 'closed';
readonly now: number;
readonly context: AudioContext;
unlock(): Promise<void>;
close(): Promise<void>;
loadSound(name: string, url: string | readonly string[], options?: LoadSoundOptions): Promise<Sound>;
hasSound(name: string): boolean;
sound(name: string): Sound; // throws SoundNotFoundError
bus(name: string): Bus; // throws BusNotFoundError
scheduleAt(audioTime: number, fn: () => void): () => void;
parameter(name: string, initial?: number): Parameter;
captureSnapshot(name: string): Snapshot;
activeVoices(): readonly Voice[];
onStateChange(fn: (s: EngineState) => void): () => void;
} Live demo
The mixer dashboard below runs a real Engine with three buses, six
pre-loaded samples in .webm/.m4a pairs, and a live voice counter.
Recipes
Build at module load, unlock on click
import { createEngine } from 'zvuk';
const engine = createEngine({
buses: {
music: { level: 0.8 },
sfx: { level: 1.0 },
voice: { level: 1.0 },
},
master: { headroom: -3 },
}); await engine.unlock(); // call from a user gesture
engine.state; // 'cold' | 'unlocking' | 'live' | 'closed'
await engine.loadSound('coin', '/sfx/coin.webm', { bus: 'sfx' });
engine.sound('coin').play();
await engine.close(); // terminal — construct a new engine if needed Watch state transitions
const off = engine.onStateChange((s) => {
if (s === 'live') console.log('audio is live; ctx time:', engine.now);
});
// later: off(); Sample-accurate scheduling
const beat = engine.now + 0.25; // 250 ms ahead in audio time
engine.scheduleAt(beat, () => engine.sound('downbeat').play()); Tear down on route change
// React example
useEffect(() => () => { void engine.close(); }, []);
// Vue example
onBeforeUnmount(() => { void engine.close(); });
// Plain SPA
router.beforeEach(async () => { await engine.close(); }); Pitfalls
createEngine is cheap; only
unlock() needs a user gesture.
suspended. Sounds will be silently
dropped. Always await engine.unlock() first.
close() is terminal. Construct a fresh one if you need audio again.