Getting started
Install
npm install longpipe
That’s it — model weights stream from cdn.longpipe.dev on first init, so no extra setup. (If you’d rather host weights yourself, see Self-hosting weights.)
Your first effect
import { EffectsPipeline } from 'longpipe'
// Get any MediaStream — webcam, captureStream(), getDisplayMedia(), whatever
const camera = await navigator.mediaDevices.getUserMedia({ video: true })
const pipeline = new EffectsPipeline(camera, {
background: 'blur', // 'blur' | 'none' | image URL | ImageBitmap | { color: '#0a3' } | ...
})
// pipeline.stream is a MediaStream — use it anywhere you'd use the original
videoElement.srcObject = pipeline.stream
await pipeline.ready
What’s happening
- The constructor returns synchronously.
pipeline.streamis available immediately and emits passthrough until init completes. - The SDK spawns a worker, picks the best model preset for your GPU (autotune), fetches weights, and starts compositing.
await pipeline.readyresolves when the first composited frame lands.
Next steps
- Pick a background other than blur
- Look at the full API reference
- Tune presets if autotune isn’t picking what you want