WINK Streaming has solved the browser compatibility problem. With our four-transport architecture, we deliver ultra-low latency where possible and graceful fallbacks everywhere else. WebTransport for modern browsers, WebSocket for Safari, fMP4 for compatibility - we're not waiting for standards, we're shipping TODAY.
See the 200-300ms latency for yourself (Chrome/Edge only)
Transport Mode | Port | Technology | Latency | Browser Support | Use Case |
---|---|---|---|---|---|
WebTransport Raw H.264 | 4443 | WebCodecs API | <100ms | Chrome/Edge/Firefox | Ultra-low latency interactive |
Native QUIC Raw | 4444 | Raw QUIC | <100ms | N/A (native apps) | Server relay, moq-rs tools |
WebTransport fMP4 | 4445 | MSE API | <200ms | Chrome/Edge/Firefox | Broader compatibility |
WebSocket fMP4 | 4446 | MSE API | 200-500ms | ALL browsers | Universal fallback |
Protocol | Typical Latency | Best Case | Worst Case | vs MoQ |
---|---|---|---|---|
MoQ (Our Implementation) | 200-300ms | 180ms | 400ms | Baseline |
WebRTC | 500ms-2s | 300ms | 3s | 2.5-10x slower |
SRT | 300-500ms | 250ms | 1s | 1.5-2.5x slower |
RTMP | 1-3s | 800ms | 5s | 5-15x slower |
HLS | 2-10s | 1.5s | 30s | 10-50x slower |
URL: https://moq.wink.co/moq-player.html
Requirements: Chrome or Edge with WebTransport enabled
What You'll See: Real-time video with 200-300ms latency from camera to screen
GitHub: https://github.com/winkmichael/mediamtx-moq
Language: Go (backend) + JavaScript (player)
License: MIT License - Free for any use
Integration: Full MediaMTX compatibility
Our MoQ implementation achieves ultra-low latency through several key innovations and hard-won discoveries. Here's the complete technical journey,, including all the dead ends and breakthroughs.
βββββββββββββββ ββββββββββββββββββββββββββββββββββββββββ βββββββββββ β Camera ββββββββΆβ MediaMTX Core ββββββββΆβ Browser β β (RTSP) β β β β Player β βββββββββββββββ β βββββββββββββββββββββββββββββββββββββ βββββββββββ β β MoQ Server (WINK) ββ β ββββββββββββββββββββββββββββββββββββ€β β β :4443 WebTransport Raw H.264 ββββββΆ Chrome/Edge (WebCodecs) β β :4444 Native QUIC Raw ββββββΆ moq-rs, servers β β :4445 WebTransport fMP4 ββββββΆ Chrome/Edge/Firefox (MSE) β β :4446 WebSocket fMP4 Fallback ββββββΆ ALL browsers (MSE) β βββββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββββββ
# Enable MoQ server with all transport modes moq: yes moqAddress: :4443 # WebTransport Raw H.264 (WebCodecs) moqAddressQuic: :4444 # Native QUIC for moq-rs tools moqAddressFMP4: :4445 # WebTransport fMP4 (MSE) moqAddressWebSocket: :4446 # WebSocket fallback (works everywhere) # SSL/TLS Configuration (REQUIRED for WebTransport) moqServerCert: /path/to/cert.pem moqServerKey: /path/to/key.pem # Optional: Frame batching for efficiency moqFramesPerSegment: 3 # Batch 3 frames per fMP4 segment # Path configuration paths: live: source: rtsp://camera.example.com:554/stream sourceProtocol: tcp sourceOnDemand: no
# Generate self-signed certificate for testing openssl req -x509 -newkey rsa:4096 -sha256 -days 365 \ -nodes -keyout server.key -out server.crt \ -subj "/CN=localhost" \ -addext "subjectAltName=DNS:localhost,IP:127.0.0.1" # For production, use Let's Encrypt certbot certonly --standalone -d moq.yourdomain.com # Update mediamtx.yml with certificate paths moqServerCert: /etc/letsencrypt/live/moq.yourdomain.com/fullchain.pem moqServerKey: /etc/letsencrypt/live/moq.yourdomain.com/privkey.pem
Transport | Port | Protocol | Container | Browser API | Latency |
---|---|---|---|---|---|
WebTransport Raw | 4443 | HTTP/3 + WebTransport | Raw H.264 Annex B | WebCodecs | <100ms |
Native QUIC | 4444 | Raw QUIC | Raw H.264 Annex B | N/A | <100ms |
WebTransport fMP4 | 4445 | HTTP/3 + WebTransport | Fragmented MP4 | MSE | <200ms |
WebSocket Fallback | 4446 | WebSocket over TLS | Fragmented MP4 | MSE | 200-500ms |
One of our biggest challenges was integrating with MediaMTX's path manager. The stream object exists immediately but the format description isn't populated until data arrives:
// Initial attempt - this crashes! res, err := pm.AddReader(pathName, &PathAddReaderReq{ Author: c, }) stream := res.Stream stream.AddReader(reader, nil, nil, callback) // PANIC: stream.Desc is nil! // Working solution - wait for stream to be ready for attempts := 0; attempts < 50; attempts++ { if res.Stream.Desc != nil { break } time.Sleep(100 * time.Millisecond) } if res.Stream.Desc == nil { return fmt.Errorf("stream not ready after 5 seconds") } // Now we can safely find formats var videoFormatH264 *format.H264 videoMedia := res.Stream.Desc.FindFormat(&videoFormatH264) if videoMedia == nil { return fmt.Errorf("H.264 format not found") } // Register reader with proper format res.Stream.AddReader(reader, videoMedia, videoFormatH264, func(u unit.Unit) { // Process frames })
This timing issue took days to debug. Other protocols seem to get lucky with timing, but MoQ's immediate connection requirement exposed this race condition.
We spent weeks trying to connect browsers to our QUIC server on port 4444. Nothing worked. After diving deep into browser security models,, we discovered:
Out of "confusion and frustration," we implemented both:
// WebTransport for browsers (port 4443)
wtServer := webtransport.Server{
H3: http3.Server{
Addr: ":4443",
TLSConfig: tlsConfig,
},
}
// Native QUIC for server-to-server (port 4444)
quicListener, err := quic.ListenAddr(":4444", tlsConfig, &quic.Config{
MaxIdleTimeout: 5 * time.Minute,
MaxIncomingStreams: 256,
MaxIncomingUniStreams: 256,
})
This "mistake" became a feature - we're the first implementation with dual transport support!
MediaMTX provides Access Units containing multiple NAL units. WebCodecs expects complete frames in Annex B format. Here's what we learned:
func processH264AccessUnit(au *unit.H264, h264Format *format.H264) []byte { var frameData []byte hasIDR := false startCode := []byte{0x00, 0x00, 0x00, 0x01} // Check if this AU contains an IDR frame for _, nalu := range au.AU { nalType := nalu[0] & 0x1F if nalType == 5 { // IDR frame hasIDR = true break } } // Critical: Add SPS/PPS before every IDR frame if hasIDR && h264Format.SPS != nil && h264Format.PPS != nil { frameData = append(frameData, startCode...) frameData = append(frameData, h264Format.SPS...) frameData = append(frameData, startCode...) frameData = append(frameData, h264Format.PPS...) } // Process NAL units - only include VCL NALUs (types 1-5) for _, nalu := range au.AU { nalType := nalu[0] & 0x1F // Skip non-VCL NAL units (SEI, AUD, etc.) if nalType < 1 || nalType > 5 { continue } // Add VCL NAL unit with start code frameData = append(frameData, startCode...) frameData = append(frameData, nalu...) } return frameData }
Key discoveries:
// Complete browser player implementation class MoQPlayer { constructor() { this.webTransport = new WebTransport(url); this.videoDecoder = new VideoDecoder({ output: frame => this.renderFrame(frame), error: e => console.error(e) }); this.audioContext = new AudioContext(); this.audioWorklet = new AudioWorkletNode(); } async receiveStreams() { const streams = await this.webTransport.incomingBidirectionalStreams; for await (const stream of streams) { this.processStream(stream); } } async processStream(stream) { // Read MoQ messages const reader = stream.readable.getReader(); while (true) { const {value, done} = await reader.read(); if (done) break; // Parse MoQ message const message = this.parseMoQMessage(value); // Decode based on type if (message.type === 'video') { this.videoDecoder.decode(new EncodedVideoChunk(message.data)); } else if (message.type === 'audio') { this.audioWorklet.port.postMessage(message.data); } } } }
Audio would play for 1 second then stop. The problem: Web Audio API scheduling conflicts when buffering too many chunks.
// WRONG - This kills audio after ~1 second audioChunks.forEach(chunk => { playAudioChunk(chunk, audioContext.currentTime + offset); offset += duration; }); // CORRECT - Sequential scheduling class AudioPlayer { constructor() { this.audioContext = new AudioContext(); this.audioWorklet = null; this.nextPlayTime = 0; this.audioQueue = []; this.setupWorklet(); } async setupWorklet() { // Use Web Worker for audio decoding to prevent main thread blocking await this.audioContext.audioWorklet.addModule('audio-processor.js'); this.audioWorklet = new AudioWorkletNode(this.audioContext, 'aac-processor'); this.audioWorklet.connect(this.audioContext.destination); } scheduleAudio(audioData) { // Decode AAC to PCM const pcmData = this.decodeAAC(audioData); // Create buffer const audioBuffer = this.audioContext.createBuffer( 2, // stereo pcmData.length / 2, 44100 // sample rate ); // Copy data to buffer audioBuffer.copyToChannel(pcmData.left, 0); audioBuffer.copyToChannel(pcmData.right, 1); // Schedule playback const source = this.audioContext.createBufferSource(); source.buffer = audioBuffer; source.connect(this.audioContext.destination); // Play at the right time if (this.nextPlayTime < this.audioContext.currentTime) { this.nextPlayTime = this.audioContext.currentTime; } source.start(this.nextPlayTime); this.nextPlayTime += audioBuffer.duration; } }
Key insight: Don't schedule everything at once. Process audio sequentially to maintain sync.
MoQ uses variable-length integers for efficiency. Here's our implementation:
// Encoding varints for MoQ messages
function encodeVarInt(value) {
if (value < 0x40) {
return new Uint8Array([value]);
} else if (value < 0x4000) {
return new Uint8Array([
0x40 | (value >> 8),
value & 0xFF
]);
} else if (value < 0x40000000) {
return new Uint8Array([
0x80 | (value >> 24),
(value >> 16) & 0xFF,
(value >> 8) & 0xFF,
value & 0xFF
]);
}
// Up to 8 bytes for large values
}
// Decoding varints from stream
async function readVarInt(reader) {
const { value: firstByte } = await reader.read();
const prefix = firstByte & 0xC0;
if (prefix === 0x00) {
return firstByte;
} else if (prefix === 0x40) {
const { value: secondByte } = await reader.read();
return ((firstByte & 0x3F) << 8) | secondByte;
} else if (prefix === 0x80) {
// Read 4 bytes total
const bytes = await readBytes(reader, 3);
return ((firstByte & 0x3F) << 24) |
(bytes[0] << 16) |
(bytes[1] << 8) |
bytes[2];
}
// Handle 8-byte case
}
// MoQ message structure
class MoQMessage {
static SETUP = 0x40;
static SETUP_OK = 0x41;
static SUBSCRIBE = 0x03;
static SUBSCRIBE_OK = 0x04;
static OBJECT = 0x00;
static encodeSetup(role, version) {
const msg = [];
msg.push(...encodeVarInt(MoQMessage.SETUP));
msg.push(...encodeVarInt(version));
msg.push(...encodeVarInt(role)); // 0x01 = subscriber
return new Uint8Array(msg);
}
static encodeSubscribe(trackName, trackAlias) {
const msg = [];
msg.push(...encodeVarInt(MoQMessage.SUBSCRIBE));
msg.push(...encodeVarInt(1)); // subscribe ID
msg.push(...encodeVarInt(trackAlias));
msg.push(...encodeVarInt(trackName.length));
msg.push(...new TextEncoder().encode(trackName));
return new Uint8Array(msg);
}
}
Optimization | Impact | Implementation |
---|---|---|
Frame Buffering | Smooth 30fps playback | Buffer 3-5 frames, render at fixed interval |
Software Decoding | More reliable than hardware | hardwareAcceleration: 'prefer-software' |
Audio Worklet | No main thread blocking | Decode AAC in Web Worker |
Unidirectional Streams | Lower overhead | Each frame on new stream |
Skip SEI NALUs | Prevent decoder crashes | Filter NAL types 1-5 only |
WebTransport: β Supported
WebCodecs: β Supported
Status: Works Today
WebTransport: β οΈ Behind Flag
WebCodecs: β οΈ Partial
Status: Not Production Ready
WebTransport: β Not Implemented
WebCodecs: β οΈ Experimental
Status: No Timeline
WebTransport: β Not Implemented
WebCodecs: β Not Available
Status: Completely Blocked
Without Safari support, MoQ cannot reach:
This means MoQ is currently unusable for consumer-facing applications.
Current MoQ implementations (including ours) use WebCodecs API to decode video in JavaScript. This is a clever workaround, but it's not a sustainable solution.
Issue | Impact | Why It Matters |
---|---|---|
CPU Usage | 3-5x higher than native | Drains battery, limits concurrent streams |
Frame Drops | Happens under load | JavaScript thread blocking causes stuttering |
Memory Usage | 2-3x higher | Frame buffers in JavaScript heap |
Complexity | 1000+ lines of code | vs <video> tag simplicity |
Audio Sync | Requires Web Workers | Complex timing coordination |
Mobile Performance | Often unusable | Mobile CPUs can't handle it |
// This is what we have to do now (complex, inefficient): const decoder = new VideoDecoder({...}); const transport = new WebTransport(url); // ... 500 lines of complex stream handling ... // This is what we SHOULD have (simple, efficient): <video src="moq://stream.example.com/live" autoplay></video>
Playing video in JavaScript is like building a car engine with Lego blocks. It's an impressive technical achievement that proves the concept works, but you wouldn't want to drive it to work every day.
IETF finalizes QUIC (RFC 9000). Apple participates in working group.
W3C publishes WebTransport draft. Chrome implements. Apple silent.
Google enables WebTransport by default. Safari: "No position yet"
Multiple requests for Safari support. Apple: "Under consideration"
Safari adds partial WebCodecs behind flag. WebTransport: Still nothing.
Safari has ZERO WebTransport support. No public timeline. No commitment.
On iOS, ALL browsers must use Safari's WebKit engine. This means:
Apple's decision blocks MoQ on the entire iOS ecosystem - roughly 1 billion devices.
The Dilemma:
Someone has to move first. Our bet: Google will force the issue by making YouTube Live use MoQ.
Use Case | Ready? | Why / Why Not |
---|---|---|
B2B Surveillance (Chrome/Edge only) | β YES | Controlled environment, can mandate browser |
Internal Corporate Streaming | β YES | IT can standardize on Chrome/Edge |
Server-to-Server Relay | β YES | Native QUIC works great |
Government Public Portals | β οΈ MAYBE | Can't exclude Safari users |
Consumer Live Streaming | β NO | Must support all browsers |
Mobile Apps | β NO | iOS WebView doesn't support |
Smart TV Apps | β NO | TV browsers are years behind |
Solution | Latency | Browser Support | Pros | Cons |
---|---|---|---|---|
WebRTC | 500ms-2s | All browsers | Universal support, P2P capable | Complex, NAT issues, not for broadcast |
LL-HLS | 2-3s | All browsers | Apple standard, CDN friendly | Still seconds of delay |
WebTransport + MSE | 1-2s | Chrome only | Better than HLS | Complex, Chrome only |
MoQ (Today) | 200-300ms | Chrome/Edge only | Lowest latency, future-proof | No Safari, needs JavaScript decoder |
For B2B/Enterprise: Use MoQ if you can control browser choice
For B2C/Consumer: Stick with WebRTC or LL-HLS until Safari supports WebTransport
For Future-Proofing: Build MoQ support now, but keep fallbacks
// Smart batching with keyframe awareness type WebSocketFallbackConn struct { framesPerSegment int // Configurable (default: 3) videoFrameBuffer []*unit.H264 audioFrameBuffer []*unit.MPEG4Audio } func (c *WebSocketFallbackConn) handleVideo(au *unit.H264) { c.videoFrameBuffer = append(c.videoFrameBuffer, au) // Flush on keyframe or when buffer full if isKeyframe || len(c.videoFrameBuffer) >= c.framesPerSegment { c.sendVideoSegment() } }
Component | Latency | Notes |
---|---|---|
Encoding | 50-100ms | FFmpeg x264 encoding |
Network | 20-50ms | QUIC transport overhead |
Buffering | 50-100ms | Frame buffer in browser |
Decoding | 30-50ms | WebCodecs processing |
Rendering | 33ms | One frame at 30fps |
MediaMTX Process: βββ RTMP Ingestion: ~5% βββ H.264 Processing: ~8% βββ MoQ Encoding: ~2% βββ Network I/O: ~3% Total: ~18% Browser: βββ WebTransport: ~10% βββ Video Decode: ~15% βββ Audio Decode: ~5% βββ Rendering: ~10% βββ JavaScript: ~5% Total: ~45%
Use Case | Issue | Workaround |
---|---|---|
High packet loss networks (>5%) | No advanced FEC | Use SRT instead |
Mobile devices | JavaScript decoding too heavy | Wait for native support |
Large scale broadcasting | No CDN integration yet | Use HLS for scale |
DRM protected content | No encryption support | Not applicable |
Live captions/subtitles | No metadata tracks | Overlay manually |
# mediamtx.yml logLevel: debug logDestinations: [stdout] # Enable MoQ specific logs servers: moq: enable: yes debug: true
# Enable experimental features chrome://flags/#enable-experimental-web-platform-features # Check WebTransport status chrome://webrtc-internals/
# Monitor QUIC traffic sudo tcpdump -i any -nn port 4443 or port 4444 # Test with moq-rs tools moq-relay --listen 0.0.0.0:4444 moq-pub --url https://localhost:4444/test video.mp4
This implementation wouldn't exist without the foundational work of many talented engineers and organizations:
To everyone who's tried to achieve sub-second latency and shared their failures and successes. The streaming community's openness made this possible.
We're not waiting for perfection. With four transport modes, frame batching, and WebSocket fallback, we've achieved 100% browser coverage TODAY. The ultra-low latency dream works in modern browsers, while fallbacks ensure nobody is excluded.
Apple's WebCodecs beta changes everything. Once Safari ships WebCodecs to stable, we'll have sub-200ms latency on iOS. WebTransport would be nice, but WebSocket fallback means we're not blocked. WINK Streaming is committed to pushing MoQ forward regardless of vendor politics.
The future is multi-transport. Pure MoQ for cutting-edge browsers, fMP4 for compatibility, WebSocket for universality. This isn't a compromise - it's pragmatic engineering that ships TODAY while building for tomorrow.
If you're in a controlled environment where you can mandate Chrome/Edge, MoQ is ready today.