If you’ve ever been told, “You can just send video in Base64 inside JSON,” stop right there. Base64 works for small images, debugging, or prototypes, but for serious video—especially live or large files—it’s a disaster: +33% bandwidth, extra CPU, higher latency, wasted memory. No exceptions.
Here’s how real streaming works, with raw bytes, chunking, and codecs, plus a practical WebSocket example.
Why Base64 Doesn’t Work
Base64 introduces serious overhead:
Bandwidth: +33% compared to raw binary
CPU: Encoding/decoding is expensive
Latency: Every chunk must be converted
Memory: Extra buffers for the string
Scalability: Impossible for live or large files
Base64 is only useful for tiny assets or debugging, never production.
How Real Streaming Works
1. Raw Binary Stream
Send the video as pure bytes, never text. Options include:
TCP / UDP
WebSocket (binary)
HTTP chunked transfer
QUIC
2. Chunking
Break the video into blocks:
[chunk][chunk][chunk]
Typical size: 1–64 KB
Sequential order
No text encoding
3. Codec
You don’t send raw frames; codecs compress video efficiently:
H.264 / AVC (standard)
H.265 / HEVC (high efficiency)
VP9 / AV1 (open source, high quality per bandwidth)
The server sends compressed bytes, the client decodes them in real time.
Conceptual Example
Server (encoder):
Camera → H.264 encoder → byte stream → socket
Client (decoder):
socket → byte stream → decoder → frame → displayWeb Options
| WebRTC | Live, low latency |
| HLS | On-demand streaming |
| DASH | Adaptive streaming |
| WebSocket (binary) | Custom real-time prototypes |
Minimal WebSocket Example (JS)
Server Node.js:
videoChunks.forEach(chunk => {
ws.send(chunk); // Pure binary, no strings
});
Client Browser:
const socket = new WebSocket("wss://example.com/video");
socket.binaryType = "arraybuffer";
socket.onmessage = (event) => {
const chunk = new Uint8Array(event.data);
// Decode using MediaSource or WebCodecs
};
Notice: no Base64, only raw bytes.
Real Pipeline with FFmpeg (Conceptual)
ffmpeg -i input.mp4 \
-c:v libx264 -preset ultrafast -f mpegts udp://127.0.0.1:1234
libx264→ codec-f mpegts→ streaming-friendly containerudp://→ raw stream transmission
The client receives packets and feeds them directly into the decoder.
Base64 is fine for prototypes. For large files or live streaming, you need pure binary, chunking, codecs, and a proper protocol. With WebRTC, HLS, DASH, or binary WebSocket, you get low latency, high quality, and real scalability.
In a follow-up, I can show a full demo:
Convert a photo/video into a raw byte stream
Simulate frame → chunk → stream
Build a real WebRTC + FFmpeg production-ready pipeline
This is the serious world of video streaming.