HACKER Q&A
📣 sktrdie

Where is P2P live video streaming?


As home bandwidth is growing for each household worldwide I'd imagine that p2p solutions allowing for live video streaming would be mainstream by now. The same way BitTorrent is mainstream for non-live content.

I understand sending sub ~10 seconds live data across a large swarm of peers is probably impossible (given the inconsistencies of peers network connections) but I think we'd be happy with 2-3 min delays.

What are the technical limitations of this approach and why has nothing picked up like this that allows streaming of live events to swarms of thousands of users?


  👤 wmf Accepted Answer ✓
CDNs have gotten cheaper even faster than home uplink bandwidth has increased so P2P doesn't save enough money to offset its complexity.

👤 SahAssar
BitTorrent can prioritize which parts to download in which order to keep the swarm healthy (I don't know how smart it is, but in my experience it seems to go for parts with less availability first) and besides people trying to play videos/audio before it fully downloads (like popcorntime or picking "download in sequential order") this works pretty well.

With a live stream you have a few problems:

* The data you just downloaded is only relevant to the swarm for a short time

* The whole swarm wants the same data at each point in time, and the only part of the swarm that has that data is the originator of the stream

* I'm by no means an expert in this, but isn't it harder to do DHT type discovery in real time when the hashes of the content are new or are always changing? New chunks would need to broadcasted over time, similar to how livestreaming MPEG-dash or HLS works and those torrents would only be relevant to the swarm for a few minutes max.

Something that could (In my amateur/uneducated opinion/guess) work better is staggering the clients based on some sort of grouped delay. If you had X viewers and spread them out over 10 groups each staggered over 10s delay with each video chunk being 10s then the swarming could probably work better and the group with the longest delay would have the best smoothness (because 9/10s of the swarm has the chunk, and 1/10 of the swarm dedicated to only uploading that chunk) but the longest latency.

It would probably work better if the groups lower down the stack are larger too, so that each layer seeds more clients in the layer below it than it was seeded from above it.

Problems that I can see is if there is a lot of churn in the higher groups there would be latency from reconnections/seeding to those and the latency of broadcasting new chunks but I have no idea how solvable or severe those would be.

Would be interesting to build a webtorrent webapp to do something like this to see if it's feasible. If my assumptions are somewhat right it might make webrtc better for broadcast/conference use-cases without using SFUs.



👤 Saris
Peertube does it.

Seems to work pretty well from my limited use so far.

I think Theta TV does as well, although it's not entirely clear how it all works (as typical for most blockchain based stuff).


👤 olyjohn
What about multicast? My understanding was that you subscribe to a multicast endpoint with a player, and that endpoint only needs to send one stream. The routers should handle all the multicasting, and essentially "split" the stream out to the clients. Always seemed a bit ridiculous to have 500 people connecting to your one tiny connection...

👤 jcrawfordor
I think part of the problem is that users in livestreaming scenarios are much more latency sensitive than you would expect, especially inter-viewer synchronization, due to live chat. Just running a small centralized streaming system I struggle constantly with optimizing for latency and viewer synchronization.

👤 swalls
The largest streaming service in South Korea is AfreecaTV, which is P2P; I think the problem, and why this wouldn't work with Twitch/Youtube etc comes with expanding it globally, as you have places with very weak upload and fewer users.