So why is it still not trivial to host a file by torrent on a private network?
If I have a set of machines behind a firewall, it's a huge pain to setup a qbittorrent instance and configure clients to use it. Even once all that is setup, the retrieval process is nothing like "wget magnet://...".
To be clear, projects like aria2 (C++) and webtorrent (nodejs) do exist, but try to use them for anything production-ish grade or in an automated fashion and you'll find out they are slow to start the downloads and overall unreliable and flaky.
A true "wget" -style system for seeding and downloading from a private torrent swarm would be amazing for many use cases. I'm envisioning something like a program that monitors a folder and any file that ends up in the folder gets auto-seeded to a set of private clients.
What do you think?
The real question is why are torrents still the best P2P system around? Why do we not have something like SyncThing but for mass public sharing?
The other thing that happened is new P2P projects aim to replace the current internet entirely, and don't fully take advantage of the fact we can have cheap centralized servers.
I would love to work on a project like this one day. I've done some proofs of concept for similar things, but making something practical and cross platform is a bit too much for one person.
Lately I've been wondering if true P2P is actually the way to go for a real next gen system, or if we should be getting back to a more refined version of the trackers concept.