I'd like to build a download accelerator (a sort of peer-to-peer torrent-ey type application that enhances http downloads), but I'm not sure how to go about limiting connection speeds, prioritizing chunks from different hosts, etc. The idea being that there's a mechanism to auto-discover other servers that host the same content, and some p2p nat-punching infrastructure to make it so that you can help serve collections of files.
Does anyone have any recommendations of required reading? Any suggestions on how you'd generally handle rate limiting and prioritization of different downloads?
How would you architect a system for downloading content from a bunch of http mirrors?
A particularly simple algorithm is https://en.wikipedia.org/wiki/Leaky_bucket
IMHO though it's not worth the trouble. CDNs are dirt cheap these days and way better enduser UX than peer downloads. The vast majority of home internet connections are still asymmetric (more download bandwidth than up) and forcing them to share their very limited upstream bandwidth for your benefit is a poor practice that will slow down their overall internet browsing.
Windows tried this, Blizzard tried this, many of the Russian game outfits did that in their patcher, etc., but I think it's mostly fallen out of favor as proper CDNs got more affordable.