The sources to be scraped are quite varied.
I’ve got a little bit of experience with Node/Express and Ruby/Rails. I’m more than happy to pick up Go or Python/Django or Elixir or something else if those are more appropriate. I think my hesitation with going back to Node is that I slightly prefer static typed languages, but happy to use the best tool for the job.
My concern is computing/bandwidth costs as the various scrapers will be running and alternating quite frequently.
I’m hoping you all could give some recommendations for a stack that makes it easy to run mass scheduled web scraping jobs with little overhead in order to reduce server costs. Thanks!
While it seems I could have built a slightly more (CPU & memory) efficient system in elixir, I am afraid the development of new features would be a bit slower and my time is more precious than the machine's.
Also, CPU & memory are likely not the constraints in the scraping exercise. What you will likely find later on that you will get blocked by Cloudflare on week 2 and superb backend won't make a difference.