Now Docker can build in steps that depend on each other, reusing previous steps, and nix exists for making builds deterministic, and can be combined with docker. With such tech, one could build a deterministic CI system where the version is a date (e.g., start with an Ubuntu install from January 2020, fully updated on July 2023, now install my stuff). Spinning up for testing the next commit should then be extremely quick.
Furthermore, being deterministic, one could make rerunning unchanged tests contingent on which files and system calls are used by it and whether they are the same. Same tests depending on the same system and the same code must give the same result.
The benefits are obvious: deterministic testing saves computing time because the same inputs give the same result, and the feedback to the developer is quicker.
Shouldn't this be tackled ASAP by big companies or startups out of their own interest?
What's the hurdle I am missing?
It's still doable I think, and I'd love to work towards it, but who is going pay for it? Mountains of wasted compute day after day might feel like a waste to nerds like us, but to an investor it's opportunity.
Fixing it? Less opportunity.