Suppose we wrote our software again in a compiled language, based on the libraries delivered with the OS, and started doing some more basic algorithmic optimizations again, and keep memory usage in check. A lot of the power gained has been wasted, mainly because it was cheaper for organizations to drop low-quality crap on users and make them upgrade.
If the Moore-party really is over, I expect organizations to start optimizing again up to the point were performance is acceptable.
I see some parallels with plastic: Both plastics and software are basically miracle material, cheap to make anything in huge masses. The result is huge amounts of wastefull practices in both cases.
But going forward, it seems that the technological solutions that are able to be applied to solving real problems don't require massive compute as much as clever parallelism.
I.e, take something like running ML inference on a CPU. Now, instead of running it in on a CPU, you can build semi specialized hardware that runs at a fraction of the power required for the same unit of compute, because all the memory locations are deterministic, and there is way less moving data around, no circuitry to decode/reorder instructions, no branch prediction, e.t.c.
And then you can scale that hardware up as much as you need to give you width of compute, at the cost of additional power draw.
So there really is no need to develop faster and faster chips. Especially given that anything that functionally improves your day to day life (such as robotic servants for example), can run in the MHZ range with extreme accuracy, even KHz depending on the task.
We're so hardware rich, it's amazing. When I was in college, a Cray-I was the ultimate in computing... 80 Million FLOPS sustained. The supercomputer in my pocket (a $200 Motorola Smartphone) can easily beat that, doesn't require 3 phase power, raised flooring, or industrial cooling.
You kids don't seem to realize just how good you've got it.
After all why did we ignore the economics of fiat currency, money printing or budget deficits focused on demand creation while entire time ignoring supply creation (which includes domestic production over outsourcing)... we saw with COVID and are seeing today what the price/cost is from ignoring supply and instead only focusing on demand. The simplest reason: supply infrastructure takes time, intense effort and time spans beyond next quarter - it takes 20 years to go from "Lab to Fab" for any technology. That's too much effort when you can just print some more money to goose GDP and profits.
It’s the same reason people like Malthus are typically wrong. He thought the world would be out of food by this point.
But he never predicted (nor could he predict) the advent of fertilizer and modern growing techniques which vastly boosted efficiency in food production.
People like this think the future will be like the past. But the future is never like the past because people solve problems.
As long as people are solving problems they are creating new knowledge and making progress.
There will still be enormous progress even as single CPU speeds stall. That said, as I said earlier, we haven't really optimized on current hardware yet.
I think that's about it. CPUs haven't gotten notably faster. They've become more dense.
And 3d - well it adds a whole new dimension …