HACKER Q&A
📣 moorequestion

Why aren't we more scared of Moore's Law ending/slowing down?


So much of our progress is currently dependent on computers getting faster to do more fantastical things. Why isn't everyone talking about our slow progress in increasing processor speed as a big concern? Or is it actually not that big a deal?


  👤 hyperman1 Accepted Answer ✓
'What hardware giveth, software taketh away'

Suppose we wrote our software again in a compiled language, based on the libraries delivered with the OS, and started doing some more basic algorithmic optimizations again, and keep memory usage in check. A lot of the power gained has been wasted, mainly because it was cheaper for organizations to drop low-quality crap on users and make them upgrade.

If the Moore-party really is over, I expect organizations to start optimizing again up to the point were performance is acceptable.

I see some parallels with plastic: Both plastics and software are basically miracle material, cheap to make anything in huge masses. The result is huge amounts of wastefull practices in both cases.


👤 ActorNightly
I don't think Moor Law applies anymore, because its based around general purpose CPUs - the idea that the faster that CPU is, the more compute you can do on it, which in turn implies you can solve harder problems. But this kinda implies sequential execution.

But going forward, it seems that the technological solutions that are able to be applied to solving real problems don't require massive compute as much as clever parallelism.

I.e, take something like running ML inference on a CPU. Now, instead of running it in on a CPU, you can build semi specialized hardware that runs at a fraction of the power required for the same unit of compute, because all the memory locations are deterministic, and there is way less moving data around, no circuitry to decode/reorder instructions, no branch prediction, e.t.c.

And then you can scale that hardware up as much as you need to give you width of compute, at the cost of additional power draw.

So there really is no need to develop faster and faster chips. Especially given that anything that functionally improves your day to day life (such as robotic servants for example), can run in the MHZ range with extreme accuracy, even KHz depending on the task.


👤 mikewarot
I learned to program on a 1 Mhz CPU with 64K of RAM, and no storage. It's really hard for me to not continue to be amazed at this laptop with a 2 GHz, 4 Core CPU, 8 Gigabytes of RAM, 250 Gigabytes of effectively 0 mSec seek disk, and 100 Megabit Wifi that connects to the world.

We're so hardware rich, it's amazing. When I was in college, a Cray-I was the ultimate in computing... 80 Million FLOPS sustained. The supercomputer in my pocket (a $200 Motorola Smartphone) can easily beat that, doesn't require 3 phase power, raised flooring, or industrial cooling.

You kids don't seem to realize just how good you've got it.


👤 DeadMouseFive
Because the capacity that hardware has right now is so insane, the design space hasn't even begun to scratch the surface of what's possible. Time to optimize that Assembly again.

👤 mbrodersen
99.9999% of computers are being used to watch videos and texting. That is already a solved problem. So there is nothing to worry about ;)

👤 FunnyBadger
Most people don't really understand what it is and what kept it going, either technologically or economically or socially. They are happy to enjoy the benefits but plenty of people think that iPhones simply appear at stores ignoring all the supply chain details etc.

After all why did we ignore the economics of fiat currency, money printing or budget deficits focused on demand creation while entire time ignoring supply creation (which includes domestic production over outsourcing)... we saw with COVID and are seeing today what the price/cost is from ignoring supply and instead only focusing on demand. The simplest reason: supply infrastructure takes time, intense effort and time spans beyond next quarter - it takes 20 years to go from "Lab to Fab" for any technology. That's too much effort when you can just print some more money to goose GDP and profits.


👤 ThrowawayR2
The average developer never cared about performance in the first place since there's no incentive for it; RAM and CPU are paid for by other people. As long as their bloated server code fits within their department's large AWS budget or their bloated Electron app uses less than 4 GB of RAM, everything is A-OK to them.

👤 quantified
People do talk about it, it is not a big deal to most people so most don't freak out about it. Processor and system architectures are also evolving, faster chips that are more prone to errors aren't necessarily the answer. Solving actual problems is more interesting than doing fantastical things.

👤 wmf
Arguably there's been a larger slowdown in progress/innovation in all areas (energy, transportation, healthcare, etc.) and people haven't noticed/complained so the end of Moore's Law is just one more.

👤 tdehnel
Because it’s very unlikely that we know everything there is to know about making computers work faster.

It’s the same reason people like Malthus are typically wrong. He thought the world would be out of food by this point.

But he never predicted (nor could he predict) the advent of fertilizer and modern growing techniques which vastly boosted efficiency in food production.

People like this think the future will be like the past. But the future is never like the past because people solve problems.

As long as people are solving problems they are creating new knowledge and making progress.


👤 mikewarot
Progress is not dependent on CPU speed any more. The Von Neumann model of serial computation has peaked, true enough. We've only reached a local maxima. Distributed processing is still in it's infancy.

There will still be enormous progress even as single CPU speeds stall. That said, as I said earlier, we haven't really optimized on current hardware yet.


👤 bravetraveler
We'll have to get back to a world where developers actively minded about how their code came out of the compiler

I think that's about it. CPUs haven't gotten notably faster. They've become more dense.


👤 eimrine
Not a big deal because top chips will go to miners.

👤 throwaway4good
Moore law is not slowing down / ending.

And 3d - well it adds a whole new dimension …