HACKER Q&A
📣 stefanos82

How many CPU / GPU cycles are wasted unnecessarily?


For years I have been asking this question to myself: how much code actually wastes CPU / GPU cycles, assuming optimization flags are set to their highest option possible?

Is there actually such thing with the flags set on, or not?

I'm really curious.


  👤 noselasd Accepted Answer ✓
How do you define wasted cpu cycles ? The compiler wrongly emitting redundant code ? The programmer doing something particularly silly ? Python running code that would be 10x as efficeint if rewritten in C ?