Or when an old iPhone or Android is sitting in a cabinet getting dust, while it can be used as a webcam, small home server, automation device, clock (I use an old iPad), etc.
I think so much compute power is being wasted and I'm not sure how to feel about that.
And when we purchase a new desktop workstation, equipped with even more staggering horsepower, we're totally OK with installing software that makes itself comfortable by consuming large fractions of this horsepower without any apparent need, all because the amazing hardware is able to mask the awfulness of the software.
e.g. I own a few high powered workstations for fluid dynamics computations. If energy was free, I would be just running 'what-if' solves all the time just for the hell of it. But because energy isn't free, they stay in sleep unless I need them for what is deemed 'necessary' for research.
I have a M2 MacBook Air with the most amount of RAM and storage that Apple would sell me. I'm sure its CPU sits at about 5% utilization almost the entire day. This was intentional on my part. I bought a laptop with enough "extra" capacity that it will be able to handle increased inefficiencies, bloat, and new features from ever-expanding software for the next several years.
So by buying one "over-speced" laptop now, I avoid buying two or three of them in the future. This is how I like to buy devices, especially since some items, like the screen or keyboard or the like, are mostly fungible.
So many underpowered devices are produced that sit in drawers, collect dust, and end up in the landfill. For instance Intel makes "atom" devices that if they don't die early because of design flaws, they die early because they aren't really useful. There was a time when Intel was interested in having you buy a new computer because it was the best computer you ever had, now they are interested in putting every vendor of parts out of business (except seemingly Synaptics) so they can get more of the BOM, even if this means the new computer you buy will be the worst computer you ever bought.
If there was some minimum standard of quality for devices, people wouldn't need to buy so many.
So I don't have a problem with, say, writing a script in Python even though it's not the most efficient use of my CPU, because I'm just looking to get things done. But when I have a problem that needs horsepower (and I know my 8 core 3.6 GHz can absolutely deliver that), I don't know how to tell it to do it. It doesn't help that many programmers' first thought would be to go for the cloud, when a single computer can be much faster than a bunch of AWS VMs
Another example is the .kkrieger demo, which seems like wizardry when you've only seen similar things done in Unity/Unreal
Consuming too much power is already a big ecological issue, and here in Europe it's becoming a financial problem too. I still want my computer to perform as good as possible, be it running a game or compiling some code or transcoding videos or whatever. In that moment. But the rest of the time, it should consume as little power as possible. I've been looking into ways to upgrade, and most of the reviews focus 95% of their time/space on peak performance (basically torture tests) and peak power consumption (but more from a "what PSU/cooler do you need" perspective). Office-type work, which is 95% of what I actually use it for, is always an afterthought.
Back to your original question, I do understand where you're coming from. I'd love to use an old android phone as a webcam, it's just a shame that most don't work without a battery, and having a device like that permanently in a charging state will eventually lead to it ballooning.
No. My devices are rarely utilizing over 40% CPU and tend to last for a very long time. I know it isn't much but this helps keep some gear out of the landfill. I can always find a use for older hardware. This Linux PC I am writing this post on is coming up on 12 years old and is more than adequate for my purposes. I expect it to last at least another 10 years, though I may have to replace the spinning rust with an SSD at some point.
I recently bought my first smart phone despite not really needing one. I have mixed thoughts on how long this will last since I do not control the firmware yet. These devices are designed to be difficult to service. I will probably turn it into a glorified MP3 player since it has a large battery and just get the Caterpillar flip phone. I think the smart phone would be a great home audio entertainment system. I can not find a logical reason to push it to the extreme.
My friends even tease me because in my hobby microcontroller-based electronics projects, I'll put the wimpiest microcontroller that will do the job instead of just slapping an R-Pi or Arduino board in them. I defend myself on the basis on minimizing power requirements, but really, I just can't bear to see capabilities go to waste.
it doesnt make me feel bad when my boiler isnt burning diesel as fast as it can
or that my car is literally not even running right now
or that i am not cramming for certifications and new languages
or that my siblings are not working out or at college
or that my laptop is waiting for instructions from me instead of me waiting for it
But once the hardware is acquired, your goal should be to stretch its lifetime and spend the least energy on it, not the most.
As an example, it takes 22 sec to run a test suite on the M1, in our CI the same test suite takes 9 sec.
These might be due to software that has not been written yet.
To be honest, the computer seems like a high end office computer, so I think that use case seems spot on.
I also encourage everyone to be aggressive in putting things on the used market, or even giving it away for free (commonly in those local facebook "buy nothing" groups). For a lot of mass-produced things you can view the used market as a storage space. Sell/give it away one year, and if you need it the next year you'll probably be able to find someone else doing the same. On the flip side, avoid buying new stuff. Try to only buy used electronics, especially.
My hair clippers are unused 99.9% of the time.
There are benefits to having the ability to opportunistically burst into 100%, and some benefits aren't easily measurable in performance terms (having an up to date secure MacBook).
We can find wasted potential in various places:
* the millions of people receiving poor education
* people working in jobs below their potential skillset
* galaxies with vast idle resources
* people spending time on logistics/bureaucracy
There's a world of opportunity out there for improvement.
On the other hand I'm not super frugal to the point where I'm on a eight year old phone to never waste a cpu cycle or something.
For example, I've always had 'more computer' than I truly need. It hasn't turned out to be a bad thing though, because sometimes usage does surge. Maybe the situation calls for being able to build that thing really ridiculously quickly. Whatever.
I think 'opportunity cost' applies, and attempts on that calculus are beyond me
Plus, if computing power isn't strictly necessary to achieve a task, but makes the experience smoother, is it really wasted? Nobody wants their web browser to look like a power point (which it kinda does on my pinebook pro, a $200 arm laptop). Most people could daily it, but stuff running smoother is worth something.
After gifting technology to some family members, I realized that what matters most at the end of the day is they enjoy using technology stress free. You could get away with non M1 MacBook Pros for presentations, but if using an M1 makes you feel good then you should use it.
If you're worried about e-waste, then pass down old technology. My sibling got a phone a lot younger than her peers because I love technology and when I switched to a new model, I passed it down. She didn't care about the specs, like 99% of the population she doesn't think about compute power.
Compute power isn't a limited resource in the world. If someone has a need for compute power, they will probably find a way to use old technology on their own. That's not something you should feel bad or worry about. I don't.
People who buy new iPhones even once every 2 or 3 years kind of gross me out with that choice (lookin at you ATP). I have what I consider a phone that is nice enough for the money, and just don't consider anything beyond it to be a suitable use of money. Also don't push it to any limit, have a very standard set of minimally demanding activities I use it for.
I haven't had a car in 4 years. When I had one, it was easy enough to justify the stupid gas and insurance expenses. Since then, I'll periodically do the math on the ongoing and initial cost of getting even a used one, and I can't think of what I could do with it that would justify its expense and existence in my life.
A new GPU would be neat, but then I think "what game would be fun enough to warrant this purchase? what am I currently doing that would usefully push it" and the answer is nothing.
I use the heat for overnight winter.
There's no reason why distributed computing couldn't be made official, but there aren't all that many personal applications which could work across a public distributed systems and tolerate long latencies and built-in unreliability.
It would be nice to see a massively redundant distributed web/cloud where every device was part of a global server system - as opposed to just a global system for accessing remote servers. But aside from the security issues, it would need a whole new set of protocols for handling extremely robust fault-tolerance.
That's a bit more complex than setting up a spare webcam. (I tried that with a spare iPhone once, but it ran too hot for extended use.)
It's okay to not use full capacity all the time because demand is varied and intermittent.
For old, discarded devices: It is the conjunction of Moore law and programmed obsolescence that makes this unsolvable on any large scale.
Suppose that Android was modular, not ara-phone-modular, but almost custom-built-pc-modular (+long term software updates). will you replace it that fast in the first place ?
For new devices: The waste/potential is rather on the "software level", not the compute one: These are 2 very different things. It is not the CPU meter that matters.
And you know what, 20 years ago I was hopeful that a lot of people in 2020 will be using their computer more creatively. I was naive of course: If you are not a curious person to start with, computers will rarely make you so.
Obviously the machine is a tool, to be used for what you need it for. Just because you have a genitalia capable of making a thousand children it doesn't mean that you need a thousand children. Besides children are expensive to have, and so is electricity. Batteries don't grow on trees either.
But it also now annoys me to see people under buying... people cheap out to save a few hundres bucks or a grand on something they are going to use more than most thing they own. Like yes, drop that cash and max out the storage on that laptop! Sure, it seems unnessary now, but those few extra terrabytes available in a couple years are worth so much more in time savings later than the money now. When were talking hardware you cant easily upgrade later.. which everything but PCs and servers.
The argument was, basically, you can't save cycles. You get so many per second, whether or not you use them. And not using a machine you dropped mid-six figures for seems like a terrible business waste.
Vastly cheaper machines, ecological crises, battery powered devices, and security eclipsing performance in importance changes that analysis a lot.
I hope to keep my Pixel 5 as long as possible. One thing I'm trying to do is to only use stock apps or lightweight FOSS apps where possible, since most "big" commercial apps seem to get slower/more resource intensive with time cough Discord cough.
My mom used to do this until I gifted her my max spec Surface Pro a few years ago. She's probably saved more money on semi-disposable laptops than buying it new would have.
And I guess I get annoyed that anyone would actually be advocating for that garbage hardware...
Buy that fancy MacBook, high end phone, or shiny PC - buy quality. Buy longevity. Then use it as long as possible. It's better to underuse a powerful device that lasts 8 - 10 years rather than it is to buy a new low-end device every couple of years.
I don't feel the same way about personal devices. As an extreme example, I'm happy to have a gas mask and a microcontroller powered Geiger counter in my storage room for emergency purposes and never use them ever.
Your managers might be perfectly right to use a MBP because they get the best display for checking emails and browsing the Internet all day in a thin, solid and secure laptop.
Not all computer needs are measured in a CPU benchmark!
I buy new gen GPUs not because I need all of the compute, but because I only want to buy a new GPU every 5 years or so.
(Even if you use resistive heating to heat your living space, it's not necessarily a win to make a computer convert that electricity into heat and computation results. They're 100% efficient, but heat pumps are MORE than 100% efficient. So unless you really want the compute results, it's a waste of electricity.)
Another angle that people always seem confused about is latency versus throughput. When you decide to look up cat pictures, your computer needs to perform billions of calculations in a very short period of time. That means that chips have to be able to do a lot of work in a short interval of time, or nobody will want them. A higher utilization could be achieved by building a machine that is less powerful, but spreads the cat picture rendering calculations out over time; if you want to see 8 cat pictures a day, and 100 compute units are required to display a cat picture, then you only need a CPU that can process 0.01 compute units per second. But, then the human is idle for 10,000 seconds in the time between wanting to see a cat picture and seeing the cat picture, which is annoying to the humans. So the chips have to be "overpowered" to be able to display cat pictures in a timeframe that is non-annoying to humans.
TL;DR: don't feel sad for engraved chunks of sand.
Now, there is something satisfying optimizing everything (Factorio), especially in a group setting.
But I learned to live and let live.
Also, the M1/M2 are more about efficiency than they are about raw performance.
Think about that and weep.