Any voltage surge will get nearly all internal components.
Modern devices almost universally have a switched mode power supply (cheaper and better). They effectively isolate the low voltage internal components. A brief voltage surge impacts only the diodes and capacitor on the input of the flyback supply. It's far easier to design those to be resilient to voltage spikes.
I wouldn't bother with a surge protector for any device made after 2004. They will be fine from all but a direct hit, and even if they get a direct hit, probably replacing a single 5 cent capacitor will fix them.
A surge is a sudden boost in excess of normal. A direct hit from lightning, for example, will fry your stuff. In my area the risk of a direct hit is once every several years.
More common in my area is the opposite problem: low supply. I live in one of the fastest growing areas of the US. This area, segment of this large city, has been in the top 5 fastest growing areas every year for the past 20 years. As a result we get brown outs more frequently than we would like since regular demand can exceed regular capacity.
Electricity is measured as power (wattage) which is voltage (force) times amperage (speed), but is served according to a voltage rating. In the US the standard rating is 110 volts AC but in most of the world it is 220 AC. If you have dirty power the voltage you receive is lower than rated for your line.
Low supply (low voltage) means the amperage increases on your electronic devices in order to maintain the same wattage the electronic is designed for. Increased amperage means increased heat generation. It also shortens the life of impacted electronics. A battery backup is beneficial because if electric supply declines enough the backup switches to its local battery which ensures a more consistent supply of juice.
Back then computer ownership was still proliferating, and computers were more expensive and always plugged in, so there was more preoccupation with protecting against an unknown.
Most people knew someone who'd had lightning strike their TV antenna, so they weren't protecting against a totally phantom risk. Many people also lived in older houses where wiring work wasn't to the standard of a home built in the last couple decades (I still live in such a home.)
I don't remember many people researching surge protector ratings in great detail, though, just being sure not to get a plain power strip. Some did know a lot.
I think the other big difference is that stuff is less portable back then. Computers, hifi, and consumer electronics are plugged into a fixed location and they stay there for the lifetime of the product. That is not the case anymore, so specialized surge protection equipment, don't often make sense. You are not taking the surge protector to bed with your laptop. Or, you have an iPad.
1. The need for more outlets. 2. The belief this provided some protection.
Older homes had way fewer outlets than required by today’s code, and it wasn’t unusual to be short on plugs. Additionally, ungrounded plugs were also common. Enter the 6 plug surge protector.
I lost a cable modem and router once from a surge through the cable line. So I now have a surge protector on the incoming internet cable line too.
https://www.grandviewresearch.com/industry-analysis/surge-pr...
It's not a "surge protector". It's a "powerstrip".
They key feature is that it's an extension cord with six outlets on it. It only has a power switch and circuit breaker because everybody else's model has those things and they only add a penny to the cost.