Of course, no technology is guaranteed to continue improving forever. The sustained performance gains reflect Apple's tremendous dedication and investment in the endeavor.
After the A11 chip, though, it became clear there was a non-zero probability that Apple could one day meet or surpass Intel.
Which begs the question: what underestimated technologies exist today that could overtake incumbent technologies within 5 years? Put another way: what technologies are in the A11 stage and not yet widely recognized for its potential?
[1] https://images.anandtech.com/doci/16226/perf-trajectory.png
People often pigeonhole Lidar as an "Augmented Reality Enhancement" which is itself a massive underestimation of what it is capable of. Lidar isn't new tech, but it has become cheaper, smaller, and more importantly the compute capable of processing it massively more available (e.g. under-used machine learning chips in modern smartphones).
Lidar for mapping objects/rooms/buildings/etc. Lidar for mapping people/yourself.
For example that custom countertop? No need for an in-person visit, just use Lidar to scan the space via our app, and we'll have it delivered in 4-6 weeks.
You want a custom tailored suit? Just scan your whole body using Lidar, and we will custom make it to those measurements abroad and have it delivered.
Got into a minor vehicle accident? Just scan the damaged section, and we'll be able to quote the exact cost of repairs.
Lidar is a new core/low level sensor. It is a gimmick today because of the chicken/egg problem, but we'll see it eventually become completely standard and expected for normal life stuff. Boring stuff. AR isn't the what it is about, that's just an easier sales pitch.
Reconfigurable Computing, such as FPGAs and more homogeneous computing fabrics offer a way around the Von Neuman bottleneck.
Reactive programming, like Verilog, offers a way to utilize FPGAs and multiprocessing in systems that take full advantage of thee available hardware, and route around Amdahl's law.
Fully homomorphic encryption would be revolutionary, but the performance isn't acceptable at the moment. As soon as it became acceptable it would be adopted swiftly in many domains.
Until then, fully encrypted memory will help prevent a lot of future attacks like Spectre, Meltdown, Heartbleed, and ShellShock. Intel and AMD both seem dedicated to implementing this.
ECC memory would need to become universal to prevent RowHammer style vulnerabilities. Would love to see that take off, and included as a mandatory part of a future DDR6. No prediction as to whether that will actually happen, and the market segment that cares the most has already had it since forever.
I played with survey grade equipment for a day and it's crazy. Being able to repeatably measure positions and distances down to millimeters feels like magic. Imagine a VR headset that doesn't need trackers, a vacuum cleaner without camera/lidar, and so much more we can't even imagine without access to the technology. It's probably more than five years out though.
So having said that: machine learning.
It’s basically been proven that throwing more compute and data at a problem gives proportionally better results. Getting to GPT10 is just a matter of money at this point. This straightforward scaling scenario leads inevitably to more resources spent on more compute (and different kinds of specialized computation), which simultaneously drives down prices and pushes forward performance.
Siri is the Altavista of AI. When the Google of AI appears in the next 5 years, it will rock our world harder than anyone expects. No one expected Google, even though plenty were hyping the internet. The same thing is happening with ML today.
For context: right now, I'm in Algiers, Algeria, with an ADSL connection of 4Mbps paid around $30/month for home usage. Recently down to $15/month, but the. 2Mbps for businesses is $77/month. Now, I've tried numerous times to upgrade this.
Fiber was priced at around $190/month for home usage, with many people showing upload is capped at 1Mbps. Yes, that's "one megabits, not megabytes, per second". Even then, I was willing to get it.
20Mbps for business is ... $505/month. It's cheaper for home use at ... $77/month recently for 100Mbps, which is a price drop from $190/month a month ago... if you can get the company to install FTTH, and if you live somewhere 'with DSLAM' as they say. They can't serve downtown.
ADSL and fiber are only available through the state owned company, of course, which also proposes Wimax, and 4G. Other carriers have 4G. Doesn't work where it matters. Plus it's capped. There were competitors in ADSL and Wimax, they were closed and had trouble with the regulation authority which oversees all.
I once needed someone remote to be on video call with a physician. ADSL went down. I sent "airtime/4G" credit to 8 numbers belonging to people who were present so they could have "internet" and be able to make the call. These numbers were of the three different carriers. No internet.
Starlink will be huge: bypass state monopolies, bypass regulating bodies. People will pay off customs agents to get receivers in if authorities try to restrict their imports, and there will be a black market for that if that happens.
I hope that we see either an existing fast memory technology become cheap, or a new technology arise that will bridge this gap.
I think modern day CPUs and GPUs remain underappreciated and not used to their full potential because of how hard it is to program around slow memory…
Seriously, we don't need esoteric new computer architectures, just give us cheap fast memory in a dumb flat memory hierarchy that is easy to program for.
2. GNU/Linux smartphones (which currently cannot be daily drivers for most people).