HACKER Q&A
📣 xchaotic

What's the next big thing in computing / programming?


I managed to surf some of the software hype waves, some like "AI" have not really benefitted me in a major way and some, like crypto I deliberately ignore by now. What do you think is the next big thing in programming or computers/Internet in general? I'm thinking low-code or no-code tools that are actually composable by programmers perhaps? How about something in the physical space - are we going to see more types of wearables, for example? Basically I'm looking for ideas on what to focus my learning on in 2022.


  👤 skadamat Accepted Answer ✓
Real personal computing, that benefits end-users on their terms. In the Future of Coding community, we often use the analogy of "home cooking". As in, we have restaurants & professional chefs but not enough home cooks in computing. Tools & processes are super different when you're cooking for yourself / family vs for a restaurant!

There are a few trends that are tiptoeing on the edges here:

- tools for thought / creative tools (Muse, Nodes.io, Mem, etc)

- end-user programming (people trying everything from visual PL's to simpler / more constrained programming interfaces)

- local-first software (https://www.inkandswitch.com/local-first/)

- web3 - I'm biased here and a bit skeptical of the fervor here, but matches some of the ethos / philosophy

- more customizable / hackable personal computing hardware (https://frame.work/)

- spatial + tactile computing so the body can participate in the thinking process (instead of only the 'head'): https://dynamicland.org/

- explorables, games / interactive viz to quickly transfer rich context: https://explorabl.es/

- video games, often indie ones, are exploring deep ideas: see Zachtronics games, Jonathan Blow's games, some of Annapurna Interactive funded games, etc. So many!

- open data: better end-user data tools so anyone can understand the systems around them


👤 gorgoiler
The granularity of computing seems to become finer by the day and with it comes ever faster and more fluid tooling and creativity.

What would once be a metal machine onto which you load executable code has become a thing that you can run virtually. Containers, serverless lambdas, dynamic language runtimes. Everything becomes faster, more fungible (a zeitgeist word but I mean it in the old way) and easier to work with.

We’ve known for a long time that software will eat the world — the comings and going’s of everyday life, once they are expressed as code, become infinitely easier to patch and improve using only a keyboard and display.

What’s new and ongoing is that software is eating itself. Another comment here expresses this as code becoming a data structure but that’s too pure for me. The practical improvements we’ll see don’t need to go that far and won’t be as zealous but we’ll reap the same rewards.

I’m personally looking forward to the next generations of infrastructure as code, where all the useful parts are factored out of the current monolithic ecosystems and become as composable as actual code is. It feels like, to draw an analogy with version control systems, we are at that 2005 boundary of CVS to SVN, and git is still around the corner.


👤 bawolff
I don't know, but i'm pretty sure its going to be nothing that's listed here.

Everything listed here is stuff at the high end of the hype cycle. Everyone is already looking at it. The next big thing is going to be something nobody is looking at. If people are already looking, it would have already happened.


👤 nicoburns
I'm not sure if it will end up being the case, but I would like to hope that open firmware and/or hardware will become a big thing in the next decade. Open source software has been such a game changer, and it would be amazing if we could realise the same benefits in the embedded/hardware sphere.

👤 throwaway24124
Not personally excited for this, but I think the next big thing will be the internet and computers becoming "hyper-transactional" through crypto. I believe this is what companies like Facebook are really pushing toward when they talk about Web 3.0 and the metaverse. It will still, for the most part, be the same internet accessed through phones and computers, but videogame-style micro-transactions powered by crypto will become part of most websites to access content. Since every site having micro-transactions would probably piss off a lot of the general population, I think there will also be an explosion of new ways to earn very small amounts of crypto online, by doing mechanical turk style tasks, playing a new game, creating a bit of content for someone, etc. Average consumers will earn these small amounts of crypto, then immediately spend them on the content they want.

👤 Delphiza
Anything in sustainability.

Climate change and related issues are driving big changes that large corporates are already having to address regardless of impending legislation.

Energy efficiency, waste management, sustainable computing, carbon markets, regulatory compliance and reporting (ISO50001), smart grids, EV logistics... there are so many areas.

Anything that you choose to focus on next year will be immensely valuable and important when the demand suddenly picks up before the end of the decade. You can build up on your existing technical foundations.


👤 jakobnissen
I think people will soon realize that code is a data structure, not text.

Like a hash map or an array, its text representation is merely a human-readable interface to it. I predict the capabilities of IDEs will expand, and to make their state persistent, they will begin to store IDE-specific files in your source code directory. Eventually this will morph into the source code itself being stored in binary format and opened/searched by editors.

The programming language Unity is pioneering in this domain, but they are too far ahead of their time. The transition to code-as-data will be gradual.


👤 eitland
Deno is the thing I think I can realistically hope for as a technological breakthrough in 2022.

If it does it can become huge. Not because of TypeScript as a native language. Not because it can create binaries. Not because it has an audited standard library or any other good reason.

But because it includes a simple but flexible way to prevent itself from doing things the author didn't want it to.

If we are extremely lucky other platforms will pick this up.


👤 conor_f
I think that decentralization is coming in a big way. The myriad of scandals and corruption surrounding the largest tech companies being one of many push forces, and platforms supporting self hosting with little technical expertise and products encouraging the same being pull forces.

A small bit of confirmation for this in my mind is the increase in companies switching to a business model that is aware of this. "We will host it for you for a fee, but we fully support self-hosting too." ala Gitlab and Ghost (https://ghost.org/docs/hosting/)


👤 thom
Better, simpler data flow frameworks. Represent all your business's logic declaratively, or at least succinctly. Connect up all the inputs, batch or streaming. The platform automagically maintains your data, always consistent and up-to-date. Apps are just views on top of this that trigger more events later. Find a bug in some of your logic? Fix it and the entire dataset refreshes, with policies on how you want to deal with external integrations that trigger off the back of that. This is what a lot of enterprise data infrastructure already looks like, but cobbled together and slow and buggy. I don't think adding no/low code on top of that really moves the needle. Make it painless and joyful from beginning to end.

If you're thinking of wearables, I personally don't think there's anything interesting there until someone can work out a decent high-bandwidth way of communicating with your device. Gimme something like subvocalisation that doesn't require me to wear something stupid or talk out loud.


👤 jeppesen-io
ARM will continue to make inroads in the server space as adoption picks up and as more and more projects target that architecture

Personal bias talking, but I'd expect to see modest uptick in Nix[os] interest

Kubernetes will transition from new and exciting to a mature platform. Expect less new big features and more minor improvements


👤 r_hoods_ghost
If Musk gets Starship to work and is successful in driving down the costs of launching material into LEO by another couple of orders of magnitude as intended then you can expect to see an explosion in the numbers of cheap and cheerful cubesats, not to mention experimental LEO factories and general weird stuff floating around in orbit. There will be a massive need for people to write control code and to process the data coming in. Unlike in the past the emphasis will be on knocking out "good enough" code, rather than something that is written to NASA standards, because the hardware being launched will be so cheap and the launch costs even cheaper. It is hard to get a grip on just how much more stuff is going to be up there by the end of the decade, all of which will require software writing for it.

👤 RedBeetDeadpool
Hyper IOTization. I'm coining the term.

- pots that tell you when they are hot.

- bikes that say their tires are flat.

- weights that track how many reps you did.

- shoes that tell you: stop landing on your heels so hard.

- pillboxes that shove pills down your throat.

- pillboxes that shove politics down your throat.

- chargers that tell you their pronouns are he/him.

- coffee mugs that tell you weather.

- pants that say you've worn them too long.

- windshields that tell you to stop texting.

- food containers that tell you maybe you shouldn't eat that leftover casserole.

IOT is gonna be the next big thing, get on the hype train cause this ones for real. Don't be left behind still writing SQL.


👤 edhelas
Reducing the usage and having an actual degrowth of electronic devices consumption, and especially power consumption.

If we are not choosing to go there, physics will teach us the hard way (it already begun).

Basically any region/country that is able to do the same, with less hardware, less energy will have a big advantage over the other regions/countries in the upcoming years.


👤 dgudkov
No-code. It's not understood by software developers, but it introduces the joy of simple programming to non-technical people and that genie can't be put back in the bottle.

Serverless. Current clouds are a half-step because they partially abstract away hardware. Serverless is the ultimate cloud because it abstracts away all hardware.

Self-hosted open-source alternatives to public clouds. Public cloud providers use proprietary software to provide and manage cloud services. However, open-source alternatives will eventually appear and will only require a pool of machines (virtual or not) to operate in order to provide a set of cloud services - computation, storage, identity management, load balancing, etc. This will lead to decentralization and proliferation of private self-managed clouds.


👤 WA
2022? Probably focus on the big things of 2021. Not much will change within a year.

2032? Hard to say, but it is quite possible that software and the web will actually look quite similar to what we have now. I found the concepts in this presentation quite intriguing [1]. Some stuff like airplanes haven't changed THAT much in the last 50 years, because they are good enough for their purpose.

[1]: https://idlewords.com/talks/web_design_first_100_years.htm


👤 aardvark179
If I had to guess at one thing that will become of increasing importance over the next decade it would be privacy and security related to that. I think some jurisdictions will force this issue, and companies will have to start treating large amounts of personal data that might be stolen as a liability rather than an asset.

This is likely to be a business change as much as a technological one, but it’s definitely going to have an effect.


👤 goalieca
Security is going to be the next big thing.

I’m hoping for a small revolution in 3rd party libs and a locking down on the supply chain as a start.


👤 kennu
Web3 will go through its current hype cycle peak (with lots of people both hyping and hating it), and in a few years we will see what actually comes of it. I predict we will find some interesting unforeseeable use cases for decentralized computing/currency, but it'll take some time.

👤 vzaliva
I believe that formal verification is the next big thing. The problem with software reliability and security becoming the most critical one and this is the way to tackle it. There is sufficient research in academia which could be used as a foundation to build tools for the industry.

👤 kristov
I am possibly alone in thinking the biggest thing to happen in computing in the 80s/90s was the spreadsheet. Having an immeasurable affect on the productivity of organizations large and small, to this day. So what does the 21st century version of the spreadsheet look like?

👤 mikewarot
Here's my wish list of things to happen in 2022

Capability Based Security - I keep hoping this will be the year, as we're going to be in cybersecurity drama land until it gets adopted.

Rich text source - Why not include photos, and other stuff in your source code?

Tools that take source code, compile it into an abstract syntax tree, and let you tweak that tree, then spit back out refactored source code, possibly in a different language. Maybe it's possible to go all the way to LLVM and back to Pascal?

I would hope that someone comes up with a toolkit that puts a UI on the user's machine directly in touch with server code on the back end, without HTTPs or any of that type of cruft.


👤 tester756
I don't think anything BIG is going to happen in next half of decade

I bet that:

autonomous cars will still suck.

CPU will just get lower and lower, but nothing fancy except maybe weird shapes concepts.

web (browsers) will still introduce some weird not privacy friendly stuff.

FP-ish languages ain't gonna be mainstream, probably nothing cool enough for purists and good enough for mainstream will appear.

decreasing energy usage seems to be irrelevant cuz people will use that energy in other way anyway.

quant computer still just fancy thing in lab.

__________________

or I've found one that I hope will be big! air purifier's cost and efficency improvements! shit's so expensive, yet so important


👤 ronenlh
Someone already mentioned Unity as ahead of the curve.

I was impressed by it being a “Framework IDE.” It contains hundreds of visual tools and workflows where actual source code files are a small component which you drag and drop into widgets. Each widget is a visual representation of a class, and it’s source code is irrelevant.

Some of these tools are visual state machines, reactions to events, version control, collaboration. Most of us won’t like that change, as we’d lose a lot of control to opinionated rigid tools.

Maybe XCode and Android Studio will evolve to that for the next generation of developers.


👤 f0e4c2f7
VR and 3D development.

It's much bigger than games.

There are some frameworks that are just now hitting their stride that make it 10x easier to build virtual worlds. Unreal Engine 5 in particular has been interesting to me. Even today we are seeing people who work full time in VR now. (mostly lower wage employees in low cost of living countries). But I've also met people who are self employed in VR making avatars for people in VR chat or that sort of thing.

It's harder now with covid but if you have a chance to try VR somewhere definitely do, it helps if you can experience it first hand.


👤 wrnr
At the other end of no/low code there is adoption of advanced computer graphics techniques that originated game programming and machine learning that focus around the GPU.

* Better GPU API on web and native with WebGPU

* Use of compute shaders to interface with trained NN models or to "replace" css.

* Easy reactive programming using immediate mode UI powered by GPU instead of CPU

* Data driven design ala ECS where your ECS lives in the cloud.

Not just for 3D AR/VR but also business/productivity apps in 2D.


👤 mmmBacon
As we hit physical limits of what we can power and cool, I expect software will have to be more aware of those limits.

Heterogeneous computing is going to be the norm in everything. Since we’ve trained a generation of programmers to be hardware unaware, there is going to be increasing need for people who can understand all the concepts of how to tie all these different devices together to keep this huge group of people productive. I would also lump network devices into this category as the trend to disaggregate accelerates and lower latency access protocols/methods are needed.

Embedded SW is going to be more important as devices get more powerful. If we are truly to see AR devices I think this is definitely the case. The difference between FW/SW will continue to blur. Will also need people who understand HW to best optimize for power consumption.

Power consumption aware computing and profiling. As we hit physical limits of power density in data centers, we will need good ways to profile power consumption of code and power based optimization of code. This is obviously a thing already in mobile devices. But expect power aware computing to become more and more prevalent everywhere over time.


👤 mettamage
IMO, the following will take 5 to 10 years

IMO, VR will skyrocket when:

We have a portable headset that is:

Affordable

4k resolution per eye

Has decent mixed reality capabilities (e.g. see the Varjo XR 3 as an example on YouTube)

Has a good omnidirectional threadml that is affordable (i.e. walking, running and moving should be possible as it is normally). I find the current threadmills that I see (e.g. KatVR) a good step in the direction but still too problematic.

Killer apps are (from concrete to abstract):

Having multiple monitors in your room. Now you can travel with 10 monitors in your backpack.

90% real simulations (try Golf+, Eleven Table Tennis and Thrill of the Fight — those games are already there).

Mark Zuckerberg his vision of the Metaverse, which I interpret as: being able to mix and match hanging out and playing games together in VR (note: more companies than only Meta could work on this). Reasoning: gaming together is simply a specialized form of hanging out.

Software-dizing hardware interfaces. The more stuff we put in VR/AR (especially force feedback mechanisms), the better we can mimic the feel of hardware interfaces. In certain cases it will feel real enough and then you can import the entire hardware device into software/VR/AR and suddenly thr production of such a device is now much cheaper. For example: a realistic electronic keyboard in VR. I don’t know how to solve for the force feedback in this case but I do know that there is other hardware for which this must be easier.

Source: I bought an Oculus Quest 2 a month ago to play around with VR while constantly asking what it is lacking and what will make it awesome.


👤 CodeGlitch
Digital Personal Assistants. What we have now are like the early mobile phones: Large, bulky and don't work very well.

Imagine a personal assistant that you could have a full on conversation with. I don't believe this requires full-blown AGI to achieve, but improvements in language and human understanding by ML might get us there.

Think something like the Computer from Star Trek NG.


👤 captaincrowbar
(1) The cloud will continue to grow. Cloud-related stuff like Azure and Docker will continue to be a good learning investment.

(2) Once Microsoft work enough of the bugs out, which I'm guessing will take another 2-3 years, I suspect Windows on ARM will be the Next Big Thing, or at least one of the Big Next-ish Things. ARM supplanting Intel will be a general Thing.


👤 phendrenad2
Moore's Law ended. We're just adding more cache to devices now. RAM speeds haven't increased in several years. PCIe 5.0 requires expensive circuit boards that no one can afford. GPUs are out of stock everywhere.

Maybe the next big thing is doing more with less. Rust and Zig come to mind.


👤 adampk
Spatial computing.

If all of reality is able to be piped into Unity (structure of static environment, motion of sensors, dynamic object poses) you can build Reality Apps the same as you can build video games.

You can give people spidey-senses or "the force". Everything will have a UX.


👤 badrabbit
Confidential computing and Web 3.0 type computing where servers never touch user data. I also predict apps being developed to support non-ad revenue stream. And I don't think the IoT hype is done yet, especially if you include things like wearables.

👤 DantesKite
I’m surprised nobody has mentioned object recognition.

There’s lots of reasons why robots aren’t currently picking up trash across every beach in America 24 hours a day. Or cooking for you whenever you want.

One of them is the inability to identify objects the way humans do. But it’s getting better. Slowly. There’s no fundamental wall that’s been hit yet.

Think about that for a moment. Let it simmer.

Think about cameras that can identify humans and structures. Roombas that can identify cats and walls before it hits them.

Think about all the work that can suddenly be automated because object representation hits an inflection point.

It’s going to change humanity. The story of our species. It’s going to be a perennial wonder.


👤 Winterflow3r
TinyML - running ML models on devices with severely limited power supplies

👤 willcipriano
Homomorphic encryption, if perfected would revolutionize cloud computing. Imagine if you didn't have to trust the nodes, people could sell spare clock cycles from their personal machines.

👤 garethmcc
Have you looked at the recent crop of Serverless rechnologies that rtuly abstract away even considering the underlying hardware platforms. I am talking about tools such as Serverless Cloud where you write code and it infers the hardware needed underneath and creates it for you automatically. https://www.serverless.com/cloud

👤 bflesch
Automation of real-world things (robotics)

👤 rocgf
I'm really just a mediocre software developer, I don't have any special insight.

But to me it looks like some things to keep on the radar: quantum computers, machine learning and more applications of it, possibly blockchain/smart contracts (I'm very unsure about this one).


👤 JayStavis
Synthetic media is definitely going to be big, for better or for worse.

It's so obvious that visual mediums are just going to continue to dominate time spent on screens. It seems odd that lots of the pixels we see today are manually put there by an artist or a sensor.


👤 AtomicOrbital
almost all software today is static ... whereas living organisms are constantly exposed to new challenges and are constantly learning to adapt and survive both as an individual and a species ... the developmental steps every individual grows through from infancy onward contributes to what some call embedded cognition ... inspiration from developmental biology as to this constantly changing morphology of not only the abstract capabilities but the physical implementation of the biological organism will influence new approaches to software architecture ... self healing software systems are a baby step in this direction

👤 zachlatta
Right now I am super excited about computer graphics, high speed consumer networking (10gbe in homes with wireless to match!), high quality home-built CNC and 3D printing machines (like the Prusa machines), and VR!

👤 dhux
I would say webassembly.

👤 randomNumber7
GPU computing will probably grow a lot. As it's physically impossible to scale up the frequency, we need to parallelize programs to increase the speed of computation.

👤 freemint
Multitenancy IoT. A lot of IoT infrastructure just serves one person or company. Once that changes large scale IoT investments will make a lot more sense.

👤 0235005
I would like to say federated social networks, but it all depends on how big companies are treated by US antimonopoly rulings

👤 austincheney
Decentralization.

Not: Web3, blockchain, or cryptocurrency

I mean actual online communications not requiring a server, account, or some Ponzi scheme.


👤 emteycz
VR/AR will be big one day.

👤 lifeplusplus
Heavy regulations on every aspect of digital world from ecom to gaming.

👤 justin66
Plastics.

👤 softwaredoug
Unpopular opinion, but chase old things, not new ones.

The big investment in your career will be ideas that stand the test of time. Relational Databases, data structures, programming paradigms, math, etc. Even in machine learning, just knowing core statistics and regression is more important than HottestNewArchitecture :TM:.

Learn the ideas underneath these ideas, and you'll really grow and be able to better evaluate / learn new ideas.


👤 ivan_gammel
Regulatory technology will dominate for the rest of 2020s, while we are trying to replace the Wild West of unregulated Internet with something that has less negative effects on society and less overhead.

Privacy and accessibility are extremely difficult and challenging tasks. Fake news, disinformation, echo chambers and cultural bubbles are growing problems. We basically need to rebuild plenty of things that are part of our daily life, and this will distract significant resources from the „next big things“. More likely digital society 2.0 is the next big thing, that will replace Meta and Google.


👤 kevinwang
It's still ai/ml, I'd say.

👤 andredeen
Wearables, as soon as fix the power.

👤 hermann123
Blocked Spam will be a big thing!

👤 mrfusion
Multimodal transformers

👤 thehappypm
Real security. Maybe whole new architectures. Would be a huge leap forward.

👤 7thaccount
1.) Price sensitive demand in retail electricity markets.

In short, everyone will participate in the wholesale markets (on the retail side) through automated bids. You put in how much you want to pay and as electricity becomes more expensive, eventually you're curtailed. With uncertainty in a fully renewable fleet (assuming nukes don't get built in mass), the demand will have to match whatever supply is available. This is starting in baby steps today, but will likely extend in other areas as well. This will take a lot of software at several levels.

2.) Not something I'd expect to happen in the next decade, but I think automatic and frequent voting will eventually happen if security is ever improved.

As of now, it is still a long ways off and there is a relevant xkcd about a developer horrified about exposing voting to modern software development. The "demarchists" from the "Revelation Space" series by Alastair Reynolds do this where they're constantly polled and vote on issues and government is largely based off of that. I can't see that happening in the US (a true democracy and not a representative based one), but it could be used as a tool to help those in power make decisions. We have smartphones that could do this today.

3.) I'd really like to move away from Windows/Mac/Unix at some point in the next 50 years for something far more simple.

Tablets don't seem very productive and the average desktop os is an absolute cluster f. Honestly, I'd love to have a very simple computer with a fast processor and lots of RAM that is somehow similar in nature to a C64 in that it's fairly simple to just boot right in and give it some commands to move something around the screen and build a game in a half hour. I want more though... something similar to Wolfram Mathematica built-in. The internet, but no web browser black hole [you'd have to fix the internet first :) ]. In short, throw out the unmaintainable mess that exists today in favor of a better, more holistic design. Maybe the Smalltalk-80 method, but Forth, Rebol, and Tcl stick out more to me. Maybe something like the power of a modern computer with a far greater simplicity in mind. No automatic spyware pushed by Microsoft or incomprehensible bloat. There was a group looking into this sort of thing that reached out to me on HN once, but I've been unable to find their page. I believe they tried to use sea vessels as a metaphor for computing where a "dinghy" is what you would ultimately want. I know things are built off of the shoulders of Giants, but computing seems to have lost its way in some respect. Why is it so hard to do GUI stuff in 2021? There are 1000 frameworks and they're all insanely complicated.


👤 asdfsd234234444
Web3

👤 lxi92
Notches.

👤 clavicat
Hopefully, realizing how unnecessarily shitty and bloated nearly all existing software is and rediscovering an appreciation for simplicity, reliability, and performance. There’s so much low-hanging fruit to be picked yet we let it rot on the vine.

Projects like Zig I think are the first signs that we are getting fed up with what we have.


👤 nathias
Crypto by another name.