HACKER Q&A
📣 MathCodeLove

What technology is “cutting edge” in 2022?


I feel like I'm still hearing about the same ML and AI advancements that were purported in 2016. What's some technology that is actually cutting edge and unknown to most laymen?


  👤 juxtapose Accepted Answer ✓
For programming languages, dependent types.

DT is a hot topic in the PL community recently. It massively enhances the capability of a type system by turning it into a comprehensive logic system, so you can encode whatever properties you'd like to enforce into a type signature. Theorem provers have been taking advantage of the Curry-Howard correspondence for some time, but the implication of DT on real-world programming is still not well understood (we need more real-world projects written in DT languages). There are also ambitious projects that want to bring DT into the mainstream.

If you are interested, you can take a look at Lean [1], Idris [2], and a few others [3,4]. Often these languages have esoteric syntax, but there are projects using a more conventional syntax, too, e.g. Cicada [5]. "The Little Typer" [6] is a pretty good introduction to this topic.

[1] https://leanprover.github.io [2] https://www.idris-lang.org [3] https://github.com/agda/agda [4] https://coq.inria.fr [5] https://cicada-lang.org [6] https://mitpress.mit.edu/books/little-typer


👤 protontypes
Spatial Finance aka Geospatial ESG: Measuring the environmental impact of companies from space to assess how green a company is. Sustainable investors need this information to overcome the problem of greenwashing. Here is some more information on this topic:

[1] https://www.wwf.org.uk/sites/default/files/2022-01/Geospatia...

[2] https://www.cgfi.ac.uk/spatial-finance-initiative/

[3] https://www.oxfordeo.com/post/near-real-time-water-stress


👤 unixhero
- Retro game GAN upscaling, although it appeared in 2019 (ESRGAN). I think this will explode this year. Enhanced Super Resolution Generative adversarial neural networks https://www.theverge.com/2019/4/18/18311287/ai-upscaling-alg...

- Demakes of computer games into 16bit titles, weird and useless but there is something here interesting going on with it, https://youtu.be/qtNytQXVnx8


👤 3np
Zero-knowledge proofs. Several major cryptographic achievements have been unlocked only in recent years, facilitating both privacy and efficiency for certain setups that were only theorized before. There’s still a lot to be done for developer tooling and libraries. The foundation is solid but the ecosystem is nascent.

So far real-world applications have been mostly in the blockchain/cryptocurrency space (privacy/anonymity for Zcash and Tornado/AZTEC, Ethereum L2s, Bulletproofs in Monero, etc) but there’s so much untapped potential for other domains still.


👤 lewisjoe
Here's a list I maintain and keep an eye on:

- WebAssembly / WASI on the server (for serverless tech, cost efficient containers, plugin environment for systems, etc) https://twitter.com/vettijoe/status/1484507483788161026?s=21

- CRDT and its implications on local-first sofwares (for me this is a better bet as a technology to architect your solution, much better than what blockchain provides)

- Provably correct programs (all the latest compilers like rust, kotlin and swift are shipping with a flavour of this idea for example it's possible to write null-error free programs in all those languages)


👤 fergie
AR (augmented reality).

Considering how good AR is _right now_, we have very few practical applications for it. It has been quite good for at least a couple of decades (I studied AR in the 90s and we had some pretty amazing demos even back then).

Snapchat is doing some amazing, if silly, AR stuff, and hardware capabilities around graphics are improving all the time (especially with the use of graphic cards for crypto mining), in fact graphics are getting to the stage where its genuinely difficult to see at a glance what is real and what isn't.

Basically this is what Facebook is going for with the Metaverse, but history tells us that it will probably not be an incumbent that develops the best products.

There will be some cool/scary stuff coming.


👤 monkeydust
I recently tried a telepresence software from a start-up that allowed me to control a robot factory arm almost a thousand miles away through Oculus quest.

It definitely had that 'wow' moment for me although the product was not production grade quality yet.

It did open my mind to possibilities I had not considered before around the future of work - mixed feelings - I mean think person in operating a robot arm making a food dish in . Some ways it can be considered a bit depressing but in others it opens up opportunities that have never existed before.

Its a sort of half way stop between no automation and full automation for certain tasks where real intelligence is needed or cheaper.


👤 simula67
Following projects seem promising:

* OpenCompute[1]: A Facebook lead initiative to build and design open source hardware, mainly for data centers but also for enterprise.

* RISC-V[2]: An open source ISA and alternative to x86 and ARM.

* Cloud Native Computing Project[3]: An industry collaboration for building cloud infrastructure.

* Teleoperations[4]: Remotely operating machines where humans take over in situations where AI is not sure of the correct action to take. For example, before humans get self-driving vehicles, we may get vehicles that are operated remotely. The vehicle will have cameras that stream the data to operators who could be sitting at an office who in turn send control signals wirelessly to the vehicles. This model could also be applied to drones, humanoid robots etc. More jobs could be sent to low wage economies and workers there could perform work that previously required physical presence in high wage economies.

[1] https://www.opencompute.org/

[2] https://riscv.org/

[3] https://www.cncf.io/

[4] https://en.wikipedia.org/wiki/Teleoperation


👤 ahevia
I was pleasantly surprised by some of the examples in Google Research themes list from this past year: http://ai.googleblog.com/2022/01/google-research-themes-from...

E-Graphs from Julia are another cool space being explored


👤 helltone
WebGPU, particularly as a portable way to do compute

👤 lolive
Probably off-topic, but the notion or a second brain (not Elon Musk stuff) is, to me, emerging as a problem to solve. For years there has been discussions about GTD, bullet journal, mind maps, etc. To help us deal with the information deluge. Whereas the (primitive) tools available were notebooks, calendar and post-its. The current battle among PKM (Personal Knowledge Management)solution providers sounds, to me, that we are slowly starting to investigate the proper UX for a second brain (I.e a place to store transient and persistent data that you meet in real life and that you want to manage OUTSIDE of your brain).

👤 dr_dshiv
Text to image generation, like GLIDE or DALL-E.

👤 senordelicioso
We can now make 20 Tesla magnets.

https://hackaday.com/2021/09/27/commonwealth-fusions-20-tesl...

Nuclear fusion cometh, and it will change absolutely everything.


👤 vroomik
Photonic processor is coming out this year, claiming to be up to 10x faster than nvidia A100 in BERT and using 90% less energy - https://lightmatter.co/products/envise/

👤 dr_dshiv
Oscillatory computing.

Csaba, G., & Porod, W. (2020). Coupled oscillators for computing: A review and perspective. Applied Physics Reviews, 7(1), 011302.


👤 aricz
VR - Check out Oculus Quest 2 for a cheap wireless standalone consumer device. If it had better resolution I think it could hit the mainstream. Or maybe it's already happening.

👤 staticelf
For web dev, I would say stuff like Liveview / Hotwire.

It makes you faster on developing complex web applications that would otherwise take much longer time with a traditional SPA-framework. You don't have to worry about handling server communications, how to send data and in what format etc, all updates will be handled in a consistent manner and it's scary easy to make live updates to all connected clients.

Of course, these things are getting more popular now and a lot of people on HN may already know about it but I would assume most laymen still have no clue that this is a thing.


👤 cblconfederate
Cell reprogramming

👤 rektide
Non industrial tech. These huge massive excesses of computation arent really relevant, compared to the frontiers of computing being a personally possible revolutionary potentiality. Progress is dispirate & not well proven. But big computing is both cutting edge but increasingly irrelevant & botique, not broadly impactful. We will leave our impactlessness shortly, start to explore & find edges that bring us somewhere good, expand our lands.

👤 defanor
It seems that many of the answers point a general CS area or a trending topic in it, but I'm pretty sure that progress happens in all the areas (see arXiv's CS page [1], for instance), as well as in topics outside of hyped ones, and each area or topic has its "cutting edge" bits (new research).

[1] https://arxiv.org/archive/cs


👤 tester756
I'd say High-NA EUV

👤 aditzup
I think typing biometrics made a lot of significant advancements in the last year. I know that it is something that was in the making for a while but there are some companies out there that are offering some very interesting solutions such as continuous authentication.

👤 beckman466

👤 Hendrikto
In the NLP community, an example of cutting edge research currently happening is token-free transfer Transformers.

👤 sealeck
WebAssembly

👤 BerislavLopac
For client-server APIs, I'd say it's using OpenAPI as specification, as opposed to just for documentation and testing.

It feels like it shouldn't be "cutting edge", but it's still not used as much as it should be.


👤 harpratap
On Networking side - eBPF

👤 sAbakumoff
Neuralink to implant chips in human brains in 2022. what could more edge-cutting?

👤 shp0ngle
Dark blockchain

👤 dbavaria
DeFi - Decentralized financial instruments that are implemented via Smart Contracts on a Blockchain.