In 2010, if you were in the right places and looking at the right trends (Theano, ImageNet, CUDA, the explosion of available datasets), you could reasonably predict that machine learning would be a good investment. (and ironically probably one of the worst thing to get into right now due to potential bull-whip effect)
Now I fully understand that hindsight is 20/20, and that any new and untested technology will be highly uncertain. But if you're working on something exciting, could you make a pitch as to why your specific field would take off in the next 10 years and is a good time to invest in right now? Rust? WASM? Maybe even robotics?
(For my own prediction: I think something Application-specific integrated circuits are super interesting to look into for the next 10 years. The slowing of Moores Law and Dennard scaling, and the ever increasing focus on specialized hardware all points to that ASICS would become more interesting. Personally I know very little about this field, but it feels like the age of homogenous computing is over and the next 10 years could be an exciting time for be able to go `full-stack`)
I was once told, around 1980, by a highly-respected IT person, that C had no commercial application. A few years later, I was told by a college's industry advisory committee, that Unix would not be used widely.
Arthur C Clarke's First Law: “When a distinguished but elderly scientist states that something is possible, they are almost certainly right. When they state that something is impossible, they are very probably wrong.” (rephrased to remove sexism)
Alan Kay: “The best way to predict the future is to invent it.”
Don't, go learn past technologies instead. People leaving school for tech in 10 years likely aren't going to have stood up servers and configured services. We'll be lucky if they've used anything that isn't a web gui on a large cloud operation. Rad Hat has already signaled it is going in this direction by tossing out their EX300 exam in favor of the EX294. I believe we will continue seeing such a trend going forward. There's going to be a growing need and shrinking availability of competent sysadmins over the next 10 years if I had to guess.
I have looked into mobile development in years past and I always got the impression that it was a hot mess. An ever changing everything, undocumented everything, tons of languages and frameworks to choose from. I was immediately turned off by how complex, immature, and ever-changing that everything was.
To me, the most important selling point of Ionic is that I can write code once (Ionic UI Framework + Angular or React or Vue), then run it through Capacitor and it spits out a NATIVE mobile app that runs in a WebView (not a PWA, but they support PWAs).
I don't have to know anything about Android or SwiftUI. If I want to access native features (such as camera or location) I simply use a Capacitor plugin. Again, zero native code, the plugin handles it. There are plugins for things like storage, clipboard, file system, haptics, and more.
If you were ever turned off by the complexity of mobile app development take a look at Ionic. If you know HTML, CSS, a JS framework, and can learn their (simple) framework language you can write a fully functional app without knowing anything about the native coding.
[1] https://ionicframework.com/ // https://github.com/ionic-team/ionic-framework
What I expect in ten years is that IOT will take off but this time it will have to be invisible for the user. An AI solution will observe, train, and expect the needs of the user and will try to provide what is needed in just in time manner. This will happen in both private and public spaces, so preparing some privacy and anti-tracking techniques might prove useful.
Jump on the next hype topic any FANG company is talking about. Everytime you got a lot of developers jumping blindly into it even if they dont need it and you earn a lot of money by buying one of the first snd again if you move people away from it because its not a silver bullet for everyone.
Think about big data and map-reduce 10+ years ago, everybody was using it, and many with workloads <100GB you could reslly throw onto one beefy machine. Take nosql databases and everyone rebuilding joins and transactional semantics until switching back to a traditional database. Or the hype of microservice which is vanishing more and more.
There are not that many inventions which really last, and nobody is able to predict it very good. But being an early adaptor of some fang hype get you somewhere.
I hope we will se more proof assistance connected to more normal languages, but I wouldn’t hold my breath for though. :)
I doubt I'll be any good or do anything with that knowledge though.