I have been wrestling with this exact feeling (and everything you posted) since the first day they released chatGPT to the public. As a software engineer / Maker, I can't possibly overstate how essential it is for me to be in the flow, writing software, building things and finding novel solutions to problems. If you take this away from my day-to-day job I have no idea what I would do with myself. I know I would be absolutely miserable.
People who are blindly "enthusiastic" and expressing their excitement about being able to "build things they never built before" "with such ease" are totally missing the mark. They're just being short-sighted and can't think about the logical implication of what it means when (in a few years) Products can be created simply by telling a computer what you wish for. Here is what happens then:
(1) The value and uniqueness of all products drops to near-0
(2) You can't create a business out of these "products" because anyone else can build them
(3) The only one profiting is the person or entity that owns the AI platform you're building your products on.
Why are people not revolting against companies who's mission statement is to "build AGI" is beyond me. it's just humans being completely oblivious as always. it's not like we haven't seen this with climate change.
I think we're living a big evolution in tech and as a developer myself I think it will take time to really adopt those tools (and not fear them). For now, to be honest I don't see any big advantage using AI for code generation. However I think today stuff like ChatGPT are great dev rubber ducks. Sometime it helps you think and put you on the right path.
> There’s an odd feeling that whatever I am going to build will just get gobbled away by some big tech company.
I could be wrong but for now we're still in the hype train, AI companies are showing off everything they can to impress and raise as much money as possible (because their own subsistance is based on huge amount of money they're not able to generate by themselves). Don't get me wrong, what they are showing us is extremely impressive. But today I tend to think those tools will give more credit to the builders that will imagine and build apps and tools with something AI eats but doesn't have... Creativity, Taste and Imagination.
I might be a fool to believe that but If I compare that to cooking, even if you have restaurants where that only re-heat stuff that taste good, people still appreciate a good meal prepared with love by a real chef :)
So my only advice here is: be a chef, build what you think makes sense to you, be happy at what you're doing. Learn, try, brake and build but just don't stop doing it.
This also has a beneficial side effect of showing you first-hand just how bad this tooling is currently and how far away it is from replacing builders wholesale. And to be clear, I'm not just talking about leveraging closed proprietary APIs for these efforts, I'm actually more referring to building your own training and inference stacks from scratch. The pedagogical impact alone can make up most of the ROI for your time spent.
Google showcases Veo or whatever it is with a small, boring, stupid spot with Donald Glover, and I feel nothing. OpenAI demos GPT-4o and all I can think is "damn that TTS is horny". Asking AI to make art for you does not scratch the dull itch all humans have to create. Asking an AI to create is not creation; its management. Even using it as part of the process implies a larger, more comprehensive creative process the AI sits within, which none of these companies, Google, OpenAI, Runway, clearly know anything about.
My take in general is: Big Tech lost the mandate of heaven like six years ago, OpenAI had it for a bit but at this point they've also lost it. Its actually kinda the case that no one has it right now. My innate response is to feel depression at Everything AI we're seeing happen, but with a little bit of effort I find that feeling sits right next to excitement; that I'm probably not the only one feeling this, that what these companies are doing is actually literally boring, its loved most by Elon thread-bros on XXXwitter, and that the world by-and-large is actually kinda holding its breath right now. Its the same feeling I had during crypto; that this can't be It, please God this can't be the thing that drives humanity forward for the next twenty years; and crypto wasn't; and AI probably won't be either. Doesn't mean it won't be useful, but it does mean that there will be a next thing, and I'm most excited to help build that.
In the era of the camera, people kept painting. Some still paint landscapes — others were driven to create new forms of expression.
Poets still exist in the era of tweets, and directors shoot on 35mm in the age of TikTok.
Change is constant — but there's always joy in finding something you love and diving deeply into it.
Very few startups are finding success and if they do it's short lived because it gets replaced or integrated natively by newer capabilities from existing major players.
There will be success stories but I haven't seen many novel ideas created using new AI tech that are not just an overlay on top of an LLM that people aren't willing to pay for.
I think the right state of mind is to explore and keep thinking about real world applications, even though at the moment it's basically just every existing saas inserting AI into their existing products.
Completely agree, and this is a sign that they're just riding the hype train. Why does that make you feel disillusioned? What do you want to build or are building that would get gobbled away?
I have certainly become more of a recluse on the internet. I'm still figuring things out... at this time I think I'll pull more inwards. Do my own thing. Create my own art. Maybe I'll see you on the flip side? Maybe not.
It will stop when they run out of data, or when the training data needs to be isolated from the datasets created by GenAI.
So create something based on human creation. Sell it to the LLM trainers. It's the new business model....
> whatever I am going to build will just get gobbled away by some big tech company
This has been happening since the industry existed. X company copies/acquires Y and things go to die. New things pop up, the cycle repeats.
Sounds like you're going through a rough patch.
I went through Stanford CS in the mid-1980s, just as the "AI Winter" was starting. The expert system guys were claiming Strong AI Real Soon Now, but expert systems didn't do much of anything. Now that was disillusioning. All those smart, famous people in AI going nowhere and in denial about it.
This time it works. Sure, LLMs aren't really that bright, but they've blown through the Turing test and can do most of the work of the chattering classes. If only they knew when they were totally wrong. Even more impressive to me is that machine learning now works for robot balance and coordination. Even automatic driving works now. That gives hope that "common sense" might emerge. We badly need that, because we can't trust current systems with anything important.
To the developer who likes to come up with creative solutions to problems, if your solution has not been done before and it has not been done many times over the AI is not going to invent your idea. At least not today's AI! :-)
Another concern is the potential disappearance of entry-level positions for junior developers due to AI-driven automation. As AI systems become more sophisticated and capable of performing tasks once reserved for human programmers, the job market may undergo a significant shift. This could make it increasingly difficult for newcomers to the field to gain a foothold and establish their careers.
This part sounds like you are going through a rough patch. Like you are feeling down, not sure why, and are digging around for reasons.
What you are saying is legit. What you are feeling is legit. But, it would be worth doing some deep chill soul-searching to figure out if these really are the reasons. Maybe you need a hike in the woods. Maybe you want a job better mission behind it. Maybe I'm full of shit.
Either way, best to laugh at cringey demos. Learn how to use AI. What is good for and what it's not. Use it or don't. Just build stuff people need either way.
1) I just don't think I have the intellect or ideas to excel in the field as I would need to.
2) Current-day academic process was antithetical to how I wanted to work. An absolute slog.
3) AI started to reach peak exuberance near when I quit, such that I looked at the job market and saw nothing but scammy startups milking VCs for easy cash, with dozens and dozens of candidates lined up for unrealistic salaries. Everyone everywhere was and still is misinterpreting what these models are and what they can actually do.
The trough of disappointment is going to be apocalyptic for this one. Once NNs are making decisions that face tough legal accountability and auditing, they are going to be dropped like a tungsten rod from God.
I'm now in a pretty comfortable job doing analysis in an engineering field, so it all worked out for me. I love the work and problems I have to solve; while the pay isn't top-tier, I'm comfortable with steadily improving my software engineering and analysis skills, rather than running the grant treadmill or competing in the ML researcher job market.
Excel, Word, Flight Simulator, Internet Explorer etc
LMMs trained on ungodly amounts of internet and private data could be that.
You and I should be worried. It’s a dog eat dog world out there.
A definition of AGI is “do what humans do”. Better, faster, cheaper.
Best course of humanity is that this takes a few decades to be as good as humans. If it’s rapid, it’s going to be very disruptive.
Perhaps I'm just in the same boat - shouldn't be drawing any sort of emotional support from that.
I didn't want to reply to anybody in particular so as to mitigate the provocation since this is HN. But this feels like a common sentiment among transhumanist etc types. Or maybe I'm imagining, but I bet a lot of us are feeling this way so I thought I'd call it out. I have no expectation of receiving any understanding. The future is not for people like us.
all LLMs are good at doing that type of pattern matching and assistance.
But just because we have long battle ahead of us doesn't mean it's over. Hackers built the internet and AI and Silicon Valley for all its fucked-up idiosyncrasies, and I think we (/our predecessors) should be proud of that.
This will make me lose any authority I have, but I unironically suggest watching Mr. Robot, a dramatic fiction about an individual "hacker"-type in times of great technological upheaval, and IMO it puts forth a compelling analysis of the various ethical and practical barriers that present themselves therein. For example, "how to influence technology without working at a giant company"
for trying to do a passion project for accolades on your originality, with money as an afterthought, I can see that seeming more pointless
for an analogy: grocery stores don't need to be original, you can put one right across the street from another and we still need more grocery stores.
Go build something that doesn't suck. They're not even trying to compete with that.
Also related: https://youtu.be/nkdZRBFtqSs
In general terms: big tech know IT have started to be spread enough to makes people thinking back at classic desktops, hosting own services and so on, making their business model rooted on lock-in and others ignorance in a threat status. They have successfully pushed laptops, but have failed to go further with netbooks and mobile "the cloud-integrated platform", Chromebooks prove to be a limited success, people still want to own their own data even if far less than we all need. Emails are almost a synonym of webmails now, but even with modern antispam sheriffs they are still a success, modern socials after usenet have gained a big success but ultimately lost much of their popularity, hw is cheap enough to own a small machine room at home in most of the west world for most people and some https://tech.ahrefs.com/how-ahrefs-saved-us-400m-in-3-years-... or https://tech.ahrefs.com/how-ahrefs-gets-a-billion-dollar-wor... have started to talk how cheap and effective is owning something... ML systems are a potential solution being not a small show. They are also very nice to hide any sort of nasty things, isolating people without them easily notice and so on.
Aside the dream to "talk to computers like humans" or having "smart devices" who follow us easing our life till the point of being able to became another species transferring our consciousness in something we built in a factory to finally reach the immortality, the pure intellect is still a thing as it was since ever.
Bottomline: we are in declining and aging society where schools was reformed to generate legions of useful idiots https://www.theatlantic.com/ideas/archive/2020/08/i-was-usef... because they are easy to manage like Ford model workers, unfortunately they are not just workers, the lowest base of the society but ALL LEVEL of the social pyramid and while became ignorant is easy became literate it's not.
It's about time we need to know to use computers, the myth of a chimp able to operate was and still is a myth, the myth we can make eye candy UIs not demanding users knowledge is and was a myth. We see CLIs coming back in nth forms, we will see DocUIs coming back, and big tech is desperate who to make them back since we need them but without giving power to the users and without loosing their digital dominance. I can't predict the future, but I'm pretty convinced we see again bat tech take ground while not fully succeed.