Every AI CEO selling the disparition of developers in 6 months. But I don't see any developer complaining about LLM steeling our job, or that LLM are infringing copyright and are bad for humanity.
I can't possibly make a fundamental difference between GenAI and vibe coding, they are the same thing.
At any rate developers complain about vibe coding on here all the time, there are complaints about stealing code from open source projects that thus ends up not obeying the license and that this is a copyright violation. There are complaints that subtle bugs in AI produced code worsen products.
The rants against GenAI are generally related to copyright but also aesthetic in nature. The aesthetic rants do not really work where code exists, although somewhat because there are complaints about bad code generated from AI.
Not sure if you're just trolling, but um... just look at every AI thread for the last year.
GenAI and vibe coding aren't the same thing because the end product is different. While you can find art and poetry in code, generally, the code is an means to an end. It's more the medium than the message.
I don't want to completely discount genAI, because I think it has it's place, but it's also a pollutant. People value human feelings, originality, and authenticity, which are cornerstones of art. GenAI is not those things, and feels like a fake to many people. It's muzak vs music. The industry plant that looks good, but has no real talent.
Personally, I don't think LLMs are the death of developers. Most software is a big steaming pile, held together with band-aids and duct tape, and the demand for it is never ending. Any tools that help us improve on the current situation are welcome IMHO.
Artists, on the other hand, already have a hard enough time just scraping by. Many take on soulless work, making corporate stock art or editorial copy, just to pay the rent. It's not what they want to be doing, but at least it's using their skill set. GenAI is arguably fine for generating that kind of content, but now the artists are out of a job and probably will just stop being artists in order to survive. Software isn't going anywhere, but artists of all kinds are dwindling. GenAI isn't helping on that front.
Then there's the whole "reality" issue... Code is just code, but genAI is making it harder to tell what's real, which probably isn't helpful for society in general.
With other content people get to see it and feel something ever so slightly discording. Sometimes it looks good on first look but then some details are off when you properly look. Errors that would not be done by humans.
In either case there are plenty of those who generate content and do not care they simply want the output. But for those who do want certain certainty of quality it is much the same.
1. AI CEOs oversell, by a lot. OpenAI CFO admission that they are cooked unless the US government bails them out is a tell.
2. The (almost) purely utilitarian nature of software code is in contrast to the more personally meaningful aim of art in general (although both do converge when we're talking about purpose-fit artwork: design/music for ads/shop centres, for instance). That makes, in my view, most of the difference given the following.
Vibe coding is mostly a very well evolved (albeit not perfect or deterministic) code completion/linting/review tool. Although it does bypass (for the user/coder) a LOT of the intellectual work needed to come to the same result - and by that, I mean/think that it is highly detrimental to the user/coder intellect; and because of this, it becomes highly detrimental to the employer too, especially if it reduces its own workforce.
A software company that extensively uses AI instead of hiring competent (and junior) people is faced with the same fate as a company that just stops hiring: it's going out of business or bought in a few years. Because it outsourced its control over its own process, or the process/product it sells. That's also a reason why considering Engineering or R&D as a cost center only makes sense in the "accounting sense", not in the "common sense", but that's only one example of how MBA's fucked up the world.
It certainly trained on existing open source codebases; whose code reuse is encouraged; although indeed, the license on the code in output is a question; did it train on closed source/proprietary codebases? that's an open question. Does it threaten developer jobs? I am not sure, see above.
"art" GenAI is a whole other beast, operating (and training) on a whole other order of magnitude of quantities of artwork that are very opinionated, original, and for which authors/owners have NOT given their consent to be used neither in training, neither in the output. People promoting GenAI dismiss the objections and practice of those owners showing a poor understanding of the process that is art, and a glaring contempt of the copyright law. Did it train on copyrighted works? Yes. Does it track how? No. Does it compensate people? No.
Does it make comparable quality work in output? No, because it's automated and it completely misses the point.
Does this threaten original artists that put in the work then? Yes, because a lot of people who have money (hence power) but shit taste and no understanding of the art process believe that it does replace real people trained and dedicated to this process and the particular media they work with. And they invest their money where they believe it will further this replacement and give them more money.
But it literally, from start to finish, makes no sense. And that's precisely the point of the process that is art. Through actual, personal and group work, make sense out of something.
A machine, an algorithm does not do so. The art is mangled in the training/labelling process. The prompt is crap, and always will, compared to the specifics and accidentals of the original work used in the training step.