- It costs more jobs than it creates.
- It's the new meme tech, ala SAAS, Cloud, etc. that I have to tolerate now. I love seeing a "Chat with bing!" button, that's great.
- It's flush with cash due to the US having tonnes of play money to throw around, enabling irresponsible behavior such as pricing below cost.
- It's an unsolved ethical problem. Sampling in music requires attribution, comparatively.
- It lowers the barrier to entry for bad actors to gain legitimacy. Good actors never needed great art or anything else AI can do if they made things with care and skills they had.
- The world didn't need more content. It is awash with content already. A lot of the content we have now is not good and AI isn't going to make it magically better.
A better question is what problems does AI really solve? Are those benefits worth the massive cost?
When I see something and I know that it was created with child labor, it induces the same disgust that AI products do. Perhaps I can do great and good things with some tool or product made with child labor, but that doesn't change the ethical abomination at the core of that product.
If AI isn't paired with UBI, then we are simply on a collision course for the elimination of tens of thousands of, admittedly awful, jobs. What are all those people going to do? Truck drivers, petty artists, call center workers, etc.
"AI" OTOH does not exist unless one adopts a strange definition of "intelligence".
An intelligent person can tell us how they arrived at a conclusion. "AI" cannot.
That's a massive timekill when the conclusion is wrong. The process is a black box.
Even with old search techniques one can understand the process used to arrive at the results. When results are not what we want, we can understand why.
As such, the predictions that AI will "replace all artists" is obviously way overblown. At best, it will be a helpful tool, along the lines of Photoshop or After Effects.
So until this settles, I don't trust it much not because of itself but because of the people who are milking it at the moment.
Too old for this shit.
(yes, that's a personal problem and doesn't relate to the merit of the technology)
I always thought the term "artificial intelligence" had a sort of disabling effect, like there is an intelligence outside of ourselves that serves to drive us in X direction, good or bad. "Technological progress" implies we are the ones driving the changes and the problems they will invariably bring. We sort of grasp this tech will cause profound impacts on society of some vague quality, enough to leave an "ethics" section in every white paper that comes with freely-distributed code and instructions for use, but continue plowing on regardless of what they could possibly be. How sustainable is this? Will there ever come a time when uploading code or even papers to GitHub for anyone to consume become taboo from the stigma and change that's been inflicted on ourselves?
I think the inflection point for those problems creeping into society at a visible everyday level is on a much quicker time scale than AGI. Sometimes I think it's like equipping people with pistols that shoot precision-guided homing bullets - not so much on the scale of a civilization-ending scenario, but it changes the game in its own significant ways. Look at comments accusing others of using ChatGPT to write their responses for them. I think most tech can cause these effects and it's worth questioning what it's meant to accomplish as they're created or used.
At times I wonder if the end stage of any given intelligent civilization is to delegate all parts of its thought process to technology that can be engineered to be superior, with all consequences that entails, because there's no point to being stuck with the tech that is already there forever. The thought that scares me the most is that the revolution might not be directed by governments or angry anarchists, but indirectly, by bored machine learning engineers sitting in their rooms contributing just one more paper or PyTorch implementation towards an inflection point in humankind because it's fun and rewarding to them.
And even if we're supposed to stop advancing this tech to prevent irreversible societal change, would it even be possible if we tried? There's 8 billion of us on Earth and metric tons of GPUs in existence. The question of if progress can ever be halted in a state such as ours for in the name of self-preservation is one I'll probably be keeping in mind for the rest of my lifetime.