HACKER Q&A
📣 MetaWhirledPeas

What will happen as AI costs increase?


We are living in the "penetration pricing" phase of AI, where costs are absorbed by seemingly endless investment. What will be the practical fallout when prices inevitably rise?


  👤 CM30 Accepted Answer ✓
For a lot of companies, probably shut down or drastically limit their AI usage due to rising costs. A small or medium sized business dependent on ever growing AI expenses is in a real bad position, and could well go under.

I heard a few companies ended up going back to hiring actual employees for work that was previous done by LLMs, so there's a chance we could see some more of that too. Might also see a few try to make it work with outdated or local ones too.


👤 scorpioxy
What always happens. A market correction followed by going back to a reasonable state, until the next bubble of course.

In my opinion, LLMs are useful for many things but not anything and everything and definitely not in the way the boosters are claiming. This is not a popular opinion when you are inside the bubble or have something to gain by it. So when there there's a downturn, things will hopefully stabilize with LLMs being another tool that can be used to automate certain things. It feels crazy saying this these days and have been told I'm out of touch if I think this way and who knows, maybe that's true.


👤 krapp
What do you think will happen? How does supply and demand work? Practically every business and government in existence is existentially dependent on AI, speculation on it is the only thing keeping the world from global financial collapse. It's "too big to fail" at a scale that dwarfs the financial crisis of 2008.

You'll pay the fucking danegeld is what you'll do, and keep paying it, because you reorganized your entire existence around and mortgaged your future on a closed proprietary third party service's business model that is now a single point of failure for our entire technological civilization, making its market value practically infinite.

That's a collective "you" there, by the way, not "you" personally.


👤 ipaddr
Less people will use the frontline models and those who do will pay more. Progress will slow. OpenAI will sell your chat data. You will get an AI tax. Companies will use less of it.

Hopefully new ways to deliver similiar quality will be discovered.

Stock market will pop.

Prices will go up for people inside the moat


👤 atleastoptimal
Prices are going down. Just look at open source models, you can run the equivalent to a SOTA model 8 months ago on your laptop.

👤 MehdiBelkacem
Token anxiety is real. What worked for me: prompt caching on fixed system prompts cut my Anthropic bill by ~60% overnight. Most devs don't realize cache writes are 25x cheaper than input tokens on Claude.

Local models for classification/routing + frontier only for generation is the other move — but the latency tradeoff is real if you're in a user-facing flow.