What is everyone else trying to do? Why are they buying whole 1000 GPGPU clusters? Are all of them each trying to train their own LLMs from scratch? If not, what are they trying to do?
- self-driving (Tesla Dojo)
- computational chemistry, drug discovery, protein folding
- physics simulations
- the many things the US national labs do (https://www.scientificamerican.com/article/new-exascale-supe...)
That’s your answer. It’s hard to imagine what GPT-5 will be capable of.