More obvious things:
* Internet enabled models (WebGPT)
* More context / bigger memory (GPT-4)
* Bigger models / more params (GPT-4)
* ChatGPT but you can talk to it in voice back and forth (this will be a cool demo, but won't be useful until we have a way to run it with low latency locally)
* MSFT will integrate ChatGPT into all MS office products (Clippy 2.0)
Less obvious things:
* Open source text models get more popular - GLM-130B which is superior to GPT-3 in zero and one shot learning
* ChatGPT competitors now that we understand how it was built. for example PaLM
* Lots of personalized generative art (apply styles to your house, your pets, your product photos)
* AI generated astroturfing
* We'll start running out of text and image data for training models at massive scale
What else?
* Internet enabled models (WebGPT)
* More context / bigger memory (GPT-4)
* Bigger models / more params (GPT-4)
* ChatGPT but you can talk to it in voice back and forth (this will be a cool demo, but won't be useful until we have a way to run it with low latency locally)
* MSFT will integrate ChatGPT into all MS office products (Clippy 2.0)
Less obvious things:
* Open source text models get more popular - GLM-130B which is superior to GPT-3 in zero and one shot learning
* ChatGPT competitors now that we understand how it was built. for example PaLM
* Lots of personalized generative art (apply styles to your house, your pets, your product photos)
* AI generated astroturfing
* We'll start running out of text and image data for training models at massive scale
I'm not sure how much computing power is required, or how much disk space.