What happened to all the GPT-3 products?
I remember the flurry of product demos that were essentially "I built a GPT-3 app to automate Thing X". I think this peaked when we saw GPT-3 generating React code (or at least it appeared as much).
Have any of these materialized?
All the GPT-3 products seemed to be overpriced.
They were nice, there were these things that generated lyrics and stuff, but the payment model would be something like 14 cents per roll, and that's not worth it. The only pricing scheme I liked was AI Dungeon.
Maybe it's a little overpriced - everyone thinks they can use AI for everything and perhaps they're right, but the solution is a bit heavy at the time. I have a site that I'd love to plug into one of these (http://random-character-generator.com). I don't think it needs anything as hardcore as GPT-3, but there's not a lot of options out there.
We'll see since the hype has died down. Then again, I wouldn't want to build my business on top of someone else's API.
OpenAI wins by default if they start to compete against you.
Most of the initial products are "solution in search of a problem". They looked cool because of its AI novelty, but they need to solve a real problem to have sustained usage. Eventually, some killer app will emerge or OpenAI team will have to figure it out on their own. This is common challenge for technology platforms.
Recently I received a lot of e-mails from VCs that seem to be auto-generated and that just rephrase text from our website. So far none of the VCs admitted this of course, pretty sure though that they don't write those by hand. Not sure if those are based on GPT-3 either, but the timing might indicate so.
AI Dungeon is a text-based game where the player is chatting with GPT, I have play it several mounthes ago.
GPT-3 access appear to be hard to get from OpenAI, which has probably limited experimentation. This is possibly why we are not seeing good products yet.
Its not going to happen overnight. That doesnt mean that the products arent revolutionary. Just that the training sets, infrastructure, know how, etc are not there. Its not a simple case of importing the model and running it. You need a second data set to tune the model to your needs with. An analogy for this post is like coming up with a new piece of theory for electric cars and then asking why the theory isnt deployed on the roads six months later.