Aren't these essentially pre-configured prompts? I see custom GPTs as helpful, shareable prompt templates. Maybe also a good strategy to convert people to the ChatGPT Pro plan.
I also tried to create a Hacker News GPT that integrates with the HN Algolia API, but had a hard time making it call the correct API endpoints and handle edge cases: https://chat.openai.com/g/g-BIfVX3cVX-hackernews-gpt
> Aren't these essentially pre-configured prompts?
They also include custom knowledge and actions, and I don't think these are simply added to the prompt. But I don't think OpenAI has published how this data is fed into the model internally?
In general, I've had mostly bad experiences with models that include retrieval or external APIs. It's just too brittle because the model needs a very precise understanding to retrieve what you want, not fuzzy language. The by far best experiences is when you can put everything into the prompt.
I think they have a way to go.
I'm really not on board with the current hype. It is not useful for my work (webcoder), but if I've to write in business-attire I find it useful.
It’s still nothing major.
It's all about exploring the potential of GPTs!