I started getting too many responses which were clearly incorrect, I point out the correction to which the response was, "You are indeed correct [rephrases the answer correctly]"
Okay, great, but what about all the responses where I don't actually know the topic enough to know it's incorrect?
ChatGPT lately has its confidence really high and hallucinates in the middle of really basic technical stuff. It will say what a command should be for something really mundane like the `ip` command in Linux and then it sprinkles stuff that doesn't exist in the middle... you start to lose confidence.
I think LLMs are better than search engines but if I have to fact check everything, I'll switch to another LLM or go back to a search engine.
I don't know what has changed in ChatGPT but it's worse lately.
I still find ChatGPT 4 well worth the money. The coding is way better than 3.5. I wonder what their system prompt is, I don’t get as good results from the API (maybe I am doing it wrong).
If you want to use GPTs or to generate images with Dall-E then ChatGPT Plus is a no brainer I guess.
But if your focus is on GPT-4 then I highly recommend to use an API instead.
There are multiple pros of using an API Key:
- You pay for your usages. I've been using API Key exclusively and most of the time, it cost me around $5-$10 a month
- Your data is not used for training. This is important for a privacy-minded user like me. Though you can disable this in ChatGPT (but you will lose chat history)
- No message limit. Though there are rate limits to prevent abuse but generally, you do not have the message limit like the ChatGPT
- You can choose previous GPT models or other open-source models via
Depends on the applications, you can also get these:
- Access to multiple AI services: OpenAI, Azure or OpenRouter
- Local LLMs via Ollama etc.
- Build custom AI workflows
- Voice search & text-to-speech etc.
- Deeper integrations with other apps & services
There are also a few cons:
- No GPTs support yet
- If you use Dall E a lot then ChatGPT plus is more affordable. Generating images using Dall E API can be quite expensive
Edit: Some tips when using an API Key:
- You pay for tokens used (basically how long your question & AI's answers). The price per chat message is not expensive, but usually you will need to send the whole conversation to OpenAI, which makes it expensive. Make sure to pick an app that allows you to limit the chat context.
- Don't use GPT-4 for tasks that doesn't require deeper reasoning capabilities. I find that GPT-3.5 turbo is still very good at simple tasks like: grammar fixes, improve your writing, translations ...
-You can even use local LLMs if your machine can run Ollama
- Use different system prompts for different tasks: for example, I have a special prompt for coding tasks, and a different system prompt for writing tasks. It usually give a much better result.
Shameless plug: I've been building a native ChatGPT app for Mac called BoltAI[0], give it a try
[0]: https://boltai.com
If it's about model output, I highly recommend custom GPTs. Taking 15 minutes to an hour to play around with a custom prompt to get it to work how you want is incredibly worth it.
It kind of reminds me of my quest vr.
For me, the only thing missing from the web UI is the ability to search past chats.
Only the mobile app offers this feature.
But Id never pay it myself, it’s not worth it with current quality.
I was trying to translate a little text and it refused because it cointaned a part that violated their terms. wth.