Codex is too slow. Is there any solution?
The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
Sonnet and Gemini are as good and faster. Can't speak for Grok.
It seems to work with less issues than CC opus.
I don’t mind if it takes longer as long as the answer is correct more often.
You can always be doing more work while one chat is working..
the new 0.47 has a better performance now imho