The best use case for what you're talking about is to have it write pseudocode that you have no intention of running. For example, if you're trying to figure out how to architect or connect some things and you haven't done that before, a conversation with ChatGPT might help you see how to put the parts together. This method has the advantage that you could ask for the example in another language, or have it explain some parts of the syntax to you individually.
However... the code it gives you will not be the real parts and will not really work. So not only is it not a programming language, it really isn't even a great tool for teaching programming. I would say GitHub Copilot is a better "buddy" to have on your arm if you're learning programming. (And even better if you're already good because it frequently suggests the correct code completion for what you're already typing.)
If so, discussion of what a programming language is (I would think it needs to keep the programmer in control for some definition of “in control”, and doubt using ChatGPT would do that) isn’t necessary to decide the answer is “No”.
Plus imagine a small layer around the LLM that takes the generate code, compiles (if needed) and executes it.