Any suggestions?
For something more tailored to LLMs and less math-intensive, "Artificial Intelligence: A Guide for Thinking Humans" by Melanie Mitchell provides a solid overview of the field while touching on the societal implications and philosophy behind AI, which might offer a refreshing perspective next to the more technical aspects.
Lastly, if you want the latest on GPT-3, OpenAI has a comprehensive paper that you can dig into. It's not a book, but it's written by the creators and gives you an in-depth look at the model's architecture, capabilities, and limitations without the clutter of third-party interpretations or excessive simplification. You can find it on the arXiv repository titled "Language Models are Few-Shot Learners."
Pair these with the original papers and blog posts from OpenAI regarding ChatGPT and you'll have a well-rounded view that's both deep and accessible. Remember to occasionally check out new content on arXiv and follow relevant AI researchers on Twitter for the latest insights. Best of luck in your endeavor to master the knowledge of LLMs!
1) are rare
2) take time to write them
I also don't know if there is an incentive to spend a lot of time on an elaborate piece of work given how likely it is to be obsolete within 1-2 years. There's certainly incentive to look like you have written "the book" on LLMs which why the space is flooded with rushed, low quality books.
The good thing is that you can learn everything you need to know without a book on the topic - papers, tutorials, videos, code repositories etc.
Might give you some answers