> Is Prompt Engineering a Thing?
Yes, it's a dumb name for the skill of modifying your prompts and questions to the LLM in a way that produces better results than if you just asked for what you wanted plainly. As language models get better, this might become obsolete.
> I'm trying to research the subject but I don't see much evidence that companies are racing to hire prompt engineers.
Because it's not really a job. Think of it like using the Google search engine - being able to search well is something you can get better at but being a "Google search-er" isn't a career or a job you'll see openings for.
Otherwise it'll be probably a nice-to-have skill on top
Personally, I have experimented with customizing prompts for creating Anki cards, and I guess you could call this prompt engineering:
https://neurotechnicians.com/p/generative-ai-and-anki-part-1...
From the perspective of - is there a skill to writing prompts that get good results from LLMs - yes, definitely. Just that ~no companies are truly hiring for that as a full time role.
I wrote more on this a few weeks back here: https://llm-utils.org/How+to+become+a+prompt+engineer - but it basically says what I wrote above, just with some more details.
Ideally the language models should understand a question that is not really well formed. Like how Google can understand the queries that lack a ton of context but can figure it out.