Why should I learn BERT/transformers/etc. when LLMs like GPT-4 exist?
Why should I learn BERT/transformers/etc. when LLMs like GPT-4 exist?
It's easier. Setting up a docker container[0] that replicates OpenAI's endpoints is faster than making an account from scratch with payment processing included.
Plus, I don't really trust OpenAI's data policies. Running the model locally gives me the option to airgap my inferencing machine if-needed, something OpenAI won't have until they offer on-prem.
[0] https://github.com/go-skynet/LocalAI
Costs aside, why should I learn older NLP methods when I can simply ask ChatGPT to give me the sentiment of a sentence, classify texts, etc.?
Sounds similar to the popular question: why learn arithmetic when calculator exists?
What do you mean "learn" BERT?