HACKER Q&A
📣 saliagato

Is an automated support chatbot powered by GPT-3 a bad idea?


I'm exploring the possibility of creating a support chatbot with GPT-3. With the powerful capacity of GPT-3 to perform well if finetuned with a lot of data, I thought that it could be possible to create a support chatbot powered by it.

That's because support chatbots need to be able to handle a wide range of topics, and GPT-3's ability to generalize from a small amount of data could make it well-suited for the task.

To train the chatbot, I would need a dataset of customer support conversations. Then, fine-tune the chatbot on this dataset.

One challenge with this approach would be that the chatbot might not be able to handle unusual or unexpected customer queries. But this shouldn't be a problem if the dataset is large enough.

I'm not sure if any business would ever use it.

Can you find any reasons why creating a business around this would be a bad idea?


  👤 bell-cot Accepted Answer ✓
How large, wide, and well-curated a training set do you have for it?

How exposed to malicious users will this chatbot be?

What's your PR and legal exposure, if the chatbot sometimes says regrettable things?


👤 warning26
The key problem you're going to run into is when the bot makes promises it can't actually do.

For example, a real support tech might say something like "Sure, let me reset your account information." If your bot says that, however, it won't actually be able to do that, and you'll be left with a frustrated customer.


👤 drdeca
Maybe you could use it as an autocomplete to make an actual support person able to handle more queries simultaneously? Idk.