One of the obstacles I was facing was curating, analyzing and transforming relatively big amounts of data, most of if text, into meaningful information.
With the arrival of GPT it seems that this road is now if not fully at least partially open.
I did already some preliminary testing and it looks very promising.
What would have taken me months (to come out with a rather mediocre solution) it seems that could be now achieved in weeks if not days.
Before I get too excited I want to understand the potential implications of depending on a third party service like OpenAI.
My biggest concern is OpenAI changing their term of services or increasing their fees so quickly and disproportionately that it kills my project overnight.
I guess I could use OpenAI to validate my idea, even get some traction and then try to implement my own GPT using publicly available libraries and somehow collecting the data I need to feed the model.
Would that be even possible for an average guy with lots of software development experience but just a very rough understanding of NLP, LLM and GPT?
I don't need a chatbot or anything alike, and could use data that concerns to my business domain only, so it would be a tiny subset of what a OpenAI has.
I know it will probably still be quite large, perhaps in the order of hundreds of millions of tokens, but hopefully when that time comes it would be financially possible to do it because of Moore's Law (?)
What are people thoughts on incorporating OpenAI and similar services into their own products and services?
What's your mid/long term strategy or plan B in case OpenAI isn't viable for you anymore?
We built Video Tap to convert our video content into high quality blog posts. We used to do this manually with transcripts, but it was still taking up a lot of time and energy, and video content alone doesn't really translate well with written content.
Thanks to OpenAI we're able to get almost publish ready blog posts in minutes.
https://videotapit.com if you'd like to try it.