Where do you host your Llama 2 model?
I'm currently hosting on AWS, but it's proving to be quite expensive, especially during the development phase. I'm actively exploring more cost-effective options as I'm concerned about the potential server costs once the project moves into production.