HACKER Q&A
📣 hospitalJail

What are you using to host local LLM models that are accessible via API?


I quickly used Oobabooga months ago, but never got llama.cpp working. What are people using for this?

I will need to fine tune, and as mentioned in the title, I want to host it on a computer and access it via API.

Any thoughts to get me closer is appreciated.


  👤 compressedgas Accepted Answer ✓