HACKER Q&A
📣 behnamoh

Function Calling with Local LLMs?


Arguably, the most important reason I've stayed with OpenAI's models is the ability to do function calling. Having searched online for quite a bit, I haven't found any free, open source models that have this feature.


  👤 gsuuon Accepted Answer ✓
You won't get it built in but there are plenty of biased sampling projects out there that can enforce responses as JSON. It's actually going to be more reliable with a local model since you can mess with logits before sampling.

[1] https://github.com/ggerganov/llama.cpp#constrained-output-wi...

[2] https://github.com/gsuuon/ad-llama (mine)