HACKER Q&A
📣 djangovm

What stack to use for LLM on my local Mac?


For learning purposes, I want to solve the following problem:

Take my notes --> Feed it in a model that can run on my mac (32GB M1 Pro, 10 cores) --> ask questions before my next meeting to give me some context based on my notes.

What stack should I use, and how do I start? Please assume zero knowledge on my part besides python, java, and theoretical understanding of vector search, and basic ML.


  👤 givenkiban1 Accepted Answer ✓
Use ollama to manage and download models locally.

there's a python ollama module that can allow you to run inference with these models.

that's the first part.


👤 friendlynokill
I use lmstudio: https://lmstudio.ai/