HACKER Q&A
📣 ramesh31

Minimum viable home lab for training LLama sized models?


Minimum viable home lab for training LLama sized models?


  👤 brucethemoose2 Accepted Answer ✓
A 48GB Nvidia GPU for LLaMa 65B. Maybe 2x24GB?

Less for LLaMA 33B, which is still very good.

Use a repo that supports QLORA, like https://github.com/OpenAccess-AI-Collective/axolotl