Playing around with huge models, inferencing, training, etc. (just for learning purposes)
Running modest general purpose, code and image generations llms/models while my pc is on. the goal is to replace the gpt-4/claude subscriptions and have a usable local AI setup thats integerated with my tools as well
actually using the pc for work, not as a server. this means that I need to be able to dual boot between desktop linux and windows and not have constant driver/upgrade problems
pytorch/tensorflow for creating and training my own models
some blender and 3d stuff
Questions =>
Does it really matter which 4090 I pick? and why? also if it does, should I go for a liquid cooled card (like the msi suprim x) or an air cooled one?
Can I split the load of training between the cards even when using self-created models on pytorch/tensorflow like I do with ollama and other pre-trained llms?
7950x3d vs 7970x - is the threadripper really worth it? will it improve my 4090 performance by being a more powerfulc ompanion or will I just hurt myself money and performance-wise with a server cpu?
how much ram and what is the thought process of coming to a number? and depending on the motherboard, can I run that amount of ran on full speed?
I live in India currently but I will probably shift to the US in the next 2-3 years. This means I'll encounter the 1600W max problem. Keeping this in mind should I even consider a 2000W PSU since that buffer won't ever be utlized anyways or are there workarounds to that problem? Also how much above the advertised power do cpu/gpus shoot when under load?
In the future, will a third or perhaps 4th 4090 make sense it terms of price-to-performance ratio or will I be better of getting a dual A6000 or equivalent setup at that point?
How do I go about selecting the right motherboard (with the right amount of pcie x16 slots) since pcpartpicker and online reviews are no help
Is the H9 flow big enough to comfortable fit and air cool dual 4090s or should I got for a taller Lian Li case?
I know this is a dumb question but which model and inference will push these cards to the absolute limit?
Feel free to suggest any corrections or ideas to my current specifications list
Current Specs In Mind =>
CPU : 7950x3D (if I decide against the threadripper, otherwise 7970x)
CPU Cooler : Noctua NH-D15
GPU : 2x 4090s
Motherboard : ProArt X670E-Creator Wifi Or Asus TUF Gaming X670E-plus (i think both of these have the right amount of slots??)
Memory : 128 GB (4x 32) G.SKILL Ripjaws
Primary Storage : 4x Samsung 990 Pro 2TB SSDs (I test multiple distros and kernels at the same time and its just easier to dedicate a drive to each one rather than make multiple partitions)
Secondary Storage : WD Blue SA510 4 TB 2.5" SSD (sata)
Case : NZXT H9 Flow
PSU : be quiet! Dark Power Pro 13 1600 W 80+ Titanium
But if you want as quiet as possible, liquid cooling will [almost] always beat air cooling
The problem with liquid-cooling is ensuring you can upgrade whatever you have "liquid cooled" if you want/need later