HACKER Q&A
📣 zaptheimpaler

What can I do with 48GB of RAM?


Hi HN,

After a mobo upgrade, i have ended up with an ungodly 48GB of 3200Mhz DDR4 RAM. This is a ridiculous amount to have on a personal machine for me. What are some cool things I can do with this much RAM?

All ideas are welcome. Video/audio editing?, databases, run an OS off a ramdisk??, anything.


  👤 cehrlich Accepted Answer ✓
Run Slack and MS Teams at the same time

👤 sillysaurusx
I see you mentioned running an OS off a ramdisk. I recommend this, just to see how incredibly fast it can be.

And also how incredibly not-fast. The fact is that most applications are memory bandwidth bound, once you eliminate the disk as a bottleneck. Not CPU bound. So when you run off a ramdisk, it's not actually helping as much as I thought it would.

But! One really neat thing you can do is to save VM checkpoints, so that backing up your computer is as simple as checkpointing the VM. So there are other advantages.

Doing some video editing is fun too, and 3D modeling. Ever want to dabble with ZBrush? Now's your chance. Get yourself a nice big monitor and Wacom tablet. Yum.

(And then, y'know, set the hobby down and never touch it again, just like the rest of us. But it's fun while it lasts.)


👤 georgia_peach
In linux-land, sometimes I'll use a 2-3G ramdisk (tmpfs) for `$HOME/.cache` just to reduce wear-and-tear on my SSD. The web browsers put a ton of junk there, and I rarely reboot my machine.

👤 tanelpoder
Have fun with in-memory columnar databases or SQL engines and see how fast they are (the ones that use CPU-efficient data structures and data element access, plus SIMD processing). For example Apache Arrow datafusion [1]

Edit: Also, run a cluster of anything (in VMs or containers) and muck around killing individual cluster nodes or just suspending them/throttling them to be extremely slow to simulate a brownout/straggling cluster node.

[1] https://github.com/apache/arrow-datafusion


👤 karmakaze
Make a relatively simple application, say an async video chat app. Build it with 'micro'services for everything (e.g. thumbnail generator, contacts, groups, sending, receiving, email/sms notifications). Deploy all of them in containers with redundancy and use a distributed datastore in VMs (to simulate separate machines, run some of them in different timezones).

Alternatively, try running Elasticsearch to index something.


👤 1MachineElf
You can devote 1/3 of it to CISA's Malcom, which has a minimum requirement of 16GB: https://github.com/cisagov/Malcolm

As for the other 2/3... ZFS, Google Chrome, or Electron apps maybe?


👤 bravetraveler
If you like playing with different things (Operating systems, misc software) - virtual machines are fun.

I allocate 32 of my 128GB to 'hugepages' - basically reserved areas of memory for the VMs (or other capable software) to use. It helps performance a bit.

Aside from that, I make pretty liberal usage of tmpfs (memory backed storage) for various things. When building software with many files it can make a big difference.

Fedpkg/mock will chew through 40-50GB depending on what I'm building/the chroot involved


👤 fsflover
Install Qubes OS: https://qubes-os.org. Sometimes my 32 GB is not enough.

👤 tiernano
My main workstation has 192GB of ram (also running twin 20 Core Xeon Golds plus a shed load of SSDs/HHDs/NVMes... Anyway, long story boring, I run VMs for Dev/Test as required... That's the main reason I got such a machine. [EDIT] I am running Windows Server 2022 on this box, not a standard Windows Desktop OS, and use Hyper-V for the Hypervisor...

👤 shanselman
Open a second tab in chrome

👤 itake
1/ If you're into ML, hosting vector search databases can be expensive.

At a super high level, an ML algorithm converts content (text, images, or audio) into vectors (aka embeddings). Similar content should generate similar embeddings, so a large RAM lets you keep more embeddings in memory allowing more search. Large ram also makes training models easier.

2/ Data leaks can be fun to explore, but are often gigabytes of data. More ram makes them faster to query.


👤 virtualritz
A normal desktop machine for a 3D visual effects artist doing complex shots and/or (offline) rendering of those commonly has (at least) 64GB of RAM nowadays.

Get Houdini Apprentice and if you like it, upgrade to Houdini Indie and get a free 3Delight license for rendering and try to max that combo out.

Clarisse is another option.


👤 Krisjohn
https://github.com/myspaghetti/macos-virtualbox

Put MacOS and Linux (I recommend Ubuntu Mate for desktop) on there as a couple of VMs and poof, a much less ungodly 16GB of RAM per OS.


👤 arghwhat
Assuming a decent OS, all that RAM is already full of filesystem caches, so you are already benefitting from it.

Just use your computer as you planned. There is nothing "cool" about large memory applications anyway. But maybe next time, don't waste money on RAM you don't really need.


👤 anonu
This isn't that much... If you want more you can go to your favorite cloud provider and spin up many 100s of gigs of ram in an instant. Of course it will be slower in the cloud. But still... Not much use for this on a personal machine IMO

👤 bitwize
Spin up a few VMs, build a Kubernetes cluster.

Alternatively, create a VM with an interesting application (for example a Genera VLM) and run that in the background with your other stuff.

Run several Electron apps at once (VSCode, Teams, Slack, etc.).


👤 coldblues
Install Gentoo and set up tmpfs on 2/3 of your RAM for compiling.

👤 johndel
Chess analysis is one of the most RAM consuming processes you can try.

👤 LinAGKar
Put /tmp on tmpfs if it isn't already. And use profile-sync-daemon

👤 asdz
Try run Chrome with multi tabs :)

👤 rvieira
24 Chrome tabs?

👤 charcircuit
Run firefox

👤 jeffbee
Weird approach to the subject. If you didn't have an application in mind, why did you upgrade the computer at all?