These days when I do literature reviews for the trending research, I find that most of the useful models need insane amount of GPU to train. As a student in a small lab, we are lack of GPU power. Is it still meaningful if I stay in this lab to learn deep learning??? I felt like all my work is like a toy...feeling lost, any advice?
Thankfully, you can get pretty good results finetuning on a much smaller and cheaper GPU like the 3060ti. Going from GPU-poor to GPU-rich is easier than you might think.