My gut feeling is that some (most?) these advances are not actually robust or that useful. But still, there's so much to keep in mind. I really like working in this field, but how do I keep myself up-to-date with all these AI achievements and filter noise from signal?
This feels different from 3+ years ago. That's when we regularly saw radically different models working better. But there hasn't been a "Transformer moment" or "ResNet moment" or "GAN moment" for quite a while now. In 2014-2018 we had these wow moments every year. Now the wow is not about new developments but more like "wow, I didn't expect this to work so well with more data"
seriously i cant even grok the math behind those things, those are way beyond me.
but to be fair, hn posts and comments are great so far in coming up with possible use cases for these things.
also writing code and prompts as side gig. accumulated over 40gb of trained models and datasets from various sources.