HACKER Q&A
📣 praveen9920

How do you catch up to the research of LLMs/Transformers etc.?


I have been trying to catch up on research that has happened in the last 5 to 10 years on transformers, GANs, LLMs, etc. There are some amazing resources out there explaining some of the aspects. I also started reading original papers for better context. However, most of them refer to previous papers and I feel like I am falling into the rabbit hole and discovering more research.

How do you catch up to the fast-moving research?

Here are some resources I use:

1. https://paperswithcode.com/

2. Two minutes papers ( Youtube )

3. https://github.com/diff-usion/Awesome-Diffusion-Models

4. https://www.litmaps.com/ ( for finding references )


  👤 hahnchen Accepted Answer ✓
This is a question I asked recently:

https://news.ycombinator.com/item?id=38652736


👤 pizza
- check r/locallama daily

- follow (amongst hundreds/thousands of other accounts..): ak, aran komatsuzaki, lucidrains, rivershavewings, wizardlm, blinkdl (rwkv), greg gerganov, thebloke, yannic kilcher, yann lecun, hazyresearch, igor carron, calccon/charles martin, birchlabs, jeremy howard, geohot, patrick kidger, gwern, adam nemecek


👤 friendlynokill
If you haven't already, I highly suggest going through Andrej Karpathy's Neural Networks course. He covers some of the history and evolution of neural networks.

https://karpathy.ai/zero-to-hero.html


👤 ZunarJ5

👤 jerpint
Yannick kilcher YouTube Channel is filled with very relevant paper reviews that are very helpful