A) Search past HN comments on hn.algolia.com, or
B) Post a new 'Ask HN'.
LLMs could provide a new way to find answers within a corpus. These have been described elsewhere, e.g.
- https://github.com/openai/openai-cookbook/blob/main/examples...
- https://news.ycombinator.com/item?id=34477543
I keep expecting someone (maybe minimaxir or simonw?) to post a 'Show HN: Get your question answered by the collective wisdom of HN', but I no one has so far (unless I missed the submission?).
Is someone already working on this?
Don't get me wrong: the idea could be nice but... ain't it time to think twice about all this before applying the last technological fad ?
To the sibling comment that I asked about doing this locally: there’s really no need for an LLM, much less for GPT-3. All you need is, well, attention. Sentence-transformer embeddings. Perhaps even just fastText.