HACKER Q&A
📣 JPLeRouzic

LLMs have billion parameters, what about old inference platforms?


50 years ago there was SHRDLU [0] which was an early attempt to have AI with reasoning capabilities.

30 years ago there were amazing inference platforms like CyC [0] that were supposed to address problems similar to those that are now submitted to ChatGPT like platforms.

It could deal with special cases because it had a community-of-agents architecture -specialized reasoning modules- each with its own data structure and algorithm.

Yet their size were tiny with respect to LLMs, roughly they were 1000 smaller.

What if they were scaled to today's size? Would it still be similar in capabilities to ChatGPTs and not better?

[0] https://en.wikipedia.org/wiki/SHRDLU

[1] https://en.wikipedia.org/wiki/Cyc


  👤 isaacfrond Accepted Answer ✓
I found an example of a successful reasoning of Cyc (pretty hard to find actually) [0] from 2006.

Cyc was asked whether the burning of the papal bull by Martin Luther is a social event. It concluded that it was an event, and it must have had attendees, and so it must be a social event.

ChatGPT4 find this an easy question. Possibly, is has read the paper though, so maybe this is not fair. Anyway, here is the output.

Prompt in italic Would you say that the burning of the papal bull by Martin Luther was a social event?

Yes, the burning of the Papal Bull by Martin Luther can be considered a social event as it had significant social, religious, and political implications during that time. The event occurred on December 10, 1520, when Luther publicly burned the Papal Bull "Exsurge Domine," which had been issued by Pope Leo X earlier that year. This act was a direct challenge to the authority of the Catholic Church and the Pope, and it contributed to the unfolding of the Protestant Reformation, which ultimately resulted in the formation of various Protestant denominations and significant changes in European society. The event attracted attention and support from various sectors of society, and it became a symbol of resistance against the religious and political status quo.

[0]: An Introduction to the Syntax and Content of Cyc.


👤 isaacfrond
You could combine Cyc with an LLM using embeddings. First you select from the Cyc database which entries are relevant to your prompt. This may use low-level ai techniques, e.g., do a word embedding and look for the database entry close to the prompt. Then add the relevant data entries data you found as context to the prompt. The LLM will then have the additional information.

This is already used to generate answers to support questions. You look for relevant FAQs in a knowledge base then have the LLM write a specific answer with the additional knowledge base information.

I doubt Cyc would add much though. As I recall it was filled with trivial world information. Like you typically shave in the morning. People need to eat. Things that ChatGPT has absolutely no problem with.


👤 ftxbro
Those ones aren't made with machine learning or statistics. They are made more symbolically maybe more like Wolfram Alpha. CyC can probably be cleverly integrated with LLMs in a similar way that they have integrated Wolfram Alpha!