HACKER Q&A
📣 dr_dshiv

How to Define the Singularity?


I know there are differences of opinion on this, but I’d like to keep an open mind about dramatic, immanent changes to human society (because it is a risk worth considering).

My concern is that we move the goal posts so rapidly that “beating the Turing Test” is now meh.

How might we define the singularity in a meaningful manner so that we can have useful discussions about it? What metrics should we attend to?

I have the same, overlapping questions about AGI. I’d give 50% odds within 5 years — but I’d need a clear cut way to win or lose that bet.


  👤 nradov Accepted Answer ✓
I define the "singularity” as misinterpreting a logistic curve as an exponential curve. It is essentially a secular religion based on faith in accelerating rates of progress, but that's simply unscientific extrapolation.

There is no scientific basis for assigning a percentage to the odds of developing a true human level AGI within a particular period. About all that we can confidently state is that it's probably not impossible and that we appear to be making some progress towards the goal. But no one can honestly claim to know how far the goal is away or quantify our rate of forward progress.

I think that Ray Kurzweil had a pretty good practical definition of intelligence: “Intelligence is the ability to use optimally limited resources – including time – to achieve goals.” So far we can't build an AGI which fully meets that definition at even the level of a rat, or maybe not even the level of a cockroach.


👤 transfire
When you can ask an AI program to write a superior version of itself.