What is the Singularity and how does it related to Chalmer’s idea that humans must integrate with super intelligent AI if we are to survive and flourish in the future?
The term “singularity” was first introduced into the discussion about technology and the future of humanity in 1983 by the science fiction writer Vernor Vinge[1] and was subsequently popularized in 2005 by the Futurist Ray Kurzweil in his book The Singularity is Near: When Humans Transcend Biology.[2]
The word “singularity” just means “a unique event with … singular implication.”[3] So, it is a one-time event, and it has a single dominant implication. The one-time event is the merging of biology with technology. The singular implication is a new level of intelligence, experience, and limitless existence made possible as human minds eventually are upload into avatars or virtual realities (that are indistinguishable from prime reality) or robots.
Why does Kurtzweil think the Singularity will take place?
Kurzweil quotes the legendary information theorist John von Neumann who said in the 1950s that “the ever-accelerating progress of technology … gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”[4]
The key word in the von Neumann quote is “accelerating.” The key idea supporting Kurzweil’s argument is the pace of change we are currently experiencing and its exponential growth given advancing technologies.
What will life be like, according to Kurzweil, after the Singularity takes place?
Assume the merger takes place. In the aftermath of the Singularity, according to Kurzweil, intelligence will begin to “saturate the matter and energy in its midst” and eventually “spread out from its origin on Earth.”[5]
Human life too will be transformed, in at least three areas, according to Kurzweil.
First, consider our intelligence.
Second, consider our experiences in life.
Third, consider our existence.
How should we assess the Singularity movement? Is the Singularity near?
I’d say that there is a sense in which the Singularity is near, or nearer, and an important sense in which the Singularity is quite far. Let’s begin with the sense in which the Singularity is near.
When futurists first started talking about the technological Singularity, AI was theoretical, but not actual. According to many metrics, AI currently outperforms humans in several areas of our cognitive lives, including handwriting recognition, speech recognition, image recognition, reading comprehension, and language understanding. So, given certain assumptions, it is certainly reasonable to think that this information and speed explosion will continue, and we’ll soon have superintelligent AI. So, in this sense we are nearer.
But in a crucial sense, we are far from the Singularity, for two reasons.
First, the information explosion and the speed explosion cannot go on forever.
Second, even if we, or AI, were successful in creating a superintelligent artificial agent, it is not possible for humans to merge or integrate with such a machine.
Finally, assume that the Singularity does happen and biology and technology seamlessly merge. We might ask: would human life, as conceived by Kurzweil, be good and valuable? Here too, there are reasons to be skeptical.
So, is the Singularity near? The short answer is as near is it might ever be, but unattainably far away.
[1] Vernor Vinge, “First Word,” Omni (January 1983), 10; as discussed in Chalmers, “The Singularity,” 172.
[2] Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology (Viking, 2005).
[3] Kurzweil, “Superintelligence and Singularity,” 160. All references to Kurzweil’s work are from a reproduction of chapter one of his book, The Singularity is Near, published in Science Fiction and Philosophy: From Time Travel to Superintelligence, 2nd ed., ed. Susan Scheider (Malden, MA: Wiley Blackwell, 2016).
[4] Kurzweil, “Superintelligence and Singularity,”149.
[5] Kurzweil, “Superintelligence and Singularity,”159. What does he mean by saturate? Kurzweil explains: “by saturating, I mean utilizing the matter and energy patterns for computation to an optimal degree, based on our understanding of the physics of computation.” Ibid., 166.