The announcement of Ray Kurzweil’s Singularity University project (and the inevitable backlash against it) has people talking about the S-word again… much to the ire of transhumanist thinkers like Michael Anissimov, who points out that there are three competing ‘schools’ of thinking about the Singularity, each of which hinges on a different interpretation of a word that has, as a result, lost any useful meaning.
The “Accelerating Change” school is probably the closest to Kurzweil’s own philosophy, but it is also Kurzweil’s quasi-religious presentation style (not to mention judicious hand-waving and fact-fudging) that makes it the easiest to attack. [image by null0]
Anissimov finds himself closer to the “Event Horizon” and “Intelligence Explosion” schools:
These other schools point to the unique transformative power of superintelligence as a discrete technological milestone. Is technology speeding up, slowing down, staying still, or moving sideways? Doesn’t matter — the creation of superintelligence would have a huge impact no matter what the rest of technology is doing. To me, the relevance of a given technology to humanity’s future is largely determined by whether it contributes to the creation of superintelligence or not, and if so, whether it contributes to the creation of friendly or unfriendly superintelligence. The rest is just decoration.
That may not actually sound any more reassuring than Kurzweil’s exponential curve of change to many people – if not even less so. And with good reason:
That’s the thing about superintelligence that so offends human sensibilities. Its creation would mean that we’re no longer the primary force of influence on our world or light cone. Its funny how people then make the non sequitur that our lack of primacy would immediately mean our subjugation or general unhappiness. This comes from thousands of years of cultural experience of tribes constantly killing each other. Fortunately, superintelligence need not have the crude Darwinian psychology of every organism crafted by biological evolution, so such assumptions do not hold in all cases. Of course, superintelligence might be created with just that selfish psychology, in which case we would likely be destroyed before we even knew what happened. Prolonged wars between beings of qualitatively different processing speeds and intelligence levels is science fiction, not reality.
Superintelligence sounds like a bit of a gamble, then… which is exactly why its proponents suggest we need to study it more vigorously so that – when the inevitable happens – we’re not annihilated by our own creations.
But what’s of relevance here is the sudden attempts by a number of transhumanist and Singularitarian thinkers to distance themselves from Kurzweil’s PT Barnum schtick in search of greater respectability for their less sensationalist ideas. Philosophical schisms have a historical tendency to become messy; while I don’t expect this one to result in bloodshed (although one can’t completely rule out some Strossian techno-jihad played out in near-Earth Orbit a hundred years hence), I think we can expect some heated debate in months to come.