Nothing divides opinion like the future – it’s human nature, we all love to take a stab at predicting what will come. But it’s also human nature to disagree over what cannot yet be proven (which is something we can be sure of by looking at the past).
So, Vernor Vinge – the computer scientist and sf novelist who coined the term ‘Technological Singularity’ as used in this context during a presentation back in the eighties, and has talked about it ever since in his fiction and elsewhere – provides the capstone article to a special Singularity edition of IEEE Spectrum, defending the concept against the criticisms levelled at it by various scientists, economists and philosophers.
“The best answer to the question, “Will computers ever be as smart as humans?” is probably “Yes, but only briefly.””
For some odd reason IEEE neglected to solicit Warren Ellis‘s opinion, so he supplied it himself:
“When you read these essays and interviews, every time you see the word “Singularity,” I want you to replace it in your head with the term “Flying Spaghetti Monster.”“
As always, if you want the apogee of cynicism, Ellis is your man; he’s the bucket of cold water thrown over the mating dogs of enthusiasm.
But other opinions are available, as the adverts say – George Dvorsky’s response to Ellis, for example:
“The day is coming, my friends, when Singularity denial will seem as outrageous and irresponsible as the denial of anthropogenic global warming. And I think the comparison is fair; environmentalists are often chastised for their “religious-like” convictions and concern. It’s easy to mock the Chicken Littles of the world.”
What do Futurismic‘s readership think about the Singularity – awesome sf-nal literary metaphor, or looming technological likelihood? [image by binary koala]
11 thoughts on “Singularity season – nerd rapture or inconvenient truth?”
If the singularity is simply the moment when the intellectual capacity of artificial machines exceeds the intellectual capacity of all the human beings on the planet then we need to ask the question: what is intelligence?
The problem is we still haven’t decided what “intelligence” is, let alone “smartness.” So to say that the singularity is the moment a computer becomes as smart as a human isn’t very meaningful.
Kurzweil seems to talk more in terms of biological intelligence and machine intelligence combining, improving and augmenting each other.
My bet: around 2045 there will be a sudden surge of engineering innovation, creativity, and problem solving. This will be caused by the combination of human intelligence with non-human intelligence, all transparently linked together. Aging geeks will say: “aha” that’s the singularity!”
Non-human intelligence includes things like evolutionary software, neural networks from animals, data-mining, software learning techniques, and pattern recognition using large quantities of data.
There won’t be one single godlike AI that can’t be unplugged, and human beings will probably still be walking around looking like human beings – but we will all benefit from the developments in medicine, science, mathematics, engineering, culture, and education that will occur because of, and then proceed to enhance, the singularity.
It isn’t really a matter of faith, and it won’t be the answer to all our problems, but something that looks like a singularity will probably happen during the next fifty years or so.
I believe there’s an argument to the effect that a Singularity is by definition unrecognisable until it has been passed; in that case, perhaps it follows that we’ll never be sure if it’s possible until it’s already happened.
‘I believe there’s an argument to the effect that a Singularity is by definition unrecognisable until it has been passed’
A statement whose sentiments themselves smack of fuzzy religious mysticism.
I’m with Mr Ellis on this one – the Singularity makes for entertaining fictions, but there are more pressing matters for humanity to attend to than imminent technological rapture. Fifty years of grindingly slow progress in AI research doesn’t lead me to believe we’ll be welcoming our new AI overlords any time soon.
He’s obviously hit a raw nerve among the Singularity boosterists – it’s not easy having your religious convictions questioned.
“For some odd reason IEEE neglected to solicit Warren Ellis’s opinion,”
The discovery of the New World in 1492 was a singularity of sorts. In fact, I’d go as far as to say that History (with a capital letter) is – in essence – a series of singularities.
@Warren Ellis – well, they should have thought of the traffic you’d bring, if nothing else. 🙂
It seems like an OK story premise, but in real life it sounds like a lot of other predictions that didn’t pan out, like bases on Mars by 1999, or small entrepreneurial groups wiping out the lumbering corporate dinosaurs, or maybe even the state withering away to be replaced by a dictatorship of the proletariat.
But just to be a complete weasel, never say never.
Yes, the Singularity predictions could very well be wrong. But that’s no reason to send the whole mess off to the religion compactor, where points are genealogized rather than argued, and claims are assumed to be too irrational to investigate.
The “nerd rapture” idea is bad because it’s used to avoid taking the ideas seriously. Dvorsky’s post makes this point very well (though not in the sensational portion quoted above).
Dworsky’s “refutation” barely addresses Ellis other than to make the melodramatic “not dignifying you with a response” gambit. He spends fully three quarters of the rest of his post preaching to his choir on how to improve Singularity’s marketing image.
And with ideas such as “Let the critics have it and show them no quarter” he’s not doing his side any favors in making them appear as anything but pseudo-religious technological zealots.
Most of Dvorsky’s post is hype. However, I think the substance is roughly this part:
It’s mocking, but it makes a point: that Ellis, even faced with the considered views of experts, prefers to sidestep them; to attack without engaging their substance at all.
Who are the experts on the “singularity?” There are none because there is no clear agreed upon a definition of “the singularity”. Robin Hanson’s analogy using “historical singularities,” in the IEEE Spectrum issue, is a level headed overview for perspective. Vernor Vinge says the “singularity” may not happen, and gives several scenarios to illustrate why not.
When a field such as the “singularity”, climate studies, or any other nascent intellectual or scientific area lacks a clarity of focus, it is ripe for sceptics and infidels.
Comments are closed.