The presumably pseudonymous Extropia DaSilva neatly sums up my problems with Kurzweil-branch Singularitarianism (and one of the major reasons I repeatedly identify as a fellow traveller of the transhuman project rather than a card-carrier) over at H+ Magazine; in a nutshell, the Singularity is the ultimate authorial handwave:
The ancient Greeks also gave us the phrase “deus ex machina,” which the Oxford Dictionary defines as “an unexpected power or event saving a seemingly hopeless situation, especially as a contrived plot device in a play or novel.” A deus ex machina makes audiences and readers roll their eyes when they encounter it in a play or a story, and we should likewise roll our eyes when we encounter a deus ex machina being used to resolve all questions regarding the feasibility of achieving transhuman goals within our lifetime. “The Singularity will fix it” is a deus ex machina.
It also turns transhumanism into an infinitely variable explanation. Just like the myth of Demeter, you can continue to believe in the swift and inevitable success of transhuman dreams if you can invoke a godlike power that can fix anything. Hell, you can even posit a total rewrite of the laws of physics, thanks to the Singularity hacking the program that runs the universe. So even if some of our dreams turn out to violate physical laws, there is no reason to abandon faith.
To me, there is something deeply troubling about using the Singularity as a kind of protective barrier against all skepticism regarding the likelihood of achieving transhuman goals within a generation. It is difficult to reason with people who use the Singularity concept in this way, and even harder to have a logical debate with them. They have a deus ex machina to hand that can demolish any argument designed to show that transhuman dreams will not inevitably come true within our lifetime. This kind of reaction takes reasonable, scientific expectations of a brighter future and pushes them dangerously close to being an irrational pseudo-religion. And I find pseudo-religions boring.
I actually find pseudo-religions fascinating, but as subjects rather than objects; indeed, it’s the current schismatic/metastasising phase of transhumanism-as-pop-culture-meme that attracts my interest, far more than the promises of the technology on which it builds. The latter is pure speculation, which has its own intellectual rewards, but the former feels more like a chance for me to observe the way ideas spread and mutate in the antfarm of a networked global society.
(Now that I’ve typed that out, I realise it makes me sound like some sort of cultural peeping-tom. Ah, well. 🙂 )
However, lest we throw the baby out with the bathwater here, I think the Singularity has a certain value in its ability to attract attention. From a personal perspective, I would probably never have discovered transhumanism if it weren’t for the rash of science fictional Singularities I encountered over the last decade or so. Compressing the transformative power of technological change (and our convergence with such) into a momentary timeframe makes the underlying point – that we change as our technologies change, and that the relationship is a positive feedback loop – much clearer to the uninitiated.
However, that same temporal compression chimes with the transcendent Final Trumps of apocalyptic religions, and there’s a very frustrating human tendency to read metaphorical truths as literal ones (which I claim no immunity from myself, I might add); explicitly reframing the Singularity as a rhetorical narrative device might make it a slightly more useful thing, but I suspect the real root of the problem is that we all secretly long for something to swoop in and fix everything for us (hence the 90s popularity of the alien intervention narrative – surely an extant intelligence greater than [or perhaps merely different to] our own offers hope of us surviving our imminent civilisational bottlenecks?).The Singularity’s seductiveness lies in its tendency to brush aside unanswerable questions.
I imagine anyone who’s written fiction knows the temptation of the deus ex machina; the only alternative to its deployment is to think hard and rationally about ways to overcome an obstacle. Religions – and their rationality-tinged descendents, like Singularitarianism – are an inevitable by-product of human intellectual laziness.
Transhumanism is a philosophy, but Singularitarianism is a cult.
7 thoughts on “The Singularity is the deus ex machina of the transhumanist narrative”
What a great post. Don’t disagree with a thing.
…. And now you know why I’m not writing Singularity-fic any more.
(Well, aside from the collaboration with Cory Doctorow, but that’s a different beast — and anyway, it’s just finishing off a project we began about six or seven years ago.)
Great article! Singularity does evoke a sense of the mythical and also the romantic, particularly in Kurzweil’s interpretation because he so passionately wants this to recreate a life that was. Most rational people will not see this as possible, but this author’s point about humans evolving with their technology in a feedback loop seems extremely relevant to the future of humanity and it must lead somewhere…?
I’m currently reading H.G.Wells’ The Outline of History. Even though his dates need some revision, he correctly points out that humans have only occupied the earth for a very small fraction of the history of life. In turn, the historical period is a very small fraction of human existence. The Industrial Revolution is almost the last sentence in that narrative so far. Computers as we know them hadn’t even entered into the picture when Wells wrote. The Singularity isn’t singular, its’ ongoing — and even though ancient peoples would look upon us as gods, we have no more solved the critical problems of humanity than they had.
The Singularity, as I understand it, pretty much happens whether anyone believes in it or not. Deus ex machina? Only if you want it to be, and I can see how people might want and need to do that. But what does it change, other than their state of mind, if they do?
I agree about the Kurzweil branch Singularitarians, they are if anything damaging to transhumanism by encouraging complacancy rather than solving the obstacles that are in the way of transgumanist goals.
However, those who accept Yudkowsky’s analysis seem to be a bit more wary about the whole scenario. Notably, they seem to be more worried about the AI itself than an ever-accelerating parade of gadgets. I don’t think we can dismiss the prediction of AI causing volatile results that are likely to be hostile to human interests with ease.
>The Singularity’s seductiveness lies in its tendency to brush aside unanswerable questions<.
Yes, that would certainly seem to be what appeals to some people. Also, you can imagine that weakly godlike intelligences might find it a cinch to solve problems that the human race has been unable to solve, despite trying for millenia, such as 'how do we avoid death, permanently'? I don't see anything wrong with entertaining the possibility of this happening, since we really don't know enough about the motivations of these hypothetical minds to say for certain what they would/ would not do. But when I see people predicting what 'will' happen post-singularity (not just 'might' or 'could') I see a significant misunderstanding of the whole 'singularity' concept.
If you read Stross' 'Accelerando', you might think we glimpse posthumanity in the amazing technological capabilities like the exo-cortex Manfred's glasses access, or the Matrioska brain Amber visits. But none of these are posthuman technologies, because a human thought them up. Where we do glimpse the posthuman, I think, is when Aienko is asked a question that 'cannot be encoded in any human grammar', or when the Wunch make references to 'untranslatable concepts'. What we glimpse here is something beyond the ability to 'answer unanswerable questions'. It is the ability to 'ask unaskable questions'. It's like the difference between 'I don't understand quantum physics' and 'a dog doesn't understand quantum physics'. The difference being, I KNOW I don't understand it. A dog doesn't know this. It's completely oblivious to quantum physics and its mind is quite unable to perceive this ignorance. Possibly there are facets of the universe to which human minds are just as oblivious, but which posthumans might access. If posthuman minds can ask questions which 'cannot be encoded in any human grammar' and use that knowledge to shape their world, how profoundly strange would that reality seem to unaugmented human minds? How can you predict the motivations of beings capable of asking questions your mind is oblivious to? You can't, and so the closer to predicting what will happen post-singularity you are, the futher you are from understanding what singularity MEANS.
Comments are closed.