Via George Dvorsky, here’s the trailer for Transcendent Man, the forthcoming film about the life and work of Ray Kurzweil:
I’m pretty convinced that Kurzweil actually believes what he says, though only time will tell whether he’s right or not. However, this trailer doesn’t do much to disrupt Kurzweil’s image as a kind of pseudo-religious techno-prophet; disengaging from the subject matter and looking purely at the language and framing, it seems to set him up as a misunderstood Messiah, and that tends to fire up my instinctive BS detector much more than speculations on the developmental curve of technology.
What’s your take on Kurweil – deluded crank or visionary genius? Or something in between?
7 thoughts on “Ray Kurzweil: the Movie”
Kurzweil, to his credit, has made specific, testable predictions about advances towards the singularity. His predictions for 2009 (made several years earlier) shot way beyond the mark of where things actually stand today. That, of course, calls into question his predictions about 2019 and beyond.
I’ve had a keen interest in AI ever since my undergrad CS days (1970s) and actually took two graduate AI classes, one as an undergrad and one as a grad student a few years later; had I finished my master’s degree, my area of focus would almost certainly have been AI, and I still have a entire shelf filled with AI texts (mostly dated at this point).
That said, AI is one of those rare areas in computer science/information technology where predictions are consistently too optimistic (instead of being too conservative). Virtually every projected AI timeline has turned out to be wrong; again, witness Kurzweil’s predictions for 2009 and beyond. We sometimes forget that Moore’s Law does not apply to software, just hardware. ..bruce..
He’ll be quoted a lot, in two ways – ways where he was right and called it first, which will be a big list since many of predictions are fairly open to fit many things. But still, he will win a Nobel prize almost effortlessly because he has called things people have been laughing at for decades (and will for at least a decade).
He will also be recalled for being criminally optimist. And I mean, criminally, as in people cursing his name. I think the future will be pretty good, pretty awful at the same time, but more than all that very disorienting. Ray overglamorized things to sell his books, and with reason.
His construct, his books, his presentation is annoying me for being so bloody *safe*. Its like little house on the prairie safe. He doesn’t make any bold statements, unlike DeGaris he keeps saying the same bold alarmist thing over and over (“Cosmists!!Terrists!! Doom! Doom!”) and smiles like a buddha.
At this stage, I would start collecting all the wilder memes and making progressively more bold statements, with caveats.
He might even create a load of copyrightable ideas in this manner, and what has he got to lose saying something like “Any nation that doesn’t have gigabit connections in its biggest cities by 2010 will risk being equivalent to a third world nation by 2020”.
“By 2020 burglary by miniature robots will be a bigger criminal industry than illegal trade in cannabis”.
“Before 2030 an autonomous software virus will hack a human brain”
My take on Kurzweil is that he’s trivially right when he identifies “the Singularity” as a period of accelerating technical change in the 21st century.
This kind of exponential growth of technological capability and knowledge (not to mention financial wealth) has been going on for > 200 years now.
As to his specific predictions well, @bfwebster above mentioned that at least they’re testable and specific.
But I’ve always been struck by the spiritual and philisophical component of his writings (particularly in TSiN): where he talks of “physicalism” (as opposed to materialism, which has some negative connotations) and “patternism” (referring to the pattern of information that makes up a human mind).
Kurzweil is a technological spiritualist. There’s nothing wrong with that, an it’s interesting to watch new religion being born.
Kurzweil? Brilliant engineer, admirable entrepreneur, excellent example of how smart people can fail to understand how exponential growth really works (or doesn’t).
“Ptolemaic Productions”? Indeed, epicycles on epicycles are going to be required to make Kurzweil’s predictions come true in the manner or the time frame he insists on. He spends so much of his time making detailed predictions about the creation of artifical consciousness and the “inevitable” technology of uploading, without once admitting that we don’t have a clue at this point what these things are. Not that we are missing engineering knowledge, but we fundamentally don’t yet know what we’re talking about. Kind of like making detailed predictions based on knowledge of heredity before the discover of DNA and the genetic code. Any assumptions you make would completely miss massive complications like epigenetics, variant gene expression and expression networks, the information content of DNA versus the information required to construct a human body, etc.
Like bfwebster, I’ve studied AI, as well as neural nets and their relationship to real neurons, theories of brain function, and the philosophy of consciousness. I simply don’t believe that his predictions take into account many factors that will affect how much and how quickly we can learn about how the human mind functions. I don’t think Kurzweil’s a charlatan, but I do think he’s fooling himself, in part because he wants to believe that these technologies will be available to him before he dies.
Bruce that is a false argument you are using.
We currently make excellent use of gravity and theories about gravity, without understanding it. The suggestions Ray puts forth – that intelligent tool using civilization is entering a transitional period of objectively great importance – a change that hasn’t happened in probably a billion years in this region of the galaxy – is an observation of a trend with a significant implications.
The problem is, you cannot anticipate on it, worse than you can anticipate on an earthquake. No response has any meaning. Plus it wouldnt be a single singularity – it can be any of a thousand different singularity type transitions, ranging from a skynet extinction to a accelerando type funhouse ride.
Khannea: but we do have theories of gravity that give us many testable hypotheses, and even some idea of how we might use gravity in more sophisticated ways in engineering. As for consciousness it’s not only that we don’t know how it works, it’s that we don’t even know for sure if it exists; there is certainly no generally agreed-on, coherent description of what it is or what it does. And if we can’t even agree on what it is, how can we make accurate predictions about when we will be able to store it, modify it, or play it back into other physical substrates? To get back to gravity, our current theories allow the creation of wormholes which can short-circuit intervals of time and space. Does this mean we can predict to within a decade when we will be able to build and use them? I don’t think so.
It seems reasonable to say that “intelligent tool using civilization is entering a transitional period of objectively great importance”, but that’s already happened at least 3 times in the last 10 or 12,000 years. I’d certainly argue that the invention of agriculture was a great transition; it made possible permanent groups of more than 30 or so humans, eventually permitting cities. Then the development of writing allowed knowledge to be transmitted across space and time with much higher accuracy and in much larger amounts than oral tradition, which had the effect of making culture more robust and giving it the same exponential growth curve as the population and later the growth of science, engineering, and even semiconductor device density. And finally the scientific method, coupled with the development of mathematics as a model of physical and other systems, resulted in a growth of knowledge of the world which could be applied to increasing life span, communication speed, and the power to modify the world, making the human experience again very different.
Singularities aren’t new to humans (and there have been a total of at least 6 in the evolution of life as well, I’ll give you a cite for that if you want), but we tend to think of the previous ones as blending into the overall trend of history that resulted in us. But I think the difference between the lives of hunter-gatherers and those of the city dwellers of Assyria or Babylon was as great as the difference between us and whatever humans may become in the next century or so is going to be. Doesn’t mean I have any way of predicting what that difference will be.
Comments are closed.