If I could choose one science fiction author in whose head-space I could spend a lengthy holiday (equipped with copious note-taking equipment and a barrel of synaptic cognition enhancers, naturally), Rudy Rucker would be my first choice by a country mile. Despite having a justified reputation as a quirky and colourful writer, he’s a ferociously smart guy. [image by JonDissed]

“VR isn’t ever going to replace RR (real reality). We know that our present-day videogames and digital movies don’t fully match the richness of the real world. What’s not so well known is that computer science provides strong evidence that no feasible VR can ever match nature. This is because there are no shortcuts for nature’s computations. Due to a property of the natural world that I call the “principle of natural unpredictability,” fully simulating a bunch of particles for a certain period of time requires a system using about the same number of particles for about the same length of time. Naturally occurring systems don’t allow for drastic shortcuts.”

I thought the Singularity was supposed to be when computers become intelligent. As far as simulations go – I remember an old adage – “No natural system can be accurately simulated by anything less complex than itself”

This seems like a silly position to take. You don’t need to accurately mimic reality 100% in order to make it “good enough”. Would you personally be fooled if a billion particles took a few short cuts and didn’t play out their interactions with the ugly complexity that a few 10^23 atoms can? Of course not. You only personally care about macroscopic things. If a computer “fudges” the other 10^23 particles, you will never notice.

An ideal simulation of reality would show you exactly as much complexity as you need. Walking around? Eh, you probably don’t need to worry about anything smaller than a millimeter and everything else can be fudged as long it behaves “as it should”. Dropping something into a particle smasher? Ok, now you might need to do a more intensive simulation that eats more computing resources or else people will find holes.

In fact, if you suspect that you are in a simulation instead of the “real world” one thing you can do is to try and get it to show you inconsistencies by forcing it to make intensive calculations that will take up more computing resources than it can handle.

I agree with Rindan.

To quote from Rucker’s post:

“This is because there are no shortcuts for nature’s computations. Due to a property of the natural world that I call the “principle of natural unpredictability,” fully simulating a bunch of particles for a certain period of time requires a system using about the same number of particles for about the same length of time. Naturally occurring systems don’t allow for drastic shortcuts.”

Rucker’s argument is fair enough as far as it goes but the whole point of the statistical mechanics invented by Gibbs and Maxwell and Boltzmann is that once you know enough particles in a system you can make accurate statistical statements about that system.

So we have the gas laws, the laws of thermodynamics etc.

Another point worth making is that current developments in spintronics (computations using the “spin” of electrons) offer a layer of computation beneath that of atomic matter.

I concede that at some point “fudging” will have to take place, but as I pointed out before: statistical mechanics isn’t really fudging. Diffusion can be accurately modelled without having to model every single damn particle.

Anyway my gut feeling is that if something like a singularity happens it will be much weirder than simply grinding up the Earth into nanomachines then running a simulated Earth on the nanomachines.

I mean c’mon, if you’re a superhuman intelligence what’s the first thing you’re going to do? Create the perfect lay? Work out the formula for the perfect cup of tea (of course, according to Douglas Adams this is a much more difficult computational problem than most anything else…).

Comments are closed.

Presenting the fact and fiction of tomorrow since 2001

uhm, the entire quantum probability type ickyness is 100% pure shortcut of the type that he claims nature doesn’t have.

Not that I totally disagree with everything else he says.

PS here is a little singularity comic 🙂 -> http://4lfa.com/page.php/20080125

I thought the Singularity was supposed to be when computers become intelligent. As far as simulations go – I remember an old adage – “No natural system can be accurately simulated by anything less complex than itself”

This seems like a silly position to take. You don’t need to accurately mimic reality 100% in order to make it “good enough”. Would you personally be fooled if a billion particles took a few short cuts and didn’t play out their interactions with the ugly complexity that a few 10^23 atoms can? Of course not. You only personally care about macroscopic things. If a computer “fudges” the other 10^23 particles, you will never notice.

An ideal simulation of reality would show you exactly as much complexity as you need. Walking around? Eh, you probably don’t need to worry about anything smaller than a millimeter and everything else can be fudged as long it behaves “as it should”. Dropping something into a particle smasher? Ok, now you might need to do a more intensive simulation that eats more computing resources or else people will find holes.

In fact, if you suspect that you are in a simulation instead of the “real world” one thing you can do is to try and get it to show you inconsistencies by forcing it to make intensive calculations that will take up more computing resources than it can handle.

I agree with Rindan.

To quote from Rucker’s post:

“This is because there are no shortcuts for nature’s computations. Due to a property of the natural world that I call the “principle of natural unpredictability,” fully simulating a bunch of particles for a certain period of time requires a system using about the same number of particles for about the same length of time. Naturally occurring systems don’t allow for drastic shortcuts.”

Rucker’s argument is fair enough as far as it goes but the whole point of the statistical mechanics invented by Gibbs and Maxwell and Boltzmann is that once you know enough particles in a system you can make accurate statistical statements about that system.

So we have the gas laws, the laws of thermodynamics etc.

Another point worth making is that current developments in spintronics (computations using the “spin” of electrons) offer a layer of computation beneath that of atomic matter.

I concede that at some point “fudging” will have to take place, but as I pointed out before: statistical mechanics isn’t really fudging. Diffusion can be accurately modelled without having to model every single damn particle.

Anyway my gut feeling is that if something like a singularity happens it will be much weirder than simply grinding up the Earth into nanomachines then running a simulated Earth on the nanomachines.

I mean c’mon, if you’re a superhuman intelligence what’s the first thing you’re going to do? Create the perfect lay? Work out the formula for the perfect cup of tea (of course, according to Douglas Adams this is a much more difficult computational problem than most anything else…).