Better shelve that plan for a timehopping cultural tour of Renaissance Venice; physicists from the University of Hong Kong are convinced they’ve shown time travel to be impossible.
The possibility of time travel was raised 10 years ago when scientists discovered superluminal – or faster-than-light – propagation of optical pulses in some specific medium, the team said. It was later found to be a visual effect, but researchers thought it might still be possible for a single photon to exceed light speed. Du, however, believed Einstein was right and determined to end the debate by measuring the ultimate speed of a single photon, which had not been done before.
“The study, which showed that single photons also obey the speed limit c, confirms Einstein’s causality; that is, an effect cannot occur before its cause,” the university said.
I’ll give it a maximum of two years before someone has a counter-theory that makes it possible once again…
Just because I get a deep kick of joy from being able to type out a post title like that and have it actually mean something. From Wired UK, reassurance that we’re (probably) not living in a holographic universe:
Hogan’s interpretation of results from the GEO600 gravitational wave experiment had shown a quantum fuzziness — a sort of pixellation — at incredibly small scales, suggesting that what was perceive as the universe might be projected from a two-dimensional shell at its edge.
However, a European satellite that should be able to measure these small scales hasn’t found any quantum fuzziness at all, contradicting the interpretation of the GEO600 results and indicating that the pixellation of spacetime, if it exists, is considerably smaller than predicted.
Well, I’ll be sleeping more soundly tonight.
Via Next Big Future, Doctor Suzanne Gildert of the excellently-named Physics & Cake blog takes apart the [science fictional / Singularitarian] concept of computronium, and does a pretty good job of explaining why it probably isn’t possible:
… as we see, atoms are already very busy computing things. In fact, you can think of the Universe as computing itself. So in a way, matter is already computronium, because it is very efficiently computing itself. [This reminds me of Rudy Rucker’s theories about gnarl and universe-as-computing-substrate – PGR] But we don’t really want matter to do just that. We want it to compute the things that WE care about (like the probability of a particular stock going up in value in the next few days). But in order to make the Universe compute the things that we care about, we find that there is an overhead – a price to pay for making matter do something different from what it is meant to do. Or, to give a complementary way of thinking about this: The closer your computation is to what the original piece of matter would do naturally, the more efficient it will be.
So in conclusion, we find that we always have to cajole matter into computing the things we care about. We must invest extra resources and energy into the computation. We find that the best way to arranging computing elements depends upon what we want them to do. There is no magic ‘arrangement of matter’ which is all things to all people, no fabled ‘computronium’. We have to configure the matter in different ways to do different tasks, and the most efficient configuration varies vastly from task to task.
If it’s not too meta a get-out clause, perhaps we could develop some sort of nanotech system for reconfiguring computational substrate matter into the most appropriate arrangement for the task at hand? Talk about an abstraction layer… 🙂
Interesting news from that weird and wonderful intellectual space where physics and theology trade slow, dignified blows; new research into the effects of varying the cosmological constant swings out like a haymaker from the atheist corner and knocks at least one God-of-the-gaps out of the ring [via SlashDot].
… although positive, the cosmological constant is tiny, some 122 orders of magnitude smaller than Planck’s constant, which itself is a small number.
So Page and others have examined the effects of changing this constant. It’s straightforward to show that if the the constant were any larger, matter would not form into galaxies and stars meaning that life could not form, at least not in the form we know it,.
So what value of the cosmological constant best encourages galaxy and star formation, and therefore the evolution of life? Page says that a slightly negative value of the constant would maximise this process. And since life is some small fraction of the amount of matter in galaxies, then this is the value that an omnipotent being would choose.
In fact, he says that any positive value of the constant would tend to decrease the fraction of matter that forms into galaxies, reducing the amount available for life.
Therefore the measured value of the cosmological constant, which is positive, is evidence against the idea that the constants have been fine-tuned for life.
I guess the obvious theist retort would be that God’s ineffable decision to use a sub-optimal value for the constant is a test of our faith… hi-ho, anthropic principle!
I’m a bit of a physics geek. Not that I can do the math. But I’ve always wanted to know how the world works, and physics is the very coolest science for that. The foundation. So I decided to find three bits of news in physics to put forward as a little gift for my fellow science geeks – a bit of how the world might work for the holiday season. Continue reading All I want for Christmas is some cool new physics →