All posts by Tom James

Coding for cars

road_lightApparently the media and navigation systems of a high-end Mercedes now require more lines of code than a 787 Dreamliner:

Boeing’s new 787 Dreamliner, scheduled to be delivered to customers in 2010, requires about 6.5 million lines of software code to operate its avionics and onboard support systems.

Alfred Katzenbach, the director of information technology management at Daimler, has reportedly said that the radio and navigation system in the current S-class Mercedes-Benz requires over 20 million lines of code alone and that the car contains nearly as many ECUs as the new Airbus A380 (excluding the plane’s in-flight entertainment system).

There is a considerably more awesome car than an S-class described in Heavy Weather by Bruce Sterling, but I can’t find my copy to tell you how many lines of code that needed (I remember it was specified somewhere).

[via Charles Stross][image Aitor Escauriaza from on flickr]

What is the Buxton index?

buxtonAn interesting science-fictional concept concerning intitutional longevity, via the late pioneering computer scientist Edsger W Dijkstra, in this essay (EWD 1175):

The Buxton Index of an entity, i.e. person or organization, is defined as the length of the period, measured in years, over which the entity makes its plans.

For the little grocery shop around the corner it is about 1/2,for the true Christian it is infinity, and for most other entities it is in between: about 4 for the average politician who aims at his re-election, slightly more for most industries, but much less for the managers who have to write quarterly reports.

The Buxton Index is an important concept because close co-operation between entities with very different Buxton Indices invariably fails and leads to moral complaints about the partner.

This is an interesting concept: and one that helps explain a lot of attitudes and responses towards issues like climate change, environmental destruction, and DRM.

In each case there are two different parties that are thinking in terms of two completely different Buxton indices. Short term profit vs. longterm survival in AGW or short term data security vs. longterm preservation of cultural artefacts in DRM.

[via this comment in Charlie’s Diary][image from Parksy1964 on flickr]

US and Russian satellites collide

satellite_dieOn Tuesday a satellite owned by the US company Iridium collided with an inoperative Russian satellite nearly 780 km above the Earth:

The risk to the International Space Station and a shuttle launch planned for later this month is said to be low.

The impact produced massive clouds of debris, and the magnitude of the crash is not expected to be clear for weeks.

There are thousands of man-made objects orbiting the earth, but this is thought to be the first time two intact spacecraft have hit each other, the BBC’s Andy Gallacher in Miami says.

Unfortunately as Earth orbit becomes more and more crowded (the number of orbiting objects larger than 10 cm reached 10, 000 in 2007 and is still increasing) it increases the risk of a cascade effect, where one collision results in a cloud of debris that go on to cause more collisions resulting in millions of tiny fragments resulting in a major and ongoing hazard to space exploration.

Given the risk of hindering future space exploration – is it worth pushing for an Earth orbit cleanup (and is such an idea even feasible)?

[from the BBC][image from Joe Hastings on flickr]

Reverse engineering the brain

thinkResearchers describe how it might one day be possible to simulate large parts of the human cortex on a computer, and how this could lead to functional human equivalent AI:

Software simulation of the human brain is just one half the solution. The other is to create a new chip design that will mimic the neuron and synaptic structure of the brain.

That’s where Kwabena Boahen, associate professor of bioengineering at Stanford University, hopes to help. Boahen, along with other Stanford professors, has been working on implementing neural architectures in silicon.

One of the main challenges to building this system in hardware, explains Boahen, is that each neuron connects to others through 8,000 synapses. It takes about 20 transistors to implement a synapse, so building the silicon equivalent of 220 trillion synapses is a tall order, indeed.

This is a different approach to more traditional AI research that has been going on for decades: instead of trying to write artificially intelligent computer programs using knowledge representation or commonsense knowledge representation now researchers are concentrating on reverse-engineering the only extant example of general intelligence we have.

[at Wired][image from bschmove on flickr]

Genes, genomes, and skiffy

beesKen MacLeod has a monograph up on genomics, sociology and science-fiction at the genomics forum:

Social scientists are less likely than natural scientists to star as villains or heroes in SF. Their work, however, has deeply influenced the genre.

At first or second or third hand – directly, through popularizations, and as refracted through mass media – anthropology, economics, sociology, and political theory have all raised questions to which SF writers have imagined answers.

As well as highlighting the importance of sociology and economics to the development of science fiction MacLeod suggests a reading list of suitable novels that are relevant to his topic. He also compliments us literary SF fans:

Written SF (whose core readership and reviewers are more scientifically informed than the general public) usually has to hew to stricter standards of scientific plausibility…

Damn staight.

[via Ken MacLeod][image from Todd Huffman on flickr]