Tag Archives: computing

Brain electrodes: in and out

silke1Following on nicely from Paul’s discussion of direct-to-brain broadband – and Robert Koslover’s comment – here we have news of the first read-write brain electrode from a company called IMEC:

Today’s deep-brain stimulation probes use millimeter-size electrodes. These stimulate, in a highly unfocused way, a large area of the brain and have significant unwanted side effects.

IMEC’s design and modeling strategy allows developing advanced brain implants consisting of multiple electrodes enabling simultaneous stimulation and recording. This strategy was used to create prototype probes with 10 micrometer-size electrodes and various electrode topologies.

These new design approaches open up possibilities for more effective stimulation with less side effects, reduced energy consumption due to focusing the stimulation current on the desired brain target, and closed-loop control adapting the stimulation based on the recorded effect.

Presumably the avenue towards the development of devices for direct-to-brain broadband will be through the development of ever more sophisticated products of this kind, possibly travelling via wirehead-style ecstasy generators.

[from this press release from IMEC, via Technovelgy][image from IMEC press release]

Would you sign up for direct-to-brain broadband?

In a “twenty-questions” style interview with author Michael Grant over at The Guardian, I was struck by his answer to the final question:

What piece of technology would you most like to own?

I want a Google chip implanted in my brain. Wire up my cerebrum. I’m perfectly serious. I want all access, all the time.

Now, despite his protestations of seriousness, I rather suspect he’s exaggerating for effect. But even so, I found myself wondering whether I’d go for such a connection myself, if the opportunity arose. Let’s assume for a moment (and not too hypothetically) that such an always-on link could be achieved without surgical intervention – high-powered wearable computing, wireless broadband link, some sort of cyberpunk data-shades assemblage for interface, all that jazz. Is it still as transgressive and extreme an idea if you could just take it all off when ever you chose to? After all, I already spend upwards of ten hours a day connected to the internet*; the technological leap to being able to do so without having to be here at my desk seems like a small skip of convenience from where I’m sitting right now.

Now, imagine that Grant’s implants actually existed – how differently would a person with such capabilities interact with the world, and with other people? Would they have something of the autist or savant about them, or would instantaneous access to the knowledge and conversation of the web enhance their abilities to socialise? What work would they do (or want to do), and what jobs would they be denied?

Sure, these are all established questions that arise from reading cyberpunk literature – but to be kicked into that mode of thinking by a throwaway line in an interview with a YA author? It’s a weird wired world, and no mistake.

[ * – Yes, I know it shows. Be nice. ]

Mobile Massively Multiplayer – Warcraft on the iPhoe

Here’s some big news for the gamers among you (provided it’s not an elaborate and well-produced hoax) – a World of Warcraft client that runs on the iPhone.

Found via The Guardian, where Greg Howson asks whether the cramped screen real-estate and network lag would make it worth bothering. I figure that’s an academic question, really; I imagine if I (a) played WoW and (b) had an iPhone, I’d be mad keen for a mobile version; I mean, who wouldn’t be, right? If you’re an iPhone and MMO geek, you’re going to go mad for the idea of getting the best of both at once…

But more to the point (and the main reason I called it out), it’s another SF Prophecy Point on the leaderboard for Charlie Stross, who included mobile MMO gaming as a core trope in his 2007 novel Halting State. Two years from science fiction to reality – things move fast, don’t they?

“Good enough” computing – will the recession kill off Microsoft?

laptop and netbookThis speculative futurism thing is starting to spread! Keir Thomas, Linux columnist for PC World, has posted a future retrospective piece that looks back from 2025 to the present day as the dawn of “good enough” computing… and the beginning of the end for Microsoft.

The lack of desire to relinquish XP by users was part of what became known as the “Good Enough” revolution in both software and hardware. At the beginning of the 21st century, computing hardware had evolved sufficiently to reach a level of performance that allowed for speedy execution of virtually all common computing tasks. Prior to this, the only way to guarantee good performance was to buy expensive cutting-edge hardware. But now chips costing just a few dollars offered more performance than most people would ever need.

Upgrading became less a matter of getting a better PC than about simply replacing old and broken computers with newer models. Ever resourceful during the Great Recession that struck in the early 21st century, PC manufacturers responded with ultra-cheap but “good enough” computers (both laptops and desktops) that were designed to be simple slot-in replacements for existing computers. PC manufacturers had already carved this route with netbook computers, where the goal was to be cheap and usable, with little if any frills.

Obviously there’s an element of fun-poking to Thomas’s piece (alongside the enduring positivity of the committed Linux evangelist) but as a piece of speculative futurism it’s a solid and plausible job. The details may well work out differently – and I’d be surprised to see even the recently-beleaguered Microsoft drop out of the game quite that easily – but the idea of computing as commodity was raised by Charlie Stross a year and a half ago, and many others since. As the line between mobile devices and ‘proper’ computers continues to blur (and convergence with phone handsets accelerates), Thomas’s future doesn’t look too fictional at all. [via the spiritual home of the Linux-takes-all story, SlashDot; image by Matthew Verso]

The silicon brain

neural networkMost attempts to simulate the function of organic brains using computers have been software simulations – models built with code, if you like. An international team of computer scientists have been trying the other approach, however: building computer hardware that mimics the dense interconnection of brain cells.

The hope is that recreating the structure of the brain in computer form may help to further our understanding of how to develop massively parallel, powerful new computers, says Meier.

This is not the first time someone has tried to recreate the workings of the brain. One effort called the Blue Brain project, run by Henry Markram at the Ecole Polytechnique Fédérale de Lausanne, in Switzerland, has been using vast databases of biological data recorded by neurologists to create a hugely complex and realistic simulation of the brain on an IBM supercomputer.

[snip]

The advantage of this hardwired approach, as opposed to a simulation, Karlheinz continues, is that it allows researchers to recreate the brain-like structure in a way that is truly parallel. Getting simulations to run in real time requires huge amounts of computing power. Plus, physical models are able to run much faster and are more scalable. In fact, the current prototype can operate about 100,000 times faster than a real human brain. “We can simulate a day in a second,” says Karlheinz.

A day in a second, huh? That’s straight out of your favourite Singularity sf story, right there. [image by neurollero]

Transhumanists talk a great deal about the inevitability of human-equivalent artificial intelligence in the very near future, and it’s easy to dismiss them as dreamers until you read an article like this. I’m not saying that silicon brainware means the Singularity is inevitable, or even likely… but I think I’ll start learning to speak in machine code. Y’know, just in case.