The implanted chip, according to the MIT team behind it, features a “microfabricated polyimide stimulating electrode array with sputtered iridium oxide electrodes” which is implanted into the user’s retina by a specially-developed surgical technique. There are also “secondary power and data receiving coils”.
Once the implant is in place, wireless transmissions are made from outside the head. These induce currents in the receiving coils of the nerve chip, meaning that it needs no battery or other power supply. The electrode array stimulates the nerves feeding the optic nerve, so generating a image in the brain.
The wireless signals, for use in humans, would be generated by a glasses-style headset equipped with cameras or other suitable sensors and transmitters tuned to the coils implanted in the head.
For now, however, the system has only been tried out in Yucatan minipigs. Three of the diminutive Mexican porkers have had the Star Trek/Gibsonesque implants for seven months, but as yet it’s difficult to tell just how well they work – as the pigs aren’t talking. The MIT boffins have fitted them with instrumented-up contact lenses to try to get an idea of what effects the implants have.
If you really need me to prompt you towards imagining the awesome and/or weird stuff that might happen as a result of this technology becoming readily available, I suspect you’re reading the wrong website. 🙂 [image by striatic]
The folks at Technology Review have run up a top ten of futurismic display/interface combos, all on display at SIGGRAPH 2009, I particularly like the haptic holography from researchers at the University of Tokyo:
The virtual objects appear in mid-air thanks to an LCD and a concave mirror. The sensation of touching the objects is created using an ultrasound device positioned below the LCD and mirror.
It’ll be interesting to see whether people end up using more traditional haptic devices like gloves and goggles combinations, or choose something based on holography and sound waves.
Also note that Wii remotes are used as off-the-shelf sensors, the street, or academia, finds its own use for things.
…macaque monkeys using brain signals learned how to move a computer cursor to various targets. What the researchers learned was that the brain could develop a mental map of a solution to achieve the task with high proficiency, and that it adhered to that neural pattern without deviation, much like a driver sticks to a given route commuting to work.
“The profound part of our study is that this is all happening with something that is not part of one’s own body. We have demonstrated that the brain is able to form a motor memory to control a disembodied device in a way that mirrors how it controls its own body. That has never been shown before.”
This is an exciting development. Developing the means to control prosthetics as if they were part of your own body would improve the lives of paraplegics, and even offer the possibility of extending baseline human abilities.
This one’s doing the rounds everywhere, and with some justification. I try to steer away from pure OMG TECH! posts here at Futurismic, but if this doesn’t kick you right in the cyberpunk-sensawunda gland with a big pair of hob-nailed boots… well, you’re obviously not as massive an unreconstructed nerd as I am, basically.
See what I mean? As I remarked to a fried on Twitter last night, I’ll cheerfully trade my mortal soul to the first cellphone provider that offers me something that can do all that. Awesome. [via Hack-a-Day and many others]
Because thinking about a new product shape by sketching is more expressive and more intuitive for engineers than the traditional mouse-and-menu-based design interfaces, the new system gives users more freedom to be creative and a shorter learning curve for use.
By providing greater freedom in conceptual design phases and alleviating costly redesign issues, the new technology will have an immediate impact on a multitude of industries, Carnegie Mellon researchers said.