Tag Archives: gadgets

Neural interfaces: the state of the market

Back in May we dipped into a heavy H+ Magazine article to find out about the cutting edge of neural interface research, the theoretical boundary-pushing stuff. While it’s fun to know where things are (or might be) going, like all good cyberpunks we’re much more interested in what we can realistically get our hands on right now; the things the street could be busily finding its own uses for. So head on over to this short piece at ReadWriteWeb, which is a neat list of six real products with basic neurointerface abilities, just waiting to be hacked or repurposed for something awesome [via TechnOccult].

Actually, the latter two are research devices rather than commercially available gizmos, but even so, those proofs-of-concept will need to be monetized at some point, AMIRITE? And of the real products on offer, I think this is my favourite:

[T]he Emotive EPOC neuroheadset […] features 14 saline-based sensors and a gyroscope. Primarily marketed to gamers, the device also helps people with disabilities regain control of their lives. Included with the device is the EmoKey, which is a lightweight application running in your computer’s background. It allows you to map out thought-controlled keystrokes. This headset is the preferred device of the Dartmouth Mobile Sensing Group, which created a brain-to-mobile interface that allows you to call your friends by thinking about them.

If any smart hacker types in the audience would like to kludge one of these things up so I can do all my blogging and editorial work without having to move my arms, drop me a line so we can discuss funding, OK?

Sing the body electric: be your own batteries

Back in the early eighties, my father had a joke he loved to tell non-engineers about the then nascent technology of mobile phones; the punchline sees the customer, heretofore staggered by the miniaturisation of the handset he’s just bought, daunted by the ludicrous size of the power supply.

While those days are long behind us (and my father should be posthumously forgiven, as he started working with computers when they still filled entire floors), the problem remains: the more electronic hardware we want to carry around with us, the more reliable and equally portable a source of juice we need to keep it running. And given that our near future is posited to be crammed with everyware, ubicomp, body area networks and cyberpunkish augmented reality contact lenses, there’s money in being the first to come up with the solution.

Money or military advantage, perhaps… indeed, good ol’ DARPA who are one of the big players in this field, because the amount of hi-tech kit the average soldier has to cart about is becoming a serious issue (not least for the soldiers themselves). Their proposed solution? Scavenge the wasted energy from the human body carrying the kit [via BoingBoing]:

Obviously, our bodies generate heat—thermal energy. They also produce vibrations when we move—kinetic energy. Both forms of energy can be converted into electricity. Anantha Chandrakasan, an MIT electrical engineering professor, who is working on the problem with a former student named Yogesh Ramadass, says the challenge is to harvest adequate amounts of power from the body and then efficiently direct it to the device that needs it.

In the case of harnessing vibrations, Chandrakasan and his colleagues use piezoelectric materials, which produce an electric current when subjected to mechanical pressure. For energy scavenging, ordinary vibrations caused by walking or even just nodding your head might stimulate a piezo material to generate electricity, which is then converted into the direct current (DC) used by electronics, stored in solid-state capacitors and discharged when needed. This entire apparatus fits on a chip no larger than a few square millimeters. Small embedded devices could be directly built onto the chip, or the chip could transmit energy wirelessly to nearby devices. The chip could also use thermoelectric materials, which produce an electric current when exposed to two different temperatures—such as body heat and the (usually) cooler air around us.

It’s a good idea (though it remains to be seen how useful it’ll be; I suspect the efficiency of gadgets will need to increase in order to meet the available energy harvest halfway), but it begs the question: how much wasted energy could we harvest if we were sufficiently motivated to do so? Think of it as a kind of energy freeganism – dumpster-diving for watt-minutes. Wind, solar and tidal power are two taps on the natural world, but what about the environment we’ve made in our own image?

People have thought about harvesting the energy of footsteps to power subway stations; why not do the same with shopping malls (hence ensuring that the energy used is directly proportional to the actual throughout of shoppers)? Might there be some way of harnessing the gravitational flexing of very tall buildings, in addition to covering them with solar cells and heat exchangers and hell knows what else? I reckon we’d be able to think of lots of sources of energy we currently overlook as too trivial, if only we really needed to… necessity is the mother of frugality, after all.

Gizmo landscapes, gonzo worldbuilding

Here’s an interesting thought experiment which feels science fictional to me – not science fictional in the “making things up about the future” sense, but in the “teasing the bigger picture out of smaller things” sense. Rob Holmes of mammoth invites us to think about the infrastructural landscapes that support a single use of a consumer-level technological artefact; his Zeitgeist-friendly example is, naturally, some web browsing done on an iPhone somewhere in Brooklyn [via MetaFilter].

The iPhone, however, is not only dependent upon highly developed systems in its production, as Banham acknowledges all such objects have always been, but is also now equally dependent in its operation upon a vast array of infrastructures, data ecologies, and device networks.  Even acknowledging this, though, and realizing that its operative value comes from its ability to tap those data ecologies and attendant socially-constituted bodies of knowledge, it is still possible to miss the landscapes that it produces. Until we see that the iPhone is as thoroughly entangled into a network of landscapes as any more obviously geological infrastructure (the highway, both imposing carefully limited slopes across every topography it encounters and grinding/crushing/re-laying igneous material onto those slopes) or industrial product (the car, fueled by condensed and liquefied geology), we will consistently misunderstand it.

His preliminary examples include the mines that supply the rare semiconductor elements used in chips and touchscreens, the factory megacomplexes where they’re designed and built, the server farms that prop up the internet that the phone connects to, and the transciever arrays that provide the last wireless step in that connection. The point is plain: there’s a whole lot of stuff behind the gadgets in our pockets and satchels that we don’t really think about when we use them.

This is very much like “systems thinking, which I imagine most of Futurismic‘s readers are already familiar with (because you all seem pretty clued up on the science and tech side of things), but which is, as far as I can tell, a fairly uncommon mindset in the population at large (a fact exploited to the fullest by politicians, among others).

But it struck me that it’s also rather like the worldbuilding that informs science fiction: the iPhone is the story, and infrastructure is the imagined world in which it’s set. In both examples, the end user doesn’t need to know anything about the infrastructure, and will probably actively resist being told about it (infodump!). In both examples, that will to ignorance allows the writer/manufacturer a lot of leeway with the infrastructure: so long as thing works, who cares how it works?

I’ll be honest – I’m not sure where I’m going with this, or even if it’s going anywhere at all*. But it gave me a brain-chime, so I’m throwing it out here by way of recording the thought, and to see if any of you can pick up the ball and run with it. Any ideas?

[ * I started writing this post immediately after a post-lunch triple espresso; make of that what you will. ]

Life-size telepresence robots make their appearance

qa_1 A few years ago I seem to recall a spate of SF stories in Asimov‘s and elsewhere that dealt with the concept of telepresence: humans controlling robots at a distance, immersed in a virtual-reality world that made what happened to the robot feel much more real than merely sitting at a control panel manipulating a joystick.

Well, human-sized telepresence robots are beginning to make their appearance. California company Anybots debuted its Anybots QA telepresence robot at the Consumer Electronics Show in January (Via Gizmag):

The robot’s 802.11g wireless connectivity allows 20 FPS video at 640×480 resolution captured by the QA’s two 5MP color cameras and full duplex, high fidelity sound to be sent back to the user’s Mac or PC running the client software. A 7-inch color LCD screen in the QA’s chest can display the remote user to give long distance interactions that human touch while navigation comes courtesy of QA’s onboard 5.5 yard range LIDAR (Light Detection and Ranging), which functions like RADAR but uses light instead of radio waves.

Standing at 5 foot tall the QA can also bend to 2 foot high to interact more easily with people sitting. The robot’s rechargeable Li-ion battery gives 4-6 hours of operation and allows QA to reach speeds of up to 6 MPH on his two 12-inch diameter wheels.

The company is developing other robots that walk, jump and run on two legs. One of them reportedly has a fully articulated hand that will permit the operator to perform a wide range of tasks. The QA is expected to be available for purchase later this year at around $30,000 U.S.

Maybe someday we’ll all sit at home at a computer and send out robots to do our jobs and errands, and never leave the house.

Someone should make a movie about it…

Order your surrogate robot today!

(Image: Anybots.)

[tags]robots,telepresence,movies,gadgets[/tags]

MIT researchers create cheap "sixth-sense" ubiquitous computing device

800px-Augmented_reality_-_heads_up_display_concept The era of ubiquitous computing progresses apace (Via PhysOrg):

US university researchers have created a portable “sixth sense” device powered by commercial products that can seamlessly channel Internet information into daily routines.

The device created by Massachusetts Institute of Technology (MIT) scientists can turn any surface into a touch-screen for computing, controlled by simple hand gestures.

The gadget can even take photographs if a user frames a scene with his or her hands, or project a watch face with the proper time on a wrist if the user makes a circle there with a finger.

The MIT wizards cobbled a Web camera, a battery-powered projector and a mobile telephone into a gizmo that can be worn like jewelry. Signals from the camera and projector are relayed to smart phones with Internet connections.

According to the researchers, the gadget (unveiled by MIT researcher Pattie Maes at the Technology, Entertainment, Design [TED] conference currently underway in Long Beach, California) uses about $300 U.S. worth of store-bought components, and can do things like recognize items on store shelves, retrieve and project information about products, look at an airplane ticket and let the user know whether the flight is on time, or recognize books in a book store, project reviews or author information from the Internet onto blank pages, and recognize articles in newspapers and retrieve the latest related stories or video from the Internet. You can interact with the data using any surface–even your hand if nothing else is available. “Maybe in ten years we will be here with the ultimate sixth-sense brain implant,” Maes said.

Forgot about trekking to the Wizard. Dorothy should have got the Strawman one of these.

(Image: Leonard Low, Concept for augmented reality mobile phone, via Wikimedia Commons.)

[tags]computers,augmented reality,technology,gadgets[/tags]