Dark Silicon: an end to Moore’s Law?

Paul Raven @ 02-08-2011

From the New York Times:

paper presented in June at the International Symposium on Computer Architecture summed up the problem: even today, the most advanced microprocessor chips have so many transistors that it is impractical to supply power to all of them at the same time. So some of the transistors are left unpowered — or dark, in industry parlance — while the others are working. The phenomenon is known as dark silicon.

As early as next year, these advanced chips will need 21 percent of their transistors to go dark at any one time, according to the researchers who wrote the paper. And in just three more chip generations — a little more than a half-decade — the constraints will become even more severe. While there will be vastly more transistors on each chip, as many as half of them will have to be turned off to avoid overheating.

Personally, I’m not going to sing requiem for Moore’s Law just yet; many brick walls for it have been suggested, and they’ve always been engineered around eventually; that said, there are limits to almost everything, and perhaps silicon architecture will finally meet its apogee. I think the real question to ask here is “would that be a bad thing?” An upper limit on computing power might just lead to software that uses what’s available more efficiently…

(Top marks on the suitably doomy and mysterious moniker “dark silicon”, though; that’s a post-cyberpunk novel title just waiting to be used.)


Throw another process log in the data furnace, darling

Paul Raven @ 27-07-2011

Via SlashDot, an intriguing idea comes a-squirming out of Microsoft’s Research wing: the data furnace. You know how your computer hardware chucks out a whole lot of heat as a waste product? Well, imagine how much a datacentre has to cope with. So why not put that waste heat to good use, and use it to heat people’s homes?

The genius of this idea is that Data Furnaces would be provided by companies that already maintain big cloud presences. In exchange for providing power to the rack, home and office owners will get free heat and hot water — and as an added bonus, these cloud service providers would get a fleet of mini urban data centers that can provide ultra-low-latency services to nearby web surfers. Of course the electricity cost would be substantial — especially in residential areas — but even so, the research paper estimates that, all things considered, between $280 and $324 can be saved per year, out of the $400 it costs to keep a server powered and connected in a standard data center. From the inverse point of view, heating accounts for 6% of the total US energy consumption — and by piggybacking on just half of that energy, the IT industry could double in size without increasing its power footprint.

You will have, of course, already thought of the most obvious objection or snag:

The main problem with Data Furnaces, of course, is physical security. Data centers are generally secure installations with very restricted access — which is fair enough, when you consider the volume and sensitivity of the data stored by companies like Facebook and Google. The Microsoft Research paper points out that sensor networks can warn administrators if physical security is breached, and whole-scale encryption of the data on the servers would ameliorate many other issues. The other issue is server management — home owners won’t want bearded techies knocking on their door every time a server needs a reboot — but for the most part, almost everything can now be managed remotely.

An interesting idea, certainly, but one that still depends on the extant hierarchical model of CPU/storage/bandwidth distribution. Better still (at least for this anarchist) would be for every home to have its own datacentre, with multiple redundant backups stored across fragments of other HDDs on other machines in a torrent-like fashion; flops and bytes are already arguably basic utilities for life (for the more privileged among us, at least), and are unlikely to become less essential to us barring some sort of existential-risk scale catastrophe… so the ubiquitous home server becomes as inevitable as the microwave oven. Sure, that model’s not without its risk scenarios, but it devolves responsibility for (and management of) said risk to the end user, removing it from the corporation or government. Of course, not everyone sees that degree of personal responsibility for risk as a net social good… 🙂

More obviously still, though, the flaw to the data furnace plan is that it overlooks the most logical response to waste heat, namely the development of more efficient computing hardware… after all, we have way more flops and bytes than the average domestic application really demands by this point… so instead of chasing BiggerBetterFasterMore, we could maybe chase SmallerCoolerLighterLess.


Form and function

Paul Raven @ 08-04-2011

As I progress into my thirties, I’m becoming more aware of my status as a demographic that is targeted with nostalgia-based marketing. In terms of pop culture ephemera, I’ve remained relatively immune – the mainstream music and fashion of the eighties repelled me at the time, and has not lost its power to do so – but there is no escape; the technology industry has matured to an extent which allows it to mine its own past for aesthetic triggers that hit us lifelong early adopters like a punch to the gut, even when the product itself is quite obviously pointless in practical terms.

Point in case: Commodore returning to the computer hardware market with Linux-powered PCs dolled up in the form factors of their classic consumer-level home computers. This is the C64x:

Commodore C64x

Hi-ho, atemporality; there’s no point whatsoever in buying one of those unless you’re jonesing for the “authenticity” of the near past (which is itself pretty close to mythological anyway). Though we’re not quite at the point where ubicomp is a reality, Commodore’s “new” products represent an interesting point in the commodification curve of computing. Function is so cheap and easy to produce that form no longer has to play second fiddle; there’s more computing juice in your smartphone than was used to run the entire Apollo moon landings program, and you can shoehorn a useable computer into pretty much any container you desire. (Worth noting that this was an enthusiast’s hobby long before the manufacturers jumped the bandwagon; casemodding has transcended its initial geeks-only cachet thanks to economies of scale.)

When computers first arrived, they looked like the vast, complex and aesthetically sterile engineering devices that they were. Now computing is sufficiently ubiquitous that they can look like whatever we want them to look like (which means that making them look like older and significantly less powerful machines is a momentary fillip of aesthetic irony; expect an imminent rash of computers that don’t look anything like what folk of my age-bracket think of when we hear the word “computer” – remember the Sandbenders custom computer from Bill Gibson’s Idoru?). The end-point of the curve will be the point where computers become effectively invisible; I hesitate to predict a solid time-scale for that, but I’d be surprised if it takes more than another decade.


Computronium == unobtainium

Paul Raven @ 09-03-2011

Via Next Big Future, Doctor Suzanne Gildert of the excellently-named Physics & Cake blog takes apart the [science fictional / Singularitarian] concept of computronium, and does a pretty good job of explaining why it probably isn’t possible:

… as we see, atoms are already very busy computing things. In fact, you can think of the Universe as computing itself. So in a way, matter is already computronium, because it is very efficiently computing itself. [This reminds me of Rudy Rucker’s theories about gnarl and universe-as-computing-substrate – PGR] But we don’t really want matter to do just that. We want it to compute the things that WE care about (like the probability of a particular stock going up in value in the next few days). But in order to make the Universe compute the things that we care about, we find that there is an overhead – a price to pay for making matter do something different from what it is meant to do. Or, to give a complementary way of thinking about this: The closer your computation is to what the original piece of matter would do naturally, the more efficient it will be.

So in conclusion, we find that we always have to cajole matter into computing the things we care about. We must invest extra resources and energy into the computation. We find that the best way to arranging computing elements depends upon what we want them to do. There is no magic ‘arrangement of matter’ which is all things to all people, no fabled ‘computronium’. We have to configure the matter in different ways to do different tasks, and the most efficient configuration varies vastly from task to task.

If it’s not too meta a get-out clause, perhaps we could develop some sort of nanotech system for reconfiguring computational substrate matter into the most appropriate arrangement for the task at hand? Talk about an abstraction layer… 🙂


What Watson did next

Paul Raven @ 24-02-2011

Impressed by Watson’s Jeopardy! victory? Found yourself with the urge to build your own (scaled down) supercomputer artificial intelligence in your basement using nothing but off-the-shelf hardware and open-source software? IBM’s very own Tony Pearson has got your back. [via MetaFilter; please bear in mind that not all basements will be eminently suited to a research project of this scale]

Meanwhile, fresh from whuppin’ on us slow-brained meatbags, Watson’s seeking new challenges in the world of medicine [via BigThink]:

The idea is for Watson to digest huge quantities of medical information and deliver useful real-time information to physicians, perhaps eventually in response to voice questions. If successful, the system could help medical experts diagnose conditions or create a treatment plan.

… while other health-care technology can work with huge pools of data, Watson is the first system capable of usefully harnessing the vast amounts of medical information that exists in the form of natural language text—medical papers, records, and notes. Nuance hopes to roll out the first commercial system based on Watson technology within two years, although it has not said how sophisticated this system will be.

Ah, good old IBM. My father used to work for them back in the seventies and early eighties, and it’s kind of amusing to see that their age-old engineering approach of building an epic tool before looking for a use to put it to hasn’t changed a bit…


Next Page »