Dark Silicon: an end to Moore’s Law?

Paul Raven @ 02-08-2011

From the New York Times:

paper presented in June at the International Symposium on Computer Architecture summed up the problem: even today, the most advanced microprocessor chips have so many transistors that it is impractical to supply power to all of them at the same time. So some of the transistors are left unpowered — or dark, in industry parlance — while the others are working. The phenomenon is known as dark silicon.

As early as next year, these advanced chips will need 21 percent of their transistors to go dark at any one time, according to the researchers who wrote the paper. And in just three more chip generations — a little more than a half-decade — the constraints will become even more severe. While there will be vastly more transistors on each chip, as many as half of them will have to be turned off to avoid overheating.

Personally, I’m not going to sing requiem for Moore’s Law just yet; many brick walls for it have been suggested, and they’ve always been engineered around eventually; that said, there are limits to almost everything, and perhaps silicon architecture will finally meet its apogee. I think the real question to ask here is “would that be a bad thing?” An upper limit on computing power might just lead to software that uses what’s available more efficiently…

(Top marks on the suitably doomy and mysterious moniker “dark silicon”, though; that’s a post-cyberpunk novel title just waiting to be used.)

Form and function

Paul Raven @ 08-04-2011

As I progress into my thirties, I’m becoming more aware of my status as a demographic that is targeted with nostalgia-based marketing. In terms of pop culture ephemera, I’ve remained relatively immune – the mainstream music and fashion of the eighties repelled me at the time, and has not lost its power to do so – but there is no escape; the technology industry has matured to an extent which allows it to mine its own past for aesthetic triggers that hit us lifelong early adopters like a punch to the gut, even when the product itself is quite obviously pointless in practical terms.

Point in case: Commodore returning to the computer hardware market with Linux-powered PCs dolled up in the form factors of their classic consumer-level home computers. This is the C64x:

Commodore C64x

Hi-ho, atemporality; there’s no point whatsoever in buying one of those unless you’re jonesing for the “authenticity” of the near past (which is itself pretty close to mythological anyway). Though we’re not quite at the point where ubicomp is a reality, Commodore’s “new” products represent an interesting point in the commodification curve of computing. Function is so cheap and easy to produce that form no longer has to play second fiddle; there’s more computing juice in your smartphone than was used to run the entire Apollo moon landings program, and you can shoehorn a useable computer into pretty much any container you desire. (Worth noting that this was an enthusiast’s hobby long before the manufacturers jumped the bandwagon; casemodding has transcended its initial geeks-only cachet thanks to economies of scale.)

When computers first arrived, they looked like the vast, complex and aesthetically sterile engineering devices that they were. Now computing is sufficiently ubiquitous that they can look like whatever we want them to look like (which means that making them look like older and significantly less powerful machines is a momentary fillip of aesthetic irony; expect an imminent rash of computers that don’t look anything like what folk of my age-bracket think of when we hear the word “computer” – remember the Sandbenders custom computer from Bill Gibson’s Idoru?). The end-point of the curve will be the point where computers become effectively invisible; I hesitate to predict a solid time-scale for that, but I’d be surprised if it takes more than another decade.

The end of the PC era is 18 months away

Paul Raven @ 08-12-2010

So claims this piece at ComputerWorld, anyhow, parroting the findings of a market research firm about the unit-numbers of smartphones and tablet devices to be shipped when compared with sales of “traditional” personal computers [via SlashDot]:

It may be seen as a historic shift, but it is one that tells more about the development of a new market, mobile and tablet computing, than the decline of an older one, the PC. Shipments of personal computers will continue to increase even as they are surpassed by other devices.

IDC said worldwide shipments this year of app-enabled devices, which include smartphones and media tablets such as the iPad, will reach 284 million. In 2011, makers will ship 377 million of these devices, and in 2012, the number will reach 462 million shipments, exceeding PC shipments. One shipment equals one device.

I think an end to the dominance of the PC is pretty inevitable, and indeed has been happening for some time – I don’t know many people whose home computer isn’t a laptop, for instance, which seems indicative of a desire for computing-as-convenient-commodity rather than computer-as-installation, than computer-as-machine.

But will they vanish completely from the consumer marketplace? I’m not so sure… I use a desktop tower by choice, because I like to be able to build, maintain and upgrade my hardware myself, but that marks me as a relic of sorts, and an inheritor of my father’s engineer-esque attitudes to computers*. But as devices get cheaper, more powerful and more disposable, that impetus may fade awy.

Whether or not disposability is a path we should be pleased to follow is another question entirely, of course…

[ * My first PC was his handed-down 8086, which he insisted I help him assemble and test; with hindsight, that’s one of those incredibly pivotal moments in a life. ]

Cheaper, more open tablets: this is exactly why I had no interest in buying an iPad

Paul Raven @ 13-07-2010

No, I’m not about to start bitching about Apple’s flagship gizmo and what it can or can’t do (although, if you want to buy me a beer or two in meatspace, I’d be more than happy to give you my two uninformed but moderately passionate cents on that).

Instead, I’m just going to point to evidence of exactly what I’ve been saying would happen: that within a very short amount of time after the iPad’s launch, you’d be able to get cheaper hardware with the same or greater functionality, and run a FOSS operating system on it that lets you get applications from anywhere you choose. So, via eBooknewser, here’s a guide to hacking the US$200 Pandigital Novel tablet device so it’ll run the Android operating system. Come Christmas time this year, there’ll be dozens of machines just like that kicking around all over the place, only cheaper still.

Speaking of Android, there’s a lot of noise about the way that Google are working on a kind of visual development system that’s designed to let folk with minimal coding knowledge to develop apps that will run on Android – again, a stark comparison to the walled-garden quality control of Apple’s development kits. Sure, the Android market will be flooded with crap and/or dodgy apps as a result… but letting the good stuff bubble to the top is what user rating systems and [editors/curators/gatekeepers] are for, right?

The Processor Wars

Paul Raven @ 12-07-2010

There are many ways to make a profit; one of them is to make a better product than the competition, but sometimes that alone is not enough, especially when you make the components of complex devices like computers. So maybe you could think about building loopholes into your product that make the competition’s product look inferior when used in the same system? There are suggestions that’s what nVidia has been doing:

PhysX is designed to make it easy for developers to add high-quality physics simulation to their games, so that cloth drapes the way it should, balls bounce realistically, and smoke and fragments (mostly from exploding barrels) fly apart in a lifelike manner. In recognition of the fact that game developers, by and large, don’t bother to release PC-only titles anymore, NVIDIA also wisely ported PhysX to the leading game consoles, where it runs quite well on console hardware.

If there’s no NVIDIA GPU in a gamer’s system, PhysX will default to running on the CPU, but it doesn’t run very well there. You might think that the CPU’s performance deficit is due simply to the fact that GPUs are far superior at physics emulation, and that the CPU’s poor showing on PhysX is just more evidence that the GPU is really the component best-equipped to give gamers realism.

Some early investigations into PhysX performance showed that the library uses only a single thread when it runs on a CPU. This is a shocker for two reasons. First, the workload is highly parallelizable, so there’s no technical reason for it not to use as many threads as possible; and second, it uses hundreds of threads when it runs on an NVIDIA GPU. So the fact that it runs single-threaded on the CPU is evidence of neglect on NVIDIA’s part at the very least, and possibly malign neglect at that.

Whether it is malign remains to be seen (the use of Occam’s Razor may well apply here, but then again it may not), but this is still an interesting development: in a world where most new inventions are part of larger systems, the battle for sales isn’t simply a matter of making your own product better. Granted, talking down the value of a competitor’s product has been a core strategy of public relations for years, but actually attenuating that value in deployment strikes me as being something pretty new, if only because it wasn’t really possible before. Unless anyone can suggest a situation where this has happened before?

Next Page »