Tag Archives: efficiency

Wicked Problems and ends to limitless [x]

That Steelweaver post on Reality As A Failed State I mentioned a few days back really did the rounds. So I’m going to link to Karl Schroeder at Charlie Stross’s blog once again, and without any sense of shame – he’s been quiet for ages, but he’s spooling out a year’s worth of good shizzle over the space of a few weeks at the moment, and I think he’s a voice worth paying attention to.

Here he is talking about the “metaproblems” that Steelweaver mentioned, which have not only been known and named (as “wicked problems” for some time, but are already a subject of intense study… which is a good thing, too.

It is not the case that wicked problems are simply problems that have been incompletely analyzed; there really is no ‘right’ formulation and no ‘right’ answer. These are problems that cannot be engineered. The anger of many of my acquaintances seems to stem from the erroneous perception that they could be solved this way, if only those damned republicans/democrats/liberals/conservatives/tree-huggers/industrialists/true believers/denialists didn’t keep muddying the waters. Because many people aren’t aware that there are wicked problems, they experience the failure to solve major complex world issues as the failure of some particular group to understand ‘the real situation.’ But they’re not going to do that, and granted that they won’t, the solutions you work on have to incorporate their points-of-view as well as your own, or they’re non-starters. This, of course, is mind-bogglingly difficult.

Our most important problems are wicked problems. Luckily, social scientists have been studying this sort of mess since, well, since 1970. Techniques exist that will allow moderately-sized groups with widely divergent agendas and points of view to work together to solve highly complex problems. (The U.S. Congress apparently doesn’t use them.) Structured Dialogic Design is one such methodology. Scaling SDD sessions to groups larger than 50 to 70 people at a time has proven difficult–but the fact that it and similar methods exist at all should give us hope.

Here are a few wicked problems I think are exemplary. I touched on one of them yesterday, in fact, namely the roboticisation curve in manufacturing; far from liberating the toiling masses in some utopian fusion of Marx and capitalism, it might well increase the polarisation and widen the gap between the poor masses and the super-rich elites, a process that Global Dashboard‘s Alex Evans refers to as “jobless growth”::

In some developed economies (and especially the US), research suggests that job opportunities are increasingly being polarised into high and low skill jobs, while middle class jobs are disappearing due to “automation of routine work and, to a smaller extent, the international integration of labour markets through trade and, more recently, offshoring”. Meanwhile, data also show that while more women are entering the global labour force, the ‘gender gap’ on income and quality of work is widening between women and men. These trends raise a number of critical uncertainties for employment and development to 2020.

If automation of routine work genuinely is a more significant factor in developed economy job polarization than international trade or offshoring, then the implication is that developing economies may increasingly also fall prey to job polarisation as new technologies emerge and become competitive with human labour between now and 2020. Chinese manufacturing and Indian service industry jobs could increasingly be replaced by technology, for example, and find their existing rates of inequality exacerbated still  further.

And here’s a serendipitous look at the economics of a world where replicators and 3d printing become cheap enough to be ubiquitous [via SlashDot]:

Prices for 3D printers are tumbling. Even simple systems often cost tens of thousands of dollars a decade ago. Now, 3D printers for hobbyists can be had for a fraction of that: MakerBot Industries offers a fully assembled Thing-O-Matic printer for just $2,500, and kits for building RepRap printers have sold for $500. The devices could be on track for mass-production as home appliances within just a few years.

So, will we all soon be living like Arabian Nights sultans with a 3D printing genie ready to grant our every wish? Could economies as we know them even survive in such a world, where the theoretically infinite supply of any good should drive its value toward zero?

The precise limitations of replicator technology will determine where scarcity and foundations for value will remain. 3D printers need processed materials as inputs. Those materials and all the labor required to mine, grow, synthesize or process them into existence will still be needed, along with the transportation costs to bring them to the printers. The energy to run a replicator might be another limiting factor, as would be time (would you spend three days replicating a toaster if you could have one delivered to your home in an hour)? Replicators will also need inputs to tell them how to make specific objects, so the programming and design efforts will still have value.

[…]

Perhaps the most important limitation on the replicator economy may competition from good old mass production. Custom-tailored suits may be objectively better than off-the-rack outfits, but people find that the latter are usually the more sensible, affordable purchase. Mass production—especially by factories adopting nimble 3D-printing technologies—can still provide marvelous economies of scale. So even when it is theoretically possible for anyone to fabricate anything, people might still choose to restrict their replicating to certain goods—and to continue making their tea with a store-bought teabag.

The unspoken underpinning of that last paragraph (as hinted by my bolding) is the important bit: the economies of scale of fabbing will see more and more human labour replaced by machines – machines that don’t need holidays, or even sleep; machines that don’t get tired and make a higher percentage of dud iterations as a result; machines that, before too long, will be able to make other machines as required. The attraction of such a system to Big Capital (and small capital, too) is pretty obvious.

And all in the name of chasing perpetual infinite growth, a central assumption of most modern economic thought (or at least the stuff I’ve encountered so far) that relies on a lot of other assumptions… like, say, the assumption that we’ll always be able to either produce more energy, or use the amount we have available more efficiently [via MetaFilter]:

It seems clear that we could, in principle, rely on efficiency alone to allow continued economic growth even given a no-growth raw energy future (as is inevitable). The idea is simple. Each year, efficiency improvements allow us to drive further, light more homes, manufacture more goods than the year before—all on a fixed energy income. Fortunately, market forces favor greater efficiency, so that we have enjoyed the fruits of a constant drum-beat toward higher efficiency over time. To the extent that we could continue this trick forever, we could maintain economic growth indefinitely, and all the institutions that are built around it: investment, loans, banks, etc.

But how many times can we pull a rabbit out of the efficiency hat? Barring perpetual motion machines (fantasy) and heat pumps (real; discussed below), we must always settle for an efficiency less than 100%. This puts a bound on how much gain we might expect to accomplish. For instance, if some device starts out at 50% efficiency, there is no way to squeeze more than a factor of two out of its performance.

[…]

Given that two-thirds of our energy resource is burned in heat engines, and that these cannot improve much more than a factor of two, more significant gains elsewhere are diminished in value. For instance, replacing the 10% of our energy budget spent on direct heat (e.g., in furnaces and hot water heaters) with heat pumps operating at their maximum theoretical efficiency effectively replaces a 10% expenditure with a 1% expenditure. A factor of ten sounds like a fantastic improvement, but the overall efficiency improvement in society is only 9%. Likewise with light bulb replacement: large gains in a small sector. We should still pursue these efficiency improvements with vigor, but we should not expect this gift to provide a form of unlimited growth.

On balance, the most we might expect to achieve is a factor of two net efficiency increase before theoretical limits and engineering realities clamp down. At the present 1% overall rate, this means we might expect to run out of gain this century.  Some might quibble about whether the factor of two is too pessimistic, and might prefer a factor of 3 or even 4 efficiency gain.  Such modifications may change the timescale of saturation, but not the ultimate result.

So it ain’t just Moore’s Law that could be running into a brick wall real soon. A whole lot of caltrops on the highway to the future, then… and we’re still arguing about how to bolt more governers and feedback loops onto fundamentally broken polticoeconomic systems. Wicked problems, indeed. It’s hard not to feel bleak as we look into the eye of this abyss, but Schroeder suggests there’s a way out:

Here’s my take on things: our biggest challenges are no longer technological. They are issues of communication, coordination, and cooperation. These are, for the most part, well-studied problems that are not wicked. The methodologies that solve them need to be scaled up from the small-group settings where they currently work well, and injected into the DNA of our society–or, at least, built into our default modes of using the internet. They then can be used to tackle the wicked problems.

What we need, in other words, is a Facebook for collaborative decision-making: an app built to compensate for the most egregious cognitive biases and behaviours that derail us when we get together to think in groups. Decision-support, stakeholder analysis, bias filtering, collaborative scratch-pads and, most importantly, mechanisms to extract commitments to action from those that use these tools. I have zero interest in yet another open-source copy of a commercial application, and zero interest in yet another Tetris game for Android. But a Wikipedia’s worth of work on this stuff could transform the world.

Digital direct democracy, in other words, with mechanisms built in to ameliorate the broken bits of our psychology. Oh, sure, you can scoff and say it’ll never work, but even a flimsy-looking boat starts looking like it’s worth a shot when the tired old paddle-steamer starts doing its Titanic impersonation in the middle of the swamp. What Schroeder (and many others) are suggesting is eminently possible; all we lack is the political will to build it.

And it’s increasingly plain that we’re not going to find that will in the bickering halls of the incumbent system; it’s only interested in maintaining its own existence for as long as possible, and damn the consequences.

Which is why we need to turn our backs on that system and build its replacement ourselves.

Peak Travel

Trends suggest that the demand for transit is flattening out in the industrialized West. Ars Technica:

… prior to recent years, two forms of transit have driven most of the growth in miles travelled, and thus energy use: air and car travel. And, although air travel has continued to increase, car travel has started to decline (a trend that predates the oil price shock of recent years). As a result, since 2003, total miles travelled have flattened out and has started to decline in some countries. This flattening out is even more apparent when graphed against per-capita GDP. Here, most countries show a flattening out once they hit a per-capita GDP of $25,000 (in the US, the figure is $35,000, while Sweden shows a continuing rise).

There are lots of individual features hidden within these general trends. For example, the US drop in the energy intensity of car travel stalled once milage standards languished in the 1990s. In contrast, European countries started raising their gasoline taxes around the same time, and experienced the opposite trend. Longer flights are also less energy intensive, which means that domestic air travel is less energy-intensive in nations like the Australia, Canada, and the US simply as a function of geography.

Nevertheless, the authors argue that the GDP-related trends, which are more consistent across countries, suggest that there might be some common factors underlying the decline in travel, such as urbanization, increased taxes, aging populations, a saturation of automobile ownership, and a basic desire not to spend any more time behind the wheel. Carpooling has also seemed to decline to the point where it probably won’t go down much further.

The folk behind the study are wisely reluctant to project into the future, though they suggest that “continued, steady growth in travel demand cannot be relied upon.”

I fully suspect that the next few weeks will see a rash of pundits suggesting that this flattening of trends means we can stop worrying about carbon emissions and climate change, to be met by a rash of counter-claims at the opposite extreme; between all the shouting, nothing of note will be achieved. Both sides can call me back when they start basing their narratives on the evidence, rather than crowbarring the evidence into their narrative. This Red vs Blue bullshit is starting to bore me.

Tax ’em back into town?

The UK iteration of Wired is doing a themed issue entitled “Rebooting Britain”, kicking around ideas for changing the face of an already-changing nation for the better. Many of them could be more broadly applied to any Western/developed nation, but a few of them address issues that are somewhat more unique to the UK. For example, Britain is apparently one of the very few nations where the percentage of people living in cities is not increasing; this doubtless has a lot to do with deep-rooted notions of the romance and allure of country living that inform the English psyche, though the increasing proliferation of surveillance and petty bureaucracy in urban areas may well be a contributing factor too.

But the rural lifestyle is disproportionately expensive from an environmental perspective; people who live in the country need to drive further and more often, they need to use more energy for heating their homes, and so on. So, P D Smith suggests, why don’t we tax the rural lifestyle heavily to encourage people back into more efficient city living?

To create a low-carbon economy we need to become a nation of city dwellers. We tax cigarettes to reflect the harm they do to our health: we need to tax lifestyles that are damaging the health of the planet – and that means targeting people who choose to live in the countryside. We need a Rural Living Tax. Agricultural workers and others whose jobs require them to live outside cities would be exempt. The revenue raised could be used to build new, well-planned cities and to radically upgrade the infrastructure of existing cities.

We have an opportunity to create an urban renaissance, to make cities attractive places to live again – not just for young adults, but for families and retired people, the groups most likely to leave the city.  Turning our old cities into “smart cities” won’t be easy or cheap, but in a recession this investment in infrastructure will boost the economy. We need to learn to love our cities again, because they will help us to save the planet.

It’s a nice idea and well-meant, but there are some pretty obvious flaws to the suggestion. First and foremost, Smith seems to have overlooked the fact that the affluent middle classes who are at the centre of the migration into the countryside are the most politically active slice of the UK population, and no government in its right mind (if such a thing exists) is going to risk alienating them by crushing their dreams of “getting away from it all” with their hard-earned money.

Another problem is the assumption that country living is necessarily less energy efficient. As the months pass, more and more middle-class jobs will fall into the sphere of knowledge work, making them ideally suited for telecommuting… which is something that businesses looking to save on their payroll overheads are starting to wake up to. Offer the chance to work from home in exchange for a smaller pay-packet, and there’ll be a significant take-up.

Plus country houses – while usually bigger and less efficient than city dwellings – are more easily retrofitted for energy efficiency, and more likely to have that money spent on them by their owners rather than by government grants – if there’s a clear economic benefit to investing in a “greener” household, you can bet your life the middle class will be all over it like a rash, especially once a few trendsetters start doing it and trumpeting the benefits.

And let’s not forget that homes in the countryside are theoretically closer to domestic sources of food; with a little logistical planning and some smart entrepreneurship, even small villages could become efficient nexuses for local produce distribution. Hell, they could even start aiming for self-sufficiency and community agriculture, like the Pennines town of Todmorden, which is showing signs of successfully shifting toward community farming and a “locavore” lifestyle [via Global Guerrillas, of all places].

In short, there are definite downsides to the British rural exodus, but using the blunt instrument of tax to reverse it is bound to fail. Better still, surely, to embrace the rural shift and let economics do the hard work for you?

Zipf’s Law – modelling the megalopolis

Taipei urban skylineMore statistical sensawunda in the urban environment. Remember us mentioning that guy who suggested that cities can be considered  as super-organisms? Well, a mathematician chap called Stephen Strogatz dropped into the New York Times blogs to talk about Zipf’s Law and other statistical phenomena that surround our urban environments:

The mathematics of cities was launched in 1949 when George Zipf, a linguist working at Harvard, reported a striking regularity in the size distribution of cities. He noticed that if you tabulate the biggest cities in a given country and rank them according to their populations, the largest city is always about twice as big as the second largest, and three times as big as the third largest, and so on. In other words, the population of a city is, to a good approximation, inversely proportional to its rank. Why this should be true, no one knows.

[…]

For instance, if one city is 10 times as populous as another one, does it need 10 times as many gas stations? No. Bigger cities have more gas stations than smaller ones (of course), but not nearly in direct proportion to their size. The number of gas stations grows only in proportion to the 0.77 power of population. The crucial thing is that 0.77 is less than 1. This implies that the bigger a city is, the fewer gas stations it has per person. Put simply, bigger cities enjoy economies of scale. In this sense, bigger is greener.

The same pattern holds for other measures of infrastructure. Whether you measure miles of roadway or length of electrical cables, you find that all of these also decrease, per person, as city size increases. And all show an exponent between 0.7 and 0.9.

Now comes the spooky part. The same law is true for living things. That is, if you mentally replace cities by organisms and city size by body weight, the mathematical pattern remains the same.

It looks as if there’s a lot of things that mathematical analysis could tell us about the cities we live in. The question is, are these properties inherently emergent, or could we design our urban environments more effectively and adjust some of those efficiency values in the process? [image by tylerdurden1]

Recycling waste heat in computers to increase efficiency

computer processor pinsThe ever-louder whining of my computer’s processor fan is a constant reminder that there’s a lot of energy wasted in modern microprocessors (and that it’s high time I replaced the ageing beast for a machine less likely to collapse at any moment).

While we’re unlikely to be offered room-temperature computer systems any time soon, engineers in the emerging field of phononics are looking at ways to harvest that waste heat and make computers more efficient in the process:

It exploits the fact that some materials can only exchange heat when they are at similar temperatures. The small memory store at the heart of their design is set to either a 1 or 0 temperature by an element that can rapidly shunt in or draw out heat. The store itself is sandwiched between two large chunks of other materials.

One of those materials is constantly hot, but can only donate heat to the memory store when that too is hot, in the 1 state. The material on the other side of the memory patch is always kept cold, but can draw heat away from the store whatever state it is in.

Early days yet, of course, but maybe thermal computing will give Moore’s Law another stay of execution when we reach the practical limits of circuit integration. [via SlashDot; image by Ioan Sameli]