BattlefieldMorality2.0

Paul Raven @ 23-02-2009

Terminator statueTo brighten your Monday morning, here’s some speculation on robot morality – though not one of the usual sources. Nick Carr bounces off a Times Online story about a report from the US Office of Naval Research which “strongly warns the US military against complacency or shortcuts as military robot designers engage in the ‘rush to market’ and the pace of advances in artificial intelligence is increased.”

Carr digs into the text of the report itself [pdf], which demonstrates a caution somewhat at odds with the usual media image of the military-industrial complex:

Related major research efforts also are being devoted to enabling robots to learn from experience, raising the question of whether we can predict with reasonable certainty what the robot will learn. The answer seems to be negative, since if we could predict that, we would simply program the robot in the first place, instead of requiring learning. Learning may enable the robot to respond to novel situations, given the impracticality and impossibility of predicting all eventualities on the designer’s part. Thus, unpredictability in the behavior of complex robots is a major source of worry, especially if robots are to operate in unstructured environments, rather than the carefully‐structured domain of a factory.

The report goes on to consider potential training methods, and suggests that some sort of ‘moral programming’ might be the only way to ensure that our artificial warriors don’t run amok when exposed to the unpredictable scenario of a real conflict. Perhaps Carr is a science fiction reader, because he’s thinking beyond the obvious answers:

Of course, this raises deeper issues, which the authors don’t address: Can ethics be cleanly disassociated from emotion? Would the programming of morality into robots eventually lead, through bottom-up learning, to the emergence of a capacity for emotion as well? And would, at that point, the robots have a capacity not just for moral action but for moral choice – with all the messiness that goes with it?

It’s a tricky question; essentially the military want to have their cake and eat it, replacing fallible meat-soldiers with reliable mechanical replacements that can do all the clever stuff without any of the attendant emotional trickiness that the ability to do clever stuff includes as part of the bargain. [image by Dinora Lujan]

I’d go further still, and ask whether that capacity for emotion and moral action actually obviates the entire point of using robots to fight wars – in other words, if robots are supposed to take the positions of humans in situations we consider too dangerous to expend real people on, how close does a robot’s emotions and morality have to be to their human equivalents before it becomes immoral to use them in the same way?


Screw optimism – this is a global guerilla century

Paul Raven @ 10-02-2009

guerillas on the marchJohn Robb isn’t going to give you the news you want to hear. Nope, sorry – the Depression scenario has already emerged fully, and the results are not going to be pretty as we transition into a new politico-economical era in its wake:

A global depression, in and of itself, isn’t the end of the world. However, it can set in motion unexpected events (black swans) — as in how the last depression catalyzed WW2. The revisionist effort to this economic collapse isn’t likely to be a surge in ideology or nationalism. Instead, we can expect an organic realignment as small groups of people form new primary loyalties (either to violent manufactured tribes or resilient communities), slot themselves into open source movements, and challenge a wheezing group of incumbent nation-states. This is a global guerrilla century.

So, not exactly a rosy outlook… and a poke in the eye for the Positive Manifesto school of sf, perhaps. That said, there’s plenty of starting points in Robb’s material for the more dystopian-leaning writer to tackle! [image by Keith Bacongco]

But what do you think – is Robb looking at a worst-case scenario and seeing Mad Max re-runs, or is he being generous with the possibilities of civilisational collapse?


Only the smart die young

Paul Raven @ 19-12-2008

You’d probably think that intelligence would be an asset in the modern battlefield, and hence the smart soldiers would be the ones to survive, right?

Well, as logical as that sounds it may not be the case: a study of records from Scottish army units from WW2 and from the education system about a decade before suggests that the average IQ of those who survived the war was lower than those who lost their lives.


Are the Olympics a convenient smokescreen for the conflict in Georgia?

Paul Raven @ 12-08-2008

Georgian tank troopsWhen I first heard the news about Russia’s invasion of Georgia last Friday (not coincidentally via Twitter rather than the mainstream media), my immediate thought was “well, you timed that neatly, didn’t you?”

I then shrugged it off as paranoid cynicism on my part, but it appears I’m not alone in suspecting that Russia quite deliberately waited until the world was busy watching the Olympic Games before launching their strike on Georgia. [Via Sentient Developments]

And the more I think about it, the more likely it seems – after all, DDoS cyberwarfare is part of the military game-plan now, so why not use current events to enhance the fog of war a little bit?

Listening to this morning’s typically vapid radio news bulletins here in the UK (fifteen seconds on Georgia, two minutes on the Olympics, two minutes on soccer) it appears to be a pretty effective tactic, albeit one that exploits our natural tendency to ignore bad news unless we feel it affects us directly.

The only remotely pleasant side to this line of thought is the possibility that one day wars will be fought entirely through media channels, obviating the need for the death and displacement of thousands of innocent people. Yeah, so I’m a dreamer. Sue me. [image from Wikimedia Commons]


Geoengineering – a new form for warfare?

Paul Raven @ 02-01-2008

flooded city Jamais Cascio has been having some unsettling thoughts about the potential of geoengineering technologies to provide nation-states with subtle yet powerful alternatives to conventional warfare:

“Geoengineering as a military strategy would appear to offer a variety of benefits. Research can be done out in the open, taking advantage of civilian work on anti-global warming geoengineering ideas. If my argument that nuclear weapons and open-source warfare have made conventional warfare essentially obsolete is correct, climate-based warfare would offer an alternative non-nuclear weapon, one that would be out of the reach of non-state actors. And the more we learn about how human activities alter the climate — in order to alter those activities — the more options might open up for intentionally harmful manipulation.”

Yikes. How’s that for taking the edge off your new year optimism, eh? 😉

Still, it strengthens my theory that nation-states are a root cause of a lot of the challenges we face. Call me a hippie if you will, but isn’t it high time we got over this arbitrary geographical factionalism and realised we’re all in the same boat? [Image by Cikaga Jamie]

[tags]climate change, geoengineering, warfare, politics[/tags]

« Previous PageNext Page »