We are all cyborgs

Paul Raven @ 12-01-2011

That’s probably not news to most of you regulars here (especially not those of you who followed along with the #50cyborgs project), but this short sharp TED video featuring cyborg anthropologist Amber Case manages to explain in simple terms what “we are all cyborgs” actually means. (Hint: it’s not that the machines are taking over.)

Like I say, somewhat entry-level by Futurismic standards (unless I’ve misjudged y’all), but a good one to show to folks who can’t seem to get past the tabloid tech-terror stories, perhaps.

[ This one’s via all sorts of people, by the way, but the two sources that got bookmarked this time round were George Dvorsky and the grinding.be posse. ]

The story of ourselves

Paul Raven @ 17-11-2010

The New Scientist CultureLab blog is running an interesting set of pieces about storytelling in the (post-)modern world (for which there is, regrettably, no single unifying tag or category to which I can link you); it’s probably due to a global swelling of interest in such matters coinciding with my own self-education curve, but in the last few years it’s felt like everything has started to boil down to narratives – the stories we graft on to our experiences so that we can make sense of the world.

Of course, by the terms of the theory, that is a narrative in and of itself… but before we get caught in an infinite loop of meta, let’s skip to this article that wonders how the changing structure of the narratives we produce in our art and culture will affect the ones we produce in our heads.

Gazzaniga […] thinks that this left-hemisphere “interpreter” creates the unified feeling of an autobiographical, personal, unique self. “The interpreter sustains a running narrative of our actions, emotions, thoughts, and dreams. The interpreter is the glue that keeps our story unified, and creates our sense of being a coherent, rational agent. To our bag of individual instincts it brings theories about our lives. These narratives of our past behaviour seep into our awareness and give us an autobiography,” he writes. The language areas of the left hemisphere are well placed to carry out these tasks. They draw on information in memory (amygdalo-hippocampal circuits, dorsolateral prefrontal cortices) and planning regions (orbitofrontal cortices). As neurologist Jeffrey Saver has shown, damage to these regions disrupts narration in a variety of ways, ranging from unbounded narration, in which a person generates narratives unconstrained by reality, to denarration, the inability to generate any narratives, external or internal.


If we create our selves through narratives, whether external or internal, they are traditional ones, with protagonists and antagonists and a prescribed relationship between narrators, characters and listeners. They have linear plots with a fixed past, a present built coherently on it, and a horizon of possibilities projected coherently into the future. Digital technologies, on the other hand, are producing narratives that stray from this classic structure. New communicative interfaces allow for novel narrative interactions and constructions. Multi-user domains (MUDs), massively multiplayer online role-playing games (MMORPGs), hypertext and cybertext all loosen traditional narrative structure. Digital narratives, in their extremes, are co-creations of the authors, users and media. Multiple entry points into continuously developing narratives are available, often for multiple co-constructors.

These recent developments seem to make possible limitless narratives lacking the defining features of the traditional structures. What kinds of selves will digital narratives generate? Multi-linear? Non-fixed? Collaborative? Would such products still be the selves we’ve come to know and love?

As heady as these implications seem, we should not get carried away. From a literary perspective, digital narrative’s break with tradition will either be so radical that the products no longer count as narrative – and so no longer will be capable of generating narrative selves – or they will still incorporate basic narrative structure, perhaps attenuated, and continue to produce recognisable narrative selves.

Or, to put it another way, “we just don’t know, so we’ll have to wait and see”. But it’s fascinating stuff, if only for the tantalising offer of a place where literary theory, anthropology and hard neuroscience might one day all meet up… and that would be an awesome place to spend one’s life theorising, don’t you think? 🙂

Looking back on Cyborg Month

Paul Raven @ 01-10-2010

When Tim Maly invited me to contribute to the 50 Posts About Cyborgs project, I had a nagging suspicion that I’d have a run-in with impostor syndrome… and I was right. The nearly complete run of posts (49 of them linked from the Tumblr above as I type this) contains some of the smartest and most brain-expanding material I’ve read in a long, long time, from some incredibly erudite writers and thinkers. If you have any interest whatsoever in the post-modern human condition in a technology-saturated world, in where we came from as a species and where we’re going, or in what being (post?)human actually means, then there’ll be something there for you to enjoy – so go read.

And many thanks Tim for inviting me to take part; I’m one proud impostor. 🙂

Unbranding and the hipster backlash

Paul Raven @ 26-08-2010

I have an awkward but passionate relationship with academic discussions of popular culture. Expansion: I’ve always found popular culture more interesting as an observer than as a participant, but I think the line between those two states is becoming thinner and fuzzier (if, indeed, it ever existed at all beyond my own desperate, continuous and largely futile attempts to see myself as separate from any form of cultural majority in my current social environment*).

You see, I had a minor revelation on the way to Tesco the other evening, in which I realised that part of the difficulty with, say, writing reviews of books or music in a networked world, is that you can’t isolate any one cultural artefact from the world in which it exists, or from its creator (not entirely), or from its consumers and detractors. To review effectively – to critique – is an act of comparative cultural anthropology, performed in a room lit only by a Maglite velcroed to one’s own forehead. Context is everything. The character and intellectual history of the critic is crucial to your understanding their understanding of the subject of their critique. The critic’s greatest insights (and, by the same token, greatest blindspots) are necessarily invisible to her. To paraphrase Douglas Adams, the critic can’t see her biases for the same reason that a tourist stood in Trafalgar Square can’t see England.

And so much for rambling pseudophilosophical cultural discourse. (Hey, it was a fun paragraph to write. I may even have meant most of it.) But back to the point: culture, fashion, trends, memes. Cyclic shifts. The mainstream’s need to reappropriate marginal culture (because, based as it is on a pattern of consumerism, it cannot create, only refine and reiterate); marginal culture’s parasitic defiance, goading and mockery/pastiche/satire of the mainstream’s current obsessions (because the urge to create is almost indistinguishable from the urge to destroy).

What am I going on about?

Like, hipsters, y’know? Right. Wired UK piece, academics and psychology types talking about the pivot point where a self-identified outsider culture reaches a critical mass and becomes a self-parody, attacks its own body-politic like cancer or some sort of immune system failure; Pop Will Eat Itself (dos dedos, mis amigos). Swarms of Johnny-come-latelys displace the boys and girls from back-in-the-day to the sound of chorused mutterings of “sell-outs and cash-ins”,  “we-were-here-first”, “the-early-albums-were-waaaaay-better”. In-group identifiers become terms of disparagement outside the group; inside the group, further divisions of nomenclature attempt to reposition the speaker in relation to the recent immigrant influx invading their cultural space (“he’s no hipster, he’s a scenester; sooooo bogus”). Meanwhile, businesses spring up and rot away in parallel with the swells and breakers of cultures rising and falling, happy remoras (remorae?) on the big dumb whale-shark of Youth. (RIP, American Apparel; couldn’t happen to a more horrifying homogeniser of urban try-hards.)

Whoa, check myself – still waffling b*llocks! Cut to the chase with academic concision:

In order to distance themselves from the hipster caricature, true indie consumers** use a number of techniques.

The first is “aesthetic discrimination”, whereby you tell those who accuse you of being hipsters as uninformed outsiders who don’t have sophisticated enough tastes to be able to discriminate between the hipster caricature and the authentic indie consumer.

The second technique is “symbolic demarcation”. Those indie consumers who engage in aesthetic discrimination tend to have an intellectual command of indie culture and are socially recognised as people who are in the know. Because of this status, they can afford to dismiss any resemblances to the hipster icon as irrelevant.

They might also rename the hipster caricature as something else, eg “scenester”, thus placing the worst traits associated with a “hipster” into a new, distinct definition. Creating a new category helps solidify the contrast between legitimate indie consumers and those who simply want to be part of a fashionable scene.

The third technique is “proclaiming (mythologised) consumer sovereignty”. This sees the person consciously reframe their interests in the indie field to show their autonomy from the dictates of fashion.

“Our findings suggest how backlash against identity categories such as hipster or metrosexual could generate complex and nuanced identity strategies that enable consumers to retain their tastes and interests while protecting these tastes from trivializing mythologies,” the authors conclude.

(Before you feel too smug, we all do this. Granted, most of us reading this site don’t do it while wearing ironic Rayban knockoffs or penny loafers under rolled-up drainpipe jeans, but we all do it. Genre fandom especially is full of this stuff, though it moves more slowly. Hell, even the transhumanists do it, though they use even bigger words than anyone else in the process. Othering is a hard-wired human thing, goes way back to pre-speech phases of socialisation. Them-and-us; hard habit to quit.)

But so what? Well, say you’re a marketer for fashion brands (or for a new author, or an advocate for a new school of transcendent philosophy). Making your own brand/author/philosophy look good is incredibly hard to achieve reliably… even more so nowadays, with the memetic flux swirling so fast. Yesterday’s viral sensation is today’s lingering and sniffly common cold. So what to do? Instead of giving your brand to cultural icons that reflect the aspirations of your target subculture, you give your rival brands to cultural icons who embody the opposite of those aspirations [via BoingBoing]. Couture-marketing psy-ops. Sounds ridiculous, a possible indicator of the end of civilisation (wring hands, mutter about the Romans, miss point entirely). But with clarity born of hindsight, this morning’s revelation, triggered by the two articles linked above and prompting the rapid-fire unedited writing of this little screed:

William Gibson’s been writing this stuff for years.

How does he keep doing that?

Related: Slate “interviews” Kanye West by slicing up his Twitter output. The Village Voice claims this as the chiselled headstone of the music magazine: who needs the middleman to broadcast their personal brand, if all they’ll do is distort it? The Village Voice fails to recognise that pop culture consumers are like fuzz-rock guitarists: distortion always sounds better than clean signal. Boutique stomp-boxes all round!

[ * So, yes, science fiction fandom was a pretty inevitable landing-spot, I suppose. But which came first, the estrangement or my enjoyment of the literature thereof?*** Answers on the back of an Urban Outfitters till receipt… ]

[ ** Not entirely sure about these notional “true indie consumers”. Neophiliacs would probably be a fairer word, albeit an arguably less flattering one. ]

[ *** And so much for pathos. ]

Defining society: the anthropologist’s dilemma

Paul Raven @ 10-08-2010

Keith Hart think’s he’s uncovered anthropolgy’s biggest challenge, and the issue that’s hampering its progress as a science: defining the word ‘society’ in a way that makes sense for the times we live in.

I believe that humanity is caught precariously in transition between two notions of where society is located, the nation-state and the world. The dominance of the former in the 20th century fed the ethnographic revolution in anthropology which, rather than following the needs of colonial empire as is commonly assumed, was in fact an attempt to make the national model of society universal by finding its principles everywhere, even in so-called primitive societies. These principles included cultural homogeneity, a bounded location and an ahistorical presumption of eternity. The centrality of the state to such a concept of nation was negated by the study of stateless societies in these terms.

Clearly world society is not yet a fact in the same sense as its principal predecessor. But the need to make a world society fit for all humanity to live in is urgent for many reasons that I don’t need to spell out. Retention of ethnography (which first emerged in Central Europe to serve a nation-building project) as our main professional model has made most of us apologists for a fragmented and static vision of the human predicament, reinforcing a rejection of world history that amounts to nothing less than, “Stop the world, I want to get off”. We no longer study exotic rural places in isolation from history, but, in abandoning that exclusive preoccupation, we have failed to bring the object, theory and method of anthropology up to date.

Note the similarities to concerns about the nation-state as dominant identifier coming from all sorts of other disciplines (as frequently documented on this ‘ere blog, among other places). Nationality is increasingly coming to be seen as the hollow sham it has always been. Think about it: the problem with identifying with a nation is that you’re identifying with nothing more than a word and a piece of multicoloured cloth. The ideological continuity that nationality implies is a complete fiction: if I’m “proud to be English”, am I proud of the same things Churchill would have held dear? Is it the citizens of England I identify with, or its values and laws, or even the physical ground itself, that territory which is no longer the map? None of these things are constants; they are different now to how they were a year ago, a decade ago, a century ago. No one chose where to be born… so why this fanaticism for a fluke of geography and childhood survival statistics? You don’t see people born on a Wednesday singing anthems about the wonderfulness and well-earned superiority of Wednesdays, do you?

“England” (or “America”, or “China”, or or or… ) is a hollow word, and the vacuum at its heart is easily filled by people with agendas that have nothing to do with bringing people together. You don’t bring people together by labelling them, by gathering them beneath a banner; that’s the definition of segregation. Nationality is at best meaningless, and at worst extremely dangerous. Nationality is apartheid.  It’s an idea that makes no logical sense in a networked world, where geography increasingly constrains only your individual access to physical resources. Until we get past the idea of ‘society’ being something to which we may belong, but to which some (most!) other humans do not, solutions to all our most pressing problems as a species will continue to elude us.

My two cents, there. 🙂

Next Page »