Tag Archives: media

Attention economics: sub-prime celebrities

There’s sometimes deep truth in flippant analogies. Well, there is in my world, anyway… and here’s an example, as The Guardian‘s Aditya Chakrabortty compares celebrity to shonky mortgages: if you sell too many of the latter masquerading as the real thing, the whole system ends up collapsing in the wake of the (admittedly huge) short-term gains you make from it.

As for the assertion that fame is sought only by a desperate few wannabes, think again. Extrapolating from surveys, the developmental psychologist Orville Gilbert Brim estimates that 4 million American adults (out of a total of 200 million) describe fame as their most important life goal. The proportions are only slightly lower in Germany and urban China.

[…]

If you define fame as being known by strangers, then newspapers, cinema and especially TV have always driven the spread of celebrity. Yet, until very recently, that attention has customarily been at a gradient: the public used to look up to their stars; now they are minded to look down.

[…]

Think back to Wall Street’s sub-prime crisis. That was a story of lenders so desperate for market share and quick profit that they were chucking big sums at people who didn’t warrant it. The tale is very similar in the celebrity-media industry.

Your TV used to be the equivalent of a rating-agency, exposing you only to AAA-rated talent. Now however, it asks you to keep up with the Kardashians; watch a Hilton or an Osborne muddle through the real world, and, yes, be a guest at Katie Price’s latest wedding. The fundamentals of all these celebs are, frankly, ropey, and yet viewers are invited to invest time and emotional equity in them.

Resonances there with our ongoing discussion about gatekeepers and experts in the world of publishing; gatekeeper failure really can collapse a thriving market.

More pertinently, I think I’ve always viewed social currencies like fame (or its more localised little brother, popularity) in economic terms, even long before I knew what economics actually was*. Chakrabortty’s model would need to factor in some of fame’s more curious properties, though: the way it can in circumstances be gifted to another without any loss of personal worth, for instance, or the way one can collapse one’s own federal reserve completely without any help or interference from others, or any intended expense on your part.

Shorter version: anyone who wants to code a detailed version of Whuffie has a whole lot of work ahead of them. But the human brain, jacked into the cyborg extension of ourselves we call the media, can run those insanely complex calculations without knowing consciously how they work… score one up for the meat. 😉

[ * This is a not-too-subtly coded way of saying that I wasn’t hugely popular at school, and spent a lot of time trying to rationalise why that was. I’d have doubtless been better served by not thinking about it, hence appearing to have been less of a massive nerd, and hence becoming more popular. Ah, hindsight… 🙂 ]

Microsoft Kinect: The Call of the Womb

Blasphemous Geometries by Jonathan McCalmont

###

I have never been to the festival of hubris and chest-thumping that is the American video games industry’s yearly trade-fair E3 (a.k.a. ‘E Cubed’, a.k.a. ‘Electronic Entertainment Expo’), but the mere thought of it makes me feel somewhat ill. A friend of mine once attended a video game trade fair in Japan. He returned not with talk of games, but of the dozens of overweight middle-aged men who practically came to blows as they jostled for the best angle from which to take up-skirt photographs of the models manning the various booths.

As disturbing and sleazy as this might well sound, it still manages to cast Japanese trade shows in a considerably better light than a lot of the coverage that came out of E3. Every so often, an event or an article will prompt the collection of sick-souled outcasts known as ‘video game journalists’ into a fit of ethical navel-gazing: are their reviews too soft? are their editorial processes too open to commercial pressures? do they allow their fannishness to override their professional integrity? Oddly enough, these periodic bouts of hand-wringing never coincide with E3.

E3 is a principles-free zone as far as video game reporting is concerned: Journalists travel from all over the world to sit in huge conference halls where they are patronised to within an inch of their wretched lives by people from the PR departments of Nintendo, Microsoft and Sony. At a time when cynicism and critical thinking might allow a decent writer to cut through the bullshit and provide some insights into the direction the industry is taking, most games writers choose instead to recycle press releases and gush about games that are usually indistinguishable from the disappointing batch of warmed-over ideas dished out the previous year. At least the creepy Japanese guys had an excuse for wandering around a trade fair doused in sweat and sporting huge hard-ons.

Microsoft Kinect with Xbox 360

Continue reading Microsoft Kinect: The Call of the Womb

Watch this movie: We Live In Public

promo poster fo We Live In PublicA couple of nights ago, I sat down and watched We Live In Public, Ondi Timoner’s award-winning documentary about Josh Harris, Pseudo.com, the Quiet experiment, and the eponymous project that involved Harris streaming every mundane moment of his life onto the web for anyone to watch. I was particularly amazed that Quiet – a darkly and deliberately Orwellian behavioural experiment involving real people that not only prefigures but utterly eclipses much of the more recent reality television – isn’t better known and more widely discussed (though I believe it was a big influence on Douglas Rushkoff, who appears as an interviewee in the film and who was certainly part of the New York dot-com boom scene that floated Harris to prominence, and which I presume influenced and informed Rushkoff’s flawed but fascinating novel The Ecstasy Club).

The same applies to Harris, who comes across as a fascinating and damaged genius and visionary who foresaw – and concretised – many of the privacy and publicy issues that are hot button topics on today’s intertubes. I’m not sure I believe that Harris’ vision of a totally mediated world is inevitable, or even possible, but the extremity of the example he created is a valuable lesson and cautionary tale… as is his life as a whole.

The caveat here is that Timoner’s previous big success (and Sundance Festival winner) is the controversial rockumentary Dig!, which has been accused by Anton Newcombe of The Brian Jonestown Massacre of portraying him and his band in a selectively negative light as compared to the film’s other main subjects, The Dandy Warhols. Much as I’m a fan of Newcombe and his work, however, it’s pretty clear that he’s a damaged genius (like Harris, though in a very different manner), and whether or not Timoner’s editing really was deliberately skewed to cast Newcombe as the bad penny will remain a mystery to anyone who wasn’t involved in the project. Sensation sells, after all… and the footage of Quiet in We Live In Public makes much of its more shocking aspects; I guess what I’m saying is that the same pinch of salt you’d apply to any other modern media is surely worth using here.

But that pinch of salt does nothing to negate a powerful story, and one that I think any internet habitue should watch. Residents of the United Kingdom have another 22 days (as of publication of this post) to watch it for free on Channel 4’s 4od service, and I urge you to take advantage of it while you can. Everyone else – keep your eyes peeled for an opportunity of your own. This is a hugely important document in the history of mediated network culture.

Did the Iranian “Twitter Revolution” actually happen?

You know, I’m always advising people not to believe everything they read, but I’m just as bad at doing it as anyone else – we all give credence to the stories we want to believe, I guess (and hell knows that media companies know how to exploit that).

So, remember the Twitter Revolution in Iran? That there was a revolution is not in question, but that the revolution was powered by social media? That’s not so clear [via MetaFilter]:

… it is time to get Twitter’s role in the events in Iran right. Simply put: There was no Twitter Revolution inside Iran. As Mehdi Yahyanejad, the manager of “Balatarin,” one of the Internet’s most popular Farsi-language websites, told the Washington Post last June, Twitter’s impact inside Iran is nil. “Here [in the United States], there is lots of buzz,” he said. “But once you look, you see most of it are Americans tweeting among themselves.”

A number of opposition activists have told me they used text messages, email, and blog posts to publicize protest actions. However, good old-fashioned word of mouth was by far the most influential medium used to shape the postelection opposition activity. There is still a lively discussion happening on Facebook about how the activists spread information, but Twitter was definitely not a major communications tool for activists on the ground in Iran.

[…]

To be clear: It’s not that Twitter publicists of the Iranian protests haven’t played a role in the events of the past year. They have. It’s just not been the outsized role it’s often been made out to be. And ultimately, that’s been a terrible injustice to the Iranians who have made real, not remote or virtual, sacrifices in pursuit of justice.

I’m starting to wonder if a faith in the hierarchy-corrosion of modern communications systems isn’t becoming a core plank of what, for want of a less contentious or partisan label, we might call the postmodern progressive liberal platform. Maybe because we feel ourselves to have been liberated from something by the internet (even though we’re not sure what it is that we’ve been liberated from), we think that it can deliver liberation to others from things that are far more oppressive and powerful (at least at the level of curtailment of individual freedoms) than we have the context and experience to understand? That political revolution can be as safe, easy (and fun!) as our spare time whiled away on social media? (See also: the illusion of participation produced by slacktivism.)

Or maybe it’s just old-fashioned and fallacious Golden Age pulp technophilia: “Twitter is the future! The future is something we progress toward! Democracy in Iran would be progress! Therefore Twitter will help create progress toward democracy in Iran!”

I’m having a weird week; I’ve been spending a lot of time thinking about how we make pretty much everything into a story that reflects what we already believe to be true. The trouble with dwelling on that for a while is that you reach a point where you realise that, if that assumption is true, then that assumption is also part of a narrative that’s reinforcing itself through you. Which is a pretty weird psychological and philosophical paradox… not to mention being remarkably unconducive to getting anything practical done.

The multiphrenic world: Stowe Boyd strikes back on “supertasking”

… which is really a neologism for its own sake (a favourite gambit of Boyd’s, as far as I can tell). But let’s not distract from his radical (and lengthy) counterblast to a New York Times piece about “gadget addiction”, which chimes with Nick Carr’s Eeyore-ish handwringing over attention spans, as mentioned t’other day:

The fear mongers will tell us that the web, our wired devices, and remaining connected are bad for us. It will break down the nuclear family, lead us away from the church, and channel our motivations in strange and unsavory ways. They will say it’s like drugs, gambling, and overeating, that it’s destructive and immoral.

But the reality is that we are undergoing a huge societal change, one that is as fundamental as the printing press or harnessing fire. Yes, human cognition will change, just as becoming literate changed us. Yes, our sense of self and our relationships to others will change, just as it did in the Renaissance. Because we are moving into a multiphrenic world — where the self is becoming a network ‘of multiple socially constructed roles shaping and adapting to diverse contexts’ — it is no surprise that we are adapting by becoming multitaskers.

The presence of supertaskers does not mean that some are inherently capable of multitasking and others are not. Like all human cognition, this is going to be a bell-curve of capability.

As always, Boyd is bullish about the upsides; personally, I think there’s a balance to be found between the two viewpoints here, but – doubtless due to my own citizenship of Multiphrenia – I’m bucking the neophobics and leaning a long way toward the positives. And that’s speaking as someone who’s well aware that he’s not a great multitasker…

But while we’re talking about the adaptivity of the human mind, MindHacks would like to point out the hollowness of one of the more popular buzzwords of the subject, namely neuroplasticity [via Technoccult, who point out that Nick Carr uses the term a fair bit]:

It’s currently popular to solemnly declare that a particular experience must be taken seriously because it ‘rewires the brain’ despite the fact that everything we experience ‘rewires the brain’.

It’s like a reporter from a crime scene saying there was ‘movement’ during the incident. We have learnt nothing we didn’t already know.

Neuroplasticity is common in popular culture at this point in time because mentioning the brain makes a claim about human nature seem more scientific, even if it is irrelevant (a tendency called ‘neuroessentialism’).

Clearly this is rubbish and every time you hear anyone, scientist or journalist, refer to neuroplasticity, ask yourself what specifically they are talking about. If they don’t specify or can’t tell you, they are blowing hot air. In fact, if we banned the word, we would be no worse off.

That’s followed by a list of the phenomena that neuroplasticity might properly be referring to, most of which are changes in the physical structure of the brain rather than cognitive changes in the mind itself. Worth taking a look at.