Tag Archives: attention

Attention economics: sub-prime celebrities

There’s sometimes deep truth in flippant analogies. Well, there is in my world, anyway… and here’s an example, as The Guardian‘s Aditya Chakrabortty compares celebrity to shonky mortgages: if you sell too many of the latter masquerading as the real thing, the whole system ends up collapsing in the wake of the (admittedly huge) short-term gains you make from it.

As for the assertion that fame is sought only by a desperate few wannabes, think again. Extrapolating from surveys, the developmental psychologist Orville Gilbert Brim estimates that 4 million American adults (out of a total of 200 million) describe fame as their most important life goal. The proportions are only slightly lower in Germany and urban China.

[…]

If you define fame as being known by strangers, then newspapers, cinema and especially TV have always driven the spread of celebrity. Yet, until very recently, that attention has customarily been at a gradient: the public used to look up to their stars; now they are minded to look down.

[…]

Think back to Wall Street’s sub-prime crisis. That was a story of lenders so desperate for market share and quick profit that they were chucking big sums at people who didn’t warrant it. The tale is very similar in the celebrity-media industry.

Your TV used to be the equivalent of a rating-agency, exposing you only to AAA-rated talent. Now however, it asks you to keep up with the Kardashians; watch a Hilton or an Osborne muddle through the real world, and, yes, be a guest at Katie Price’s latest wedding. The fundamentals of all these celebs are, frankly, ropey, and yet viewers are invited to invest time and emotional equity in them.

Resonances there with our ongoing discussion about gatekeepers and experts in the world of publishing; gatekeeper failure really can collapse a thriving market.

More pertinently, I think I’ve always viewed social currencies like fame (or its more localised little brother, popularity) in economic terms, even long before I knew what economics actually was*. Chakrabortty’s model would need to factor in some of fame’s more curious properties, though: the way it can in circumstances be gifted to another without any loss of personal worth, for instance, or the way one can collapse one’s own federal reserve completely without any help or interference from others, or any intended expense on your part.

Shorter version: anyone who wants to code a detailed version of Whuffie has a whole lot of work ahead of them. But the human brain, jacked into the cyborg extension of ourselves we call the media, can run those insanely complex calculations without knowing consciously how they work… score one up for the meat. ๐Ÿ˜‰

[ * This is a not-too-subtly coded way of saying that I wasn’t hugely popular at school, and spent a lot of time trying to rationalise why that was. I’d have doubtless been better served by not thinking about it, hence appearing to have been less of a massive nerd, and hence becoming more popular. Ah, hindsight… ๐Ÿ™‚ ]

Maybe it doesn’t matter that the internet is “making us stupid”

High-profile internet-nay-sayer and technology curmudgeon Nick Carr is cropping up all over the place; these things happen when one has a new book in the offing, y’know*. He’s the guy who claims that Google is making us stupid, that links embedded in HTML sap our ability to read and understand written content (cognitive penalties – a penalty that even the British can do properly, AMIRITE?), and much much more.

The conclusions of Carr’s new book, The Shallows – that, in essence, we’re acquiring a sort of attention deficit problem from being constantly immersed in a sea of bite-sized and interconnected info – have been given a few polite kickings, such as this one from Jonah Lehrer at the New York Times. I’ve not read The Shallows yet, though I plan to; nonetheless, from the quotes and reviews I’ve seen so far, it sounds to me like Carr is mapping the age-related degradation of his own mental faculties onto the world as a whole, and looking for something to blame.

I should add at this point that, although I disagree with a great number of Carr’s ideas, he’s a lucid thinker, and well worth reading. As Bruce Sterling points out, grumpy gadfly pundits like Carr are useful and necessary for a healthy scene, because the urge to prove them wrong drives further innovation, thinking, research and development. He’s at least as important and worth reading as the big-name webvangelists… who all naturally zapped back at Carr’s delinkification post with righteous wrath and snark. The joy of being a mere mortal is, surely, to watch from a safe point of vantage while the gods do battle… ๐Ÿ˜‰

But back to the original point: there’s always a trade-off when we humans acquire new technologies or skills, and what’s missing from commentators decrying these apparent losses is any suggestion that we might be gaining something else – maybe something better – as part of the deal; technological symbiosis is not a zero-sum game, in other words. Peripherally illustrating the point, George Dvorsky points to some research that suggests that too good a memory is actually an evolutionary dead end, at least for foraging mammals:

These guys have created one of the first computer models to take into account a creature’s ability to remember the locations of past foraging successes and revisit them.

Their model shows that in a changing environment, revisiting old haunts on a regular basis is not the best strategy for a forager.

It turns out instead that a better approach strategy is to inject an element of randomness into a regular foraging pattern. This improves foraging efficiency by a factor of up to 7, say Boyer and Walsh.

Clearly, creatures of habit are not as successful as their opportunistic cousins.

That makes sense. If you rely on that same set of fruit trees for sustenance, then you are in trouble if these trees die or are stripped by rivals. So the constant search for new sources food pays off, even if it consumes large amounts of resources. “The model forager typically spends half of its traveling time revisiting previous places in an orderly way, an activity which is reminiscent of the travel routes used by real animals, ” say Boyer and Walsh.

They conclude that memory is useful because it allows foragers to find food without the effort of searching. “But excessive memory use prevents the forager from updating its knowledge in rapidly changing environments,” they say.

This reminds me of the central idea behind Peter Watts’ Blindsight – the implication that intelligence itself, which we tend to think of as the inevitable high pinnacle of evolutionary success, is actually a hideously inefficient means to genetic survival, and that as such, we’re something of an evolutionary dead end ourselves. Which reminds me in turn of me mentioning evolutionary “arms races” the other day; perhaps, instead of being in an arms race against our own cultural and technological output as a species, we’re entering a sort of counterbalancing symbiosis with it. Should we start considering technology as a part of ourselves rather than a separate thing? Are we not merely a species of cyborgs, but a cyborg species?

[ * The irony here being that almost all the discussion and promotion of Carr’s work that does him any good occurs… guess where? Hint: not in brick’n’mortar bookstores. ]

The attention economy: curation by duration

This Short Manifesto on the Future of Attention by Michael Erard pushed a lot of my buttons, and I reckon it’ll be of some considerable interest to other art creators and consumers (writers and readers, for example, which is most of you lot):

I imagine attention festivals: week-long multimedia, cross-industry carnivals of readings, installations, and performances, where you go from a tent with 30-second films, guitar solos, 10-minute video games, and haiku to the tent with only Andy Warhol movies, to a myriad of venues with other media forms and activities requiring other attention lengths. In the Nano Tent, you can hear ringtones and read tweets. A festival organized not by the forms of the commodities themselves but of the experience of interacting with them. Not organized by time elapsed, but by cognitive investment: a pop song, which goes by quickly, can resonate for days; a poem, which can go by more quickly, sticks through a season. A festival in which you can see images of your brain on knitting and on Twitter.

I imagine a retail sector for cultural products that’s organized around the attention span: not around “books” or “music” but around short stories and pop songs in one aisle, poems and arias in the other. In the long store: 5,000 piece jigsaw puzzles, big novels, beer brewing equipment, DVDs of The Wire. Clerks could suggest and build attentional menus. We would develop attentional connoisseurship: the right pairings of the short and long.

Has a hint of the science fictional about it, but doesn’t seem implausible by any means given the way the web is mutating creation and commerce. But this bit deserves special attention:

I imagine an attention tax that aspiring cultural producers must pay. A barrier to entry. If you want people to read your book, then you have to read books; if you want people to buy your book, then you buy books. Give your attention to the industry of your choice. Like indie musicians have done for decades, conceive of the scene as an attention economy, in which those who pay in (e.g., I go to your shows) get to take out (e.g., come to my show). It would also mitigate one oft-claimed peril of the rise of the amateur, which is that they don’t know from quality: consuming many other examples from a variety of sources, even amateur producers would generate a sense of what’s good and what’s bad: in other words, in their community they’d evolve a set of standards. This might frustrate the elitists, who want to impose their standards. But standards would, given enough time, emerge.

This sounds very much like the online short fiction scene to me, albeit a more highly evolved version thereof, and the pparallel with the indie music scenes, especially at a local level, is palpable. I’d be tempted to make “economy” and “ecosystem” interchangeable, though. What do you think – will curation of niche artforms become a form of crowdsourced consensus of attention?

(This is yet another link from Joanne McNeil of Tomorrow Museum, who I’ll stop linking to just as soon as she stops posting really interesting stuff… which hopefully won’t be any time soon.)

Furniture with character

How can we fix our relentless habit of buying new stuff to replace perfectly functional older stuff? James Pierce of Indiana University has an idea: design household objects to interact with us periodically and engage our attention beyond their established roles:

For instance, he has designed a table with an embedded digital counter that displays the number of heavy objects that have been placed on it during its lifetime. The counter becomes blurry or erratic if someone drops a heavy object on the table, only later returning to the correct count.

Another approach is cheeky misbehaviour, such as a lamp that dims if you leave it on for too long; shaking the lamp “wakes” it again. Or a clock that occasionally shows the wrong time, only to correct itself and display a message that it was just joking.

There’s more than a hint of Philip K Dick in that idea… and as much as I can see where Pierce is trying to go, it’s surely a bit too playful and arty to actually swing in the real world. I don’t know about you, but if I had a clock that periodically lied about what time it was, I’d replace it sooner rather than later!

What the US election tells us about how marketing is changing

smashed televisionGiven that the bulk of Futurismic‘s readers are US-based, I doubt we’re going to get much sense out of you for a few days while the smoke clears… and if you’ve come here looking for a respite from the election topicality, please accept my apologies and this uber-cute box of puppies. [No, for real, live webcam feed! Via MeFi, of course.]

But for the rest of us (and those of you tuning in regardless) here’s a topical (and pretty much non-partisan) note from the master marketer, Seth Godin, who notes that this year’s presidential election has turned a lot of old marketing truisms on their head. F’rexample:

TV is over. If people are interested, they’ll watch. On their time (or their boss’s time). They’ll watch online, and spread the idea. You can’t email a TV commercial to a friend, but you can definitely spread a YouTube video. The cycle of ads got shorter and shorter, and the most important ads were made for the web, not for TV. Your challenge isn’t to scrape up enough money to buy TV time. Your challenge is to make video interesting enough that we’ll choose to watch it and choose to share it.

Zing! Of course, most of us know that already, but hey – it’s nice to feel ahead of the curve sometimes, ain’t it? ๐Ÿ™‚ [image by Scott89]