Tag Archives: culture

Unbranding and the hipster backlash

I have an awkward but passionate relationship with academic discussions of popular culture. Expansion: I’ve always found popular culture more interesting as an observer than as a participant, but I think the line between those two states is becoming thinner and fuzzier (if, indeed, it ever existed at all beyond my own desperate, continuous and largely futile attempts to see myself as separate from any form of cultural majority in my current social environment*).

You see, I had a minor revelation on the way to Tesco the other evening, in which I realised that part of the difficulty with, say, writing reviews of books or music in a networked world, is that you can’t isolate any one cultural artefact from the world in which it exists, or from its creator (not entirely), or from its consumers and detractors. To review effectively – to critique – is an act of comparative cultural anthropology, performed in a room lit only by a Maglite velcroed to one’s own forehead. Context is everything. The character and intellectual history of the critic is crucial to your understanding their understanding of the subject of their critique. The critic’s greatest insights (and, by the same token, greatest blindspots) are necessarily invisible to her. To paraphrase Douglas Adams, the critic can’t see her biases for the same reason that a tourist stood in Trafalgar Square can’t see England.

And so much for rambling pseudophilosophical cultural discourse. (Hey, it was a fun paragraph to write. I may even have meant most of it.) But back to the point: culture, fashion, trends, memes. Cyclic shifts. The mainstream’s need to reappropriate marginal culture (because, based as it is on a pattern of consumerism, it cannot create, only refine and reiterate); marginal culture’s parasitic defiance, goading and mockery/pastiche/satire of the mainstream’s current obsessions (because the urge to create is almost indistinguishable from the urge to destroy).

What am I going on about?

Like, hipsters, y’know? Right. Wired UK piece, academics and psychology types talking about the pivot point where a self-identified outsider culture reaches a critical mass and becomes a self-parody, attacks its own body-politic like cancer or some sort of immune system failure; Pop Will Eat Itself (dos dedos, mis amigos). Swarms of Johnny-come-latelys displace the boys and girls from back-in-the-day to the sound of chorused mutterings of “sell-outs and cash-ins”,  “we-were-here-first”, “the-early-albums-were-waaaaay-better”. In-group identifiers become terms of disparagement outside the group; inside the group, further divisions of nomenclature attempt to reposition the speaker in relation to the recent immigrant influx invading their cultural space (“he’s no hipster, he’s a scenester; sooooo bogus”). Meanwhile, businesses spring up and rot away in parallel with the swells and breakers of cultures rising and falling, happy remoras (remorae?) on the big dumb whale-shark of Youth. (RIP, American Apparel; couldn’t happen to a more horrifying homogeniser of urban try-hards.)

Whoa, check myself – still waffling b*llocks! Cut to the chase with academic concision:

In order to distance themselves from the hipster caricature, true indie consumers** use a number of techniques.

The first is “aesthetic discrimination”, whereby you tell those who accuse you of being hipsters as uninformed outsiders who don’t have sophisticated enough tastes to be able to discriminate between the hipster caricature and the authentic indie consumer.

The second technique is “symbolic demarcation”. Those indie consumers who engage in aesthetic discrimination tend to have an intellectual command of indie culture and are socially recognised as people who are in the know. Because of this status, they can afford to dismiss any resemblances to the hipster icon as irrelevant.

They might also rename the hipster caricature as something else, eg “scenester”, thus placing the worst traits associated with a “hipster” into a new, distinct definition. Creating a new category helps solidify the contrast between legitimate indie consumers and those who simply want to be part of a fashionable scene.

The third technique is “proclaiming (mythologised) consumer sovereignty”. This sees the person consciously reframe their interests in the indie field to show their autonomy from the dictates of fashion.

“Our findings suggest how backlash against identity categories such as hipster or metrosexual could generate complex and nuanced identity strategies that enable consumers to retain their tastes and interests while protecting these tastes from trivializing mythologies,” the authors conclude.

(Before you feel too smug, we all do this. Granted, most of us reading this site don’t do it while wearing ironic Rayban knockoffs or penny loafers under rolled-up drainpipe jeans, but we all do it. Genre fandom especially is full of this stuff, though it moves more slowly. Hell, even the transhumanists do it, though they use even bigger words than anyone else in the process. Othering is a hard-wired human thing, goes way back to pre-speech phases of socialisation. Them-and-us; hard habit to quit.)

But so what? Well, say you’re a marketer for fashion brands (or for a new author, or an advocate for a new school of transcendent philosophy). Making your own brand/author/philosophy look good is incredibly hard to achieve reliably… even more so nowadays, with the memetic flux swirling so fast. Yesterday’s viral sensation is today’s lingering and sniffly common cold. So what to do? Instead of giving your brand to cultural icons that reflect the aspirations of your target subculture, you give your rival brands to cultural icons who embody the opposite of those aspirations [via BoingBoing]. Couture-marketing psy-ops. Sounds ridiculous, a possible indicator of the end of civilisation (wring hands, mutter about the Romans, miss point entirely). But with clarity born of hindsight, this morning’s revelation, triggered by the two articles linked above and prompting the rapid-fire unedited writing of this little screed:

William Gibson’s been writing this stuff for years.

How does he keep doing that?

Related: Slate “interviews” Kanye West by slicing up his Twitter output. The Village Voice claims this as the chiselled headstone of the music magazine: who needs the middleman to broadcast their personal brand, if all they’ll do is distort it? The Village Voice fails to recognise that pop culture consumers are like fuzz-rock guitarists: distortion always sounds better than clean signal. Boutique stomp-boxes all round!

[ * So, yes, science fiction fandom was a pretty inevitable landing-spot, I suppose. But which came first, the estrangement or my enjoyment of the literature thereof?*** Answers on the back of an Urban Outfitters till receipt… ]

[ ** Not entirely sure about these notional “true indie consumers”. Neophiliacs would probably be a fairer word, albeit an arguably less flattering one. ]

[ *** And so much for pathos. ]

Tearing down the walls between “boy” and “girl”

Well, this is heartening: an opinion piece in New Scientist arguing in favour of dismantling the gender divide.

Yes, boys and girls, men and women, are different. But most of those differences are far smaller than the Men are from Mars, Women are from Venus stereotypes suggest. Nor are the reasoning, speaking, computing, empathising, navigating and other cognitive differences fixed in the genetic architecture of our brains. All such skills are learned, and neuro-plasticity – the modification of neurons and their connections in response to experience – trumps hard-wiring every time. If men and women tend towards different strengths and interests, it is due to a complex developmental dance between nature and nurture that leaves ample room to promote non-traditional skills in both sexes.

The obvious place to start looking for behavioural differences between the sexes is infancy. Yet even here they are often in the eye of the beholder. In a classic experiment, researchers cross-dress babies to fool people that they are interacting with a child of the opposite sex. Volunteers tend to comment more on the physical strength and negative emotions of babies they believe to be boys, and on the beauty and positive emotions of babies they believe to be girls.

[…]

So should we abandon our search for the “real” differences between the sexes? Yes. There is almost nothing we do with our brains that is hard-wired: every skill, attribute, and personality trait is moulded by experience. At no time are children’s brains more malleable than in early life – the time when parents are so eager to learn the baby’s sex, project it to others and unconsciously express stereotyped impressions of their child.

It’s a timely topic, brought into the public eye by celebrity gossip (what else?): Angelina Jolie’s decision to let her four year old daughter dress as she pleases – short haircut, traditionally “male” clothes – is a pretty good barometer for comparing the opinions of different demographics. For example, compare the Feministing headline for this story (“Angelina Jolie responds to gender policing of Shiloh“) with that from FOX Nation (“Angelina Jolie Lets Daughter Gender-Bend?“).

Sadly, essentialist views of gender differences are deeply entrenched in the conservative and fundamentalist worldviews, both of which tend to place adherence to tradition above and beyond the well-being and freedom of the individual; regular readers of this site probably don’t need reminding that I tend to see things quite the other way round. Nonetheless, it’s great to see this topic becoming a matter for public discussion; sure, it’ll stir up a whole lot of dumb uninformed invective (from extremist positions on both sides of the debate, sadly), but cultural change comes with friction as standard.

And who knows – maybe we’ll end up with a society that finds the notion of applying experimental hormone treatments to your unborn child in the hope of nipping any potential gender ambiguity in the bud to be a repugnant act of cultural eugenics. Fingers crossed, eh?

Attention economics: sub-prime celebrities

There’s sometimes deep truth in flippant analogies. Well, there is in my world, anyway… and here’s an example, as The Guardian‘s Aditya Chakrabortty compares celebrity to shonky mortgages: if you sell too many of the latter masquerading as the real thing, the whole system ends up collapsing in the wake of the (admittedly huge) short-term gains you make from it.

As for the assertion that fame is sought only by a desperate few wannabes, think again. Extrapolating from surveys, the developmental psychologist Orville Gilbert Brim estimates that 4 million American adults (out of a total of 200 million) describe fame as their most important life goal. The proportions are only slightly lower in Germany and urban China.

[…]

If you define fame as being known by strangers, then newspapers, cinema and especially TV have always driven the spread of celebrity. Yet, until very recently, that attention has customarily been at a gradient: the public used to look up to their stars; now they are minded to look down.

[…]

Think back to Wall Street’s sub-prime crisis. That was a story of lenders so desperate for market share and quick profit that they were chucking big sums at people who didn’t warrant it. The tale is very similar in the celebrity-media industry.

Your TV used to be the equivalent of a rating-agency, exposing you only to AAA-rated talent. Now however, it asks you to keep up with the Kardashians; watch a Hilton or an Osborne muddle through the real world, and, yes, be a guest at Katie Price’s latest wedding. The fundamentals of all these celebs are, frankly, ropey, and yet viewers are invited to invest time and emotional equity in them.

Resonances there with our ongoing discussion about gatekeepers and experts in the world of publishing; gatekeeper failure really can collapse a thriving market.

More pertinently, I think I’ve always viewed social currencies like fame (or its more localised little brother, popularity) in economic terms, even long before I knew what economics actually was*. Chakrabortty’s model would need to factor in some of fame’s more curious properties, though: the way it can in circumstances be gifted to another without any loss of personal worth, for instance, or the way one can collapse one’s own federal reserve completely without any help or interference from others, or any intended expense on your part.

Shorter version: anyone who wants to code a detailed version of Whuffie has a whole lot of work ahead of them. But the human brain, jacked into the cyborg extension of ourselves we call the media, can run those insanely complex calculations without knowing consciously how they work… score one up for the meat. 😉

[ * This is a not-too-subtly coded way of saying that I wasn’t hugely popular at school, and spent a lot of time trying to rationalise why that was. I’d have doubtless been better served by not thinking about it, hence appearing to have been less of a massive nerd, and hence becoming more popular. Ah, hindsight… 🙂 ]

The multiphrenic world: Stowe Boyd strikes back on “supertasking”

… which is really a neologism for its own sake (a favourite gambit of Boyd’s, as far as I can tell). But let’s not distract from his radical (and lengthy) counterblast to a New York Times piece about “gadget addiction”, which chimes with Nick Carr’s Eeyore-ish handwringing over attention spans, as mentioned t’other day:

The fear mongers will tell us that the web, our wired devices, and remaining connected are bad for us. It will break down the nuclear family, lead us away from the church, and channel our motivations in strange and unsavory ways. They will say it’s like drugs, gambling, and overeating, that it’s destructive and immoral.

But the reality is that we are undergoing a huge societal change, one that is as fundamental as the printing press or harnessing fire. Yes, human cognition will change, just as becoming literate changed us. Yes, our sense of self and our relationships to others will change, just as it did in the Renaissance. Because we are moving into a multiphrenic world — where the self is becoming a network ‘of multiple socially constructed roles shaping and adapting to diverse contexts’ — it is no surprise that we are adapting by becoming multitaskers.

The presence of supertaskers does not mean that some are inherently capable of multitasking and others are not. Like all human cognition, this is going to be a bell-curve of capability.

As always, Boyd is bullish about the upsides; personally, I think there’s a balance to be found between the two viewpoints here, but – doubtless due to my own citizenship of Multiphrenia – I’m bucking the neophobics and leaning a long way toward the positives. And that’s speaking as someone who’s well aware that he’s not a great multitasker…

But while we’re talking about the adaptivity of the human mind, MindHacks would like to point out the hollowness of one of the more popular buzzwords of the subject, namely neuroplasticity [via Technoccult, who point out that Nick Carr uses the term a fair bit]:

It’s currently popular to solemnly declare that a particular experience must be taken seriously because it ‘rewires the brain’ despite the fact that everything we experience ‘rewires the brain’.

It’s like a reporter from a crime scene saying there was ‘movement’ during the incident. We have learnt nothing we didn’t already know.

Neuroplasticity is common in popular culture at this point in time because mentioning the brain makes a claim about human nature seem more scientific, even if it is irrelevant (a tendency called ‘neuroessentialism’).

Clearly this is rubbish and every time you hear anyone, scientist or journalist, refer to neuroplasticity, ask yourself what specifically they are talking about. If they don’t specify or can’t tell you, they are blowing hot air. In fact, if we banned the word, we would be no worse off.

That’s followed by a list of the phenomena that neuroplasticity might properly be referring to, most of which are changes in the physical structure of the brain rather than cognitive changes in the mind itself. Worth taking a look at.

A sci-fi rock’n’roll odyssey at Clarkesworld

Long-term readers of this here site are probably aware that my other huge cultural obsession (besides science fiction literature, natch) is rock music, and that I’ve spent some amount of time in the last few years on drawing comparisons and connections between the two scenes.

So imagine my joy (if you will) when I saw that this month’s issue of Clarkesworld contains an article by Jason Heller that traces the history of science fictional futurism and narrative through the canon of rock music since Bowie’s “Space Oddity”! And better still (because this is the multimedia information super-content-highway-tubes, kids) it’s full of embedded video so you can actually hear and see what he’s on about.

Not for the first time (though almost always at moments when I have more than enough pressing demands on my time), I find myself thinking that there’s enough scope for me to write a non-fiction book on the cross-pollination of sf/f/h and rock music… anyone want to crowdfund me to spend a year on that? Maybe Heller would like to co-write… *opens email client*