… which is really a neologism for its own sake (a favourite gambit of Boyd’s, as far as I can tell). But let’s not distract from his radical (and lengthy) counterblast to a New York Times piece about “gadget addiction”, which chimes with Nick Carr’s Eeyore-ish handwringing over attention spans, as mentioned t’other day:
The fear mongers will tell us that the web, our wired devices, and remaining connected are bad for us. It will break down the nuclear family, lead us away from the church, and channel our motivations in strange and unsavory ways. They will say it’s like drugs, gambling, and overeating, that it’s destructive and immoral.
But the reality is that we are undergoing a huge societal change, one that is as fundamental as the printing press or harnessing fire. Yes, human cognition will change, just as becoming literate changed us. Yes, our sense of self and our relationships to others will change, just as it did in the Renaissance. Because we are moving into a multiphrenic world — where the self is becoming a network ‘of multiple socially constructed roles shaping and adapting to diverse contexts’ — it is no surprise that we are adapting by becoming multitaskers.
The presence of supertaskers does not mean that some are inherently capable of multitasking and others are not. Like all human cognition, this is going to be a bell-curve of capability.
As always, Boyd is bullish about the upsides; personally, I think there’s a balance to be found between the two viewpoints here, but – doubtless due to my own citizenship of Multiphrenia – I’m bucking the neophobics and leaning a long way toward the positives. And that’s speaking as someone who’s well aware that he’s not a great multitasker…
But while we’re talking about the adaptivity of the human mind, MindHacks would like to point out the hollowness of one of the more popular buzzwords of the subject, namely neuroplasticity [via Technoccult, who point out that Nick Carr uses the term a fair bit]:
It’s currently popular to solemnly declare that a particular experience must be taken seriously because it ‘rewires the brain’ despite the fact that everything we experience ‘rewires the brain’.
It’s like a reporter from a crime scene saying there was ‘movement’ during the incident. We have learnt nothing we didn’t already know.
Neuroplasticity is common in popular culture at this point in time because mentioning the brain makes a claim about human nature seem more scientific, even if it is irrelevant (a tendency called ‘neuroessentialism’).
Clearly this is rubbish and every time you hear anyone, scientist or journalist, refer to neuroplasticity, ask yourself what specifically they are talking about. If they don’t specify or can’t tell you, they are blowing hot air. In fact, if we banned the word, we would be no worse off.
That’s followed by a list of the phenomena that neuroplasticity might properly be referring to, most of which are changes in the physical structure of the brain rather than cognitive changes in the mind itself. Worth taking a look at.