… which is really a neologism for its own sake (a favourite gambit of Boyd’s, as far as I can tell). But let’s not distract from his radical (and lengthy) counterblast to a New York Times piece about “gadget addiction”, which chimes with Nick Carr’s Eeyore-ish handwringing over attention spans, as mentioned t’other day:
The fear mongers will tell us that the web, our wired devices, and remaining connected are bad for us. It will break down the nuclear family, lead us away from the church, and channel our motivations in strange and unsavory ways. They will say it’s like drugs, gambling, and overeating, that it’s destructive and immoral.
But the reality is that we are undergoing a huge societal change, one that is as fundamental as the printing press or harnessing fire. Yes, human cognition will change, just as becoming literate changed us. Yes, our sense of self and our relationships to others will change, just as it did in the Renaissance. Because we are moving into a multiphrenic world — where the self is becoming a network ‘of multiple socially constructed roles shaping and adapting to diverse contexts’ — it is no surprise that we are adapting by becoming multitaskers.
The presence of supertaskers does not mean that some are inherently capable of multitasking and others are not. Like all human cognition, this is going to be a bell-curve of capability.
As always, Boyd is bullish about the upsides; personally, I think there’s a balance to be found between the two viewpoints here, but – doubtless due to my own citizenship of Multiphrenia – I’m bucking the neophobics and leaning a long way toward the positives. And that’s speaking as someone who’s well aware that he’s not a great multitasker…
But while we’re talking about the adaptivity of the human mind, MindHacks would like to point out the hollowness of one of the more popular buzzwords of the subject, namely neuroplasticity [via Technoccult, who point out that Nick Carr uses the term a fair bit]:
It’s currently popular to solemnly declare that a particular experience must be taken seriously because it ‘rewires the brain’ despite the fact that everything we experience ‘rewires the brain’.
It’s like a reporter from a crime scene saying there was ‘movement’ during the incident. We have learnt nothing we didn’t already know.
Neuroplasticity is common in popular culture at this point in time because mentioning the brain makes a claim about human nature seem more scientific, even if it is irrelevant (a tendency called ‘neuroessentialism’).
Clearly this is rubbish and every time you hear anyone, scientist or journalist, refer to neuroplasticity, ask yourself what specifically they are talking about. If they don’t specify or can’t tell you, they are blowing hot air. In fact, if we banned the word, we would be no worse off.
That’s followed by a list of the phenomena that neuroplasticity might properly be referring to, most of which are changes in the physical structure of the brain rather than cognitive changes in the mind itself. Worth taking a look at.
Hi Paul, and welcome to our club of Bad Multitaskers Anonymous. I too have the feeling that society (and the internet in particular) strongly pushes people towards multitasking; but also that we (I for sure) are dangerously near to the limits of what our brain evolved to do, and consequently is actually capable of doing.
It seems to me that people increase the number of tasks that they work on at the same time only at the price of an increase in errors in the outcomes. People really manage to do more things (such as working or using social networks or going to the gym or… educating their children) in a given time span in comparison with the past: but the quality of the results of all this activity is lower.
Why should we have to become multitasking virtuosos without any new tools to help us? The reason why oral recitation and the ability to memorize thousands of lines of text ceased to be an important measure of intellectual achievement was that writing and the various tools created from it (the aide memoir, the written history, etc.) removed the need for that sort of recall, replacing it with methods that were more accurate over multiple recitations, easier to transmit over distance and time, and simpler and quicker to copy. I don’t expect multitasking to become so common as to not attract comment until we have more and better tools to support it; for instance a central device that aggregates, triages, and prioritizes all demands on a user’s attention. Right now I have to deal with email, voicemail, real-time phone, IM, Twitter, Facebook, Live Journal, RSS feeds, broadcast and cable TV, Youtube, broadcast and internet radio, and my doorbell, mostly in separate windows or separate hardware devices. When I can see (and/or hear) all of he possible sources of demands on my attention in one place, stacked so that I can end my response to one and pick the next one to deal with, then I’ll consider multitasking a routine skill. Until then only those people with a preternatural skill will be good at it.