Tag Archives: language

Dolphin diplomacy

In a passably neat segue from yesterday’s post about the potential personhood of higher animals, here’s news of some research that purports to say not just that dolphins have their own language, but that they can use it to talk their way out of fights [via, not surprisingly, George Dvorsky again].

Now, I’m no marine biologist (and nor do I play one on television), but I rather suspect the conclusion here is a speculative one, especially given that the lede of the piece mentions that “[t]he study reveals the complexity and our lack of understanding about the communication of these marine mammals”. That’s not to say I don’t think it’s possible, but that I’m not sure there’s any way we could prove the assertion without having someone who spoke dolphin like a native… and given that a fair bit of their communication is based on body-language as well as sound (or so I was once told), I think that’s probably a fair distance in the future.

That said, if dolphins really can talk their way out of fights, they’re doing better than a lot of the humans I knew in my late teens, and we could probably do with a few of them on the UN security council.

Techlepathy: decoding words from brain signals

Another piece slots in to the mind-machine interface puzzle: via George Dvorsky comes news that University of Utah neuroboffins have decoded individual words from embedded electrode scans of brain activity.

The University of Utah research team placed grids of tiny microelectrodes over speech centers in the brain of a volunteer with severe epileptic seizures. The man already had a craniotomy – temporary partial skull removal – so doctors could place larger, conventional electrodes to locate the source of his seizures and surgically stop them.

Using the experimental microelectrodes, the scientists recorded brain signals as the patient repeatedly read each of 10 words that might be useful to a paralyzed person: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less.

Later, they tried figuring out which brain signals represented each of the 10 words. When they compared any two brain signals – such as those generated when the man said the words “yes” and “no” – they were able to distinguish brain signals for each word 76 percent to 90 percent of the time.

As always with this sort of story, though, it’s early days yet:

When they examined all 10 brain signal patterns at once, they were able to pick out the correct word any one signal represented only 28 percent to 48 percent of the time – better than chance (which would have been 10 percent) but not good enough for a device to translate a paralyzed person’s thoughts into words spoken by a computer.

“This is proof of concept,” Greger says, “We’ve proven these signals can tell you what the person is saying well above chance. But we need to be able to do more words with more accuracy before it is something a patient really might find useful.”

So you’ll have to wait a little longer for that comfy little skull-cap that’ll read your as-yet-unwritten novel straight out of your head (worse luck). But proof-of-concept’s better than nothing, especially for a technology that – even comparatively recently – was considered to be pure science fiction.

Why isn’t there a gender-neutral pronoun?

Actually, there are dozens of gender-neutral pronouns, and that’s  true even if you limit your search to the science fiction canon. But calls for a gender-neutral pronoun are much older than you might have thought, as the Oxford University Press blog explains, and we still haven’t managed to adopt one [via TheBigThink]:

Such discussions in the 1880s and 90s did nothing to shake up the pronoun paradigm, and nothing came of subsequent proposals for heer, hie, ha, hesh, thir, she (together with shis and shim), himorher, se, heesh, hse, kin, ve, ta, tey, fm, z, ze, shem, se, j/e, jee, ey, ho, po, ae, et, heshe, hann, herm, ala, de, ghach, han, he, mef, ws, and ze [a list with dates and sources for many of these pronouns can be found here].

Flash forward to 1978, when The Times (of London) prints a letter in response to yet another call for a new “unisex” pronoun set, advocating le, lim, ler, and lers. (And another correspondent tersely suggests it.)

Despite this wealth of coinage, there is still no widely-accepted gender-neutral pronoun. In part, that’s because pronoun systems are slow to change, and when change comes, it is typically natural rather than engineered.

For those of us who work with words, of course, there are canonical rulesets to which we are supposed to adhere. But it’s the ruleset of grammar that long forbade the use of the singular they:

… despite the almost universal condemnation of the coordinate he or she by supporters of gender-neutral pronouns, the rule books now opt for he or she and not an invented word to replace the generic he. Students who once were taught that the masculine pronoun must always be used in cases of mixed or doubtful gender are now taught instead to use coordinate forms, not for gender balance or grammatical precision, but simply because that’s the new rule. Those writers who question the rule, who realize that multiple he-or-she’s just don’t make for readable prose, won’t seek out a new gender-neutral pronoun. Instead they’ll recast some sentences as plural, and for the rest they’ll just take their chances with singular they. After all, if you, which is also gender neutral, can serve both for singular and plural, why can’t they do the same? In any case, after more than 100 attempts to coin a gender-neutral pronoun over the course of more than 150 years, thon and its competitors will remain what they always have been, the words that failed.

Regular readers may have noticed that I tend to use the singular they wherever possible – indeed, I’ve been called out on it in the comments here once or twice, so that grammatical rule dies hard. I really can’t remember when I started doing it, either; I’m not sure whether I was taught that way at school (though I doubt it, given the conservatism of my education).

All this, I suppose, makes gender-neutral pronouns a case study in the seemingly universal human urge to create multiple new rules in order to fix a problem that could be obviated by dropping or loosening a single old rule…

Close conversation really is a meeting of minds

Behind the inevitable allusions to Star Trek, this is an interesting story: scientific evidence that the brain waves of someone listening closely to another person’s speech can synchronise with them.

The evidence comes from fMRI scans of 11 people’s brains as they listened to a woman recounting a story.

The scans showed that the listeners’ brain patterns tracked those of the storyteller almost exactly, though trailed 1 to 3 seconds behind. But in some listeners, brain patterns even preceded those of the storyteller.

“We found that the participants’ brains became intimately coupled during the course of the ‘conversation’, with the responses in the listener’s brain mirroring those in the speaker’s,” says Uri Hasson of Princeton University.

Hasson’s team monitored the strength of this coupling by measuring the extent of the pattern overlap. Listeners with the best overlap were also judged to be the best at retelling the tale. “The more similar our brain patterns during a conversation, the better we understand each other,” Hasson concludes.

Apparently (and completely unsurprisingly) an unfamiliar language acts as a barrier to this synchronisation – if you can’t understand the person who’s speaking, you can’t “click” with them. This is probably the best argument for a single global language that I can think of… but I wonder if poor comprehension of the same language would produce similar results to a completely foreign language?

Foreign Accent Syndrome

Strange things are afoot in the language centres of our brains. Just the other week, we had the story of the Croatian teenager who woke from a coma unable to speak her native language, but mysteriously fluent in German. Slightly more mundane (but still pretty weird) is Foreign Accent Syndrome, wherein traumas and/or triggers unknown cause the afflicted person to speak their native tongue in a seemingly foreign accent.

So much we don’t understand about that cauliflower of grey meat…