Destroying the chrysalis of your online childhood

Google CEO Eric Schmidt has suggested that in the near future every person might be entitled to change their name on reaching adulthood, in order that they can live unhindered by the online record of their youthful indiscretions. Now, Eric Schmidt is almost certainly a much smarter man than I am… but that idea is clearly batshit nuts, especially coming from a Google bigwig.

I mean, think about it – most people use pseudonyms and handles online as it is. How many names and identities will you be allowed to abandon? One? Some? All? And as the vigilante efforts of Anonymous prove time and time again, even folk making a big effort to conceal their identity can have it exposed against their will. If Schmidt is implying (as he seems to be – it’s hard to tell from such a throwaway comment, to be fair) that the digital record would remain after this identity disconnection, how exactly would you prevent people from doing a combination of internet sleuthing and good old-fashioned meatspace gumshoe work in order to connect xCrazyLarry1989x to Lawrence Michaels, aspiring tort lawyer and gubernatorial candidate for his local Chamber of Commerce?

Given the way web technology keeps advancing, it may not even be much of an effort. Hell, Google itself offers a neat little app called Goggles that can identify famous faces and objects by comparing them to archived images on the web (as for values of ‘famous’, John Scalzi is apparently famous enough for this to work); given the sheer number of photos on the Facebook profiles of most young people, how are you going to prevent someone using this sort of image search and linking your newly-renamed Adult Person to the child they were?

Simple answer: you’re not. If what Schmidt actually means is that there’ll be some sort of legally-enshrined disconnect between an adult and their behaviour before adulthood, then maybe it’s not quite such a crazy suggestion… but it’s still pretty crazy. Personally, I tend to agree with Stowe Boyd and others: I think we underestimate the common sense kids apply to social media based on the high-profile idiocy of a tiny minority, and I think we overestimate the impact that youthful (or even not so youthful) indiscretions in the digital fossil record will have on how the people who left them behind will be viewed. As a crude numbers-from-the-air example: if one in five kids is pictured somewhere on the web taking a hit from a bong, is society more likely to (a) refuse to employ 1/5 of the population, or (b) figure that kids smoking weed really isn’t such a big deal?

Trouble is, we keep applying the social mores of today to the society of a decade hence. Think about how different the world felt just five years ago; attitudes change fast. By the time the internet’s knowledge of our past is sufficient to be causing problems for the majority of people, my bet is that we’ll be worrying about something else entirely. Or, to put it another way: when you have evidence that pretty much everyone has been a little bit naughty at some point in their lives, your assessment of how much naughtiness is forgivable will shift accordingly. Transgression is implicitly assessed against a baseline of ‘normality’; a searchable childhood for everyone will move that baseline. In fact, I’d even go so far as to suggest it’ll be the people with squeaky-clean pasts who end up looking the most suspicious…

Probabilistic processing: the analogue computer waits in the wings

Digital processing has the advantage of versatility – the utter ubiquity of computer technology is a testament to that. But digital logic has to use lots of bits to represent large ranges of values; perhaps some applications – spam filtering, for instance, or pattern analysis – would run better and faster on a system that allowed for analogue values “in the raw”, so to speak?

Lyric’s innovation is to use analogue signals instead of digital ones, to allow probabilities to be encoded directly as voltages. Their probability gates represent zero probability as 0 V, and certainty as VDD. But unlike digital logic, for which these are the only options, Lyric’s technology allows probabilities between 0 and 1 to use voltages between 0 and VDD. Each probabilistic bit (“pbit”) stores not an exact value, but rather, the probability that the value is 1. The technology allows a resolution of about 8 bits; that is, they can discriminate between about 28 = 256 different values (different probabilities) between 0 and VDD.

By creating circuits that can operate directly on probabilities, much of the extra complexity of digital circuits can be eliminated. Probabilistic processors can perform useful computations with just a handful of pbits, with a drastic reduction in the number of transistors and circuit complexity as a result.

This could so easily be an excerpt from a Rudy Rucker story… or a Neal Stephenson novel, for that matter.

Transhumanist science clash! Kurzweil vs. Myers

Say what you will about transhumanism, but one thing’s for certain: it really polarises opinion, and nowhere more so than in the halls of academia and scientific research. Observe: Wired/Gizmodo had a chat with Singularitarian-in-chief Ray Kurzweil, who restated his theory (considered unrealistically optimistic by some transhumanists) that we’ll be able to reverse-engineer the human brain and simulate it with computers within a decade or so.

Here’s how that math works, Kurzweil explains: The design of the brain is in the genome. The human genome has three billion base pairs or six billion bits, which is about 800 million bytes before compression, he says. Eliminating redundancies and applying loss-less compression, that information can be compressed into about 50 million bytes, according to Kurzweil.

About half of that is the brain, which comes down to 25 million bytes, or a million lines of code.

Now enter PZ Myers, prominent atheism advocate (I like to think of him as “Dawkins’ Bulldog”, though I’m not sure Dawkins really needs a bulldog in the way that Darwin did) and vigorous debunker of fringe science. Broad claims in the Kurzweil vein are like a red rag to Myers, especially on his home turf of genetic biology, and he’s not afraid of mixing in a little ad hominem disparagement with his rejoinders, either:

Kurzweil knows nothing about how the brain works. It’s design is not encoded in the genome: what’s in the genome is a collection of molecular tools wrapped up in bits of conditional logic, the regulatory part of the genome, that makes cells responsive to interactions with a complex environment. The brain unfolds during development, by means of essential cell:cell interactions, of which we understand only a tiny fraction. The end result is a brain that is much, much more than simply the sum of the nucleotides that encode a few thousand proteins. He has to simulate all of development from his codebase in order to generate a brain simulator, and he isn’t even aware of the magnitude of that problem.

[…]

To simplify it so a computer science guy can get it, Kurzweil has everything completely wrong. The genome is not the program; it’s the data. The program is the ontogeny of the organism, which is an emergent property of interactions between the regulatory components of the genome and the environment, which uses that data to build species-specific properties of the organism. He doesn’t even comprehend the nature of the problem, and here he is pontificating on magic solutions completely free of facts and reason.

Now, I’m not taking sides here*; I don’t know enough computer science or evolutionary biology to cut into either interpretation. But a high-minded slapfight like this is always of interest, because it highlights just how seriously some very intelligent people take the issue. Kurzweil has more than a tinge of the evangelist about him, which is (I suspect) a large part of what bothers Myers about him, but there’s obviously something powerful about the idea (the meme?) of transhumanism/singularitarianism that he feels makes it worth fighting.

Ideas that get people arguing are important ideas. I consider myself a fellow traveller of transhumanism for this very reason; the ways we imagine tomorrow says a lot about where we are today, and vice versa. There’s a lot to learn by listening to both sides, I think.

[ * Yeah, yeah, I know, I’ve got marks on my ass from sitting on the fence. That’s just how I roll, baby; you want clenched-fist advocacy of anything but the right to think for yourself, you’re gonna need to read a different blog. ]

Paranormal biofantasy: zombie ants, hungry vampires

I pretty much never talk about the “paranormal romance” or “urban fantasy” tropes here at Futurismic, partly because they rarely say much about the real future in anything more than very vague metaphorical terms (the ubiquity of the shambling undead as a symbol of the subliminal horror of a greying society where the elderly prey on the financial vitality of the young and healthy?), and partly because talking about vampires and zombies and werewolves in the genre blogosphere is a bit like whispering your shopping list in the mosh-pit at a Slayer gig.

But put the roots of those tropes into some sort of scientific context, and I’m all over it like the tribal tattoo on an ass-kicking heroine’s lower back. So, ladies and gentlemen: zombie ants. Zombie ants that have been mind-controlled by a parasitic fungus for nearly fifty million years.

The finding shows that parasitic fungi evolved the ability to control the creatures they infect in the distant past, even before the rise of the Himalayas.

The fungus, which is alive and well in forests today, latches on to carpenter ants as they cross the forest floor before returning to their nests high in the canopy.

The fungus grows inside the ants and releases chemicals that affect their behaviour. Some ants leave the colony and wander off to find fresh leaves on their own, while others fall from their tree-top havens on to leaves nearer the ground.

The final stage of the parasitic death sentence is the most macabre. In their last hours, infected ants move towards the underside of the leaf they are on and lock their mandibles in a “death grip” around the central vein, immobilising themselves and locking the fungus in position.

OK, so the fate of rainforest bugs and freaky fungi may not seem all that existentially terrifying, but symbiosis occurs elsewhere – remember toxoplasma, the cat parasite that may (be sure to emphasise the ‘may’) be responsible for human neurotic behaviour patterns?

And in deepest darkest Peru, no one is finding vampirism sparkly and smoulderingly attractive (yet strangely supportive of Christianised notions of sexual abstinence and submissive femininity): swarms of vampire bats are on the rampage, and have attacked more than 500 people. The only immortality that bite is going to give you is a third page sidebar in your local paper as the first person to die of rabies in living memory.

Living with less: digital lifestyles versus consumer materialism

Seems like you can’t have a good idea these days without it turning into some sort of cult or movement… maybe that’s always been the case, but 24/7 journalism and social media certainly speeds up the process. Aaaaaaanyways, here’s a BBC article on technohipster types who’re shedding the majority of their material possessions in favour of computer hardware and cloud-based communications and data storage.

There are so many misconceptions about what tarot is and how it is used. If you believed what we see in Hollywood you would think that all tarot readers are wearing veils, slaughtering goats and getting their palms crossed with silver or a very regular basis. This could not be further from the truth. There are many different styles of tarot reader but my approach is to stay very grounded, practical and accessible. I believe that tarot can benefit every person on the planet if used properly. You might want to check out my the best free tarot reading online and try it yourself.

It’s kind of romantic, in a somewhat smug and self-aware po-mo kind of way: the New Nomadism! A reaction to the consumerist lust-for-stuff that helped bring us to global financial collapse, etc etc. What it fails to take into account is that there are hundreds of thousands living just as nomadic a lifestyle, only without the luxuries of a fresh Macbook Air and a custom-built fixie; having too much stuff is very much a #firstworldproblem, and as much as it’s satisfying to see a turn away from that, it’s frustrating to see how, already, it’s destined to be repackaged and sold as a lifestyle trend.

If I was in the cloud computing business right now, I’d be thinking real hard about how to market (and mark up!) my tools and services to precisely these sorts of people: people who are financially and geographically fortunate enough to see sparse living as something worth paying for (as opposed to being the only game in town, as it is for most folks living out of a couple of bags).

That said, I can see the benefits… hell, I’ve even experienced some of them. My own recent relocation saw me sell off my entire music collection, for instance; I realised I never played my CDs in a player, so I just ripped them all to a hard drive and sold them off. There were nearly a thousand of them, and do you know what the biggest surprise was? How hard it was to get people to buy them, even priced at just £1 each. Another couple of years (or even less), and you’ll have to give physical music media away. Even now, as new promos keep pouring through my letterbox, I increasingly view them as an imposition on my space… like a meatspace version of bacn, I guess.

It would have been much more pragmatic of me to replace my books with an ereader, but there I drew the line; my library is my major fetish, the last real outlet for my deeply-ingrained middle-class collector’s impulse, and while I may have culled a lot of crap from it, there’s a lot of books that I simply can’t bear to part with. It’s irrational, but I don’t think a bit of irrationality is all that harmful to anything other than my own bank balance… though ask me again after the next time I have to move house. Close to a thousand books is a whole lot of heavy boxes to shift, and they take up a lot of space.

What the BBC piece (and the technomad quotes that prop it up) skips over is the infrastructre that makes such a nomadic lifestyle possible. Ubiquitous wireless broadband, for instance; I’m guessing these people wouldn’t be so keen on living the way they do if they couldn’t remain connected to the world from wherever they’re currently laying their hat. And there’s a whole bunch of unexamined Western privilege beneath the surface: safe places to crash or couch-surf, cheap places to rent over short periods, comparatively low incidences of property theft, kitchen utensils cheap enough to throw out or give away each time you move… these hidden costs are carried by the societies these people live in. Which isn’t to portray these people as parasites (far from it!), but it’s worth bearing in mind to counteract some of the digital_Beatnik utopian vibe of the thing.

Going back to my own downsizing, I found that necessity was the motivator… I inherited a real packrat mindset from my late father, and it dies hard. But now I’ve started, it’s easier to see other things that I know (rationally) I could (and indeed should) get rid of. But emotional attachments are very powerful things; whatever you might think of Buddhism as a religion, that’s one aspect of human psychology it really nails. It can be done, though; Futurismic‘s very own peripatetic columnist Sven Johnson tells me his possessions consist of a desk, a decent ergonomic chair, a computer and a duffle full of clothes. As a freelance industrial designer, he doesn’t really need much else – and it means moving to where the work is becomes a much less painful process.

What would it take to make you give up the majority of your physical possessions? And what’s the one thing you really couldn’t bear to part with, even though you know you don’t need it?

Presenting the fact and fiction of tomorrow since 2001