I think this is about the third or fourth variation of this story I’ve seen in the last few years, but nonetheless – The Guardian has a brief piece wherein philosopher Nick Bostrom suggests we should be thinking ahead about what rights we will need to grant to our sentient machines.
Which is very well-meant, I suppose. But science fiction author Peter Watts takes a rather different view of the necessity for robotic rights – basically, there isn’t any.
“I’ve got no problems with enslaving machines — even intelligent machines, even intelligent, conscious machines — because as Jeremy Bentham said, the ethical question is not “Can they think?” but “Can they suffer?”* You can’t suffer if you can’t feel pain or anxiety; you can’t be tortured if your own existence is irrelevant to you.
You cannot be thwarted if you have no dreams — and it takes more than a big synapse count to give you any of those things. It takes some process, like natural selection, to wire those synapses into a particular configuration that says not I think therefore I am, but I am and I want to stay that way. We’re the ones building the damn things, after all. Just make sure that we don’t wire them up that way, and we should be able to use and abuse with a clear conscience.”
How about you – are you looking forward to running your Roomba ragged, or planning to kennel your Aibo when you go on holiday? [Image by Plutor]
[tags]robotics, rights, ethics, technology[/tags]