Sex robots and thought crimes

A recent op-ed in The New York Times discusses some ethical and moral issues with sex robots. In particular, the issues with one such robotic personality dubbed “Frigid Farrah” which the company describes in their FAQ as “reserved and shy”.

From the op-ed:

Frigid Farrah is not alone in providing her user with a replica of a human partner without the nagging complication of consent. […]

One of the authors of the Foundation for Responsible Robotics report, Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield, England, said there are ethical arguments within the field about sex robots with “frigid” settings.

“The idea is robots would resist your sexual advances so that you could rape them,” Professor Sharkey said. “Some people say it’s better they rape robots than rape real people. There are other people saying this would just encourage rapists more.”

Like the argument that women-only train compartments are an answer to sexual harassment and assault, the notion that sex robots could reduce rape is deeply flawed. […]

Rape is not an act of sexual passion. It is a violent crime. We should no more be encouraging rapists to find a supposedly safe outlet for it than we should facilitate murderers by giving them realistic, blood-spurting dummies to stab.

Now, I agree with the condemnation of rape. To their credit, True Companion now says in their FAQ (which may or may not have been there at the time the op-ed was published):

We absolutely agree with Laura Bates, campaigner and founder of the Everyday Sexism Project, that “rape is not an act of sexual passion…”

Roxxxy, our True Companion sex robot is simply not programmed to participate in a rape scenario and the fact that she is, is pure conjecture on the part of others. […]

Frigid Farrah can be used to help people understand how to be intimate with a partner.

Rape simply isn’t an interaction that Roxxxy supports nor is it something that our customers are requesting.

That said, I would much rather someone rape a human-like robot instead of raping a real person. I understand others may not feel the same way, but I would compare an attempt to outlaw rape victim scenario robot software because others feel it can be used to practice for the rapes of real women paramount to banning certain video games like the Grand Theft Auto series because they can be used to practice real scenarios of crime and evading arrest. If you think the latter is just plain silly, you understand how I see the former.

The logical extreme of this–and I use the word “logical” a bit loosely here–is to make it a crime to partake in a simulated rape involving a sex robot. We already have laws that have run amok outlawing the possession of simulated child porn (not involving actual children). The original purpose of child porn laws was to prevent child abuse, similar to the actual laws against child abuse itself. If there are no children involved, all you really have left is thoughtcrime (a la George Orwell’s 1984). There are people who have been arrested and convicted for possessing only simulated child porn. While I find even simulated child porn repulsive, having our laws establish crimes where there is no real victim makes me even queasier.

Not surprisingly, there have also been calls for child sex robots to be banned at least in Britain, per the op-ed. I would certainly hope the demand for the child-sized versions to be in much lower demand. I find the sexual abuse of children particularly revolting, and I would expect most of decent society to concur. Again, better a robot than the real thing. The perverts get their perverse desires satisfied, no humans are harmed, and everyone wins.

Unless, of course, the legislators run amok and outlaw them. When child-size sex robots are outlawed, only outlaws will have child-size sex robots, and yet another thoughtcrime will be on the books. Can someone please remind our legislators that George Orwell’s book 1984 is not an instruction manual?

Matt McMullen, quoted in a linked article from The Guardian in the op-ed summarizes it rather nicely:

“Is it ethically dubious to force my toaster to make my toast?”