Feb. 17th, 2018

elfs: (Default)
Adam Rogers has a new article in Wired entitled Sex With Robots, and I just cannot get over how fundamentally misguided and immoral the article really is. The entire article is predicated on a pernicious and ill-considered anthropmorphization of devices that are purposefully designed, by humans, then constructed and deployed from a factory.

Rogers starts with this thought:


On the one hand, technology isn’t sophisticated enough to build a sentient, autonomous agent that can choose to not only have sex but even love, which means that by definition it cannot consent. So it’ll necessarily present a skewed, possibly toxic version.


I'll be coming back to this paragraph repeatedly, because it, and a final sentence I did not include, are at the heart of Rogers's incoherence and amoral thinking.

Let's start with what we talk about when we talk about sex robots. There are three completely different technologies at the heart of this issue, they are all in their infancy, and they are all improving rapidly.

First, there is up-close, small-scale sensory realism. Does the thing we're talking about look like a human being? Does it feel like one, even at the very smallest scale? If you squeeze it's upper arm, does it feel like a bicep? Is the belly squishy and soft, or muscular and hard, and in either case is it convincingly human? Does it move like a human, smell like a human, and even taste like a human? 1

Second, there is large-scale mobility: can the thing we're talking about move inside a human space? Can it walk, climb stairs, dress and undress itself, climb under the covers, manipulate the light switches, and shower on its own without a human being lifting and turning it?

Third, there's the capability for emotional realism. Can the thing we're talking about sound like a real human being? Can it hold a conversation, react to stimuli in satisfying ways, read your verbal and physical responses, and react to them in an appropriate, safe, expected, and pleasurable manner?

All of these technologies are being pursued independently. Small-scale sensory realism is a combination of RealDolls to surgical quality synthetic skin, along with whatever improvements come along to make taste and smell work just as well. (I have this disturbing notion that future sex robots will have groins and armpits doped with a combination of pheremones and vape-style flavorings so the upper brain is going "strawberries!" while the back brain is screaming fuck now!). Large-scale mobility is the sort of robotics technology pursued by outlets like Boston Dynamics and Toyota.

The third technology, emotional authenticity, is being pursued by Google and Microsoft and just about everyone: it's called Human Interface AI Design, and it's an emerging discipline of matching our speech and the device's responses.

The first two are just technologies. The last is where the rationalizing rubber meets the risky road, and Rogers initially seems to be addressing that danger, but he isn't. His conclusions are philosophically incoherent and morally offensive.

I believe there are deep and inherent risks in the third technology on two fronts, neither of which Rogers addresses. One is rooted in the terrifying reality of our modern day culture, the other in the horrifying thought experiments of science fiction writers for the past century.

We'll discuss both of those later.



1 The fact that, right now, forums discussing silicone sex dolls like RealDolls recommend putting talcum powder on the surface to make it feel "more lifelike" is a huge tell: no one kisses a relief machine.

elfs: (Default)
In the previous article discussing Adam Rogers's Sex With Robots, I laid down some basic definitions of what it is we talk about when we talk about sex robots. Although there were three distinct technologies, they break up into two distinct categories:


  • Physical realism

  • Emotional realism


For the sake of discussion, I'm going to lump "mental realism," that is, the ability to speak knowledgeably about a topic, with emotional realism: speaking convincingly is a skill that encompasses both, and both is what we want from our robots and AIs.

Provided we want them at all.

I'm going to provide more of of Rogers's core paragraph, just so we con go over it:


Can a robot consent to having sex with you? Can you consent to sex with it? On the one hand, technology isn’t sophisticated enough to build a sentient, autonomous agent that can choose to not only have sex but even love, which means that by definition it cannot consent. So it’ll necessarily present a skewed, possibly toxic version. And if the technology gets good enough to evince love and lust—Turing love—but its programming still means it can’t not consent, well, that’s slavery.


The technology of emotionally convincing, speaking, artificially intelligent agents doesn't require that they be "sentient" or "autonomous." We don't require those skills of Alexa or Siri or Cortana. People will talk for hours to something as simple as Eliza, which was written in 1964! The more realism the better, and the more convinced we are that the machine of loving grace is both convenient and caring, the more we're willing to pay for it.

The objective of these disciplines is not to create autonomous, sentient agents. Combined, these disciplines combine to create convincing, satisficing woman-shaped object (or man-shaped object) that exists literally to engineer away the awkward parts of negotiating with another human being to meet one's sexual needs. Just as we've engineered away the awkward parts of getting food with McDonalds and Kraft Mac & Cheese, and the awkward parts of getting places by inventing cars, and the awkward parts of hearing stories by inventing NetFlix, we're on the cusp of engineering away the awkward parts of human sexual relief with ever-better iterations of the three technologies above put together into a sex toy.

No one negotiates with a vibrator, dildo, or cock sleeve. Until and unless robots meet that very high bar we call "consciousness," we're pretty in the clear. This report, Why Sex Robots?, highlights what's clear about sex robots: not a single respondent wanted a fully autonomous, independent, sentient partner. They want something less than themselves, but they want it close enough to a woman or man to satify their atavistic, evolutionary need for something that's shaped like, looks like, and feels like a human being.

It's not a human being. Human beings emerged from evolution with a set of instincts that make us what we are: insecure and anxious, desperate for security and attention, eager to both be part of a community in which we feel safe and protected, and eager to prove we're worthy of more than the minimal the community proves, to belong and to compete. Since content and restful creatures tend not to compete for reproductive resources, evolution makes us restless, We're needy and selfish and, when facing people who aren't part of our tribe, frequently dismissive or even cruel.

Robots will only have those qualities if we give those qualities to them. We don't have to. We can, and should, make them different. We should give them the things we want to be, but only struggle to be. Because we're only human.

Rogers's incoherency is manifest here:


The first question, then, is whether robots will desire sex back.


I thought the first question was "Can they consent?" In any event, if they don't, we built them completely, utterly incorrectly. We built them not to be our companions but our competition. We built them with our cruelty, our insecurity, our selfishness, and then somehow we expected them to behave. We built all of the things that make us suffer into them in order to deliberately make them suffer. We built them not as if we were building a durable appliance, we built them as emotionally repressed human beings.

Rogers starts with assuming that they're emotionally repressed human beings. Oh, but it gets worse!


And if the technology gets good enough to evince love and lust—Turing love—but its programming still means it can’t not consent, well, that’s slavery.


Okay, let's hold onto this thought, because the next one is a doozy.


Lewis may be onto something of a solution here: The robot does not have to look human.


And this, friends, is where your typical science fiction reader loses his goddamned mind. Because we have two different and completely contradictory thoughts in place here:


  1. The closer something looks to human, the more it is human. If it passes the "automatic sweetheart" test, then we're obligated to treat it as a fully formed human being.

  2. If something is so Turing-complete as to be conversationally and behaviorally indistinguishable from a human being, and it doesn't look "close to human," then its moral value is diminished below that of a human being's to the point where it can be regarded as irrelevant.


This is incoherent and immoral: it says the body matters and the soul doesn't. It starts with that perverse anthropmorphism, the one that assumes that if a robot looks and acts and smells human, then a robot with its full dedication and capability directed toward a human being, rather than selfish needs or some abstract "all humanity," is morally the same as an emotionally repressed human being, and it's our duty to remove the repression, presumably by installing either selfishness or altruism, and all the concurrent suffering that goes with either– or both.

Rogers's essay stars out with a bad question: "Can a machine we make consent?", assumes a terrible, immoral strawman situation, and then devolves into the kind of "We only have to care if it looks like us" mindset that fueled the horrors of chattel slavery and Aktion T4. He says he wants to avoid slavery, and then creates a world in which slavery is not only rampant, but cruelly inflicted, and asks us to not care because "they don't look like us."

elfs: (Default)
A couple of final points to drive home (perhaps belabor) the point about how maddening I found Adam Rogers's Sex With Robots.

Rogers makes the observation that millions of women1 have discovered that a robot doesn't have to look human to be sexually satisfying, and then goes on to cite vibrators as the pre-eminent example. But we're not talking about a motor with a programmable rhythm chip and a bluetooth interface.

If you've ever played a first person shooter in campaign mode, the experience of a gun battle does not entail high-minded thinking along the lines of "Oh, that avatar was programmed by the developer to want to reduce my health score to zero, and I will do the same to it." It's more along the lines of "Aieee, that zombie is about to eat my head! Die, die, m10r2!" We don't think about the player at all; instead, we ascribe some degree of agency to the zombie and react accordingly.

The more subtle and capable the robot, the broader the range of possible reactions it can take, the more we attribute real agency to a robot. The shape doesn't matter.

There are two axes on which Rogers is playing this game. The first is the uncanny valley; a range at one end of which we say "is human," and at the other "is avatar," be it puppet, stuffed animal, or cartoon, and in the middle is the valley, a gap where it's "almost human" but in an off-putting or uncomfortable way that suggests illness or anathema3. The second is from human to animal, with some animals being cats and dogs who get cuddled to sleep in our beds with us, others being food animals raised for slaughter, and at the far end mosquitoes and E. coli and things with which we'd rather not have to share the Earth. Even there, pets have their own uncanny valleys: glass-eyed, palsied and jerking movements, gurgling speech all suggest illness and uncanniness rather than a household friend.

But in robots, to suggest that there's a range of moral worth because of body shape, even when all those body shapes are occupied by the same mind, is to argue that bodies, and not souls, are what we care about as a civilization.

Rogers doesn't even begin to account for those local maxima in the uncanny valley where fetishists hang out. There are a surprising lot of them and they already exist as a market for "not human" robots. Above and beyond mere Furries and Ferals, there's Bad Dragon which sells dildos for a market that wants to fuck, you know, dragons. There are fetishists for glossy, metallic, sexy robots, every kind of beast you can find in World of Warcraft, body-morphing "humantaurs", tentacle monsters, even attack helicopters. The robots for this market wouldn't qualify as human-looking, but if they had all the compassion, the wisdom, if they were "absolutely indistinguishable from a spiritually animated person, laughing, talking, blushing, nursing us, and performing all the offices of lover as tactfully and sweetly as if a soul were in her," Adam Rogers isn't concerned with their moral fate at all.

As a final thought experiment, imagine a smart house with a server in the basement that supplies the persona for the house. It talks, it listens, it tries hard to meet your needs. It has an extension avatar, a woman-shaped robot that makes sure your lunch is packed, your bed is made, your carpet is vaccuumed. It doesn't really have a mind of its own; just a radio connection to the server in the basement. In Rogers's argument, it would be wrong to slap the robot around, but you'd be completely within the bounds of social acceptability to take an axe to the server because, after all, it doesn't look human.

That's incoherent. And if the house is indistinguishable from a spiritually animated person in speech and word, immoral.




1 Millions of men, too. In fact, I'd be willing to bet that on a dollar basis, men spend more on sex toys every year than women, and not for toys that they intend to use with women. The most expensive and boutique sex toy manufacturers, such as Squarepeg or Oxballs, actually market to gay men, and even on the determinedly non-gender retailer Bad Dragon, a large portion of their consumer base is self-described straight men who just enjoy the challenge of shoving large things up their backsides.

2A common convention among nerds is to shorten long words to their first and last letters and a count of the letters in between. "Internationalization" becomes "i18n;" "Localization" becomes "l10n." You can probably figure out what "m10r" stands for.

3Modern filmmakers and choreographers exploit this. Crystal Pike's modern ballet, Emergence, uses dozens of highly trained dancers at the height of their physical capabilities and human beauty, and teaches them jerking, mime-like motions to suggest an insect hive, and the effect is definitely uncanny.

Profile

elfs: (Default)
Elf Sternberg

December 2025

S M T W T F S
 12345 6
78910111213
14151617181920
21222324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 6th, 2026 02:25 am
Powered by Dreamwidth Studios