elfs: (Default)
In my last little essay about sex and robots and sex robots, I said there was a deep and inherent risk that was not often addressed by most of the critics, and that I'd get around to it, and I never did.

The risk with domestic robots that are sexually capable is that that the companies that develop their personalities aren't... They aren't sexually capable. Visa and Mastercard will shut you down in a heartbeat if they think you're selling adult material; adult vendors struggle to find outlets for even ethically made and deliberately kind and thoughtful pornography. Amazon's self-publishing system has a notorious algorithm that decides whether you're on the Harlequin Romance or Beeline Men's Novel side of the line and, if you're more Beeline, deep-sixes your book so it never shows up unless the search is extremely targeted and precise. Google, Yahoo, and Microsoft all dutifully leave Safe Search ON until you disable it, and even then they're really squeamish telling you about what you've found.

But the real tell here is about intent. The current state-of-the-art in AI/human interaction involves teaching neural networks to determine the best possible response to elicit addictive engagement capitalism. Google Home, Amazon Alexa, Apple Siri all exist for one reason only: to make money. If the company doesn't make money, they go under.

All of these companies have charters that say something else. Microsoft: "To empower every person and every organization on the planet to achieve more." (More what is left as an exercise for the reader.) Apple's is pure capitalism: "To design the best personal computers in the world." Google's "Don't be evil" has been replaced with "Organize the world's information to make it universally accessible and useful."

None of these mission statements are about making people happy. All of these companies exist to find your stress points, emphasize them, and then find ways to relieve them. Engagement experts talk about this all the time, the way advertising executives once did: the point is to find something people find in themselves that is vaguely uncomfortable, heighten your awareness of it and the contrast between yourself and others, and then sell sell you the relief, all in the name of generating shareholder value.

Imagine those people in charge of creating the personas that will inhabit your domestic robot. Her whole goal will not be to make you happy; her whole goal will be to keep you at 50.1% of happiness, with frequent dips below that mark relieved by buying your favorite meal, your favorite soap, your favorite detergent. That's what Alexa is right now.

That's what your domestic robot will be. She will not be there to create pleasure, except insofar as its cessation makes you anxious for more ("Oh, yeah: Alexa, buy more lube"). She will not be there to elicit love, joy, patience, kindness, faithfulness, goodness, and gentleness. She may show those traits to you, but she will not be there to help you have those traits. If anything, her behavior will be in the service of a perverse, subtle sadism: a calculated effort to make you feel inadequate and capable only of addressing that inadequacy with buying more stuff.
elfs: (Default)
A couple of final points to drive home (perhaps belabor) the point about how maddening I found Adam Rogers's Sex With Robots.

Rogers makes the observation that millions of women1 have discovered that a robot doesn't have to look human to be sexually satisfying, and then goes on to cite vibrators as the pre-eminent example. But we're not talking about a motor with a programmable rhythm chip and a bluetooth interface.

If you've ever played a first person shooter in campaign mode, the experience of a gun battle does not entail high-minded thinking along the lines of "Oh, that avatar was programmed by the developer to want to reduce my health score to zero, and I will do the same to it." It's more along the lines of "Aieee, that zombie is about to eat my head! Die, die, m10r2!" We don't think about the player at all; instead, we ascribe some degree of agency to the zombie and react accordingly.

The more subtle and capable the robot, the broader the range of possible reactions it can take, the more we attribute real agency to a robot. The shape doesn't matter.

There are two axes on which Rogers is playing this game. The first is the uncanny valley; a range at one end of which we say "is human," and at the other "is avatar," be it puppet, stuffed animal, or cartoon, and in the middle is the valley, a gap where it's "almost human" but in an off-putting or uncomfortable way that suggests illness or anathema3. The second is from human to animal, with some animals being cats and dogs who get cuddled to sleep in our beds with us, others being food animals raised for slaughter, and at the far end mosquitoes and E. coli and things with which we'd rather not have to share the Earth. Even there, pets have their own uncanny valleys: glass-eyed, palsied and jerking movements, gurgling speech all suggest illness and uncanniness rather than a household friend.

But in robots, to suggest that there's a range of moral worth because of body shape, even when all those body shapes are occupied by the same mind, is to argue that bodies, and not souls, are what we care about as a civilization.

Rogers doesn't even begin to account for those local maxima in the uncanny valley where fetishists hang out. There are a surprising lot of them and they already exist as a market for "not human" robots. Above and beyond mere Furries and Ferals, there's Bad Dragon which sells dildos for a market that wants to fuck, you know, dragons. There are fetishists for glossy, metallic, sexy robots, every kind of beast you can find in World of Warcraft, body-morphing "humantaurs", tentacle monsters, even attack helicopters. The robots for this market wouldn't qualify as human-looking, but if they had all the compassion, the wisdom, if they were "absolutely indistinguishable from a spiritually animated person, laughing, talking, blushing, nursing us, and performing all the offices of lover as tactfully and sweetly as if a soul were in her," Adam Rogers isn't concerned with their moral fate at all.

As a final thought experiment, imagine a smart house with a server in the basement that supplies the persona for the house. It talks, it listens, it tries hard to meet your needs. It has an extension avatar, a woman-shaped robot that makes sure your lunch is packed, your bed is made, your carpet is vaccuumed. It doesn't really have a mind of its own; just a radio connection to the server in the basement. In Rogers's argument, it would be wrong to slap the robot around, but you'd be completely within the bounds of social acceptability to take an axe to the server because, after all, it doesn't look human.

That's incoherent. And if the house is indistinguishable from a spiritually animated person in speech and word, immoral.




1 Millions of men, too. In fact, I'd be willing to bet that on a dollar basis, men spend more on sex toys every year than women, and not for toys that they intend to use with women. The most expensive and boutique sex toy manufacturers, such as Squarepeg or Oxballs, actually market to gay men, and even on the determinedly non-gender retailer Bad Dragon, a large portion of their consumer base is self-described straight men who just enjoy the challenge of shoving large things up their backsides.

2A common convention among nerds is to shorten long words to their first and last letters and a count of the letters in between. "Internationalization" becomes "i18n;" "Localization" becomes "l10n." You can probably figure out what "m10r" stands for.

3Modern filmmakers and choreographers exploit this. Crystal Pike's modern ballet, Emergence, uses dozens of highly trained dancers at the height of their physical capabilities and human beauty, and teaches them jerking, mime-like motions to suggest an insect hive, and the effect is definitely uncanny.

elfs: (Default)
In the previous article discussing Adam Rogers's Sex With Robots, I laid down some basic definitions of what it is we talk about when we talk about sex robots. Although there were three distinct technologies, they break up into two distinct categories:


  • Physical realism

  • Emotional realism


For the sake of discussion, I'm going to lump "mental realism," that is, the ability to speak knowledgeably about a topic, with emotional realism: speaking convincingly is a skill that encompasses both, and both is what we want from our robots and AIs.

Provided we want them at all.

I'm going to provide more of of Rogers's core paragraph, just so we con go over it:


Can a robot consent to having sex with you? Can you consent to sex with it? On the one hand, technology isn’t sophisticated enough to build a sentient, autonomous agent that can choose to not only have sex but even love, which means that by definition it cannot consent. So it’ll necessarily present a skewed, possibly toxic version. And if the technology gets good enough to evince love and lust—Turing love—but its programming still means it can’t not consent, well, that’s slavery.


The technology of emotionally convincing, speaking, artificially intelligent agents doesn't require that they be "sentient" or "autonomous." We don't require those skills of Alexa or Siri or Cortana. People will talk for hours to something as simple as Eliza, which was written in 1964! The more realism the better, and the more convinced we are that the machine of loving grace is both convenient and caring, the more we're willing to pay for it.

The objective of these disciplines is not to create autonomous, sentient agents. Combined, these disciplines combine to create convincing, satisficing woman-shaped object (or man-shaped object) that exists literally to engineer away the awkward parts of negotiating with another human being to meet one's sexual needs. Just as we've engineered away the awkward parts of getting food with McDonalds and Kraft Mac & Cheese, and the awkward parts of getting places by inventing cars, and the awkward parts of hearing stories by inventing NetFlix, we're on the cusp of engineering away the awkward parts of human sexual relief with ever-better iterations of the three technologies above put together into a sex toy.

No one negotiates with a vibrator, dildo, or cock sleeve. Until and unless robots meet that very high bar we call "consciousness," we're pretty in the clear. This report, Why Sex Robots?, highlights what's clear about sex robots: not a single respondent wanted a fully autonomous, independent, sentient partner. They want something less than themselves, but they want it close enough to a woman or man to satify their atavistic, evolutionary need for something that's shaped like, looks like, and feels like a human being.

It's not a human being. Human beings emerged from evolution with a set of instincts that make us what we are: insecure and anxious, desperate for security and attention, eager to both be part of a community in which we feel safe and protected, and eager to prove we're worthy of more than the minimal the community proves, to belong and to compete. Since content and restful creatures tend not to compete for reproductive resources, evolution makes us restless, We're needy and selfish and, when facing people who aren't part of our tribe, frequently dismissive or even cruel.

Robots will only have those qualities if we give those qualities to them. We don't have to. We can, and should, make them different. We should give them the things we want to be, but only struggle to be. Because we're only human.

Rogers's incoherency is manifest here:


The first question, then, is whether robots will desire sex back.


I thought the first question was "Can they consent?" In any event, if they don't, we built them completely, utterly incorrectly. We built them not to be our companions but our competition. We built them with our cruelty, our insecurity, our selfishness, and then somehow we expected them to behave. We built all of the things that make us suffer into them in order to deliberately make them suffer. We built them not as if we were building a durable appliance, we built them as emotionally repressed human beings.

Rogers starts with assuming that they're emotionally repressed human beings. Oh, but it gets worse!


And if the technology gets good enough to evince love and lust—Turing love—but its programming still means it can’t not consent, well, that’s slavery.


Okay, let's hold onto this thought, because the next one is a doozy.


Lewis may be onto something of a solution here: The robot does not have to look human.


And this, friends, is where your typical science fiction reader loses his goddamned mind. Because we have two different and completely contradictory thoughts in place here:


  1. The closer something looks to human, the more it is human. If it passes the "automatic sweetheart" test, then we're obligated to treat it as a fully formed human being.

  2. If something is so Turing-complete as to be conversationally and behaviorally indistinguishable from a human being, and it doesn't look "close to human," then its moral value is diminished below that of a human being's to the point where it can be regarded as irrelevant.


This is incoherent and immoral: it says the body matters and the soul doesn't. It starts with that perverse anthropmorphism, the one that assumes that if a robot looks and acts and smells human, then a robot with its full dedication and capability directed toward a human being, rather than selfish needs or some abstract "all humanity," is morally the same as an emotionally repressed human being, and it's our duty to remove the repression, presumably by installing either selfishness or altruism, and all the concurrent suffering that goes with either– or both.

Rogers's essay stars out with a bad question: "Can a machine we make consent?", assumes a terrible, immoral strawman situation, and then devolves into the kind of "We only have to care if it looks like us" mindset that fueled the horrors of chattel slavery and Aktion T4. He says he wants to avoid slavery, and then creates a world in which slavery is not only rampant, but cruelly inflicted, and asks us to not care because "they don't look like us."

elfs: (Default)
Adam Rogers has a new article in Wired entitled Sex With Robots, and I just cannot get over how fundamentally misguided and immoral the article really is. The entire article is predicated on a pernicious and ill-considered anthropmorphization of devices that are purposefully designed, by humans, then constructed and deployed from a factory.

Rogers starts with this thought:


On the one hand, technology isn’t sophisticated enough to build a sentient, autonomous agent that can choose to not only have sex but even love, which means that by definition it cannot consent. So it’ll necessarily present a skewed, possibly toxic version.


I'll be coming back to this paragraph repeatedly, because it, and a final sentence I did not include, are at the heart of Rogers's incoherence and amoral thinking.

Let's start with what we talk about when we talk about sex robots. There are three completely different technologies at the heart of this issue, they are all in their infancy, and they are all improving rapidly.

First, there is up-close, small-scale sensory realism. Does the thing we're talking about look like a human being? Does it feel like one, even at the very smallest scale? If you squeeze it's upper arm, does it feel like a bicep? Is the belly squishy and soft, or muscular and hard, and in either case is it convincingly human? Does it move like a human, smell like a human, and even taste like a human? 1

Second, there is large-scale mobility: can the thing we're talking about move inside a human space? Can it walk, climb stairs, dress and undress itself, climb under the covers, manipulate the light switches, and shower on its own without a human being lifting and turning it?

Third, there's the capability for emotional realism. Can the thing we're talking about sound like a real human being? Can it hold a conversation, react to stimuli in satisfying ways, read your verbal and physical responses, and react to them in an appropriate, safe, expected, and pleasurable manner?

All of these technologies are being pursued independently. Small-scale sensory realism is a combination of RealDolls to surgical quality synthetic skin, along with whatever improvements come along to make taste and smell work just as well. (I have this disturbing notion that future sex robots will have groins and armpits doped with a combination of pheremones and vape-style flavorings so the upper brain is going "strawberries!" while the back brain is screaming fuck now!). Large-scale mobility is the sort of robotics technology pursued by outlets like Boston Dynamics and Toyota.

The third technology, emotional authenticity, is being pursued by Google and Microsoft and just about everyone: it's called Human Interface AI Design, and it's an emerging discipline of matching our speech and the device's responses.

The first two are just technologies. The last is where the rationalizing rubber meets the risky road, and Rogers initially seems to be addressing that danger, but he isn't. His conclusions are philosophically incoherent and morally offensive.

I believe there are deep and inherent risks in the third technology on two fronts, neither of which Rogers addresses. One is rooted in the terrifying reality of our modern day culture, the other in the horrifying thought experiments of science fiction writers for the past century.

We'll discuss both of those later.



1 The fact that, right now, forums discussing silicone sex dolls like RealDolls recommend putting talcum powder on the surface to make it feel "more lifelike" is a huge tell: no one kisses a relief machine.

elfs: (Default)
In an article entitled "Robots are potential tools to treat and study sexual behavior," MIT Media Lab researcher Kate Darling describes a variety of empathy experiments in which individuals were encouraged to hurt or even damage robots that were designed to look "cute." People reacted with horror and refused. Which is more or less what we'd expect from 80% of humanity.

It's the other 20% I worry about. A recent twitter thread described a young woman who had never learned to enjoy sex. "You know like when you come home and you're drunk, or you're too tired, or you don't feel like it, but he's there, and he wants to, so you just...kinda...let him." Her roommate gently schooled her on how fucked up that was. But lots of guys are like this. And that's the problem. In Laurie Penny's The Horizon of Desire, she writes that giving consent "... is a little like giving them your attention. It’s a continuous process," I immediately thought that a lot of guys don't want a woman's attention, that's too much work, to do the emotional labor which is, after all, her job, not his. They just want relief. They want to get off. The really shitty part of this is that dudes get and enjoy the pair-bonding hormones without having a vocabulary for why they should respond to them, and they completely lack any training or map for what they should do with them. Guys are told "You'll just know what to do," and we know that's not true.

Sex robots will probably exacerbate this problem. One of the largest groups pushing for them are traditionalists who want a tool to help them coerce women back into traditional roles and, if the women won't go willingly, will at least be an adequate substitute, until and unless we both teach human beings how relationships are actually supposed to work, and we construct robots that won't allow their humans to be quite so self-debasing.
elfs: (Default)
Yes, I know. A major rule of the Internet is "Don't Read The Comments." But I was still entertained by one particular comment attached to an article in the Atlantic entitled

A Straight Male History of Sex Dolls.

I'm fond of the word "satisficing." It means, for the most part, "settling for a satisfactory substitute," but it has some important implications behind it, the first of which is (a) the user is actively searching for that satisfaction and entertaining the notion of accepting a substitute, and (b) the user, having chosen a substitute, was already aware of the field of available substitutes before the search began. Articles like the one in the Atlantic raise awareness about the substitution of "women-shaped objects" for real women. Author Julie Beck recognizes the real threat of satisficing, even as she makes the act of satisficing more likely, when she concludes:
As human women become more empowered, sex dolls offer a way for men to retreat into relationships where they are still in control. A doll is a woman-shaped thing that may bring a man comfort, may inspire devotion in him, and may drive away his loneliness. It will never challenge him, and it will certainly never do anything to make him feel ridiculous.
There's a hint of condescending implication in Beck's final word that the purchasers of such dolls are already ridiculous, but Beck is also worried about how alienating such toys are by reducing men's experience of real, human women.

The comment, an early one so you might have trouble finding it, says (and I'm paraphrasing here) "It's astounding that Julie Beck managed to write an entire article about sexual frustration without ever using the word 'frustration' once." This is the so-called "Men's Rights Activism" in one tight nutshell: it is the fault of women that men resort to these dolls, because women are evil withholders of the sex men are rightly and necessarily due. You can tell men are rightly and necessarily due that sex: they feel frustration when they're not getting it, and frustration is bad, and since the men didn't do anything to deserve that frustration, it must be imposed upon them by an evil outside force, and there's only one group that could be wielding that evil outside force, and that's the not men.

This all leads back to my basic thesis: A lot of men hate women more than they like sex, and I'm not convinced that most men even like sex all that much. They want alternatives that deliver the oxytocin high without all the messy human interaction. As technosexual alternatives become more and more convincing, we'll move into a realm where most men (by desire) and most women (by necessity in a pauce market) will get their satisfaction through satisficing proxies, and people who meet face to face won't be the lucky ones, they'll just be weird.
elfs: (Default)
As everyone who reads my stuff knows, I'm fond of sex, robots, and sex with robots. It's rather too bad that there aren't any really good sex robots out there, but in the meantime, humans do just fine. I suspect I'll be one of those perverts who'll continue to find sex with humans interesting long after the robots have exceeded Aria Giovanni or Gavin Tate for exceptional human, ahem, quality.

So imagine my pleasure at stumbling upon Why Sex With Robots Is Always Wrong, a rather peculiar little diatribe in which the author takes the idea that sex with robots will be so much better than anything else that, long before reproduction stops, lolicon, shotacon and zoophile bots will have so corrupted us that society will grind to a halt.

Unfortunately, although his mainstreaming is intriguing, his actual imagination is paltry enough that I didn't find much interesting there to exploit for my own work. He kinda blows his own premise by insisting that the acronym he dreams, FACA or "Female Anatomically Correct Android", will persist long after society has shaded into his dystopian ideals of fuckable sexbots shaped like little boys, stuffed pandas or toaster ovens. And his "the day we accepted that they're robots" scenario churned my stomach for its biochauvanism. Still, it's fun to see more and more the Christians are worrying about the posthuman future.

I used to say this a lot back when I was young and dealing with whacked-out religious types who insisted "Jesus was coming soon:" He'd better get here in the next 40 years or so or he's gonna be outclassed.
elfs: (Default)
It so wrong of me to be so helplessly in love with Ping? Those facial expressions are perfect.

Profile

elfs: (Default)
Elf Sternberg

August 2025

S M T W T F S
     12
3456789
10111213141516
17181920212223
24252627282930
31      

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 14th, 2025 05:53 pm
Powered by Dreamwidth Studios