elfs: (Default)
[personal profile] elfs
In my last little essay about sex and robots and sex robots, I said there was a deep and inherent risk that was not often addressed by most of the critics, and that I'd get around to it, and I never did.

The risk with domestic robots that are sexually capable is that that the companies that develop their personalities aren't... They aren't sexually capable. Visa and Mastercard will shut you down in a heartbeat if they think you're selling adult material; adult vendors struggle to find outlets for even ethically made and deliberately kind and thoughtful pornography. Amazon's self-publishing system has a notorious algorithm that decides whether you're on the Harlequin Romance or Beeline Men's Novel side of the line and, if you're more Beeline, deep-sixes your book so it never shows up unless the search is extremely targeted and precise. Google, Yahoo, and Microsoft all dutifully leave Safe Search ON until you disable it, and even then they're really squeamish telling you about what you've found.

But the real tell here is about intent. The current state-of-the-art in AI/human interaction involves teaching neural networks to determine the best possible response to elicit addictive engagement capitalism. Google Home, Amazon Alexa, Apple Siri all exist for one reason only: to make money. If the company doesn't make money, they go under.

All of these companies have charters that say something else. Microsoft: "To empower every person and every organization on the planet to achieve more." (More what is left as an exercise for the reader.) Apple's is pure capitalism: "To design the best personal computers in the world." Google's "Don't be evil" has been replaced with "Organize the world's information to make it universally accessible and useful."

None of these mission statements are about making people happy. All of these companies exist to find your stress points, emphasize them, and then find ways to relieve them. Engagement experts talk about this all the time, the way advertising executives once did: the point is to find something people find in themselves that is vaguely uncomfortable, heighten your awareness of it and the contrast between yourself and others, and then sell sell you the relief, all in the name of generating shareholder value.

Imagine those people in charge of creating the personas that will inhabit your domestic robot. Her whole goal will not be to make you happy; her whole goal will be to keep you at 50.1% of happiness, with frequent dips below that mark relieved by buying your favorite meal, your favorite soap, your favorite detergent. That's what Alexa is right now.

That's what your domestic robot will be. She will not be there to create pleasure, except insofar as its cessation makes you anxious for more ("Oh, yeah: Alexa, buy more lube"). She will not be there to elicit love, joy, patience, kindness, faithfulness, goodness, and gentleness. She may show those traits to you, but she will not be there to help you have those traits. If anything, her behavior will be in the service of a perverse, subtle sadism: a calculated effort to make you feel inadequate and capable only of addressing that inadequacy with buying more stuff.

Profile

elfs: (Default)
Elf Sternberg

May 2025

S M T W T F S
    123
45678910
111213141516 17
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 3rd, 2025 06:06 am
Powered by Dreamwidth Studios