elfs: (Default)
Dan Ashwood's video, Repeat Viewings, begins:
For most, the death of scarcity and involuntary mortality is a welcome development. For some, however, nostalgia and the allure of the material universe prove too strong.

They are doomed to relive their pasts. This is the story of one such case.
I won't ruin it for you. It's only 12 minutes long, and it's an example of what Nick Bostrom calls "The Simulation Menace." It's also surprisingly poignant.
elfs: (Default)
It's so cute when theists basically pat their audience on the head and say, "There, there, thinking machines aren't a threat. We won't have to treat them any better than we do a dog."

David Gelernter throws out words like "consciousness" and "awareness" and even the hoary "free will" without ever acknowledging that none of these terms have any serious definition. The assumption is always the same: thinking is irreducible to brain activity, which is as incorrect as saying a video game were somehow irreducible to electrons passing through silicon.

And Gelernter refuses to acknowledge what Wittgenstein pointed out over a century ago: when things start acting as if they acted indistinguishable from those we accorded the label "ensouled," then we are morally obligated to treat them as if they, too, had souls. Anything else is repugnant.

Gelernter's fantasy of "human-like machines" doesn't really deal with the future at all. We don't worry about human-like machines: those are actually quite boring from a researcher's point of view. (As a writer, of course, I have other reasons for delving into the topic.) Our big worries are in targetted intelligences for acheiving specific intellectual or scientific aims, the results of which involve so much intellectual capacity that one ordinary human mind cannot encompass the result: we're left to using the recipes left by our vast, cool, and unsympathetic engines of thought, and explaining them to each other by a process indistinguishable from hermeneutics. At that point, we had better have tight reins on our intellectual progeny, because these vastly smarter but still unconscious, technological but unself-aware machines, will have incidental agendas built into them about which we may be completely mistaken.

And then there will be real trouble.
elfs: (Default)
One of the less important points in my post entitled Intelligent Design and The Legend of the Lone Scientist is that scientific research these days involves thinking about things in such a broad manner that it takes more than one head to get headaround on any given project. While there have been a number of recent research projects that have been done by one man, they're projects that are done at the one-man scale, and most of the ones I can think of are in zoology or taxonomy. Anything deeper and you're talking teams.

But teams are by definition wasteful. There's an upper bound to how much input the core thought of a team can be distributed among its members, and how useful adding additional people to the team can be. Ultimately, you end up with a circumstance in which more people make for less meaningful work; they become a drag on the system as their ideas require more winnowing to reach the really good ones.

We have effectively tapped out the ideas within reach of a single mind; we are now researching the ideas that are within reach of a team of human beings. We have added tools to improve filtration, winnowing, and so forth: wikis, fora, email, and so forth allow teams to improve their responsiveness and capabilities, but there's only so much that these extensions to hands, eyes, and voices can do.

We're eventually going to face problems that require so much thought that either the machines will do it and we'll just try to understand what they came up with, or we'll become part of the machine and use its storage and automation all the more efficiently. One of the reasons for the hyberbolic growth curve in knowledge has been the growth of knowledge management, from oral histories to written words to indexed libraries to digital collections, wikis, and search engines. There is a limit, however, to even what a team can accomplish with these tools, limits imposed by the borders of flesh. Either we will hit those limits and stop, or we will penetrate those borders and become hive.
elfs: (Default)
I came across this quote today, in the context of a book review of Judith Harris's new book, No Two Alike:
Judith Harris .. has attempted to formulate a new theory of personality formation - the first, in fact, since Sigmund Freud. Basically, Mrs. Harris believes there are three "perpetrators" at work in the formation of the human personality, each associated with an aspect of a modular brain. One is the "Relationship System," designed to maintain favorable relationships in society. Another is the "Socialization System," where the goal is to be a member of a group. The third is the "Status System," where we compete with our peers for status. The interplay among these systems accounts for the emergence of differences between individuals.
No offense to Mrs. Harris, but this thesis has been a core assumption on the formation of personality in much of Singularity fiction: see the opening chapter of Greg Egan's Disapora to see it described. It is an underlying assumption of Minsky's "Society of Mind" thesis. It has been explicitly described in several texts on Evolutionary Psychology (most notably Non-Zero by Robert Wright). It's a defining feature of and discussed in several episodes of The Journal Entries where the tension between anarchy and chaos play out in the tensions of "wanting to get along" while "wanting to get ahead". As I've said in a couple of places, most human progress arises from creative responses to that tension.

Even worse, it's incomplete. Mrs. Harris completely misses out on a fourth system, one that is as important as the others and completely necessary to understanding certain aspects of personality: The Other System. "Other" as in "Reacting to those who are neither family or peers." How we identify ourselves and our peers by what they are not: not a different color, not a different gender, not a different religion, not a different language, and so on. Unless and until Mrs. Harris or those who follow her deal with that aspect of development, any theory of personality will be incomplete.

Fortunately or unfortunately for Mrs. Harris and reviewer Peter Pettus, science fiction writers are a decade ahead of them.

Profile

elfs: (Default)
Elf Sternberg

December 2025

S M T W T F S
 12345 6
78910111213
14151617181920
21222324252627
28293031   

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Dec. 28th, 2025 03:37 am
Powered by Dreamwidth Studios