Where priorities lie.
Mar. 20th, 2007 07:54 amI had to say this yesterday: "This isn't an argument that can be won with words. It can only be demonstrated by experiment and outcome. Either you're right or I am. I don't have time to argue with you on Usenet about hypotheticals. I have a book to write and an audience to reach. Sorry, Greg."
(Although, truth to tell, I am looking forward to Incandescence.)
Best quote from the conversation came from
fallenpegasus: "A Harrison Bergeron civilization with an IQ of 900 is just as miserable as one with an IQ of 80." I'm keeping that.
(Although, truth to tell, I am looking forward to Incandescence.)
Best quote from the conversation came from
What was the gist of the disagreement?
Date: 2007-03-20 08:16 pm (UTC)Re: What was the gist of the disagreement?
Date: 2007-03-20 09:18 pm (UTC)Greg seems to believe that there's nothing structurally significant to the brain that impacts one's ability to consciously grasp difficult concepts; beyond a certain point, there is no more you can add to a human brain to make it "smarter." The disagreement seems to be about the arbitrariness of Egan's limits, the notion that Beyond Some Point there is no place for human consciousness to go. Pegasus' quote exactly captures the nature of Egan's argument: Once we're smart in XYZ ways, we'll all agree that Greg Egan was a prophet, stop thinking about this singularity nonsense, and accept a hard cap on further self-improvement. It will be an HB universe of people immeasurably smarter than us. As I said, it's the arbitrariness of his argument that irks me.
Egan started frothing a bit toward the end about how the pro-Singularity people looked forward to hunting him down someday. I decided I had better things to do. The only way to determine who's right, Egan or Vinge, is to run the experiment.
no subject
Date: 2007-03-20 11:37 pm (UTC)