Podcast: Play in new window | Download
Subscribe: RSS
Under the Aegis of Transhuman Spiderman + Other Considerations (and Listener Feedback Catch-up)
Fiction
The Metamorphosis of Prime Intellect, by Roger Williams
The Golden Age, by John C Wright
Utopia, LOL?, by Jamie Wahls (+audio version)
Red Legacy and Other Stories, by Eneasz Brodski
Non-Fiction
Making Beliefs Pay Rent
The Psychological Unity of Humankind, by Eliezer Yudkowsky
– counterpoint: The Psychological Diversity of Mankind, by Kaj Sotala
Dunbar’s Number/Monkey Sphere
CEV
IQ actually doesn’t correlate with unhappiness. It’s either uncorrelated or weakly correlated with happiness! /surprised
That backwards audio effect thing was kinda creepy and scarry to me for some reason. Like world ceases to be comprehensible all of a sudden or something.
I managed to influence the topic of next discussion with a simple question and now you will think it was your idea all along..
*evil laughter*
On the topic of boxing gloves make the port far more dangerous. Gloves protect the hands and let you punch a hear harder and more often.
Steroids are very much transhuman. Sports like body building are filled with people completely unlike natural humans because of the changes the large dose of steroids make. There is a large well studied body of wisdom about their use in extremes like that. I don’t think there is a significant body of study relating to it’s responsible use thought. I know there are some uses of steroids during puberty to increase adult hight and that is super cool but that is along the line of weird Russian eugenics programs. They are also really good for makeing injuries heal faster. I would love to see clinical trials for uses like that. That is a transhuman project we could see now.
You touched briefly on Dunbar’s work in this episode. It’s a bit more complex than you explained and a bit more far reaching than the single number.
Dunbar correlated brain size with social group size. So the short version is that yes a post human of some sort will have a higher number (almost certainly). Dunbar’s number is actually a set of many numbers that match up to various intimacy levels. The levels basically start at 5 and triple more or less from there (so the 150 is 135 but 150 is easier to remember and fits in the variation).
It has been 5 years since I read his book so bear with me. Your 5 are basically people you see every day and you would openly sob at their funeral. Your 15 are close friends, people who might put you in their wedding or make them your pall bearer. 50 is basically your community and 150 is all of the people who are really people on an INSTINCTIVE emotional level. It goes out further but the numbers get super nebulous at that point.
He has a bunch of examples int eh book, compares them to military unit and sports team sizes. There was an interesting one about amish communities and how they will split the community on purpose to keep the population at a level that can be controlled by peer pressure alone.
Back to the point the potential bad is that if you buy his research it could mean that an infinitely smart human cares deeply about an infinite number of people. So it would be like every Chinese sweat shop worker is you grandma or whatever and you care about them like you would care about her in those conditions.
On the upside you care about everybody super deeply. So as long as nothing bad is happening that basically means things are sort of great?
Cue complaint about if you care about everyone no one is special to anyone. (You can’t comprehend that mental state, and the same thing can be said about the 5 people you already care about.)
This was one of the more interesting discussions of my novel, The Metamorphosis of Prime Intellect that I’ve found online. (I found your podcast via the website referrers, yo.) I would just like to say that the theory that Chapter 8 was a put-on job by Prime Intellect is canon, as I do intend to write a sequel which will be based on this idea. (OTOH I’ve been planning to write this sequel since 2004, and it has proven difficult. But then so was MOPI, that book took 12 years from original inspiration to actual incarnation, so TOPI is on schedule.) The fundamental point I was trying to make — the point which I realized made it possible to write a story about this thing that wasn’t even called the Singularity when I wrote it — is that the fundamental problem with the Singularity isn’t the AI, it’s humans. Humans do not deal well with having anything we want given to us with no effort, a fact that we’ve been able to observe since the first time we started putting crowns on peoples’ heads and telling them they can do whatever they want. The thing that saved Caroline’s life also subverted everything she ever believed in. God, work, learning, money — all meaningless now. So she sees nothing left of herself but an animal exploring the feelings her body is capable of. Nothing else matters to her at this point. And as Lawrence, who built PI in the first place and deliberately sat back and let it cause the Sigularity instead of trying to stop it, learns from her, she might be at least a little right about that.
Actually, there might be something of an over-correlation to artists and misery.
“There are plenty of geniuses who are not mentally ill, and there are plenty of mentally ill people who aren’t geniuses.
Sometimes you have the two combined. When you have geniuses who have such prominence, like Phillip Seymour Hoffman, or Robin Williams, or John Nash, they make you think that it is more common than it is. One in four people in this country annually have a mental illness that impairs their function. That’s pretty common. The illness is pervasive.
Genius is much more rare.”
-Dr. Lloyd Sederer, Medical Director of New York State Mental Health