Podcast: Play in new window | Download
We welcomed the one, the only, Eliezer Yudkowsky to the Bayesian Conspiracy for a quick chat.
His latest project is Arbital, an ambitious effort to solve online discussion. It is focusing on solving online explanation. They want to do for difficult explanations – and someday, complicated arguments in general – what Wikipedia did for centralizing humanity’s recounting of agreed-on facts.
The initial demo page is an explanation of Bayes’s Rule
About Eliezer in his (heavily truncated by me) own words:
One Eliezer Yudkowsky writes about the fine art of human rationality. That Eliezer Yudkowsky’s work can be found in the Rationality section.
The other Eliezer Yudkowsky concerns himself with Artificial Intelligence. Very shortly – on a historical scale, that is – we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years. For more on this see the Singularity tab.
My short fiction, miscellaneous essays, and various other things can be found under Other.
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
A majority of all my written material is currently stored at the community blog Less Wrong.
We asked him about questions ranging from his opinions on AI to rationality to his popular Harry Potter Fan Fiction.
Eliezer brought up a few concepts, and you will definitely want these links:
- He spoke about potential types of AI, including Sovereign and Genie.
- Read the full Fun Theory Sequence, which is summarized in the 31 Laws of Fun.
- Eliezer wants you to see this spifftastic Guide to Bayes Rule.
We will be reposting that last link, I’m sure. As always, you can email us at BayesianConspiracyPodcast@gmail.com with your valued feedback, or comment at the subreddit.
Other Relevant Links
Surely You’re Joking, Mr Feynman!
A Step Farther Out, by Jerry Pournelle
The first Miles Vorkosigan book – The Warrior’s Apprentice
Beneath movie (it doesn’t deserve all those stars)
Three Worlds Collide (also – audio version)
The Hidden Complexity of Wishes (“With a safe genie, wishing is superfluous. Just run the genie.”)
Can you please write down what exactly EY told about epilog, when he said something “there will be a lot of … ” at the end of half-hour.