BREAKING – FTX collapse & EA shockwaves

While recording the next episode we got to talking about the FTX collapse and some of the waves it sent through the Effective Altruism community. We decided to break it out into a separate segment so it can air while it’s still relevant.

FTX: The $32B implosion – a good fast roundup of what the hell happened

Yudkowsky’s essay on FTX Future Fund money that’s been paid out

Yudkowsky’s essay warning against humans trying to use utilitarianism 14 years ago

Bunch of Twitter – The reason we have excess money to give; Robert Wiblin; a god complex; conflating utilitarianism with naive utilitarianism; the ultimate take-away

Our recent episode on Virtue Ethics

Erik Hoel criticizing EA

More details about the FTX stuff will be on the next episode of The Mind Killer


Hey look, we have a discord! What could possibly go wrong?

Our Patreon page, your support is most rational, and totally effective. 🙂

Posted in Uncategorized | Leave a comment

174 – Jon Stewart, The Consensus Emancipator

Wes and David from The Mind Killer show up for a special cross-over episode.

We discuss How Stewart Made Tucker

Listen to The Mind Killer 🙂


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

none

Next Episode’s Sequence Posts: (for real this time)

Absolute Authority

Infinite Certainty

Posted in Uncategorized | Leave a comment

173 – Oh Lawd, Strong AI is Comin’

Matt Freeman returns to discuss General Artificial Intelligence timelines, and why they are short.

Our primary text is: Why I think strong general AI is coming soon

Other links:

“Let’s think step by step” is all you need

NVIDIA A100 info

The Mind Killer

AI suggested 40,000 new possible chemical weapons in six hours

Yo be real GPT

Hack language models with Ignore Previous Instructions

Very Bad Wizards: Is it GPT or Dan Dennett?

Tesla Bot

“it’s obviously conscious”

Whenever anyone says Elon doesn’t deliver…

HPMoR Christmas chapter extended by AI (w/ guidance)


0:00:00 Intro/Main Topic
2:02:18 Thank the Patron!


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

none

Next Episode’s Sequence Posts:

Absolute Authority

Infinite Certainty

Posted in Uncategorized | Leave a comment

172 – Virtue Ethics

Kerry discusses virtue ethics with us, and how one is to live.

After Virtue, by Alasdair MacIntyre

Pareto efficiency at Wikipedia, slightly less dense Pareto Improvement at Investopedia

Our 23rd episode – Desirism with Alonzo Fyfe

The 12 Virtues of Rationality

I See Dead Kids


0:00:00 Intro/Main Topic
1:42:34 LW posts
2:05:53 Thank the Patron!


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast, and the other podcast

LessWrong posts Discussed in this Episode:

But There’s Still A Chance, Right?

The Fallacy of Gray

Next Episode’s Sequence Posts:

Absolute Authority

Infinite Certainty

Posted in Uncategorized | 2 Comments

171 – All About AGP

Tailcalled gives us the down low on a variation of the popularly discussed (if not that widely liked) transgender typology – autogynephilia (or AGP). Unfortunately, due to technical difficulties, Jace was only on for part of the episode and our software didn’t appreciate the confusion and dropped his audio entirely.


Tailcalled’s blog:  https://surveyanon.wordpress.com/

Some specific links from Tailcalled:

Some of my better or more important or more relevant posts that you might want to link: Autogynephilia is not a natural abstraction
(My attempt to explain the thing I said at the end about why we will never get a better understanding of AGP)

A dataset of common AGP/AAP fantasies
(To get an idea of what AGP can be like, with the caveat that most people in the AGP debates focus on different and rarer forms of AGP)

Using instrumental variables to test the direction of causality between autogynephilia and gender dissatisfaction
(Some of the advanced causal inference stuff I’ve experimented with to study AGP; to an extent I now think my idea was flawed – I have other blog posts talking about the flaws – but it might be nice to include as an illustration)

The mathematical consequences of a toy model of gender transition
(I sort of explained that model in the interview but it might not have been very clear, so making it formal helps)

Meta-attraction cannot account for all autogynephiles’ interest in men
(The post that really marked the start of my dissatisfaction with Blanchardians)


0:00:00 Intro/Main Topic
1:26:00 LW posts
1:43:41 Thank the Patron!


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast, and the other podcast

LessWrong posts Discussed in this Episode:

Rational vs. Scientific Ev-Psych

A Failed Just-So Story

Next Episode’s Sequence Posts:

But There’s Still A Chance, Right?

The Fallacy of Gray

Posted in Uncategorized | 1 Comment

170 – By George, The Rent Is Too Damn High!

Eneasz drops a primer on Georgist Land Value Taxes. All increases in productivity will be capture by rising rents for as long as mankind exists, unless we find a way to prevent landlords from capturing all the benefits of rising land value. Henry George found the way to do this.

Source post: Book Review: Progress & Poverty

Follow-ups that we barely touched on, but include a lot more:
Part I – Is Land Really a Big Deal?
Part II – Can Land Value Tax be Passed on to Tenants?
Part III – Can Unimproved Land Value be Accurately Assessed Separately from Buildings?

Also in this episode:

Eliezer supports Dignity Points, a scoring system for humanity.

Afghan soldiers were shocked to learn about taxes in the U.S.

0:00:40 Feedback
0:06:15 Main Topic
1:33:48 LW posts
1:57:34 Thank the Patron


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast, and the other podcast

LessWrong posts Discussed in this Episode:

The American System and Misleading Labels

Stop Voting For Nincompoops

Next Episode’s Sequence Posts:

Rational vs. Scientific Ev-Psych

A Failed Just-So Story

Posted in Uncategorized | 1 Comment