Podcast: Play in new window | Download
Subscribe: RSS
Worried about AGI running amok in the near future? John Wentworth thinks we can align AI. In 10-15 years. With greater than 50/50 probability. And he has a plan!
We discuss The Plan, its merits, and how it has progressed over the past year.
Primary Sources:
The Plan
The Plan – 2022 Update
Also discussed:
The Basic Foundations for Agent Models sequence
The Telephone Theorem
The “Minimal Latents” Approach to Natural Abstractions
Help With The Plan, Get The Skills, Save The World:
Read The Embedded Agency Sequence
Join SERI MATS! (see also SERI MATS tag on LessWrong)
Apply for funding from The Long-Term Future Fund
56:05 – Guild of the Rose Update
57:36 – Feedback
58:20 – LW posts
1:19:09 – Thank the Patron
We now partner with The Guild of the Rose, check them out.
Hey look, we have a discord! What could possibly go wrong?
Our Patreon page–your support is most rational, and totally effective. (also merch)
Next Episode’s Sequence Posts: