Podcast: Play in new window | Download
Subscribe: RSS
The Conspirators talk cryonics. Spoiler alert, they are signed up (and signing up soon) for the big freeze. Katrina is a cryocrastinator.
The in-depth cryonics article that sparked this conversation, “Why Cryonics Makes Sense” by Tim : Wait But Why Essay
Steven mentioned the agent he went through: Rudi Hoffman’s Website
The two major US companies: Cryonics Institute FAQ Page; Alcor FAQ page
One of many stories of 23-yr-old Kim Suozzi getting last-minute cryo funding (yay!)
Gwern focuses on Plastination in his usual super-comprehensive style (also, I guess we were mispronouncing it in this episode. Doh!)
At the top of Gwern’s post he links to several estimates for calculated probabilities of revival, but we specifically called out Robin Hanson’s, so here’s a link to his ~6% reasoning.
Eliezer’s referenced post about cryonics being a bad way to erase a person. That’s really just a small part of it though. This post is amazing and fairly short and Eneasz in particular cannot recommend it highly enough.
This American Life’s episode on Testosterone.
Phil Goetz on Why People Want To Die
Rudi Hoffman‘s profession site – great guy who has helped many of us with our Cryo-funding insurance.
If you’re in Colorado, shout-out to Shani Sorensen for cryo-friendly Life Insurance
The word for electrocuting people in therapy is ECT or electroconvulsive therapy. A much better example though would be people with seizures.
i am a rationalist and very into incest and bestiality.
I hedged on whether or not delete this comment or engage with it. I’ll take a change and see if you’re open to understanding this better.
I don’t know anyone who’s “into” incest or bestiality, but part of being a clear thinker means having good reasons for what you believe. If the only reason you say it’s bad or wrong is “it’s yucky” then that’s what many rationalists will call a bad reason. Incest is gross to me personally, but that’s not a sufficient reason for me to propose it be morally prohibited.
My one (uninformed) qualm is that if people get really into cryonics, the number of organ donors in the world would decrease. I think people who are okay with being chopped up and distributed after death would also be the people who would be less afraid of cryonics. Personally, I’m signed up to donate my organs in the event of my death, and if my body is frozen on the chance that I can still be saved, several other people who needed my heart, kidneys, etc, might die.
Also, if it catches on fast, it might slow down research in perfecting cryonics and methods that can revive cryonics patients (taking away bodies that would have otherwise been used as test subjects), meaning there is less of a chance for me to be revived before my time runs out.
I think about the organ donations quite a bit. I meant to bring it up on the episode. I’ve been an organ donor since I got my first license and I think it’s an entirely good thing to do and there are very few good reasons not to. Since I signed up for Crionics, I realized that my organs will almost certainly get frozen with me and it seems like a waste.
The question is here what percentage of people prefer whole body preservation over neuropreservation (since, as I understand you can freely donate your organs in second case)?
I’m not sure I’ve understood your reasoning behind more people->less probability of revival, my intuition lie in opposite direction..
Right, if you just preserve your head, the rest of your bod may be donated. Hopefully we can alter the incentives for organ donation to appeal to more people in general, whether cryo is popular or not.
Yes, I think it’s widely agreed that more buy-in means a greater chance for success!
I’m wondering if you know a good place to read about different life insurance options (If there is a place that actually focuses on cryonics as an objective of insurance that would be totally great).
Because there is a lot of different options (some of them discussed in the episode) and it hard to figure out what is an optimal solution for this case.
So the main point is obviously – being able to pay the costs of preservation at the moment of death (If it occurs). So from this perspective termed insurance kind of sucks because you’re a lot likely to die later in your life.
Now you can consider several solutions for that – either buying whole life insurance (which is quite more expensive, as I understand) or going with options like Eneasz’s (which is also not great due to the inflation, as was mentioned.)
Other option can be mixed life insurances/investment (not sure how this option works).
Or buying a basic termed insurance and investing money in index fund (for example) with a goal to have an ability to pay the costs of preservation directly when the term of insurance runs out. But this case has a lot of caveats to consider also…
So if you know about any decent exploration of this topic, I would be really glad if you can share a link here.
Thank you!
There is also other things to consider (like growing costs of preservation: http://lesswrong.com/lw/8fe/cryonics_costs_given_estimates_are_low/)
Hey Harcisis,
When I obtained a policy through Rudy, he was very detailed about my many options. I think a phone call or email to his office would be your best bet for good information.
I have the following response when I’m trying to open referenced address:
“Access to this server is forbidden from your client”
Also, I live in Europe (at least for now) and as I understand Rudy works only with US customers?
+ Independent sources to read would be still great.
When I finally get around to doing this, I’ll put more details here.
First, thanks for your great podcast. I’m really enjoying each episode!
You discussed that most people in the developed would in principle be able to afford cryonics, but you didn’t really discuss the ethical consequences of spending a substantial amount of money on cryonics. As most rationalist I consider myself a consequentialist and try to live as an Effective Altruist. So for the question if I should sign up for cryonics, the main issue isn’t whether I can afford cryonics, but rather if I can justify spending 100k on a single digit probability that I might survive.
To illustrate this with a simple calculation: There is a large amount of evidence, suggesting that providing mosquito bed nets saves the life of a child per 2000$ spent. So instead of paying 100k$ for cryonics, I could donate this money to AMF and save 50 children from dying from malaria. Additionally, this would prevent many more non-lethal malaria infections, which would also create suffering. Let’s be generous and assume that there is a 5% chance of surviving if I’d sign up for cryonics. Therefore, I would value my life as much as the life of 50 * 1/0.05 = 1000 african children, which could be saved instead. Additionally, the evidence for the effectiveness of AMF is very strong, while the probability that cryonics might work has much larger uncertainties. The number of lifes saved, suffering prevented, or additionally QUALYs generated could even be much larger with a less conservative (conservative as in more risky) choice of charity.
However, I think there might be a way out of this dilemma. If I assume that EA doesn’t take over the world and that extreme poverty is still an issue in the far future when I would wake up from my cryo sleep, I could presumably pay back the equivalent amount of QUALYs, which I destroyed with my selfish decision to spend so much money for cryonics. However, I would have to be able to pay back at least the equivalent of cryonics cost * survival probability over my second life, assuming that future charities are at least as effective with regards to cost / QUALY as today charities. These are quite a few assumptions, so there is a high likelihood that I won’t be able to prevent more suffering in the future instead of now.
Additionally, investing in the help of extremly poor people today has a very high return of investment, I find it unlikely that I would be able to achieve more good by signing up for cryonics than donating this money today to the most effective charities.
I’m glad you like them! 🙂
I am of the opinion that people are allowed to prioritize their own existence over that of others, and the existence of close loved ones over that of far-off strangers, etc. In terms of straight utilitarian calculations, this doesn’t have much support (though there is some). However in terms of practical utilitarianism, the case is far stronger. As was pointed out in “Morality should be Moral”, spreading one’s moral system may be immoral if one’s moral system brings misery to those who adopt it. If a moral system demands that its adherents kill themselves in the prime of their lives in order to donate their organs to save 20 other people, that may infact be immoral, regardless of the QALY calculations. This is the personal-morality reason to sign up for cryo even as an EA.
Furthermore, a moral system that kills its own members will fail to be passed on in the human population, regardless of how good it may be. If Effective Altruism demands I kill myself (which is literally how I interpret the argument as given), I will simply not be an Effective Altruist. Generally, given a choice between suicide and signing up to be a Nazi, most people will get busy on justifying why Nazi-ism isn’t actually THAT bad. Which is why I consider it very important to never force such a moral choice on someone. It will rapidly make the world a *worse* place. This is the systematic-strategy reason to not oppose cryo if you’re an EA.
Finally, as an EA, it is practically guaranteed that you are still doing vastly more good in your life than 99% of the population *even after* reducing your EA giving budget by the amount need to fund cryo. You get a pass here. No one can be perfect, no one is expected to be perfect, don’t kill yourself trying to be perfect. You seem to put no value on your own existence, which I think is a mistake. The future would be better with people like you in it, please join us there. 🙂
This is not rationality, but is related to this topic. Arjen Anthony Lucassen created a rock opera called “Lost in the New Real” where the protagonist got himself frozen and was brought back sometime in the future. The album goes on about what times are now like for people. Give it a listen, you might enjoy it.
Link: https://www.youtube.com/watch?v=MsXmtjLMH18
I read a series of books (The Bobiverse Trilogy) wherein the protagonist is put into cryo for about 100 years, and is woken up by a fundamentalist theocratic regime to be the AI in Van Neummann probe.
Pingback: 146 – What Is A Rationalist? | The Bayesian Conspiracy