Podcast: Play in new window | Download
Slate Star Codex article – You Kant Dismiss Universalizability
Wikipedia page on the Revelation Principle
Eliezer’s post on LessWrong – Ends Don’t Justify Means (Among Humans)
Another LessWrong post – Protected From Myself
Louis CK bit on Lying (1:46 seconds long)
Short book by Sam Harris on Lying (six minute preview of audiobook)
Scott Alexander post on LessWrong – The Worst Argument in the World
The Prevalence of Lying in America: Three Studies of Self-Reported Lies
Kim B. Serota, Timothy R. Levine, & Franklin J. Boster
Eliezer_Yudkowsky 05 July 2009 comment on Not Technically Lying by Psychohistorian
Are animals capable of deception or empathy? Implications for animal consciousness and animal welfare
S Kuczaj, K Tranel, M Trone, H Hill
Animal Welfare. Special Issue 10:161- 173 (2001)
Comadena, Mark E. “Accuracy in detecting deception: Intimate and friendship relationships.” Annals of the International Communication Association 6.1 (1982): 446-472.
Comadena’s study finds that friends & spouses have better deception rates than acquaintances, but friends have better deception rates than spouses. So, it’s more like closeness helps your lie-detection ability up to a point, but past that point of closeness, it starts to hurt instead of help. Some other studies show no significant difference between detection rates of strangers vs people in close relationships
Listening to this episode it struck me that you stayed pretty close to the model were you: talk to one or a couple of other persons and determine the different reasons you have of saying things that you believe to be true or false. That wasn’t all but it was a big focus.
I think there could be a bit more to this subject.
When it comes to every day life, with people you actually meet a lot I think you’re reasoning had a lot to say.
One of you’re main reasons for telling the truth was that it gives you a trust worthy reputation, but there are several cases where this doesn’t really work. If the people you are talking to have a very different model of the world and have a hard time telling truth about the world from their own beliefs and your beliefs, then they might very well think of you as a liar even though you say what you believe and the belief is indeed true.
In other cases you may have no history together and other factors may completely determine the reputation some individual believe you to have. This can go both ways in the sense that people might believe strongly in your honesty even though you often lie and if I remember correctly there is even some psychological thing at play here where people are less likely to spot lies the further down a con they go. (I think R. Cialdini talks about this)
There are also the cases where you have many easily checked or reasonably sounding truthful statements mixed in with some small number of reasonably sounding lies or half truths to try to shift beliefs on critical issues.
As Bayesians truth and falsehood are just limiting cases of highly likely and highly unlikely and any discussion of lies should talk about what’s between.
Also there are the techniques of misdirection where you don’t even answer questions or stay on the subject but just pick something else. This can be done sometimes entirely without lying and entirely without telling technical truths.
Even in a completely honest conversations the stakes and values matter.
Consider a chess game where you’re playing against your friend and they tell you to make a move to expose your king and that they won’t use the oportuinity. In this case they are either lying or are not true to their own motives as far as you understand them.
In this case you had often better not take their offer unless you know what their actual motive is and that it binds them to their promise or that you can recover (like if you are playing two simultaneous games of chess and you both to this kind of move at the same time etc.) (analogy is maybe not very strong but I hope the point is clear).
You further tried some thought experiments where you imagined that you could not lie. I found this a bit curious. What probabilities in you model where you trying to shift? What would the world look like if you couldn’t lie and what would enforce it? I think you might have been a bit quick to draw conclusions here.
One way I might imagine this happening is that your entire brain state was known from the outside. In this case you could technically lie but the entity knowing your brain state would know every time you said something not matching your model of the world. They wouldn’t know if what you said was true about the world though so the way to ‘lie’ in this world might be to completely deceive yourself when no one is looking if this was possible. The question is if this is considered lying =p maybe some sort of temporal lie where your passes self is lying and the present one is not. The concept of a lie breaks down a little.
I’m not sure a world where lying is impossible is consistent. I think you might need entities that are able to completely model other agents and those agents restricted environment. Could you imagine a fly you couldn’t fool? But the exact definition of lying used is important here.
So as usual this all comes down to what possible futures you think the world is attracting towards. Is it possible to build a computer system that couldn’t output statements which it’s own model considers false. Could it cheat by not explicitly calculating probabilities for things and saying them anyway etc. there is a lot to unpack but I will not do it.
Finally I’m a bit confused by Eneasz comment that he considered the natural world to be cute and not deceiving (parafrasing and maybe not remembering correctly). I thought you guys had discussed such thing before and I feel Katrina would have rolled her eyes a bit. 😉
There might be more to say about this but I’ll end here.
In the end I guess I agree that truth is good but there are good reasons for lying also.
Like when you invent the possible ultimate death machine (nukes, Super human general artificial intelligence, etc) or maybe we should even lie about the possible existence of such things. I surely hope that some kinds of things would be or are kept secret. And in some such cases you don’t tell half truths – you lie!
Btw I’ve been following the show since the beginning and I really do enjoy it.
This is the first time I took the time to comment hope you can use it even if just to pick it apart =)
Even though you are not experts in most of the things you talk about you always take at the least a good first order stab at things and you make the subjects spark an interest. That in turn often make me want to read more.
Just do remember that the power ladder sometimes goes very far.
Keep up the good work
Thanks for the comment and for the encouraging closing remarks. For the most part, we’re people with an average amount of knowledge in the topics we cover, though we try to do a little studying or bring on more knowledgeable people. I think we’ll engage your comment in more detail on the air sometime, but I’ll respond briefly to a couple of things.
One, it is hard to anticipate and control the beliefs in other people from individual encounters (you mention that someone might think you’re lying even if you’re not) but I meant to spend more time on the non-reputation reasons for being honest. The main reason I stick to honesty is that it is way less effort! Sure, there’s some awkwardness to overcome, but I’ve been close to people who went through phases where they lied more or less compulsively and juggling all of that wears a person down.
Two, I totally agree that if some way to destroy the world was discovered that I’d hope the findings weren’t published online or shared at all. Did you read Eliezer’s short story Three Worlds Collide? There’s something in there that relates to this, but I wont spoil it. 🙂
Thanks for the response. (And thanks for even reading my huge response :))
I guess I can support some sort of heuristic rule of thumb like avoid lying if you can unless winning is really important and even then be careful.
(winning = achieving some important goal you actually care about. )
I’m sympathetic to some form of consequentialism so I think of simple rules mostly as describing the best action in limiting cases.
Simple rules can be very useful though just because they are so compact.
I’ve listened to three worlds collide through the HPMOR-podcast and the rest of the podcast. Also most of the sequences. I’m also an aspiring physicist so the concept hits close to home. =p
Point taken about the compulsively lying friend. I’ve had a friend like that and people got hurt. Still, if it’s compulsive maybe the rule wouldn’t work very well for them. But maybe I’m taking you to literally =p that kind of lying is not what I would propose either.
I just realized that you were the one mostly defending honesty. To be clear I think I agree with that. Thinking of lying as good or bad is a bit of a false dichotomy. Lying is hard and unpleasant but also sometimes the only tool. The question is under what circumstances it is important enough to use the tool.
Interestingly the point Eneasz raised about submitting fiction and ticking boxes for minority status around 34 minutes in is an area where I have an almost exactly mirrored position. I lie and claim I don’t fit into those categories because I have a strong opposition to normalising discrimination by factors of birth. I feel like this is a case where lying is morally acceptable as a way to minimise harm.
I’ve also lied in the interest of not losing perks due to factors that are outside of my control. When I signed up for cryonics, I signed a piece of paper that said I had a religious objection to being autopsied so that it would essentially bar someone from ruining my chances at being preserved for that reason. If there was a formal document that said I had a secular objection, I’d have signed that instead, but I didn’t want to get screwed just because I wasn’t religious.
Oh interesting! That is very admirable, I wish I could buy you a drink or something. Have you been published yet?
This episode may have (sort of) changed my mind. I would have previously said “lying is in-itself immoral”, but I now realize that what makes lying wrong is that it is an action that generally causes suffering, sometimes immense. Applied to the axe-person example, we clearly see that lying creates an outcome that minimizes suffering.
This opens the door for some potential pervasive convenient lie causing the least suffering (i.e. religious worldviews), but living with a false model of reality seems to cause a great deal of suffering for oneself and others.
I still think that radical honesty is a correct goal, and the discomfort it may cause is the responsibility of the discomforted, as it is a manifestation of convenient delusion.
Two works of fiction that explore lying in interesting ways:
1. The Truth Machine by James Halperin
Explores the implications of a 100% trustworthy lie detector that works at a distance and is accessible to everyone. What are the implications of a society without lies? How does politics work? Business negotiations? Mental health? And what is the initial transition from a society of liars to a society of truth tellers look like?
2. Remembrance of Earth’s Past (3 books: The Three Body Problem, The Dark Forest, and Death’s End) by Cixin Liu.
Humanity under attack by space-faring aliens with vastly superior technology, the ability to stop earth’s technical progress, and the ability to spy on all human activity, communications, and electronic systems. The sole advantage for humanity: Our ability to deceive.