2 – What is this “Sanity Waterline” you speak of?

Can the Sanity Waterline be raised? Should it be? And just what does that mean anyway? Based on the post “Raising The Sanity Waterline

Also relevant to today’s discussion: Joy In The Merely Real, Christian Bale Did Nothing Wrong, and You Kant Dismiss Universalizability (particularly section III)

Link to the TAM discussion panel.

This entry was posted in Uncategorized. Bookmark the permalink.

9 Responses to 2 – What is this “Sanity Waterline” you speak of?

  1. JJ says:

    I think the main issue is that people are so resistant to change, and once they get into a pattern of gathering data from one place, they stop looking for others. My parents were actually very good at teaching me logical thinking skills, however, when I was young, they were both staunch young-Earth creationists. Now, my father, at, least, believes in evolution, so I know he can make logical choices, but is still very Christian.
    I considered myself Christian until around last year, which coincided with my leaving home and going to college. When I lived with my parents the only media I consumed was strictly religious and conservative and I really believe that that crippled me. The things that gave me my strongest convictions in God were documentaries and seemingly sound scientific explanations my father showed me. Now that I no longer live with my parents I consume much more liberal media and can weigh it with what I used to know to come to rational decisions. I don’t believe I was irrational before, just coming to a false conclusion because of lack of data. My father only consumes conservative media, and so he only can come to the conclusions that that lets him come to.
    I wish I could talk to my parents about it, but I know they would never listen to me outright, because they definitely don’t see me as an adult. In that way, I guess I’m like Harry trying to talk to Mr. Verres.

    Thanks for the awesome podcast, guys, it has really been making me think!

    • BayesianAdmin says:

      Thank you for the comment!

      I think a large part of practical rationality is the skill of knowing how to find and assess various pieces of information and sources. Also, I think we concluded something that’s basically correct when we said that one typically needs a motivation to want to change their beliefs.

  2. David youssef says:

    This is a bit of a rambling post and I know that already. I hope you can forgive me but this comes across at a time when I am contemplating the same idea.

    I’m sure many of you have read the book Thinking Fast and thinking slow by Daniel Kahneman. There he describes how if the human brain is a biological Computing substrate then the way it’s designed it has energy usage based on the operating system running at the same time. The part of us that does rationality is really really energy-intensive whereas are illogical and fast brain is really good at conserving energy on a daily basis. Just this information taken on its own means that you’re not trying to make people rational in the sense of imbuing in them knowledge but by building a set of habits and techniques and really simple rules of thumb. When looking at someone who has done this from the outside what you notice is that this actually doesn’t make you on the surface “smarter”. Applying this will not raise your maximum level of complex problem solving but will it will look like is that every decision you make is slightly smarter and wiser then the ones of the people around you who are not applying these skills. You guys mentioned at the beginning of this podcast that for every big catastrophic failure there is a less exciting failure earlier I think that for every major success there’s a smaller much less obvious success at the beginning. Imbuing a set of rational habits on two people will have very little appreciable difference on the surface but when those big decisions do come up you are better trained and better prepared to make them. So how you know someone is rational is really difficult to tell except in those moments where they are most tried.

    Now I’m going to get into a little bit of esoteric weirdness. I figure this is as good a place as any to get some criticism. Let’s take the idea of operating systems on the brain another step further we know that because of the Turing completeness theorem at least I think that’s the right proof that anything that can compute can theoretically compute anything. I think that using this analogy except further we can say there are probably an infinite number or at least arbitrarily large number of Mind states with different benefits detriments and biases built in that a human being can access. Looking at the thing I wrote above about how a person who is rational would look from the outside you notice that while many people fit this criteria one group that consistently works there is people in esoteric Eastern philosophies Buddhist and Taoist monks specifically. The State of Mind known as enlightenment in eastern philosophy is closest described as Flow State. What is flow state is a different operating system that happens to be in a weird local Minima of energy usage? It doesn’t need to be significantly higher in terms of cognitive ability than your fast brain to essentially make you wiser at all times. Especially if you can maintain it for longer periods than rational thinking. I feel like a good way to attack this problem would be from two angles at once one angle would be moving people philosophically and psychologically towards Flow State / enlightenment to raise their minimum and their Baseline. The Second Step would be to use classic rationalist thought if such a thing really exists and training to build a set of habits and heuristics. So part of this training would include asking yourself what do I know and how do I know it every time you are introduced to a problem this immediately gets you thinking on a meta-level not only about your evidence but where you get it and then from the other side meditating and learning to get the internal feel that you douse key mentions in your next podcast for when you’re lying to yourself that’s really a form of meditation. Hope this wasn’t too long and crazy really enjoying it so far and will be anticipating your next podcast with excitement.

  3. David youssef says:

    I also wanted to talk about why I think raising the sanity waterline doesn’t necessarily mean less religion or faith. A lot of people in the West View those who are religious as though they’re willing to either ignore evidence that disagrees with them or add fake evidence. A lot of Eastern faiths tend to take a different approach they see the same data set as any rationalist atheist but use a different filter for interpreting that data set. If their new dataset or their new line of correlation still give them effective results but has a different mental framework is there Faith less rational?

    • Eneasz Brodski says:

      A lot of Western faiths also claim that they aren’t ignoring or adding data, but simply interpreting existing data with a different filter. Ken Ham claims this. If your filter leads you to claim that dinosaurs existed alongside humans in the recent past, there’s a problem with your filter. Filters should be examined the same way any belief would be, and so far I haven’t met any faith-based filter that is clearer than a non-faith one.

      • David youssef says:

        I’m willing to agree that a faith-based filter cannot be cleaner then a similar filter without faith. But reality is more nuanced than that. For example you bring up a breed of American Christianity that in my opinion is completely making up data not just reinterpreting it but literally ignoring the parts of the data that disagree with them and making up a funny story. Compare this with the faith of taoism. Those who follow the Tao te Ching are taught to appreciate reality and love it. Yes we do believe that there is a force that moves things in certain directions. But part of that belief is that you can discover which way the universe wants you to go by looking at what the universe is honestly doing already. A faith-based way of saying this is that I believe that the universe is a living example of the Tao. There can be no higher calling then learning to understand the universe. Self-deception only brings you further away from the truth so you can even say the Taoist were afraid of biases in our thinking thousands of years ago.

        Another example is Hinduism where they specifically say that this universe that we observe is not the only one in fact there is a spiritual plane but they make it very clear that science and reason are the tools that we use on this plane and that ignoring them is just falling further into the illusion that is cast Over the minds of men that they call Maya.

        I agree that spiritual filters will never allow perfect clarity. When we talk about raising the sanity waterline are we going to indulge in mental fantasies of making everyone perfect rationalist? I think that letting go of all faiths is much more difficult for a lot of people then it is for a majority of the rationalist community for whatever reason. Given that I hope that maybe we can push people into changing their spiritual filters to be more reasonable.

        I bring this up because I feel like a lot of religions actually do have really great wisdom buried inside them and too quickly and too often rational is so willing to throw out the baby with the bathwater. Also given the above belief that most people need some kind of religion to structure their moral lives instead of discouraging it or disparaging the lack of rationality we should ask ourselves which faith(s) have the most effect on dropping the sanity waterline and encouraging people away from those sets of beliefs.

        • Eneasz Brodski says:

          It seems to me that you are ignoring the fundamental question of rationality: “Why do I believe that what I believe is true?” What is the method you used to arrive at your beliefs? Is that method sound?

          Outside of extreme situations, the method one uses is more important than any particular belief. Because if you have a good method, your beliefs are self-correcting. OTOH, If you have correct beliefs, but a bad method for acquiring/correcting beliefs, then your beliefs will become more and more flawed over time, until you end up with something like religion.

          Thus the question: “WHY do I believe what I believe.” The Question that forces you to inspect your method of belief acquisition/correction. Anyone who cares enough about the truth, and understanding the universe, will eventually ask themselves this. And if they pursue that process honestly, they will eventually discard mysticism and the supernatural. Afterwards you can judge which religion has the greatest net positive (or least net negative) and choose to promote that one. Liberal christianity is certainly far better than fundamentalist christianity. But I would still feel like a liar for endorsing any system of belief that relies on mysticism and supernaturalism. It may be a bullet you’re willing to bite, if you believe that the majority of people cannot live without a religious framework. I think you underestimate people.

  4. clacke says:

    In whatever video I heard NdGT bringing up the 7%, his point waa clearly along the lines of “7% of the brightest people are theist, so you clearly can’t say that religious people are plain stupid. Convince 100% of the clever people and then you might have a case. Atheism is not as obvious as gravityism.”.

    • BayesianAdmin says:

      That was the impression in my (Steven’s) memory as well. I liked Eneasz’s charitable interpretation, but I had the idea in my head somewhere that Tyson was being sort of wishy-washy with it.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.