Keeping Ideas at a Distance Using Probability

We often talk about ideas by using phrases like “I believe X.” But what do we mean when we say that we “believe” in an idea? Do we mean that we have 100% confidence that the idea is true?

Let’s hope not. Even statements that we all would say we very strongly believe, like “tomorrow the sun will rise”, and “I am not a robot” we should not assign 100% probability to. While we can be very, very, very certain that the sun will rise tomorrow and that our brains are not computers, we cannot be absolutely 100% certain. Tomorrow the sun could be destroyed by some process (perhaps a process that we don’t even understand), and there is a non-zero probability that we are the result of some extraordinarily secret and amazingly advanced government robotics projects. It is simply not reasonable to view belief as a claim of absolute certitude.

Phrases like “I believe” can also be problematic because they can imply group membership. For instance, if you say “I believe that our government should use less market regulation”, but actually you view this as a statement of identifying as a libertarian, then it may be difficult for you to engage in rational truth-seeking debate. Arguments in favor of regulation may now be processed by your brain as attacks on your in group, which means that you may feel a strong urge to deny them no matter how solid they are. And admitting that regulation is a good idea may require an adjustment to your thoughts about who you are, or a reconsideration of how reliable opinions of other members of your group are. A lot more may be at stake in the argument for you than just the truth about the facts of the case.

So what is a productive way to think about situations where our brain says to us “I believe”? Perhaps we can view this as a claim that X is likely to be true. If we take this perspective, then there are some methods that we can use to get a rough idea of the probability that we are implicitly assigning to X.

Imagine a stranger comes up to you and offers to make a bet with you. You will win one dollar from the stranger if X is in fact true, and you will pay the stranger D dollars if X turns out not to be true. We will assume for the purpose of this thought experiment that an all-knowing oracle will instantly provide you both with the correct answer.

Now, the question to ask yourself is, what is the largest value of D (the number of dollars you owe if X turns out not to be true) such that you would be willing to play this game? If you claim that a certain “I believe” statement corresponds to a 99% chance that X is true, and yet you are unwilling to pay even $20 to play this game, then your thinking has probably gone wrong. According to that 99% probability assignment, you will be taking a 1% chance of losing the $20, but will have a 99% chance of making $1, which means that the expected value  (i.e. average value) of the game is plus 79 cents (so games like this will lead you to profit a decent amount, on average). You also are only risking $20, which for many people reflects a small enough amount of money that there won’t be any noticeable life consequences for losing it. So if you are not willing to put even $20 on the line for this bet, then it is likely that one of the following things is true: (1) your estimate of there being a 99% chance of X being true does not really reflect your implicitly believed probability, (2) you are being unreasonably risk averse, or (3) $20 has a substantial amount of value to you which is why you aren’t willing to risk losing this amount.

On the flip side, suppose that you took your “I believe X” statement as only reflecting a 60% confidence that X is true. Now, the gamble is much more likely to go against you, and even if you only put up $1.20 against the other guy’s $1.00 (rather than $20 as before), the expected value of the game is only plus 12 cents .

Hence, we see that the amount you would be willing to bet is an implicit measure of how strong your belief really is (though it also necessarily will be influenced by your risk aversion). Thinking about these bets won’t yield exact belief probabilities, but they can help you determine if the implicit probability you assign to X is more like 99%, 99.999%, or 60%.

Another way to try to convert “I believe” statements into probabilistic statements is to ask yourself, “how surprised would I be if X turned out not to be true?” If the answers is, “about as surprised as I would be if I tried to guess how a spun coin would land ten times, and got it correct all 10”, then you you’ve got a probabilistic estimate of about 1 in 1000. Different levels of surprise can be thought of as roughly corresponding to probabilities.

Yet another handy trick is to ask yourself, “how often, when I believed things this strongly, was I correct in the past?” Of course, you should only count examples where you quite definitively found out the true answer afterwards. If in the past when you’ve “strongly believed” something, it turned out to be true about 90% of the time, then you now have a probabilistic estimate of sorts about future “strong beliefs”.

None of these methods is fool-proof or totally rigorous, but they can still be very useful. One of the advantages of trying to convert your “I believe X” statements to “I’m about P% certain that X is true” statements is that doing so removes some of the ego investment we may have regarding X being true. In the latter case we are openly admitting that X might turn out to be false, even if our estimate of the probability P were very well computed. What we are claiming to believe now is not X itself, but in the probability of X given the information that we are aware of. It will now be easier psychologically to face up to the truth if X turns out to be false.

Converting to rough probability estimates is also useful because it forces us to attempt to pin down what we are really claiming. If we are being ambiguous in our claim, we are more likely to realize this when we think about how much we’d be willing to bet on the claim. Ambiguity is either going to increase our uncertainty in the answer, or make the bet unverifiable.

Thinking in terms of probabilities also helps avoid issues of bias that can come about from implied group membership. If we say that something is likely, we probably won’t feel as though we have just made a claim to belong to a certain group, and others will probably not hold us to this claim as well.

A final benefit from making statement in terms of likelihood rather than belief is that it makes it easier to change our minds in front of others. If I say “I believe X”, then a person makes an argument against X, and I flip to saying “You’re right”, I may seem like I lack strong convictions or am easily persuaded or believe things for bad reasons. These are all traits that I may be judged for having. On the other hand, consider how it sounds if I say “Based on the information I have seen, I think it is probable that X is true”, then someone provides an argument against X, and I reply with “Good point, taking that information into account X doesn’t seem as likely.” In this case, I am more likely to end up sounding like a careful thinker who is updating his beliefs based on the new evidence I encounter.

When you say “I think it is quite likely that less market regulation would be good for the U.S. in terms of GDP growth” that statement is more precise, more self-reflective, and more likely to lead to productive discussion than if you just to say “I believe that our government should use less market regulation.” Converting to probabilities, even if they are only rough ones like “very likely” or “a bit better than a 50% chance” can lead to more productive discussions and less bias. So ask yourself:

  • What probability do I really assign to this statement?
  • How much would I bet on this?
  • How surprised would I be if this turned out to be false?
  • How often have I been wrong in the past when I felt this strongly?

 


Influences: Robin Hanson, Eliezer Yudkowsky, Divia Melwani


  

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *


  1. I’ve found these self-questions useful when trying to make disciplined poker decisions–for both avoiding impatient mistakes and for overcoming fear in making aggressive and risky actions.

  2. I don’t even know how I ended up here, but I thought this post was good. I do not know who you are but certainly you are going to a famous blogger if you are not already 😉 Cheers!

  3. Awesome article 🙂

    I like how you combine the psychological principle of mans natural tendancy to strive for “consistency of beliefs” (ie. Once someone publicly announces something, they are significantly more likely to defend that belief whether they fully feel that way or not thus deeping their convictions) with expected value.

    Something fun… Understanding this principle can be helpful when trying to persuade someone of a certain viewpoint….. Think along the lines of classical conditioning / association 🙂