No matter how intelligent, rational, or knowledgeable you may be, you are going to be wrong pretty regularly. And you’ll be wrong far more often than pretty regularly when dealing with complex topics like politics, people or philosophy. Even if you’ve freed yourself from thinking in terms of true and false dichotomies, and made the effort to convert your beliefs to probabilities or degrees of belief, you’ll still be wrong by way of assigning high probabilities to false propositions.
Most people underestimate how often they are wrong. Not only is there a common human tendency to overestimate one’s own abilities, but beliefs have the property that they feel right to us when we focus on them. So even if we admit that we likely have a number of false beliefs, it’s easy to go on acting as though each of our individual beliefs is beyond serious doubt. Worse still, we assume that if a belief of ours hasn’t yet been proven wrong, then it’s right (it feels that way, after all) so it seems to us that we have made far fewer errors than we really have.
It’s disturbing to discover we’ve been mistaken about something important – especially when we’ve wasted time or effort because of the belief, or expressed the belief in front of others. So we’re incentivized to come up with justifications for why we weren’t actually wrong. We try to avoid psychological discomfort, and we try to save face in front of others. But there is a healthier way to think about wrongness: recognizing that we have an error rate.
Since we have to assume that we will be wrong sometimes, we can think of ourselves as having a frequency with which things we claim are actually false (or, if we’re thinking probabilistically, a rate at which we assign high probabilities to false propositions). As was pointed out in the comments below, it may be helpful to think of your error rate as being context specific: we make errors more frequently when discussing philosophy than when remarking on the weather. But if you wanted a single overall rate, you could define it, for example, as the fraction of the last 1000 claims you made that actually were not true (or were not even very nearly true). This rate will be different than, but generally quite predictive of, the fraction of your next 1000 claims that will be wrong.
Our error rate is connected to the chance that any one of our individual beliefs will be wrong, though we obviously should be much more confident in some of our beliefs than others. When evaluating the probability of a particular belief being right, there are a variety of indicators to look at. For example, we should be more skeptical of one of our beliefs if a large percentage of smart people with relevant knowledge dispute it, or if we have a strong incentive (financial or otherwise) to believe it, or if we can’t discuss the belief without feeling emotional.
Once we fully accept the fact that we have an error rate, we can think about wrongness in a new light: we can expect to be wrong with regularity, especially when reasoning about complex subjects. Once we start expecting to be wrong, it is no longer as disturbing to find that we are wrong in a particular case. This merely is a confirmation of our own predictions: we were right that our being wrong is a common occurrence. That way, being wrong doesn’t have to be so frightening. When it happens, it indicates our error rate may be slightly higher than we previously believed, but it is not abnormal.
Estimating our actual error rate is hard, in part because we’re wrong much more often than we notice it. So even in theory it doesn’t work to just count up how many times we’ve discovered being wrong as a fraction of the number of things we’ve claimed were true. But nonetheless, we can benefit psychologically from remembering that we have an error rate, even if we don’t know what that rate is.
If in your experience you’re almost never wrong, that is indicative of a serious problem: it is far more likely that you are wrong fairly regularly (and are simply bad at processing the counter evidence that should make you aware of your wrongness) than it is that you really are wrong so infrequently. Put another way: failure to detect your own wrongness doesn’t imply you’re right, it indicates you’re very likely deceived about your rate of wrongness. Presumably, you’ve noticed that those around you are wrong quite regularly. Do you really think you’re the incredibly rare exception who is pretty much always right?
When you deeply accept the fact that you’re wrong with a certain error rate, it becomes easier to convert fear of being wrong into curiosity about when your wrongness is occurring. Whereas seeking out your thinking failures may have scared you before, it may now seem dangerous to not seek them out: you already know that you’re going to be repeatedly wrong, so the responsible thing is to figure out when that wrongness is occurring.
Yet another advantage of thinking about your error rate is that it naturally leads to thinking about how to reduce this rate. This can be done by learning to rely on more reliable procedures for forming beliefs (something I’ll say much more about later), and using these procedures to check what you previously believed to be true.
Remember: you too have an error rate. You don’t need to fear being wrong. Instead, you should expect it.