Quantifying Confidence: a method for critiquing my thinking

09 Feb 2024
 

Poker, wrote Maria Konnikova in Wired back in 2020, is an excellent way to learn how to make decisions: “The betting in poker isn’t incidental. It’s integral to the learning process. Our minds learn when we have a stake, a real stake, in the outcome of our learning.”

I am not myself suggesting going into poker and gambling. However, I have found that “taking a stake” is helpful in clarifying my thoughts. I learned this a rather hard way. During the last election, I rashly said “there is just no way that Trump would be the Republican nominee.” An old friend of mine, Michael Jaffarian, felt there was a fairly strong chance that he would be. (Neither of us, to my knowledge, are Trump supporters; this was just his judgment that I mis-judged the chances.) He bet me a pizza, and I lost. Since then, I have remembered that incident whenever I have made a statement (and I certainly was aware that a 28% chance of getting elected was not insignificant).

Recently I have begun thinking about the short-term scenarios for the future that I envision. I’ve been asking myself, how much would I bet on this? A pizza? I began measuring it in terms of orders of magnitude: $1? $10? $100? $1,000?

Thinking about how much I would bet and why forces me to at least mentally break an estimate down into its component steps. If I think the RSF is likely to win in Sudan—or at least come to substantially control the population centers—what are the components of that reasoning? What would have to happen for the population centers to be controlled? How much of that is happening? Are there indicators that the RSF is planning for the remaining? I judge the probability of a not-yet-occurred event on the basis of these indicators: is it “more probable than not” (>50%), “very probable” (>75%), “almost certain” (>90%), or the inverse of less-probable, improbable, and nearly certain not.

On the basis of that, how confident am I in the estimate? Would I bet a $1 (a sort of not-very-confident but I would “play the round”)? $10 - the equivalent of a reasonably priced book? $100 - a significant investment, for me? $1,000, or the equivalent of an iPad and some additional tech equipment?

I would think that the more certain I am, the more willing I am to bet a large amount. Yet here’s the interesting thing - even though I am not actually making a bet, I have found that mental imagination of making a bet actually causes a reflexive evaluation of the odds. That reflex is useful, because if I tell myself I’m “almost certain” (i.e. 90%), and yet my reflex flinches at betting more than, say, $1 or $10, then I begin re-examining the assumptions of my certainty. Am I really certain? If I were 90% certain, why wouldn’t I be willing to bet more?

The fact that I have this reflexive disagreement and can recognize it is useful. If my mind always said “well, of course I’d bet $1,000,” then I’d suspect confirmation bias is at play. I have yet to encounter a situation where I’d say, “oh, I’d bet more—I’d bet a $1,000 or $10,000”—because that might suggest I should be more certain than what I initially estimated.

As a side note: I’m here using “% chances” as a measure of my estimate of my certainty—not as a measure of probability. Percentages in probability don’t quite work this way in real life. For example, if a weather forecaster says “there is a 90% chance of rain today,” he means that out of 100 days that match this day’s particular climate model, 90 of them will have feature rain somewhere within the area being modeled. That means it’s very likely—but then again, if the model holds true, then 10 days should not feature rain. In other words, given the forecast, if 100 days that fit the model don’t have 10 days without rain, the model is poorly calibrated.

The mental bet set against the percentages is a useful calibration, but how do I decide whether my estimate was right? I’ve taken this a step further, and created a small Google sheet where I am recording my hypothesis, my estimate of its likelihood, and a bet. Then, at the end of the six months, I’ll take a look back and see what my “score” was. It’s an easy way to hold myself accountable for my thinking process. If many of my estimates were wrong, then I need to revisit how I am actually thinking about those scenarios.

There’s another situation where this process is less helpful: a Bayesian mindset, where one updates an estimate in mid-stream. John Maynard Keynes reportedly said, “When the facts change, I change my mind. What do you do, sir?” Bayesian thinking essentially starts with a set of priors, examines current observations, and then updates its thinking about the probability of a future scenario. This updated thinking becomes a new set of priors, which is used in future estimates.

To update the scenario in mid-stream sort of has the effect of “changing the game.” It’s an effective way of figuring out what is happening as the happening gets closer in time to us, but it’s not a good way to understand the decision-making process unless we record all revisions to the scenario. I might do this by way of “taking a counter bet.” For example, I might bet $10 that the RSF will conquer Sudan. Three months from now, a Western power might decide to intervene. At that point, I might make a “counter-bet” of $10 or that the RSF would lose. The net score at the end of 6 months might be $0 at best. This is not necessarily the most elegant solution, but it might be workable.

Updating one’s understanding of the short-term future context of critical decisions is an important part of life, and the best model I know for that is Bayesian. But recording key decisions, why one made them, and how confident you were when you originally made them is key to updating how you think about things in general.

Previous

02 Feb 2024

Next

17 Feb 2024

Roundup

2024

What happened to the unreached this week?

Each Friday I send a newsletter to over 2,400 mission activists, advocates, managers, field workers, and pastors - about what happened among the unreached, and what could happen next. Each issue comes with a curated list of nearly 100 links, and note why each is important. You can get on the list for free.

SUBSCRIBE   PREMIUM VERSION