Why its wise to think in bets

 

The secret is to make peace with walking around in a world where we recognize that we are not sure and that’s okay. As we learn more about how our brains operate, we recognize that we don’t perceive the world objectively. But our goal should be to try.

Annie duke’s “Thinking in Bets” is basically long essay with an extremely valuable message. Under a plethora of entertaining anecdotes about professional poker it contains a valuable framework for making decisions in this uncertain world. This requires accepting uncertainty, and being intellectually honest. Good decision making habits compound over time
Thinking in Bets is a slightly less nerdy and less nuanced compliment to pair with “Fortune’s Formula”. It also fits in well with some of the more important behavioral finance books, such as…. Misbehaving, and Hour Between wolf and dog, Kluge, etc.

I’ve organized some of my highlights and notes from Thinking in Bets below.

The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.

Outcome quality vs decision quality

We can get better at separating outcome quality from decision quality, discover the power of saying, “I’m not sure,” learn strategies to map out the future, become less reactive decision-makers, build and sustain pods of fellow truthseekers to improve our decision process, and recruit our past and future selves to make fewer emotional decisions. I didn’t become an always-rational, emotion-free decision-maker from thinking in bets. I still made (and make) plenty of mistakes. Mistakes, emotions, losing—those things are all inevitable because we are human. The approach of thinking in bets moved me toward objectivity, accuracy, and open-mindedness. That movement compounds over time

Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.

Why are we so bad at separating luck and skill? Why are we so uncomfortable knowing that results can be beyond our control? Why do we create such a strong connection between results and the quality of the decisions preceding them? How

Certainty is an illusion

Trying to force certainty onto an uncertain world is a recipe for poor decision making. To improve decision making, learn to accept uncertainty. You can always revise beliefs.

Seeking certainty helped keep us alive all this time, but it can wreak havoc on our decisions in an uncertain world. When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.

There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers. Here are two of them. First, “I’m not sure” is simply a more accurate representation of the world. Second, and related, when we accept that we can’t be sure, we are less likely to fall

Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience.

Incorporating uncertainty into the way we think about our beliefs comes with many benefits. By expressing our level of confidence in what we believe, we are shifting our approach to how we view the world. Acknowledging uncertainty is the first step in measuring and narrowing it. Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance toward information that disagrees with us. We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from “right” to “wrong.” When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.” Our narrative of being a knowledgeable, educated, intelligent person who holds quality opinions isn’t compromised when we use new information to calibrate our beliefs, compared with having to make a full-on reversal. This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek. When we work toward belief calibration, we become less judgmental .

In an uncertain world, the key to improving is to revise, revise, revise.

Not much is ever certain. Samuel Arbesman’s The Half-Life of Facts is a great read about how practically every fact we’ve ever known has been subject to revision or reversal. We are in a perpetual state of learning, and that can make any prior fact obsolete. One of many examples he provides is about the extinction of the coelacanth, a fish from the Late Cretaceous period. A mass-extinction event (such as a large meteor striking the Earth, a series of volcanic eruptions, or a permanent climate shift) ended the Cretaceous period. That was the end of dinosaurs, coelacanths, and a lot of other species. In the late 1930s and independently in the mid-1950s, however, coelacanths were found alive and well. A species becoming “unextinct” is pretty common. Arbesman cites the work of a pair of biologists at the University of Queensland who made a list of all 187 species of mammals declared extinct in the last five hundred years.

Getting comfortable with this realignment, and all the good things that follow, starts with recognizing that you’ve been betting all along.

The danger of being too smart

The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where information is coming from, right? Part of being “smart” is being good at processing information, parsing the quality of an argument and the credibility of the source. So, intuitively, it feels like smart people should have the ability to spot motivated reasoning coming and should have more intellectual resources to fight it. Surprisingly, being smart can actually make bias worse. Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative .

… the more numerate people (whether pro- or anti-gun) made more mistakes interpreting the data on the emotionally charged topic than the less numerate subjects sharing those same beliefs. “This pattern of polarization . . . does not abate among high-Numeracy subjects.

It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs. Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of those instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or willpower alone can’t make us resist motivated reasoning.

The Learning Loop

Thinking rationally is a lot about revising, and refuting beliefs(link to reflexivity) By going through a learning loop faster we are able to get an advantage. This is similar to John Boyd’s concept of an OODA loop.

We have the opportunity to learn from the way the future unfolds to improve our beliefs and decisions going forward. The more evidence we get from experience, the less uncertainty we have about our beliefs and choices. Actively using outcomes to examine our beliefs and bets closes the feedback loop, reducing uncertainty. This is the heavy lifting of how we learn.

Chalk up an outcome to skill, and we take credit for the result. Chalk up an outcome to luck, and it wasn’t in our control. For any outcome, we are faced with this initial sorting decision. That decision is a bet on whether the outcome belongs in the “luck” bucket or the “skill” bucket. This is where Nick the Greek went wrong. We can update the learning loop to represent this like so: Think about this like we are an outfielder catching a fly ball with runners on base. Fielders have to make in-the-moment game decisions about where to throw the ball.

Key message: How poker players adjust their play from experience determines how much they succeed. This applies ot any competitive endeavor in an uncertain world.

Intellectual Honesty

The best players analyze their performance with extreme intellectual honesty. This means if they win, they may end up being more focused on erros they made, as told in this anecdote:

In 2004, my brother provided televised final-table commentary for a tournament in which Phil Ivey smoked a star-studded final table. After his win, the two of them went to a restaurant for dinner, during which Ivey deconstructed every potential playing error he thought he might have made on the way to victory, asking my brother’s opinion about each strategic decision. A more run-of-the-mill player might have spent the time talking about how great they played, relishing the victory. Not Ivey. For him, the opportunity to learn from his mistakes was much more important than treating that dinner as a self-satisfying celebration. He earned a half-million dollars and won a lengthy poker tournament over world-class competition, but all he wanted to do was discuss with a fellow pro where he might have made better decisions. I heard an identical story secondhand about Ivey at another otherwise celebratory dinner following one of his now ten World Series of Poker victories. Again, from what I understand, he spent the evening discussing in intricate detail with some other pros the points in hands where he could have made better decisions. Phil Ivey, clearly, has different habits than most poker players—and most people in any endeavor—in how he fields his outcomes. Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward. A habit could involve eating cookies: the cue might be hunger, the routine going to the pantry and grabbing a cookie, and the reward a sugar high. Or, in poker, the cue might be winning a hand, the routine taking credit for it, the reward a boost to our ego. Charles Duhigg, in The Power of Habit, offers the golden rule of habit change….

Being in an environment where the challenge of a bet is always looming works to reduce motivated reasoning. Such an environment changes the frame through which we view disconfirming information, reinforcing the frame change that our truthseeking group rewards. Evidence that might contradict a belief we hold is no longer viewed through as hurtful a frame. Rather, it is viewed as helpful because it can improve our chances of making a better bet. And winning a bet triggers a reinforcing positive update.

Note: Intellectual Honesty thinking clearly= thinking in bets

Good decisions compound

One useful model is to view everything as one big long poker game. Therefore the result of individual games won’t upset you so much. Furthermore, good decision making habits compound over time. So the key is to always be developing good long term habits, even as you deal with the challenges of a specific game.

The best poker players develop practical ways to incorporate their long-term strategic goals into their in-the-moment decisions. The rest of this chapter is devoted to many of these strategies designed to recruit past- and future-us to help with all the execution decisions we have to make to reach our long-term goals. As with all the strategies in this book, we must recognize that no strategy can turn us into perfectly rational actors. In addition, we can make the best possible decisions and still not get the result we want. Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them. Even when that effort makes a small difference—more rational thinking and fewer emotional decisions, translated into an increased probability of better outcomes—it can have a significant impact on how our lives turn out. Good results compound. Good processes become habits, and make possible future calibration and improvement.

At the very beginning of my poker career, I heard an aphorism from some of the legends of the profession: “It’s all just one long poker game.” That aphorism is a reminder to take the long view, especially when something big happened in the last half hour, or the previous hand—or when we get a flat tire. Once we learn specific ways to recruit past and future versions of us to remind ourselves of this, we can keep the most recent upticks and downticks in their proper perspective. When we take the long view, we’re going to think in a more rational way.

Life, like poker, is one long game, and there are going to be a lot of losses, even after making the best possible bets. We are going to do better, and be happier, if we start by recognizing that we’ll never be sure of the future. That changes our task from trying to be right every time, an impossible job, to navigating our way through the uncertainty by calibrating our beliefs to move toward, little by little, a more accurate and objective representation of the world. With strategic foresight and perspective, that’s manageable work. If we keep learning and calibrating, we might even get good at it.

Leave a Reply