The secret is to make peace with walking around in a world where we recognize that we are not sure and that’s okay. As we learn more about how our brains operate, we recognize that we don’t perceive the world objectively. But our goal should be to try.
Annie duke’s “Thinking in Bets” is basically long essay with an extremely valuable message. Under a plethora of entertaining anecdotes about professional poker it contains a valuable framework for making decisions in this uncertain world. This requires accepting uncertainty, and being intellectually honest. Good decision making habits compound over time
Thinking in Bets is a slightly less nerdy and less nuanced compliment to pair with “Fortune’s Formula”. It also fits in well with some of the more important behavioral finance books, such as…. Misbehaving, and Hour Between wolf and dog, Kluge, etc.
I’ve organized some of my highlights and notes from Thinking in Bets below.
The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
Outcome quality vs decision quality
We can get better at separating outcome quality from decision quality, discover the power of saying, “I’m not sure,” learn strategies to map out the future, become less reactive decision-makers, build and sustain pods of fellow truthseekers to improve our decision process, and recruit our past and future selves to make fewer emotional decisions. I didn’t become an always-rational, emotion-free decision-maker from thinking in bets. I still made (and make) plenty of mistakes. Mistakes, emotions, losing—those things are all inevitable because we are human. The approach of thinking in bets moved me toward objectivity, accuracy, and open-mindedness. That movement compounds over time
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
Why are we so bad at separating luck and skill? Why are we so uncomfortable knowing that results can be beyond our control? Why do we create such a strong connection between results and the quality of the decisions preceding them? How
Certainty is an illusion
Trying to force certainty onto an uncertain world is a recipe for poor decision making. To improve decision making, learn to accept uncertainty. You can always revise beliefs.
Seeking certainty helped keep us alive all this time, but it can wreak havoc on our decisions in an uncertain world. When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.
There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers. Here are two of them. First, “I’m not sure” is simply a more accurate representation of the world. Second, and related, when we accept that we can’t be sure, we are less likely to fall
Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience.
Incorporating uncertainty into the way we think about our beliefs comes with many benefits. By expressing our level of confidence in what we believe, we are shifting our approach to how we view the world. Acknowledging uncertainty is the first step in measuring and narrowing it. Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance toward information that disagrees with us. We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from “right” to “wrong.” When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.” Our narrative of being a knowledgeable, educated, intelligent person who holds quality opinions isn’t compromised when we use new information to calibrate our beliefs, compared with having to make a full-on reversal. This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek. When we work toward belief calibration, we become less judgmental .
In an uncertain world, the key to improving is to revise, revise, revise.
Not much is ever certain. Samuel Arbesman’s The Half-Life of Facts is a great read about how practically every fact we’ve ever known has been subject to revision or reversal. We are in a perpetual state of learning, and that can make any prior fact obsolete. One of many examples he provides is about the extinction of the coelacanth, a fish from the Late Cretaceous period. A mass-extinction event (such as a large meteor striking the Earth, a series of volcanic eruptions, or a permanent climate shift) ended the Cretaceous period. That was the end of dinosaurs, coelacanths, and a lot of other species. In the late 1930s and independently in the mid-1950s, however, coelacanths were found alive and well. A species becoming “unextinct” is pretty common. Arbesman cites the work of a pair of biologists at the University of Queensland who made a list of all 187 species of mammals declared extinct in the last five hundred years.
Getting comfortable with this realignment, and all the good things that follow, starts with recognizing that you’ve been betting all along.
The danger of being too smart
The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where information is coming from, right? Part of being “smart” is being good at processing information, parsing the quality of an argument and the credibility of the source. So, intuitively, it feels like smart people should have the ability to spot motivated reasoning coming and should have more intellectual resources to fight it. Surprisingly, being smart can actually make bias worse. Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative .
… the more numerate people (whether pro- or anti-gun) made more mistakes interpreting the data on the emotionally charged topic than the less numerate subjects sharing those same beliefs. “This pattern of polarization . . . does not abate among high-Numeracy subjects.
It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs. Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of those instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or willpower alone can’t make us resist motivated reasoning.
The Learning Loop
Thinking rationally is a lot about revising, and refuting beliefs(link to reflexivity) By going through a learning loop faster we are able to get an advantage. This is similar to John Boyd’s concept of an OODA loop.
We have the opportunity to learn from the way the future unfolds to improve our beliefs and decisions going forward. The more evidence we get from experience, the less uncertainty we have about our beliefs and choices. Actively using outcomes to examine our beliefs and bets closes the feedback loop, reducing uncertainty. This is the heavy lifting of how we learn.
Chalk up an outcome to skill, and we take credit for the result. Chalk up an outcome to luck, and it wasn’t in our control. For any outcome, we are faced with this initial sorting decision. That decision is a bet on whether the outcome belongs in the “luck” bucket or the “skill” bucket. This is where Nick the Greek went wrong. We can update the learning loop to represent this like so: Think about this like we are an outfielder catching a fly ball with runners on base. Fielders have to make in-the-moment game decisions about where to throw the ball.
Key message: How poker players adjust their play from experience determines how much they succeed. This applies ot any competitive endeavor in an uncertain world.
The best players analyze their performance with extreme intellectual honesty. This means if they win, they may end up being more focused on erros they made, as told in this anecdote:
In 2004, my brother provided televised final-table commentary for a tournament in which Phil Ivey smoked a star-studded final table. After his win, the two of them went to a restaurant for dinner, during which Ivey deconstructed every potential playing error he thought he might have made on the way to victory, asking my brother’s opinion about each strategic decision. A more run-of-the-mill player might have spent the time talking about how great they played, relishing the victory. Not Ivey. For him, the opportunity to learn from his mistakes was much more important than treating that dinner as a self-satisfying celebration. He earned a half-million dollars and won a lengthy poker tournament over world-class competition, but all he wanted to do was discuss with a fellow pro where he might have made better decisions. I heard an identical story secondhand about Ivey at another otherwise celebratory dinner following one of his now ten World Series of Poker victories. Again, from what I understand, he spent the evening discussing in intricate detail with some other pros the points in hands where he could have made better decisions. Phil Ivey, clearly, has different habits than most poker players—and most people in any endeavor—in how he fields his outcomes. Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward. A habit could involve eating cookies: the cue might be hunger, the routine going to the pantry and grabbing a cookie, and the reward a sugar high. Or, in poker, the cue might be winning a hand, the routine taking credit for it, the reward a boost to our ego. Charles Duhigg, in The Power of Habit, offers the golden rule of habit change….
Being in an environment where the challenge of a bet is always looming works to reduce motivated reasoning. Such an environment changes the frame through which we view disconfirming information, reinforcing the frame change that our truthseeking group rewards. Evidence that might contradict a belief we hold is no longer viewed through as hurtful a frame. Rather, it is viewed as helpful because it can improve our chances of making a better bet. And winning a bet triggers a reinforcing positive update.
Note: Intellectual Honesty thinking clearly= thinking in bets
Good decisions compound
One useful model is to view everything as one big long poker game. Therefore the result of individual games won’t upset you so much. Furthermore, good decision making habits compound over time. So the key is to always be developing good long term habits, even as you deal with the challenges of a specific game.
The best poker players develop practical ways to incorporate their long-term strategic goals into their in-the-moment decisions. The rest of this chapter is devoted to many of these strategies designed to recruit past- and future-us to help with all the execution decisions we have to make to reach our long-term goals. As with all the strategies in this book, we must recognize that no strategy can turn us into perfectly rational actors. In addition, we can make the best possible decisions and still not get the result we want. Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them. Even when that effort makes a small difference—more rational thinking and fewer emotional decisions, translated into an increased probability of better outcomes—it can have a significant impact on how our lives turn out. Good results compound. Good processes become habits, and make possible future calibration and improvement.
At the very beginning of my poker career, I heard an aphorism from some of the legends of the profession: “It’s all just one long poker game.” That aphorism is a reminder to take the long view, especially when something big happened in the last half hour, or the previous hand—or when we get a flat tire. Once we learn specific ways to recruit past and future versions of us to remind ourselves of this, we can keep the most recent upticks and downticks in their proper perspective. When we take the long view, we’re going to think in a more rational way.
Life, like poker, is one long game, and there are going to be a lot of losses, even after making the best possible bets. We are going to do better, and be happier, if we start by recognizing that we’ll never be sure of the future. That changes our task from trying to be right every time, an impossible job, to navigating our way through the uncertainty by calibrating our beliefs to move toward, little by little, a more accurate and objective representation of the world. With strategic foresight and perspective, that’s manageable work. If we keep learning and calibrating, we might even get good at it.
Backgammon: the cruelest game provides a guide to some key principles of backgammon, and contains analysis of several games between top players. It also gets philosophical about the vicissitudes of randomness that make backgammon so challenging and intriguing:
From the start there is a complicated interplay of possibilities, probabilities, good fortune and bad, which influences every facet of the game. in backgammon, to seek position is to take certain calculated risks, and because all players are ruled by the dictates of the dice- or by chance, which Karl von Clausewitz, the ninetheenth-century military theorist, described as “an agency indifferent to the actor’s preference for the outcomes” – no player is ever in control of his particular destiny. One of the game’s chief tactics, then, is to shield oneself against the dice. The player with the strongest position can withstand the greater number of unfavorable rolls, or “bad luck,” than can the more weakly protected player, who, because he failed to protect himself, is more easily assaulted and overrun.
Nonetheless, no matter how cunningly you play, you are virtually always vulnerable. One unexpected horror roll can undermine the best positions, and derange the most sensible of plans; this is bot hthe charm and the frustration of the game. The best players know they must employ the craftiest of tactics, not because of the dice, but in spite of them. It is the enormously high luck factor in backgammon that causees it to be a game of skill. Without luck or accident, the game would not only be monotonous, but infinitely less skillfull.
In backgammon, to be skillful is to be self protective. At any given point in the ggame, the better players are aware of Murphy’s Law, which states that if anything can go wrong, it will.” Given the whimsical nature of the dice, all players have a chance in the game, but some players have more chances than others, because they have created in environment in which the more propirious is more likely to occur.
In backgammon, an understanding of the correct percentage moves in specific situations qualifies as “inside information” and will enable you to win in the long run. But not every time, alas, and often nt even in what you believe to be crucial games. This condition must be accepted philosophically, of course, and should not deter you from continuing a detailed study of the game.
Learning to think probabilistically is one of the most critical skills one can master. Nate Silver’s The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t is a valuable book on thinking probabilistically and forecasting in an uncertain environment. It compares and contrasts examples across multiple disciplines, including weather forecasting, seismology, finance, and more.
This book pairs well with Against the Gods, Fortune’s Formula and Superforecasting. Against the Gods is in my opinion, the most important book on the development of probabilistic thinking. Early civilizations were good with geometry and logic, but helpless with uncertainty. Ironically it was gamblers and heretics who moved mankind forward by developing the science of probability, statistics, and ultimately risk management. Fortune’s Formula shows the connection between information theory, gambling, and correct position sizing for investors. It helps the answer the question: when you have a slight edge, how much should you bet? Nate Silver draws heavily on Superforecasting. Particularly important is the idea of “foxes and hedgehogs”. Foxes are multidisciplinary, adaptable, self critical , tolerant of complexity, cautious and empirical. In contrast, Hedgehogs are specialized, stalwart, stubborn, order-seeking, confident, and ideological. As you might expect, foxes make far better forecasters than hedgehogs, even though hedgehogs make for better television.
Anyways, here are a few key insights from my notes on The Signal and the Noise
1) Data is useless without context.
There are always patterns to find in data, but its critical to understand the theory behind the system you are studying to avoid being fooled by noise. This is true in forecasting the weather, investing, betting on sports, or any other probabilistic endeavor. The ability to understand context is also a critical advantage humans have over computer programs.
“Statistical inferences are much stronger when backed up by theory or at least some deeper thinking about their root causes. “
The importance of understanding context comes to the forefront when you compare human’s success with weather forecasting, vs relative failure with earthquake forecasting.
“Chaos theory is a demon that can be tamed- weather forecasts did so, at least in part. But weather forecasters have a much better theoretical understanding of th earth’s atmosphere than seismologists do of the earth’s crust. They know more or less, how weather works, right down to the molecular level. Seismologists don’t have that advantage. “
The ability to understand context is what separates success from failure in all pursuits dealing with uncertainty. The profile of professional sports gambler Bob Voulgaris, is highly instructive. Voulgaris focuses on NBA basketball. A key insight is that Voulgaris has powerful tools for analyzing data, and he makes good use of the data, but he also has deep understanding of the qualitative subletities of how NBA basketball works. Obvious statistical patterns are quickly incorporated into betting lines, whether they are signal or noise. Voulgaris looks deeper, and finds places where the line misprices true probabilities.
“Finding patterns is easy in any data rich environment; thats what mediocre gamblers do. The key is in determining whether the patterns represent noise or signal. “
2) Beware of overconfidence
“… the amount of confidence someone expresses in a prediction is not good indication of its accuracy, to the contrary, these qualities are often inversely correlated. “
3) Think big, and think small. Mix the macro and the micro.
“Good innovators typically think very big, and they think very small. New ideas are sometimes found in the most granular of details where few others bother to look. And they are sometimes found when you are doing your most abstract and philosophical thinking, considering why the world is the way that it is and whether there might be an alternative to the dominant paradigm.”
This is reminiscent of the “global micro” approach used by several manager’s profiled in Inside the House of Money: Top Hedge Fund Traders on Profiting in the Global Markets
4) Recognize the Value of Bayesian Thinking
The work of Thomas Bayes forms the framework underlying how good gamblers think.
Bayes was an English minister who argued in his theological work that admitting our own imperfections is a necessary step on the way to redemption. His most famous work, however, was “An Essay toward Solving a Problem in the Doctrine of Chances,” which was not published until after his death. One interpretation of the essay concerns a person who emerges into the world( ie Adam, or someone from Plato’s cave), and rises to see the sun for the first time:
“At first the does not know whether this is typical of some sort of freak occurrence. However each day that he survives and the sun rises again, his confidence increases that it is a permanent feature of nature. Gradually, through this purely statistical form of inference, the probability that he assigns to his prediction that the sun will rise again tomorrow approaches(although never exactly reaches) 100 percent.”
In essence, beliefs on probability are updated as new information comes in.
Ironically Bayes philosophical work was extended by the mathematician and astronomer Pierre Simon-Laplace, who was likely an atheist. Although Laplace believed in scientific determinism, he was frustrated with the disconnect between (what he believed to be the perfection of nature, and human imperfections in understanding it, in particular with regards to astronomical observations. Consequently, he developed some measuring techniques that relied on probabilistic inferences, rather than exact measurements. “Laplace came to view probability as a waypoint between ignorance and knowledge.” The combined work of Laplace and Bayes led to simple expression that is concerned with conditional probability. In essence Bayesian math can be used to tell us the probability that a theory or hypothesis if some event has happened.
5) The road to wisdom is to be less and less wrong.
forecasting, or at least operating in an uncertain environment, is an iterative process.
Nate Silver titles one of the chapters “Less and Less Wrong, as a homage to the Danish mathematician, scientist, inventor, and poet Piet Hein, author of Grooks:
The road to wisdom? — Well, it’s plain
and simple to express:
and err again
George Soros treats developments in financial markets as a historical process. In The Alchemy of Finance, he outlines his theory of reflexivity, discusses historical developments in markets, and describes a real time “experiment” he undertook while running the Quantum fund in the 1980s.
Markets are an ideal laboratory for testing theories: changes are expressed in quantitative terms, and the data are easily accessible.
Three of the key interrelated concepts in his framework, are anti-equilibrium, Imperfect Knowledge, and Reflexivity.
In markets, equilibrium is a very rare special case. Further, adjustments rarely lead to new equilibrium. The economy is always in adjustment.
According to George Soros:
If we want to understand the real world we must divert our gaze from a hypothetical final outcome , and concentrate our attention on the process of change that we observe all around us.
In trying to deal with macroeconomic developments, equilibrium analysis is totally inappropriate. Nothing could be further removed from reality than the assumptions that the participants base their decisions on perfect knowledge. People are groping to anticipate the future with the help of whatever guideposts they can establish. The outcome tends to diverge from expectations, leading to constantly changing expectations, and constantly changing outcomes. The process is reflexive.
The stock market, is of course a perfect example:
The concept of an equilibrium seems irrelevant at best and misleading at worst. The evidence shows persistent fluctuations, whatever length of time is chosen as the period of observation. Admittedly, the underlying conditions that are supposed to be reflected in stock prices are also constantly changing, but it is difficult to establish any firm relationship between changes in stock prices and changes in underlying conditions. Whatever relationship can be established has to be imputed rather than observed.
So its better to focus on nature and direction of ongoing adjustments, rather than trying to identify an equilibrium.
Perhaps more problematic with an exclusive focus on rarely occurring equilibrium conditions is the assumption of perfect knowledge. Perfect knowledge is impossible. Everything is a provisional hypothesis, subject to improvement. Soros makes the bias of market participants the center part of his analysis.
In natural sciences, usually the thinking of participants and the events themselves can be separated. However, when people are involved, there is interplay between thoughts and actions. There is a partial link to Heisenberg’s uncertainty principle. The basic deductive nomological approach of science is inadequate. Use of probabilistic generalization, or some other novel scientific method is preferable.
Thinking plays a dual role. On the one hand, participants seek to understand the situation in which they participate; on the other, their understanding serves as the basis of decisions which influence the course of the events. The two roles interfere with each other.
The influence of this idea is inseparable from the theory of imperfect knowledge.
The participants’ perceptions are inherently flawed, and there is a two-way connection between flawed perceptions and the actual course of events, which results in a lack of correspondence between the two.
This two way connection is what Soros called “reflexivity.”
The thinking of participants, exactly because it is not governed by reality, is easily influenced by theories. In the field of natural phenomena, scientific method is effective only then its theories are valid, but in social political , and economic matters, theories can be effective without being valid.
Effective here, means having an impact. For example, in a bubble, the cost of capital for some companies drops to be absurdly low, relative to the risk of their respective enterprises. Consequently, some businesses that would have otherwise died, may go on to survive. (Example from two decades after the Alchemy of Finance was written: Peter Thiel mentions when being interviewed in Inside the House of Money, that Paypal did a massive capital raise right a the height of the tech bubble, even though it didn’t need the money at the time) On the flip side, a depression can be self fulfilling, if businesses are unable to refinance.
This seems to be especially true in the credit markets:
Loans are based on the lender’s estimation of the borrowers ability to service his debt. The valuation of the collateral is supposed to be independent of the act of lending; but in actual fact the act of lending can affect the value of the collateral. This is true of the individual case and of the economy as a whole. Credit expansion stimulates the economy and enhances the collateral values; the repayment or contraction of credit has a depressing influence both on the economy and on the valuation of collateral. The connection between credit and economy activity is anything but constant- for instance , credit for building a new factory has quite a different effect from credit for a leveraged buyout. This makes it difficult to quantify the connection between credit and economic activity. Yet it is a mistake to ignore it.
This is reminiscent of Hyman Minsky’s Financial Instability Hypothesis
In terms of the stock market, Soros asserts (1)Markets are always biased in one direction or another. (2) Markets can influence the events that they anticipate.