Category: Philosophy

Why its wise to think in bets

 

The secret is to make peace with walking around in a world where we recognize that we are not sure and that’s okay. As we learn more about how our brains operate, we recognize that we don’t perceive the world objectively. But our goal should be to try.

Annie duke’s “Thinking in Bets” is basically long essay with an extremely valuable message. Under a plethora of entertaining anecdotes about professional poker it contains a valuable framework for making decisions in this uncertain world. This requires accepting uncertainty, and being intellectually honest. Good decision making habits compound over time
Thinking in Bets is a slightly less nerdy and less nuanced compliment to pair with “Fortune’s Formula”. It also fits in well with some of the more important behavioral finance books, such as…. Misbehaving, and Hour Between wolf and dog, Kluge, etc.

I’ve organized some of my highlights and notes from Thinking in Bets below.

The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.

Outcome quality vs decision quality

We can get better at separating outcome quality from decision quality, discover the power of saying, “I’m not sure,” learn strategies to map out the future, become less reactive decision-makers, build and sustain pods of fellow truthseekers to improve our decision process, and recruit our past and future selves to make fewer emotional decisions. I didn’t become an always-rational, emotion-free decision-maker from thinking in bets. I still made (and make) plenty of mistakes. Mistakes, emotions, losing—those things are all inevitable because we are human. The approach of thinking in bets moved me toward objectivity, accuracy, and open-mindedness. That movement compounds over time

Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.

Why are we so bad at separating luck and skill? Why are we so uncomfortable knowing that results can be beyond our control? Why do we create such a strong connection between results and the quality of the decisions preceding them? How

Certainty is an illusion

Trying to force certainty onto an uncertain world is a recipe for poor decision making. To improve decision making, learn to accept uncertainty. You can always revise beliefs.

Seeking certainty helped keep us alive all this time, but it can wreak havoc on our decisions in an uncertain world. When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.

There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers. Here are two of them. First, “I’m not sure” is simply a more accurate representation of the world. Second, and related, when we accept that we can’t be sure, we are less likely to fall

Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience.

Incorporating uncertainty into the way we think about our beliefs comes with many benefits. By expressing our level of confidence in what we believe, we are shifting our approach to how we view the world. Acknowledging uncertainty is the first step in measuring and narrowing it. Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance toward information that disagrees with us. We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from “right” to “wrong.” When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.” Our narrative of being a knowledgeable, educated, intelligent person who holds quality opinions isn’t compromised when we use new information to calibrate our beliefs, compared with having to make a full-on reversal. This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek. When we work toward belief calibration, we become less judgmental .

In an uncertain world, the key to improving is to revise, revise, revise.

Not much is ever certain. Samuel Arbesman’s The Half-Life of Facts is a great read about how practically every fact we’ve ever known has been subject to revision or reversal. We are in a perpetual state of learning, and that can make any prior fact obsolete. One of many examples he provides is about the extinction of the coelacanth, a fish from the Late Cretaceous period. A mass-extinction event (such as a large meteor striking the Earth, a series of volcanic eruptions, or a permanent climate shift) ended the Cretaceous period. That was the end of dinosaurs, coelacanths, and a lot of other species. In the late 1930s and independently in the mid-1950s, however, coelacanths were found alive and well. A species becoming “unextinct” is pretty common. Arbesman cites the work of a pair of biologists at the University of Queensland who made a list of all 187 species of mammals declared extinct in the last five hundred years.

Getting comfortable with this realignment, and all the good things that follow, starts with recognizing that you’ve been betting all along.

The danger of being too smart

The popular wisdom is that the smarter you are, the less susceptible you are to fake news or disinformation. After all, smart people are more likely to analyze and effectively evaluate where information is coming from, right? Part of being “smart” is being good at processing information, parsing the quality of an argument and the credibility of the source. So, intuitively, it feels like smart people should have the ability to spot motivated reasoning coming and should have more intellectual resources to fight it. Surprisingly, being smart can actually make bias worse. Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative .

… the more numerate people (whether pro- or anti-gun) made more mistakes interpreting the data on the emotionally charged topic than the less numerate subjects sharing those same beliefs. “This pattern of polarization . . . does not abate among high-Numeracy subjects.

It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs. Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of those instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning. As with visual illusions, we can’t make our minds work differently than they do no matter how smart we are. Just as we can’t unsee an illusion, intellect or willpower alone can’t make us resist motivated reasoning.

The Learning Loop

Thinking rationally is a lot about revising, and refuting beliefs(link to reflexivity) By going through a learning loop faster we are able to get an advantage. This is similar to John Boyd’s concept of an OODA loop.

We have the opportunity to learn from the way the future unfolds to improve our beliefs and decisions going forward. The more evidence we get from experience, the less uncertainty we have about our beliefs and choices. Actively using outcomes to examine our beliefs and bets closes the feedback loop, reducing uncertainty. This is the heavy lifting of how we learn.

Chalk up an outcome to skill, and we take credit for the result. Chalk up an outcome to luck, and it wasn’t in our control. For any outcome, we are faced with this initial sorting decision. That decision is a bet on whether the outcome belongs in the “luck” bucket or the “skill” bucket. This is where Nick the Greek went wrong. We can update the learning loop to represent this like so: Think about this like we are an outfielder catching a fly ball with runners on base. Fielders have to make in-the-moment game decisions about where to throw the ball.

Key message: How poker players adjust their play from experience determines how much they succeed. This applies ot any competitive endeavor in an uncertain world.

Intellectual Honesty

The best players analyze their performance with extreme intellectual honesty. This means if they win, they may end up being more focused on erros they made, as told in this anecdote:

In 2004, my brother provided televised final-table commentary for a tournament in which Phil Ivey smoked a star-studded final table. After his win, the two of them went to a restaurant for dinner, during which Ivey deconstructed every potential playing error he thought he might have made on the way to victory, asking my brother’s opinion about each strategic decision. A more run-of-the-mill player might have spent the time talking about how great they played, relishing the victory. Not Ivey. For him, the opportunity to learn from his mistakes was much more important than treating that dinner as a self-satisfying celebration. He earned a half-million dollars and won a lengthy poker tournament over world-class competition, but all he wanted to do was discuss with a fellow pro where he might have made better decisions. I heard an identical story secondhand about Ivey at another otherwise celebratory dinner following one of his now ten World Series of Poker victories. Again, from what I understand, he spent the evening discussing in intricate detail with some other pros the points in hands where he could have made better decisions. Phil Ivey, clearly, has different habits than most poker players—and most people in any endeavor—in how he fields his outcomes. Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward. A habit could involve eating cookies: the cue might be hunger, the routine going to the pantry and grabbing a cookie, and the reward a sugar high. Or, in poker, the cue might be winning a hand, the routine taking credit for it, the reward a boost to our ego. Charles Duhigg, in The Power of Habit, offers the golden rule of habit change….

Being in an environment where the challenge of a bet is always looming works to reduce motivated reasoning. Such an environment changes the frame through which we view disconfirming information, reinforcing the frame change that our truthseeking group rewards. Evidence that might contradict a belief we hold is no longer viewed through as hurtful a frame. Rather, it is viewed as helpful because it can improve our chances of making a better bet. And winning a bet triggers a reinforcing positive update.

Note: Intellectual Honesty thinking clearly= thinking in bets

Good decisions compound

One useful model is to view everything as one big long poker game. Therefore the result of individual games won’t upset you so much. Furthermore, good decision making habits compound over time. So the key is to always be developing good long term habits, even as you deal with the challenges of a specific game.

The best poker players develop practical ways to incorporate their long-term strategic goals into their in-the-moment decisions. The rest of this chapter is devoted to many of these strategies designed to recruit past- and future-us to help with all the execution decisions we have to make to reach our long-term goals. As with all the strategies in this book, we must recognize that no strategy can turn us into perfectly rational actors. In addition, we can make the best possible decisions and still not get the result we want. Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them. Even when that effort makes a small difference—more rational thinking and fewer emotional decisions, translated into an increased probability of better outcomes—it can have a significant impact on how our lives turn out. Good results compound. Good processes become habits, and make possible future calibration and improvement.

At the very beginning of my poker career, I heard an aphorism from some of the legends of the profession: “It’s all just one long poker game.” That aphorism is a reminder to take the long view, especially when something big happened in the last half hour, or the previous hand—or when we get a flat tire. Once we learn specific ways to recruit past and future versions of us to remind ourselves of this, we can keep the most recent upticks and downticks in their proper perspective. When we take the long view, we’re going to think in a more rational way.

Life, like poker, is one long game, and there are going to be a lot of losses, even after making the best possible bets. We are going to do better, and be happier, if we start by recognizing that we’ll never be sure of the future. That changes our task from trying to be right every time, an impossible job, to navigating our way through the uncertainty by calibrating our beliefs to move toward, little by little, a more accurate and objective representation of the world. With strategic foresight and perspective, that’s manageable work. If we keep learning and calibrating, we might even get good at it.

The value of improvisation and informal processes

Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed summarizes the key dangers of centrally managed social engineering projects. Its a bit dense, but well worth it. It shows similarities between many seemingly different disasters caused by top-down control and central planning. Case studies include modernist architecture, Soviet collectivization, herding or rural people into villages in Africa,and early errors in scientific agriculture, etc.

Anyone trying to build and manage an organization needs to be aware of the lessons in this book.

One key lesson is that practical knowledge, informal processes, and improvisation in the face of unpredictability are indispensable.

Formal scheme was parasitic on informal processes that alone, it could not create or maintain. The the degree that the formal scheme made no allowance for those processes or actually suppressed them, it failed both its intended beneficiaries and ultimately its designers as well. “

Radically simplified designs for social organization seem to court the same risks of failure courted by radically simplified designs for natural environments.

It makes the case for resilience of both social and natural diversity, and a strong case for limits about what can be known about complex social order. Avoid reductive social science.

Four elements of centrally planned disasters

According to the book, there four elements necessary for a full fledged disaster to be caused by state initiated social engineering.

  • Administrative ordering of nature and society
  • “High Modernist Ideology” a faith that borrowed the legitimacy of science and technology. Uncritical unskeptical public and therefore unscientific optimism about possibilities for comprehensive planning of human settlement and production. Often with aesthetic terms too.
  • Authoritarian state willing to use the full weight of its coercive power for #2
  • Prostrate civil society lacking capacity to resist these plans.

“By themselves they are unremarkable tools of modern statecraft; they are as vital to the maintenance of our welfare and freedom as they are to the designs of a would be modern despot. They undergird the concept of citizenship and the provision of social welfare just as they might undergird a policy of rounding up undesirable minorities.”

Overlooking ecology

The discussion of the ecological disasters caused by forestry regulation in 18th and 19th century Germany is instructive:

The metaphorical value of this brief account of scientific production forestry is that it illustrates the dangers of dismembering an exceptionally complex and poorly understood set of relations and processes in order to isolate a single element of instrumental value. The instrument, the knife, that carved out the new, rudimentary forest was the razor sharp interest in the production of a single commodity. Everything that interfered with the efficient production of the key commodity was implacably eliminated. Everything that seemed unrelated to efficient production was ignored. Having come to see the forest as a commodity, scientific forestry set about refashioning it as a commodity machine. Utilitarian simplification in the forest was an effective way of maximizing wood production in the short and intermediate term. Ultimately, however, its emphasis on yield and paper profits, its relatively short time horizon, and , above all, the vast array of consequences it had resoloutley bracketed came back to haunt it.

Department of unintended consequences

I like to joke about wanting to start a department of unintended consequences to oversee economic policy. Often central planners fail because they arrograntly fail to foresee unintended consequences of their policies.

“ the door and window tax established in France under the directory and abolished only in 1917 is a striking case in point. Its originator must have reasoned that the number of windows and doors in a dwelling was proportional to the dwelling a size. Thus a tax assessor need not enter the house or measure it but merely count the doors and windows. As a simple, workable formula, it was a brilliant stroke, but it was not without consequences. Peasant dwellings were subsequently designed or renovated with the formula in mind so as to have as few openings as possible. While the fiscal losses could be recouped by raising the tax per opening, the long-term effects on the health of the rural population lasted for more than a century. “

See also: Goodhart’s Law

Department of unintended consequences

To: Democrats, Republicans

CC: Sportsmanship, Books

Re: Department of Unintended Consequences

I don’t talk politics around here much but…

One thing the government needs is a “Department of Unintended Consequences” , or Unintended Consequences Ministry. This department will be in charge of analyzing the potential unintended consequence of any proposed policy put forth by any government department.   I suggest that it hire the best computer engineers to help build simulations using the most advanced AI /game techniques. Perhaps this Unintended Consequences Ministry will help the broader government avert misguided actions, avoid long term consequences, and identify prudent courses of action.

This critical department will start it out with a small budget. But funds are tight, so other departments may have to make a few small cuts.

This department may rise in importance to be come a fourth component of the balance of power in the American system.  I volunteer to be head of this department, and will accept a market compensation package.

Sincerely,

Paul C Wonk

Optimizing An Organized Mind

How can one maximize mental performance? The Organized Mind- Thinking Straight in an Age of Information Overload by Daniel Levitin is a book that works towards an answer to this question. The book’s ideas on offloading things to external systems and organizational techniques are very similar to David Allen’s , Getting Things Done . However, The Organized Mind, provides much more historical and scientific background an context. Further, An Organized Mind avoids being overly prescriptive, and instead gives the reader ideas on how to best optimize for their own situation.

Some of my highlights on the key themes of the book:

Getting the mind into the right mode

One useful framework that the books develops is hte idea of the mind as functioning in different modes. An important component of high performance is the ability to use the right mode at the right time.

There are four components in the human attention system: the mind-wandering mode, the central executive mode, the attention filter, and the attention switch, which directs neural and metabolic resources among the mind-wandering, stay-on-task, or vigilance modes.

Remember that the mind-wandering mode and the central executive work in opposition and are mutually exclusive states; they’re like the little devil and angel standing on opposite shoulders, each trying to tempt you. While you’re working on one project, the mind-wandering devil starts thinking of all the other things going on in your life and tries to distract you. Such is the power of this task-negative network that those thoughts will churn around in your brain until you deal with them somehow. Writing them down gets them out of your head, clearing your brain of the clutter that is interfering with being able to focus on what you want to focus on. As Allen notes, “Your mind will remind you of all kinds of things when you can do.

The task-negative or mind-wandering mode is responsible for generating much useful information, but so much of it comes at the wrong time.

Creativity involves the skillful integration of this time-stopping daydreaming mode and the time-monitoring central executive mode.

Insights into how human memory works

The book delineates the nuances of human memory by comparing it to systems in the physical world.

Being able to access any memory regardless of where it is stored is what computer scientists call random access. DVDs and hard drives work this way; videotapes do not. You can jump to any spot in a movie on a DVD or hard drive by “pointing” at it. But to get to a particular point in a videotape, you need to go through every previous point first (sequential access). Our ability to randomly access our memory from multiple cues is especially powerful. Computer scientists call it relational memory. You may have heard of relational databases— that’s effectively what human memory is.

Having relational memory means that if I want to get you to think of a fire truck, I can induce the memory in many different ways. I might make the sound of a siren, or give you a verbal description (“ a large red truck with ladders on the side that typically responds to a certain kind of emergency”).

This feature can lead to either valuable insights or being overwhelmed, depending on how it is controlled:

If you are trying to retrieve a particular memory, the flood of activations can cause competition among different nodes, leaving you with a traffic jam of neural nodes trying to get through to consciousness, and you end up with nothing.

Categorization is key to mental functioning.

This ability to recognize diversity and organize it into categories is a biological reality that is absolutely essential to the organized human mind.”

Shift burdens to external systems

You might say categorizing and externalizing our memory enables us to balance the yin of our wandering thoughts with the yang of our focused execution.

Continue reading

Is it really necessary to have a meeting?

A lot of time and money is wasted on unnecessary corporate meetings. Since the early days of Amazon , Jeff Bezos has taken a unique approach to meetings.


At a management offsite in the late 1990s, a team of well-intentioned junior executives stood up before top brass and gave a presentation on a problem indigenous to all large organizations: the difficulty of coordinating far-flung divisions. The junior executives recommended a variety of different techniques to foster cross group dialogue and afterward seemed proud of their own ingenuity. Then Jeff Bezos, his face red, and the blood vessel in his forehead pulsating, spoke up.

“I understand what you are saying, but you are completely wrong,” he said.

“Communication is a sign of dysfunction. It means people aren’t working together in a close, organic way. We should be trying to figure out a way for teams to communicate less with each other, not more.”

…At that meeting and in public speeches afterward, vowed to run Amazon with an emphasis on decentralization and independent decision-making. “A hierarchy isn’t responsive enough to change,” he said. “I’m still trying to get people to do occasionally what I ask. And if I was successful, maybe we wouldn’t have the right kind of company.

Bezos’s counter intuitive point was that coordination among employees wasted time, and that the people closest to problems were usually in the best position to solve them. That would come to represent something akin to the conventional wisdom in the high-tech industry over the next decade. The companies that embraced this philosophy, like Google, Amazon, and, later, Facebook, were in part drawing lessons from theories about lean and agile software development. In the seminal high-tech book The Mythical Man-Month, IBM veteran and computer science professor Frederick Brooks argued that adding manpower to complex software projects actually delayed progress. One reason was that the time and money spent on communication increased in proportion to the number of people on a project.

When you do have a meeting, make it useful

Of course, some meetings are necessary. There is value to cross-pollination of thoughts among intelligent people. Some processes do require explicit coordination and discussion. However, in practice, many hours are wasted on routine updates, grandstanding, and “thinking out loud”. To ensure meetings were productive Bezos required the person who leads a meeting to write detailed prose explaining their thoughts. The first half hour or so of every meeting would be silent reading time. This ensured everyone thought deeply and expressed complete thoughts cogently.

Meetings no longer started with someone standing up and commanding the floor as they had previously at Amazon and everywhere else throughout the corporate land. Instead, the narratives were passed out and everyone sat quietly reading the document for fifteen minutes—or longer. At the beginning, there was no page limit, an omission that Diego Piacentini recalled as “painful” and that led to several weeks of employees churning out papers as long as sixty pages. Quickly there was a supplemental decree: a six-page limit on narratives, with additional room for footnotes.

Education of a Wandering Man: The Ultimate Autodidact

Louis L’amour was an autodidact’s autodidact. John Wayne called him the most interesting man in the world. L’amour spent the first couple decades of his adulthood wandering across the country, and around the world, doing odd jobs, and obsessively reading whatever he could find. Only much later did he become a famous novelist. Education of a Wandering Man is a quasi-autobiography, in which he describes the trajectory of his life, and the evolution of his thinking in terms of the places he traveled and the books he read.

L’amour spent years as a hobo, hopping trains from town to town, working various jobs. In each town he would visit the local library.

Its important to note, that unlike a bum, a hobo is ready and willing to work.

To properly understand the situation in America before the Depression, one must realize there was great demand for seasonal labor, and much of this was supplied by men called hoboes.
Over the years the terms applied to wanderers have been confused until all meaning has been lost. To begin with, a bum was a local man who did not want to work. A tramp was a wanderer of the same kind, but a hobo was a wandering worker and essential to the nation’s economy.

…Many hoboes would start working the harvest in Texas, and follow the ripening grain north through Oklahoma, Kansas, and Nebraska into the Dakotas. During harvest season ,when the demand for farm labor was great, the freight trains permitted the hoboes to ride, as the railroads were to ship the harvested grain, and it was in their interest to see that labor was provided.”

 

He also worked on merchant ships, and traveled throughout Asia and most of the world. He would find books for free or cheap wherever he went, reading 100+ books per year.  For example:

Byron’s Don Juan I read on an Arab dhow sailing north from Aden up the Red Sea to Port Tewfik on the Suez Canal. Boswell’s The Life of Samuel Johnson (Penguin Classics) I read while broke and on the beach in San Pedro. In Singapore, I came upon a copy of Annals and Antiquities of Rajasthan, Vol. 1 of 3: Or the Central and Western Rajput States of India (Classic Reprint) by James Tod.

Although he didn’t have real formal degrees, L’amour understood the value of books and knowledge:

Books are precious things, but more than that, they are the strong backbone of civilization. They are the thread upon which it all hangs, and they can save us when all else is lost.
…Knowledge is like money: To be of value it must circulate, and in circulating it can increase in quantity and hopefully, in value. “

He wrote 89 novels, and clearly a lot of ideas came from paying close attention when he travelled:

People are forever asking me where I get my ideas, but one has only to listen, to look, and to live with awareness. As I have said in several of my stories, all men look, but so few can see. It is all there, waiting for any passerby.”
… for a writer, everything is grist for the mill, and a writer cannot know too much. Sooner or later everything he does know will find its uses.

As with reading, L’amour never let the challenges of a transient lifestyle interfere with writing:

“I began my writing in ship’s fo’c’sles, bunkhouses, hotel rooms- wherever I could sit down with a pen and something to write on.”

L’amour also spent time boxing in various small towns, and coaching other fighters. I’ve seen reference online to a 51-8 professional record, although I wasn’t able to verify it.

In the later years of his life L’amour spent more time in his personal library. His deep knowledge of the world gave him perspective:

Surely, the citizens and the rulers of Babylon and Rome did not see themselves as a passing phase. Each in its time believed it was the end-all of the world’s progression. I have no such feeling. Each age is a day that is dying, each one a dream that is fading.

Goodhart’s law and the fall of Nick Schorsch: The infamous mousepad

The House of Cards that Nick Schorsch built was destined to collapse for a variety of reasons. But what started the demise was  then-CFO of ARCP Brian Block just making up some numbers in a spreadsheet. This led to ARCP revealing a $23 million accounting misstatement. After that it became nearly impossible for the non-traded programs to raise new capital, and a whole slew bad behavior and examples of egregious mismanagement soon came to light(I’ve highlighted examples of their questionable corporate governance before). ARCP changed its name to Vereit, but the whole American Realty Capital complex of affiliated entities that depended on new fundraising would never recover.

ARCP’s culture was obsessively focused on achieving financial projections, especially for adjusted funds from operations(AFFO), a preferred Wall Street metric for REITs . According to Investment News:

In fact, the company gave employees computer mouse pads with 2014 AFFO guidance on them. “AFFO per share greater than $1.16,” the computer mousepad declared. “First believe it, then achieve it.”

I was able to independently verify the existence of this infamous mousepad. Here is a (deliberately obscured) photo:

Nick Schorsch designed this mousepad that specified the AFFO targt for the company.

This mousepad is a manifestation of “Goodhart’s Law” in action. Named after economist Charles Goodhart, this states that

When a measure becomes a target, it ceases to be reliable.

Goodhart’s law is very similar to “Campbell’s Law” named after social scientist Donald Campbell. Campbell’s law states:

The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.

When people are incentivized to achieve one metric above all else, there behavior will result in the number ceasing to be have its orignal meaning. Goodhart’s law was originally used to describe how monetary policy targets led to distortion. Recent examples of this phenomenon on include reclassification of crimes to reduce crime statistics, and abuse of academic citations. In Capital Returns: Investing Through the Capital Cycle , Edward Chancellor highlighted the Goodhart’s law as the reason conducting investment analysis based exclusively on the single metric of earnings per share growth. The ARCP incident certainly wasn’t the first time that Goodhart’s law led people to fudge the accounting numbers.

Goodhart’s law inevitably leads to waste of resources. One example from the Soviet Union nail factories illustrates this in a big way:

The goal of central planners was to measure performance of the factories, so factory operators were given targets around the number of nails produced. To meet and exceed the targets, factory operators produced millions of tiny, useless nails. When targets were switched to the total weight of nails produced, operators instead produced several enormous, heavy and useless nails.

Beyond just reclassifying or forging numbers, and producing useless nails, incentives distorted by the emphasis of single metrics can have even scarier effects:

During British colonial rule of India, the government began to worry about the number of venomous cobras in Delhi, and so instituted a reward for every dead snake brought to officials. Indian citizens dutifully complied and began breeding venomous snakes to kill and bring to the British. By the time the experiment was over, the snake problem was worse than when it began. The Raj government had gotten exactly what it asked for.

To avoid the trap of Goodhart’s law or Campbell’s law managers (and investment analysts) need to take think deeply about what is measured, and take multiple factors into consideration, never relying too much on any individual metric. Failing to consider Goodhart’s law can be fatal for investments.

Backgammon and Life Philosophy

Backgammon: the cruelest game provides a guide to some key principles of backgammon, and contains analysis of several games between top players. It also gets philosophical about the vicissitudes of randomness that make backgammon so challenging and intriguing:

From the start there is a complicated interplay of possibilities, probabilities, good fortune and bad, which influences every facet of the game. in backgammon, to seek position is to take certain calculated risks, and because all players are ruled by the dictates of the dice- or by chance, which Karl von Clausewitz, the ninetheenth-century military theorist, described as “an agency indifferent to the actor’s preference for the outcomes” – no player is ever in control of his particular destiny. One of the game’s chief tactics, then, is to shield oneself against the dice. The player with the strongest position can withstand the greater number of unfavorable rolls, or “bad luck,” than can the more weakly protected player, who, because he failed to protect himself, is more easily assaulted and overrun.

Nonetheless, no matter how cunningly you play, you are virtually always vulnerable. One unexpected horror roll can undermine the best positions, and derange the most sensible of plans; this is bot hthe charm and the frustration of the game. The best players know they must employ the craftiest of tactics, not because of the dice, but in spite of them. It is the enormously high luck factor in backgammon that causees it to be a game of skill. Without luck or accident, the game would not only be monotonous, but infinitely less skillfull.

In backgammon, to be skillful is to be self protective. At any given point in the ggame, the better players are aware of Murphy’s Law, which states that if anything can go wrong, it will.” Given the whimsical nature of the dice, all players have a chance in the game, but some players have more chances than others, because they have created in environment in which the more propirious is more likely to occur.

In backgammon, an understanding of the correct percentage moves in specific situations qualifies as “inside information” and will enable you to win in the long run. But not every time, alas, and often nt even in what you believe to be crucial games. This condition must be accepted philosophically, of course, and should not deter you from continuing a detailed study of the game.

Quick Thoughts on The Signal and the Noise

Learning to think probabilistically is one of the most critical skills one can master. Nate Silver’s The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t is a valuable book on thinking probabilistically and forecasting in an uncertain environment. It compares and contrasts examples across multiple disciplines, including weather forecasting, seismology, finance, and more.

This book pairs well with Against the Gods, Fortune’s Formula and Superforecasting.   Against the Gods is in my opinion, the most important book on the development of probabilistic thinking. Early civilizations were good with geometry and logic, but helpless with uncertainty. Ironically it was gamblers and heretics who moved mankind forward by  developing the science of  probability, statistics, and ultimately risk management.   Fortune’s Formula shows the connection between information theory, gambling, and correct position sizing for investors. It helps the answer the question: when you have a slight edge, how much should you bet? Nate Silver draws heavily on Superforecasting.  Particularly important is the idea of “foxes and hedgehogs”. Foxes are multidisciplinary, adaptable, self critical , tolerant of complexity, cautious and empirical. In contrast, Hedgehogs are specialized, stalwart, stubborn, order-seeking, confident, and ideological. As you might expect, foxes make far better forecasters than hedgehogs, even though hedgehogs make for better television.

Anyways, here are a few key insights from my notes on The Signal and the Noise

1) Data is useless without context.

There are always patterns to find in data, but its critical to understand the theory behind the system you are studying to avoid being fooled by noise. This is true in forecasting the weather, investing, betting on sports, or any other probabilistic endeavor. The ability to understand context is also a critical advantage humans have over computer programs.

“Statistical inferences are much stronger when backed up by theory or at least some deeper thinking about their root causes. “

The importance of understanding context comes to the forefront when you compare human’s success with weather forecasting, vs relative failure with earthquake forecasting.

“Chaos theory is a demon that can be tamed- weather forecasts did so, at least in part. But weather forecasters have a much better theoretical understanding of th earth’s atmosphere than seismologists do of the earth’s crust. They know more or less, how weather works, right down to the molecular level. Seismologists don’t have that advantage. “

The ability to understand context is what separates success from failure in all pursuits dealing with uncertainty. The profile of professional sports gambler Bob Voulgaris, is highly instructive. Voulgaris focuses on NBA basketball. A key insight is that Voulgaris has powerful tools for analyzing data, and he makes good use of the data, but he also has deep understanding of the qualitative subletities of how NBA basketball works. Obvious statistical patterns are quickly incorporated into betting lines, whether they are signal or noise. Voulgaris looks deeper, and finds places where the line misprices true probabilities.

“Finding patterns is easy in any data rich environment; thats what mediocre gamblers do. The key is in determining whether the patterns represent noise or signal. “

2) Beware of overconfidence

“… the amount of confidence someone expresses in a prediction is not good indication of its  accuracy, to the contrary, these qualities are often inversely correlated. “

3) Think big, and think small. Mix the macro and the micro.

“Good innovators typically think very big, and they think very small. New ideas are sometimes found in the most granular of details where few others bother to look. And they are sometimes found when you are doing your most abstract and philosophical thinking, considering why the world is the way that it is and whether there might be an alternative to the dominant paradigm.”

This is reminiscent of the “global micro” approach used by several manager’s profiled in Inside the House of Money: Top Hedge Fund Traders on Profiting in the Global Markets

4) Recognize the Value of Bayesian Thinking

The work of Thomas Bayes forms the framework underlying how good gamblers think.

Bayes was an English minister who argued in his theological work that admitting our own imperfections is a necessary step on the way to redemption. His most famous work, however, was “An Essay toward Solving a Problem in the Doctrine of Chances,” which was not published until after his death. One interpretation of the essay concerns a person who emerges into the world( ie Adam, or someone from Plato’s cave), and rises to see the sun for the first time:

“At first the does not know whether this is typical of some sort of freak occurrence. However each day that he survives and the sun rises again, his confidence increases that it is a permanent feature of nature. Gradually, through this purely statistical form of inference, the probability that he assigns to his prediction that the sun will rise again tomorrow approaches(although never exactly reaches) 100 percent.”

In essence, beliefs on probability are updated as new information comes in.

Ironically Bayes philosophical work was extended by the mathematician and astronomer Pierre Simon-Laplace, who was likely an atheist. Although Laplace believed in scientific determinism, he was frustrated with the disconnect between (what he believed to be the perfection of nature, and human imperfections in understanding it, in particular with regards to astronomical observations. Consequently, he developed some measuring techniques that relied on probabilistic inferences, rather than exact measurements. “Laplace came to view probability as a waypoint between ignorance and knowledge.” The combined work of Laplace and Bayes led to simple expression that is concerned with conditional probability. In essence Bayesian math can be used to tell us the probability that a theory or hypothesis if some event has happened.

5) The road to wisdom is to be less and less wrong.

forecasting, or at least operating in an uncertain environment, is an iterative process.

Nate Silver titles one of the chapters “Less and Less Wrong, as a homage to the Danish mathematician, scientist, inventor, and poet Piet Hein, author of Grooks:

The road to wisdom? — Well, it’s plain
and simple to express:
Err
and err
and err again
but less
and less
and less.

 

Disequilibrium Analysis

George Soros treats developments in financial markets as a historical process. In The Alchemy of Finance, he outlines his theory of reflexivity, discusses historical developments in markets, and describes a real time “experiment” he undertook while running the Quantum fund in the 1980s.

Markets are an ideal laboratory for testing theories: changes are expressed in quantitative terms, and the data are easily accessible.

Three of the key interrelated concepts in his framework, are anti-equilibrium, Imperfect Knowledge, and Reflexivity.

Disequilibrium

In markets, equilibrium is a very rare special case. Further, adjustments rarely lead to new equilibrium. The economy is always in adjustment.

According to George Soros:

If we want to understand the real world we must divert our gaze from a hypothetical final outcome , and concentrate our attention on the process of change that we observe all around us.

In trying to deal with macroeconomic developments, equilibrium analysis is totally inappropriate. Nothing could be further removed from reality than the assumptions that the participants base their decisions on perfect knowledge. People are groping to anticipate the future with the help of whatever guideposts they can establish. The outcome tends to diverge from expectations, leading to constantly changing expectations, and constantly changing outcomes. The process is reflexive.

The stock market, is of course a perfect example:

The concept of an equilibrium seems irrelevant at best and misleading at worst. The evidence shows persistent fluctuations, whatever length of time is chosen as the period of observation. Admittedly, the underlying conditions that are supposed to be reflected in stock prices are also constantly changing, but it is difficult to establish any firm relationship between changes in stock prices and changes in underlying conditions. Whatever relationship can be established has to be imputed rather than observed.

So its better to focus on nature and direction of ongoing adjustments, rather than trying to identify an equilibrium.

Imperfect Knowledge

Perhaps more problematic with an exclusive focus on rarely occurring equilibrium conditions is the assumption of perfect knowledge. Perfect knowledge is impossible. Everything is a provisional hypothesis, subject to improvement. Soros makes the bias of market participants the center part of his analysis.

Reflexivity

In natural sciences, usually the thinking of participants and the events themselves can be separated. However, when people are involved, there is interplay between thoughts and actions. There is a partial link to Heisenberg’s uncertainty principle. The basic deductive nomological approach of science is inadequate. Use of probabilistic generalization, or some other novel scientific method is preferable.

Thinking plays a dual role. On the one hand, participants seek to understand the situation in which they participate; on the other, their understanding serves as the basis of decisions which influence the course of the events. The two roles interfere with each other.

The influence of this idea is inseparable from the theory of imperfect knowledge.

The participants’ perceptions are inherently flawed, and there is a two-way connection between flawed perceptions and the actual course of events, which results in a lack of correspondence between the two.

This two way connection is what Soros called “reflexivity.”

The thinking of participants, exactly because it is not governed by reality, is easily influenced by theories. In the field of natural phenomena, scientific method is effective only then its theories are valid, but in social political , and economic matters, theories can be effective without being valid.

Effective here, means having an impact. For example, in a bubble, the cost of capital for some companies drops to be absurdly low, relative to the risk of their respective enterprises. Consequently, some businesses that would have otherwise died, may go on to survive. (Example from two decades after the Alchemy of Finance was written: Peter Thiel mentions when being interviewed in Inside the House of Money, that Paypal did a massive capital raise right a the height of the tech bubble, even though it didn’t need the money at the time) On the flip side, a depression can be self fulfilling, if businesses are unable to refinance.

This seems to be especially true in the credit markets:

Loans are based on the lender’s estimation of the borrowers ability to service his debt. The valuation of the collateral is supposed to be independent of the act of lending; but in actual fact the act of lending can affect the value of the collateral. This is true of the individual case and of the economy as a whole. Credit expansion stimulates the economy and enhances the collateral values; the repayment or contraction of credit has a depressing influence both on the economy and on the valuation of collateral. The connection between credit and economy activity is anything but constant- for instance , credit for building a new factory has quite a different effect from credit for a leveraged buyout. This makes it difficult to quantify the connection between credit and economic activity. Yet it is a mistake to ignore it.

This is reminiscent of Hyman Minsky’s Financial Instability Hypothesis

In terms of the stock market, Soros asserts (1)Markets are always biased in one direction or another. (2) Markets can influence the events that they anticipate.