Serena crunched all the numbers and made the best estimates possible when she was preparing the coming quarter’s sales projections for the product she managed. She used lessons from her graduate studies in statistics and decision science. Informed by historical trends, economic forecasts, and market projections, she estimated a total sales volume of 1,000 units. In addition, she estimated a 15% probability that sales would fall below 900, and a 15% probability that sales would surpass 1,100. When she finished presenting her forecast, the first comment was from the CEO; she leaned back, scowled at Serena, and said, “I don’t pay you to be uncertain.”
Many of us, like Serena’s CEO, imagine they want perfect predictions made with absolute certainty. For people like that, the current economic moment has brought a particularly acute apprehension. The business press reports robust jobs numbers and low unemployment, but high inflation and anemic economic growth. The news is rife with speculation about whether recession looms, even while some government officials offer rosy forecasts and comforting words. It is a complex picture that leaves substantial uncertainty about the future. Should your company invest in hiring additional staff or scale back in case a recession brings a decline in sales?
If you are looking for fool-proof strategy for obtaining certainty, we have bad news for you — the world is complicated and markets are difficult to predict. But, if you are looking for ideas to manage the uncertain future, we have good news. There are tools for thinking through uncertainty and using it to plan and make decisions. These tools are useful in everyday life and every economic climate, regardless of whether the world is at war or at peace; whether the economy is growing or shrinking; and whether we are in a bull or a bear market. Here, we share five tools for thriving in an uncertain world.
Think in Expected Values
The essence of rationality is selecting the course of action with the highest expected value. Computing expected value is as easy as multiplying the value by its probability. For example, the expected value of a gamble that pays $20 with 50% probability is $10. If you could play this gamble every day of your life at a cost of $9, you would come out ahead in the long run. You should take the chance every day, even though half the time you would lose $9. On losing days, you may feel sad that you got unlucky, but you need not regret your choice to play; it was a good choice, given what you knew at the time you made the choice.
Jeff Bezos pitched early investment in Amazon.com using the logic of expected value. He saw a large potential upside of his online retail business, but also acknowledged substantial risk. He warned early investors that there was a 70% chance he would fail and their investment would become worthless. But the potential rewards attached to that 30% chance of success, he argued, was enough to outweigh the 70% chance of failure. In fact, a dollar invested in Amazon.com when the company went public in 1997 would be worth $1,840 today. Let’s say that, at the time of the IPO, there was a 70% chance of failure and a 30% chance of a return of $1,840 for a dollar’s investment. That would give a dollar’s investment an expected value of $552 (which is $1,870 multiplied by 30%). That expected value makes investment a good idea.
The logic underlying expected values acknowledges that the future is uncertain and our decisions should reflect that. Some of the uncertainty in the world is simply irreducible. It is folly, for instance, to pretend you can predict the coin flip or the roulette wheel. Likewise, many of the social and economic systems in which we operate are sufficiently complex that it is functionally impossible to predict their operations perfectly.
History is replete with confident forecasts from smart people that, in retrospect, look ridiculous. Take, for example, Apple Computer co-founder Steve Wozniak’s pessimistic prediction in 1985: “the home computer may be going the way of video games, which are a dying fad. For most personal tasks…paper works just as well as a computer, and costs less.” Or consider Stanford Professor Paul Ehrlich’s gloomy forecast in his 1968 best-seller, The Population Bomb, that the world would run out of food and “hundreds of millions of people are going to starve to death” in the 1970s. In a complex world, we should forecast with humility. Give up on the pretense that you can anticipate precisely what will happen. Usually, though, the answer isn’t to just shrug your shoulders and say “I have no idea what will happen.” Instead, think about the range of possibilities and the likelihoods of each. Explicitly considering how you might be wrong can help you be more humble.
We often ask participants in our studies to report their confidence in different ways. One matches the way we are most often invited to forecast the future: They report a best guess and their confidence in it. For instance, we ask them to estimate the high temperature, one month out, in the city where they live. When asked this way, across studies, people on average claim to be about 70% confident that the actual temperature will be within 5 degrees of their guess, even though they are only right 30% of the time.
A second way to forecast is to estimate the likelihood of each of several possibilities. For instance, I can break the range of likely temperatures into a set of ranges, each 10 degrees wide. When people estimate these likelihoods, the highest probability assigned to any 10-degree range is lower — typically a bit below 50%. Now that’s still overconfident relative to their 30% hit rate, but it’s a lot better.
Use the Wisdom of the Crowd
Even experts tend to have too much confidence in their estimates, and most of us have too much confidence that we can find the right expert. The Wall Street Journal asks expert economists to predict key economic outcomes for the upcoming year. There is huge variation in their predictions. How should you use the distribution of expert forecasts? Many would use the advice of the top expert. That’s basically what the ancient Greek philosopher Socrates advocated:
First of all, ask whether there is any one of us who has knowledge of that about which we are deliberating? If there is, let us take his advice, though he be one only, and not mind the rest.
A different approach relies on the wisdom of crowds. In his 2004 book on popularizing that idea, James Surowiecki argued that simple rules of aggregating judgments within a group — including using a mean or median, or majority vote for yes/no decisions — typically outperform more complex decision-making strategies. Business professor Rick Larrick and his colleagues show the benefits of a “select-crowd” strategy, which consists of choosing a small number of expert individuals and averaging their opinions. Averaging the estimates of all of the economists in the WSJ survey is a better strategy than selecting the estimate of the best predictor from the previous year. But averaging the top five predictors from the previous year outperforms a simple average all of the economists’ opinions.
It is our craving for certainty that leads us to chase a single expert, the one who can make perfect predictions. And this craving also makes us vulnerable to charlatans who lie to us and pretend they know; or worse yet, those megalomaniacs so overconfident that they sincerely believe they know. Beware the leader, entrepreneur, or political candidate who claims certainty about an uncertain future. They reveal more arrogance than insight.
Calibrate Your Confidence
Many self-help and business books could leave you with the impression that your challenge in life is to maximize your confidence. Shouldn’t you want to be optimistic? “One of the most important qualities of a good leader is optimism,” Disney CEO Robert Iger wrote in his 2019 memoir, The Ride of a Lifetime. “People are not motivated or energized by pessimists.” Our advice to accept uncertainty could make you look indecisive or, worse yet, pessimistic. Good leaders should strive for confidence, right?
Wrong. Striving for maximum confidence can lead to all sorts of bad decisions. Overconfidence about your future earnings could lead you to spend more than you have. Overconfidence about your invincibility might lead you to take risks that could shorten your life expectancy. Overconfidence about your popularity can lead you to behave in annoying and offensive ways. Overconfidence about your success can undermine investment in the effort required to achieve it.
Good expected value calculations require accurate estimates of both the probability and the payoff of different options. That is not easy when wishful thinking leads you to overestimate the probability of desirable outcomes. Conversely, if you are a defensive pessimist, you may be tempted to overestimate the risk of disaster, so as to motivate yourself to avoid it. Both are biases you should try to banish from your expected value calculations. You want accuracy. Once you have calculated both value and probability as faithfully as possible, then you can consider your attitude toward risk. If you are risk averse, then you will require that uncertainty be offset by higher expected values. On the other hand, risk seekers will be willing to accept lower expected values in return for the chance at a jackpot.
Decision analyst and former professional poker player Annie Duke, in her book, Thinking in Bets, describes how gamblers help calibrate each other’s confidence by challenging implausible forecasts with the question, “Wanna bet?” This can be a fun game to play with your colleagues if you disagree about something. Instead of arguing, bet on your beliefs. Write down everyone’s forecasts and resolve the bets later.
This can be a useful way to get better at calibrating your confidence: keep track and keep score. Get in the habit of making probabilistic forecasts of uncertain events. Then go back and see how often you were right. When you claimed 90% confidence you would meet a particular deadline, how often did you do it? If your confidence is perfectly “calibrated” you’d have met that deadline 9 out of 10 times.
Managers can help others in their organizations get better at calibrating their confidence by collecting predictions and scoring them later. Will a development project stay on schedule? Will the project stay on budget? Record everyone’s estimates of these probabilities and then score them and publicize them later. Share the results so that people are aware of their own accuracy. Encourage those who report to you to honestly report their uncertainty. Don’t be like Serena’s boss, who, by demanding certainty, encouraged inaccurate and overconfident forecasts.
Hedge Your Bets
Even those, like Serena, who are comfortable with uncertainty and thinking of the future as a distribution of possible outcomes, still have to make decisions. How many units should Serena’s division produce? She can’t produce a probability distribution. The simple answer is just to take the mean of the probability distribution. But that simple answer assumes symmetric costs of over- versus under-estimating. If Serena is providing ventilators for the sickest Covid patients, then producing too few could result in unnecessary deaths but producing too many only requires storing them so they can be used later. In this case, Serena should err on the side of over-production.
To take another example, there is uncertainty around how long it will take to get to the airport and make your way through security. Since missing your flight by a minute is worse than waiting a minute at the gate, you err on the early side. The greater the uncertainty, the earlier you should head to the airport. And the greater the expected difficulty of finding a later flight, the earlier you should arrive. On the other hand, if you are always more than an hour early at gate, perhaps you are being a bit too cautious.
It’s not always obvious if erring in one direction is costlier than erring in the other. One classic example is the newsvendor problem. For a newsvendor whose unsold papers are worthless the next day, excess production is pure waste. In this case, given uncertainty around demand for newspapers, the newsvendor seeking to avoid waste should err on the side of printing fewer newspapers. On the other hand, if each paper sold is sufficiently profitable, then under-production that fails to meet customer demand represents lost profits. Deciding how many papers to print requires balancing these concerns.
An accurate and well-calibrated probability distribution can be enormously helpful for selecting the course of action with the highest expected value. For instance, accurate weather forecasts have been a colossal economic boon, not only to farmers but also to any other businesses that depend on the weather. Do I need a tent for my outdoor event? Well, if I have sold $10,000 in tickets that I would have to refund if the event were cancelled due to rain, and a tent would cost $5000, then I should rent a tent if the probability of rain is more than 50%. Weather forecasts are not perfectly accurate but evidence suggests that weather forecasts from the National Oceanic and Atmospheric Administration are well-calibrated: it rains on about half the days when NOAA forecasts a 50% probability of rain.
Serena’s boss may have wanted her executives to be able to predict with certainty. But, Serena’s calibrated confidence allows the organization to develop contingent plans based on the uncertain future. Hedging their bets can allow the company to think through a plan for where they can quickly find additional staff in the event that sales start to rise. And the same logic can allow the company to keep an eye on open positions elsewhere in the company, in the event that sales fall and they need fewer staff.
Communicate Uncertainty with Confidence
In our book, Decision Leadership, we advise leaders to distinguish the confidence with which they report what they know from the certainty of their forecasts. You do not need to pretend you can perfectly predict an uncertain future to come across as decisive. Research by Celia Gaertig and Joe Simmons shows how leaders can thread this needle. Gaertig and Simmons found that the most credible forecasts reported uncertainty with confidence: “I am confident the Golden State Warriors have a 60% chance of winning their next game.” By contrast, the researchers found that audiences were most skeptical of categorical predictions delivered with low confidence: “I’m not sure, but I think the Warriors will win their next game.”
The former advice is compelling because it acknowledges both the actual uncertainty in future events while also signaling that the adviser has gathered relevant information in order to specify that uncertainty precisely. That is, in fact, what we should strive for, even though it might fail to satisfy our yearning for certainty and leave us with the uncomfortable reality of being unable to predict the future with certainty. “Uncertainty is an uncomfortable position,” the philosopher Voltaire admitted, “But certainty is an absurd one.”
Too many leaders think that maintaining their credibility requires them to feign absurd levels of certainty. Not only is this untrue, but it also jeopardizes their reputations when their confident predictions turn out to be wrong. In fact, pretending to be certain about inherently uncertain outcomes ought to undermine leaders’ credibility even before their confident prediction is shown to be wrong. Instead, wise leaders will gather enough information to be able to report with confidence how much uncertainty remains.
Lessons for Leaders
The world is full of uncertainty. Ignoring that uncertainty and pretending you can make perfect predictions is either disingenuous or delusional. You will make better decisions when you accurately incorporate uncertainty into your own thinking and your calculations of expected value. Moreover, you will be a better leader when you help those around you to accurately understand uncertainty, help them quantify it, and help them make better expected value calculations. The result will be better decisions with higher expected values.
This can reduce the wild swings of the organizational pendulum that follow failed attempts at perfect prediction. When you incorrectly think you should have been able to predict what happened, you will be driven to change systems, processes, and staff to avoid the error. If you think you should have anticipated a spike in demand, you will ramp up production to meet that demand in the future. But if that demand spike was partially driven by chance events that will not recur, then you will have made the error of producing too much; we have all seen organizational pendulums swing this way. Instead, you should base your decisions and your production quantities on your best guess of the underlying demand distribution, including the relative costs of over- versus under-production.
The lesson is to learn as much as you can about the uncertainties our complex world. Reflect honestly on the unpredictability of the future. Make the best probability estimates you can, and use them to inform the most accurate expected value calculations. You will never know for certain that they are right, and you will always wish you had more information so as to reduce your uncertainty. But if you keep track and keep score, you and your colleagues can improve your calibration and get better with time. We’re willing to bet it’ll be well worth it.
This content was originally published here.