Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-dfsvx Total loading time: 0 Render date: 2024-04-25T08:48:31.646Z Has data issue: false hasContentIssue false

2 - The Management of Risk and Uncertainty

from Part I - The Problem

Published online by Cambridge University Press:  18 March 2019

Alfred A. Marcus
Affiliation:
University of Minnesota

Summary

This chapter summarizes previous academic literature about the management of risk and uncertainty. Specifically, it deals the decision-making limitations that research has found to exist when decision makers contend with important strategic issues.
Type
Chapter
Information
Strategies for Managing Uncertainty
Booms and Busts in the Energy Industry
, pp. 28 - 41
Publisher: Cambridge University Press
Print publication year: 2019

Decision makers in the energy industry have no easy way to reduce the risk and uncertainty about what is likely to take place next, how it affects them, and how they should respond. Therefore, the return companies make on a project, if it materializes, may be short-lived as economic and political conditions shift; technologies change; and competitors, substitute products, and new entrants cut off the possibility of gain. In general, decision makers in businesses try to gain control of such uncertainties, coping in various ways. They try to anticipate the future and attempt to take calculated bets. This chapter summarizes previous academic literature about the management of risk and uncertainty. Specifically, it analyzes the decision-making limitations that research has found to exist when decision makers contend with important strategic issues.

Anticipating the Future

The prior literature suggests that there are at least four ways to anticipate the future, each of which has limitations.1

  1. 1. Trends. Though trends in one area often lead to developments in another, the trends must be analyzed with caution. Just because a trend is moving in a particular way does not mean it will continue. It can flatten out, become more pronounced, or reverse direction. Inflections take place. Tipping points occur. Simple extrapolation can be deceiving if it does not take into account different future conditions, the impact of trends on each other, and how human responses change their direction. Trend analysis leaves insufficient room for surprises and unexpected developments. For example, in The Rise and Fall of American Growth economist Robert Gordon argues that it is not certain that the remarkable improvements in people’s lives that took place in the century that started in 1870 will continue.2 He points out that since 1970 US productivity growth has stalled, technological progress has slowed, and the benefits of this growth have not been widely shared. The information technology revolution, he comments, is no match for indoor plumbing, autos, electricity, air travel, antibiotics, and air conditioning.

  2. 2. Experts. Relying on expert opinion is not much better than relying on trends. Even the most trusted experts are fallible. An expert panel asserted that the analytical engine that Charles Babbage, later the inventor of the computer, created had no practical value. A committee of experts found no potential in Thomas Edison’s incandescent light bulb. Experts declared that intercontinental ballistic missiles could not accurately deliver their payload thousands of miles away. In Superforecasting: The Art and Science of Prediction, Philip Tetlock and Dan Gardner report that experts are right less than 50 percent of the time.3 Groups of laypeople typically make better predictions. Seers project the future, but their predictions do not necessarily foretell what is to come next. In the book Megamistakes, author Steven Schnarrs lists many errors that experts have made in forecasting the future.4 He includes the supersonic air transport (SST) that at speeds of 2,000 miles per hour forecasters suggested would be followed rapidly by hyper-supersonic transport at even greater speeds of 4,000 miles per hour.

  3. 3. Analogical reasoning. Decision makers may compare known aspects of current conditions with corresponding characteristics of past phenomena. If the match is good, they assume that prior cases can provide a good understanding of contemporary events. The closer the match, the stronger the indication that the analogy they are using is good. Teasing lessons from the past, however, is not easy. In using analogical reasoning, the analyst must be careful to highlight the relevant similarities and differences between previous and present events. The analyst cannot ignore unlikely Black Swan events, though people tend to disregard them as they did in the Great Financial Crisis of 2007–2008. Like Robert Gordon, Wall Street Journal writer Greg Ip argues that the past performance of the global economy is no indicator of its future performance.5 He suggests that the future might be very different from the past because the world has harvested the gains in science, medicine, and technology that are the easiest to achieve. Additional advances are costly and complex, and they could be prone to failure. Barriers for transforming ideas into commercially successful products have grown. Yet, Ip admits that countertrends do exist. For example, future innovations that affect the energy industry might come from many sources including artificial intelligence, gene therapy, robotics, and software. Ip asks whether a jump as large as that, which occurred, from oil and kerosene lamps to electric lighting is possible. He maintains that no amount of analogical reasoning can answer the question of whether a golden age is on the horizon because of technological breakthroughs.

  4. 4. Baysean judgment. As the situation changes, analysts can update estimates based on new information. With the new information, learning is possible. The Baysean perspective thus offers room for constant adjustment, but it is not foolproof. For example, we know the probabilities of rolling dice. Even if not known, we could roll the dice and record the results. Thus, empirically, after enough tosses of the dice, we could arrive at a fairly good estimate of the odds, and with each observation, we could update and improve our estimate. Eventually, we would come very close to knowing the actual odds based on what we have observed. However, the dice exist as a fixed system. They do not exhibit any dynamism. There is no change in the twelve possible outcomes. However, if the system shifts, then it is hard to learn from a series of observations in which we record the results, since the odds of what the outcomes will be are not stable. When a system exhibits fundamental shifts, it is very hard to learn from its past behavior. For example, what if, unbeknownst to the observer, someone slightly altered the weight of the dice after each roll. No matter how faithfully we observed the rolls of the dice and recorded the results, we would not have a good sense of the odds. It would not be possible to arrive at accurate probabilities based on observation and our Baysean updating. Thus, economist Hyman Minsky has argued that because every economic era is different it is nearly impossible to extract reproducible lessons from any of them.6

Taking Calculated Risks

For these reasons, and others like them, decision makers cannot fully anticipate the future. Nonetheless, they must take calculated risks based on their expectations of future returns. How certain can they be that they will achieve such payoffs? As argued in the last chapter, animal spirits play a role. Decision makers rely on their guts as much as on their reason –that does not mean that they leave reason entirely behind, yet however good they are at reasoning it only gets them so far, since fundamentally the future is unknown. Thus, to get proposals off the ground and projects funded, they have to exhibit optimism. They have to display confidence that they know what they are doing. They need to lay out the risks and uncertainties and indicate what their assumptions are. Then, they can indulge in the use of various caveats and qualifiers to maintain their credibility, inserting vague phrases like reasonable, likely, possible, or hoped-for into assertions they make, and, when pressed, they might have to attach hard numbers to the guesses.

For instance, Shlomo Maital provides the following example of taking a calculated risk.7 Let us say that decision makers agree that a big energy project provides a 10 percent chance of $10 billion returns, but there is a 30 percent chance it might lose $0.1 billion, and a 60 percent chance the same project might result in a $1 billion loss. The decision makers have based their estimates of the odds of the likely returns on reasonable guesses. They have examined the experience of their organization and others with similar projects. They have updated their understanding with advances in technology and economic forecasts. They operate in a reasonable way. They rely on their judgment and good sense as far as it will take them. Yet, they have made a series of estimates to arrive at these odds, which together can compound the error, so the ground on which they are operating is only as good as the assumptions they have made.

Let us take for granted for a moment that their assessments of the odds are reasonably sound. Then, in the example that Maital gives, the expected gain before subtracting the expected losses is $10 billion times 0.1, or $1 billion. The expected losses are $1 billion times 0.6, plus $0.1 billion times 0.3, or $0.63 billion. Putting it together, the likely payback from this project goes down, but it remains positive, as $1 billion minus $0.63 billion is $0.37 billion. Pending opportunity costs and expected interest rates, Maital asks, should decision makers approve a project of this nature? Recall that the possibility of the $10 billion return that generates the $0.37 payback is just 10 percent while the possibility that the loss could be as low as $0.1 billion or as high $1 billion is 90 percent. Maital suggests that since people tend to be loss averse, as the psychologists Adam Tversky and Daniel Kahneman demonstrate, the decision makers in the example are not likely to pursue this initiative.8 They will forego the expected gain of $0.37 billion, which is the most likely payback, because of fear of a loss, although, as a compromise, they might try to find an alternative where the loss is lower, but the expected gain is less.

This example suggests that even when decision makers quantify the risk, however imprecisely, the problem of what to do does not go away. To what extent are important organizational decisions actually made in this fashion? Do organizations always make their most important choices after similar quantifications? In most instances, they do not get to a point where they have enumerated the benefits and costs with such precision. Therefore, people in organizations depend on argument and counter-argument. Factions take up different sides. Only if they had perfect knowledge would they be fully rational. Lacking full knowledge, they make compromises. They hedge what they do, fearing to go out on a limb and get themselves and their organizations into excessive trouble.

Their hedging is the phenomenon this book later examines, that is how decision makers in major energy companies show caution and hedge their bets when they lack full knowledge of the future. Organizations have to hedge because they carry out few important strategic undertakings with full knowledge and hence total certainty. Their operational and technical decisions may be rooted in sound engineering knowledge that yields stable and reliable outcomes, but not their strategic choices. The essence of strategy is that in making strategy decision makers face an unknown future. Limitations of time, money, and analytical power mean that residual uncertainty nearly always plays a role. Under conditions of perfect ignorance, the strategic choices decision makers would make would be entirely random. They would be unable to link outcomes to alternatives. If results conformed to expectations, it would be a matter of luck. Under residual uncertainty, the choices they make are generally better than mere luck; they are not entirely random, but they are not perfect. Understanding that they are not perfect, organizations protect themselves, and thus decision makers hedge the choices they make. Given the randomness, hunches play an important role. Maital reports that when Sony decided to license the rights to the transistor in the early 1950s, the value of the technology was entirely unclear.9 Company co-founder Masaru Ibuka made the decision based on a sleepless night’s hunch during a visit to New Jersey’s Bell Lab. The history of the company on its website states “an idea flashed through his mind.” With hindsight, this hunch paid off, but many others do not.

Individual and Organizational Shortcomings

Not all strategic decisions are pure hunches yet all contain some guesswork. The psychological and sociological literature has portrayed how decision making takes place when guesswork plays a role. It has portrayed the shortcomings that exist at individual and collective levels. What follows is a thumbnail sketch of some of the essential points this literature makes.

Individual Biases

Psychologists Kahneman and Tversky showed that people have unreasonable expectations of early results and their replicability.10 No matter what the implications of further evidence are, natural starting points or anchors act as aids in judgment. Humans display such perceptual biases as:

  • Anchoring – putting too much stock in initial information.

  • Recency – placing too much emphasis on information recently acquired.

  • Salience – placing too much emphasis on information that stands out.

  • Availability – placing too much emphasis on easily acquired information.

  • Clustering – seeing patterns in this information when there are none.

  • Overconfidence – being too certain that the patterns give accurate findings.

  • Confirmation – placing too much emphasis on information that supports the findings.

  • Conservatism – placing too high a burden of proof on disconfirming evidence.

Another common perceptual error is that people tend to be blind to events on their periphery. When psychologists carry out experiments in which they show a film of teams passing a ball and ask how many times the teams have passed the ball, most subjects provide the right answer.11 Nonetheless, more than 90 percent fail to notice if a gorilla walks through the scene and taunts the players. Among those who notice the gorilla, less than half get the count right. Subjects who repeat the experiment usually get the count right and spot the gorilla, but fail to notice changes like a person leaving the game or a change in the color of the background curtain. Thus, most people can see beyond the known knowns (the ball passing hands) and spot the unknown knowns (the taunting gorilla), but it is hard for them to recognize the unknown unknowns (players leaving the game and the change in curtain color).

Group Limitations

Collective errors compound individual judgmental impediments.12 Groups working on decision-making problems may suppress doubts and develop illusions of invulnerability. Group consensus and groupthink reinforce such attitudes. They mean insulation from external criticism and promotion of a dominant views to the exclusion of others.

Moving from the group to the organization, organizations also are limited in the choices they make. When considering the same conditions, they reach different conclusions. Objectives, capacities, and interests lead the people in organizations to present facts in a biased way. They cope with indeterminateness by simplifying tasks and narrowing the range of information they take into account, limiting the information they analyze, and confining themselves to a few responses. They recall and emphasize congenial information and ignore or suppress critical information. They select a course of action before they consider all the plausible outcomes because they are constrained by time pressures. They also use information for its manipulative or propagandistic value in promoting acceptance of their points of view.

Even when they discover error, they do not promptly correct it. They may be reluctant to admit error because of fear of blame. They may recognize the error slowly because their training leads them to stick to a prevailing course of action even when they start to sense it is defective. Thus, they will fail to distinguish true signals from the noise and not make sound use of knowledge they possess. The correction of error arising from the inadequacy of previous assessments depends not only on its recognition. People in organizations must have the power to act based on updated information they obtain.

State, Effect, and Response

Francis Milliken argued that three types of uncertainty – state, effect, and response – affect the decisions organizations make.13 State uncertainty has to do with the conditions in the world, effect uncertainty concerns the impact of those conditions on a particular organization, and response uncertainty relates to what the organization should do next. In each instance, there are no clear answers, which compounds the problem. Below is a summary of some of the literature on how state, effect, and response uncertainty affect organizational decision making.

State Uncertainty

State uncertainty becomes a central problem for organizations because of complexity in their external environments. Complexity arises because of the number of factors decision makers in organizations must consider. They confront multiple external environments – technical, economic, political, and social – that change at different rates, conflict with each other, and impose inconsistent demands. Separate interacting technical, economic, political, and social logics create myriad possible futures. Technological change, for instance, takes place in a discontinuous pattern. A dominant technological design comes under threat, which disrupts the dominant design. Another dominant design, though, may come under threat, but maintain itself and withstand the disruption. In the societal sphere, some societies are disproportionately young, while others are disproportionately old. They confront different types of economic stresses – on the one hand, nurturing the young and, on the other hand, caring for the elderly. Change in the social realm also regularly arises from fluctuations in fashion, tastes, attitudes, and values. Tipping points can occur in any realm, whether it be technical, economic, political, or social, and suddenly change the rules of the game and realign the relationships among realms. Related technical, economic, political, and social changes are not in harmony. Each moves at its own pace. Governments influence the course of events. Their actions affect the economy, society, and technology. Some governments are stable, but clearly not all are. Some are in the midst of regime change. Other governments have collapsed entirely. They exist in a failed condition. For all these reasons, decision makers must contend with the question of whether the laws and public policies in place today will be the same in the future.

Effect Uncertainty

Changes in the technical, social, and political environments are dynamic and complex, patterns change, and decision makers in organizations may not have enough time to adjust. Even with the time, it is unclear what the effects will be on their organizations. What are the threats and what are opportunities for them? Can they make sense of the impacts of what it is taking place on them? Can they draw the relevant connections? Though they try to hedge their bets, there is no simple firewall they can erect to protect their organizations from the technical, economic, political, and social volatility. With disparate events converging, surprises can multiply. Even if more information were available, it would not be clear how to assess it. There may be too much information as well as too little of it. Thus, decision makers organize what they know based on what stands out and is noticeable, trying to impose meaning and order on events, perhaps framing what they observe relying on various shortcuts, mental models, heuristics, lenses, and protocols for the purposes of interpretation.

Response Uncertainty

Before decision makers fully know what is happening they must act – this is why hunches and professional judgment play such an important role, but uncertainty makes them hesitate, unsure of what to do next. The uncertainty may instill in them anxiety, fear, dread, perhaps even panic. It activates emotions that function at odds with rationality. Ensuing doubt and unease can lead to procrastination and delay, paralysis, drift, or retreat. As is well known, within an organization, individuals with different objectives provide different and competing interpretations about what is taking place and what the organization should do. People with different interests engage in struggles for power and influence. They may build a consensus, but if the internal organizational conflicts escalate, the consensus may break down. Lack of a consensus may result in a myopic focus on the present with blind spots regarding the future. The instinctive responses of the decision makers may then lead them down well-trodden paths, with organizational imprinting, path dependence, and sunk costs adding to inertia.

Thus, decision makers may escalate commitment to the current course of action. They find justifications for their actions based on how their organizations have been doing in comparison to various reference points. As a reference point, they might compare how well their organizations did in the past to how well they are doing now. If current performance is not out of line with past performance, they will stick with what has worked in the past. They may also benchmark performance relative to their peers. If the performance of their organizations is not out of line with that of their peers, they will continue to match what the peers are doing. Only if performance is very out of line with the past and/or peers might decision makers initiate a search for new solutions. They also may then find that it is difficult to buffer their organizations from change – it is difficult to muddle through; no longer is it possible to resist, or stall. Being defensive and fighting change is not possible.

Being a Leader

Rather than fighting change, decision makers may choose to adapt by taking a leadership position, but doing so is hazardous. Being a first mover can yield unanticipated consequences. Decision makers may conclude that it is far better for other organizations to test the waters. Thus, they might resist being a leader. The primary concern is to protect an organization’s existing assets – to hedge – and the reasoning used is that there is no reason to go out on a limb if it means putting these assets in danger. Because of these factors, before entirely reconfiguring a prior business model, decision makers may aim for some type of compromise. They will mostly stick with what they have done thus far, but not entirely. They exploit the current business model, but undertake some exploring for new approaches just in case. On the margins, they open up and try to be flexible.

If they conclude they face some type of non-linear change, they try experiments. To cope, they may enter new markets; develop novel products or services; try out alliances, mergers, and acquisitions; globalize; and innovate with alternative business models. They take these steps and then evaluate and reassess them before going further. If they have exaggerated the threat and/or misperceived the opportunity, they may abandon the fresh approaches. Along with flexibility, they need the capacity to revert to their prior condition if they have made a mistake. Hesitating to move forward too vigorously in new directions, they flip-flop between new and old positions and approaches. Different organizations go through this process in different ways. These variations are a key element in the case studies found later in the book.

Profiting from Risk and Uncertainty

In his book Profiting from Uncertainty, Paul Schoemaker promises strategies for firm success no matter what the future brings.14 He argues an organization can profit from risk and uncertainty if it follows these recommended steps:

  • embrace uncertainty;

  • prepare the mind;

  • experience multiple futures;

  • build a robust strategic vision;

  • create flexible options;

  • engage in dynamic monitoring and adjustment;

  • implement, accepting the uncertainty;

  • then successfully navigate the future.

Though there is much merit in this type of advice, actual decision making in the face of risk and uncertainty is not likely to be either this orderly or successful.

Rather it is likely to be hedged. Incumbent companies in industries poised for change are likely to adapt to risk and uncertainty by exploring and pursuing a variety of different options at the same time, but their commitment to these options is likely to be provisional. Taking out options that decision makers can reverse is a fine balancing act. It is a stretch for organizations to keep re-orienting themselves in opposing directions simultaneously, to be ambidextrous, and to engage in the art of paradox and dialectics. Sequential investment, in which they raise and lower stakes based on changing perceptions of the gains and losses, is not easy. To assemble and reassemble capabilities, to structure and restructure organizations, and to configure and reconfigure business models is a daunting activity. For example, how far can organizations go in re-pivoting themselves from being comfortable specialists in a chosen field to amateur generalists in many domains? With too many lightweight experiments they take on, the chance of spreading themselves too thin grows. Beyond a certain level, variety does not increase their odds of survival. It decreases it. The consequences of a wrong decision can be profound. There are no guarantees. Outright failure can take place. Many examples exist of once-indomitable firms such as Xerox, Kodak, Lucent, Nokia, Motorola, and Blackberry that did not adequately adjust to the mobile revolution. Just 12.2 percent of the Fortune 500 companies in 1955 continued to be on the list 59 years later. The rest were bankrupt, merged, or had fallen so much in revenue that they no longer were major companies. Many companies on the 1955 list are long forgotten.15

How Companies Adapt

This chapter has covered the decision-making limitations that exist when organizations confront the type of risk and uncertainty with which decision makers in the energy industry must contend. Incumbent companies in industries like the energy industry that seem poised and ready for change are likely to adapt to the risk and uncertainty by hedging. The following behaviors are likely to accompany their hedging:

  • Decision makers in more risky and uncertain environments (hard to predict the future) are likely to rely less on rational/analytical decision making because they do not know chances of payoffs/losses and their magnitude. They are more likely to rely on hunches and intuition or what Keynes calls animal spirits.

  • The less they rely on rational/analytical decision making, the more other factors such as organizational politics, imitation (what their rivals do), and what analysts say will affect the choices they make.

  • The less they rely on rational/analytical decision making the more cognitive and collective errors they are likely to make, despite their best efforts.

  • The less they feel they are able to avoid these errors, the more cautious they may become – the more they will try to protect their organizations from harm by hedging their bets. Under these circumstances, they become defensive and try to delay and avoid departures from the status quo.

Such factors, as the following, however, might mitigate their caution about departing from the status quo:

  • the degree to which they perceive external change as competence enhancing or detracting;

  • the extent to which they have sunk costs (stranded assets) wrapped up in the status quo;

  • the degree to which their organizations have been and are well diversified;

  • the extent to which their organizations are performing well in comparison with peers and other reference points;

  • the extent to which their organizations have the slack resources to experiment and make changes.

Under these conditions, their hedging might move from protecting their organizations from harm to exploiting the situation that they reckon is emerging for gain. The case studies that come later in this book examine the slight, but significant, departures from the status quo incumbents in the energy industry made in response to the uncertain changes taking place in their external environment.

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×