In the case of Barings Bank trader Nick Leeson and the managers who supervised his work, the cost turned out to be "one of the most spectacular collapses in banking history," the authors note.
In "Wharton on Making Decisions," to be published in March, editors Stephen Hoch and Howard Kunreuther bring together the decision-making research of 16 faculty members. Each contributor discusses the factors that influence individuals and groups in their decision-making process, why they may be costly to the individual or organization involved, their effect on society and how this process might be improved.
The stakes can be high, as Hoch, chairman of Wharton's marketing department, and Kunreuther, chairman of Wharton's operations and information management department and co-director of the school's Risk Management and Decisions Processes Center, note in their introductory chapter, titled "A Complex Web of Decisions." For example, when managers decided to launch the Challenger space shuttle, "the concerns about the O-rings that ultimately led to the explosion were buried in a vast sea of thousands of other decisions...leading up to the ill-fated launch."
Barings Bank is another prominent example, and Hoch and Kunreuther use that case to illustrate how a number of strategic errors in decisions--discussed in more detail in later chapters of the book--eventually led to accumulated trading losses for the bank of more than $1 billion.
The authors begin by recapping the key events of the story, starting with Leeson's decision on July 17, 1992, to cover up a mistake made by a new trader in the Barings Futures Singapore office. The trader had sold contracts instead of buying them, an error that would have cost the firm approximately $29,000 to cover.
"Should Leeson have revealed the error to his superiors or concealed it?" the authors ask. "He decided to hide the mistake. What he justified initially as a desire to protect one of his employees snowballed into a habitual hiding of his own trading errors in the derivatives market--deceptions that three years later brought down one of the world's oldest financial institutions. How did a back-office clerk in his 20s become responsible for bankrupting one of the world's oldest merchant banks? The answer: many bad decisions."
Leeson alone wasn't responsible for Barings' collapse, the authors point out. "There were decisions at multiple levels...that either encouraged his actions or created the holes through which he slipped."
Leeson, ironically, started out his career "by fixing the errors of others"--in this case, the errors of traders calling out orders to buy and sell in open-cry markets. These errors, the authors note, are usually caught and corrected within 24 hours in the settlements department. "Leeson had a knack for this type of dogged detail work and it helped him get his job at Barings Securities in 1989...Within a year he was sent to the Barings Jakarta branch to wade though the mountain of paperwork that lay idle in its settlement office."
In 1992 Leeson was offered the position of running Barings' new futures subsidiary in Singapore. Within three years, his own trading losses had reached more than $1 billion. He pleaded guilty to deceiving Barings' auditors and cheating the Singapore International Monetary Exchange (SIMEX), and was sentenced to six and a half years in a Singapore prison.
Greed, speed and other errors
How did Leeson get away with so much for so long? Citing such sources as Stephen Fay's book, "The Collapse of Barings," and Nick Leeson's own account written with Edward Whitley, titled "Rogue Trader: How I Brought Down Barings Bank and Shook the Financial World," Hoch and Kunreuther point to a series of strategic errors in decision-making that contributed to the collapse.
Blinded by emotions. Managers at Barings, the authors say, saw Leeson as a "golden boy...(who would help them) reap enormous profits in the emerging Southeast Asian markets." Because they liked him, and because he performed so well in the settlements department, Leeson's managers were willing to overlook some early warning signs. They ignored, for example, a notice from London's Securities and Futures Authority, sent just days after Leeson arrived in Singapore, about two outstanding debts that he had neglected to mention on his application for a trader's license. Later on, in 1994, internal auditors "failed to expose Leeson's hidden errors on trading Nikkei 225 contracts" because of one auditor's "admiration" for him. In short, "managers let emotions get in their way...and (consequently) made less effective decisions," the authors note.
Overreliance on intuition. That Leeson was able to hide his trading mistakes for so long was due to Barings' "managerial confusion," according to a Bank of England study. The authors point out that the futures business at Barings was originally and "essentially a one-man operation that relied on an instinctive style of management." Once that operation expanded to include others, including Leeson, the bank "failed to recognize that such an intuitive management style was no longer appropriate."
Emphasis on speed. "Barings executives appeared to make decisions quickly, racing to take advantage of market opportunities and failing to institute a sufficiently rigorous system of controls," the authors write. Such a "time-is-money attitude" encouraged the company to ignore clues that might have tipped it off to Leeson's activity at an earlier stage in his trading.
Failure to detect deception. Barings' managers overestimated their ability to detect deception, never even considering the possibility that Leeson was cheating them, according to the authors.
Underestimating risks until it's too late. Leeson was both the Jakarta branch's settlement clerk and Barings' floor manager on the SIMEX--"a breach of one of the basic rules of thumb in the securities industry," the authors note. These two positions are supposed to be checks on each other, but in the case of Leeson, the situation simply allowed him to fix or hide his own mistakes. By allowing this to happen, "Barings' managers significantly increased their risks, but no one apparently understood how significantly." As the authors point out, "decision makers have great difficulty in evaluating low-probability, high-risk events before disaster strikes so they tend to underprotect themselves beforehand and overprotect themselves afterward."
Insufficient information technology for decision support. Hoch and Kunreuther quote Fay's "The Collapse of Barings," in which the author writes: "Resources were not committed to developing global computer systems which would enable management in London to know the firm's position anywhere in the world; nor was information technology applied to risk management."
Insufficient monitoring and control. After Leeson's and Barings' downfall, investigations by British, American and Singapore institutions into the management structure at Barings were instituted in an attempt to learn how a trader like Leeson could "go haywire in the first place." The Bank of England concluded that the huge losses were caused by a "serious failure of controls and managerial confusion within Barings." As the authors put it: "In other words, many people made many unwise decisions, as individuals, managers, negotiators and perhaps as regulators," which together led to Barings' collapse.
Bad decisions, compounded
Many of the failures described in the Barings case are addressed in more detail by individual chapters in "Wharton on Making Decisions" at the personal, managerial, multi-person and societal levels. For example, the section on personal decision making includes a chapter on the role emotions play in business decisions. Another chapter in this section explores how humans make surprisingly effective decisions even when using shortcuts. And a third analyzes how the desire for variety can cloud decisions-makers' judgments.
In the managerial decision-making section of the book, one researcher looks at how human intuition and analytic models can be effectively combined to forecast future demand for products. Another chapter studies complexity in decision making by focusing on two examples: The electric utilities restructuring under deregulation, and the insurance industry's management of catastrophic risks from natural disasters in the face of improved scientific data and computer support.
The section on multi-person decision making looks at different aspects of the negotiation process. One chapter explores how reputations affect the way partners approach negotiations; another chapter looks at deception in negotiations and the difficulties of detecting lies; and a third chapter analyzes the effect of new technology, such as e-mail and the Internet, on bargaining.
In the section of the book on societal decisions, one chapter explores why decision makers do not always use medical tests as expected based on analytic models; in another, the author explores why people tend to under-prepare for high-risk, low-probability events such as not protecting themselves against earthquakes or floods until after they occur. Still another chapter looks at the inconsistencies between public and private decisions, as in the case where people call for stricter environmental policies but then refuse to pay for them in their own purchases.
Given the increased speed and complexity of today's business environment, Hoch and Kunreuther note, insights on decision making are more important than ever.
"Faster speed in business, as in automobiles, increases the risks that a small miscalculation can lead to a serious crash," they write. By examining how people should make decisions, how they actually behave, and how they can improve their decision making overall, this book hopes to make better drivers of us all.
To read more articles like this one, visit Knowledge@Wharton.
All materials copyright © 2001 of the Wharton School of the University of Pennsylvania.