In the immediate aftermath of Hurricane Florence, one question that has recurred in media reports – as it often has after other natural disasters – is why people stayed in their homes and neighborhoods and risked harm to themselves or their families, rather than flee to safety when they had time to do so. Television or newspaper accounts of a family clinging to a tree to avoid being swept away by floodwaters, or of a mother whose infant son was torn from her grasp when she tried to drive through a flooded street, often prompted reactions of sympathy, along with critical and judgmental questions along the lines of “What were they thinking?”
Compliance officers who are newly hired to build or rebuild corporate compliance programs after a major compliance failure might have similar reactions. When they learn that executives at their firm, over multiple years, paid tens of millions of dollars in bribes to secure business or to evade U.S. sanctions, or allowed accounts at the firm to be used to launder hundreds of millions of euros, their first thought might also be, “What were they thinking?”
In both cases, however, that question should be factual rather than rhetorical. Disaster experts and compliance experts alike need to understand the thought processes that prompt people in risky situations to make decisions that put them at greater risk. In fact, how people make decisions before and during disasters may be critically influenced by the same factors that can influence corporate employees who are tempted (or expected or pressured) to participate in improper or illegal activity.
Those factors, as risk expert Robert J. Meyer recently explained in a Washington Post essay, are “cognitive biases that lead people to underplay warnings and make poor decisions, even when they have the information they need.” A number of those biases that Meyer – a co-director of the Wharton Risk Management and Decision Processes Center at the University of Pennsylvania – identified can also be found in corporate settings:
Overconfidence Bias: This bias, simply defined, “is the tendency people have to be more confident in their own abilities, such as driving, teaching, or spelling, than is objectively reasonable. This overconfidence also involves matters of character.” In Hurricane Sandy in 2012, Meyer wrote, East Coast residents “knew all too well that a storm was at their doorstep and that many people would be affected – they just thought it wouldn’t affect them.” Studies have shown “that, even when reliable information about probable danger is available, it is difficult to effectively warn large populations that cannot directly perceive the danger associated with a disaster. If a storm warning is at all vague, people will underestimate the threat and be less likely to heed evacuation orders.” In addition, “the longer people have lived in an area, the less likely it is that they will evacuate, in part because they have successfully ridden out past hurricanes.”
Overconfidence bias is widely prevalent in the business world, including decisionmaking on matters of finance and investment. As Meyer noted, overconfidence bias “also involves matters of character.” For example, what Harvard Business School Dean Nitin Nohria calls “moral overconfidence” is evident when there is a gap “between how people believe they would behave and how they actually behave.” That gap, Dean Nohria wrote, “tends to be most evident in high-pressure situations, where there is some inherent ambiguity, when there are competing claims on our sense of right and wrong, and when our moral transgressions are incremental, taking us down a slippery slope.” Discussing proposals to expand or retain business in higher-risk markets, especially if the company is suffering declining profits or other reversals, can reflect all of those factors.
“Herd Thinking”/Social Proof: Meyer also noted the effects of “herd thinking” in compounding the problem. “Herd thinking” is a colloquial term for the cognitive bias that social psychologists term “social proof” or “conformity bias.” Social proof, as Professor Robert Cialdini wrote in his seminal work Influence: The Psychology of Persuasion, is “the tendency to see an action as more appropriate when others are doing it.” In the case of Hurricane Sandy, Meyer wrote that residents who looked around before the storm “and [saw] that few others were making preparations . . . felt no social pressure to do more.”
Social proof can also influence people in corporate settings. For example, if one or more meetings are held to discuss and implement a proposal for a course of action that is improper or illegal, and no one speaks up to challenge the improper course of action, participants who have doubts may remain silent when they see that no one else is speaking against the proposed action.
Inertia and Simplification/Normalcy Bias: Meyer also singled out inertia and simplification as
enemies of sound decision-making. When we are unsure of what to do in the face of an incoming storm, we tend to stick to the status quo — doing nothing. If we are uncertain about when to evacuate, we tend not to evacuate at all. And we tend to simplify our course of action, selectively focusing on a few factors. . . . Before Hurricane Sandy, for example, 90 percent of residents secured supplies — but typically only enough to get them through a single day without power. Again, most failed to make evacuation plans.
This “status quo” tendency has also been labeled “normalcy bias,” for situations in which people in imminent or immediate danger “freeze” or wait to consult with multiple other people rather than acting immediately to flee that danger. As journalist Amanda Ripley documented in her book The Unthinkable, the consequences of normalcy bias in those situations are often fatal.
Corporate executives and employees who are caught up as intracorporate misconduct become more severe may also display inertia or normalcy bias. Especially if they believe that their in-house mechanisms for reporting misconduct are untrustworthy or ineffective, they may default to acting as if there is no heightened or imminent risk to themselves or their company, and keep on with “business as usual.” Or they may simplify their responses by taking only half-hearted steps – perhaps talking to one or two colleagues or family members — rather than decisive action to separate themselves from the misconduct.
To overcome the effects of these biases and situational factors in disaster situations, Meyer maintained that “[t]he key to better preparedness is not to eliminate those biases – a hopeless task, since they’re part of who we are – but to design measures that anticipate biases.” Here are some possible approaches to anticipating biases in disaster or business scenarios:
Overconfidence Bias: Two techniques have successfully reduced overconfidence bias, according to Professor David Myers in his book Exploring Social Psychology. One “is prompt feedback on the accuracy of [people’s] judgments.” For impending natural disasters, that may mean communications at the town or neighborhood level from credible sources – local weather forecasters, emergency-management teams, or police – to convey to residents in specific terms that the danger for them is real. When a National Weather Service meteorologist, the day before Hurricane Katrina made landfall, issued a warning for the New Orleans area that described the probable dangers in highly specific and graphic terms, that warning was later deemed “the most dire—and effective—weather forecast ever issued by the National Weather Service.”
To address risky corporate situations — say, a proposed entry into a new market in which bribery of national government officials is common – effectively, corporate compliance officers need to take two types of actions. First, they should make efforts to attend every meeting in which senior executives are discussing or preparing to implement a plan that could involve improper or illegal conduct, to ensure that compliance risks are neither ignored or downplayed. Second, both in and outside those meetings, they need to engage with participating executives and refute, with specific examples from prior enforcement actions, any assumptions that the planned course of action presents little or no compliance risk.
The other, Professor Myers wrote, relates to the fact that “[w]hen people think about why an idea might be true, it begins to seem true . . . . Thus, another way to reduce overconfidence is to get people to think of one good reason why their judgments might be wrong, forcing them to consider why opposing ideas might be right . . . .” For impending disasters, that could mean using local media and community meetings with local officials to confront “we-can-ride-it-out” beliefs with information as specific as possible on the impending disaster’s likely impact in that community – perhaps even supplemented with examples of people in previous disasters who came to regret their “ride-it-out” decisions. For corporate situations, that could mean compliance officers talking with key participants in a risky course of action about what those participants believe to be non-risky decisions and actions, and pointing out information that would support opposing ideas and recommendations.
“Herd Thinking”/Social Proof: To combat “herd thinking” or social proof-based decisionmaking by people before disasters, officials should use public-service messages and community- meeting talks that call attention to that particular bias. One example would be, “Folks, don’t assume that just because others in your community are talking about staying, that’s the right decision for you and your families. Talk with your neighbors and friends all you want, but in the end make your decision based on the latest information, not assumptions about what others are doing and why.”
A similar approach can work in corporate environments. Compliance training, for instance, can include guidance to employees that says, “If you hear or see something that you feel in your gut is wrong, trust your first instincts and talk about it with someone – your manager or our ethics line. Don’t assume that because no one else is speaking up about it, no one shares your concerns.”
Inertia and Simplification/Normalcy Bias: To combat inertia, Meyer recommended that governments “work hard to persuade people to develop precise preparedness plans that include a shopping list of supplies and exact plans for when and where to evacuate, should that be necessary.” To combat simplification, he similarly urged officials to present people with short lists of the most important preparation measures they should take.
In corporate settings, compliance officers need to supplement in-house compliance training and messaging to employees in two ways. First, the training and messaging should convey that the need for employees to speak up or report misconduct is even greater when it appears that that misconduct is well underway. Second, it should set clear priorities for how employees should report when that misconduct is advanced (i.e., directing an employee to notify a senior compliance officer rather than consulting his or her immediate supervisor or reporting through conventional whistleblower reporting channels).
This discussion cannot do justice to all of the cognitive biases and influences that can affect business decisionmaking and compliance. It should indicate, however, why compliance officers need to pay closer attention to cognitive biases, and see that their compliance programs move beyond “check-the-box” policies and conventional internal controls to operationalizing measures that can counteract or reduce the influence of those biases.