One day in 1979, James Burke, the chief executive of Johnson & Johnson, summoned more than 20 of his key people into a room, jabbed his finger at an internal document, and proposed destroying it.
The document was hardly incriminating. Entitled “Our Credo,” its plainspoken list of principles—including a higher duty to “mothers, and all others who use our products”—had been a fixture on company walls since 1943. But Burke was worried that managers had come to regard it as something like the Magna Carta: an important historical document, but hardly a tool for modern decision making. “If we’re not going to live by it, let’s tear it off the wall,” Burke told the group, using the weight of his office to force a debate. And that is what he got: a room full of managers debating the role of moral duties in daily business, and then choosing to resuscitate the credo as a living document.
Three years later, after reports emerged of a deadly poisoning of Tylenol capsules in Chicago-area stores, Johnson & Johnson’s reaction became the gold standard of corporate crisis response. But the company’s swift decisions—to remove every bottle of Tylenol capsules from store shelves nationwide, publicly warn people not to consume its product, and take a $100 million loss—weren’t really decisions. They flowed more or less automatically from the signal sent three years earlier. Burke, in fact, was on a plane when news of the poisoning broke. By the time he landed, employees were already ordering Tylenol off store shelves.
On the face of it, you’d be hard-pressed to find an episode less salient to the emissions-cheating scandal at Volkswagen—a company that, by contrast, seems intent on poisoning its own product, name, and future. But although the details behind VW’s installation of “defeat devices” in its vehicles are only beginning to trickle out, the decision process is very likely to resemble a bizarro version of Johnson & Johnson’s, with opposite choices every step of the way.
The sociologist Diane Vaughan coined the phrase the normalization of deviance to describe a cultural drift in which circumstances classified as “not okay” are slowly reclassified as “okay.” In the case of the Challenger space-shuttle disaster—the subject of a landmark study by Vaughan—damage to the crucial O‑rings had been observed after previous shuttle launches. Each observed instance of damage, she found, was followed by a sequence “in which the technical deviation of the [O‑rings] from performance predictions was redefined as an acceptable risk.” Repeated over time, this behavior became routinized into what organizational psychologists call a “script.” Engineers and managers “developed a definition of the situation that allowed them to carry on as if nothing was wrong.” To clarify: They were not merelyacting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.
If that comparison sounds overwrought, consider the words of Denny Gioia, a management professor at Penn State who, in the early 1970s, was the coordinator of product recalls at Ford. At the time, the Ford Pinto was showing a tendency to explode when hit from behind, incinerating passengers. Twice, Gioia and his team elected not to recall the car—a fact that, when revealed to his M.B.A. students, goes off like a bomb. “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall,” he wrote in the Journal of Business Ethics some 17 years after he’d left the company. “I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.”The O-ring engineers were not merely acting as if nothing was wrong. They believed it.
What, Gioia the professor belatedly asked, had Gioia the auto executive been thinking? The best answer, he concluded, is that he hadn’t been. Executives are bombarded with information. To ease the cognitive load, they rely on a set of unwritten scripts imported from the organization around them. You could even define corporate culture as a collection of scripts. Scripts are undoubtedly efficient. Managers don’t have to muddle through each new problem afresh, Gioia wrote, because “the mode of handling such problems has already been worked out in advance.” But therein lies the danger. Scripts can be flawed, and grow more so over time, yet they discourage active analysis. Based on the information Gioia had at the time, the Pinto didn’t fit the criteria for recall that his team had already agreed upon (a clearly documentable pattern of failure of a specific part). No further thought necessary.
Sometimes a jarring piece of evidence does intrude, forcing a conscious reassessment. For Gioia, it was the moment he saw the charred hulk of a Pinto at a company depot known internally as “The Chamber of Horrors.” The revulsion it evoked gave him pause. He called a meeting. But nothing changed. “After the usual round of discussion about criteria and justification for recall, everyone voted against recommending recall—including me.”
The most troubling thing, says Vaughan, is the way scripts “expand like an elastic waistband” to accommodate more and more divergence. Morton-Thiokol, the NASA contractor charged with engineering the O-rings, requested a teleconference on the eve of the fatal Challenger launch. After a previous launch, its engineers had noticed O-ring damage that looked different from damage they’d seen before. Suspecting that cold was a factor, the engineers saw the near-freezing forecast and made a “no launch” recommendation—something they had never done before. But the data they faxed to NASA to buttress their case were the same data they had earlier used to argue that the space shuttle was safe to fly. NASA pounced on the inconsistency. Embarrassed and unable to overturn the script they themselves had built in the preceding years, Morton-Thiokol’s brass buckled. The “no launch” recommendation was reversed to “launch.”
“It’s like losing your virginity,” a NASA teleconference participant later told Vaughan. “Once you’ve done it, you can’t go back.” If you try, you face a credibility spiral: Were you lying then or are you lying now?
The document was hardly incriminating. Entitled “Our Credo,” its plainspoken list of principles—including a higher duty to “mothers, and all others who use our products”—had been a fixture on company walls since 1943. But Burke was worried that managers had come to regard it as something like the Magna Carta: an important historical document, but hardly a tool for modern decision making. “If we’re not going to live by it, let’s tear it off the wall,” Burke told the group, using the weight of his office to force a debate. And that is what he got: a room full of managers debating the role of moral duties in daily business, and then choosing to resuscitate the credo as a living document.
Three years later, after reports emerged of a deadly poisoning of Tylenol capsules in Chicago-area stores, Johnson & Johnson’s reaction became the gold standard of corporate crisis response. But the company’s swift decisions—to remove every bottle of Tylenol capsules from store shelves nationwide, publicly warn people not to consume its product, and take a $100 million loss—weren’t really decisions. They flowed more or less automatically from the signal sent three years earlier. Burke, in fact, was on a plane when news of the poisoning broke. By the time he landed, employees were already ordering Tylenol off store shelves.
On the face of it, you’d be hard-pressed to find an episode less salient to the emissions-cheating scandal at Volkswagen—a company that, by contrast, seems intent on poisoning its own product, name, and future. But although the details behind VW’s installation of “defeat devices” in its vehicles are only beginning to trickle out, the decision process is very likely to resemble a bizarro version of Johnson & Johnson’s, with opposite choices every step of the way.
The sociologist Diane Vaughan coined the phrase the normalization of deviance to describe a cultural drift in which circumstances classified as “not okay” are slowly reclassified as “okay.” In the case of the Challenger space-shuttle disaster—the subject of a landmark study by Vaughan—damage to the crucial O‑rings had been observed after previous shuttle launches. Each observed instance of damage, she found, was followed by a sequence “in which the technical deviation of the [O‑rings] from performance predictions was redefined as an acceptable risk.” Repeated over time, this behavior became routinized into what organizational psychologists call a “script.” Engineers and managers “developed a definition of the situation that allowed them to carry on as if nothing was wrong.” To clarify: They were not merelyacting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.
If that comparison sounds overwrought, consider the words of Denny Gioia, a management professor at Penn State who, in the early 1970s, was the coordinator of product recalls at Ford. At the time, the Ford Pinto was showing a tendency to explode when hit from behind, incinerating passengers. Twice, Gioia and his team elected not to recall the car—a fact that, when revealed to his M.B.A. students, goes off like a bomb. “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall,” he wrote in the Journal of Business Ethics some 17 years after he’d left the company. “I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.”The O-ring engineers were not merely acting as if nothing was wrong. They believed it.
What, Gioia the professor belatedly asked, had Gioia the auto executive been thinking? The best answer, he concluded, is that he hadn’t been. Executives are bombarded with information. To ease the cognitive load, they rely on a set of unwritten scripts imported from the organization around them. You could even define corporate culture as a collection of scripts. Scripts are undoubtedly efficient. Managers don’t have to muddle through each new problem afresh, Gioia wrote, because “the mode of handling such problems has already been worked out in advance.” But therein lies the danger. Scripts can be flawed, and grow more so over time, yet they discourage active analysis. Based on the information Gioia had at the time, the Pinto didn’t fit the criteria for recall that his team had already agreed upon (a clearly documentable pattern of failure of a specific part). No further thought necessary.
Sometimes a jarring piece of evidence does intrude, forcing a conscious reassessment. For Gioia, it was the moment he saw the charred hulk of a Pinto at a company depot known internally as “The Chamber of Horrors.” The revulsion it evoked gave him pause. He called a meeting. But nothing changed. “After the usual round of discussion about criteria and justification for recall, everyone voted against recommending recall—including me.”
The most troubling thing, says Vaughan, is the way scripts “expand like an elastic waistband” to accommodate more and more divergence. Morton-Thiokol, the NASA contractor charged with engineering the O-rings, requested a teleconference on the eve of the fatal Challenger launch. After a previous launch, its engineers had noticed O-ring damage that looked different from damage they’d seen before. Suspecting that cold was a factor, the engineers saw the near-freezing forecast and made a “no launch” recommendation—something they had never done before. But the data they faxed to NASA to buttress their case were the same data they had earlier used to argue that the space shuttle was safe to fly. NASA pounced on the inconsistency. Embarrassed and unable to overturn the script they themselves had built in the preceding years, Morton-Thiokol’s brass buckled. The “no launch” recommendation was reversed to “launch.”
“It’s like losing your virginity,” a NASA teleconference participant later told Vaughan. “Once you’ve done it, you can’t go back.” If you try, you face a credibility spiral: Were you lying then or are you lying now?
by Jerry Useem, The Atlantic | Read more:
Image: Justin Renteria