In commencement addresses like this, people admonish us: take risks; be willing to fail. But this has always puzzled me. Do you want a surgeon whose motto is “I like taking risks”? We do in fact want people to take risks, to strive for difficult goals even when the possibility of failure looms. Progress cannot happen otherwise. But how they do it is what seems to matter. The key to reducing death after surgery was the introduction of ways to reduce the risk of things going wrong—through specialization, better planning, and technology. They have produced a remarkable transformation in the field. Not that long ago, surgery was so inherently dangerous that you would only consider it as a last resort. Large numbers of patients developed serious infections afterward, bleeding, and other deadly problems we euphemistically called “complications.” Now surgery has become so safe and routine that most is day surgery—you go home right afterward.
But there continue to be huge differences between hospitals in the outcomes of their care. Some places still have far higher death rates than others. And an interesting line of research has opened up asking why.
Researchers at the University of Michigan discovered the answer recently, and it has a twist I didn’t expect. I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe.
Scientists have given a new name to the deaths that occur in surgery after something goes wrong—whether it is an infection or some bizarre twist of the stomach. They call them a “failure to rescue.” More than anything, this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more.
This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all. (...)
There was, as I said, every type of error. But the key one was the delay in accepting that something serious was wrong. We see this in national policy, too. All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself. In surgery, you learn early how essential that is. You are imperfect. Your knowledge is never complete. The science is never certain. Your skills are never infallible. Yet you must act. You cannot let yourself become paralyzed by fear.
by Atul Gawande, The New Yorker | Read more:
Photograph courtesy Hulton Archive/Getty.
But there continue to be huge differences between hospitals in the outcomes of their care. Some places still have far higher death rates than others. And an interesting line of research has opened up asking why.
Researchers at the University of Michigan discovered the answer recently, and it has a twist I didn’t expect. I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe.
Scientists have given a new name to the deaths that occur in surgery after something goes wrong—whether it is an infection or some bizarre twist of the stomach. They call them a “failure to rescue.” More than anything, this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more.
This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all. (...)
There was, as I said, every type of error. But the key one was the delay in accepting that something serious was wrong. We see this in national policy, too. All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself. In surgery, you learn early how essential that is. You are imperfect. Your knowledge is never complete. The science is never certain. Your skills are never infallible. Yet you must act. You cannot let yourself become paralyzed by fear.
by Atul Gawande, The New Yorker | Read more:
Photograph courtesy Hulton Archive/Getty.