Friday, July 4, 2014

The Fear Factor

In 2010, a respected international team published a study finding that old age generally arrives later than the dependency ratio assumes—if old age is defined as the point at which older people need permanent care, that is, when they are disabled. The demographers Warren C. Sanderson and Sergei Scherbov wrote in Science magazine, “Alternative measures that account for life-expectancy changes”—improvements in health and longevity—“show slower rates of aging than their conventional counterparts,” based on “fixed chronological ages.”

They wrote that chronological age is less useful than life expectancies in predicting national health costs, because “most of those costs occur in the last few years of life.” Sanderson and Scherbov developed a measure they called the adult disability dependency ratio, defined as the number of adults 20 and over with disabilities, divided by the number of adults 20 and over without them. In the United States, this measure will likely remain flat for the next generation, meaning that the cost of caring for the disabled is not likely to skyrocket as a result of a major increase in the number of disabled people.

John Shoven, a Stanford economist, takes that idea a step further: in a scholarly paper called “New Age Thinking,” he argues that age should be defined differently from the universal convention of years since birth. “The measurement of age with different measures is not like choosing between measuring temperature on a Fahrenheit or Centigrade scale,” he warned. The reason to change how age is measured is that the connection between the universal definition of age and the alternatives he proposes is constantly changing. Because of advances in nutrition, sanitation, and other factors, as well as health care, someone who has lived a long time is no longer as old as his or her numerical age once indicated.

A man born in 1900 was expected to live until he was 51 ½ and had less than a 50 percent chance of living until he reached 65. A man born in 2000 is expected to live until he is 80 and has an 86 percent chance of reaching 65. That dramatic advance in longevity indicates that knowing how many years a person has been alive tells only so much about the person’s risk of dying.

Shoven proposes that instead of measuring age backward, as in years since birth, we measure it forward, as in years until projected death. One option is to measure age by mortality risk. A 51-year-old man in 1970 had the same mortality risk (a one percent chance that he would die) as a 58-year-old man in 2000: in one generation, longevity advanced by seven years for that level of risk. Another option is to measure age by remaining life expectancy, a more accessible measure because it is computed in years rather than as a percentage. In 1900, a man who reached 65 had a remaining life expectancy of about 13 years. In 2000, a man who reached 65 had a life expectancy of about 21 years.

Measuring backward yields starkly different results from measuring forward. “Consider two alternative definitions of who is elderly in the population,” Shoven writes, “those who are currently 65 or older and those who have a mortality rate of 1.5 percent or worse.” In 2007, when he wrote this paper, the two definitions were equal: the average mortality rate was 1.5 percent or worse for 65-year-olds. According to the U.S. Census, the population of those who are 65 or older will increase from about 12.5 percent of the population in 2035 to about 20.5 percent in 2050. But “the percent of the population with mortality risks higher than 1.5 percent (currently also 12.5 percent of the population) never gets above 16.5 percent,” because of what James Fries of the Stanford School of Medicine called “the compression of morbidity”—the tendency of illnesses to occur during a short period before death if the first serious illness can be postponed. That number “is projected to be just slightly below 15 percent and declining by 2050.”

By the conventional measure of years since birth, the population considered elderly is expected to grow by 64 percent. By Shoven’s measure, on the other hand, it is expected to grow by just 32 percent. “The point,” he says, “is the great aging of our society is partly a straightforward consequence of how we measure age.”

To Laura Carstensen, a psychologist who directs the Stanford Center on Longevity, the striking advance in lifespan requires “us to answer a uniquely twenty-first-century question: What are we going to do with super-sized lives?” In her book A Long Bright Future, she envisions a transformation in American culture and society that would “expand youth and middle age” as well as old age, in “a new model for longer life” that would “harness the best of each stage at its natural peak.”

She proposes that young adults should ease into the work force, “working fewer hours during the years that they’re caring for young children, completing their educations, and trying to find the right careers.” Around 40, full-time work life would begin, when people “have developed the emotional stability that guides them as leaders.” Older workers, rather than “vaulting into full retirement on their sixty-fifth birthdays,” would continue to work for more years but for fewer hours, and retirement “could be the pinnacle of life, rather than its ‘leftovers.’ ”

Carstensen’s proposal rests on findings in her work about the capabilities of older workers. She learned that they are generally more stable emotionally than younger workers and better at dealing with stress, and that while younger workers, by and large, pick up new information faster, older workers often have wider knowledge and more expertise. One important study by a group at the Rush University Medical Center casts doubt even on the cognitive advantage of younger workers. The decline in cognitive processing speed found in older workers turns out to be negligible when people who later developed Alzheimer’s disease are removed from the group studied. That would include one out of every nine people who are 65 and over.

Carstensen and others are building on the work of the late Robert N. Butler, a psychiatrist whose biographer described him as a “visionary of healthy aging.” The founding director of the National Institute on Aging at the National Institutes of Health, Butler believed that the extension of American lives—especially the extension of the healthy years—requires new thinking about some of America’s basic institutions. “Many of our economic, political, ethical, health, and other institutions, such as education and work life, have been rendered obsolete by the added years of life for so many citizens,” Butler wrote in his 2008 book, The Longevity Revolution.

Butler was a realist about the discrimination that older Americans can face in addition to declines in physical capability, health, and cognitive ability. He coined the term ageism for this form of discrimination and catalogued how it can manifest itself in problems finding appropriate work, housing, transportation, and satisfying other basic needs. But Butler was an optimist, convinced that many healthy older Americans represent not a liability but a great asset of experience, skill, and drive that the country should learn how to exploit.

In a nation whose motto is E pluribus unum, a fundamental disagreement about social policy in recent decades has been about how policymakers should reinforce the mutual support called for in the motto. They could emphasize the value of older Americans working on behalf of children in education, for example, and younger Americans supporting older ones who need help. Or policymakers could strive to ease the allegedly large conflict between generations over the allocation of scarce resources. The shorthand for this difference of opinion in our splintered political culture is “warfare” versus “interdependence” between the boomer generation and the generations that follow. Our emphasis should be on generational interdependence.

by Lincoln Caplan, American Scholar |  Read more:
Image: David Herbick