Saturday, October 6, 2012

Who Destroyed the Economy? The Case Against the Baby Boomers


Crescent Lake, Ore.--My father taught me how to throw a baseball and divide big numbers in my head and build a life where I'd be home in time to eat dinner with my kid most nights. He and my mother put me through college and urged me to follow my dreams. He never complained when I entered a field even less respected than his. He lives across the country and still calls just to check in and say he loves me.

His name is Tom. He is 63, tall and lean, a contracts lawyer in a small Oregon town. A few wisps of hair still reach across his scalp. The moustache I have never seen him without has faded from deep brown to silver. The puns he tormented my younger brother and me with throughout our childhood have evolved, improbably, into the funniest jokes my 6-year-old son has ever heard. I love my dad fiercely, even though he's beaten me in every argument we've ever had except two, and even though he is, statistically and generationally speaking, a parasite.

This is the charge I've leveled against him on a summer day in our Pacific Northwest vision of paradise. I have asked my favorite attorney to represent a very troublesome client, the entire baby-boom generation, in what should be a slam-dunk trial--for me. On behalf of future generations, I am accusing him and all the other parasites his age of breaking the sacred bargain that every American generation will pass a better country on to its children than the one it inherited.

We are sitting on a beach in late afternoon on a sun-drizzled lake in the Cascade Mountains, two college-educated, upper-middle-class white men settling in for a week of generational warfare. My son, Max, splashes in the waves with his grandmother; sunbathers lounge in inner tubes around us; snow-capped peaks loom above the tree line. The breeze smells of Coppertone and wet dog. My father thinks back on the country that awaited him when he finished law school. "There seemed to be a lot of potential," he says, setting up the first of many evasions, "but there weren't a lot of jobs."

I'm mildly impressed that he's even bothering to mount a defense. The facts as I see them are clear and damning: Baby boomers took the economic equivalent of a king salmon from their parents and, before they passed it on, gobbled up everything but the bones.

by Jim Tankersley, The Atlantic |  Read more:
Photo: Rob Finch Visuals

War Between the Sexes

Darwinian sexual selection has not, in general, selected for particularly cosy relations between the sexes. The praying mantis female often cannibalizes her mate; she bites his head just as he is delivering his sperm and then completes her meal when he’s done. Aside from a hard-to-interpret wiggle, he seems not to protest the terms of the sexual bargain because he is solitary and unlikely to score again. By contrast, the male bedbug is a brutal bully; he has evolved a dagger-like projection with which to slash the female’s abdomen. The more graceful water strider has two precision antennae that serve no other purpose than to hold females down. As for the toxin-loaded scorpion, he has evolved a special toxin-lite to subdue the female of his species.

And so it goes. Sex on six legs, or eight, can be a decidedly sordid affair. As Darwin himself observed, one should not look for moral uplift in nature. For the economist and Darwinist Paul Seabright, insect sex nonetheless neatly illustrates the dialectical nature of sexual evolution. Male strategies for “scoring” escalate over time. In dialectic tandem, so do female counterstrategies for evading undesirables and exerting some choice – overt or covert – in their affairs. This is the “war” of his title.

Game theory enables evolutionary biologists and economists such as Seabright to think of the so-called war of the sexes as a strategic game. In general, the male evolves to “want” to score at all costs – whether that means being a bully, a martyr or something else entirely. The female, however, “knows” the real stakes are viable offspring. Of course, neither sex “knows” or “wants”, which would imply sentience or introspection; rather, they are unconscious vehicles for such behaviours. Insects and humans alike, we are the descendants of those who happened to play the game exceptionally well.

Cars, sports, human plumage and a great deal else, according to Seabright, all exploit the same basic principle. The wastefulness of the billion-dollar cosmetics industry clearly dismays him even while it enables him to point out the connection between vanity, fertility cues and marketability. Much of the first half of his book cleverly relates the essence of life to cocktail party dynamics. Seabright puts it this way: “Like a conversation at a party with someone who cannot restrain himself from looking over your shoulder to see who else there might be to talk to, sexual relations in almost all species are clouded by the possibility that either partner might be better off with someone else now or in the future”.  (...)

Regarding romance, Seabright argues that we have evolved to be “a socially monogamous species but surreptitiously promiscuous”. Sexual conflicts of interest need not compromise our long-term unions. “We are the species for whom life is about partnerships” – even if every partnership harbours sublimated conflicts of interest. For Seabright as for Freud, the sexual partnership is the template for all others. We can’t create, any more than we can procreate, without others. We exist only in the cocktail party-distracted gaze of the other. Charm is about monopolizing that gaze. Adolescent schoolgirls know this better than anyone. (Seabright reminds his reader that female “cunning” can be charming, even glamorous.) So do business tycoons. Being in the dumpster where no one wants to partner up or collaborate with us makes us physically ill. Fearing the dumpster makes us neurotic. Winning the Oscar or its equivalent gives us extra years of life (compared to also-rans), according to a now-famous study, because everyone wants to partner or cooperate with us. In other words, our emotions and health are intimately tied up with where we stand in the cooperative or partnering hierarchy – with our bargaining power in effect.

by Michele Pridmore-Brown, The Times Literary Supplement |  Read more:

Unfriending Someone, Before Facebook

I realized the other day that I had been quietly unfriended on Facebook and I could not help but think how much better things were 50 years ago, when a relationship went south and you knew why.

Let me give you an example of how the people in my family unfriended someone when I was growing up in the Catskills: It is summer and my favorite city cousin, whom we shall call Ravishing Rachel because of the delicacy of the situation, is in the mountains with a boyfriend. My mother gets a call that one of her brothers, Rachel’s father, has had a fatal heart attack. She is naturally distraught and, after calling a few motels, finally tracks down her niece and breaks it to her.

“Ravishing, you tramp!” I hear her holler. “Your father is dead and you killed him.”

See how much better that was than Facebook? No confusion, no wondering why or when it happened. Yes, yes, I know what you are thinking: My mother was obviously the Emily Post of the Catskills, how many of us could hope to attain her command of language, her diplomacy and tact? It is amazing, you are thinking, that Jackie Kennedy didn’t ditch Letitia Baldrige and hire my mother to be her social secretary at the White House. And I would have to give you that. In my mother, one found a moral clarity that I think can only be compared to John Wayne, who unfriended people by shooting them dead.

Consider the grace with which my mother unfriended Cousin Marvin when he appeared on a summer day at our house:

“Marvin, I plain can’t stand you and nobody in the family can stand you. Stay at a motel.”

But why stop with my mother’s generation?

My grandmother Gussie, who conversed primarily in Yiddish and was so hazy about American customs that she understood Halloween to be a national holiday in which you give the children money, was a genius at terminating relationships. When someone, say Cousin Marvin, who just seemed to have one problem after another, got a divorce, a scandalous event at the time, my grandmother took out a pair of nail scissors and removed the face of Marvin’s ex-wife from the family photos, leaving for some reason the hair – well, Marvin’s wife did have very nice hair. She did the same thing with someone’s prom photos, after he had broken up with the girl. I kept expecting to see it in The National Enquirer: “Upstate Boy Takes Faceless Girl to Prom.”

by Joyce Wadler, NY Times |  Read more:
Illustration: NY Times

Friday, October 5, 2012

Grateful Dead



Drinking is bad, feelings are worse.
via:

Beyonce'


The Things They Carried: At The National Wife-Carrying Championships

It's happened before—for Dave and for his wife Lacey, perennial contenders at the North American Wife-Carrying Championships, a raucous gathering attended by both fitness fiends and softies like me who think, wrongly, that Wife-Carrying is an easy kind of carnival game. It's not.

The basics on the most difficult nuptial sport around: fifty couples run in the Championships, two in each heat, and the two best times overall make the finals. The husband dangles his wife upside-down over his head and tries to traverse a hilly, sloppy, divot-filled, 278-yard obstacle course as fast as he can while gradually coming to realize, once and for all, that his hamstrings are useless. He thinks, Oh God, I'm about to tumble in the most emasculating fashion possible. That was my experience, at least.

But Dave Castro did much better than that. He came up only three feet short, and he hasn't stopped thinking about it since.

***

Well before these Wife-Carrying Championships came to Newry, Maine's Sunday River Ski Resort a decade ago, and well before Dave and Lacey Castro were a world-class pair, Wife-Carrying already had a long, half-nefarious history. While Wife-Carrying sounds like something an enterprising Jack might have invented in 1956 to amuse the local Rotarians, it's actually a centuries-old Scandinavian tradition.

Story goes that a Finnish robber named Herkko Rosvo-Ronkainen used women as training weights to prepare his marauding bands for their raids on nearby villages. That's the sanitized tale, at least; it's also possible that modern-day Wife-Carrying has its origins in Rosvo-Ronkainen's knack for women-napping. This is part of why I'd been a bit reluctant to haul my progressive wife on my back like she was some sort of nifty plunder.

“Some people say, 'Oh, Wife-Carrying is such a sexist sport,'" Lacey Castro says. "But I feel like I've been training just as long as my husband has. The women are forcing the husbands to do it just as much as the other way.”

The Castros have forced themselves into phenomenal shape for the one-minute a year when they have a chance to be the absolute best at something. Silly as that something may seem to an outsider, they take the Championships seriously, and they're made for them. Lacey, a body-builder with about a thimble's-worth of body fat, weighs in at around 108 pounds, which is the minimum for wives; Dave, compact and slim, has the bulging calves of a UPS driver. Which is because he is a UPS driver.

“He's basically training all the time,” Lacey told me. “Five days a week he's running around with 50-pound boxes from door to door.” (...)

Wife-Carrying is, honestly, mostly Estonian. If only because The Estonian Carry is the sport's answer to the Fosbury Flop: both stylistic signature and agreed-upon best practice. There is no official technique in wife-carrying—Piggybacks, Fireman's Carries, and Honeymoon Threshold Lifts are all legal. But every champion has used the Estonian Carry, most notably the Estonian Uusorg brothers, the most decorated spouse-hauling athletes, and rivals of Miettenen's for international supremacy. At the farthest opposite end of the prestige spectrum, I also would need to master this Baltic carry for my own turn on the Wife-Carrying Championship course.

Since the North American Championship was my first attempt at extreme sports with my wife—they'd given us a wild card entry—I wasn't sure I could convince her to dangle near my rear as I traversed a slippery ski hill. She'd agreed to participate because I'd made some claim about Immersion Journalism, but we'd yet to get to the touchy subject of head-ass-ground proximity. I turned to the Castros for advice.

“Tell her, 'Look, you might be a little dizzy, but just hold on',” Lacey said. “Go right into the Estonian.”

Dave added: “You'll put her on your back. She might not like it at first. But it'll be good.”

I was convinced, but I was not really the one who needed convincing. Megan and I needed to practice, so before the competition, we took to the field between our apartment and the public library. It was dark. I popped her up on my shoulders and chugged up a hill. Right then, the local book club let out and we were bathed in the accusing glare of a dozen pairs of headlights. Undaunted, I took 17 laps around an oak tree and put her down. She was flushed beneath her helmet, either from embarrassment or because half her blood supply had rushed straight to her cheeks, or both.

by David Wancyzk, The Classical |  Read more:

What Will Ice-Free Arctic Summers Bring?

On Sunday, September 16, the sun did not rise above the horizon in the Arctic. Nevertheless enough of the sun's heat had poured over the North Pole during the summer months to cause the largest loss of Arctic sea ice cover since satellite records began in the 1970s. The record low 3.41 million square kilometers of ice shattered the previous low—4.17 million square kilometers—set in 2007. All told, since 1979, the Arctic sea ice minimum extent has shrunk by more than 50 percent—and even greater amounts of ice have been lost in the corresponding thinning of the ice, according to the U.S. National Snow and Ice Data Center (NSIDC).

"There is much more open ocean than there used to be," says NSIDC research scientist Walt Meier. "The volume is decreasing even faster than the extent [of surface area] as best as we can tell," based on new satellite measurements and thickness estimates provided by submarines. Once sea ice becomes thin enough, most or all of it may melt in a single summer.

Some ice scientists have begun to think that the Arctic might be ice-free in summer as soon as the end of this decade—leaving darker, heat-absorbing ocean waters to replace the bright white heat-reflecting sea ice. The question is: Then what happens? Although the nature and extent of these rapid changes are not yet fully understood by researchers, the impacts could range from regional weather-pattern changes to global climate feedbacks that exacerbate overall warming. As Meier says: "We expect there will be some effect…but we can't say exactly what the impacts have been or will be in future."

On thin ice

Arctic ice influences atmospheric circulation and, hence, weather and climate. Take away the ice and impacts seem sure to follow. There's more warming to come, as well, particularly in the Arctic, which is warming faster than the rest of the globe. Given cumulative greenhouse gas emissions, there's likely at least as much warming to come as has occurred to date—a rise of 0.8 degree Celsius in global average temperatures, most of that in the past 30 years.

The warmer Arctic waters and land have also begun to release methane, a short-lived but potent greenhouse gas that is also the primary hydrocarbon in natural gas fuel. The Arctic Ocean alone contains more methane than the rest of the world's oceans combined—though when and even if such a thawing would contribute a massive methane release remains a "known unknown" in the words of former Defense Secretary Donald Rumsfeld and oceanographer Wieslaw Maslowski of the Naval Postgraduate School in Monterey. "If we release that methane, we will amplify global warming by an unknown amount," Maslowski says. "We have no idea."

by David Biello, Scientific American |  Read more:
Image: Courtesy of NASA

09-30-12 (by Lee Kaloidis)
via:

The Cup Holders Runneth Over

From the inside, the new electric BMW i3 is airy and light. This, says its designer, Benoit Jacob, will produce a peaceful environment that influences the driver's state of mind. A calming interior, together with natural materials (including wool, vegetable-dyed leather and eucalyptus wood sourced from sustainable forests in Europe), will coax us into behaving responsibly. I interrupt his musing on automobile psychology with a query: where are the cup holders?

"There will be cup holders," he sighs. "The world wants cup holders. Designers are crying, 'oh shit, another cup holder'." But Jacob is smiling. He knows that cup holders matter. The outside of a car gets all the attention—at first. Yet it is inside that we sit, often for long stretches, and nowadays most of us want to do more than just listen to the radio. (...)

Car design began on June 23rd 1927, when the executive committee of General Motors (GM) met in Detroit and approved the creation of a new department to "study the question of art and colour combinations". Harley Earl, who made customised car bodies for Hollywood stars, was hired as its leader. Over the coming decades, Earl developed the idea of making concept cars, both to get a better idea of what production vehicles would look like and to drum up interest in new models. He also came up with the annual model change and, brilliantly, put tail fins on Cadillacs. Earl was also the first to hire female designers, thus beginning the process of feminising the automobile.

Today GM has 1,900 designers in ten design centres in seven countries. Many are specialists in safety, aerodynamics, materials, colours or ergonomics: there is even a specialist looking after cup holders. Cup holders got going in the 1980s, although some were available well before. The idea came from the spread of drive-in restaurants and cinemas in America, as people parked up and wanted somewhere to put their drink. Now they take the drink for a drive.

Many European carmakers at first resisted fitting cup holders, only to capitulate in time. Some regional differences remain: an American car could have a dozen cup holders, a European one only half that. And the American cup holders are generally bigger (to hold those supersized Cokes). In Asia, the cup holders have to take drinks that come in square containers. (...)

by Paul Markillie, Intelligent Life |  Read more:
Illustration: Nick Hardcastle

Meet the High Priest of Runaway College Inflation (He Regrets Nothing)


Shortly after Stephen Trachtenberg announced in 2007 that he would step down as president of George Washington University, he strolled alone from his office to the school's library, where he often held informal office hours at the adjoining Starbucks. He bought a cup of coffee and sat at a table outside, contemplating his success.

The valedictory beverage felt well earned. Since Trachtenberg took over in 1988, he had boosted the school's endowment from $250 million to $1 billion and built many state-of-the-art facilities, such as computer and research labs. The profusion of comforts didn't just stimulate students' minds; it also fulfilled their every whim--a change that drew a more selective, more intelligent group of applicants and sent the admission rate plummeting from 75 percent to 37 percent. "It was a very soothing, very beautiful experience and gave me a great sense of satisfaction about my tenure as president," Trachtenberg wrote in his memoir, Big Man on Campus.

Trachtenberg's students funded this triumph. When he became president, they paid $25,000 (in today's dollars) in tuition, room, and board to attend; by the time he retired, they paid $51,000. Trachtenberg made George Washington the most expensive school in the nation. The burst of cash powered his agenda, but the freshmen who borrowed to enroll--46 percent of the class--during his final year graduated with an average of $28,000 of debt.

Trachtenberg also set a trend that other colleges--first his private competitors, then universities across the country--felt compelled to follow. Today, George Washington is only the 21st most expensive school, and the average American student accumulates $24,300 of debt earning her diploma. Collectively, Americans hold more student-loan debt than credit-card debt, and graduates enter a world where more than half of them are jobless or underemployed. (...)

The way Trachtenberg saw it, selling George Washington over the other schools was like selling one brand of vodka over another. Vodka, he points out, is a colorless, odorless liquid that varies little by maker. He realized the same was true among national private universities: It was as simple as raising the price and upgrading the packaging to create the illusion of quality. Trachtenberg gambled that prospective students would see costly tuition as a sign of quality, and he was right. "People equate price with the value of their education," he says.

Trachtenberg was hardly the first to reach this conclusion, but under his leadership, George Washington was peerless in following its logic. He didn't spend the tuition windfall to shift the professor-to-student ratio or overhaul the curriculum. Instead, he covered the campus in cafés, beautiful study spaces, and nicer dorms. Trachtenberg thought that construction on campus gave the appearance that the school was financially sound and was progressing toward a goal, so his policy was, "Never stop building." If he wanted to erect or renovate two buildings, he would stagger the projects so that jackhammers could be heard constantly around campus. He also introduced a three-day orientation, known as Colonial Inauguration, that featured ice-cream socials, casino nights, and a laser show that cost $2,500 per minute.

by Julia Edwards, The Atlantic |  Read more:
Photo: Richard A. Bloom

Thursday, October 4, 2012

Custom Namiki Falcon Resin Fountain Pen HD


[ed. Strangely hypnotic, and beautiful.]

Aimee Mann


A Great Pair of Bookshelf Speakers

Pioneer is known for many things in the home theater and A/V world: a substantial and well respected line of receivers, one of the best HDTVs ever made…laser discs. Affordable bookshelf speakers have never really made the list though. If the BS41s are any indication, that may be changing very soon.

These budget-minded speakers are some serious giant killers according to a number of well respected reviewers and people who own them. And if you want a fantastic pair of two-ways that perform as well as speakers costing 3-4 times as much, our research says these are the ones to get.

Designed by legendary audio engineer Andrew Jones–the same guy who came up with the TAD Reference One loudspeaker ($70,000/pair)–these more modestly priced $150 speakers still have many of the high-end details. These includes things like radio frequency bonded, curved cabinets, improved multi-component crossovers, and gold-plated five-way binding posts.

Stereophile's Robert Reina was floored by their performance, saying that both the quantity and the quality of the Pioneer's bass reproduction was excellent. "One would expect the sound of a $149.99/pair bookshelf model to include some serious compromises and tradeoffs," he says, "but within its size limitations, the Pioneer SP-BS41-LR has none."

In his review, Reina compared the BS41s to a number of other well known entry-level benchmark bookshelves, including the Paradigm Atom v.6 ($250/pair) and the Wharfedale Diamond 10.1 ($350/pair). According to Reina, the Pioneer bested the more expensive Paradigm in nearly all ways (with a richer, more detailed midrange and cleaner, more extended highs). In fact, the BS41s came very close to matching Wharfedale Diamond 10.1, generally considered to be the reigning king of affordable loudspeakers.

by Brian Lam, Wirecutter |  Read more:

Michelin, Get Out of the Kitchen!

A little more than a hundred years ago, a pair of brothers invented the food guide. It was an inadvertent invention. What they thought they’d done was compile a directory of places in France where you could grab a baguette and a bed for the night while some rural blacksmith or farrier tried to mend your broken-down Boitel, Motobloc, Otto, or Lacoste & Battmann. The brothers, Édouard and André Michelin, made pneumatic tires and were staring down the road at the biggest blue-sky start-up industry of the new century.

The Michelin guide turned out to be prescient and inspired. This motoring thing wasn’t going to be about what you went in but where you went to. The guide quickly became not an emergency manual but a destination invitation. They added a star system—one, two, or three stars—and a hieroglyphic lexicon to show you where you could eat on a terrace, take your dog, or make a phone call.

The Michelin guide made kitchens as competitive as football teams, becoming the most successful and prestigious guidebook in the world, and along the way it killed the very thing it had set out to commend. It wasn’t the only assassin of the greatest national food ever conceived, but it’s not hyperbole to say Michelin was French haute cuisine’s Brutus. (...)

You think three-star food is expensive, but it’s nothing compared with compiling the world’s most famous guide. Michelin doesn’t say how many inspectors it has, what it pays them, how often they visit each establishment—they claim at least once a year—or what their expenses are like, but you do the math. Consider how many more restaurants there are now than there were 30 years ago. It’s a very, very expensive production. When the occasional ex-inspector goes public, there are stories of exhausting and unsustainable lives on the road, covering vast areas where the pleasure of food is made a relentless and lonely craft. There are admissions that many dining rooms are not revisited year after year.

But still, Michelin has launched in a number of foreign countries. And though it claims its standards are universal and unimpeachable, it proves how Francophile and bloated and snobbish the whole business really is and that, far from being a lingua franca, the food on our plate is as varied as any other aspect of a national culture. For instance, Italy has absurdly few three-star restaurants, apparently because the criteria of complexity and presentation aren’t up to Michelin—French—standards, and the marvelously rich and varied curries of India plainly seem to baffle the guide. The city with the most stars is Tokyo, but then, many of its restaurants have barely a handful of chairs, and most benefit from the Gallic reverence for O.C.D. saucing and solitary boy’s knife skills. In both London and New York, the guide appears to be wholly out of touch with the way people actually eat, still being most comfortable rewarding fat, conservative, fussy rooms that use expensive ingredients with ingratiating pomp to serve glossy plutocrats and their speechless rental dates.

The New York guide has also swapped the dry information of the original for short, purple reviews. Food writing is already the recidivist culprit of multiple sins against both language and digestion, but the little encomiums of the Michelin guide effortlessly lick the bottom of the descriptive swill bucket. Take this, for instance, but only if you have a paper bag close at hand: “Can something be too perfect? Can its focus be so singular, pleasure so complete, and technique so flawless that creativity suffers? Per Se proves that this fear is unfounded.” That was written in chocolate saliva. Or this: “Devout foodies are quieting their delirium of joy at having scored a reservation—everyone and everything here is living up to the honor of adoring this extraordinary restaurant … Uni with truffle-oil gelée and brioche expresses the regret that we have but three stars to give.” That’s not a review of Chef’s Table at Brooklyn Fare—it’s a handjob.

by A.A. Gill, Vanity Fair |  Read more:
Illustration: Chris Crisman

The Dementia Plague


Evelyn C. Granieri is that rarest of 21st-century doctors: she still makes house calls. On a warm Thursday morning towardthe end of August, the New York–based geriatrician, outfitted in a tailored white suit and high heels, rang a doorbell at a seven-story red-brick apartment building in the Riverdale section of the Bronx and was buzzed in.

"You look gorgeous!" the doctor exclaimed when she greeted her patient, a 99-year-old woman with white hair and a wry smile, in the dining room of her apartment. In an hourlong conversation, Mrs. K (as we'll call her) recalled, in moving and sometimes mischievous detail, growing up in Poland, where soldiers on horseback took her brother away; coming to America on a ship and working in her parents' grocery story in Queens; and dealing with male colleagues in the real-estate business when they got "fresh." But when Granieri asked how old Mrs. K was when she got married, she looked puzzled.

"I can't remember," she said after a pause. A cloud passed over her face. "Was I married? To whom?" A framed photograph on a nearby table memorialized her 50th wedding anniversary.

Spirited and funny, her personality intact even as her memory deteriorates, Mrs. K is one of more than five million Americans with dementia. Far from the gleaming research centers where scientists parse the subtle biochemical changes associated with Alzheimer's disease and other forms of the condition, clinicians like Granieri, chief of the Division of Geriatric Medicine and Aging at Columbia University Medical Center, confront its devastating reality every day. And, often, they talk to relatives of patients. As Granieri and two interns probed Mrs. K's memory with small talk and measured her blood pressure, a niece called from Manhattan to see how her aunt was doing.

Almost every dementia patient has worried family members huddled in the background, and almost every story about dementia includes a moment when loved ones plead with the doctor for something—any medicine, any intervention, anything—to forestall a relentless process that strips away identity, personality, and ultimately the basic ability to think. Unfortunately, Evelyn Granieri is the wrong person to ask. In 2010 she served on a high-level panel of experts that assessed every possible dementia intervention, from expensive cholinesterase-­inhibiting drugs to cognitive exercises like crossword puzzles, for the National Institutes of Health; it found no evidence that any of the interventions could prevent the onslaught of Alzheimer's. She can—with immense compassion, but equally immense conviction—explain the reality for now and the immediate future: "There really is nothing." Dementia is a chronic, progressive, terminal disease, she says. "You don't get better, ever."

These conversations have always been difficult for doctors and families alike, but perhaps never more so than in the past year, when public reports about dementia research have bounced between optimism and gloom. In the fall of 2011, financial analysts were giddily projecting a global Alzheimer's market of $14 billion a year by 2020 and touting a new generation of drugs known as monoclonal antibodies that were in advanced human trials. A year later, the prospect for the drugs no longer looked so positive. This past August, the giant drug makers Pfizer and Johnson & Johnson suspended advanced clinical trials of one of the monoclonals because it showed no effect in patients with mild to moderate Alzheimer's. A few weeks later, another leading pharmaceutical manufacturer, Eli Lilly, announced inconclusive results for a monoclonal drug it too was testing against the protein deposits called amyloid plaques that are characteristically found in the brains of Alzheimer's patients. The disheartening results prompted some critics to start writing epitaphs for the prevailing hypothesis about the disease—that these amyloid deposits are causing the cognitive impairment.

"The field is in a precarious place right now," says Barry D. Greenberg, director of strategy for the Toronto Dementia Research Alliance, "because tens of billions of dollars have been invested in the development of new treatments, and nothing—not a single disease-modifying agent—has been identified." Granieri often sets off on her house calls from her second-floor office at Allen Hospital—literally the last building in Manhattan, on the northernmost tip of Broadway. That may sound like an out-of-the-way outpost in medicine's battle against dementia, but in reality it sits at ground zero for the looming medical and societal catastrophe. The hospital's catchment area includes upper Manhattan and parts of the Bronx, one of the three densest concentrations of nursing home facilities in the entire United States, according to Granieri. "Here we sit, right in an epicenter," she says.

The epicenter is a contentious place these days. Frontline clinicians like ­Granieri are increasingly frustrated with the narrowness of dementia research. In the patients they treat every day, they see a disease that is complicated and insidious, often with multiple causes and murky diagnostic distinctions. In contrast, they see a research enterprise focused on several favorite hypotheses, and they see a drug industry that has profited handsomely from expensive, marginally effective treatments sought by desperate families.

Academic and pharmaceutical researchers, meanwhile, continue to throw money at the dementia problem—but finally, they insist, with better aim and much shrewder treatment strategies. They have begun to assemble a list of diagnostic markers that they believe may reliably indicate the first signs of Alzheimer's disease 10 or 15 years before symptoms appear, and they are gearing up to test new drugs that can be given to healthy patients, in an attempt to block the buildup of amyloid long before dementia's onset. Indeed, to hear researchers tell it, this summer's highly publicized clinical-trial failures are already ancient history. They are finally doing the right kind of science and hope to get the right kinds of answers, the first glimpses of which may appear in the next several years.

As Granieri and other physicians who treat dementia patients know, the stakes could scarcely be higher.

by Steven S. Hall, MIT Technology Review |  Read more:
Illustration: William Utermohlen—Blue Skies (detail), 1995