Thursday, January 17, 2013

A Modest Proposal

Can we all just finally admit that wine people are in desperate need of a reality check on Bordeaux? The sooner we do, we will all be better off. Even Bordeaux itself — the entire region and its thousands of wine producers, not just the First Growths — will be better off. By focusing so much on the top end, Bordeaux has become almost entirely irrelevant to two generations of wine drinkers.

The Bordeaux backlash began to gain steam during all the hyperbolic critical attention for the 2009 vintage, and its record-setting prices. New York Times wine critic Eric Asimov wrote that “for a significant segment of the wine-drinking population in the United States, the raves heard around the world were not enough to elicit a response beyond, perhaps, a yawn.”

A few months later, the Wall Street Journal’s wine critic (and occasionally famous novelist) Jay McInerney bluntly asked: “Does Bordeaux still matter?” McInerney recounted boos at a fine wine auction when an offering of Bordeaux was announced. “For wine buffs with an indie sensibility,” he wrote, “Bordeaux is the equivalent of the Hollywood blockbuster, more about money than about art.” As sort of a hedge, he added: “Bordeaux bashing has become a new form of wine snobbery.”

A year later, the Journal’s other wine critic, Lettie Teague, wrote about how wine drinkers “shy away from Bordeaux, dismissing it as too expensive, too old-fashioned, too intimidating or simply too dull.”

Top sommeliers have weighed in, too. At Terroir, Paul Greico’s trend-setting New York wine bar, it is often noted that, despite over 50 wines by the glass, there is not one from Bordeaux. But perhaps the most damning rebuke of Bordeaux came last summer, from Pontus Elofsson, the sommelier at the cutting-edge Copenhagen restaurant Noma, voted “best restaurant in the world” three years running. Elofsson steadfastly refuses to carry Bordeaux on Noma’s wine list.

So it was with all this venom as backdrop that I made my first visit to Bordeaux last spring.

My friend was right. Even for someone who writes about wine, Bordeaux is totally intimidating. It hit me when I found myself sitting uneasily in the tasting parlor of Château La Mission Haut-Brion in the company of Prince Robert de Luxembourg, the chateau’s royal managing director.

Prince Robert told me that the big-time critics like Parker and James Suckling had visited here the week before. During our chit-chat, I mentioned this was my first trip to Bordeaux, and the Prince guffawed, incredulously. “Never been to Bordeaux? And you write about wine?”

“Um, well…yeah?” I said, backpedaling. “I guess I’ve just spent most of my time in places like Italy and Spain and Portugal. And other parts of France? I don’t know. Italy I guess is where most of my wine knowledge has come from.”

“Oh,” said the Prince, in a grand princely fashion, “so you are an expert in Italian wines? Ha. Well, we have an Italian wine expert here!” I haven’t felt so foolish since middle school when I forgot to wear shorts to a basketball game, and pulled down my sweatpants to reveal my tighty-whities to the crowd. The message from Prince Robert seemed to be: How the hell did you get an appointment to taste wines with me?

I looked around at the regal tasting room, with the heavy wood furniture and the bust of someone famous, and the high-seated chairs where the important wine critics swirl and spit and opine and move cases of thousand-dollar wine. And I decided to jump right in with a question that may have been impolite: “A lot of wine writers and sommeliers back in the States say that Bordeaux isn’t really relevant anymore. What do you say to those people?”

“The fact is,” said Prince Robert, “that people need to write about something. And Bordeaux is obviously so relevant that they need to write something about Bordeaux. It’s the tall poppy syndrome.”

Prince Robert clearly had answered this question many times before. “I would ask other winemakers around the world and they will tell you that Bordeaux would be the benchmark by which to judge all other wines,” he said. “There are no wines in the world that receive more excitement.”

“But wait,” I said. “Aren’t you worried that younger people aren’t drinking Bordeaux? That it’s not even on their radar? Aren’t you afraid that when this generation can finally afford your wines, they won’t care about them?”

“Yes, the young wine drinker likes the simplicity of New World wines. Wines that are easy to explain,” he said, and I’m not sure I can properly convey just how much contempt dripped from the Prince’s voice. “Anyway, I am confident that people will come back to the great wines of Bordeaux.”

“There has never been more demand for the top-end wines,” he added. This may be true, but we all know that the market is now being driven, in large part, by newer collectors in Asia. One might reasonably hypothesize that tastes will eventually change in China and India, too, just as they have in the United States in the decades since 1982 when Americans “discovered” Bordeaux (via Robert Parker). Surely by now there is a Chinese Robert Parker? And in the not-so-distant future a backlash against Bordeaux by young, tattooed, hipster Chinese sommeliers will happen?

I didn’t get to ask these questions because, apparently, our conversation bored the Prince. He rose from his chair, bid me adieu and wished me a good first trip to Bordeaux. “Enjoy those Italian wines,” he said, with a smile and a wink.

I was then left to taste nine wines from the 2011 vintage with the public relations person. How were the wines? Amazing. No doubt about it. The flagship first label wine was more complex and dense and rich than just about anything else I’ve ever tasted. But at what price? Château Haut-Brion 2009 has been listed at $1,000 a bottle. I tasted the only ounce of the 2011 that I will likely ever taste, one ounce more than most of my friends and readers will likely ever taste. Will my description inspire you to drink Bordeaux? I mean, one of my friends drove a Ferrari once and another once had sex with an underwear model, but neither of their descriptions has exactly led to me closer to the same experience.

by Jason Wilson, The Smart Set |  Read more:
Photo: uncredited

When Pills Fail, There Are Other Options

The treatment may sound appalling, but it works.

Transplanting feces from a healthy person into the gut of one who is sick can quickly cure severe intestinal infections caused by a dangerous type of bacteria that antibiotics often cannot control.

A new study finds that such transplants cured 15 of 16 people who had recurring infections with Clostridium difficile bacteria, whereas antibiotics cured only 3 of 13 and 4 of 13 patients in two comparison groups. The treatment appears to work by restoring the gut’s normal balance of bacteria, which fight off C. difficile.

The study is the first to compare the transplants with standard antibiotic therapy. The research, conducted in the Netherlands, was published Wednesday in The New England Journal of Medicine.

Fecal transplants have been used sporadically for years as a last resort to fight this stubborn and debilitating infection, which kills 14,000 people a year in the United States. The infection is usually caused by antibiotics, which can predispose people to C. difficile by killing normal gut bacteria. If patients are then exposed to C. difficile, which is common in many hospitals, it can take hold.

The usual treatment involves more antibiotics, but about 20 percent of patients relapse, and many of them suffer repeated attacks, with severe diarrhea, vomiting and fever.

Researchers say that, worldwide, about 500 people with the infection have had fecal transplantation. It involves diluting stool with a liquid, like salt water, and then pumping it into the intestinal tract via an enema, a colonoscope or a tube run through the nose into the stomach or small intestine.

Stool can contain hundreds or even thousands of types of bacteria, and researchers do not yet know which ones have the curative powers. So for now, feces must be used pretty much intact.

Medical journals have reported high success rates and seemingly miraculous cures in patients who have suffered for months. But until now there was room for doubt, because no controlled experiments had compared the outlandish-sounding remedy with other treatments.

The new research is the first to provide the type of evidence that skeptics have demanded, and proponents say they hope the results will help bring fecal transplants into the medical mainstream, because for some patients nothing else works.

“Those of us who do fecal transplant know how effective it is,” said Dr. Colleen R. Kelly, a gastroenterologist with the Women’s Medicine Collaborative in Providence, R.I., who was not part of the Dutch study. “The tricky part has been convincing everybody else.”  (...)

Dr. Keller said that patients were so eager to receive transplants that they would not join the study unless the researchers promised that those assigned to antibiotics alone would get transplants later if the drugs failed.

Among the 16 who received transplants, 13 were cured after the first infusion. The other three were given repeat infusions from different donors, and two were also cured. In the two groups of patients who did not receive transplants, only 7 of 26 were cured.

Of the patients who did not receive transplants at first and who relapsed after receiving antibiotics only, 18 were subsequently given transplants, and 15 were cured.

The study was originally meant to include more patients, but it had to be cut short because the antibiotic groups were faring so poorly compared with the transplant patients that it was considered unethical to continue.

by Denise Grady, NY Times |  Read more:
Gretchen Ertl for The New York Times

Auguste Rodin, I am Beautiful 1882.

Wednesday, January 16, 2013

‘Survival of the Wrongest’


In late 2011, in a nearly 6,000-word article in The New York Times Magazine, health writer Tara Parker-Pope laid out the scientific evidence that maintaining weight loss is a nearly impossible task—something that, in the words of one obesity scientist she quotes, only “rare individuals” can accomplish. Parker-Pope cites a number of studies that reveal the various biological mechanisms that align against people who’ve lost weight, ensuring that the weight comes back. These findings, she notes, produce a consistent and compelling picture by “adding to a growing body of evidence that challenges conventional thinking about obesity, weight loss, and willpower. For years, the advice to the overweight and obese has been that we simply need to eat less and exercise more. While there is truth to this guidance, it fails to take into account that the human body continues to fight against weight loss long after dieting has stopped. This translates into a sobering reality: once we become fat, most of us, despite our best efforts, will probably stay fat.”

But does this mean the obese should stop trying so hard to lose weight? Maybe. Parker-Pope makes sure to include the disclaimer that “nobody is saying” obese people should give up on weight loss, but after spending so much time explaining how the science “proves” it’s a wasted effort, her assurance sounds a little hollow.

The article is crammed with detailed scientific evidence and quotes from highly credentialed researchers. It’s also a compelling read, thanks to anecdotal accounts of the endless travails of would-be weight-losers, including Parker-Pope’s own frustrating failures to remove and keep off the extra 60 pounds or so she says she carries.

In short, it’s a well-reported, well-written, highly readable, and convincing piece of personal-health-science journalism that is careful to pin its claims to published research.

There’s really just one problem with Parker-Pope’s piece: Many, if not most, researchers and experts who work closely with the overweight and obese would pronounce its main thesis—that sustaining weight loss is nearly impossible—dead wrong, and misleading in a way that could seriously, if indirectly, damage the health of millions of people.

Many readers—including a number of physicians, nutritionists, and mental-health professionals—took to the blogs in the days after the article appeared to note its major omissions and flaws. These included the fact that the research Parker-Pope most prominently cites, featuring it in a long lead, was a tiny study that required its subjects to go on a near-starvation diet, a strategy that has long been known to produce intense food cravings and rebound weight gain; the fact that many programs and studies routinely record sustained weight-loss success rates in the 30-percent range; and Parker-Pope’s focus on willpower-driven, intense diet-and-exercise regimens as the main method of weight loss, when most experts have insisted for some time now that successful, long-term weight loss requires permanent, sustainable, satisfying lifestyle changes, bolstered by enlisting social support and reducing the temptations and triggers in our environments—the so-called “behavioral modification” approach typified by Weight Watchers, and backed by research studies again and again.  (...)

The Times has run into similar trouble with other prominent articles purporting to cut through the supposed mystery of why the world keeps getting dangerously fatter. One such piece pointed the finger at sugar and high-fructose corn syrup, another at bacteria. But perhaps the most controversial of the Times’s solution-to-the-obesity-crisis articles was the magazine’s cover story in 2002, by science writer Gary Taubes, that made the case that high-fat diets are perfectly slimming—as long as one cuts out all carbohydrates. His article’s implicit claim that copious quantities of bacon are good for weight loss, while oatmeal, whole wheat, and fruit will inevitably fatten you up, had an enormous impact on the public’s efforts to lose weight, and to this day many people still turn to Atkins and other ultra-low-carb, eat-all-the-fat-you-want diets to try to shed excess pounds. Unfortunately, it’s an approach that leaves the vast majority of frontline obesity experts gritting their teeth, because while the strategy sometimes appears to hold up in studies, in the real world such dieters are rarely able to keep the weight off—to say nothing of the potential health risks of eating too much fat. And of course, the argument Taubes laid out stands in direct opposition to the claims of the Parker-Pope article. Indeed, most major Times articles on obesity contradict one another, and they all gainsay the longstanding consensus of the field.

The problem isn’t unique to the Times, or to the subject of weight loss. In all areas of personal health, we see prominent media reports that directly oppose well-established knowledge in the field, or that make it sound as if scientifically unresolved questions have been resolved. The media, for instance, have variously supported and shot down the notion that vitamin D supplements can protect against cancer, and that taking daily and low doses of aspirin extends life by protecting against heart attacks. Some reports have argued that frequent consumption of even modest amounts of alcohol leads to serious health risks, while others have reported that daily moderate alcohol consumption can be a healthy substitute for exercise. Articles sang the praises of new drugs like Avastin and Avandia before other articles deemed them dangerous, ineffective, or both.

What’s going on? The problem is not, as many would reflexively assume, the sloppiness of poorly trained science writers looking for sensational headlines, and ignoring scientific evidence in the process. Many of these articles were written by celebrated health-science journalists and published in respected magazines and newspapers; their arguments were backed up with what appears to be solid, balanced reporting and the careful citing of published scientific findings.

But personal-health journalists have fallen into a trap. Even while following what are considered the guidelines of good science reporting, they still manage to write articles that grossly mislead the public, often in ways that can lead to poor health decisions with catastrophic consequences. Blame a combination of the special nature of health advice, serious challenges in medical research, and the failure of science journalism to scrutinize the research it covers.

by David H. Freedman, Columbia Journalism Review |  Read more:



Mike Carroll, Bumper Crop, Board and Batten and Setting Sun.

[ed. I just spent a delightful half hour with Mike and Kathy Carroll, both very nice people. Kathy started the Lanai Animal Rescue Center and showed me a video of the fenced cat sanctuary she developed for abandoned and feral cats here on Lanai. It's about a quarter-acre in size, has over a couple hundred cats and is maintained by donations and fund raising. If you check out this video you'll see it's quite an innovative solution to caring for stray cats (all neutered) and protecting the island's native bird populations. The cats certainly look happy.]

Hypochondria: An Inside Look


When The New York Times called, inquiring if I might pen a few words “from the horse’s mouth” about hypochondria, I confess I was taken aback. What light could I possibly shed on this type of crackpot behavior since, contrary to popular belief, I am not a hypochondriac but a totally different genus of crackpot?

What I am is an alarmist, which is in the same ballpark as the hypochondriac or, should I say, the same emergency room. Still there is a fundamental difference. I don’t experience imaginary maladies — my maladies are real.

What distinguishes my hysteria is that at the appearance of the mildest symptom, let’s say chapped lips, I instantly leap to the conclusion that the chapped lips indicate a brain tumor. Or maybe lung cancer. In one instance I thought it was Mad Cow.

The point is, I am always certain I’ve come down with something life threatening. It matters little that few people are ever found dead of chapped lips. Every minor ache or pain sends me to a doctor’s office in need of reassurance that my latest allergy will not require a heart transplant, or that I have misdiagnosed my hives and it’s not possible for a human being to contract elm blight.

Unfortunately, my wife bears the brunt of these pathological dramas. Like the time I awoke at 3 a.m. with a spot on my neck that to me clearly had the earmarks of a melanoma. That it turned out to be a hickey was confirmed only later at the hospital after much wailing and gnashing of teeth. Sitting at an ungodly hour in the emergency room where my wife tried to talk me down, I was making my way through the five stages of grief and was up to either “denial” or “bargaining” when a young resident fixed me with a rather supercilious eye and said sarcastically, “Your hickey is benign.”

by Woody Allen, NY Times |  Read more:
Illustration: Maumont

The Science of Sex Abuse

Is it right to imprison people for heinous crimes they have not yet committed?

On a Saturday night in the summer of 1998, an undercover officer logged in to a child-pornography chat room using the screen name Indy-Girl. Within minutes, a user named John introduced himself and asked her, “Are you into real life or just fantasy?” Indy-Girl said that because of the “legality of it” she had never acted on her fantasies. But she soon revealed an adventurous spirit. She was a bisexual college sophomore, she said, and had learned about sex at an early age. “My mother is very European,” she explained.

John, a thirty-one-year-old soldier stationed in Fort Campbell, Kentucky, had been using the Internet for less than a year. He began downloading child pornography after watching a television special about how Internet child porn had become epidemic. He hadn’t realized that it existed. In the five months since he’d seen the show, he had downloaded more than two thousand images from child-pornography news groups. In the anonymous chat rooms, he felt free to adopt a persona repugnant to society. He told Indy-Girl that he was a “real-life pedophile,” adding, “At least here I can come out and admit it.”

“What’s the kinkiest you’ve done?” Indy-Girl asked. John said he’d had sex with a ten-year-old while her parents were skiing, and with a fourteen-year-old at a night club in Germany. Indy-Girl recognized that she was too old for him, which was “depressing,” but she offered that her little sister liked older men. “Maybe you could intro me,” John wrote. “We could meet somewhere discreet.”

John had been in the Army for eight years, serving in Desert Storm and Bosnia, and had graduated from Penn State with a degree in history. He was thinking of leaving the service, in part because he felt picked on by other soldiers. He had been commended for having a memory for technical details, but he was also nervous, nerdy, and eager to please. At all stages of his life, he had been afflicted with the sense that he was just a “wannabe.”

Unlike other people John met online, Indy-Girl seemed to like him. After a week of conversations, she asked John if he was “r/l” (real life) about the meeting, and when he said that he was she sent him a soft-focus digital image of a girl who she said was her fourteen-year-old sister. “Now don’t be mean when you see it,” she warned. “She still has some of her baby fat, she’s kinda embarrassed.” Undeterred, John described how the three of them would enjoy one another’s company: they could have sex in the shower or in a field of flowers. He encouraged Indy-Girl to “talk dirty” and “let your imagination go wild,” but she cut him off, explaining, “I’m not the cyber type.”

She preferred to discuss the logistics of their meeting, a subject that John approached hesitantly. During the following week, Indy-Girl repeatedly expressed concern that John was avoiding her: “You’re usually so fun to chat with . . . and now . . . I feel like just . . . blaaaahhh.” She apologized for getting “a bit too gabby” and for “being so weird” and “reading into things.” John said it wasn’t her—he worked long hours and was tired. He also admitted that he wanted a relationship more than he wanted sex. He hoped to find someone who “could accept me the way I am.” “Give it a chance,” Indy-Girl encouraged. “If you like her . . . and she likes you . . . things will work out.” She added, “It’s not like she’s gonna die if you don’t.”

They decided to meet at a park in Elizabethtown, Kentucky, where they could have a picnic or go boating on the lake. Two weeks after their first conversation, John drove three hours to the appointed meeting spot. He brought lacy undergarments in his briefcase. The Military Police Investigations unit, working with the F.B.I., had recruited two young officers to play the roles of the two sisters. They arrived early, spread a blanket on the grass, and waved at John, who was sitting at a picnic table, writing in his journal.

An athletic man with light-brown hair and green eyes, John slowly walked over to the girls, who were playing with a beach ball. He offered them sodas, and they chatted about what they liked to drink—Indy-Girl said she preferred beer—and about how long the drive had taken. It was a “normal conversation,” one of the cops later wrote, until John “saw the agents approaching him, and he began backing away.” A plainclothes officer whom John had seen standing by the lake, holding a fishing pole and a tackle box, shouted at him to put his hands behind his back.

John waived his right to a lawyer, hoping to end the humiliation quickly. (His mother, for the sake of John’s two younger brothers, has asked that I not use the family’s last name.) In an interview with the agents, John confessed that he frequently downloaded child pornography, storing it on his hard drive in a folder labelled “2Young.” He was sexually attracted to the girls in the photographs, he admitted, but he had never had sexual contact with anyone below the age of eighteen. He insisted that he had invented his sexual exploits to impress Indy-Girl. According to an F.B.I. report summarizing the interview, “Everything that he said on the Internet was a lie.”

by Rachel Aviv, New Yorker |  Read more:
Illustration by Noma Bar/Dutch Uncle.

Tuesday, January 15, 2013


Fernandez Arman (1928-2005) - Guitar, 1995.
via:

Man Ray, Seguidia 1970.
via:

Grant Wood, Young Corn 1931.
via:

The Violent Femmes of Football

"Hey! HEY! Lemme get a picture!" A stumpy man in a Darren McFadden jersey, whose cheeks are glopped with two generous Rorschach blots of eye black, is shouting over to the two women next to me, Metal Cindy and Dre of the Dead. This keeps happening. It's taken us about thirty minutes to walk just sixty yards from our tailgate in the southwest corner of Lot C toward the entrance of the Oakland coliseum, home of the Oakland Raiders. And not because a crowd has formed to get inside. It's just past noon, an hour before kickoff, and there's time for one more beer before the tailgaters are required to pack up their grills. No, we're traveling at turtles-on-benzos pace because Cindy, 27, and her 17-year-old protégée, Dre, a quiet, raven-haired beauty, who, in her striped leggings and black tutu, recalls a 1990s Winona Ryder, are celebrities here. Navigating the lot with them, as I've been doing for the past four hours, is like walking around a junior high school with Justin Bieber. Only in this junior high, the kids wear spikes and chains and generally look like groupies who got tossed out of a Kiss concert for freaking out the band.

Here's a sampling of who Cindy and Dre have posed with so far this morning: three teenage girls openly smoking a pipe of weed; a pack of smallish Mexican men who speak no English; two lady police officers on bicycles; a pair of incongruously well-dressed European gentlemen; a big group of rowdy drunks; a phocomelic young man in a wheelchair; a bulldog with an eye patch; a middle-aged couple wearing Raiders jerseys and khakis who look like the parents from the movie Pleasantville; and two shy Latino teen boys, both built like vending machines, who trade Hey, fuck you's in Vito Corleone voices.

The guy in the McFadden jersey finagles himself between Cindy and Dre, his head barely rising above Cindy's toothsome cleavage, which is cinched and pushed up to her clavicle by a Raiders bustier top with maybe a thousand decorative belt buckles. Cindy is an aspiring model who's done some "very dark fetish" work for pinup calendars and music ads. She looks like a goth JWoww, or a honeypot T-800 whose face has been half blasted off to its metal core; the right side of her face is painted like Skeletor, and one of her big brown eyes is concealed by a ghoulish white Raiders contact lens. Just before the photo is snapped, McFadden guy turns and grins directly into Cindy's chest flesh, throwing a lecherous thumbs-up to the camera.

In all of Raider nation, there are about fifty or so "superfans," and Metal Cindy and Dre of the Dead are two of them. Along with other "characters"—including Gorilla Rilla, a dude who shows up every game day in a full ape suit, plus a jersey and sunglasses over the ape suit, and who, according to Metal Cindy, got married in that getup—Cindy and Dre never miss a Sunday. They're like walking and waving Disney World mascots for the drunk-at-10-A.M. set. Hunter S. Thompson once described them as "the sleaziest and rudest and most sinister mob of thugs and whackos ever assembled." They represent NFL obsession at its most fervent—or unhinged, depending on your viewpoint.

Which is why I've come here, of all NFL cities, to find women like Metal Cindy and Dre: because I have no earthly idea why any woman would want to be part of this scene. It's not that I hate football—I reserve that word for Nazis and tuna-fish salad—but it's safe to say I don't -particularly enjoy the game. If you say "football," I think: "pack mentality," "day drinking," "pissing on sidewalks," "brain damage," "homoerotic pile-ups," and "Dad ignoring me." Basically, rock-bottom male behavior, quintupled. I realize you're probably not on my side here. And I'm probably not being fair. But it's a gut reaction—like how you might see a gaggle of girls watching The Real Housewives of Atlanta together and think: Ugh, women.

Apparently, though, I'm in the minority when it comes to football. Because with every given Sunday, more and more women are being drawn into its cult. And the NFL has gotten wise to its big new demo: In 2010 the league launched a new ad campaign for its line of women's apparel featuring, among others, a grinning Condoleezza Rice wearing a slim-cut Browns jersey. And according to the most recent league stats, females make up 44 percent of the NFL's fan base. Most of these women, I have to assume, are just casual fans. Watch-from-the-couch-on-Sunday types. But there's a small sliver of them who freak out over football just as hard as the men. And I wanted to see how these women—football's female diehards—assimilate into such a uniquely macho culture. Especially since I've always gotten the impression that guys value the notion that football is something for them. (Case in point: "man caves.") It's as if these ladies are pledging the biggest frat in America. So what are the hazing rituals? What are the privileges of membership? And how do these women have to shape (or reshape) themselves to fit in with Phi Kappa Football?

So I flew to Oakland, home of fans who apparently could out-weird Hunter S. Thompson, to spend game day with women who willingly get up at 4 A.M. to drink and grill and celebrate football, so that I could experience the sport through their eyes. Could it be that I, the lamest cliché of the lady football hater, am missing out on something awesome?

by Lauren Bans, GQ |  Read more:
Photo: Ture Lillegraven

What Should We Be Worried About?

[ed. If you'd like to spend a very productive half-hour, check out the essays mentioned in this article at: Edge: 2013: What *Should* We Be Worried About?]

Each December for the past fifteen years, the literary agent John Brockman has pulled out his Rolodex and asked a legion of top scientists and writers to ponder a single question: What scientific concept would improve everybody’s cognitive tool kit? (Or: What have you changed your mind about?) This year, Brockman’s panelists (myself included) agreed to take on the subject of what we should fear. There’s the fiscal cliff, the continued European economic crisis, the perpetual tensions in the Middle East. But what about the things that may happen in twenty, fifty, or a hundred years? The premise, as the science historian George Dyson put it, is that “people tend to worry too much about things that it doesn’t do any good to worry about, and not to worry enough about things we should be worrying about.” A hundred fifty contributors wrote essays for the project. The result is a recently published collection, “What *Should* We Be Worried About?” available without charge at John Brockman’s edge.org.

A few of the essays are too glib; it may sound comforting to say that ”the only thing we need to worry about is worry itself” (as several contributors suggested), but anybody who has lived through Chernobyl or Fukushima knows otherwise. Surviving disasters requires contingency plans, and so does avoiding them in first places. But many of the essays are insightful, and bring attention to a wide range of challenges for which society is not yet adequately prepared.

One set of essays focusses on disasters that could happen now, or in the not-too-distant future. Consider, for example, our ever-growing dependence on the Internet. As the philosopher Daniel Dennett puts it:
We really don’t have to worry much about an impoverished teenager making a nuclear weapon in his slum; it would cost millions of dollars and be hard to do inconspicuously, given the exotic materials required. But such a teenager with a laptop and an Internet connection can explore the world’s electronic weak spots for hours every day, almost undetectably at almost no cost and very slight risk of being caught and punished.
As most Internet experts realize, the Internet is pretty safe from natural disasters because of its redundant infrastructure (meaning that there are many pathways by which any given packet of data can reach its destination) but deeply vulnerable to a wide range of deliberate attacks, either by censoring governments or by rogue hackers. (Writing on the same point, George Dyson makes the excellent suggestion of calling for a kind of emergency backup Internet, “assembled from existing cell phones and laptop computers,” which would allow the transmission of text messages in the event that the Internet itself was brought down.)

We might also worry about demographic shifts. Some are manifest, like the graying of the population (mentioned in Rodney Brooks’s essay) and the decline in the global birth rate (highlighted by Matt Ridley, Laurence Smith, and Kevin Kelly). Others are less obvious. The evolutionary psychologist Robert Kurzban, for example, argues that the rising gender imbalance in China (due to the combination of early-in-pregnancy sex-determination, abortion, the one-child policy, and a preference for boys) is a growing problem that we should all be concerned about. As Kurzban puts it, by some estimates, by 2020 “there will be 30 million more men than women on the mating market in China, leaving perhaps up to 15% of young men without mates.” He also notes that “cross-national research shows a consistent relationship between imbalanced sex ratios and rates of violent crime. The higher the fraction of unmarried men in a population, the greater the frequency of theft, fraud, rape, and murder.” This in turn tends to lead to a lower G.D.P., and, potentially, considerable social unrest that could ripple around the world. (The same of course could happen in any country in which prospective parents systematically impose a preference for boys.)

Another theme throughout the collection is what Stanford psychologist Brian Knutson called “metaworry”: the question of whether we are psychologically and politically constituted to worry about what we most need to worry about.

In my own essay, I suggested that there is good reason to think that we are not inclined that way, both because of an inherent cognitive bias that makes us focus on immediate concerns (like getting our dishwasher fixed) to the diminishment of our attention to long-term issues (like getting enough exercise to maintain our cardiovascular fitness) and because of a chronic bias toward optimism known as a “just-world fallacy” (the comforting but unrealistic idea that moral actions will invariably lead to just rewards). In a similar vein, the anthropologist Mary Catherine Bateson argues that “knowledgeable people expected an eventual collapse of the Shah’s regime in Iran, but did nothing because there was no pending date. In contrast, many prepared for Y2K because the time frame was so specific.” Furthermore, as the historian of ideas Noga Arikha puts it, “our world is geared at keeping up with a furiously paced present with no time for the complex past,” leading to a cognitive bias that she calls “presentism.”

As a result, we often move toward the future with our eyes too tightly focussed on the immediate to care much about what might happen in the coming century or two—despite potentially huge consequences for our descendants. As Knutson says, his metaworry is that actual threats [to our species] are changing much more rapidly than they have in the ancestral past. Humans have created much of this environment with our mechanisms, computers, and algorithms that induce rapid, “disruptive,” and even global change. Both financial and environmental examples easily spring to mind.… Our worry engines [may] not retune their direction to focus on these rapidly changing threats fast enough to take preventative action.

by Gary Marcus, New Yorker |  Read more:
Illustration by Lou Brooks

The Searchers


[ed. I guess I naively assumed that 'Search' was pretty much a static issue these days with only incremental refinements as more big data became available -- Google won, end of story. But Facebook is making quite a splash today, and search seems to be evolving in very significant ways. FB has a lot of data to work with, but as one source noted, it could also be pretty "noisy data".] 

When we talk about “searching” these days, we’re almost always talking about using Google to find something online. That’s quite a twist for a word that has long carried existential connotations, that has been bound up in our sense of what it means to be conscious and alive. We don’t just search for car keys or missing socks. We search for truth and meaning, for love, for transcendence, for peace, for ourselves. To be human is to be a searcher.

In its highest form, a search has no well-defined object. It’s open-ended, an act of exploration that takes us out into the world, beyond the self, in order to know the world, and the self, more fully. T. S. Eliot expressed this sense of searching in his famously eloquent lines from “Little Gidding”:
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.
Google searches have always been more cut and dried, keyed as they are to particular words or phrases. But in its original conception, the Google search engine did transport us into a messy and confusing world—the world of the web—with the intent of helping us make some sense of it. It pushed us outward, away from ourselves. It was a means of exploration. That’s much less the case now. Google’s conception of searching has changed markedly since those early days, and that means our own idea of what it means to search is changing as well.

Google’s goal is no longer to read the web. It’s to read us. Ray Kurzweil, the inventor and AI speculator, recently joined the company as its director of research. His general focus will be on machine learning and natural language processing. But his particular concern, as he said in a recent interview, will entail reconfiguring the company’s search engine to focus not outwardly on the world but inwardly on the user:
“I envision some years from now that the majority of search queries will be answered without you actually asking. It’ll just know this is something that you’re going to want to see.” While it may take some years to develop this technology, Kurzweil added that he personally thinks it will be embedded into what Google offers currently, rather than as a stand-alone product necessarily.
This has actually been Google’s great aspiration for a while now. We’ve already begun to see its consequences in the customized search results the company serves up by tracking and analyzing our behavior. But such “personalization” is only the start. Back in 2006, Eric Schmidt, then the company’s CEO, said that Google’s “ultimate product” would be a service that would “tell me what I should be typing.” It would give you an answer before you asked a question, obviating the need for searching entirely. This service is beginning to take shape, at least embryonically, in the form of Google Now, which delivers useful information, through your smartphone, before you ask for it. Kurzweil’s brief is to accelerate the development of personalized, preemptive information delivery: search without searching.

In its new design, Google’s search engine doesn’t push us outward; it turns us inward. It gives us information that fits the behavior and needs and biases we have displayed in the past, as meticulously interpreted by Google’s algorithms. Because it reinforces the existing state of the self rather than challenging it, it subverts the act of searching. We find out little about anything, least of all ourselves, through self-absorption.

by Nicholas Carr, Rough Type |  Read more:
Photo from John Ford’s “The Searchers.”

Zao Wou-Ki
via:

Gamifying Beauty

A few months ago, I stumbled across a website that promised a “virtual makeover.” You’d upload a photo of yourself, then apply various “looks” with all manner of makeup colors and hairstyles; you could even “borrow” a celebrity’s entire look, pasting her makeup and hair onto your image.

I’d seen similar tools before, of course, but they were always comically bad—more along the lines of my friend Lindsay Goldwert’s awesome collection of horror-makeover images than anything you’d actually use to evaluate whether you’d look good in, say, coral lipstick. On a whim, though, I decided to give it a try, figuring that the technology must have changed since I’d last given them a whirl.

I was right. Though the results were obviously computerized, the tech had developed so that you could align your face more precisely in the application frame, meaning that lipstick actually landed on your lips instead of where the computer wanted your lips to be. More important, it was actually useful. I was surprised to find that I actually might look good in coral lipstick; I confirmed that, sadly, the mod look makes me look just wrong; I found a half-up, half-down hairstyle that looked great on me, and when I tried it out on terra firma, it was indeed flattering.

The site linked out to other sites that had features besides makeovers—you could digitally slim yourself down, or plump yourself up. You could get a breast lift, breast augmentation, or both, which served as a complement to the rhinoplasty and face-lift features on the makeover site.

Do I even need to tell you what happened? I went down the rabbit hole. Making adjustment after adjustment, I manipulated my face and body—just to see, of course. Learning what I’d look like with Gwen Stefani’s hair (absurd) led to seeing what I’d look like what Penelope Cruz’s hair (not bad), which led to me trying on dozens of brunette celebrity styles to see which might suit me best (Ginnifer Goodwin?). I plumped my body out 20 pounds to see if it would resemble how my body actually looked when I was 20 pounds heavier (it did), then trimmed myself down 10 pounds to see if it echoed my erstwhile 10-pounds-lighter frame (it didn’t, which didn’t stop me from going on to drop another 15 virtual pounds, because, hey, this is just a game, right?). I narrowed my nose, went up three cup sizes, ridded myself of my deep nasolabial folds, and alternated between digitally tanning and digitally “brightening” until I realized I was aiming for pretty much the skin tone I actually have. And then, a good two hours after I’d sat down to try on Gwen Stefani’s hair for a lark, I went to bed.

Now, there’s plenty to say here about the nature of that rabbit hole, and how it relates to self-esteem and dissatisfaction. (Is it any surprise that after inflating my breasts three cup sizes, clicking back to the photo of myself au naturel left me feeling deflated?) But in truth, after spending an evening creating a slimmer, bustier, better-made-up version of myself, the most pervasive feeling I had was not of self-abasement but of extraordinary fatigue. It was like I’d spent 12 hours proofreading a dissertation on, I don’t know, dirt, printed out in 7-point font. I felt the brain-drain not only of sitting in front of the computer for too long, but of doing crap I don’t actually feel like doing. Which is to say: I felt like I’d been working.

In fact, I sort of was working, even if I tricked myself into thinking I was doing it just for fun. It made me think of gamification, the use of game elements and digital gaming techniques in non-game situations. The idea, in part, is that by lending the benefits of gaming to more tedious tasks (like work), the tedium is lessened because it feels more like play. Perhaps you’ll be more likely to, say, complete online training courses if you earn “points” or “badges” for each segment you finish. It seems silly that something essentially imaginary would motivate people—but one peek at the popularity of programs like Foursquare that allow you to gamify your own life shows that it works. The term more broadly applies to any sort of game thinking that applies to non-game situations—like interactive features (that annoying Microsoft Word pop-up dude) and simulation (think 3-D modeling à la SimCity), though most of the critiques of gamification that I’ve read focus on its reward aspects.

by Autumn Whitefield-Madrano, The New Inquiry |  Read more:

Yozo Hamaguchi
via:

Someone’s Knocking at My Door

I’ve been living in complete silence for months, I might say for years, with just the usual dull sounds you hear at the outskirts of town, the occasional echo of steps in the corridor and, further off, in the stairwell, someone dragging a sack, a carpet, a package, or a corpse, God knows what, along the ground; or the sound of the elevator as it slows, stops, opens, then closes and starts to rise or descend. Every so often a dog barks briefly, someone laughs or shouts. But everything dies away, soon lost in the constant low-level murmur of the street outside. That is what complete silence is like round here.

There are of course times I put on a Zelenka mass or listen to one of Schiff’s “Wohltemperiertes Klavier” interpretations, or take out Spoon, Karen Dalton or Vic Chesnutt, but after a few bars I turn it off so it may be quiet again, because I want to be ready and I don’t want anything disturbing going on when he arrives and finds me.

To be honest I wouldn’t have been surprised if he hadn’t knocked but beat at the door, or simply kicked the door in, but now that I hear the knocking, it’s clear there is no difference between his knocking and beating or kicking the door in, I mean really no difference, the point being that I am dead certain it is him, who else; he of whom I knew, and have always known would come.

The most tragic figure in history is the one in whom two terrible conditions meet. The two conditions that meet and combine in him are bottomless idiocy and unbounded aggression. Someone —a self-exiled Hungarian writer in San Diego — once said that this kind of person inevitably crawls from the gutter during one of those historical lulls. I don’t agree, there is never a sufficiently long lull in history. If he did ever live in one of those filthy historical sewer systems, he has been at liberty for many a long year now, for decades, ready to raise flags, discover kindred spirits, move about in groups and organize secret meetings. He is rarely alone but is always to be found in one of those indeterminate military uniforms, his ideas nonsensical or non-existent, since these are simply obligatory forms of hatred, hatred being his raison d’être, his guiding principle, a hatred whose object is usually only hinted at, though hatred never lacks an object, an object being very much the point and I should know since I am that object.

Say I am sitting in a bar and he steps in. I can immediately see that he has immediately singled me out. My eyes are light blue, I am thin and don’t stand straight, that’s all. I have no idea how this tells him, makes him so certain that I am the one but there’s no denying he has an instinct for picking us out, picking out the weak — I say weak because weakness, I suspect, is the thing in me that irritates him — so he stands beside me, and everyone near us feels the tension, and both he and I know what must follow. It doesn’t in fact matter where I am, whether I’m at a railway station where he picks me out in the waiting room, or in a store I happen to be shopping in, our eyes will lock and then it’s too late, too late for me that is, to look away, because I always know what is coming and am simply incapable of making an escape. I know it would be in vain.

If he could find the words to articulate his hatred he would say he is defending himself, that he feels threatened, by me as it happens, though I wouldn’t hurt a fly. He goes to the gym, does martial arts, and trains day and night so that after a while his body is, as they say, pure muscle, nothing spare, his skin merely an ornament to his physique, no superfluous hair, eyes, nose or ears, needing nothing but this pure muscle, because he had better be prepared, as the others tell him, I mean the pack he goes to the gym with, to shoot with, and to train with, prepared because the enemy is all but invisible. The enemy can be named and is everywhere, but as soon as you put your hand out to grab him —at least in is his own experience — the enemy slips through those pure-muscle fingers, wriggles free, slips away and pretty soon disappears so there’s nothing left in the pure muscle fist and he has to start all over again, searching, fencing him in, and pounding him with his fist again and again.

When asked to give his name he prefers to remain silent because even if he has a name as such he doesn’t really have one because he has no need of one; he is entirely subsumed in his function, his hatred, the hatred that should be his proper name, that is if he has to have a name, though what he loves best is having no name, anonymity being his natural condition, his desire to become of sufficient weight to kill, to deliver a fatal blow, a single terminal blow that has accurately located its object.

He dreams a lot. But not of that single blow, rather that, should he find the person he is looking for, he might grind him between his fingers and make mincemeat of him, not the way the slaughterhouse man deals with the pig in the abattoir, that is to say quickly, but the way the butcher deals with his meat, with a certain languorous pleasure, so the enemy should feel, really feel what he himself had suffered down there in that dark, filthy labyrinth of tunnels until he emerged to crush this, his object. Most of his dreams end like this: he keeps punching the face which by now is a bloody pulp, but he keeps hitting it, beating and beating it, unable to stop, and he wakes in a cold sweat, his mouth dry, his knuckles so painful it might not have been a dream at all.

by Laszlo Krasznahorkai, NY Times | Read more:
Illustration: Balint Zsako