Wednesday, April 25, 2012

Narcissism in Pink and Blue

I’m typically a year or two behind any cultural trend, so you probably already know about gender-reveal parties. I first heard of them over the weekend: a couple, strangers to me, had invited friends and relatives over to bite into cupcakes at the same instant and share the moment when the blue or pink custard inside would inform them all of the sex of the baby. (The sonogram result had gone from lab to baker without being seen by anyone else, including the parents-to-be.) Other couples choose different methods of revelation: grip the knife together and cut into a cake with blue or pink filling. Open a sealed box that releases pink or blue helium balloons. Then put the scene on the Web so that everyone not invited can participate vicariously.

These events are becoming more and more popular. The first video of a gender-reveal party was posted on YouTube in 2008, but in just the last six months almost two thousand have been uploaded. You can watch one from last month. (Spoiler alert: it’s a girl.)

Maybe it was the context—I happened to hear about the gender-reveal party in a rundown inner-city cafĂ© full of ex-felons who were having a very hard time finding jobs—but my initial take was incredulity trending negative. These parties seem to marry the oversharing of Facebook and Instagram with the contrived ceremonies that modern people in search of meaning impose on normal life events: food journaling, birthday parties for grownups, workout diaries, birth-experience planning. (One birth-planning center offers a “baby gender selection kit” involving three safe and natural steps that turn sex itself into a gender-reveal party.)

In the case of gender-reveal parties, couples take a private moment made possible by science and oblige others to join in, with the result—as in so many invented rituals of our day—that the focus turns from where it ought to be (in this case, the baby) to the self. At a bris or christening, the emotional emphasis falls on the arrival of a new life in the embrace of family and community. At a gender-reveal party, the camera is on the expectant father tearing up at the sight of pink cake.

by George Packer, The New Yorker |  Read more:

Tuesday, April 24, 2012

Has Physics Made Philosophy and Religion Obsolete?

It is hard to know how our future descendants will regard the little sliver of history that we live in. It is hard to know what events will seem important to them, what the narrative of now will look like to the twenty-fifth century mind. We tend to think of our time as one uniquely shaped by the advance of technology, but more and more I suspect that this will be remembered as an age of cosmology---as the moment when the human mind first internalized the cosmos that gave rise to it. Over the past century, since the discovery that our universe is expanding, science has quietly begun to sketch the structure of the entire cosmos, extending its explanatory powers across a hundred billion galaxies, to the dawn of space and time itself. It is breathtaking to consider how quickly we have come to understand the basics of everything from star formation to galaxy formation to universe formation. And now, equipped with the predictive power of quantum physics, theoretical physicists are beginning to push even further, into new universes and new physics, into controversies once thought to be squarely within the domain of theology or philosophy. 

In January, Lawrence Krauss, a theoretical physicist and Director of the Origins Institute at Arizona State University, published A Universe From Nothing: Why There Is Something Rather Than Nothing, a book that, as its title suggests, purports to explain how something---and not just any something, but the entire universe---could have emerged from nothing, the kind of nothing implicated by quantum field theory. But before attempting to do so, the book first tells the story of modern cosmology, whipping its way through the big bang to microwave background radiation and the discovery of dark energy. It's a story that Krauss is well positioned to tell; in recent years he has emerged as an unusually gifted explainer of astrophysics. One of his lectures has been viewed over a million times on YouTube and his cultural reach extends to some unlikely places---last year Miley Cyrus came under fire when she tweeted a quote from Krauss that some Christians found offensive. Krauss' book quickly became a bestseller, drawing raves from popular atheists like Sam Harris and Richard Dawkins, the latter of which even compared it to The Origin of Species for the way its final chapters were supposed to finally upend "last trump card of the theologian." 

By early spring, media coverage of "A Universe From Nothing" seemed to have run its course, but then on March 23rd the New York Times ran a blistering review of the book, written by David Albert, a philosopher of physics from Columbia University. Albert, who has a PhD in theoretical physics, argued that Krauss' "nothing" was in fact a something and did so in uncompromising terms: 

"The particular, eternally persisting, elementary physical stuff of the world, according to the standard presentations of relativistic quantum field theories, consists (unsurprisingly) of relativistic quantum fields... they have nothing whatsoever to say on the subject of where those fields came from, or of why the world should have consisted of the particular kinds of fields it does, or of why it should have consisted of fields at all, or of why there should have been a world in the first place. Period. Case closed. End of story."

Because the story of modern cosmology has such deep implications for the way that we humans see ourselves and the universe, it must be told correctly and without exaggeration---in the classroom, in the press and in works of popular science. To see two academics, both versed in theoretical physics, disagreeing so intensely on such a fundamental point is troubling. Not because scientists shouldn't disagree with each other, but because here they're disagreeing about a claim being disseminated to the public as a legitimate scientific discovery. Readers of popular science often assume that what they're reading is backed by a strong consensus. Having recently interviewed Krauss for a different project, I reached out to him to see if he was interested in discussing Albert's criticisms with me. He said that he was, and mentioned that he would be traveling to New York on April 20th to speak at a memorial service for Christopher Hitchens. As it happened, I was also due to be in New York that weekend and so, last Friday, we were able to sit down for the extensive, and at times contentious, conversation that follows.

by Ross Andersen, The Atlantic |  Read more:

Thursday, April 19, 2012


Bob Carlos Clarke - Untitled (Cutlery Series), 1999
via:

Levon Helm (May 26, 1940 – April 19, 2012)

How to Mend a Broken Heart


In an act of transformation worthy of any magician, scientists have converted scar tissue in the hearts of living mice into beating heart cells. If the same trick works in humans (and we’re still several years away from a trial), it could lead us to a long-sought prize of medicine – a way to mend a broken heart.

Our hearts are made of several different types of cell. These include muscle cells called cardiomyocytes, which contract together to give hearts their beats, and connective cells called cardiac fibroblasts, which provide support. The fibroblasts make up half of a heart, but they become even more common after a heart attack. If hearts are injured, they replace lost cardiomyocytes with scar tissue, consisting of fibroblasts. In the short-term, this provides support for damaged tissue. In the long-term, it weakens the heart and increases the risk of even further problems.

Hearts can’t reverse this scarring. Despite their vital nature, they are terrible at healing themselves. But Deepak Srivastava from the Gladstone Institute of Cardiovascular Disease can persuade them to do so with the right chemical cocktail. In 2010, he showed that just three genes – Gata4, Mef2c and Tbx5 (or GMT)– could transform fibroblasts into new cardiomyocytes.

This only worked in cells growing in a laboratory dish, but it was a start. Srivastava’s team have now taken the next step. By injecting living mice with GMT, they turned some of the rodents’ fibroblasts into cardiomyocytes. Since hearts are already loaded with fibroblasts, Srivastava’s technique simply conscripts them into muscle duty. Best of all, the technique worked even better in the animals than in isolated cells. No transplants. No surgeries. No stem cells. Just add three genes, and watch sick hearts turn into healthier ones.

“This is a permanent fix,” says Benoit Bruneau, a heart specialist who works at the same institute but was not involved in this study. “The net result is a much smaller scar and restored cardiac function. Honestly, I would have thought a few years ago that this was science fiction.”

by Ed Yong, Discover Magazine |  Read more:
Image: by 20after4

Did Humans Invent Music?


Music is everywhere, but it remains an evolutionary enigma. In recent years, archaeologists have dug up prehistoric instruments, neuroscientists have uncovered brain areas that are involved in improvisation, and geneticists have identified genes that might help in the learning of music. Yet basic questions persist: Is music a deep biological adaptation in its own right, or is it a cultural invention based mostly on our other capacities for language, learning, and emotion? And if music is an adaptation, did it really evolve to promote mating success as Darwin thought, or other for benefits such as group cooperation or mother-infant bonding?

Here, scientists Gary Marcus and Geoffrey Miller debate these issues. Marcus, a professor of psychology at New York University and the author of Guitar Zero: The New Musician and The Science of Learning and Kluge: The Haphazard Evolution of The Human Mind, argues that music is best seen as a cultural invention. Miller, a professor of psychology at the University of New Mexico and the author of The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature and Spent: Sex, Evolution, and Consumer Behavior, makes the case that music is the product of sexual selection and an adaptation that's been with humans for millennia.

Gary Marcus: We both love music and think it's important in modern human life, but we have different views about how music came to be. In Guitar Zero, I argued that music is a cultural technology, something that human beings have crafted over the millennia, rather than something directly wired into our genomes. Why do you think music is a biological adaptation?

Geoffrey Miller: Music's got some key features of an evolved adaptation: It's universal across cultures, it's ancient in prehistory, and kids learn it early and spontaneously.

Marcus: "Ancient" seems like a bit of stretch to me. The oldest known musical artifacts are some bone flutes that are only 35,000 years old, a blink in an evolutionary time. And although kids are drawn to music early, they still prefer language when given a choice, and it takes years before children learn something as basic as the fact that minor chords are sad. Of course, music is universal now, but so are mobile phones, and we know that mobile phones aren't evolved adaptations. When we think about music, it's important to remember that an awful lot of features that we take for granted in Western music—like harmony and 12-bar blues structure, to say nothing of pianos or synthesizers, simply didn't exist 1,000 years ago.

Miller: Sure, and other things like the pentatonic scale and the verse-chorus-bridge structure of pop songs aren't as universal as most people think.  

Marcus: I think it's deeper than that. Pentatonic scales are fairly common, but what we think of as music isn't what our ancestors thought of as music. Virtually every modern song revolves around harmony, but harmony is an invention that is only a thousand years old. Even if you ignore electric guitars and synthesizers, there is still some fairly significant difference between virtually all contemporary music and the music that people listened to a few thousand years ago.

by Gary Marcus and Geoffrey Miller, The Atlantic |  Read more: 
Photo: Keith Richards via:

Wednesday, April 18, 2012


Hiroko Imada, Waterfall in Summer, 1998
via:

50 Amazing Numbers About Today's Economy

In no particular order, here are 50 things about our economy that blow my mind:

50. The S&P 500 is down 3% from 2000. But a version of the index that holds all 500 companies in equal amounts (rather than skewed by market cap) is up nearly 90%.

49. According to economist Tyler Cowen, "Thirty years ago, college graduates made 40 percent more than high school graduates, but now the gap is about 83 percent."

48. Of all non-farm jobs created since June 2009, 88% have gone to men. "The share of men saying the economy was improving jumped to 41 percent in March, compared with 26 percent of women," reports Bloomberg.

47. A record $6 billion will be spent on the 2012 elections, according to the Center for Responsive Politics. Adjusted for inflation, that's 60% more than the 2000 elections.

46. In 2010, nearly half of Americans lived in a household that received direct government benefits. That's up from 37.7% in 1998.

45. Adjusted for inflation, federal tax revenue was the same in 2009 as it was 1997, even though the U.S. population grew by 37 million during that period.

by Morgan Housel, Motley Fool |  Read more:

Battleground America

There are nearly three hundred million privately owned firearms in the United States: a hundred and six million handguns, a hundred and five million rifles, and eighty-three million shotguns. That works out to about one gun for every American. The gun that T. J. Lane brought to Chardon High School belonged to his uncle, who had bought it in 2010, at a gun shop. Both of Lane’s parents had been arrested on charges of domestic violence over the years. Lane found the gun in his grandfather’s barn.

The United States is the country with the highest rate of civilian gun ownership in the world. (The second highest is Yemen, where the rate is nevertheless only half that of the U.S.) No civilian population is more powerfully armed. Most Americans do not, however, own guns, because three-quarters of people with guns own two or more. According to the General Social Survey, conducted by the National Policy Opinion Center at the University of Chicago, the prevalence of gun ownership has declined steadily in the past few decades. In 1973, there were guns in roughly one in two households in the United States; in 2010, one in three. In 1980, nearly one in three Americans owned a gun; in 2010, that figure had dropped to one in five.

Men are far more likely to own guns than women are, but the rate of gun ownership among men fell from one in two in 1980 to one in three in 2010, while, in that same stretch of time, the rate among women remained one in ten. What may have held that rate steady in an age of decline was the aggressive marketing of handguns to women for self-defense, which is how a great many guns are marketed. Gun ownership is higher among whites than among blacks, higher in the country than in the city, and higher among older people than among younger people. One reason that gun ownership is declining, nationwide, might be that high-school shooting clubs and rifle ranges at summer camps are no longer common.

Although rates of gun ownership, like rates of violent crime, are falling, the power of the gun lobby is not. Since 1980, forty-four states have passed some form of law that allows gun owners to carry concealed weapons outside their homes for personal protection. (Five additional states had these laws before 1980. Illinois is the sole holdout.) A federal ban on the possession, transfer, or manufacture of semiautomatic assault weapons, passed in 1994, was allowed to expire in 2004. In 2005, Florida passed the Stand Your Ground law, an extension of the so-called castle doctrine, exonerating from prosecution citizens who use deadly force when confronted by an assailant, even if they could have retreated safely; Stand Your Ground laws expand that protection outside the home to any place that an individual “has a right to be.” Twenty-four states have passed similar laws.

The day before T. J. Lane shot five high-school students in Ohio, another high-school student was shot in Florida. The Orlando Sentinel ran a three-paragraph story. On February 26th, seventeen-year-old Trayvon Martin left a house in a town outside Orlando and walked to a store. He was seen by a twenty-eight-year-old man named George Zimmerman, who called 911 to report that Martin, who was black, was “a real suspicious guy.” Zimmerman got out of his truck. Zimmerman was carrying a 9-mm. pistol; Martin was unarmed. What happened next has not been established, and is much disputed. Zimmerman told the police that Martin attacked him. Martin’s family has said that the boy, heard over a cell phone, begged for his life.

Zimmerman shot Martin in the chest. Martin did not survive. Zimmerman was not charged. Outside Orlando, the story was not reported.

by Jill Lepore, NewYorker |  Read more:
Photograph by Christopher Griffith.

Facebook: Like?

[ed. I know, another Facebook post. But this is a fine essay and Facebook itself is becoming so pervasive that it's hard to keep up with all its implications.]

Wooooh! yeah! Hoots, hollers and dance music played on a full-volume boombox, assail the conference room where I am quizzing a data scientist at Facebook. It takes 20 seconds for the noise to die down enough for us to continue talking.

This is what Facebook’s offices are like, embracing at least the idea of “creative destruction”, violence to the establishment. Facebook will soon float its shares on the stock market, making several billionaires and many millionaires out of its staff and backers. But the sprawling new Menlo Park office complex is designed—perhaps a bit too designed—to look as if the kids just took over in a revolution. Walls are extensively, if rather meticulously, graffiti’d; the graffiti artist, who was paid in shares, will be among the new millionaires. Chalkboards line many of the remaining surfaces, so Facebook’s wandering young employees can doodle almost anywhere. There are blocks of conference rooms with whimsical names: one here based on Star Wars characters mixed with drinks (Darth Jager, The Empire Strikes Bacardi), one over there echoing Ben & Jerry’s ice-cream (Americone Dream, Half Baked). Signs abound reading “Move Fast and Break Things”.

But these kids are not really breaking things. They are relentlessly building things, one after the other after the other, and adding them to the vastly ambitious mega-thing called Facebook. With its initial public offering (IPO) approaching, the company is in a “quiet period” during which it must avoid making new public predictions, but it is expected that Facebook’s 850m users will grow to a clean billion by July.

So for all the capricious decor and talk of breaking things, Facebook is very well aware that the eyes of the world are on it as an incumbent giant, not an insurgent. Besides “Move Fast and Break Things” there are signs telling employees to “Stay Focused and Keep Shipping”. Visitors are greeted warmly, but also presented with the standard Silicon Valley non-disclosure agreement before they can proceed past security. A billion people connected as never before in history. But Facebook also engenders anxiety on levels from the personal to the political, worries about a world in which private lives are always on display. What is 24-hour social networking doing to our self-expression, our self-image, our sense of decorum? Have we finally landed in the “global village” coined by Marshall McLuhan in the 1960s? What if you don’t like it there? Is there anywhere else to live?

And what is Facebook, anyway? The most obvious point of historical comparison is the social networks that preceded it. First there was Friendster, the flirt-and-forget site of the first half of the 2000s. Then everyone dumped Friendster for MySpace, and MySpace was bought by News Corp for $580m. Its value soared to $12 billion, and the received wisdom was that MySpace would take over the world. Then it didn’t, and News Corp sold it for $35m, because someone else had finally got social networking right. Started by Mark Zuckerberg in 2004, Facebook went from a Harvard dorm room to the rest of teenage America’s bedrooms to hundreds of millions of people all around the world—even parents and grandparents. Along the way, Facebook has fuelled revolutions in the Middle East, and inspired an Oscar-winning movie. Other social networks can only try to build out from the few niches it hasn’t already filled. Facebook is the undisputed champion of the world.

But the real comparison is not with other social networks. To give real credit to its achievement today and its ambitions for the future, it can only be said that Facebook’s true competitor is the rest of the entire internet.

The internet allows three things, broadly speaking: access to content (video, music, things to read), self-expression (blogs, Twitter) and communication (e-mail, chat, Skype). Facebook competes with it on all these fronts. By one estimate, one minute in every seven spent online, anywhere in the world, is spent on Facebook. To express themselves, users have Status Updates. For content, they can find photos, videos, music, news stories, recipes, book reviews and much more. And for communication, of course, there are your friends and Friends. More and more, the point of Facebook is to do almost anything you would do anyway, but with your friends, online. Facebook is an internet within the internet, so dominant that both it and other technology companies are realising that it is far easier to join forces than to fight.

by Robert Lane Greene, Intelligent Life |  Read more:

Gary Hume
Green, Pink, Pink and Green. 2004
Enamel on aluminum
91 3/4 x 63 3/4 inches; 233 x 162 cm
via:

Too Gentle to Live Among Wolves

“I am one of the searchers. There are, I believe, millions of us. We are not unhappy, but neither are we really content. We continue to explore life, hoping to uncover its ultimate secret. We continue to explore ourselves, hoping to understand. We like to walk along the beach, we are drawn by the ocean, taken by its power, its unceasing motion, its mystery and unspeakable beauty. We like forests and mountains, deserts and hidden rivers, and the lonely cities as well. Our sadness is as much a part of our lives as is our laughter. To share our sadness with one we love is perhaps as great a joy as we can know - unless it be to share our laughter.

We searchers are ambitious only for life itself, for everything beautiful it can provide. Most of all we love and want to be loved. We want to live in a relationship that will not impede our wandering, nor prevent our search, nor lock us in prison walls; that will take us for what little we have to give. We do not want to prove ourselves to another or compete for love.

For wanderers, dreamers, and lovers, for lonely men and women who dare to ask of life everything good and beautiful. It is for those who are too gentle to live among wolves.”

James Kavanaugh, There Are Men Too Gentle to Live Among Wolves 

Death by Treacle

Sentiment surfaces fast and runs hot in public life, dumbing it down and crippling intimacy in private life

When I was a child, I knew national flags by the color and design alone; today I could know diseases the same way. This occurs to me on my morning commute as I note the abundance of magnetic awareness ribbons adhering to cars. A ribbon inventory on the Internet turns up 84 solid colors, color combinations, and color patterns, although there are certainly more. The most popular colors must multitask to raise awareness of several afflictions and disasters at once. Blue is a particularly hard-working color, the new black of misfortunes; 43 things jockey to be the thing that the blue ribbon makes us aware of.

Awareness-raising and fundraising 5K races augment the work of the ribbons. Maryland, where I live, had 28 5K races in one recent two-month period. I think it might be possible to chart a transcontinental route cobbled together entirely by annual 5K charity and awareness runs. Some memorialize a deceased loved one or raise funds for an affliction in the family (“Miles for Megan,” for example, or “Bita’s Run for Wellness”); others raise awareness of problems ranging from world health to Haiti to brain injury. A friend of mine who works in fundraising and development once observed, and lamented, that some medical problems were more popular than others and easier to solicit money for. Conditions with sentimental clout elicit more research donations, and cute endangered animals such as the giant panda, the World Wildlife Fund’s mascot, lure more donations than noncuddly ones.

On some days you’ll see makeshift shrines for victims of car accidents or violence by the side of the road, placed next to a mangled guardrail or wrapped around a lamppost. As more people hear of the tragedy, teddy bears, flowers, and notes accumulate. Princess Diana’s was the biggest of such shrines, a mountain of hundreds of thousands of plastic-sheathed bouquets outside her residence. Queen Elizabeth resisted the presumptuous momentum of all the grief but finally relented and went to inspect the flower shrine and its handwritten messages, a concession to sentiment depicted in the movie The Queen. Maybe I was the only one in the theater who thought the Queen was right; I rooted for her propriety over Tony Blair’s dubious advice that she drag the monarchy into the modern age by publicly displaying a sentiment she probably didn’t feel. The mourners didn’t even know Diana, the queen reasoned by an obsolete logic of restrained stoicism, and the palace flag didn’t fly at half-mast even for more illustrious figures. But she caved in the end. We most always do.

Sentiment surfaces fast and runs hot in public life, and it compels our attention. On good days I dimly register this makeshift iconography of people’s sorrows, losses, and challenges. Some of them have been my own, too, but I don’t have ribbons. On my dark days I believe that pink ribbons and 5K runs and temporary shrines and teddy bears and emails exclamation-pointed into a frenzy—the sentimental public culture—is malicious to civil society and impedes in one elegant motion our capacities for deliberation in public life and intimacy in private life. On the days I’m feeling melodramatic I suspect that we are in the grips of death by treacle.

by Pamela Haag, American Scholar |  Read more:

Monday, April 16, 2012

Robert Caro’s Big Dig


Robert Caro probably knows more about power, political power especially, than anyone who has never had some. He has never run for any sort of office himself and would probably have lost if he had. He’s a shy, soft-spoken man with old–fashioned manners and an old-fashioned New York accent (he says “toime” instead of “time” and “foine” instead of fine), so self-conscious that talking about himself makes him squint a little. The idea of power, or of powerful people, seems to repel him as much as it fascinates. And yet Caro has spent virtually his whole adult life studying power and what can be done with it, first in the case of Robert Moses, the great developer and urban planner, and then in the case of Lyndon Johnson, whose biography he has been writing for close to 40 years. Caro can tell you exactly how Moses heedlessly rammed the Cross Bronx Expressway through a middle-class neighborhood, displacing thousands of families, and exactly how Johnson stole the Texas Senate election of 1948, winning by 87 spurious votes. These stories still fill him with outrage but also with something like wonder, the two emotions that sustain him in what amounts to a solitary, Dickensian occupation with long hours and few holidays.

Caro is the last of the 19th-century biographers, the kind who believe that the life of a great or powerful man deserves not just a slim volume, or even a fat one, but a whole shelf full. He dresses every day in a jacket and tie and reports to a 22nd-floor office in a nondescript building near Columbus Circle, where his neighbors are lawyers or investment firms. His office looks as if it belongs to the kind of C.P.A. who still uses ledgers and a hand-cranked adding machine. There are an old wooden desk, wooden file cabinets and a maroon leather couch that never gets sat on. Here Caro writes the old-fashioned way: in longhand, on large legal pads.

Caro began “The Years of Lyndon Johnson,” his multivolume biography of the 36th president, in 1976, not long after finishing “The Power Broker,” his immense, Pulitzer Prize-winning biography of Moses, and figured he could do Johnson’s life in three volumes, which would take him six years or so. Next month, a fourth installment, “The Passage of Power,” will appear 10 years after the last, “Master of the Senate,” which came out 12 years after its predecessor, “Means of Ascent,” which in turn was published 8 years after the first book, “The Path to Power.” These are not ordinary-size volumes, either. “Means of Ascent,” at 500 pages or so, is the comparative shrimp of the bunch. “The Path to Power” is almost 900 pages long; “Master of the Senate” is close to 1,200, or nearly as long as the previous two combined. If you try to read or reread them all in just a couple weeks, as I foolishly did not long ago, you find yourself reluctant to put them down but also worried that your eyeballs may fall out. 

The new book, an excerpt of which recently ran in The New Yorker, is 736 pages long and covers only about six years. It begins in 1958, with Johnson, so famously decisive and a man of action, dithering as he decides whether or not to run in the 1960 presidential election. The book then describes his loss to Kennedy on the first ballot at the Democratic convention and takes him through the miserable, humiliating years of his vice presidency before devoting almost half its length to the 47 days between Kennedy’s assassination in November 1963 (Caro’s account, told from Johnson’s point of view, is the most riveting ever) and the State of the Union address the following January — a period during which Johnson seizes the reins of power and, in breathtakingly short order, sets in motion much of the Great Society legislation.

 In other words, Caro’s pace has slowed so that he is now spending more time writing the years of Lyndon Johnson than Johnson spent living them, and he isn’t close to being done yet. We have still to read about the election of 1964, the Bobby Baker and Walter Jenkins scandals, Vietnam and the decision not to run for a second term. The Johnson whom most of us remember (and many of us marched in the streets against) — the stubborn, scowling Johnson, with the big jowls, the drooping elephant ears and the gallbladder scar — is only just coming into view. 

by Charles McGrath, NY Times |  Read more:
Photo: Martine Fougeron/Getty, for The New York Times

The Great Kobe Beef Scam

Think you’ve tasted the famous Japanese Kobe beef?

Think again.

Of course, there are a small number of you out there who have tried it – I did, in Tokyo, and it is delicious. If you ever go to Japan I heartily recommend you splurge, because while it is expensive, it is unique, and you cannot get it in the United States. Not as steaks, not as burgers, certainly not as the ubiquitous “Kobe sliders” at your trendy neighborhood “bistro.”

That’s right. You heard me. I did not misspeak. I am not confused like most of the American food media.

I will state this as clearly as possible:

You cannot buy Japanese Kobe beef in this country. Not in stores, not by mail, and certainly not in restaurants. No matter how much you have spent, how fancy a steakhouse you went to, or which of the many celebrity chefs who regularly feature “Kobe beef” on their menus you believed, you were duped. I’m really sorry to have to be the one telling you this, but no matter how much you would like to believe you have tasted it, if it wasn’t in Asia you almost certainly have never had Japan’s famous Kobe beef.  (...)

“How is this possible?” you ask, when you see the virtues of Kobe being touted on television food shows, by famous chefs, and on menus all over the country? A dozen burger joints in Las Vegas alone offer Kobe burgers. Google it and you will find dozens of online vendors happy to take your money and ship you very pricey steaks. Restaurant reviews in the New York Times have repeatedly praised the “Kobe beef” served at high-end Manhattan restaurants. Not an issue of any major food magazine goes by without reinforcing the great fat Kobe beef lie. So how could I possibly be right?

The answer is sadly simplistic: Despite the fact that Kobe Beef, as well as Kobe Meat and Kobe Cattle, are patented trademarks in Japan, these trademarks are neither recognized nor protected by U.S. law. As far as regulators here are concerned, Kobe beef, unlike say Florida Orange Juice, means almost nothing (the “beef” part should still come from cows). Like the recent surge in the use of the unregulated label term “natural,” it is an adjective used mainly to confuse consumers and profit from that confusion.

by Larry Olmstead, Forbes |  Read more:
Photo: Wikipedia

Friday, April 13, 2012


Carol Carter, Magnolia
via:

Tokyo, Japan
via:

Is Facebook Making Us Lonely?

Social media—from Facebook to Twitter—have made us more densely networked than ever. Yet for all this connectivity, new research suggests that we have never been lonelier (or more narcissistic)—and that this loneliness is making us mentally and physically ill. A report on what the epidemic of loneliness is doing to our souls and our society.

Yvette Vickers, a former Playboy playmate and B-movie star, best known for her role in Attack of the 50 Foot Woman, would have been 83 last August, but nobody knows exactly how old she was when she died. According to the Los Angeles coroner’s report, she lay dead for the better part of a year before a neighbor and fellow actress, a woman named Susan Savage, noticed cobwebs and yellowing letters in her mailbox, reached through a broken window to unlock the door, and pushed her way through the piles of junk mail and mounds of clothing that barricaded the house. Upstairs, she found Vickers’s body, mummified, near a heater that was still running. Her computer was on too, its glow permeating the empty space.

The Los Angeles Times posted a story headlined “Mummified Body of Former Playboy Playmate Yvette Vickers Found in Her Benedict Canyon Home,” which quickly went viral. Within two weeks, by Technorati’s count, Vickers’s lonesome death was already the subject of 16,057 Facebook posts and 881 tweets. She had long been a horror-movie icon, a symbol of Hollywood’s capacity to exploit our most basic fears in the silliest ways; now she was an icon of a new and different kind of horror: our growing fear of loneliness. Certainly she received much more attention in death than she did in the final years of her life. With no children, no religious group, and no immediate social circle of any kind, she had begun, as an elderly woman, to look elsewhere for companionship. Savage later told Los Angeles magazine that she had searched Vickers’s phone bills for clues about the life that led to such an end. In the months before her grotesque death, Vickers had made calls not to friends or family but to distant fans who had found her through fan conventions and Internet sites.

Vickers’s web of connections had grown broader but shallower, as has happened for many of us. We are living in an isolation that would have been unimaginable to our ancestors, and yet we have never been more accessible. Over the past three decades, technology has delivered to us a world in which we need not be out of contact for a fraction of a moment. In 2010, at a cost of $300 million, 800 miles of fiber-optic cable was laid between the Chicago Mercantile Exchange and the New York Stock Exchange to shave three milliseconds off trading times. Yet within this world of instant and absolute communication, unbounded by limits of time or space, we suffer from unprecedented alienation. We have never been more detached from one another, or lonelier. In a world consumed by ever more novel modes of socializing, we have less and less actual society. We live in an accelerating contradiction: the more connected we become, the lonelier we are. We were promised a global village; instead we inhabit the drab cul-de-sacs and endless freeways of a vast suburb of information.

At the forefront of all this unexpectedly lonely interactivity is Facebook, with 845 million users and $3.7 billion in revenue last year. The company hopes to raise $5 billion in an initial public offering later this spring, which will make it by far the largest Internet IPO in history. Some recent estimates put the company’s potential value at $100 billion, which would make it larger than the global coffee industry—one addiction preparing to surpass the other. Facebook’s scale and reach are hard to comprehend: last summer, Facebook became, by some counts, the first Web site to receive 1 trillion page views in a month. In the last three months of 2011, users generated an average of 2.7 billion “likes” and comments every day. On whatever scale you care to judge Facebook—as a company, as a culture, as a country—it is vast beyond imagination.

Despite its immense popularity, or more likely because of it, Facebook has, from the beginning, been under something of a cloud of suspicion. The depiction of Mark Zuckerberg, in The Social Network, as a bastard with symptoms of Asperger’s syndrome, was nonsense. But it felt true. It felt true to Facebook, if not to Zuckerberg. The film’s most indelible scene, the one that may well have earned it an Oscar, was the final, silent shot of an anomic Zuckerberg sending out a friend request to his ex-girlfriend, then waiting and clicking and waiting and clicking—a moment of superconnected loneliness preserved in amber. We have all been in that scene: transfixed by the glare of a screen, hungering for response.

When you sign up for Google+ and set up your Friends circle, the program specifies that you should include only “your real friends, the ones you feel comfortable sharing private details with.” That one little phrase, Your real friends—so quaint, so charmingly mothering—perfectly encapsulates the anxieties that social media have produced: the fears that Facebook is interfering with our real friendships, distancing us from each other, making us lonelier; and that social networking might be spreading the very isolation it seemed designed to conquer.

by Stephen Marche, The Atlantic |  Read more:
Illustration: Facebook.com