Monday, July 25, 2011

The Problem With Memoirs

by Neil Genzlinger

A moment of silence, please, for the lost art of shutting up.


There was a time when you had to earn the right to draft a memoir, by accomplishing something noteworthy or having an extremely unusual experience or being such a brilliant writer that you could turn relatively ordinary occur­rences into a snapshot of a broader historical moment. Anyone who didn’t fit one of those categories was obliged to keep quiet. Unremarkable lives went unremarked upon, the way God intended.

But then came our current age of oversharing, and all heck broke loose. These days, if you’re planning to browse the “memoir” listings on Amazon, make sure you’re in a comfortable chair, because that search term produces about 40,000 hits, or 60,000, or 160,000, depending on how you execute it.

Sure, the resulting list has authors who would be memoir-eligible under the old rules. But they are lost in a sea of people you’ve never heard of, writing uninterestingly about the unexceptional, apparently not realizing how commonplace their little wrinkle is or how many other people have already written about it. Memoirs have been disgorged by virtually every­one who has ever had cancer, been anorexic, battled depression, lost weight. By anyone who has ever taught an underprivileged child, adopted an under­privileged child or been an under­privileged child. By anyone who was raised in the ’60s, ’70s or ’80s, not to mention the ’50s, ’40s or ’30s. Owned a dog. Run a marathon. Found religion. Held a job.

So in a possibly futile effort to restore some standards to this absurdly bloated genre, here are a few guidelines for would-be memoirists, arrived at after reading four new memoirs. Three of the four did not need to be written, a ratio that probably applies to all memoirs published over the last two decades. Sorry to be so harsh, but this flood just has to be stopped. We don’t have that many trees left.

Read more:

Wrong Again

by Barry Ritholtz

The recession is well behind us now, and Wall Street seems to think this recovery should be all wrapped up.

Consider this: The federal non-farm jobs report for June was pretty awful. The private sector created 57,000 jobs. Federal, state and local governments cut 39,000 positions (the eighth straight monthly decrease in government employment). We picked up a mere 18,000 net new jobs.

Not a single forecaster in Bloomberg’s monthly survey of 85 Wall Street economists got it anywhere close to right. The most common reaction was “surprise.” That any professional can sincerely claim to be surprised by continued weakness — in employment, GDP or retail sales — was the only revelation.

Let’s put the number into context: In a nation of 307 million people with about 145 million workers, we have to gain about 150,000 new hires a month to maintain steady employment rates. So 18,000 new monthly jobs misses the mark by a wide margin.

Why have analysts and economists on Wall Street gotten this so wrong? In a word: context. Most are looking at the wrong data set, using the post-World War II recession recoveries as their frame of reference.

History suggests the correct frame of reference is not the usual contraction-expansion cycles, but rather credit-crisis collapse and recovery. These are not your run-of-the-mill recessions. They are far rarer, more protracted and much more painful.

Fortunately, a few economists have figured this out and provide some insight into what we should expect. Among the most prescient are professors Carmen M. Reinhart and Kenneth S. Rogoff. Back in January 2008 (!), they published a paper warning that the U.S. subprime mortgage debacle was turning into a full-blown credit crisis. Looking at five previous financial crises — Japan (1992), Finland (1991), Sweden (1991), Norway (1987) and Spain (1977) — the professors warned that we should expect a prolonged slump. These other crises had a number of surprisingly consistent elements:

First, asset market collapses were prolonged and deep. Real housing prices declined an average of 35 percent over six years, while equity prices collapsed an average of 55 percent. Those numbers were stunningly close to what occurred in the U.S. crisis of 2007-09.

Second, they’ve noted that the aftermaths of banking crises “are associated with profound declines in employment.” They found that following a crisis, the average increase in the unemployment rate was 7 percentage points over four years. U.S. unemployment climbed 6 percentage points (from about 4 percent to about 10 percent), while the broadest measure of joblessness gained over 7 percentage points (from about 9 percent to about 16 percent). Again, they were right on the money.

Third, the professors warned that “government debt tends to explode, rising an average of 86 percent.” Surprisingly, the primary cause is not the costs of bailing out the banking system, but the “inevitable collapse in tax revenues that governments suffer in the wake of deep and prolonged contractions.” They also warned that “ambitious countercyclical fiscal policies aimed at mitigating the downturn” also tend to be costly.

Hmmm, plummeting tax revenues just as the government tries to stimulate the economy . . . does any of this sound familiar? It should.

Read more:
image credit:  Rhett Maxwell, Creative Commons

Smash the Ceiling

by James Surowiecki

In the past few years, the U.S. economy has been beset by the subprime meltdown, skyrocketing oil prices, the Eurozone debt crisis, and even the Tohoku earthquake. Now it’s staring at a new problem—a failure to raise the debt ceiling, which would almost certainly throw the economy back into recession. Unlike those other problems, however, this one would be wholly of our own making. If the economy suffers as a result, it’ll be what a soccer fan might call the biggest own goal in history.

The truth is that the United States doesn’t need, and shouldn’t have, a debt ceiling. Every other democratic country, with the exception of Denmark, does fine without one. There’s no debt limit in the Constitution. And, if Congress really wants to hold down government debt, it already has a way to do so that doesn’t risk economic chaos—namely, the annual budgeting process. The only reason we need to lift the debt ceiling, after all, is to pay for spending that Congress has already authorized. If the debt ceiling isn’t raised, we’ll face an absurd scenario in which Congress will have ordered the President to execute two laws that are flatly at odds with each other. If he obeys the debt ceiling, he cannot spend the money that Congress has told him to spend, which is why most government functions will be shut down. Yet if he spends the money as Congress has authorized him to he’ll end up violating the debt ceiling.

As it happens, the debt ceiling, which was adopted in 1917, did have a purpose once—it was a way for Congress to keep the President accountable. Congress used to exercise only loose control over the government budget, and the President was able to borrow money and spend money with little legislative oversight. But this hasn’t been the case since 1974; Congress now passes comprehensive budget resolutions that detail exactly how the government will tax and spend, and the Treasury Department borrows only the money that Congress allows it to. (It’s why TARP, for instance, required Congress to pass a law authorizing the Treasury to act.) This makes the debt ceiling an anachronism. These days, the debt limit actually makes the President less accountable to Congress, not more: if the ceiling isn’t raised, it’s President Obama who will be deciding which bills get paid and which don’t, with no say from Congress.

Read more:

Sunday, July 24, 2011

Harold Melvin and the Bluenotes


[with Teddy Pendergrass]

Raking In Hip-Hop Millions and Snorting Your Way To Ruin

by Gus Garcia-Roberts

When Scott Storch was 8 years old, he was dizzied by a soccer cleat to the head. His mom did not take such injuries in stride. She had been apoplectic when Scott lost his baby teeth in a living-room dive five years earlier, leaving him with a Leon Spinks grin. "I was an overly worrisome mother," admits Joyce Yolanda Storch, who goes mainly by her middle name. "I was overbearing to a fault."

Mom banned Scotty from participating in sports. Instead, she enrolled him in piano classes at Candil Jacaranda Montessori in Plantation, about 15 minutes from their Sunrise home. An old jazz pianist named Jack Keller taught him. A singer herself, Yolanda stopped taking weekday gigs so she could drive Scott to the lessons and scraped together enough cash to buy him a baby grand.

The scrawny, creative kid wasn't much of an athlete anyway. But it turns out he was a virtuoso on the keys. By age 12, he was landing paid gigs. As an adult, he parlayed that ability into studio production, eventually becoming one of hip-hop's elite beatmakers. He laid backdrops for nearly every rap or R&B superstar of the past decade, including Jay-Z, Beyoncé, Dr. Dre, Lil Wayne, and 50 Cent.

At age 33, in 2006, his fee hit six figures per beat, which he could produce in 15 minutes. The money turned the Sunrise kid into a Palm Island Lothario. Hip-hop's blinged-out white boy lived in an expansive villa in the Miami Beach enclave, kept more than a dozen exotic vehicles — including a $1.7 million sports car — and docked a $20 million yacht.

So Yolanda, who raised Scott and his brother Matthew after she divorced their father in 1983, has reason to cling to the fact that she introduced Scott to the piano. It's the consolation prize of her life. "It's not that I want to toot my own horn, but I was always very supportive of his music," she says. "It's just too bad that everything went sour."

She perches gingerly on a bottomed-out wooden patio chair outside the modest two-bedroom red-brick home she shares with her 88-year-old father, Julius. The years have battered Yolanda's former starlet looks, but she's still a handsome woman, instantly identifiable as Scott's Mother by her ghostly fair skin, blue eyes, and prominent jaw. Keeping large eyeglasses atop a nest of bleached hair, she wears pink slippers, gray sweatpants, and a T-shirt bearing a cartoon bird saying, "How about a Christmas goose?" A burned-out Doral Ultra Light 100 is wedged between her fingers.

Yolanda is, to put it one way, quirky. A Catholic convert of Lithuanian-Jewish descent, she's obsessed with all things Italian. Especially Al Pacino. She calls the abstract prospect of meeting the actor "the reason I get up in the morning."

For her and her gifted son, nothing has turned out the way it should have. She watched Scott blow his fortune in spectacular, infamous fashion, giving millions of dollars in diamonds and cars to his girlfriends, which included America's holy trinity of floozydom: Paris Hilton, Lindsay Lohan, and Kim Kardashian.

In the meantime, Yolanda, who cares full-time for her partially blind father, waited in this $81,000 house for her son to remember her. Instead, Scott descended into a cocaine binge that crashed his career, propelled him into massive financial litigation and bankruptcy, and sent him to rehab.

Read more:
Clare Woods

The Meaninglessness of "Terrorism"

by Glenn Greenwald

For much of the day yesterday, the featured headline on The New York Times online front page strongly suggested that Muslims were responsible for the attacks on Oslo; that led to definitive statements on the BBC and elsewhere that Muslims were the culprits. The Washington Post's Jennifer Rubin wrote a whole column based on the assertion that Muslims were responsible, one that, as James Fallows notes, remains at the Post with no corrections or updates. The morning statement issued by President Obama -- "It's a reminder that the entire international community holds a stake in preventing this kind of terror from occurring" and "we have to work cooperatively together both on intelligence and in terms of prevention of these kinds of horrible attacks" -- appeared to assume, though (to its credit) did not overtly state, that the perpetrator was an international terrorist group.

But now it turns out that the alleged perpetrator wasn't from an international Muslim extremist group at all, but was rather a right-wing Norwegian nationalist with a history of anti-Muslim commentary and an affection for Muslim-hating blogs such as Pam Geller's Atlas Shrugged, Daniel Pipes, and Robert Spencer's Jihad Watch. Despite that, The New York Times is still working hard to pin some form of blame, even ultimate blame, on Muslim radicals (h/t sysprog):

Terrorism specialists said that even if the authorities ultimately ruled out Islamic terrorism as the cause of Friday’s assaults, other kinds of groups or individuals were mimicking Al Qaeda's brutality and multiple attacks.

"If it does turn out to be someone with more political motivations, it shows these groups are learning from what they see from Al Qaeda," said Brian Fishman, a counterterrorism researcher at the New America Foundation in Washington.


Al Qaeda is always to blame, even when it isn't, even when it's allegedly the work of a Nordic, Muslim-hating, right-wing European nationalist. Of course, before Al Qaeda, nobody ever thought to detonate bombs in government buildings or go on indiscriminate, politically motivated shooting rampages. The NYT speculates that amonium nitrate fertilizer may have been used to make the bomb because the suspect, Anders Behring Breivik, owned a farming-related business and thus could have access to that material; of course nobody would have ever thought of using that substance to make a massive bomb had it not been for Al Qaeda. So all this proves once again what a menacing threat radical Islam is.

Read more:
Boats and birds by Alicque
via:

George Michael


A Whiff of History

by Courtney Humphries

Think of some of your most powerful memories, and there’s likely a smell attached: the aroma of suntan lotion at the beach, the sharpness of freshly mown grass, the floral trail of your mother’s perfume. “Scents are very much linked to memory,” says perfumer Christophe Laudamiel. “They are linked to remembering the past but also learning from experiences.”

But despite its primacy in our lives, our sense of smell is often overlooked when we record our history. We tend to connect with the past visually - we look at objects displayed in a museum, photographs in a documentary, the writing in a manuscript. Sometimes we might hear a vintage speech, or touch an ancient artifact and imagine what it was like to use it. But our knowledge of the past is almost completely deodorized.

“It seems remarkable to me that we live in the world where we have all the senses to navigate it, yet somehow we assume that the past was scrubbed of smells,” says sensory historian Mark Smith.

It seems far-fetched to think we could actually start to smell the past - or somehow preserve a whiff of our daily lives. But increasingly, technology is making it possible, and historians, scientists, and perfumers are now taking the idea of smells as historical artifacts more seriously. They argue that it’s time to delve into our olfactory past, trying harder to understand how people experienced the world with their noses - and even save scents for posterity. Their efforts have already made it possible to smell fragrances worn a century ago, to re-create the smell of a rare flower even if it goes extinct, and to better understand the smells that ancient cultures appreciated or detested.

Read more:
image credit:

Astronomers Find Largest, Most Distant Reservoir of Water

Two teams of astronomers have discovered the largest and farthest reservoir of water ever detected in the universe. The water, equivalent to 140 trillion times all the water in the world's ocean, surrounds a huge, feeding black hole, called a quasar, more than 12 billion light-years away.

"The environment around this quasar is very unique in that it's producing this huge mass of water," said Matt Bradford, a scientist at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "It's another demonstration that water is pervasive throughout the universe, even at the very earliest times." Bradford leads one of the teams that made the discovery. His team's research is partially funded by NASA and appears in the Astrophysical Journal Letters.

A quasar is powered by an enormous black hole that steadily consumes a surrounding disk of gas and dust. As it eats, the quasar spews out huge amounts of energy. Both groups of astronomers studied a particular quasar called APM 08279+5255, which harbors a black hole 20 billion times more massive than the sun and produces as much energy as a thousand trillion suns.

Chris Langstroth
via:

Karsh Kale


A Conversation with the Dalai Lama

by Melissa Mathison

The sun is shining on Tsuglakhang temple, in the foothills of the Indian Himalayas, and hundreds of Tibetans have gathered in the courtyard for a feast. As Buddhist monks ladle out white rice and stewed vegetables, horns blow and cymbals crash. Such celebrations are common here — the monks often feed local villagers as an act of service to earn karmic merit — but the festive air seems to capture the mood of the man who lives next to the temple. The Dalai Lama, despite many heartfelt petitions by his constituents, has finally been granted his wish for official retirement from government duties.

The Tibetan Parliament had twice urged His Holiness to reconsider, but he had declined even to read a message from them or meet with legislators. His mind was made up. On May 29th, the papers were signed and the Tibetan charter amended. The act marks a remarkable and voluntary separation of church and state: For the first time in more than 350 years, the Dalai Lama is no longer the secular as well as the spiritual leader of the Tibetan people.

Although the Tibetan government-in-exile has been largely democratic for decades, the Dalai Lama still had the final say in every major political decision within the diaspora. He appointed foreign envoys, determined the scope and timing of negotiations with China, had the power to sign or veto bills and could even dismiss Parliament. Now, with his signature, his formal title has changed from "Head of Nation" to "Protector and Symbol of Tibet and Tibetan People." Many of his political responsibilities will rest on the shoulders of Lobsang Sangay, a 43-year-old Harvard legal scholar who was elected in April to the post of prime minister.

China, dismissing the transfer of power as a "trick," has refused to meet with Sangay. The Communist government believes that the struggle for Tibetan autonomy will die with the Dalai Lama; all they have to do is wait him out. But by turning the reins of government over to the governed, His Holiness is banking on democracy's ability to serve as an effective bulwark against Chinese oppression. At 76, he knows he won't be around to steer the ship of state forever. Tibetans, he believes, must learn to steer it for themselves.

Tenzin Gyatso, the 14th Dalai Lama, was born in 1935, the son of a farmer in a small Tibetan village. In accordance with ancient tradition, the dreams and visions of high lamas and oracles eventually led a search party to the boy. At age two, he successfully identified people and possessions from his past life and was officially recognized as the reincarnation of the 13th Dalai Lama. At four, he entered the capital of Lhasa and was named the spiritual leader of his people. At 15, he became head of state. In 1959, as tensions with the Chinese army reached a flash point, he fled to India, where he has led the Tibetan diaspora ever since.

Looking back over his 60 years of leadership, he has much to be proud of. He has established a successful and stable government in exile and stood firm against a brutal regime. As the first Dalai Lama to travel to the West, he has also extolled the virtues of nonviolence to millions, a lifelong effort that earned him a Nobel Peace Prize. As the spiritual leader of Tibet, he remains the personification of his nation's struggle.

I have known His Holiness since 1990, when I wrote Kundun, a movie about his childhood directed by Martin Scorsese. Since then, we have developed a lasting friendship. I continue to work as an activist for Tibetan autonomy and serve on the board of the International Campaign for Tibet. Every day I pray for Tenzin Gyatso's long life.

When we meet on June 2nd in his reception area behind the busy main temple in the dusty Indian hill town of McLeod Ganj, he asks if he still looks as healthy as the last time we met. Yes, I tell him — even younger, if possible. But, I add, his eyes look older. "That's right," he says. He wishes to inform me, however, that he hasn't needed to increase his eyeglass prescription — in part because he doesn't use a computer. "I never even tried," he says, breaking into his distinct, ebullient laugh. "I don't know how!"

Read more:

She Looks Too Much Like Me

by Shauna Miller

I joked about our age difference the first time we hung out. When Kurt Cobain died, I was in a pub in Germany. She was in the second grade. I made some crack about watching MTV News and feeling old. She was pretty cocky about not knowing who Kurt Loder was.

She was 23, opinionated, and emotional, with lots of orange hair on top. “Fiery” is the word I think I assigned the overall package. I liked arguing with her. She made me nervous. She had complicated hobbies, like making her own beer and playing archaic musical instruments. She had big, passionate ideas about what was wrong with the world and how to save it. We met while volunteering, because that's how every lesbian meets every other lesbian in Washington, D.C.

She also had my haircut. To be fair, I had her haircut, too. Doppelbanger Syndrome—banging one’s clone—is a scourge of the lesbian community, and we had a critical case: same Bieber haircut, same thick-framed glasses. “You guys sisters?” everyone wondered, from pervy guys to sweet old ladies. D.C. doesn’t really do butch-femme, so there we were, left to haggle out the gray areas in the same dressing room at H&M.

Read more:

Notorious Ph.D.


Adam J. Ruben spent seven years working on a Ph.D. in molecular biology at Johns Hopkins. On the side, he performed at open mikes and wrote a book that didn’t count toward his publish-or-perish count. “Surviving Your Stupid, Stupid Decision to Go to Grad School” was published by Broadway Books last year. Now Dr. Ruben teaches an undergraduate class on the stand-up comic in society at Johns Hopkins, when he’s not at his day job at Sanaria, working on a vaccine for malaria. Another hobby: rapping.

Personal DNA Sequencing Machine One Step Closer

A new, low cost semiconductor-based gene sequencing machine has been developed and may unlock the door to advanced medicines and life itself.

A team led by Jonathan Rothberg of Ion Torrent in Guilford, Conn is working on a system which uses semiconductors to decode DNA, dramatically reducing costs and taking them closer to being able to reach the goal of a $1000 human genome test.

"DNA sequencing and, more recently, massively parallel DNA sequencing has had a profound impact on research and medicine," the study reads.

Typical DNA sequencing machines use optical technology instead of semi-conductors. While fast, optical technology is expensive and complex.

The current optical based system costs around $49000 and is already on the market and being used in over 40 countries.

The team hopes that the new development will allow them to tap into ever growing computing power that gets cheaper and more powerful over time, essentially riding the backs of the $50 billion chip industry.

"The reductions in cost and time for generating DNA sequence have resulted in a range of new sequencing applications in cancer, human genetics, infectious diseases, and the study of personal genomes,, as well as in fields as diverse as ecology, and the study of ancient DNA," the team said.

Read more:
image credit:

Saturday, July 23, 2011

Jonte' Moaning


Chain World

by Jason Fagone

Jason Rohrer is known as much for his eccentric lifestyle as for the brilliant, unusual games he designs. He lives mostly off the grid in the desert town of Las Cruces, New Mexico. He doesn’t own a car or believe in vaccination. The 33-year-old works out of a home office, typing code in a duct-taped chair. He takes his son Mez to gymnastics and acting class on his lime-green recumbent bicycle, and on weekends he paints with his son Ayza. (He got Mez’s name from a license plate, and Ayza’s by mixing up Scrabble tiles.)

On the morning of February 24, Rohrer took a break from coding and pedaled to the local Best Buy. He paid $19.99 for a 4-gigabyte USB memory stick sheathed in black plastic. The next day he sanded off the memory stick’s logos, giving it a brushed-metal texture that reminded him of something out of Mad Max. Then, using his kids’ acrylics, he painted a unique pattern on both sides, a chain of dots that resembled a piece of Aboriginal art he had seen.

The stick would soon hold a videogame unlike any other ever created. It would exist on the memory stick and nowhere else. According to a set of rules defined by Rohrer, only one person on earth could play the game at a time. The player would modify the game’s environment as they moved through it. Then, after the player died in the game, they would pass the memory stick to the next person, who would play in the digital terrain altered by their predecessor—and on and on for years, decades, generations, epochs. In Rohrer’s mind, his game would share many qualities with religion—a holy ark, a set of commandments, a sense of secrecy and mortality and mystical anticipation. This was the idea, anyway, before things started to get weird. Before Chain World, like religion itself, mutated out of control.

Read more:

Sneaky Sound System