Sunday, February 12, 2012

Obama, Explained


In office as during his campaign—indeed, through the entirety of his seven-plus years as a national figure since his keynote speech at the Democratic Convention in the summer of 2004—Obama has maintained his stoic, unflapped, “no drama” air. During the fall and winter of 2007, his campaign seemed to be getting nowhere against Hillary Clinton, who was then, to knowledgeable observers, the “inevitable” nominee. In 2008, John McCain’s selection of Sarah Palin as his running mate seemed to energize his campaign so much that, despite gathering signs of financial disaster under the incumbent Republicans, just after Labor Day the McCain-Palin team had opened up a lead over Obama and Joe Biden in several national polls. CBS News and an ABC–Washington Post poll had McCain up by 2 percentage points in early September, a week before the Lehman Brothers bankruptcy; a USA Today–Gallup poll that same week had him ahead by a shocking 10 points. But Obama and Biden stayed unrattled and on message, and two months later they won with a two-to-one landslide in the Electoral College and a 7-point margin in the popular vote. The earnestly devotional HOPE poster by Shepard Fairey was the official icon of the Obama campaign. But its edgier, unofficial counterpart, a Photoshopped Internet image that appeared as an antidote to the panic over polls and Palin, perfectly captured the candidate’s air of icy assurance. It showed a no-nonsense Obama looking straight at the camera, with the caption EVERYONE CHILL THE FUCK OUT, I GOT THIS!

The history is relevant because it shows how quickly impressions of strength or weakness can evaporate and become almost impossible to reimagine. Try to think back to when sophisticated people thought that Sarah Palin was the key to Republican victory, or when Obama’s every political instinct seemed inspired. I can attest personally to a now-startling fact behind Jimmy Carter’s rise to the presidency. When he met privately with editorial-board members and veteran political figures across the country in the early days of his campaign—people who had seen contenders come and go and were merciless in spotting frailties—the majority of them went away feeling that in Carter they had encountered a person of truly exceptional political insight and depth. (You might not believe me; I have the notes.) Is this how the Nobel Peace Prize committee’s choice of Obama as its laureate within nine months of his taking office will look as the years pass—the symbol of a “market top” in the world’s romanticism about Obama?

Whether things seem to be going very well or very badly around him—whether he is announcing the death of Osama bin Laden or his latest compromise in the face of Republican opposition in Congress—Obama always presents the same dispassionate face. Has he been so calm because he has understood so much about the path ahead of him, and has been so clever in the traps he has set for his rivals? Or has he been so calm because, like the high-school kid on the plane, he has been so innocently unaware of how dire the situation has truly been?

This is the central mystery of his performance as a candidate and a president. Has Obama in office been anything like the chess master he seemed in the campaign, whose placid veneer masked an ability to think 10 moves ahead, at which point his adversaries would belatedly recognize that they had lost long ago? Or has he been revealed as just a pawn—a guy who got lucky as a campaigner but is now pushed around by political opponents who outwit him and economic trends that overwhelm him?

The end of a president’s first term is an important time to ask these questions, and not just because of the obvious bearing on his fitness for reelection. Hard as it is to have any dispassionate discussion of a president’s performance during an election year, it will be even harder once the election is over. If a year from now Obama is settling in for a second term, a halo effect will extend back to everything he did during his first four years. His programs will be more effective in reality, since he will get that many more years to cement them in with follow-up measures, supportive appointments to federal agencies and the courts, and possible vetoes of any attempts at repeal. And, through the lens of history, they will seem more effective, since whatever he did in his first term will appear to have been part of an overall plan that was ratified through reelection. Yet if a year from now a just-beaten former President Obama is thinking about his memoirs and watching his former appointees blame one another, and him, for the loss, the very same combination of missteps and achievements will be viewed as a narrative leading inexorably to defeat. By saying, after a year in office, that he would rather be “a really good one-term” president than a “mediocre” president who served two terms, Obama was playing to the popular conceit that presidents should rise above such petty concerns as reelection. The reality, though, is that our judgment about “really good” and “mediocre” presidents is colored by how long they serve. A failure to win reelection places a “one-term loser” asterisk on even genuine accomplishments. Ask George H. W. Bush, victor in the Gulf War; ask Jimmy Carter, architect of the Camp David agreement.

by James Fallows, The Atlantic |  Read more:
Photo: Carolyn Kaster/Associated Press/Corbis Images

Saturday, February 11, 2012


ahhh, that feels good...right there...
via:

The Ahh-ness of Things


[ed. Cherry blossoms are thought to be particularly symbolic of the concept of mono no aware. The transience of the blossoms, the extreme beauty and quick death, has often been associated with mortality and the ephemeral nature of life.]

The word is derived from the Japanese word mono, which means "thing", and aware, which was a Heian period expression of measured surprise (similar to "ah" or "oh"), translating roughly as "pathos", "poignancy", "deep feeling", or "sensitivity". Thus, mono no aware has frequently been translated as "the 'ahh-ness' of things", life, and love. Awareness of the transience of all things heightens appreciation of their beauty, and evokes a gentle sadness at their passing.

via: Wikipedia

Four Seaweeds for Health


A staple in Asian diets since ancient times, seaweeds are among the healthiest foods on the planet, packed with vitamins, minerals, and antioxidants. And now we know they’re great for the waistline, too: A 2010 study found the algae can reduce our rate of fat absorption by 75 percent, thanks to its inhibitory effect on a digestive enzyme called lipase. (Scientists at Newcastle University are about to begin clinical trials on a “wonder bread” made with alginate fibers and designed to speed weight loss.) Here are four briny plants to sample.

Wakame (Undaria pinnatifida)
Pappardella-like leaves with a salty-sweet zest

Nutrition Perks
Nutritionist Gillian McKeith, PhD, author of the You Are What You Eat Cookbook, calls wakame the woman’s seaweed because it is loaded with osteoporosis-preventing calcium and magnesium and acts as a diuretic (which helps reduce bloating). Wakame’s pigment, fucoxanthin, is known to improve insulin resistance, and a 2010 animal study found that fucoxanthin burns fatty tissue.

Kitchen Prep
Soak the leaves in cold water until tender, then enjoy them in a cucumber salad, dressed with rice vinegar, sesame oil, and soy sauce. To make miso soup, add wakame, tofu, and a few tablespoons of miso paste to a kombu stock (see below).

Nori (Porphyra species)
Papery sheets with a mild earthy taste

Nutritional Perks
Among the marine flora, nori is one of the richest in protein (up to 50 percent of the plant’s dry weight), and one sheet has as much fiber as a cup of raw spinach and more omega-3 fatty acids than a cup of avocado. Nori contains vitamins C (a potent antioxidant) and B12 (crucial for cognitive function) and the compound taurine, which helps control cholesterol.

Kitchen Prep
For a snack, toast strips of nori in the oven at low heat. Or cover a sheet with cooked brown rice; add a layer of sliced carrots, celery, or avocado, and a dash of wasabi. Roll it up and dip in a sauce of tamari, toasted-sesame oil, ginger, and rice vinegar.

By Tova Gelfond, Oprah |  Read more:
Photo: Dan Saelinger

Letters to Alice (by wjosna)
via:

Friday, February 10, 2012

What Remains: Conversations With America's Funeral Directors

The small film company I sometimes work for was planning a feature on new trends in funerals, and I had flown out for the weekend to try to meet some of the younger, hipper funeral directors at the conference. One of these was Ryan, a round man with a wide smile and an impeccable hair-part, whose car, he told me, has a bumper sticker that says "Let's Put The 'Fun' Back In Funeral." He started his career as a funeral director, but had since moved into the lucrative field of “death care industry” consultation, where he works with funeral directors on ways to expand their businesses. In one of our conversations he tells me, “The worst thing I’ve heard a funeral director say is ‘we’ve always done it this way.’" Later, I tell him my plan to attend that evening's “Funeral Directors Under 40: A Night on the Town” event. Without missing a beat, he lowered his voice and said, “Funeral directors are notoriously heavy drinkers. There will definitely be some hook-ups.”

The funeral industry is in the midst of a transition of titanic proportions. America is secularizing at a rapid pace, with almost 25% of the country describing itself as un-church. Americans, embracing a less religious view of the afterlife, are now asking for a "spiritual" funeral instead of a religious one. And cremation numbers are up. Way up. In liberal, secular states, specifically in the Pacific Northwest, cremation rates have steadily increased to more than half of disposals, up from the low single digits in 1990. The rest of the nation had also experienced steady gains in cremation since 2000 (except in the Bible Belt, where cremation rates remained relatively low). The rate of cremation has skyrocketed as Americans back away from the idea that Jesus will be resurrecting them straight from the grave. And so in the past twenty years, funeral directors have had to transform from presenters of a failed organism, where the sensation of closure is manifest in the presence of the deceased body, to the arbitrators of the meaning of a secular life that has just been reduced to ash. Reflecting this trend, this year's NFDA conference was, for the first time in its history, held jointly with the Cremation Association of North America (CANA).

Talking with funeral directors at the conference, I began to realize the scope of the crisis spurred by the rise of cremation and its new importance. As one former funeral director said, “If the family wanted a cremation, we’d say ‘That’ll be $595,’ hand them the urn and show them the door. Not anymore though.” The industry is scrambling to find a way to add value-added cremation services to remain solvent.

This tension about how best to innovate was in evidence at the first presentation I attended, titled “How To Step Up Your Game.” The presenter worked for a consulting firm that specialized in business strategies and management—the funeral industry was his particular subject of expertise. He launched into his talk with a story about a recent trip to Disney World with his daughter. While walking through the park, he realized how much the funeral industry could learn from the attraction. At Disney World, every interaction had been scripted and rehearsed, down to the greetings from the custodians. Experiences are controlled. Likewise, he said, every funeral should offer the same experience for everyone, whether cremated or open-casket. If, say, the customer was having an open-casket service with a priest and an organist, there should also be a corresponding service for someone, possibly secular, who has just been cremated. If priests are no longer always present to say platitudes over the dead, funeral directors would have to develop a corresponding basic, secular service to stand in as a reverent farewell. Thus they'd take a much larger role in the memorial, acting more like mainstream event planners and offering such amenities as video tributes, arranging for music and other points of the new-age burial.

by Max Rivlin-Nadler, The Awl:  Read more:

The Beatles


Much Ado About Nothing


Nick Hornby knew better, but he didn’t care. Because suddenly there was that face—the upturned nose, the lupine grin, the wary expression barely softened by the passage of, what, three decades now? Everyone else in the London club that December night was flittering around Colin Firth, set aglow by the Oscar buzz for his performance in The King’s Speech. Hornby let them flit. For here stood … Kevin Bacon. Undisturbed. That knowing smirk may have derailed him as a leading man, but it has allowed for a career of darker, richer roles—and allows him still to cruise a cocktail party longer than most boldfaced names without some fanboy rushing up to say how wonderful he is.

God knows, Hornby had seen that too often: an actor friend, eyes darting, cornered by a gushing stranger. This belated celebration of Firth’s 50th birthday was a private bash where artists and actors, people like Firth and Bacon—and, well, Hornby—could expect to relax. After all, between best-selling books such as About a Boy and a 2010 Academy Award nod earlier in the year for his screenplay for An Education, he had been cornered plenty himself.

Yet when he saw Bacon, Hornby couldn’t help it. He edged closer. It was like that scene from Diner when Bacon’s buddy sees a boyhood enemy in a crowd and breaks his nose: Hornby had no choice. In 1983 a girlfriend had brought home a tape of director Barry Levinson’s pitch-perfect comedy about twentysomething men, their nocturnal ramblings in 1959 Baltimore, their confused stumble to adulthood. Hornby was 26, a soccer fanatic, a writer searching for a subject. Diner dissected the male animal’s squirrelly devotion to sports, movies, music, and gambling. Diner had one man give his fiancée a football-trivia test and had another stick his penis through the bottom of a popcorn box. Hornby declared it, then and there, “a work of great genius.”

Midway through the movie, the ladies’ man Boogie, played by Mickey Rourke, is driving in the Maryland countryside with Bacon’s character, the perpetually tipsy Fenwick. They see a beautiful woman riding a horse. Boogie waves the woman down.

“What’s your name?,” Boogie asks.

“Jane Chisholm—as in the Chisholm Trail,” she says, and rides off.

Rourke throws up his hands and utters the words that Hornby, to this day, uses as an all-purpose response to life’s absurdities: “What fuckin’ Chisholm Trail?” And Fenwick responds with the line that, for Diner-lovers, best captures male befuddlement over women and the world: “You ever get the feeling there’s something going on that we don’t know about?”

In all, the scene encompasses only 13 lines of dialogue—an eternity if you’re Bacon at a party and a stranger knows them all. But Hornby wouldn’t be stopped. “I pinned that guy to the wall, and I quoted line after line,” Hornby recalls. “I thought, I don’t care. I’m never going to meet Kevin Bacon again. I need to get ‘What fuckin’ Chisholm Trail?’ off my chest.”

Hornby could not have planned a more apt tribute: Diner introduced to movies a character who compulsively recites lines from his favorite movie—and nothing else. And Hornby’s subsequent books about a fan obsessed with Arsenal football (Fever Pitch) and another obsessed with pop music (High Fidelity)—two postmodern London slackers who could easily have slid into a booth at the Fells Point Diner—are only the most obvious branches of the movie’s family tree.

Made for $5 million and first released in March 1982, Diner earned less than $15 million and lost out on the only Academy Award—best original screenplay—for which it was nominated. Critics did love it; indeed, a gang of New York writers, led by Pauline Kael, saved the movie from oblivion. But Diner has suffered the fate of the small-bore sleeper, its relevance these days hinging more on eyebrow-raising news like Barry Levinson’s plan to stage a musical version—with songwriter Sheryl Crow—on Broadway next fall, or reports romantically linking star Ellen Barkin with Levinson’s son Sam, also a director. The film itself, though, is rarely accorded its actual due.

Yet no movie from the 1980s has proved more influential. Diner has had far more impact on pop culture than the stylistic masterpiece Bladerunner, the indie darling Sex, Lies, and Videotape, or the academic favorites Raging Bull and Blue Velvet. Leave aside the fact that Diner served as the launching pad for the astonishingly durable careers of Barkin, Paul Reiser, Steve Guttenberg, Daniel Stern, and Timothy Daly, plus Rourke and Bacon—not to mention Levinson, whose résumé includes Rain Man, Bugsy, and Al Pacino’s recent career reviver, You Don’t Know Jack. Diner’s groundbreaking evocation of male friendship changed the way men interact, not just in comedies and buddy movies, but in fictional Mob settings, in fictional police and fire stations, in commercials, on the radio. In 2009, The New Yorker’s TV critic Nancy Franklin, speaking about the TNT series Men of a Certain Age, observed that “Levinson should get royalties any time two or more men sit together in a coffee shop.” She got it only half right. They have to talk too.

What Franklin really meant is that, more than any other production, Diner invented … nothing. Or, to put it in quotes: Levinson invented the concept of “nothing” that was popularized eight years later with the premiere of Seinfeld. In Diner (as well as in Tin Men, his 1987 movie about older diner mavens), Levinson took the stuff that usually fills time between the car chase, the fiery kiss, the dramatic reveal—the seemingly meaningless banter (“Who do you make out to, Sinatra or Mathis?”) tossed about by men over drinks, behind the wheel, in front of a cooling plate of French fries—and made it central.

by S.L. Price, Vanity Fair |  Read more:
Photograph: Paul Reiser

Open Your Mouth and You're Dead


Junko Kitahama's face is pale blue, her mouth agape, her head craned back like a dead bird’s. Through her swim mask, her eyes are wide and unblinking, staring at the sun. She isn’t breathing.

“Blow on her face!” yells a man swimming next to her. Another man grabs her head from behind and pushes her chin out of the water. “Breathe!” he yells. Someone from the deck of a boat yells for oxygen. “Breathe!” the man repeats. But Kitahama, who just surfaced from a breath-hold dive 180 feet below the surface of the ocean, doesn’t breathe. She doesn’t move. Kitahama looks dead.

Moments later, she coughs, jerks, twitches her shoulders, flutters her lips. Her face softens as she comes to. “I was swimming and…” She laughs and continues. “Then I just started dreaming!” Two men slowly float her over to an oxygen tank sitting on a raft. While she recovers behind a surgical mask, another freediver takes her place and prepares to plunge even deeper.

Kitahama, a female competitor from Japan, is one of more than 130 freedivers from 31 countries who have gathered here—one mile off the coast of Kalamata, Greece, in the deep, mouthwash blue waters of Messinian Bay—for the 2011 Individual Freediving Depth World Championships, the largest competition ever held for the sport. Over the next week, in an event organized by the International Association for the Development of Apnea (AIDA), they’ll test themselves and each other to see who can swim the deepest on a single lungful of air without passing out, losing muscle control, or drowning. The winners get a medal.

How deep can they go? Nobody knows. Competitive freediving is a relatively new sport, and since the first world championships were held in 1996, records have been broken every year, sometimes every few months. Fifty years ago, scientists believed that the deepest a human could freedive was about 160 feet. Recently, freedivers have routinely doubled and tripled that mark. In 2007, Herbert Nitsch, a 41-year-old Austrian, dove more than 700 feet—assisted by a watersled on the way down and an air bladder to pull him to the surface—to claim a new world record for absolute depth. Nitsch, who didn’t compete in Greece, plans to dive 800 feet in June, deeper than two football fields are long.

Nobody has ever drowned at an organized freediving event, but enough people have died outside of competition that freediving ranks as the second-most-dangerous adventure sport, right after BASE jumping. The statistics are a bit murky: some deaths go unreported, and the numbers that are kept include people who freedive as part of other activities, like spearfishing. But one estimate of worldwide freediving-related fatalities revealed a nearly threefold increase, from 21 deaths in 2005 to 60 in 2008.

by James Nestor, Outside |  Read more:
Photo: Igor Liberti

20 Common Grammar Mistakes That (Almost) Everyone Makes


I’ve edited a monthly magazine for more than six years, and it’s a job that’s come with more frustration than reward. If there’s one thing I am grateful for — and it sure isn’t the pay — it’s that my work has allowed endless time to hone my craft to Louis Skolnick levels of grammar geekery.

As someone who slings red ink for a living, let me tell you: grammar is an ultra-micro component in the larger picture; it lies somewhere in the final steps of the editing trail; and as such it’s an overrated quasi-irrelevancy in the creative process, perpetuated into importance primarily by bitter nerds who accumulate tweed jackets and crippling inferiority complexes. But experience has also taught me that readers, for better or worse, will approach your work with a jaundiced eye and an itch to judge. While your grammar shouldn’t be a reflection of your creative powers or writing abilities, let’s face it — it usually is.

Below are 20 common grammar mistakes I see routinely, not only in editorial queries and submissions, but in print: in HR manuals, blogs, magazines, newspapers, trade journals, and even best selling novels. If it makes you feel any better, I’ve made each of these mistakes a hundred times, and I know some of the best authors in history have lived to see these very toadstools appear in print. Let's hope you can learn from some of their more famous mistakes.

Who and Whom

This one opens a big can of worms. “Who” is a subjective — or nominative — pronoun, along with "he," "she," "it," "we," and "they." It’s used when the pronoun acts as the subject of a clause. “Whom” is an objective pronoun, along with "him," "her," "it", "us," and "them." It’s used when the pronoun acts as the object of a clause. Using “who” or “whom” depends on whether you’re referring to the subject or object of a sentence. When in doubt, substitute “who” with the subjective pronouns “he” or “she,” e.g., Who loves you? cf., He loves me. Similarly, you can also substitute “whom” with the objective pronouns “him” or “her.” e.g., I consulted an attorney whom I met in New York. cf., I consulted him.

Which and That

This is one of the most common mistakes out there, and understandably so. “That” is a restrictive pronoun. It’s vital to the noun to which it’s referring. e.g., I don’t trust fruits and vegetables that aren’t organic. Here, I’m referring to all non-organic fruits or vegetables. In other words, I only trust fruits and vegetables that are organic. “Which” introduces a relative clause. It allows qualifiers that may not be essential. e.g., I recommend you eat only organic fruits and vegetables, which are available in area grocery stores. In this case, you don’t have to go to a specific grocery store to obtain organic fruits and vegetables. “Which” qualifies, “that” restricts. “Which” is more ambiguous however, and by virtue of its meaning is flexible enough to be used in many restrictive clauses. e.g., The house, which is burning, is mine. e.g., The house that is burning is mine.

by Jon Gingerich, Lit Reactor |  Read more:

Thursday, February 9, 2012


Mark Davis, Icarus, 2011. Aluminum, brass, steel, acrylic and oil paint.
via:

The End of Wall Street As They Knew It

On Wall Street, the misery index is as high as it’s been since brokers were on window ledges back in 1929. But sentiments like that, accompanied by a full orchestra of the world’s tiniest violins, are only part of the conversation in Wall Street offices and trading desks. Along with the complaint is something that might be called soul-searching—which is, in itself, a surprising development. Since the crash, and especially since the occupation of Zuccotti Park last September (which does appear to have rattled a lot of nerves), there has been a growing recognition on Wall Street that the system that had provided those million-dollar bonuses was built on a highly unstable foundation. Disagreeable as it may be, goes this thinking, bankers have to go back to first principles, assess their value in the economy, and take their part in its rebuilding. No one on Wall Street liked to be scapegoated either by the Obama administration or by the Occupiers. But many acknowledge that the bubble­-bust-bubble seesaw of the past decades isn’t the natural order of capitalism—and that the compensation arrangements just may have been a bit out of whack. “There’s no other industry where you could get paid so much for doing so little,” a former Lehman trader said. Paul Volcker, whose eponymous rule is at the core of the changes, echoes an idea that more bankers than you’d think would agree with. “Finance became a self-justification,” he told me recently. “They made a lot of money trading with each other with doubtful public benefit.”

The questions of how to fix Wall Street–style capitalism—from taxes to regulation—are being intensely argued and will undergird much of the economic debate during this presidential election. And many on Wall Street are still making the argument that the consequences of hobbling Wall Street could be severe. “These are sweeping secular changes taking place that won’t just impact the guys who won’t get their bonuses this year,” Bove told me. “We’ve made a decision as a nation to shrink the growth of the financial system under the theory that it won’t impact the growth of the nation’s economy.”

And yet, the complaining has settled to a low murmur. Even as bonuses have withered, Wall Street as a political issue is gaining force. Bankers are aware that populism has a foothold, even in the Republican Party, and that these forces are liable to accelerate the process already taking place. “There’s a real sense the world is changing,” says a private-­equity executive with deep ties to the GOP. “People are becoming aware there’s real anger out there. It’s not just some kids camping out in some park. The Romney attacks caught everyone by surprise. We have prepared for this to come from the Democrats in the fall, but not now. You could run an entire campaign if you’re Barack Obama with ads using nothing but Republicans saying things about finance that you’d never hear two months ago. It’s an amazing thing.” (...)

Wall Street as Wal-Mart? A few years ago, the Masters of the Universe never could have imagined their industry being compared to big-box retailing. And yet, the model that had fueled bank profits has finally broken, as markets sputtered and new regulation kicked in. “Compensation is never really going to come back,” a Wall Street headhunter told me. “That is something entirely new.”

What is even more startling about this reversal is that few thought the much-vilified Dodd-Frank act would have much effect at all. From the moment it was proposed in 2009, the bill was tarred from all sides. Critics from the left, who wanted a return of Glass-­Steagall, which had kept investments banks and commercial banks separate until it was repealed during the Clinton years, howled that Dodd-Frank wouldn’t go far enough to break up the too-big-to-fail banks. “Dodd-Frank was an attempt to preserve the status quo,” Harvard economist Ken Rogoff told me. The too-big-to-fail banks, for their part, argued that the 2,300-page bill would create an overly complex morass of overlapping regulators that risked killing their ability to compete against foreign rivals. “We joke that Dodd-Frank was designed to deal with too-big-to-fail but it became too-big-to-read,” said the Citigroup executive.

By the time the bill passed, in July 2010, the legislation hadn’t found many new friends. Banks were especially upset by the inclusion of the Volcker Rule, which banned proprietary trading and virtually all hedge-fund investing by banks. Banks also complained about an amendment that slashed lucrative debit-card fees. They capitulated mainly because the alternative—breaking them up—was worse.

Part of the perception that the financial crisis changed nothing is that, in the immediate wake of the crash, the banks, buoyed by bailout dollars, whipsawed back to profitability. Goldman earned a record profit of $13.4 billion in 2009, as markets roared back from their post-Lehman lows. This dead-cat bounce was central to the formation of Occupy Wall Street and the neopopulist political currents that first erupted when the Treasury Department appointed Ken Feinberg to regulate bonuses for several TARP recipients. “The statute creating my authority was populist retribution,” Feinberg told me recently. “The feeling was, if you’re going to bail everyone out with the taxpayers, it has to come with a price.”

And yet, from the moment Dodd-Frank passed, the banks’ financial results have tended to slide downward, in significant part because of measures taken in anticipation of its future effect. Since July 2010, Bank of America nosed down 42 percent, Morgan Stanley fell 25 percent, Goldman fell 21 percent, and Citigroup fell 16—in a period when the Dow rose 25 percent. Partly, this is a function of the economic headwinds. But the bill’s major provisions—forcing banks to reduce leverage, imposing a ban on proprietary trading, making derivatives markets more transparent, and ending abusive debit-card practices—have taken a pickax to the Wall Street business model even though the act won’t be completely in effect till the ­Volcker Rule kicks in this July (other aspects of the bill took force in December; capital requirements and many other elements of the bill will be phased in gradually between now and 2016). “If you landed on Earth from Mars and looked at the banks, you’d see that these are institutions that need to build up capital and that they’re becoming ­lower-margin businesses,” a senior banker told me. “So that means it will be hard, nearly impossible, to sustain their size and compensation structure.” In the past year, the financial industry has laid off some 200,000 workers.

Nobody on either side would say that Dodd-Frank perfectly accomplished its aims. But while critics lament that no bank executives have gone to jail and have argued for a law that would have effectively blown up the banking system, Dodd-Frank is imposing a painful form of punishment. “Since 2008, what the financial community has done is kick the can down the road,” the senior banker added. “ ‘Let’s just buy us one more quarter and hope it gets better.’ Well, we’re now seeing cracks in that ability to continue operating with the structures that had been built up.”

by Gabriel Sherman, New York |  Read more:
Photo: Howard Schatz

Are You with the Right Mate?


Romance itself seeds the eventual belief that we have chosen the wrong partner. The early stage of a relationship, most marked by intense attraction and infatuation, is in many ways akin to cocaine intoxication, observes Christine Meinecke, a clinical psychologist in Des Moines, Iowa. It's orchestrated, in part, by the neurochemicals associated with intense pleasure. Like a cocaine high, it's not sustainable.

But for the duration—and experts give it nine months to four years—infatuation has one overwhelming effect: Research shows that it makes partners overestimate their similarities and idealize each other. We're thrilled that he loves Thai food, travel, and classic movies, just like us. And we overlook his avid interest in old cars and online poker.

Eventually, reality rears its head. "Infatuation fades for everyone," says Meinecke, author of Everybody Marries the Wrong Person. That's when you discover your psychological incompatibility, and disenchantment sets in. Suddenly, a switch is flipped, and now all you can see are your differences. "You're focusing on what's wrong with them. They need to get the message about what they need to change."

You conclude you've married the wrong person—but that's because you're accustomed to thinking, Cinderella-like, that there is only one right person. The consequences of such a pervasive belief are harsh. We engage in destructive behaviors, like blaming our partner for our unhappiness or searching for someone outside the relationship.

Along with many other researchers and clinicians, Meinecke espouses a new marital paradigm—what she calls "the self-responsible spouse." When you start focusing on what isn't so great, it's time to shift focus. "Rather than look at the other person, you need to look at yourself and ask, 'Why am I suddenly so unhappy and what do I need to do?'" It's not likely a defect in your partner.

In mature love, says Meinecke, "we do not look to our partner to provide our happiness, and we don't blame them for our unhappiness. We take responsibility for the expectations that we carry, for our own negative emotional reactions, for our own insecurities, and for our own dark moods."

But instead of looking at ourselves, or understanding the fantasies that bring us to such a pass, we engage in a thought process that makes our differences tragic and intolerable, says William Doherty, professor of psychology and head of the marriage and family therapy program at the University of Minnesota. It's one thing to say, "I wish my spouse were more into the arts, like I am." Or, "I wish my partner was not just watching TV every night but interested in getting out more with me." That's something you can fix.

It's quite another to say, "This is intolerable. I need and deserve somebody who shares my core interests." The two thought processes are likely to trigger differing actions. It's possible to ask someone to go out more. It's not going to be well received to ask someone for a personality overhaul, notes Doherty, author of Take Back Your Marriage.

No one is going to get all their needs met in a relationship, he insists. He urges fundamental acceptance of the person we choose and the one who chooses us. "We're all flawed. With parenting, we know that comes with the territory. With spouses, we say 'This is terrible.'"

The culture, however, pushes us in the direction of discontent. "Some disillusionment and feelings of discouragement are normal in the love-based matches in our culture," explains Doherty. "But consumer culture tells us we should not settle for anything that is not ideal for us."

by Rebecca Webber, Psychology Today |  Read more:

How Your Cat Is Making You Crazy

No one would accuse Jaroslav Flegr of being a conformist. A self-described “sloppy dresser,” the 63-year-old Czech scientist has the contemplative air of someone habitually lost in thought, and his still-youthful, square-jawed face is framed by frizzy red hair that encircles his head like a ring of fire.

Certainly Flegr’s thinking is jarringly unconventional. Starting in the early 1990s, he began to suspect that a single-celled parasite in the protozoan family was subtly manipulating his personality, causing him to behave in strange, often self-destructive ways. And if it was messing with his mind, he reasoned, it was probably doing the same to others.

The parasite, which is excreted by cats in their feces, is called Toxoplasma gondii (T. gondii or Toxo for short) and is the microbe that causes toxoplasmosis—the reason pregnant women are told to avoid cats’ litter boxes. Since the 1920s, doctors have recognized that a woman who becomes infected during pregnancy can transmit the disease to the fetus, in some cases resulting in severe brain damage or death. T. gondii is also a major threat to people with weakened immunity: in the early days of the AIDS epidemic, before good antiretroviral drugs were developed, it was to blame for the dementia that afflicted many patients at the disease’s end stage. Healthy children and adults, however, usually experience nothing worse than brief flu-like symptoms before quickly fighting off the protozoan, which thereafter lies dormant inside brain cells—or at least that’s the standard medical wisdom.

But if Flegr is right, the “latent” parasite may be quietly tweaking the connections between our neurons, changing our response to frightening situations, our trust in others, how outgoing we are, and even our preference for certain scents. And that’s not all. He also believes that the organism contributes to car crashes, suicides, and mental disorders such as schizophrenia. When you add up all the different ways it can harm us, says Flegr, “Toxoplasma might even kill as many people as malaria, or at least a million people a year.”

An evolutionary biologist at Charles University in Prague, Flegr has pursued this theory for decades in relative obscurity. Because he struggles with English and is not much of a conversationalist even in his native tongue, he rarely travels to scientific conferences. That “may be one of the reasons my theory is not better known,” he says. And, he believes, his views may invite deep-seated opposition. “There is strong psychological resistance to the possibility that human behavior can be influenced by some stupid parasite,” he says. “Nobody likes to feel like a puppet. Reviewers [of my scientific papers] may have been offended.” Another more obvious reason for resistance, of course, is that Flegr’s notions sound an awful lot like fringe science, right up there with UFO sightings and claims of dolphins telepathically communicating with humans.

But after years of being ignored or discounted, Flegr is starting to gain respectability. Psychedelic as his claims may sound, many researchers, including such big names in neuroscience as Stanford’s Robert Sapolsky, think he could well be onto something. Flegr’s “studies are well conducted, and I can see no reason to doubt them,” Sapolsky tells me. Indeed, recent findings from Sapolsky’s lab and British groups suggest that the parasite is capable of extraordinary shenanigans. T. gondii, reports Sapolsky, can turn a rat’s strong innate aversion to cats into an attraction, luring it into the jaws of its No. 1 predator. Even more amazing is how it does this: the organism rewires circuits in parts of the brain that deal with such primal emotions as fear, anxiety, and sexual arousal. “Overall,” says Sapolsky, “this is wild, bizarre neurobiology.” Another academic heavyweight who takes Flegr seriously is the schizophrenia expert E. Fuller Torrey, director of the Stanley Medical Research Institute, in Maryland. “I admire Jaroslav for doing [this research],” he says. “It’s obviously not politically correct, in the sense that not many labs are doing it. He’s done it mostly on his own, with very little support. I think it bears looking at. I find it completely credible.”

What’s more, many experts think T. gondii may be far from the only microscopic puppeteer capable of pulling our strings. “My guess is that there are scads more examples of this going on in mammals, with parasites we’ve never even heard of,” says Sapolsky.

by Kathleen McAuliffem, The Atlantic |  Read more:
Photo: Dennis Kunkel Microscropy, Inc./Visuals Unlimited/Corbis Images

Wednesday, February 8, 2012

Tumblr: Tumbling on Success


Tumblr launched in February 2007, with the tagline "Blogging made easy". Its first accounts specialised in art, media and porn (14 of the top 20 search keywords containing the word "tumblr" are still associated with adult blogs, according to SEOBook.com). Around 42 per cent of all original posts are photos.

But Tumblr is growing up, fast: the site expanded its user base by 900 per cent in the year to June 2011. In 2010, it served under two billion monthly page views; now, it generates about 14 billion, more than Wikipedia or Twitter. Its 36 million users so far have created 42 million posts each day -- 13.5 billion in total. According to Nielsen, it was the UK's second most popular social network or blog in the third quarter of 2011, with 229.6 million page views, trailing only Facebook. In September 2011, the company raised $85 million (£55m) from investors -- a round that valued Tumblr at $800 million (£500m).

If Facebook is the social network for online identification and authentication, and Twitter is for communication, Tumblr fulfils a different role: self-expression. Users can upload seven types of media -- text, photos, quotes, links, dialogue, audio, video -- from one button on their dashboard and push it to their public-facing tumblelog. These blogs can be designed however a user wants, or dressed in a "theme" (the most popular theme, Redux, has three million users). Tumblr is extremely easy to use as a free-form blogging platform, but has also developed into its own social network. Users follow other tumblelogs, whose content appears in their dashboards, not unlike Facebook's newsfeed; hitting the "reblog" button publishes that post to their own blogs, a feature Tumblr put out two years before Twitter introduced its own retweet button. "The social network that emerges out of Tumblr is interesting because it's driven by content, not by the social graph that these other networks are building around," says John Maloney, the company's president. And that content spreads quickly: on average, a Tumblr post gets reblogged nine times.

As Tumblr matures, it's attracting powerful fans. In October 2011, President Obama launched his 2012 re-election campaign on Tumblr, encouraging user submissions as a part of a "huge collaborative storytelling effort". Six months earlier, the US State Department launched the "official US Department of State presence on Tumblr", with video posts and article links. Major media outlets such as Newsweek, the New York Times and the BBC have tumblrs, along with fashion brands such as Alexander McQueen and Oscar de la Renta. Tech is represented by high profile companies including IBM and Olympus.

David Karp was 19 when he founded Tumblr -- "still a dippy, nerdy kid," as he puts it. The New Yorker learned to code for the web at 11, was home-schooled from 15 and lived in Japan by himself for a year at 18. He's now 25 and has grown up with the site. "I was always self-conscious about my age," he says. "I still don't have that much faith in me." But his goal is ambitious: Karp sees Tumblr not as a network, but as a product he's designing. "We're striving towards perfection," he says. "We're trying to build the iPod."

by Tom Cheshire, Wired UK |  Read more:
Photo: Chris Crisman

Al Rodente

There are people around who remember the days when squirrel was a more commonly served meat on the American table than chicken. The Kentucky Long Rifle, with its long barrel and small caliber, was designed for squirrel hunting (the smaller the caliber, the more squirrel left to take home after shooting one.)

The ideal shot was aimed not at the squirrel, but at the tree branch directly below it, so that the animal would be killed by the concussion of the bullet instead of the bullet itself. Historians say that this is what won the Revolutionary war; even the most highly trained British soldiers were no match for squirrel killers trained by hunger.

Until recent decades, Americans ate squirrel meat because it was cheap, plentiful, and there, according to Hank Shaw, author of Hunt, Gather, Cook: Finding the Forgotten Feast. Domesticated animals may have been easier to catch, but, in the days before the industrialization of farming, they were expensive to raise and feed. “When Herbert Hoover promised a chicken in every pot, that was a big deal,” Shaw adds. The first edition of The Joy of Cooking, published in 1931, was heavy on the squirrel. As it moved into later and later editions, Hoover’s promise was fulfilled (by other politicians, if not Hoover himself) and chicken gradually replaced squirrel.

Shaw shot his first squirrel when he was working as a reporter for a daily paper in Minnesota. He’d made it through an underpaid stint as a cub reporter in Long Island by catching and eating his own fish. When he arrived in Minnesota, though, he could not help but take note of the squirrels. The state has such a vibrant squirrel scene that a cottage industry has grown up around trapping and removing ones that have moved into people’s homes. Shaw bought a few books about squirrel hunting off the internet, applied for a license to hunt them, and got to it.

In doing so, he placed himself on the vanguard of the re-squirreling of the American diet. Squirrel-eating has been trendy in Great Britain for half a decade now — spurred by a nationalistic fervor to kill as many as possible of the invasive American gray squirrel, which is outcompeting the domestic red squirrel (the latter had the good fortune to star in a Beatrix Potter book, one of the best ways to cement your status as charismatic megafauna). (...)

The shift has left the squirrel hunting to the immigrant populations like the Hmong, who hunt squirrels in America because they’re the closest thing to the ones they hunted in the mountains of Southeast Asia. And it’s left them to people like Shaw — idealists who believe that, if you’re going to eat meat, it’s more noble (and thrifty) to kill whatever protein happens to be closest to home.

It’s hard to imagine more sustainable local game — squirrels are abundant, far from endangered, and don’t even require refrigeration the way that big game does. The standard rule of thumb is that one squirrel = enough meat for one dinner for one person. The squirrel is road food — the kind of prey that fed cross-country hikers, in the days before MRE and freeze-dried lentils. Squirrel is like the drive-through cheeseburger of the forest — albeit a cheeseburger that needs to be gutted first.

by Heather Smith, Grist |  Read more:
Photo by Chrissy Wainwright

The New You

[ed. Two excellent articles with similar themes: how your personal information and history are being collected, analyzed, sold, and manipulated in ways you can't imagine. The implications are both profound and scary.]

From the NY Times (Facebook is Using You):

Facebook makes money by selling ad space to companies that want to reach us. Advertisers choose key words or details — like relationship status, location, activities, favorite books and employment — and then Facebook runs the ads for the targeted subset of its 845 million users. If you indicate that you like cupcakes, live in a certain neighborhood and have invited friends over, expect an ad from a nearby bakery to appear on your page. The magnitude of online information Facebook has available about each of us for targeted marketing is stunning. In Europe, laws give people the right to know what data companies have about them, but that is not the case in the United States.

Facebook made $3.2 billion in advertising revenue last year, 85 percent of its total revenue. Yet Facebook’s inventory of data and its revenue from advertising are small potatoes compared to some others. Google took in more than 10 times as much, with an estimated $36.5 billion in advertising revenue in 2011, by analyzing what people sent over Gmail and what they searched on the Web, and then using that data to sell ads. Hundreds of other companies have also staked claims on people’s online data by depositing software called cookies or other tracking mechanisms on people’s computers and in their browsers. If you’ve mentioned anxiety in an e-mail, done a Google search for “stress” or started using an online medical diary that lets you monitor your mood, expect ads for medications and services to treat your anxiety.

by Lori Andrews |  Read more:

From The Atlantic (A Guide to the Digital Advertising Industry That's Watching Your Every Click):

At the start of the 21st century, the advertising industry is guiding one of history's most massive stealth efforts in social profiling. At this point you may hardly notice the results of this trend. You may find you're getting better or worse discounts on products than your friends. You may notice that some ads seem to follow you around the internet. Every once in a while a website may ask you if you like a particular ad you just received. Or perhaps your cell phone has told you that you will be rewarded if you eat in a nearby restaurant where, by the way, two of your friends are hanging out this very minute.

You may actually like some of these intrusions. You may feel that they pale before the digital power you now have. After all, your ability to create blogs, collaborate with others to distribute videos online, and say what you want on Facebook (carefully using its privacy settings) seems only to confirm what marketers and even many academics are telling us: that consumers are captains of their own new-media ships.

But look beneath the surface, and a different picture emerges. We're at the start of a revolution in the ways marketers and media intrude in -- and shape -- our lives. Every day, most if not all Americans who use the internet, along with hundreds of millions of other users from all over the planet, are being quietly peeked at, poked, analyzed and tagged as they move through the online world. Governments undoubtedly conduct a good deal of snooping, more in some parts of the world than in others. But in North America, Europe, and many other places, companies that work for marketers have taken the lead in secretly slicing and dicing the actions and backgrounds of huge populations on a virtually minute-by-minute basis. Their goal is to find out how to activate individuals' buying impulses so they can sell us stuff more efficiently than ever before. But their work has broader social and cultural consequences as well. It is destroying traditional publishing ethics by forcing media outlets to adapt their editorial content to advertisers' public-relations needs and slice-and-dice demands. And it is performing a highly controversial form of social profiling and discrimination by customizing our media content on the basis of marketing reputations we don't even know we have.

by Joseph Turow |  Read more:

Illustration Joon Mo Kang, NY Times

Why Is It So Hard for New Musical Instruments to Catch On?


For the musically daring, it's hard to beat the Guthman Musical Instrument Competition, which takes place later this month at Georgia Institute of Technology. One previous winning entry turned whisks and garlic presses into music makers. Another, the Double Slide Controller, borrowed the trombone's slide mechanism—a 15th-century innovation—to shape digitally produced tones into an otherworldly drone.

Events like these would seem to signal a golden age for the adventurous musician. New instruments have come to market at a steady clip in recent years, offering novel and occasionally fanciful ways to perform music. Maybe you've heard of the the Eigenharp, the Tenori-on, or the Harpejji?Or maybe not. Good luck hearing any of these contraptions on the recordings of prominent modern artists. You're more likely to come across Tibetan singing bowls (Fleet Foxes), 17th-century Indonesian angklung (Okkervil River), or the zither (P.J. Harvey). In other words, established pop and rock musicians seem more inclined to try just about any instrument other than a new one. The turntable might be the last new implement to break into pop music; there's even debate over whether that qualifies as an instrument, despite having its own form of notation and a course at Berklee College of Music. According to hip-hop lore, Grand Wizzard Theodore invented scratching 36 years ago. Suddenly, the turntable became a device used not just for listening to music, but performing it. And like the guitar, it turned into a focal point in live performances.

Now consider some of the instrumental developments in the 36 years prior: the solid-body electric guitar, the pedal-steel guitar, the steel drum, the electric bass, the synthesizer, and the drum machine.

Music technology in general has charged forward, and computers, digital sampling and MIDI have dramatically shaped music. But no one mimes to music on the "air sampler" and the idea of a "Software Hero" video game, with its own simulated laptop, is a little glum. Will a brand-new instrument ever capture hearts, minds, and speaker systems again?

THE PROBLEM WITH NEWNESS

It's hard to overstate the importance of new musical instruments in history. The piano's dynamic range allowed for a subtlety in composition previously unimagined. The modern drum set paved the way for jazz. Rock and roll would not have happened without the electric guitar. As composer Edgard Varese put it in 1936, "It is because new instruments have been constantly added to the old ones that Western music has such a rich and varied patrimony."

So what happened? Why has there been such a drought of new instruments—especially in rock and pop, which thrive on novelty?

by William Weir, The Atlantic |  Read more:
Photo: AP