Monday, February 13, 2012

Spirit of a Racer in a Siberian Husky's Blood

Winnie’s breed does not have royal roots, but her lineage is fierce. It dates to what some consider the finest feat in dog-and-human history, a 1925 race to deliver lifesaving diphtheria serum to icebound Nome, Alaska. The event gripped the nation and later became an inspiration for the Iditarod race.

But after the headlines ceased, what happened to two of the lead dogs — Winnie’s forebear Togo and Balto, whose statue stands in Central Park — is a tale that reflects Americans’ quick creation and destruction of celebrities, involving Hollywood, a 10-cent circus, a Cleveland zoo, a ruined friendship and a sports controversy that, almost 90 years later, still raises the hackles of sled-dog drivers everywhere.

“It’s still very much in the mind of mushers,” said Bob Thomas, a Siberian musher and a historian for the International Siberian Husky Club.

In January 1925, an outbreak of diphtheria had killed two children and was spreading quickly in Nome, a town of about 1,400 that was icebound seven months a year.

A local doctor telegraphed Washington, urgently requesting serum to treat the diphtheria, and public health officials found a supply in Anchorage, according to Gay and Laney Salisbury’s riveting book, “The Cruelest Miles.” Officials determined that dog sleds were the best way to transport the serum from Nenana, a northern railroad stop, to Nome, 674 miles west. A group of top mushers and sled-dog racers would hand off the serum at roadhouses along the route. That distance usually took a few weeks to cover. By then, public health officials feared, much of Nome would be dead.

As the dog-sled teams raced west, roadhouse owners provided near-real-time updates over telephone and telegraph lines. Front-page headlines from The New York Times included “Nome Relief Dogs Speed 192 Miles,” “Serum Relief Near for Stricken Nome,” and “Blizzard Delays Nome Relief Dogs in the Final Dash.”

“It came right down to just the spirit of men and dogs against nature,” Gay Salisbury said.

A noted racer and mining-company dog driver named Leonhard Seppala was originally assigned half of the Nenana-Nome distance. Seppala’s lead dog, a gray and brown Siberian husky named Togo, had covered 4,000 miles in one year alone, guided a famed polar explorer around Alaska, and won major races. Togo had been Seppala’s lead dog since he was 8 months old; now, at age 12, Togo would have one of his final Alaska outings with his driver.

Seppala, Togo and the team set out at high speeds, running a total of 261 miles — they carried the serum for almost double the length any other team did. Twice, to save time, they violated warnings to avoid Norton Sound, a dangerous inlet of the Bering Sea, and instead went straight over the frozen sea, where ice often separated from shore, stranding travelers on floes. In the dark, in 85-below temperatures with wind chill, Seppala could not see or hear the cracking ice, and was dependent on Togo, the Salisburys wrote.

Meanwhile, worried that Seppala’s dogs would get too tired, Alaska’s governor called in additional drivers for the final portion. Just five and a half days after the serum left Nenana, a driver named Gunnar Kaasen and a lead dog named Balto pulled into Nome, serum in hand.

“It was Balto who led the way,” Kaasen told a reporter. “The credit is his.”

Kaasen and Balto, a handsome black Siberian with white paws, became instant heroes. There were front-page articles; commendations from the president; tributes from the Senate; newspapers (including The Times) printing a report that Balto had died from frozen lungs, then quickly rescinding it; wishful editorials proposing that Balto appear at Westminster; a national tour; a Hollywood contract.

But as Kaasen, Balto and that team of dogs were becoming celebrities, the other mushers from the relay straggled into Nome with a different story. Kaasen was assigned the next-to-last leg. But, in an account that some mushers still doubt, Kaasen said the lights were off in the cabin where he was to hand off the serum, so he headed for Nome himself.

Seppala was already broken when he arrived — he had lost Togo when the dog ran off after a reindeer. Then he found that not only were Kaasen and Balto on their way to Hollywood, but the newspapers had attributed Togo’s lifetime feats to Balto, a dog he had not considered decent enough to put on his 20-dog team.

by Stephanie Clifford, NY Times |  Read more:
Photo: Underwood & Underwood/Corbis

Keeping Consumers on the Digital Plantation

In the old days, you listened to music on your iPod while exercising. During an idle moment at the office you might use Google on your Microsoft Windows PC to search for the latest celebrity implosion. Maybe you would post an update on Facebook. After dinner, you could watch a DVD from Netflix or sink into a new page-turner that had arrived that day from Amazon.

That vision, where every company and every device had its separate role, is so 2011.

The biggest tech companies are no longer content simply to enhance part of your day. They want to erase the boundaries, do what the other big tech companies are doing and own every waking moment. The new strategy is to build a device, sell it to consumers and then sell them the content to play on it. And maybe some ads, too.

Last week’s news that Google is preparing its first Google-branded home entertainment device — a system for streaming music in the house — might seem far afield for an Internet search and advertising company, but fits solidly into an industrywide goal in which each tech company would like to be all things to all people all day long.

“It’s not about brands or devices or platforms anymore,” said Michael Gartenberg, an analyst at Gartner. “It’s about the ecosystem. The idea is to get consumers tied into that ecosystem as tightly as possible so they and their content are locked into one system.” 

by David Streitfeld, NY Times |  Read more:

Sunday, February 12, 2012

No Doubt



Piannissimo
Source: unknown

Jeremy Lin’s Social Media Fast Break


We live in fickle times, but this is ridiculous. New York, suddenly, has gone nuts over Jeremy Lin, an Asian-American, Harvard-educated point guard who has played only two good games for the NBA’s hapless Knicks. And that’s just the beginning: In China, Lin’s name was among the top-10 search terms on Monday on Sina Weibo, the Chinese equivalent to Twitter. Last Friday, most of the world hadn’t heard of him. Today, you could make a case he’s the most famous Asian-American athlete since Tiger Woods. Which is just kooky. No question, Lin played really, really well against the New Jersey Nets and Utah Jazz over the weekend, but that hardly makes him the second coming of Oscar Robertson.

Now, don’t get me wrong. I’ve got nothing against Jeremy Lin. He was a high school phenom in Palo Alto, Calif., and I know some Asian American kids out here in Berkeley who worship the ground he walks on. Lin didn’t make the NBA because he’s freakishly tall, like the 7-foot-4 Yao Ming (Lin is “only” 6″3′). He’s there because he can play ball, because he has a wicked fast first step when he drives to the basket, and he knows how to deliver the rock to the big guys (a skill a surprising number of “legitimate” NBA guards show little interest in mastering). He’s a triumph of will over genetic endowment, a fact that makes him inspiring to an entire generation of Californian kids restless with their model minority shackles.

But you can like Lin, and you can root for him, and yet still find his instantaneous, Tim Tebow-like ascent (in more ways than one!) to pop-cultural phenom — LINSANITY! — to be more than a little disorienting. Jeremy Lin is the latest example of how our socially-mediated, always-on world can churn any data point, any outrage, any act of heroism or moment of despair into a full-scale world-wide frenzy in less time than it took me to write this sentence.

We’ve seen this before. The same forces — social media, digital publishing tools, smartphone ubiquity — that are giving us Linsanity just blitzkrieged the Susan G. Komen Race for the Cure Foundation. They torpedoed Hollywood’s attempt to force SOPA and PIPA through Congress and blew up Bank of America’s plan to charge a $5 fee for debit card use. They fueled the Occupy Wall Street movement, magnified every Tebow prostration before God into a worldwide religious orgy and are ever-more ready to pounce on any misstep by a Mitt Romney or a Newt Gingrich and explode it into an instant political crisis.

And the crazy thing is, we’re figuring this out as it goes along — and giving the phenomenon more power. As we understand this new world, and submerge ourselves in it, we are beginning to take our cues from it.
The mainstream media now seems to be adapting its coverage of events on the basis of whether something blows up in social media as much as it does from the perceived newsworthiness of the event itself. It’s startling, but also natural: When you see a fire start to blaze, you run to cover to it. And so Linsanity breeds more Linsanity.

by Andrew Leonard, Salon |  Read more:
Photo: AP/Kathy Kmonicek

OK Go


The new music video from OK Go, made in partnership with Chevrolet. OK Go set up over 1000 instruments over two miles of desert outside Los Angeles. A Chevy Sonic was outfitted with retractable pneumatic arms designed to play the instruments, and the band recorded this version of Needing/Getting, singing as they played the instrument array with the car. The video took 4 months of preparation and 4 days of shooting and recording. There are no ringers or stand-ins; Damian took stunt driving lessons. Each piano had the lowest octaves tuned to the same note so that they'd play the right note no matter where they were struck. Many thanks to Chevy for believing in and supporting such an insane and ambitious project, and to Gretsch for providing the guitars and amps.

[ed. Thanks, Nate!]

“Early Trouble” by J. Scott Pike 1964
via:

Obama, Explained


In office as during his campaign—indeed, through the entirety of his seven-plus years as a national figure since his keynote speech at the Democratic Convention in the summer of 2004—Obama has maintained his stoic, unflapped, “no drama” air. During the fall and winter of 2007, his campaign seemed to be getting nowhere against Hillary Clinton, who was then, to knowledgeable observers, the “inevitable” nominee. In 2008, John McCain’s selection of Sarah Palin as his running mate seemed to energize his campaign so much that, despite gathering signs of financial disaster under the incumbent Republicans, just after Labor Day the McCain-Palin team had opened up a lead over Obama and Joe Biden in several national polls. CBS News and an ABC–Washington Post poll had McCain up by 2 percentage points in early September, a week before the Lehman Brothers bankruptcy; a USA Today–Gallup poll that same week had him ahead by a shocking 10 points. But Obama and Biden stayed unrattled and on message, and two months later they won with a two-to-one landslide in the Electoral College and a 7-point margin in the popular vote. The earnestly devotional HOPE poster by Shepard Fairey was the official icon of the Obama campaign. But its edgier, unofficial counterpart, a Photoshopped Internet image that appeared as an antidote to the panic over polls and Palin, perfectly captured the candidate’s air of icy assurance. It showed a no-nonsense Obama looking straight at the camera, with the caption EVERYONE CHILL THE FUCK OUT, I GOT THIS!

The history is relevant because it shows how quickly impressions of strength or weakness can evaporate and become almost impossible to reimagine. Try to think back to when sophisticated people thought that Sarah Palin was the key to Republican victory, or when Obama’s every political instinct seemed inspired. I can attest personally to a now-startling fact behind Jimmy Carter’s rise to the presidency. When he met privately with editorial-board members and veteran political figures across the country in the early days of his campaign—people who had seen contenders come and go and were merciless in spotting frailties—the majority of them went away feeling that in Carter they had encountered a person of truly exceptional political insight and depth. (You might not believe me; I have the notes.) Is this how the Nobel Peace Prize committee’s choice of Obama as its laureate within nine months of his taking office will look as the years pass—the symbol of a “market top” in the world’s romanticism about Obama?

Whether things seem to be going very well or very badly around him—whether he is announcing the death of Osama bin Laden or his latest compromise in the face of Republican opposition in Congress—Obama always presents the same dispassionate face. Has he been so calm because he has understood so much about the path ahead of him, and has been so clever in the traps he has set for his rivals? Or has he been so calm because, like the high-school kid on the plane, he has been so innocently unaware of how dire the situation has truly been?

This is the central mystery of his performance as a candidate and a president. Has Obama in office been anything like the chess master he seemed in the campaign, whose placid veneer masked an ability to think 10 moves ahead, at which point his adversaries would belatedly recognize that they had lost long ago? Or has he been revealed as just a pawn—a guy who got lucky as a campaigner but is now pushed around by political opponents who outwit him and economic trends that overwhelm him?

The end of a president’s first term is an important time to ask these questions, and not just because of the obvious bearing on his fitness for reelection. Hard as it is to have any dispassionate discussion of a president’s performance during an election year, it will be even harder once the election is over. If a year from now Obama is settling in for a second term, a halo effect will extend back to everything he did during his first four years. His programs will be more effective in reality, since he will get that many more years to cement them in with follow-up measures, supportive appointments to federal agencies and the courts, and possible vetoes of any attempts at repeal. And, through the lens of history, they will seem more effective, since whatever he did in his first term will appear to have been part of an overall plan that was ratified through reelection. Yet if a year from now a just-beaten former President Obama is thinking about his memoirs and watching his former appointees blame one another, and him, for the loss, the very same combination of missteps and achievements will be viewed as a narrative leading inexorably to defeat. By saying, after a year in office, that he would rather be “a really good one-term” president than a “mediocre” president who served two terms, Obama was playing to the popular conceit that presidents should rise above such petty concerns as reelection. The reality, though, is that our judgment about “really good” and “mediocre” presidents is colored by how long they serve. A failure to win reelection places a “one-term loser” asterisk on even genuine accomplishments. Ask George H. W. Bush, victor in the Gulf War; ask Jimmy Carter, architect of the Camp David agreement.

by James Fallows, The Atlantic |  Read more:
Photo: Carolyn Kaster/Associated Press/Corbis Images

Saturday, February 11, 2012


ahhh, that feels good...right there...
via:

The Ahh-ness of Things


[ed. Cherry blossoms are thought to be particularly symbolic of the concept of mono no aware. The transience of the blossoms, the extreme beauty and quick death, has often been associated with mortality and the ephemeral nature of life.]

The word is derived from the Japanese word mono, which means "thing", and aware, which was a Heian period expression of measured surprise (similar to "ah" or "oh"), translating roughly as "pathos", "poignancy", "deep feeling", or "sensitivity". Thus, mono no aware has frequently been translated as "the 'ahh-ness' of things", life, and love. Awareness of the transience of all things heightens appreciation of their beauty, and evokes a gentle sadness at their passing.

via: Wikipedia

Four Seaweeds for Health


A staple in Asian diets since ancient times, seaweeds are among the healthiest foods on the planet, packed with vitamins, minerals, and antioxidants. And now we know they’re great for the waistline, too: A 2010 study found the algae can reduce our rate of fat absorption by 75 percent, thanks to its inhibitory effect on a digestive enzyme called lipase. (Scientists at Newcastle University are about to begin clinical trials on a “wonder bread” made with alginate fibers and designed to speed weight loss.) Here are four briny plants to sample.

Wakame (Undaria pinnatifida)
Pappardella-like leaves with a salty-sweet zest

Nutrition Perks
Nutritionist Gillian McKeith, PhD, author of the You Are What You Eat Cookbook, calls wakame the woman’s seaweed because it is loaded with osteoporosis-preventing calcium and magnesium and acts as a diuretic (which helps reduce bloating). Wakame’s pigment, fucoxanthin, is known to improve insulin resistance, and a 2010 animal study found that fucoxanthin burns fatty tissue.

Kitchen Prep
Soak the leaves in cold water until tender, then enjoy them in a cucumber salad, dressed with rice vinegar, sesame oil, and soy sauce. To make miso soup, add wakame, tofu, and a few tablespoons of miso paste to a kombu stock (see below).

Nori (Porphyra species)
Papery sheets with a mild earthy taste

Nutritional Perks
Among the marine flora, nori is one of the richest in protein (up to 50 percent of the plant’s dry weight), and one sheet has as much fiber as a cup of raw spinach and more omega-3 fatty acids than a cup of avocado. Nori contains vitamins C (a potent antioxidant) and B12 (crucial for cognitive function) and the compound taurine, which helps control cholesterol.

Kitchen Prep
For a snack, toast strips of nori in the oven at low heat. Or cover a sheet with cooked brown rice; add a layer of sliced carrots, celery, or avocado, and a dash of wasabi. Roll it up and dip in a sauce of tamari, toasted-sesame oil, ginger, and rice vinegar.

By Tova Gelfond, Oprah |  Read more:
Photo: Dan Saelinger

Letters to Alice (by wjosna)
via:

Friday, February 10, 2012

What Remains: Conversations With America's Funeral Directors

The small film company I sometimes work for was planning a feature on new trends in funerals, and I had flown out for the weekend to try to meet some of the younger, hipper funeral directors at the conference. One of these was Ryan, a round man with a wide smile and an impeccable hair-part, whose car, he told me, has a bumper sticker that says "Let's Put The 'Fun' Back In Funeral." He started his career as a funeral director, but had since moved into the lucrative field of “death care industry” consultation, where he works with funeral directors on ways to expand their businesses. In one of our conversations he tells me, “The worst thing I’ve heard a funeral director say is ‘we’ve always done it this way.’" Later, I tell him my plan to attend that evening's “Funeral Directors Under 40: A Night on the Town” event. Without missing a beat, he lowered his voice and said, “Funeral directors are notoriously heavy drinkers. There will definitely be some hook-ups.”

The funeral industry is in the midst of a transition of titanic proportions. America is secularizing at a rapid pace, with almost 25% of the country describing itself as un-church. Americans, embracing a less religious view of the afterlife, are now asking for a "spiritual" funeral instead of a religious one. And cremation numbers are up. Way up. In liberal, secular states, specifically in the Pacific Northwest, cremation rates have steadily increased to more than half of disposals, up from the low single digits in 1990. The rest of the nation had also experienced steady gains in cremation since 2000 (except in the Bible Belt, where cremation rates remained relatively low). The rate of cremation has skyrocketed as Americans back away from the idea that Jesus will be resurrecting them straight from the grave. And so in the past twenty years, funeral directors have had to transform from presenters of a failed organism, where the sensation of closure is manifest in the presence of the deceased body, to the arbitrators of the meaning of a secular life that has just been reduced to ash. Reflecting this trend, this year's NFDA conference was, for the first time in its history, held jointly with the Cremation Association of North America (CANA).

Talking with funeral directors at the conference, I began to realize the scope of the crisis spurred by the rise of cremation and its new importance. As one former funeral director said, “If the family wanted a cremation, we’d say ‘That’ll be $595,’ hand them the urn and show them the door. Not anymore though.” The industry is scrambling to find a way to add value-added cremation services to remain solvent.

This tension about how best to innovate was in evidence at the first presentation I attended, titled “How To Step Up Your Game.” The presenter worked for a consulting firm that specialized in business strategies and management—the funeral industry was his particular subject of expertise. He launched into his talk with a story about a recent trip to Disney World with his daughter. While walking through the park, he realized how much the funeral industry could learn from the attraction. At Disney World, every interaction had been scripted and rehearsed, down to the greetings from the custodians. Experiences are controlled. Likewise, he said, every funeral should offer the same experience for everyone, whether cremated or open-casket. If, say, the customer was having an open-casket service with a priest and an organist, there should also be a corresponding service for someone, possibly secular, who has just been cremated. If priests are no longer always present to say platitudes over the dead, funeral directors would have to develop a corresponding basic, secular service to stand in as a reverent farewell. Thus they'd take a much larger role in the memorial, acting more like mainstream event planners and offering such amenities as video tributes, arranging for music and other points of the new-age burial.

by Max Rivlin-Nadler, The Awl:  Read more:

The Beatles


Much Ado About Nothing


Nick Hornby knew better, but he didn’t care. Because suddenly there was that face—the upturned nose, the lupine grin, the wary expression barely softened by the passage of, what, three decades now? Everyone else in the London club that December night was flittering around Colin Firth, set aglow by the Oscar buzz for his performance in The King’s Speech. Hornby let them flit. For here stood … Kevin Bacon. Undisturbed. That knowing smirk may have derailed him as a leading man, but it has allowed for a career of darker, richer roles—and allows him still to cruise a cocktail party longer than most boldfaced names without some fanboy rushing up to say how wonderful he is.

God knows, Hornby had seen that too often: an actor friend, eyes darting, cornered by a gushing stranger. This belated celebration of Firth’s 50th birthday was a private bash where artists and actors, people like Firth and Bacon—and, well, Hornby—could expect to relax. After all, between best-selling books such as About a Boy and a 2010 Academy Award nod earlier in the year for his screenplay for An Education, he had been cornered plenty himself.

Yet when he saw Bacon, Hornby couldn’t help it. He edged closer. It was like that scene from Diner when Bacon’s buddy sees a boyhood enemy in a crowd and breaks his nose: Hornby had no choice. In 1983 a girlfriend had brought home a tape of director Barry Levinson’s pitch-perfect comedy about twentysomething men, their nocturnal ramblings in 1959 Baltimore, their confused stumble to adulthood. Hornby was 26, a soccer fanatic, a writer searching for a subject. Diner dissected the male animal’s squirrelly devotion to sports, movies, music, and gambling. Diner had one man give his fiancée a football-trivia test and had another stick his penis through the bottom of a popcorn box. Hornby declared it, then and there, “a work of great genius.”

Midway through the movie, the ladies’ man Boogie, played by Mickey Rourke, is driving in the Maryland countryside with Bacon’s character, the perpetually tipsy Fenwick. They see a beautiful woman riding a horse. Boogie waves the woman down.

“What’s your name?,” Boogie asks.

“Jane Chisholm—as in the Chisholm Trail,” she says, and rides off.

Rourke throws up his hands and utters the words that Hornby, to this day, uses as an all-purpose response to life’s absurdities: “What fuckin’ Chisholm Trail?” And Fenwick responds with the line that, for Diner-lovers, best captures male befuddlement over women and the world: “You ever get the feeling there’s something going on that we don’t know about?”

In all, the scene encompasses only 13 lines of dialogue—an eternity if you’re Bacon at a party and a stranger knows them all. But Hornby wouldn’t be stopped. “I pinned that guy to the wall, and I quoted line after line,” Hornby recalls. “I thought, I don’t care. I’m never going to meet Kevin Bacon again. I need to get ‘What fuckin’ Chisholm Trail?’ off my chest.”

Hornby could not have planned a more apt tribute: Diner introduced to movies a character who compulsively recites lines from his favorite movie—and nothing else. And Hornby’s subsequent books about a fan obsessed with Arsenal football (Fever Pitch) and another obsessed with pop music (High Fidelity)—two postmodern London slackers who could easily have slid into a booth at the Fells Point Diner—are only the most obvious branches of the movie’s family tree.

Made for $5 million and first released in March 1982, Diner earned less than $15 million and lost out on the only Academy Award—best original screenplay—for which it was nominated. Critics did love it; indeed, a gang of New York writers, led by Pauline Kael, saved the movie from oblivion. But Diner has suffered the fate of the small-bore sleeper, its relevance these days hinging more on eyebrow-raising news like Barry Levinson’s plan to stage a musical version—with songwriter Sheryl Crow—on Broadway next fall, or reports romantically linking star Ellen Barkin with Levinson’s son Sam, also a director. The film itself, though, is rarely accorded its actual due.

Yet no movie from the 1980s has proved more influential. Diner has had far more impact on pop culture than the stylistic masterpiece Bladerunner, the indie darling Sex, Lies, and Videotape, or the academic favorites Raging Bull and Blue Velvet. Leave aside the fact that Diner served as the launching pad for the astonishingly durable careers of Barkin, Paul Reiser, Steve Guttenberg, Daniel Stern, and Timothy Daly, plus Rourke and Bacon—not to mention Levinson, whose résumé includes Rain Man, Bugsy, and Al Pacino’s recent career reviver, You Don’t Know Jack. Diner’s groundbreaking evocation of male friendship changed the way men interact, not just in comedies and buddy movies, but in fictional Mob settings, in fictional police and fire stations, in commercials, on the radio. In 2009, The New Yorker’s TV critic Nancy Franklin, speaking about the TNT series Men of a Certain Age, observed that “Levinson should get royalties any time two or more men sit together in a coffee shop.” She got it only half right. They have to talk too.

What Franklin really meant is that, more than any other production, Diner invented … nothing. Or, to put it in quotes: Levinson invented the concept of “nothing” that was popularized eight years later with the premiere of Seinfeld. In Diner (as well as in Tin Men, his 1987 movie about older diner mavens), Levinson took the stuff that usually fills time between the car chase, the fiery kiss, the dramatic reveal—the seemingly meaningless banter (“Who do you make out to, Sinatra or Mathis?”) tossed about by men over drinks, behind the wheel, in front of a cooling plate of French fries—and made it central.

by S.L. Price, Vanity Fair |  Read more:
Photograph: Paul Reiser

Open Your Mouth and You're Dead


Junko Kitahama's face is pale blue, her mouth agape, her head craned back like a dead bird’s. Through her swim mask, her eyes are wide and unblinking, staring at the sun. She isn’t breathing.

“Blow on her face!” yells a man swimming next to her. Another man grabs her head from behind and pushes her chin out of the water. “Breathe!” he yells. Someone from the deck of a boat yells for oxygen. “Breathe!” the man repeats. But Kitahama, who just surfaced from a breath-hold dive 180 feet below the surface of the ocean, doesn’t breathe. She doesn’t move. Kitahama looks dead.

Moments later, she coughs, jerks, twitches her shoulders, flutters her lips. Her face softens as she comes to. “I was swimming and…” She laughs and continues. “Then I just started dreaming!” Two men slowly float her over to an oxygen tank sitting on a raft. While she recovers behind a surgical mask, another freediver takes her place and prepares to plunge even deeper.

Kitahama, a female competitor from Japan, is one of more than 130 freedivers from 31 countries who have gathered here—one mile off the coast of Kalamata, Greece, in the deep, mouthwash blue waters of Messinian Bay—for the 2011 Individual Freediving Depth World Championships, the largest competition ever held for the sport. Over the next week, in an event organized by the International Association for the Development of Apnea (AIDA), they’ll test themselves and each other to see who can swim the deepest on a single lungful of air without passing out, losing muscle control, or drowning. The winners get a medal.

How deep can they go? Nobody knows. Competitive freediving is a relatively new sport, and since the first world championships were held in 1996, records have been broken every year, sometimes every few months. Fifty years ago, scientists believed that the deepest a human could freedive was about 160 feet. Recently, freedivers have routinely doubled and tripled that mark. In 2007, Herbert Nitsch, a 41-year-old Austrian, dove more than 700 feet—assisted by a watersled on the way down and an air bladder to pull him to the surface—to claim a new world record for absolute depth. Nitsch, who didn’t compete in Greece, plans to dive 800 feet in June, deeper than two football fields are long.

Nobody has ever drowned at an organized freediving event, but enough people have died outside of competition that freediving ranks as the second-most-dangerous adventure sport, right after BASE jumping. The statistics are a bit murky: some deaths go unreported, and the numbers that are kept include people who freedive as part of other activities, like spearfishing. But one estimate of worldwide freediving-related fatalities revealed a nearly threefold increase, from 21 deaths in 2005 to 60 in 2008.

by James Nestor, Outside |  Read more:
Photo: Igor Liberti

20 Common Grammar Mistakes That (Almost) Everyone Makes


I’ve edited a monthly magazine for more than six years, and it’s a job that’s come with more frustration than reward. If there’s one thing I am grateful for — and it sure isn’t the pay — it’s that my work has allowed endless time to hone my craft to Louis Skolnick levels of grammar geekery.

As someone who slings red ink for a living, let me tell you: grammar is an ultra-micro component in the larger picture; it lies somewhere in the final steps of the editing trail; and as such it’s an overrated quasi-irrelevancy in the creative process, perpetuated into importance primarily by bitter nerds who accumulate tweed jackets and crippling inferiority complexes. But experience has also taught me that readers, for better or worse, will approach your work with a jaundiced eye and an itch to judge. While your grammar shouldn’t be a reflection of your creative powers or writing abilities, let’s face it — it usually is.

Below are 20 common grammar mistakes I see routinely, not only in editorial queries and submissions, but in print: in HR manuals, blogs, magazines, newspapers, trade journals, and even best selling novels. If it makes you feel any better, I’ve made each of these mistakes a hundred times, and I know some of the best authors in history have lived to see these very toadstools appear in print. Let's hope you can learn from some of their more famous mistakes.

Who and Whom

This one opens a big can of worms. “Who” is a subjective — or nominative — pronoun, along with "he," "she," "it," "we," and "they." It’s used when the pronoun acts as the subject of a clause. “Whom” is an objective pronoun, along with "him," "her," "it", "us," and "them." It’s used when the pronoun acts as the object of a clause. Using “who” or “whom” depends on whether you’re referring to the subject or object of a sentence. When in doubt, substitute “who” with the subjective pronouns “he” or “she,” e.g., Who loves you? cf., He loves me. Similarly, you can also substitute “whom” with the objective pronouns “him” or “her.” e.g., I consulted an attorney whom I met in New York. cf., I consulted him.

Which and That

This is one of the most common mistakes out there, and understandably so. “That” is a restrictive pronoun. It’s vital to the noun to which it’s referring. e.g., I don’t trust fruits and vegetables that aren’t organic. Here, I’m referring to all non-organic fruits or vegetables. In other words, I only trust fruits and vegetables that are organic. “Which” introduces a relative clause. It allows qualifiers that may not be essential. e.g., I recommend you eat only organic fruits and vegetables, which are available in area grocery stores. In this case, you don’t have to go to a specific grocery store to obtain organic fruits and vegetables. “Which” qualifies, “that” restricts. “Which” is more ambiguous however, and by virtue of its meaning is flexible enough to be used in many restrictive clauses. e.g., The house, which is burning, is mine. e.g., The house that is burning is mine.

by Jon Gingerich, Lit Reactor |  Read more:

Thursday, February 9, 2012


Mark Davis, Icarus, 2011. Aluminum, brass, steel, acrylic and oil paint.
via:

The End of Wall Street As They Knew It

On Wall Street, the misery index is as high as it’s been since brokers were on window ledges back in 1929. But sentiments like that, accompanied by a full orchestra of the world’s tiniest violins, are only part of the conversation in Wall Street offices and trading desks. Along with the complaint is something that might be called soul-searching—which is, in itself, a surprising development. Since the crash, and especially since the occupation of Zuccotti Park last September (which does appear to have rattled a lot of nerves), there has been a growing recognition on Wall Street that the system that had provided those million-dollar bonuses was built on a highly unstable foundation. Disagreeable as it may be, goes this thinking, bankers have to go back to first principles, assess their value in the economy, and take their part in its rebuilding. No one on Wall Street liked to be scapegoated either by the Obama administration or by the Occupiers. But many acknowledge that the bubble­-bust-bubble seesaw of the past decades isn’t the natural order of capitalism—and that the compensation arrangements just may have been a bit out of whack. “There’s no other industry where you could get paid so much for doing so little,” a former Lehman trader said. Paul Volcker, whose eponymous rule is at the core of the changes, echoes an idea that more bankers than you’d think would agree with. “Finance became a self-justification,” he told me recently. “They made a lot of money trading with each other with doubtful public benefit.”

The questions of how to fix Wall Street–style capitalism—from taxes to regulation—are being intensely argued and will undergird much of the economic debate during this presidential election. And many on Wall Street are still making the argument that the consequences of hobbling Wall Street could be severe. “These are sweeping secular changes taking place that won’t just impact the guys who won’t get their bonuses this year,” Bove told me. “We’ve made a decision as a nation to shrink the growth of the financial system under the theory that it won’t impact the growth of the nation’s economy.”

And yet, the complaining has settled to a low murmur. Even as bonuses have withered, Wall Street as a political issue is gaining force. Bankers are aware that populism has a foothold, even in the Republican Party, and that these forces are liable to accelerate the process already taking place. “There’s a real sense the world is changing,” says a private-­equity executive with deep ties to the GOP. “People are becoming aware there’s real anger out there. It’s not just some kids camping out in some park. The Romney attacks caught everyone by surprise. We have prepared for this to come from the Democrats in the fall, but not now. You could run an entire campaign if you’re Barack Obama with ads using nothing but Republicans saying things about finance that you’d never hear two months ago. It’s an amazing thing.” (...)

Wall Street as Wal-Mart? A few years ago, the Masters of the Universe never could have imagined their industry being compared to big-box retailing. And yet, the model that had fueled bank profits has finally broken, as markets sputtered and new regulation kicked in. “Compensation is never really going to come back,” a Wall Street headhunter told me. “That is something entirely new.”

What is even more startling about this reversal is that few thought the much-vilified Dodd-Frank act would have much effect at all. From the moment it was proposed in 2009, the bill was tarred from all sides. Critics from the left, who wanted a return of Glass-­Steagall, which had kept investments banks and commercial banks separate until it was repealed during the Clinton years, howled that Dodd-Frank wouldn’t go far enough to break up the too-big-to-fail banks. “Dodd-Frank was an attempt to preserve the status quo,” Harvard economist Ken Rogoff told me. The too-big-to-fail banks, for their part, argued that the 2,300-page bill would create an overly complex morass of overlapping regulators that risked killing their ability to compete against foreign rivals. “We joke that Dodd-Frank was designed to deal with too-big-to-fail but it became too-big-to-read,” said the Citigroup executive.

By the time the bill passed, in July 2010, the legislation hadn’t found many new friends. Banks were especially upset by the inclusion of the Volcker Rule, which banned proprietary trading and virtually all hedge-fund investing by banks. Banks also complained about an amendment that slashed lucrative debit-card fees. They capitulated mainly because the alternative—breaking them up—was worse.

Part of the perception that the financial crisis changed nothing is that, in the immediate wake of the crash, the banks, buoyed by bailout dollars, whipsawed back to profitability. Goldman earned a record profit of $13.4 billion in 2009, as markets roared back from their post-Lehman lows. This dead-cat bounce was central to the formation of Occupy Wall Street and the neopopulist political currents that first erupted when the Treasury Department appointed Ken Feinberg to regulate bonuses for several TARP recipients. “The statute creating my authority was populist retribution,” Feinberg told me recently. “The feeling was, if you’re going to bail everyone out with the taxpayers, it has to come with a price.”

And yet, from the moment Dodd-Frank passed, the banks’ financial results have tended to slide downward, in significant part because of measures taken in anticipation of its future effect. Since July 2010, Bank of America nosed down 42 percent, Morgan Stanley fell 25 percent, Goldman fell 21 percent, and Citigroup fell 16—in a period when the Dow rose 25 percent. Partly, this is a function of the economic headwinds. But the bill’s major provisions—forcing banks to reduce leverage, imposing a ban on proprietary trading, making derivatives markets more transparent, and ending abusive debit-card practices—have taken a pickax to the Wall Street business model even though the act won’t be completely in effect till the ­Volcker Rule kicks in this July (other aspects of the bill took force in December; capital requirements and many other elements of the bill will be phased in gradually between now and 2016). “If you landed on Earth from Mars and looked at the banks, you’d see that these are institutions that need to build up capital and that they’re becoming ­lower-margin businesses,” a senior banker told me. “So that means it will be hard, nearly impossible, to sustain their size and compensation structure.” In the past year, the financial industry has laid off some 200,000 workers.

Nobody on either side would say that Dodd-Frank perfectly accomplished its aims. But while critics lament that no bank executives have gone to jail and have argued for a law that would have effectively blown up the banking system, Dodd-Frank is imposing a painful form of punishment. “Since 2008, what the financial community has done is kick the can down the road,” the senior banker added. “ ‘Let’s just buy us one more quarter and hope it gets better.’ Well, we’re now seeing cracks in that ability to continue operating with the structures that had been built up.”

by Gabriel Sherman, New York |  Read more:
Photo: Howard Schatz