Friday, October 12, 2012


玉山 / Jade Mountain, 台灣 / Taiwan.
via:

Disappearance on Mount Marathon

You’ve heard of Tough Mudder and the Spartan Race, but every Fourth of July, the town of Seward, Alaska, puts on a competition that rivals both. The Mount Marathon Race doesn’t look like much on paper: it’s short (just three miles long) and starts at sea level. But the race, which has been run as an organized event since 1915, is a beast, with a 3,022-foot vertical gain and loss over icy and treacherous mountain terrain, where racers pick their own way up—and down—the average 38-degree slope.

While injuries aren't uncommon at Mount Marathon, three unusually serious accidents marred this year's race. Iraq war veteran and Blackhawk pilot Penny Assman slid over a cliff, lacerating her liver and breaking several ribs; an Alaskan runner named Matthew Kenney fell at the same spot and suffered broken legs and brain damage. The most puzzling casualty, however, was 66-year-old Anchorage resident Michael LeMaitre, who vanished without a trace on his way up the mountain and is believed to be the first fatality in the race's history. The incidents left Seward struggling to understand how a runner could disappear on a course that's only a few miles long. We spoke separately to Karol Fink, a Mount Marathon organizer and 19-year race veteran who ran this year, and to MaryAnne LeMaitre, whose father has yet to be found.

THE MOTIVATION
The record time, set in 1981 by Bill Spencer, is 43:21. The average speed uphill is 2 mph. Downhill is 12 mph.


Karol Fink: It’s a bit mysterious—the feel of it. It’s dangerous, so you’re not only competing against other people on the mountain, but you’re competing to get across the finish line healthy and safe. I’m never going to be a professional runner, and I’m never going to be a racecar driver or a rodeo person, but this is my chance to push that edge.

What happened this year—it was quite devastating for the community. It was the talk of the town: Two people falling off the waterfall. A missing runner. People were out searching, helicopter going constantly. It consumed the energy of the town.

MaryAnne LeMaitre: My dad—he has always had an adventurous spirit: He did the Iditaski several times; he had a lot of exciting adventures on the water. He was definitely physically fit. But he hadn’t been on the mountain before. He won the lottery for getting into the race, and that was it.

THE SETTING
Mount Marathon in the Kenai Mountain range in southern Alaska looms over town of Seward, an active fishing port, with a population of about 3,000.


Fink: Distance in this race isn’t as critical as vertical. When the competitive men come down from the top of the mountain to the bottom, they’re descending over 3,000 feet in about five minutes. They’re not running, they’re free-falling.

This year, it had rained the night before, so it made the race really slick and muddy for the juniors. Next was the women’s race, and it was a pleasant temperature, in the low 60s, with a little bit of fog blowing in and out in strips. The men’s race was also foggy, but the trail had dried some.

LeMaitre: I was told it was overcast. Clouds were moving in and out, but you could still see Seward at times. It was rainy, but it didn’t sound like it was raining all the time.

by Caty Enders, Outside Magazine |  Read more:
Photo: jseattle/Flickr

Andy Warhol, Mick Jagger, 1975
via:

Levon Helm's Midnight Rambles


Leave it to Lucinda Williams, whose astringent lyrics have proven her capable of leaving no romantic notion unexamined, to ask the question of the hour. Sitting backstage at the Izod Center, in East Rutherford, before doing her turn at “Love for Levon,” the all-star tribute concert for Levon Helm, who died last April at seventy-one, Williams confessed to some confusion about the honoree. Helm was most famous for singing and playing drums in the Band, whose golden era ended in 1976. “Everybody is asking why he was important, and I don’t know what to say, because I never thought of most of these songs as Levon’s music—they were the Band’s. Did they do all this after Rick Danko died?”

The answer, obviously, is that there was no such reaction in 1999, after Danko, the Band’s bassist and singer, died in his sleep, at the age of fifty-six. Why, then, the outpouring of feeling for Helm? Oddly, Don Was, the esteemed musician and producer who served as co-musical director for the concert, had been pondering that very question. “Maybe it comes from reaching sixty,” speculated Was. “I’ve seen it before—Kris Kristofferson, Willie Nelson. All the jokes that people make about the I.R.S. or whatever. Suddenly you turn sixty, and that’s the stuff you’ve survived.”

Helm could serve as Exhibit A in Was’s argument, for, by the time he turned sixty, he was a veritable curator of stuff to survive. There were an abundance of garden-variety career disappointments, money setbacks, and legal problems maintaining a steady background behind larger crises. In 1986, after playing some club in Florida, longtime friend and bandmate Richard Manuel left Helm’s hotel room and hung himself in his shower. In 1991, Helm’s beloved home and recording studio in Woodstock burned to the ground in an electrical fire. In 1998, he was diagnosed with throat cancer and advised to get a laryngectomy; instead, he took twenty-seven radiation treatments that killed the tumors but robbed him of his voice. He faced each insult not only stoically—deprived of singing, he resolved to improve his drumming—but with an affability that was his signature. “I don’t care what kind of shitty mood you were in,” says Steve Berlin, the keyboard player and saxophonist for Los Lobos, “seeing Levon just brightened your day.”

Unfortunately, amiable perseverance alone is not enough, and by the end of 2003, Helm was facing foreclosure on his rebuilt home. With the help of his new manager, Barbara O’Brien, he began holding rent parties. These shindigs proved doubly useful: the earnings showed that he had regular income, which enabled him to refinance, and more importantly, their popularity gave Helm the idea of turning the parties into a regular show. Calling it the “Midnight Ramble,” after the almost-anything-goes medicine shows that would come through his Helena, Arkansas boyhood home, Helm tapped Larry Campbell, who had distinguished himself as the guitar-, mandolin-, and fiddle-playing sideman for Bob Dylan and others. Together with guitarist Jimmy Vivino, they built a band, one that eventually came to include Levon’s daughter, Amy, and Larry’s wife, Teresa Williams. They played songs from the Band’s repertoire, but also country, blues, gospel, Cajun, and rockabilly.

Soon, word began to circulate that Levon had a hot band that played at his house on Saturday nights, and before very long, Helm’s musical friends began sitting in: Emmylou Harris, Elvis Costello, Dr. John, Phil Lesh, Norah Jones, and many more began making the trip to Woodstock, and the audience followed. “It was one of those ‘If you build it, they will come’ things,” says Campbell. “And they did.” Standing on Helm’s outdoor deck, he pointed into the woods that fronts the eighteen-acre property. “On Saturday evenings, you could stand here and see the headlights stretching all the way back to the highway.” Capacity was two hundred, tops, and the audience would pack around the stage, sitting with their backs to his fireplace, or peer down from the loft space under the great peaked ceiling. To help foster the down-home feel of things, the guests, who had paid a hundred bucks apiece, were asked to bring a dish to the potluck supper that preceded every show.

“There’s a lot of bullshit that surrounds the music industry,” says Jim James of the group My Morning Jacket, who played the Ramble twice, expressing a widely held sentiment. “But Levon always conveyed a great sense of community and spirit, and put the music first.” Then the real miracle occurred: Helm’s voice returned. Gone was his strong tenor, replaced by something raspy and ornery, the voice of old man who has something to say. In 2007, he went back into the studio and recorded an album called “Dirt Farmer,” co-produced by Campbell and Amy Helm, which won a Grammy. For an album recorded with friends and family, in the home he rebuilt after an inferno, in the studio seized from the jaws of foreclosure, with the voice reborn after cancer, the honor must have been like gilding on a lily.

by Jamie Malanowski, New Yorker |  Read more:

Why Is This Man Running For President Of The Internet?


It’s an unseasonably cold early October evening in Lincoln, Nebraska, and Reddit co-founder Alexis Ohanian is giving his elevator pitch to a flustered but rapt woman behind the counter of a fast-food joint, Runza, where he’s picking up 45 servings of the eponymous Nebraskan meat and bread dish to bring back to a party at a local sports startup. “Have you heard of Hudl?” he asks, explaining in unbroken paragraphs how the power of the internet is changing high school and college sports. The woman laughs, an is this guy for real? kind of nervous giggle. But he’s dead serious.

It’s the off-the-cuff version of a stump speech I’d seen him give a few hours earlier, to a crowd of about a hundred at the University of Nebraska-Lincoln. ("We need to be good stewards of technology.") I’d watched him tell the crowd about Geek Day, a digital march on Washington that he had set into motion the day before, and tout his site's success in fighting the anti-piracy bills SOPA and PIPA — including the $94 million in lobbying, largely from the entertainment industry, that had pushed them to the brink of passing. He also tells a story about a trucker he met in Colorado who didn’t even know he was an “Internet Freedom” supporter until Alexis explained what that meant, and then — from the stage — he calls Nebraska Rep. Jeff Fortenberry, a Republican who supported SOPA, to ask him why (he gets voicemail). He refers to the President of the United States, without hesitation, as POTUS.

The whole time, though, he’s subtly code-switching to speak to one of the invisible constituencies he knows is present: Redditors. These, after all, are his people, his true believers. So he references bacon. He uses the word “epic.” He acknowledges memes, like Advice Animals.

When he leaves the stage, he shakes hands and poses for pictures; had a baby been there, he might’ve kissed it. Then he’s off to the next stop, a high school football game, in a tour bus that had at one time been leased by the McCain campaign and converted into the “Straight Talk Express.” Now, it’s been painted over — half red, half blue — and along with Ohanian and a few other Reddit staffers, is also carrying a small press corps (BuzzFeed included), a documentary crew, representatives from farming startup AgLocal, and a staffer at the newly formed Internet Association lobbying group. It is trailed, loudly, by an impressive car-buggy designed by open source automotive startup Local Motors.

If you didn’t know any better, you might think that Alexis Ohanian — the insistently goofy, imposingly tall, never-off 29-year-old cofounder of what is arguably the largest cohesive community on the internet — is running for office. And in fact, he kind of is — but for a position that doesn’t yet exist.

Alexis Ohanian wants to be the President of the Internet. And he’s pretty sure he knows what he needs to do to get there.

by John Herrman, Buzz Feed |  Read more:
Illustration by John Gara

Let's Start the Foodie Backlash

On a crisp autumn evening in a north London street, a rôtisserie trailer is parked outside a garden flat, green fairy lights blinking on and off, warm chickens perfuming the air. A thirtyish hipster wanders out to where I'm standing with a friend on the pavement and drawls his unimpressed judgment of what is going on inside. "I think the arancinis are not quite spicy enough," he informs us, with an eaten-it-all-before air. "Could have more flavour, not really exotic." Right now I haven't the faintest idea what "arancinis" are (or that arancini, like panini, is already an Italian plural), but I nod knowingly while typing his thoughts into my phone, and my friend keeps him talking. "I thought the Korean burger was quite good," the hipster goes on, without much kimchi-fired enthusiasm, "but I think a lot of people don't make their food with enough shbang … They kind of cater to the middle of the road." Twenty-five years ago, he could have been an indie-rock fan bemoaning the blandness of chart music. Now he's a social-smoking, foodier-than-thou critic at a "Food Rave".

The name of the Food Rave is entirely appropriate for a modern culture in which food is the last ingestible substance you can indulge in with obsessiveness without being frowned on by society. Alex James, the Blur bassist turned gentleman cheese farmer and Sun food columnist, has said: "My 20th birthday party was all about booze, my 30th birthday was about drugs, and now I realise that my 40s are about food." And he is not alone. Food replaces drugs in the gently ageing food-fancier's pantheon of pleasure, and brings along with it traces of the old pharmaceutical vocabulary. You hear talk of taking a "hit" of a dish or its sauce, as though from a spliff or bong; and a food-obsessive in hunter-gatherer mode is thrilled to "score" a few chanterelle mushrooms, as though he has had to buy them from a dodgy-looking gent on a murky Camden street corner. Food is valued for its psychotropic "rush"; Nigella Lawson refers to salted caramel as "this Class A foodstuff". Yes, food is the new drugs for former Britpoppers and the Ecstasy generation, a safer and more respectable hedonic tool, the key to a comfortingly domesticated high.

Western industrial civilisation is eating itself stupid. We are living in the Age of Food. Cookery programmes bloat the television schedules, cookbooks strain the bookshop tables, celebrity chefs hawk their own brands of weird mince pies (Heston Blumenthal) or bronze-moulded pasta (Jamie Oliver) in the supermarkets, and cooks in super-expensive restaurants from Chicago to Copenhagen are the subject of hagiographic profiles in serious magazines and newspapers. Food festivals (or, if you will, "Feastivals") are the new rock festivals, featuring thrilling live stage performances of, er, cooking. As one dumbfounded witness of a stage appearance by Jamie Oliver observed: "The girls at the front – it's an overwhelmingly female crowd – are already holding up their iPhones […] A group in front of me are saying, 'Ohmigodohmigodohmigod' on a loop […] 'I love you, Jamie,' yells a girl on the brink of fainting." The new series of The Great British Bake-Off trounced Parade's End in the ratings, and canny karaoke-contest supremo Simon Cowell is getting in on the act with a new series in development called Food, Glorious Food!– or, as it's known among production wags, The Eggs Factor.

If you can't watch cooking on TV or in front of your face, you can at least read about it. Vast swaths of the internet have been taken over by food bloggers who post photographs of what they have eaten from an edgy street stall or at an aspirational restaurant, and compose endlessly scrollable pseudo-erotic paeans to its stimulating effects. Right now, five of the 10 bestselling books on amazon.co.uk are food books, withNigellissima outselling Fifty Shades of Grey. According to the spring 2011 Bookscan data, British sales of books in nearly all literary genres were down, except for the categories of "food and drink" (up 26.2%), followed by "religion" (up 13%). (Before 1990, the bibliographic category of "food and drink" didn't even exist.) That food and religion alone should buck the negative trend is no coincidence, for modern food books are there to answer metaphysical or "lifestyle" rather than culinary aspirations, and celebrity chefs themselves are the gurus of the age.

It is not in our day considered a sign of serious emotional derangement to announce publicly that "chocolate mousse remains the thing I feel most strongly about", or to boast that dining with celebrities on the last night of Ferran Adrià's restaurant elBulli, in Spain, "made me cry". It is, rather, the mark of a Yahoo not to be able and ready at any social gathering to converse in excruciating detail and at interminable length about food. Food is not only a safe "passion" (in the tellingly etiolated modern sense of "passion" that just means liking something a lot); it has become an obligatory one. The unexamined meal, as a pair of pioneer modern "foodies" wrote in the 1980s, is not worth eating. Most cannily, the department of philosophy at the University of North Texas announced in 2011 its "Philosophy of Food Project", no doubt having noticed which way the wind was blowing, and presumably hoping that it would be able to trick food-obsessives into hard thinking about other topics. One can of course think philosophically about food, as about anything at all, but that is not what is going on in our mainstream gastroculture.

Where will it all end? Is there any communication or entertainment or social format that has not yet been commandeered by the ravenous gastrimarge for his own gluttonous purpose? Does our cultural "food madness", as the New York Times columnist Frank Rich suggests, tip into "food psychosis"? Might it not, after all, be a good idea to worry more about what we put into our minds than what we put into our mouths?

by Steven Poole, The Guardian |  Read more:
Photo: Alarmy

Thursday, October 11, 2012

The Neuroscience of Stage Fright

Public speaking is one of our most common fears, topping flying, financial ruin, sickness, and even death. The fear can get so bad that people become physically ill before getting on stage. But this fear — often called performance anxiety or stage fright — extends beyond the pressure to perform in the moment. It's about the underlying social psychology of exposing oneself to an audience. It's this vulnerability that sets off an entire cascade of physiological processes throughout the body in a defense mechanism that at one time served an important evolutionary purpose.

Understanding the science of stage fright can also help ease the fear.

A common fear

Back in 2007, I gave a talk at a futurist conference in Chicago that featured such speakers as Ray Kurzweil, William Shatner, and Peter Diamandis of XPrize fame. If this wasn't pressure enough, the day before my presentation I learned that one of my longtime heros, cognitive scientist Marvin Minsky, was going to be in the audience. It was at this point that my body started to rebel against me; I broke out into a nasty rash, began vomiting, and contracted a rather unpleasant case of diarrhea. The next day, I stood on the stage and desperately fought back the urge to panic, delivering a presentation that was stilted, awkward, and completely uninspiring.

Sadly, this was typical for me back then. But this experience finally made me snap out of my denial: I have stage fright — and I have it bad. And I am hardly alone.

Celebrities with stage fright include Rod Stewart, Barbara Streisand, Mel Gibson, and Carol Burnett (who reportedly threw-up before many of her performances). Many prominent athletes tend to suffer from it as well, including nervous hockey goalies and boxers who just can't seem to perform when everything's on the line.

Generalized anxiety

Stage fright is an emotional and physical response that is triggered in some people when they need to perform in front of an audience — or even an anticipated or perceived audience (such as standing in front of a camera).

While feelings of stress and anxiety are present during the actual performances themselves, individuals with stage fright often start to experience its effects days or weeks in advance (something that was particularly bad in my case). Consequently, stage fright is more than just a fear that's elicited during a performance — it's also very much about the lead-up. And in fact, for some, the performance itself can be a kind of cathartic release from the tension. (...)

Like most phobias, stage fright is a perfectly normal and even natural response to situations that are perceived to be dangerous or somehow detrimental. Psychologists who work with stage fright patients describe how their inner chatter tends to focus on those things that could go wrong during the performance and in the immediate aftermath of a potential failure. For people who have it quite bad, this can amount to a kind of neuroticism in which fears are exaggerated completely out of context — what psychologists call chronic catastrophizing.

And in fact, studies have shown that these fears can be driven by any number of personality traits, including perfectionism, an ongoing desire for personal control, fear of failure and success, and an intense anxiety about not being able to perform properly when the time comes (which can often serve as a self-fulfilling prophecy). Psychologists have also observed that people with stage fright tend to place a high value on being liked and regarded with high esteem.

Moreover, during the performance itself, individuals with stage fright tend to form a mental representation of their external appearance and behavior as they presume it's being seen by the audience. Consequently, they turn their focus onto themselves and interpret the audience's attention as a perceived threat.

by George Dvorsky, io9 |  Read more:
Photo: Clover/Shutterstock.com.

Frogs


[ed. Mo Yan winner of the 2012 Nobel Prize for Literature.]

...Aunty said she staggered out of the restaurant, headed to the hospital dormitory, but wound up in a marshy area on a narrow, winding path bordered on both sides by head-high reeds. Moonlight reflected on the water around her shimmered like glass. The croaks of toads and frogs sounded first on one side and then on the other, back and forth, like an antiphonal chorus. Then the croaks came at her from all sides at the same time, waves and waves of them merging to fill the sky. Suddenly, there was total silence, broken only by the chirping of insects. Aunty said that in all her years as a medical provider, traveling up and down remote paths late at night, she’d never once felt afraid. But that night she was terror-stricken. The croaking of frogs is often described in terms of drumbeats. But that night it sounded to her like human cries, almost as if thousands of newborn infants were crying. That had always been one of her favorite sounds, she said. For an obstetrician, no sound in the world approaches the soul-stirring music of a newborn baby’s cries. The liquor she’d drunk that night, she said, left her body as cold sweat. ‘Don’t assume I was drunk and hallucinating, because as soon as the liquor oozed out through my pores, leaving me with a slight headache, my mind was clear.’ As she walked down the muddy path, all she wanted was to escape that croaking. But how? No matter how hard she tried to get away, the chilling croak – croak – croak – sounds of aggrieved crying ensnared her from all sides. She tried to run, but couldn’t; the gummy surface of the path stuck to the soles of her shoes, and it was a chore even to lift up a foot, snapping the silvery threads that held her shoes to the surface of the path. But as They came upon her like ocean waves, enshrouding her with their angry croaks, and it felt as if all those mouths were pecking at her skin, that they had grown nails to scrape her skin soon as she stepped down, more threads were formed. So she took off her shoes to walk in her bare feet, but that actually increased the grip of the mud. Aunty said she got down on her hands and knees, like an enormous frog, and began to crawl. Now the mud stuck to her knees and calves and hands, but she didn’t care, she just kept crawling. It was at that moment, she said, when an incalculable number of frogs hopped out of the dense curtain of reeds and from lily pads that shimmered in the moonlight. Some were jade green, others were golden yellow; some were as big as an electric iron, others as small as dates. The eyes of some were like nuggets of gold, those of others, red beans. They came upon her like ocean waves, enshrouding her with their angry croaks, and it felt as if all those mouths were pecking at her skin, that they had grown nails to scrape her skin. When they hopped onto her back, her neck, and her head, their weight sent her sprawling onto the muddy path. Her greatest fear, she said, came not from the constant pecking and scratching, but from the disgusting, unbearable sensation of their cold, slimy skin brushing against hers. ‘They covered me with urine, or maybe it was semen.’ She said she was suddenly reminded of a legend her grandmother had told her about a seducing frog: a maiden cooling herself on a riverbank one night fell asleep and dreamed of a liaison with a young man dressed in green. When she awoke she was pregnant and eventually gave birth to a nest of frogs. Given an explosion of energy by that terrifying image, she jumped to her feet and shed the frogs on her body like mud clods. But not all – some clung to her clothes and to her hair; two even hung by their mouths from the lobes of her ears, a pair of horrific earrings. As she took off running, she sensed that somehow the mud was losing its sucking power, and as she ran she shook her body and tore at her clothes and skin with both hands. She shrieked each time she caught one of the frogs, which she flung away. The two attached to her ears like suckling infants took some of the skin with them when she pulled them off.

by Mo Yan, Granta | Read more:
Photo: Polarjez

Felice Casorati. Italian (1883 - 1963)
via:

Marcin Maciejowski. Clothes, 2009. Oil on canvas, 140 x 160 cm.
via:

First in Line


They fill the sidewalks with tents and sleeping bags, transforming once pristine city blocks with their very presence, sharing thermoses of coffee and small hot meals.

They don’t care about the evening chill, or the stares of passerby, or the police. And the police don’t care about them. Because on that bright morning when the Apple store opens, they’ll roll up their blankets, strike their tents, and go home with a shiny new iPhone 5, as happy as clams and just as stupid.

To liberals of the 90s, Bill Gates was the symbol of both wealth and malevolence incarnate. Not only was he the richest man in the world, but his monolithic and monopolistic enterprise was based on a mediocre product with built in buggy obsolescence. He didn’t innovate; instead he partnered with IBM, purchased DOS, and then exploited both. And through ruthless business savvy, the narrative goes, Microsoft strong-armed the market despite a middling product, terrible customer service, and ruthless cost cutting.

But one man, one company, made a career (and cult) out of this “critique” of Bill Gates. (...)

Apple surpassed Microsoft as the most valuable tech company in 2010, but Jobs had long before eclipsed Bill Gates in the consumer’s CEO-aspirant imaginary. Benevolent Jobs, who died merely the 42nd wealthiest American, was worshipped by liberals with the same intensity that Gates was hated.  (...)


What’s really going on in these ads? It’s not exactly the classic hip/square dichotomy: Jon Hodgman is funny and charismatic, and there is some amount of mutual respect here. Although Hodgman is clearly a square, ‘Mac’ is not primarily a cool guy who rejects Hodgman’s identity. Instead, ‘Mac’ is unshaven, informally dressed, kinda average. The difference is not between the square who sells out and the cool guy who opts out, but rather the technocratic office worker and the precarious creative. Mac admits that PC is good at getting business done, but business is boring, and he’d rather be drinking a latte at the co-work space he shares with an industrial designer and a start-up architect. Wouldn’t you?

The ascension myth of Jobs over Gates and of Apple over Microsoft is a spectacular reflection and reenactment of the rise of post-Fordist precarious labor over the sort of middle class white-collardom historicized by C. Wright Mills. Gates was a man who rationalized the computer business. He ensured his software was packaged with outside manufacturers, who would then do the messy work of race-to-the-bottom competition for him, but would all carry the same (and same priced) Microsoft OS.

Microsoft made money on every shitty Dell or pimped out IBM, requiring almost no hardware overhead of their own. Once Windows had achieved a certain level of dominance it was impossible to make a cheap computer without either it or a high level of technical savvy (e.g., Linux). Microsoft won, as Žižek put it in a London Review of Books essay, when it had “imposed itself as an almost universal standard, practically monopolizing the field, as one embodiment of what Marx called the ‘general intellect,’ meaning collective knowledge in all its forms … Gates effectively privatized part of the general intellect.”

Since Apple couldn’t fight on the grounds of cost, it would compete on ease-of-use and, more fundamentally, “lifestyle”: it would fight for its patch of general intellect. Rather than infinite customizability, Apple would come bundled with iTunes, iMovie and Garage Band. Apple made the hardware, the software, and ultimately even the store you bought it in. Microsoft sold a product that was ubiquitous and fundamental, but Apple sold a whole line of products, an experience, a way of life. The narrative is well rehearsed.

Microsoft predominantly (and unabashedly) produces work tools: the Microsoft Office Suite remains its flagship product. The PC has always been clunky when it comes to media production and consumption (though less so than Apple and its legions want you to believe), and the graceful handling of these functions is what sets Apple apart. Of course, these “creative” fields of production are just as much work tools as Office; it’s just that your work is fun. You make music! You make movies! You’re not a slave!

And though gamers have always used PCs, they did so because you could upgrade video cards and processing speed and power as you needed — keeping up with the latest generation of games without replacing your computer — so that gamers ended up with machines whose internal functioning little resembled their office counterparts. A Mac, however, is a Mac, its functions largely black-box and proprietary. You don’t hack it and you don’t upgrade it, you just buy a new one.

Of course, it is in design and packaging, not computing, that Apple has really excelled. Other than its innovations in touch-screen technology and battery life (significant but outsourced achievements), Apple has offered little in the way of technical invention. What Apple does best is user interface and visual design, which, if you’re feeling generous, you can call a kind of beautiful craft: sowing the glove to exactly fit the hand while also grabbing the eye. But design, especially when it comes to the mass-produced consumer object, is really just the arty end of the marketing spectrum.

by Willie Osterweil, Jacobin |  Read more:

Everyone Eats There

I left Los Angeles at 4 in the morning, long before first light, and made it to Bakersfield — the land of oil derricks, lowriders and truck stops with Punjabi food — by 6. Ten minutes later, I was in the land of carrots.

You know that huge pile of cello-wrapped carrots in your supermarket? Now imagine that the pile filled the entire supermarket. That’s how many carrots I saw upon my arrival at Bolthouse Farms. Something like 50 industrial trucks were filled to the top with carrots, all ready for processing. Bolthouse, along with another large producer, supplies an estimated 85 percent of the carrots eaten by Americans. There are many ways to put this in perspective, and they’re all pretty mind-blowing: Bolthouse processes six million pounds of carrots a day. If you took its yield from one week and stacked each carrot from end to end, you could circle the earth. If you took all the carrots the company grows in a year, they would double the weight of the Empire State Building.

At Bolthouse’s complex, carrots whirl around on conveyor belts at up to 50 miles an hour en route to their future as juliennes, coins and stubs, or baby carrots, which the company popularized and which aren’t babies. Other carrots become freezer fare, concentrate, salad dressings and beverages. Fiber is separated for tomato sauce and hot dogs. Whatever’s left becomes cattle feed.

Bolthouse is just one of the many massive operations of California’s expansive Central Valley, which is really two valleys: the San Joaquin to the south and Sacramento to the north. All told, the Central Valley is about 450 miles long, from Bakersfield up to Redding, and is 60 miles at its widest, between the Sierra Nevada to the east and the Coast Ranges to the west. It’s larger than nine different states, but size is only one of its defining characteristics: the valley is the world’s largest patch of Class 1 soil, the best there is. The 25-degree (or so) temperature swing from day to night is an ideal growing range for plants. The sun shines nearly 300 days a year. The eastern half of the valley (and the western, to some extent) uses ice melt from the Sierra as its water source, which means it doesn’t have the same drought and flood problems as the Midwest. The winters are cool, which offers a whole different growing season for plants that cannot take the summer heat. There’s no snow.

The valley became widely known in the 1920s and 1930s, when farmers arrived from Virginia or Armenia or Italy or (like Tom Joad) Oklahoma and wrote home about the clean air, plentiful water and cheap land. Now the valley yields a third of all the produce grown in the United States. Unlike the Midwest, which concentrates (devastatingly) on corn and soybeans, more than 230 crops are grown in the valley, including those indigenous to South Asia, Southeast Asia and Mexico, some of which have no names in English. At another large farm, I saw melons, lettuce, asparagus, cabbage, broccoli, chard, collards, prickly pears, almonds, pistachios, grapes and more tomatoes than anyone could conceive of in one place. (The valley is the largest supplier of canned tomatoes in the world too.) Whether you’re in Modesto or Montpelier, there’s a good chance that the produce you’re eating came from the valley.

I came to the valley both by choice and by mandate. In preparation for the magazine’s Food and Drink Issue, I asked readers to suggest my assignment. They could send me anywhere they wanted, within limitations of climate and jet lag. After reviewing the suggestions, it became clear that readers wanted an article that incorporated big farming, small farming, sustainability, politics, poverty and, of course, truly delicious food — and in the United States, if possible. So I decided to head to the Central Valley, where all of this was already happening. This also happened to satisfy a curiosity of mine. From a desk in New York, it’s impossible to fathom 50 m.p.h. carrots, hills of almonds, acres of basil and millions of tomatoes all ripening at once. How can all of this possibly work?

But I was also inclined to head to the valley because I know that, for the last century or so, we’ve been exploiting ­­ — almost without limitation — its water, mineral resources, land, air, people and animals. Mark Arax, a writer who lives in Fresno and has chronicled the region’s past and present, offered his opinion while serving me and a dozen others marinated lamb, a terrific recipe from his Armenian family: “This land and its water have gone mostly to the proposition of making a few men very wealthy and consigning generations of others, especially farmworkers, to lives in the dust.” I’d already seen an example of how wealth has been concentrated and captured in the valley: this summer, Campbell’s bought Bolthouse Farms for $1.55 billion. Meanwhile, there are thousands of valley farmworkers who are often victims of wage-theft and (illegally) required to supply their own tools.

So for five days I drove through the southern half of the valley. I wanted to learn as much as I could about the agriculture in America’s produce factory; where thoughtful farmers were leading it; and how — if at all — it might become sustainable.

by Mark Bittman, NY Times |  Read more:
Photo: Vincent Laforet

Wednesday, October 10, 2012

Un Homme Qui Dort (1974)


The title of Bernard Queysanne and Gerges Perec's 1974 haunting masterpiece, Un homme qui dort
(based on Perec's novel of the same name), translates roughly to "A Man Asleep", and that is an accurate description of the sole character in this beautifully fractured tale of alienation and isolation. The character is a man 25 years of age (Jacques Speisser), living alone in a cramped, lonesome Parisian apartment. We never learn his name, and we never hear him utter even a single word of dialogue. We simply view him going about various chores and activities that seem to take on a ritualistic importance: making a cup of Nescafe, reading, brushing his teeth, playing solitaire, attempting to fall asleep, piling dirty laundry into a basin of murky water, avoiding contact with friends and family so that a pile of crumpled letters accumulate near his door, and on occasion, trudging outdoors for a trip to the cinema or the diner. The film's opening segments contrast this unnamed man indulging in these repetitive routines with exterior shots of inhabitants in the city systematically going through the motions of every day life, and it becomes quickly clear that we are observing a human all but completely removed from the rhythms of society, marching to the beat of his own drum. The only words spoken in the film are done so by a female voice-over, who reads strikingly poetic passages from Perec's novel that convey the various emotional turmoils and anxieties felt by the nameless protagonist, as the character continues to avoid all contact with family and friends, and interaction with society in general as he slips further and further into this solipsistic void.

Un homme qui dort is shot in gorgeous black and white by cinematographer Bernard Zitzerman, and the film more than once recalls the work of Alain Resnais; with its classy compositions and gliding camera and enigmatic voice-over, and especially with its use of high contrast black and white in the latter part of the film, which is used to further give the outdoor scenes an alien quality, so foreign does the nameless hero feel walking the streets of his very own block. It is also fascinating how the camera methodically pushes in and out on the man in moments of contemplation, as though he were a specimen of loneliness under some giant existential microscope, whose very existence is on the verge of dissipating at any moment. The music in the film is sparse but used effectively, alternating between a high-pitched ambient tone that crescendos arbitrarily without warning, and an urgent clicking gallop, punctuated by harsh bangs on a piano. This disconcerting and distressing soundtrack only heighten the overwhelming sense of angst and disquietude that accompany the continuous shots of the young man and his vacant, lifeless stare, as he embarks on one lonesome, meaningless endeavor after the next.

There is no traditional narrative here, no backstory, no indications as to what could have possibly gone wrong in this persons life, or if anything ever went wrong at all. There is only the shell of a man, withdrawn, cut-off, sitting around and waiting until there's nothing left to wait for. By the end of the movie, the character is indulging himself in various delusions and launching into venemous, misanthropic speeches comparing humans to monsters, before a final bleak voice-over seems to suggest that nothing has been learned here, and that the character may never find peace, may never find a compromise, a possible means of actually living his life, as opposed to sleepwalking through it, as long as he is giving himself to these conditions. Un homme qui dort is a powerful experience for anyone who's ever felt like cutting themselves off from the world completely, for anyone who's just wanted to totally disappear from everything. It's a terrifying yet beautiful glimpse into a sad, sick life not led, and a piercing call to arms against neutrality and indifference. To disappear from the world is not difficult; to disappear from yourself is an entirely different matter, and this is a film that recognizes that with a deep, aching conviction.

Does Biology Make Us Liars?

Aristotle was a cynic. Sure, the Bible exhorts to “Love thy neighbor as thyself,” but he knew better. “The friendly feelings that we bear for another,” instructed his Ethics, “have arisen from the friendly feelings that we bear for ourselves.”

Two thousand years later, in 1739, Hume spelled out what the pagan thinker intuited: “I learn to do service to another, without bearing him any real kindness; because I foresee, that he will return my service, in expectation of another of the same kind.” Hume’s Edinburgh neighbor, Adam Smith, penned an often quoted phrase in this vein in The Wealth of Nations: “It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest. Nobody but a beggar chooses to depend chiefly on the benevolence of his fellow citizens.”

Self-love makes the world go round. But, alongside cooperation, could self-love give birth to deception? Could the imperative of self-regard be so great, in fact, as to lead to self-deceit? In his new book, Robert Trivers, a master of evolutionary thought, roams from stick insects and brain magnets to plane crashes and Israeli-Palestinian wars in service of a corollary to Aristotle’s hard-boiled thesis. We humans deceive ourselves, Trivers argues. We do so often, and almost always the better to deceive others for our own personal gain. From misguided estimates of self-worth to false historical narratives of nations, the self-love that spins the world is itself fueled by self-deceit. And the price can be substantial.  (...)

Deception is rife in humans for the same reason it is in nature: there are inbuilt clashes of interest, whether it be sexual strategy when it comes to females and males, parental investment when it comes to mothers and fathers, or resource allocation when it comes to parents and offspring. An expert in detecting conflict where others see harmony, Trivers worked out the evolutionary logic behind such relationships in the early 1970s, spawning entire fields in behavioral studies and genetics and giving rise to a number of predictions. One of the starkest of these was the idea that because fathers and mothers have different interests when it comes to the fetus (dad wants the baby bigger than mom does), identical genes on the chromosomes that they have each bequeathed will battle each other over control of embryonic growth. Sure enough, in the 1980s, biologists began to discover genes whose expression levels depended on from which parent they had come. And the gene knows where it came from, following the basic logic of genetic conflict Trivers described years before genomic imprinting was discovered.

Deception, to be truthful, is less of a mind-twister than self-deceit. Like Hume and Smith before him, Trivers understood that giving could serve one’s interests if the rewards of cooperation outweighed its costs. Using the logic of game theory, he showed that the principle of “you scratch my back and I’ll scratch yours” made evolutionary sense. Soon “reciprocal altruism” helped explain otherwise beguiling sacrificial behavior. But benevolence requires a strong sense of justice because a sense of justice is necessary to appreciate dishonesty: after all, in games of trust, especially with lag time, cheaters can wreak havoc. And so, over evolutionary time, an arms race honed in social mammals a growing intelligence. Trivers finds it ironic that “dishonesty has often been the file against which intellectual tools for truth have been sharpened.” But one of the outcomes of this Darwinian dynamic may have also been a genuine instinct for fairness, born of the need to distinguish trustworthy partners from charlatans.

But if evolution has done such a grand job of fine-tuning our senses in the service of detecting deceit, why does all the hard-won information that we extract from the world through our senses often become muddled and deformed in our brains? Why do we project our own traits onto others, repress true memories and invent false ones, lie to ourselves, rationalize immoral behavior, and generally deny inconvenient truths? Seventy percent of people rank themselves better-looking than average, according to a study cited by Trivers; 94 percent of academics (shocking!) think they are better than average, too. Why is this? The answer, Trivers would have us believe, is that the possibility of deceit raises the probability of ever more subtle mechanisms for spotting deceit, which in turn raises the probability of mechanisms for self-deceit. Trick yourself to trick another: what better way to conceal the truth? Self-deception is not a defensive measure meant only to make us feel better; it is a weapon instilled in us by natural selection to help deceive others for our own good.

by Oren Harman, TNR |  Read more:
Photo: Evan-Amos/Creative Commons

Dan Bern



[ed. I remember a hard freeze one year with hardly a wisp of snow, right through December -- the ice nearly three feet thick. I took my truck out on Big Lake, spinning brodies for at least a quarter of a mile. Without snow covering the ice you could see air bubbles like this everywhere, in all kinds of frozen, fantastic patterns.]

Photographs of water & ice by Douglas Capron

Melissa Mitchell, Kliff Hopson


[ed. Feeling a little homesick today.]