Friday, October 12, 2012

Why Is This Man Running For President Of The Internet?


It’s an unseasonably cold early October evening in Lincoln, Nebraska, and Reddit co-founder Alexis Ohanian is giving his elevator pitch to a flustered but rapt woman behind the counter of a fast-food joint, Runza, where he’s picking up 45 servings of the eponymous Nebraskan meat and bread dish to bring back to a party at a local sports startup. “Have you heard of Hudl?” he asks, explaining in unbroken paragraphs how the power of the internet is changing high school and college sports. The woman laughs, an is this guy for real? kind of nervous giggle. But he’s dead serious.

It’s the off-the-cuff version of a stump speech I’d seen him give a few hours earlier, to a crowd of about a hundred at the University of Nebraska-Lincoln. ("We need to be good stewards of technology.") I’d watched him tell the crowd about Geek Day, a digital march on Washington that he had set into motion the day before, and tout his site's success in fighting the anti-piracy bills SOPA and PIPA — including the $94 million in lobbying, largely from the entertainment industry, that had pushed them to the brink of passing. He also tells a story about a trucker he met in Colorado who didn’t even know he was an “Internet Freedom” supporter until Alexis explained what that meant, and then — from the stage — he calls Nebraska Rep. Jeff Fortenberry, a Republican who supported SOPA, to ask him why (he gets voicemail). He refers to the President of the United States, without hesitation, as POTUS.

The whole time, though, he’s subtly code-switching to speak to one of the invisible constituencies he knows is present: Redditors. These, after all, are his people, his true believers. So he references bacon. He uses the word “epic.” He acknowledges memes, like Advice Animals.

When he leaves the stage, he shakes hands and poses for pictures; had a baby been there, he might’ve kissed it. Then he’s off to the next stop, a high school football game, in a tour bus that had at one time been leased by the McCain campaign and converted into the “Straight Talk Express.” Now, it’s been painted over — half red, half blue — and along with Ohanian and a few other Reddit staffers, is also carrying a small press corps (BuzzFeed included), a documentary crew, representatives from farming startup AgLocal, and a staffer at the newly formed Internet Association lobbying group. It is trailed, loudly, by an impressive car-buggy designed by open source automotive startup Local Motors.

If you didn’t know any better, you might think that Alexis Ohanian — the insistently goofy, imposingly tall, never-off 29-year-old cofounder of what is arguably the largest cohesive community on the internet — is running for office. And in fact, he kind of is — but for a position that doesn’t yet exist.

Alexis Ohanian wants to be the President of the Internet. And he’s pretty sure he knows what he needs to do to get there.

by John Herrman, Buzz Feed |  Read more:
Illustration by John Gara

Let's Start the Foodie Backlash

On a crisp autumn evening in a north London street, a rôtisserie trailer is parked outside a garden flat, green fairy lights blinking on and off, warm chickens perfuming the air. A thirtyish hipster wanders out to where I'm standing with a friend on the pavement and drawls his unimpressed judgment of what is going on inside. "I think the arancinis are not quite spicy enough," he informs us, with an eaten-it-all-before air. "Could have more flavour, not really exotic." Right now I haven't the faintest idea what "arancinis" are (or that arancini, like panini, is already an Italian plural), but I nod knowingly while typing his thoughts into my phone, and my friend keeps him talking. "I thought the Korean burger was quite good," the hipster goes on, without much kimchi-fired enthusiasm, "but I think a lot of people don't make their food with enough shbang … They kind of cater to the middle of the road." Twenty-five years ago, he could have been an indie-rock fan bemoaning the blandness of chart music. Now he's a social-smoking, foodier-than-thou critic at a "Food Rave".

The name of the Food Rave is entirely appropriate for a modern culture in which food is the last ingestible substance you can indulge in with obsessiveness without being frowned on by society. Alex James, the Blur bassist turned gentleman cheese farmer and Sun food columnist, has said: "My 20th birthday party was all about booze, my 30th birthday was about drugs, and now I realise that my 40s are about food." And he is not alone. Food replaces drugs in the gently ageing food-fancier's pantheon of pleasure, and brings along with it traces of the old pharmaceutical vocabulary. You hear talk of taking a "hit" of a dish or its sauce, as though from a spliff or bong; and a food-obsessive in hunter-gatherer mode is thrilled to "score" a few chanterelle mushrooms, as though he has had to buy them from a dodgy-looking gent on a murky Camden street corner. Food is valued for its psychotropic "rush"; Nigella Lawson refers to salted caramel as "this Class A foodstuff". Yes, food is the new drugs for former Britpoppers and the Ecstasy generation, a safer and more respectable hedonic tool, the key to a comfortingly domesticated high.

Western industrial civilisation is eating itself stupid. We are living in the Age of Food. Cookery programmes bloat the television schedules, cookbooks strain the bookshop tables, celebrity chefs hawk their own brands of weird mince pies (Heston Blumenthal) or bronze-moulded pasta (Jamie Oliver) in the supermarkets, and cooks in super-expensive restaurants from Chicago to Copenhagen are the subject of hagiographic profiles in serious magazines and newspapers. Food festivals (or, if you will, "Feastivals") are the new rock festivals, featuring thrilling live stage performances of, er, cooking. As one dumbfounded witness of a stage appearance by Jamie Oliver observed: "The girls at the front – it's an overwhelmingly female crowd – are already holding up their iPhones […] A group in front of me are saying, 'Ohmigodohmigodohmigod' on a loop […] 'I love you, Jamie,' yells a girl on the brink of fainting." The new series of The Great British Bake-Off trounced Parade's End in the ratings, and canny karaoke-contest supremo Simon Cowell is getting in on the act with a new series in development called Food, Glorious Food!– or, as it's known among production wags, The Eggs Factor.

If you can't watch cooking on TV or in front of your face, you can at least read about it. Vast swaths of the internet have been taken over by food bloggers who post photographs of what they have eaten from an edgy street stall or at an aspirational restaurant, and compose endlessly scrollable pseudo-erotic paeans to its stimulating effects. Right now, five of the 10 bestselling books on amazon.co.uk are food books, withNigellissima outselling Fifty Shades of Grey. According to the spring 2011 Bookscan data, British sales of books in nearly all literary genres were down, except for the categories of "food and drink" (up 26.2%), followed by "religion" (up 13%). (Before 1990, the bibliographic category of "food and drink" didn't even exist.) That food and religion alone should buck the negative trend is no coincidence, for modern food books are there to answer metaphysical or "lifestyle" rather than culinary aspirations, and celebrity chefs themselves are the gurus of the age.

It is not in our day considered a sign of serious emotional derangement to announce publicly that "chocolate mousse remains the thing I feel most strongly about", or to boast that dining with celebrities on the last night of Ferran Adrià's restaurant elBulli, in Spain, "made me cry". It is, rather, the mark of a Yahoo not to be able and ready at any social gathering to converse in excruciating detail and at interminable length about food. Food is not only a safe "passion" (in the tellingly etiolated modern sense of "passion" that just means liking something a lot); it has become an obligatory one. The unexamined meal, as a pair of pioneer modern "foodies" wrote in the 1980s, is not worth eating. Most cannily, the department of philosophy at the University of North Texas announced in 2011 its "Philosophy of Food Project", no doubt having noticed which way the wind was blowing, and presumably hoping that it would be able to trick food-obsessives into hard thinking about other topics. One can of course think philosophically about food, as about anything at all, but that is not what is going on in our mainstream gastroculture.

Where will it all end? Is there any communication or entertainment or social format that has not yet been commandeered by the ravenous gastrimarge for his own gluttonous purpose? Does our cultural "food madness", as the New York Times columnist Frank Rich suggests, tip into "food psychosis"? Might it not, after all, be a good idea to worry more about what we put into our minds than what we put into our mouths?

by Steven Poole, The Guardian |  Read more:
Photo: Alarmy

Thursday, October 11, 2012

The Neuroscience of Stage Fright

Public speaking is one of our most common fears, topping flying, financial ruin, sickness, and even death. The fear can get so bad that people become physically ill before getting on stage. But this fear — often called performance anxiety or stage fright — extends beyond the pressure to perform in the moment. It's about the underlying social psychology of exposing oneself to an audience. It's this vulnerability that sets off an entire cascade of physiological processes throughout the body in a defense mechanism that at one time served an important evolutionary purpose.

Understanding the science of stage fright can also help ease the fear.

A common fear

Back in 2007, I gave a talk at a futurist conference in Chicago that featured such speakers as Ray Kurzweil, William Shatner, and Peter Diamandis of XPrize fame. If this wasn't pressure enough, the day before my presentation I learned that one of my longtime heros, cognitive scientist Marvin Minsky, was going to be in the audience. It was at this point that my body started to rebel against me; I broke out into a nasty rash, began vomiting, and contracted a rather unpleasant case of diarrhea. The next day, I stood on the stage and desperately fought back the urge to panic, delivering a presentation that was stilted, awkward, and completely uninspiring.

Sadly, this was typical for me back then. But this experience finally made me snap out of my denial: I have stage fright — and I have it bad. And I am hardly alone.

Celebrities with stage fright include Rod Stewart, Barbara Streisand, Mel Gibson, and Carol Burnett (who reportedly threw-up before many of her performances). Many prominent athletes tend to suffer from it as well, including nervous hockey goalies and boxers who just can't seem to perform when everything's on the line.

Generalized anxiety

Stage fright is an emotional and physical response that is triggered in some people when they need to perform in front of an audience — or even an anticipated or perceived audience (such as standing in front of a camera).

While feelings of stress and anxiety are present during the actual performances themselves, individuals with stage fright often start to experience its effects days or weeks in advance (something that was particularly bad in my case). Consequently, stage fright is more than just a fear that's elicited during a performance — it's also very much about the lead-up. And in fact, for some, the performance itself can be a kind of cathartic release from the tension. (...)

Like most phobias, stage fright is a perfectly normal and even natural response to situations that are perceived to be dangerous or somehow detrimental. Psychologists who work with stage fright patients describe how their inner chatter tends to focus on those things that could go wrong during the performance and in the immediate aftermath of a potential failure. For people who have it quite bad, this can amount to a kind of neuroticism in which fears are exaggerated completely out of context — what psychologists call chronic catastrophizing.

And in fact, studies have shown that these fears can be driven by any number of personality traits, including perfectionism, an ongoing desire for personal control, fear of failure and success, and an intense anxiety about not being able to perform properly when the time comes (which can often serve as a self-fulfilling prophecy). Psychologists have also observed that people with stage fright tend to place a high value on being liked and regarded with high esteem.

Moreover, during the performance itself, individuals with stage fright tend to form a mental representation of their external appearance and behavior as they presume it's being seen by the audience. Consequently, they turn their focus onto themselves and interpret the audience's attention as a perceived threat.

by George Dvorsky, io9 |  Read more:
Photo: Clover/Shutterstock.com.

Frogs


[ed. Mo Yan winner of the 2012 Nobel Prize for Literature.]

...Aunty said she staggered out of the restaurant, headed to the hospital dormitory, but wound up in a marshy area on a narrow, winding path bordered on both sides by head-high reeds. Moonlight reflected on the water around her shimmered like glass. The croaks of toads and frogs sounded first on one side and then on the other, back and forth, like an antiphonal chorus. Then the croaks came at her from all sides at the same time, waves and waves of them merging to fill the sky. Suddenly, there was total silence, broken only by the chirping of insects. Aunty said that in all her years as a medical provider, traveling up and down remote paths late at night, she’d never once felt afraid. But that night she was terror-stricken. The croaking of frogs is often described in terms of drumbeats. But that night it sounded to her like human cries, almost as if thousands of newborn infants were crying. That had always been one of her favorite sounds, she said. For an obstetrician, no sound in the world approaches the soul-stirring music of a newborn baby’s cries. The liquor she’d drunk that night, she said, left her body as cold sweat. ‘Don’t assume I was drunk and hallucinating, because as soon as the liquor oozed out through my pores, leaving me with a slight headache, my mind was clear.’ As she walked down the muddy path, all she wanted was to escape that croaking. But how? No matter how hard she tried to get away, the chilling croak – croak – croak – sounds of aggrieved crying ensnared her from all sides. She tried to run, but couldn’t; the gummy surface of the path stuck to the soles of her shoes, and it was a chore even to lift up a foot, snapping the silvery threads that held her shoes to the surface of the path. But as They came upon her like ocean waves, enshrouding her with their angry croaks, and it felt as if all those mouths were pecking at her skin, that they had grown nails to scrape her skin soon as she stepped down, more threads were formed. So she took off her shoes to walk in her bare feet, but that actually increased the grip of the mud. Aunty said she got down on her hands and knees, like an enormous frog, and began to crawl. Now the mud stuck to her knees and calves and hands, but she didn’t care, she just kept crawling. It was at that moment, she said, when an incalculable number of frogs hopped out of the dense curtain of reeds and from lily pads that shimmered in the moonlight. Some were jade green, others were golden yellow; some were as big as an electric iron, others as small as dates. The eyes of some were like nuggets of gold, those of others, red beans. They came upon her like ocean waves, enshrouding her with their angry croaks, and it felt as if all those mouths were pecking at her skin, that they had grown nails to scrape her skin. When they hopped onto her back, her neck, and her head, their weight sent her sprawling onto the muddy path. Her greatest fear, she said, came not from the constant pecking and scratching, but from the disgusting, unbearable sensation of their cold, slimy skin brushing against hers. ‘They covered me with urine, or maybe it was semen.’ She said she was suddenly reminded of a legend her grandmother had told her about a seducing frog: a maiden cooling herself on a riverbank one night fell asleep and dreamed of a liaison with a young man dressed in green. When she awoke she was pregnant and eventually gave birth to a nest of frogs. Given an explosion of energy by that terrifying image, she jumped to her feet and shed the frogs on her body like mud clods. But not all – some clung to her clothes and to her hair; two even hung by their mouths from the lobes of her ears, a pair of horrific earrings. As she took off running, she sensed that somehow the mud was losing its sucking power, and as she ran she shook her body and tore at her clothes and skin with both hands. She shrieked each time she caught one of the frogs, which she flung away. The two attached to her ears like suckling infants took some of the skin with them when she pulled them off.

by Mo Yan, Granta | Read more:
Photo: Polarjez

Felice Casorati. Italian (1883 - 1963)
via:

Marcin Maciejowski. Clothes, 2009. Oil on canvas, 140 x 160 cm.
via:

First in Line


They fill the sidewalks with tents and sleeping bags, transforming once pristine city blocks with their very presence, sharing thermoses of coffee and small hot meals.

They don’t care about the evening chill, or the stares of passerby, or the police. And the police don’t care about them. Because on that bright morning when the Apple store opens, they’ll roll up their blankets, strike their tents, and go home with a shiny new iPhone 5, as happy as clams and just as stupid.

To liberals of the 90s, Bill Gates was the symbol of both wealth and malevolence incarnate. Not only was he the richest man in the world, but his monolithic and monopolistic enterprise was based on a mediocre product with built in buggy obsolescence. He didn’t innovate; instead he partnered with IBM, purchased DOS, and then exploited both. And through ruthless business savvy, the narrative goes, Microsoft strong-armed the market despite a middling product, terrible customer service, and ruthless cost cutting.

But one man, one company, made a career (and cult) out of this “critique” of Bill Gates. (...)

Apple surpassed Microsoft as the most valuable tech company in 2010, but Jobs had long before eclipsed Bill Gates in the consumer’s CEO-aspirant imaginary. Benevolent Jobs, who died merely the 42nd wealthiest American, was worshipped by liberals with the same intensity that Gates was hated.  (...)


What’s really going on in these ads? It’s not exactly the classic hip/square dichotomy: Jon Hodgman is funny and charismatic, and there is some amount of mutual respect here. Although Hodgman is clearly a square, ‘Mac’ is not primarily a cool guy who rejects Hodgman’s identity. Instead, ‘Mac’ is unshaven, informally dressed, kinda average. The difference is not between the square who sells out and the cool guy who opts out, but rather the technocratic office worker and the precarious creative. Mac admits that PC is good at getting business done, but business is boring, and he’d rather be drinking a latte at the co-work space he shares with an industrial designer and a start-up architect. Wouldn’t you?

The ascension myth of Jobs over Gates and of Apple over Microsoft is a spectacular reflection and reenactment of the rise of post-Fordist precarious labor over the sort of middle class white-collardom historicized by C. Wright Mills. Gates was a man who rationalized the computer business. He ensured his software was packaged with outside manufacturers, who would then do the messy work of race-to-the-bottom competition for him, but would all carry the same (and same priced) Microsoft OS.

Microsoft made money on every shitty Dell or pimped out IBM, requiring almost no hardware overhead of their own. Once Windows had achieved a certain level of dominance it was impossible to make a cheap computer without either it or a high level of technical savvy (e.g., Linux). Microsoft won, as Žižek put it in a London Review of Books essay, when it had “imposed itself as an almost universal standard, practically monopolizing the field, as one embodiment of what Marx called the ‘general intellect,’ meaning collective knowledge in all its forms … Gates effectively privatized part of the general intellect.”

Since Apple couldn’t fight on the grounds of cost, it would compete on ease-of-use and, more fundamentally, “lifestyle”: it would fight for its patch of general intellect. Rather than infinite customizability, Apple would come bundled with iTunes, iMovie and Garage Band. Apple made the hardware, the software, and ultimately even the store you bought it in. Microsoft sold a product that was ubiquitous and fundamental, but Apple sold a whole line of products, an experience, a way of life. The narrative is well rehearsed.

Microsoft predominantly (and unabashedly) produces work tools: the Microsoft Office Suite remains its flagship product. The PC has always been clunky when it comes to media production and consumption (though less so than Apple and its legions want you to believe), and the graceful handling of these functions is what sets Apple apart. Of course, these “creative” fields of production are just as much work tools as Office; it’s just that your work is fun. You make music! You make movies! You’re not a slave!

And though gamers have always used PCs, they did so because you could upgrade video cards and processing speed and power as you needed — keeping up with the latest generation of games without replacing your computer — so that gamers ended up with machines whose internal functioning little resembled their office counterparts. A Mac, however, is a Mac, its functions largely black-box and proprietary. You don’t hack it and you don’t upgrade it, you just buy a new one.

Of course, it is in design and packaging, not computing, that Apple has really excelled. Other than its innovations in touch-screen technology and battery life (significant but outsourced achievements), Apple has offered little in the way of technical invention. What Apple does best is user interface and visual design, which, if you’re feeling generous, you can call a kind of beautiful craft: sowing the glove to exactly fit the hand while also grabbing the eye. But design, especially when it comes to the mass-produced consumer object, is really just the arty end of the marketing spectrum.

by Willie Osterweil, Jacobin |  Read more:

Everyone Eats There

I left Los Angeles at 4 in the morning, long before first light, and made it to Bakersfield — the land of oil derricks, lowriders and truck stops with Punjabi food — by 6. Ten minutes later, I was in the land of carrots.

You know that huge pile of cello-wrapped carrots in your supermarket? Now imagine that the pile filled the entire supermarket. That’s how many carrots I saw upon my arrival at Bolthouse Farms. Something like 50 industrial trucks were filled to the top with carrots, all ready for processing. Bolthouse, along with another large producer, supplies an estimated 85 percent of the carrots eaten by Americans. There are many ways to put this in perspective, and they’re all pretty mind-blowing: Bolthouse processes six million pounds of carrots a day. If you took its yield from one week and stacked each carrot from end to end, you could circle the earth. If you took all the carrots the company grows in a year, they would double the weight of the Empire State Building.

At Bolthouse’s complex, carrots whirl around on conveyor belts at up to 50 miles an hour en route to their future as juliennes, coins and stubs, or baby carrots, which the company popularized and which aren’t babies. Other carrots become freezer fare, concentrate, salad dressings and beverages. Fiber is separated for tomato sauce and hot dogs. Whatever’s left becomes cattle feed.

Bolthouse is just one of the many massive operations of California’s expansive Central Valley, which is really two valleys: the San Joaquin to the south and Sacramento to the north. All told, the Central Valley is about 450 miles long, from Bakersfield up to Redding, and is 60 miles at its widest, between the Sierra Nevada to the east and the Coast Ranges to the west. It’s larger than nine different states, but size is only one of its defining characteristics: the valley is the world’s largest patch of Class 1 soil, the best there is. The 25-degree (or so) temperature swing from day to night is an ideal growing range for plants. The sun shines nearly 300 days a year. The eastern half of the valley (and the western, to some extent) uses ice melt from the Sierra as its water source, which means it doesn’t have the same drought and flood problems as the Midwest. The winters are cool, which offers a whole different growing season for plants that cannot take the summer heat. There’s no snow.

The valley became widely known in the 1920s and 1930s, when farmers arrived from Virginia or Armenia or Italy or (like Tom Joad) Oklahoma and wrote home about the clean air, plentiful water and cheap land. Now the valley yields a third of all the produce grown in the United States. Unlike the Midwest, which concentrates (devastatingly) on corn and soybeans, more than 230 crops are grown in the valley, including those indigenous to South Asia, Southeast Asia and Mexico, some of which have no names in English. At another large farm, I saw melons, lettuce, asparagus, cabbage, broccoli, chard, collards, prickly pears, almonds, pistachios, grapes and more tomatoes than anyone could conceive of in one place. (The valley is the largest supplier of canned tomatoes in the world too.) Whether you’re in Modesto or Montpelier, there’s a good chance that the produce you’re eating came from the valley.

I came to the valley both by choice and by mandate. In preparation for the magazine’s Food and Drink Issue, I asked readers to suggest my assignment. They could send me anywhere they wanted, within limitations of climate and jet lag. After reviewing the suggestions, it became clear that readers wanted an article that incorporated big farming, small farming, sustainability, politics, poverty and, of course, truly delicious food — and in the United States, if possible. So I decided to head to the Central Valley, where all of this was already happening. This also happened to satisfy a curiosity of mine. From a desk in New York, it’s impossible to fathom 50 m.p.h. carrots, hills of almonds, acres of basil and millions of tomatoes all ripening at once. How can all of this possibly work?

But I was also inclined to head to the valley because I know that, for the last century or so, we’ve been exploiting ­­ — almost without limitation — its water, mineral resources, land, air, people and animals. Mark Arax, a writer who lives in Fresno and has chronicled the region’s past and present, offered his opinion while serving me and a dozen others marinated lamb, a terrific recipe from his Armenian family: “This land and its water have gone mostly to the proposition of making a few men very wealthy and consigning generations of others, especially farmworkers, to lives in the dust.” I’d already seen an example of how wealth has been concentrated and captured in the valley: this summer, Campbell’s bought Bolthouse Farms for $1.55 billion. Meanwhile, there are thousands of valley farmworkers who are often victims of wage-theft and (illegally) required to supply their own tools.

So for five days I drove through the southern half of the valley. I wanted to learn as much as I could about the agriculture in America’s produce factory; where thoughtful farmers were leading it; and how — if at all — it might become sustainable.

by Mark Bittman, NY Times |  Read more:
Photo: Vincent Laforet

Wednesday, October 10, 2012

Un Homme Qui Dort (1974)


The title of Bernard Queysanne and Gerges Perec's 1974 haunting masterpiece, Un homme qui dort
(based on Perec's novel of the same name), translates roughly to "A Man Asleep", and that is an accurate description of the sole character in this beautifully fractured tale of alienation and isolation. The character is a man 25 years of age (Jacques Speisser), living alone in a cramped, lonesome Parisian apartment. We never learn his name, and we never hear him utter even a single word of dialogue. We simply view him going about various chores and activities that seem to take on a ritualistic importance: making a cup of Nescafe, reading, brushing his teeth, playing solitaire, attempting to fall asleep, piling dirty laundry into a basin of murky water, avoiding contact with friends and family so that a pile of crumpled letters accumulate near his door, and on occasion, trudging outdoors for a trip to the cinema or the diner. The film's opening segments contrast this unnamed man indulging in these repetitive routines with exterior shots of inhabitants in the city systematically going through the motions of every day life, and it becomes quickly clear that we are observing a human all but completely removed from the rhythms of society, marching to the beat of his own drum. The only words spoken in the film are done so by a female voice-over, who reads strikingly poetic passages from Perec's novel that convey the various emotional turmoils and anxieties felt by the nameless protagonist, as the character continues to avoid all contact with family and friends, and interaction with society in general as he slips further and further into this solipsistic void.

Un homme qui dort is shot in gorgeous black and white by cinematographer Bernard Zitzerman, and the film more than once recalls the work of Alain Resnais; with its classy compositions and gliding camera and enigmatic voice-over, and especially with its use of high contrast black and white in the latter part of the film, which is used to further give the outdoor scenes an alien quality, so foreign does the nameless hero feel walking the streets of his very own block. It is also fascinating how the camera methodically pushes in and out on the man in moments of contemplation, as though he were a specimen of loneliness under some giant existential microscope, whose very existence is on the verge of dissipating at any moment. The music in the film is sparse but used effectively, alternating between a high-pitched ambient tone that crescendos arbitrarily without warning, and an urgent clicking gallop, punctuated by harsh bangs on a piano. This disconcerting and distressing soundtrack only heighten the overwhelming sense of angst and disquietude that accompany the continuous shots of the young man and his vacant, lifeless stare, as he embarks on one lonesome, meaningless endeavor after the next.

There is no traditional narrative here, no backstory, no indications as to what could have possibly gone wrong in this persons life, or if anything ever went wrong at all. There is only the shell of a man, withdrawn, cut-off, sitting around and waiting until there's nothing left to wait for. By the end of the movie, the character is indulging himself in various delusions and launching into venemous, misanthropic speeches comparing humans to monsters, before a final bleak voice-over seems to suggest that nothing has been learned here, and that the character may never find peace, may never find a compromise, a possible means of actually living his life, as opposed to sleepwalking through it, as long as he is giving himself to these conditions. Un homme qui dort is a powerful experience for anyone who's ever felt like cutting themselves off from the world completely, for anyone who's just wanted to totally disappear from everything. It's a terrifying yet beautiful glimpse into a sad, sick life not led, and a piercing call to arms against neutrality and indifference. To disappear from the world is not difficult; to disappear from yourself is an entirely different matter, and this is a film that recognizes that with a deep, aching conviction.

Does Biology Make Us Liars?

Aristotle was a cynic. Sure, the Bible exhorts to “Love thy neighbor as thyself,” but he knew better. “The friendly feelings that we bear for another,” instructed his Ethics, “have arisen from the friendly feelings that we bear for ourselves.”

Two thousand years later, in 1739, Hume spelled out what the pagan thinker intuited: “I learn to do service to another, without bearing him any real kindness; because I foresee, that he will return my service, in expectation of another of the same kind.” Hume’s Edinburgh neighbor, Adam Smith, penned an often quoted phrase in this vein in The Wealth of Nations: “It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest. Nobody but a beggar chooses to depend chiefly on the benevolence of his fellow citizens.”

Self-love makes the world go round. But, alongside cooperation, could self-love give birth to deception? Could the imperative of self-regard be so great, in fact, as to lead to self-deceit? In his new book, Robert Trivers, a master of evolutionary thought, roams from stick insects and brain magnets to plane crashes and Israeli-Palestinian wars in service of a corollary to Aristotle’s hard-boiled thesis. We humans deceive ourselves, Trivers argues. We do so often, and almost always the better to deceive others for our own personal gain. From misguided estimates of self-worth to false historical narratives of nations, the self-love that spins the world is itself fueled by self-deceit. And the price can be substantial.  (...)

Deception is rife in humans for the same reason it is in nature: there are inbuilt clashes of interest, whether it be sexual strategy when it comes to females and males, parental investment when it comes to mothers and fathers, or resource allocation when it comes to parents and offspring. An expert in detecting conflict where others see harmony, Trivers worked out the evolutionary logic behind such relationships in the early 1970s, spawning entire fields in behavioral studies and genetics and giving rise to a number of predictions. One of the starkest of these was the idea that because fathers and mothers have different interests when it comes to the fetus (dad wants the baby bigger than mom does), identical genes on the chromosomes that they have each bequeathed will battle each other over control of embryonic growth. Sure enough, in the 1980s, biologists began to discover genes whose expression levels depended on from which parent they had come. And the gene knows where it came from, following the basic logic of genetic conflict Trivers described years before genomic imprinting was discovered.

Deception, to be truthful, is less of a mind-twister than self-deceit. Like Hume and Smith before him, Trivers understood that giving could serve one’s interests if the rewards of cooperation outweighed its costs. Using the logic of game theory, he showed that the principle of “you scratch my back and I’ll scratch yours” made evolutionary sense. Soon “reciprocal altruism” helped explain otherwise beguiling sacrificial behavior. But benevolence requires a strong sense of justice because a sense of justice is necessary to appreciate dishonesty: after all, in games of trust, especially with lag time, cheaters can wreak havoc. And so, over evolutionary time, an arms race honed in social mammals a growing intelligence. Trivers finds it ironic that “dishonesty has often been the file against which intellectual tools for truth have been sharpened.” But one of the outcomes of this Darwinian dynamic may have also been a genuine instinct for fairness, born of the need to distinguish trustworthy partners from charlatans.

But if evolution has done such a grand job of fine-tuning our senses in the service of detecting deceit, why does all the hard-won information that we extract from the world through our senses often become muddled and deformed in our brains? Why do we project our own traits onto others, repress true memories and invent false ones, lie to ourselves, rationalize immoral behavior, and generally deny inconvenient truths? Seventy percent of people rank themselves better-looking than average, according to a study cited by Trivers; 94 percent of academics (shocking!) think they are better than average, too. Why is this? The answer, Trivers would have us believe, is that the possibility of deceit raises the probability of ever more subtle mechanisms for spotting deceit, which in turn raises the probability of mechanisms for self-deceit. Trick yourself to trick another: what better way to conceal the truth? Self-deception is not a defensive measure meant only to make us feel better; it is a weapon instilled in us by natural selection to help deceive others for our own good.

by Oren Harman, TNR |  Read more:
Photo: Evan-Amos/Creative Commons

Dan Bern



[ed. I remember a hard freeze one year with hardly a wisp of snow, right through December -- the ice nearly three feet thick. I took my truck out on Big Lake, spinning brodies for at least a quarter of a mile. Without snow covering the ice you could see air bubbles like this everywhere, in all kinds of frozen, fantastic patterns.]

Photographs of water & ice by Douglas Capron

Melissa Mitchell, Kliff Hopson


[ed. Feeling a little homesick today.]

Amanda Hocking and the 99-Cent Kindle Millionaires

As Amanda Hocking said herself, "I don't understand why the internet suddenly picked up on me this past week, but it definitely did."

And how.

The writing world is abuzz about Amanda Hocking, the 26-year-old self-published author who sold over 450,000 copies of her e-books in January alone, mostly priced between 99 cents and $2.99. She's now a millionaire. The writing world has been abuzz for a while about J.A. Konrath, who has very publicly blogged about the significant amount of money he has made selling inexpensive e-books.

Many people in the last week have sent me links about these authors, wondering...

What exactly is going on here? How in the heck are these self-published authors making so much money? Is this the future? And does this mean the end of the publishing industry as we know it?

The News That's Fit to Print

Before we delve into what this means for the world of books, I feel like it's important to take a deep breath and splash some cold water on our faces.

The reality: This is still a print world and probably will be for at least the next several years. Even as some publishers report e-book sales jumping to between 25% and 35% in January, the significant majority of sales are still in print. As I wrote in my recent post about record stores, over a decade after the rise of the mp3 the majority of revenue in music is still in CDs.

So let's not get out of hand (yet) about the scale of this e-book self-publishing revolution, if it is indeed one. Yes, this is real money we're talking about. Yes, these authors deserve all the credit in the world. And yes, these authors are also making money in print as well.

But we're still a ways away from self-published Kindle bestsellers making Dan Brown, James Patterson, Stephenie Meyer, J.K. Rowling kind of money, the old-fashioned way, through paper books in bookstores. It's not as exciting a story to remember that traditionally published franchise James Patterson made $70 million between June '09 and June '10, but it's still worth keeping in perspective.

Let's also not forget that Hocking, Konrath and a couple of others are the tip of a very large iceberg of self-published authors, the overwhelming majority of whom are selling the merest handful of copies. As Hocking herself writes:
I guess what I'm saying is that just because I sell a million books self-publishing, it doesn't mean everybody will. In fact, more people will sell less than 100 copies of their books self-publishing than will sell 10,000 books. I don't mean that to be mean, and just because a book doesn't sell well doesn't mean it's a bad book. It's just the nature of the business.
Yes, it's new, it's a big deal, it's seriously awesome for Hocking, who seems like a super nice and humble person. But let's not also lose our perspective about the scale of the shift taking place. The book world is changing in a big way, but it still ain't done changed just yet.

by Nathan Bransford |  Read more:

Zhao Ji (Emperor Huizong, 1082 - 1135): Auspicious Cranes
via:

Those Summers, These Days

On a warm afternoon in August, almost all of the fifty or so members of my extended family gather at my grandma’s farm to celebrate Grandma Fugman’s 80th birthday, and concurrently, my son Elvis’s second birthday. Picnic tables and chairs dot the front lawn, burgers and hot dogs roast on a grill, a slight breeze rustles the century-old trees bordering the street. It is warm but not sweltering, cool enough to sit comfortably in the shade. Two of my cousins recline on a blanket with their six-month-old babies beneath the lane of maple trees along the south side of the yard. My dad and his brother sit at the picnic table, each with a Miller Lite in his hand. Some uncles and nephews kick a soccer ball around. While it’s a special occasion that we’re gathered for on this Sunday in August, one could expect to see a half dozen or so kids in the yard at Grandma’s house on any given day. All of the family members on my dad’s side live within 30 minutes of each other in Northeast Ohio, except for me, my husband, and my kids. Elvis and my daughter, Lydia, with my cousins and cousins’ kids, push tractors and bull dozers in the same sand pile that my brothers and I played in twenty years ago, and my dad and his siblings twenty plus years before that. If they dig deep enough, they will probably unearth a Matchbox car from 1970. Beneath the shade of a maple tree, the cousins and second cousins and first cousins twice removed, or whatever they might be, get the same grit of the family farm beneath their fingernails.

I spent my childhood romping around the farm with my cousins, begged my dad to take me with him in the mornings to traverse the cool, wet terrain of the cornfield, dew heavy before the sun rose over the tree line. My cousins and I were taught the way to pull an ear of corn away from the stalk with a swift twist in order to make a clean break. After we filled the bushel baskets lining the dirt lane, Dad, or Frank or June or Connie or Rich or Pat or one of the other aunts and uncles, would lift the baskets over the edge of the pickup. We challenged each other to see who could launch themselves up into the truck bed the fastest. Our bony legs dangled over the tailgate, prune-y feet in wet shoes swinging back and forth as we bounced through the field to the house.

When we weren’t trying to help pick corn or vegetables in the field, my older cousins and I would play a dozen different versions of tag, hide and seek, SPUD, ghost in the graveyard, and baseball, employing “ghost runners” when there weren’t enough of us to run the bases, pitch, catch, and field. We jumped from the wooden bench swing into a mountain of maple leaves each fall. The swing’s rope rubbed our palms until they stung as we spun each other around. We barrel rolled each other down the slope from the house to the trees, the whole world spinning. We picked red raspberries and black raspberries and didn’t notice until much later the scratches on our legs from the bushes.

When we tired of playing in the yard, we walked through the corn and hay, down the hill, and into the woods. The trails wound randomly, looped around an ancient tree and backed up to a creek, but it was more fun to ignore the trail and plot out our own way, stepping on branches and startling at the sudden rustle of leaves nearby. The woods were never quiet, even when we would shush each other into silence and freeze, our breathing heavy as we eyed the forest for deer or fiercer wildlife we imagined into existence. The birds would chirrup, frogs ribbit, bees hum, chipmunks and squirrels rummage, leaves crackle. Cars could be heard coming down Stafford Road, spraying up limestone and tar as they sped along. When it was hot, we navigated skunk cabbage and may apples to the creek, waded in the cold, knee-high waters hunting for crawfish and minnows, challenged each other to walk through the culvert pipe underneath the road. As the pond my dad dug in the woods filled with rain water and run-off from the fields, I imagined all of us in speed boats, hanging out on a sandy beach, fishing and picnicking by the lake. It didn’t matter that you could skip a rock from one end of the pond to another or that the mud bottom and snapping turtles prevented anyone except our black lab from swimming in it. We roamed around the pond hunting for tadpoles, wary of the higher weeds, afraid there might be snakes.

Our parents were elsewhere—working at a job, sitting in the living room with Grandma, weeding in the garden. We came back for lunch and for dinner, but no one scolded us for being gone so long, at least not that I remember. We were free to wander.

It is hard for me to imagine a childhood without the farm or a definition of home without the farm in it. The summer I turned ten, my parents bought the century home across the street from the farm and next door to my other set of grandparents. Home extended beyond the four walls of my parents’ house and was defined by natural boundaries; it stretched through the field and woods all the way to the creek and then south to the lane, across the road and down to another creek, then back up through the rows of field corn to my mom’s parents’ yard, bordered by towering blue spruce trees. My brothers and I were more at home outdoors than in. No matter the day or season, someone was always around to play with, all I needed to do was cross the street, hop the ditch, and walk down the field. If there weren’t cousins there yet, they’d be there soon, I was sure of it.

by Sarah Wells, Ascent |  Read more:

FiveBooks Interviews > Renata Salecl on Modern Misery

The Slovenian philosophy professor decries the tyranny of choice and says we now expect long life, a beautiful body, sexual and job satisfaction. But the idea that we can perfect ourselves dooms us to failure and misery

Why misery? 

I wanted to go against the presumption that happiness is the theme of today’s life. This ideology of happiness has actually produced more unhappiness than needed, since we’ve constantly been measuring our lives with regard to success, or self-fulfilment, or enjoyment. From a psychoanalytic point of view it’s been known for a long time that total satisfaction is completely impossible to attain.

The books I chose describe the most prevalent forms of unhappiness linked to the expectations that we have in today’s post-industrial capitalism. Those expectations are: long life, beautiful body, a sexually satisfied life, creatively satisfied life, an ideal of self-making. The idea is that we have the power to create this ideal life – and exactly these books reverse this presumption.

Tell me about The New Black.

Darian Leader is a British psychoanalyst who in a great way undermines today’s ideas about depression. He starts with the premise that we live in a society of hyped optimism, where depression appears as a danger that goes against optimism – it’s something for people who gave up the fight for success or whatever. Today we use the terms depression and stress too much – they dominate psychiatric and self-help discourse.

They’re debased terms; you might be ‘depressed’ if you miss the bus.

Absolutely. Or just the common boredom of children can be described as depression. But what Darian does is to return to the difference between melancholy and mourning, and he makes a great distinction between them. It’s very good to return to these different roots of depression, and to stay with them – not as traumatic things, but as something pretty normal which has been forgotten.

So we should re-codify depression?

Not perceive it as a unified term, but to see it as various different things, which is why he is using the old terms of melancholy, mourning and loss.

Darian is also critical of the pharmaceutical industry: depression appears as something universal that can be quickly dealt with using pretty much universal types of drugs. But, as he points out, this denies the fact that the symptom is connected to some cause beyond the depression. He shows that in depression everyone has a different logic and a different individual story, which can be linked to loss – of another human being, of identity, of a job, of health or love. It can also be linked to being stuck in circulating around some lack.

He particularly shows that what we cannot deal with easily today is mourning. Not only mourning people we have lost, where it is quite easy to see why someone is sad; but whenever we lose someone we mourn not only that person, but also who we were for that person – and that’s much more difficult to overcome. Our society has a hard time dealing with that: previous societies (he uses a lot of anthropology) had long rituals related to mourning for the beloved and so on.

Contemporary society has a problem with death altogether.

Absolutely. There’s a denial of death, almost a prohibition of thinking about it, and an attempt to prolong life unrealistically. We try to stay away from a person who is mourning. And he shows that mourning and loss can be very much helped with various social rituals, as it was in the past. For example, children today are almost not allowed to attend funerals. But funerals have a huge symbolic role, especially for small children if their grandparent dies, to be part of saying goodbye and be part of the ritual.

He also deals with the important difference between lack and loss. He points out the problem that we have with loss as such: sometimes we lose something and can pinpoint what we have lost. But sometimes actually we are stuck in a particular deadly enjoyment of circulating around a lack in our life, and that can be a completely different melancholic state from a mourning state. In the conclusion he sees a way of overcoming this stuckness via artistic production. Some artists, for example, have been very good at circulating around the pain of lack, and have created great works of art. So not only has melancholy been the cornerstone of a lot of cultural production in the past and today, but it can even be its engine.

If you were to read this looking for some prescription for your own life, wouldn’t that set up unrealistic expectations? Not everybody has a creative spark.

That’s a great point. Unfortunately life is perceived as a work of art; everyone is supposed to work on life as a special project or a business plan. I would think that he’s not planning to push everyone into creation; what he is doing is basically unpacking and destigmatising depression and showing that it’s a really dangerous term to use universally. That’s where I think he helps everyone.

Interview of Renata Saleci by Tom Dannet, The Browser |  Read more: