Saturday, October 13, 2012

Dark Social: We Have the Whole History of the Web Wrong

[ed. This is why I find it curious that content providers are sometimes so aggressive in wielding DMCA takedown requests when a variety of sources (ahem....like this blog) are responsible for generating such a large proportion of their traffic.]

Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'

But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.

To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.

There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).

This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.

Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.

Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")

And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.

The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!

On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.


Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.

Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.

by Alex Madrigal, The Atlantic |  Read more:

Nara Leão


Friday, October 12, 2012


William Dyce - The Meeting of Jacob and Rachel (1850)
via:

Rabbit Island, 2009 by Norman Engel
via:

John Prine, Iris DeMent



The Master


People weren’t walking out in droves from the suburban cinema in which we saw Paul Thomas Anderson’s new film The Master because there weren’t droves there: just perhaps eight or nine people. But they did all walk out, in a slow trickle that started about halfway through the film. Staying there made us feel like loyalists in a lost cause – the future of film perhaps – but I’m not sure we were less bewildered or even less bored than those who had left. The Master does not tell a gripping story. It doesn’t tell a story at all.

It’s not that there is nothing going on in the movie, and we weren’t only bewildered and bored. We were also, intermittently, intrigued, daunted, amused, troubled. The very metaphor of the cause that I have just used is borrowed from the film, where something called the Cause (and distinctly resembling Scientology) is one of its elusive subjects. The questions it asks, again and again (in a way it doesn’t do anything else), are whether charisma can be portrayed and/or inspected and whether a charismatic leader can help anyone who seriously needs help.

The answers seem to be yes, no and no. Philip Seymour Hoffman’s portrayal of the Master is remarkable because it makes charisma seem so unsteady and so complicated. He is lordly, mischievous, scoundrelly, conniving, bullying, petulant, perfectly in control, half out of control, always slithering from the posture of the sage into that of the snake-oil salesman. But then this brilliant picture is not an inspection or exploration of the person or the type, it’s a fascinated trailing after him. When he abruptly loses his temper two-thirds of the way through the film, thoroughly shocking one of his devoted rich acolytes played by Laura Dern, it’s the same temper and the same loss we have seen twice already. We’re not shocked, this is not a revelation to us as it is to Dern. The repetition has its eerie force, though. We have long realised that the Master is his own permanent self-invention, and we finally see that there is no limit to the vast contentment with which he can keep putting together the self he so enjoys and admires. Nothing unsettles him for long: defections, hostile questions, doubt, imprisonment, all small bumps in the long right road. At the end of the film, still smiling with an air of benevolence that would scare the life out of anyone not seeking to grovel, he says: ‘Everyone needs a master.’ He doesn’t mean he needs one. He means everyone needs him, and if they imagine they don’t they are doomed.

The Master’s counterpart, disciple and victim is Freddie Quell, played by Joaquin Phoenix with a lurid tormented charm which spreads all over the large bright screen and makes you long for the days of lower definition and poorer visibility. He hunches his shoulders, twists his mouth, often talks unintelligibly, as if knowing what he is actually saying were a privilege he jealously wants to keep to himself. We first see him in the navy at the end of World War Two, frustrated, lonely, willing to fall in a frenzy on a woman made of sand as long as she has breasts that stick up sharply enough. The memory of this effigy recurs throughout the movie, and at the end, having found and lost and found the Master again, Freddie picks up a girl who has the right sort of breasts and is much more relaxed and amusing than any piece of Pacific beach could be. Perhaps he’s cured.

But cured of what? After the war, identified as a potential misfit in any arrangement where fitting might be required, Freddie becomes a photographer in a department store, cuts cabbages in a California field, and runs away to sea when the liquor he has become an expert at mixing – from paint thinner and various engine fuels – almost kills a man. The boat he hops onto belongs to, or is rented by, the Master, named Lancaster Dodd, who also has a taste for paint thinner. And so begins the weird relationship that so statically dominates the rest of the movie.

by Michael Wood, LRB |  Read more:

玉山 / Jade Mountain, 台灣 / Taiwan.
via:

Disappearance on Mount Marathon

You’ve heard of Tough Mudder and the Spartan Race, but every Fourth of July, the town of Seward, Alaska, puts on a competition that rivals both. The Mount Marathon Race doesn’t look like much on paper: it’s short (just three miles long) and starts at sea level. But the race, which has been run as an organized event since 1915, is a beast, with a 3,022-foot vertical gain and loss over icy and treacherous mountain terrain, where racers pick their own way up—and down—the average 38-degree slope.

While injuries aren't uncommon at Mount Marathon, three unusually serious accidents marred this year's race. Iraq war veteran and Blackhawk pilot Penny Assman slid over a cliff, lacerating her liver and breaking several ribs; an Alaskan runner named Matthew Kenney fell at the same spot and suffered broken legs and brain damage. The most puzzling casualty, however, was 66-year-old Anchorage resident Michael LeMaitre, who vanished without a trace on his way up the mountain and is believed to be the first fatality in the race's history. The incidents left Seward struggling to understand how a runner could disappear on a course that's only a few miles long. We spoke separately to Karol Fink, a Mount Marathon organizer and 19-year race veteran who ran this year, and to MaryAnne LeMaitre, whose father has yet to be found.

THE MOTIVATION
The record time, set in 1981 by Bill Spencer, is 43:21. The average speed uphill is 2 mph. Downhill is 12 mph.


Karol Fink: It’s a bit mysterious—the feel of it. It’s dangerous, so you’re not only competing against other people on the mountain, but you’re competing to get across the finish line healthy and safe. I’m never going to be a professional runner, and I’m never going to be a racecar driver or a rodeo person, but this is my chance to push that edge.

What happened this year—it was quite devastating for the community. It was the talk of the town: Two people falling off the waterfall. A missing runner. People were out searching, helicopter going constantly. It consumed the energy of the town.

MaryAnne LeMaitre: My dad—he has always had an adventurous spirit: He did the Iditaski several times; he had a lot of exciting adventures on the water. He was definitely physically fit. But he hadn’t been on the mountain before. He won the lottery for getting into the race, and that was it.

THE SETTING
Mount Marathon in the Kenai Mountain range in southern Alaska looms over town of Seward, an active fishing port, with a population of about 3,000.


Fink: Distance in this race isn’t as critical as vertical. When the competitive men come down from the top of the mountain to the bottom, they’re descending over 3,000 feet in about five minutes. They’re not running, they’re free-falling.

This year, it had rained the night before, so it made the race really slick and muddy for the juniors. Next was the women’s race, and it was a pleasant temperature, in the low 60s, with a little bit of fog blowing in and out in strips. The men’s race was also foggy, but the trail had dried some.

LeMaitre: I was told it was overcast. Clouds were moving in and out, but you could still see Seward at times. It was rainy, but it didn’t sound like it was raining all the time.

by Caty Enders, Outside Magazine |  Read more:
Photo: jseattle/Flickr

Andy Warhol, Mick Jagger, 1975
via:

Levon Helm's Midnight Rambles


Leave it to Lucinda Williams, whose astringent lyrics have proven her capable of leaving no romantic notion unexamined, to ask the question of the hour. Sitting backstage at the Izod Center, in East Rutherford, before doing her turn at “Love for Levon,” the all-star tribute concert for Levon Helm, who died last April at seventy-one, Williams confessed to some confusion about the honoree. Helm was most famous for singing and playing drums in the Band, whose golden era ended in 1976. “Everybody is asking why he was important, and I don’t know what to say, because I never thought of most of these songs as Levon’s music—they were the Band’s. Did they do all this after Rick Danko died?”

The answer, obviously, is that there was no such reaction in 1999, after Danko, the Band’s bassist and singer, died in his sleep, at the age of fifty-six. Why, then, the outpouring of feeling for Helm? Oddly, Don Was, the esteemed musician and producer who served as co-musical director for the concert, had been pondering that very question. “Maybe it comes from reaching sixty,” speculated Was. “I’ve seen it before—Kris Kristofferson, Willie Nelson. All the jokes that people make about the I.R.S. or whatever. Suddenly you turn sixty, and that’s the stuff you’ve survived.”

Helm could serve as Exhibit A in Was’s argument, for, by the time he turned sixty, he was a veritable curator of stuff to survive. There were an abundance of garden-variety career disappointments, money setbacks, and legal problems maintaining a steady background behind larger crises. In 1986, after playing some club in Florida, longtime friend and bandmate Richard Manuel left Helm’s hotel room and hung himself in his shower. In 1991, Helm’s beloved home and recording studio in Woodstock burned to the ground in an electrical fire. In 1998, he was diagnosed with throat cancer and advised to get a laryngectomy; instead, he took twenty-seven radiation treatments that killed the tumors but robbed him of his voice. He faced each insult not only stoically—deprived of singing, he resolved to improve his drumming—but with an affability that was his signature. “I don’t care what kind of shitty mood you were in,” says Steve Berlin, the keyboard player and saxophonist for Los Lobos, “seeing Levon just brightened your day.”

Unfortunately, amiable perseverance alone is not enough, and by the end of 2003, Helm was facing foreclosure on his rebuilt home. With the help of his new manager, Barbara O’Brien, he began holding rent parties. These shindigs proved doubly useful: the earnings showed that he had regular income, which enabled him to refinance, and more importantly, their popularity gave Helm the idea of turning the parties into a regular show. Calling it the “Midnight Ramble,” after the almost-anything-goes medicine shows that would come through his Helena, Arkansas boyhood home, Helm tapped Larry Campbell, who had distinguished himself as the guitar-, mandolin-, and fiddle-playing sideman for Bob Dylan and others. Together with guitarist Jimmy Vivino, they built a band, one that eventually came to include Levon’s daughter, Amy, and Larry’s wife, Teresa Williams. They played songs from the Band’s repertoire, but also country, blues, gospel, Cajun, and rockabilly.

Soon, word began to circulate that Levon had a hot band that played at his house on Saturday nights, and before very long, Helm’s musical friends began sitting in: Emmylou Harris, Elvis Costello, Dr. John, Phil Lesh, Norah Jones, and many more began making the trip to Woodstock, and the audience followed. “It was one of those ‘If you build it, they will come’ things,” says Campbell. “And they did.” Standing on Helm’s outdoor deck, he pointed into the woods that fronts the eighteen-acre property. “On Saturday evenings, you could stand here and see the headlights stretching all the way back to the highway.” Capacity was two hundred, tops, and the audience would pack around the stage, sitting with their backs to his fireplace, or peer down from the loft space under the great peaked ceiling. To help foster the down-home feel of things, the guests, who had paid a hundred bucks apiece, were asked to bring a dish to the potluck supper that preceded every show.

“There’s a lot of bullshit that surrounds the music industry,” says Jim James of the group My Morning Jacket, who played the Ramble twice, expressing a widely held sentiment. “But Levon always conveyed a great sense of community and spirit, and put the music first.” Then the real miracle occurred: Helm’s voice returned. Gone was his strong tenor, replaced by something raspy and ornery, the voice of old man who has something to say. In 2007, he went back into the studio and recorded an album called “Dirt Farmer,” co-produced by Campbell and Amy Helm, which won a Grammy. For an album recorded with friends and family, in the home he rebuilt after an inferno, in the studio seized from the jaws of foreclosure, with the voice reborn after cancer, the honor must have been like gilding on a lily.

by Jamie Malanowski, New Yorker |  Read more:

Why Is This Man Running For President Of The Internet?


It’s an unseasonably cold early October evening in Lincoln, Nebraska, and Reddit co-founder Alexis Ohanian is giving his elevator pitch to a flustered but rapt woman behind the counter of a fast-food joint, Runza, where he’s picking up 45 servings of the eponymous Nebraskan meat and bread dish to bring back to a party at a local sports startup. “Have you heard of Hudl?” he asks, explaining in unbroken paragraphs how the power of the internet is changing high school and college sports. The woman laughs, an is this guy for real? kind of nervous giggle. But he’s dead serious.

It’s the off-the-cuff version of a stump speech I’d seen him give a few hours earlier, to a crowd of about a hundred at the University of Nebraska-Lincoln. ("We need to be good stewards of technology.") I’d watched him tell the crowd about Geek Day, a digital march on Washington that he had set into motion the day before, and tout his site's success in fighting the anti-piracy bills SOPA and PIPA — including the $94 million in lobbying, largely from the entertainment industry, that had pushed them to the brink of passing. He also tells a story about a trucker he met in Colorado who didn’t even know he was an “Internet Freedom” supporter until Alexis explained what that meant, and then — from the stage — he calls Nebraska Rep. Jeff Fortenberry, a Republican who supported SOPA, to ask him why (he gets voicemail). He refers to the President of the United States, without hesitation, as POTUS.

The whole time, though, he’s subtly code-switching to speak to one of the invisible constituencies he knows is present: Redditors. These, after all, are his people, his true believers. So he references bacon. He uses the word “epic.” He acknowledges memes, like Advice Animals.

When he leaves the stage, he shakes hands and poses for pictures; had a baby been there, he might’ve kissed it. Then he’s off to the next stop, a high school football game, in a tour bus that had at one time been leased by the McCain campaign and converted into the “Straight Talk Express.” Now, it’s been painted over — half red, half blue — and along with Ohanian and a few other Reddit staffers, is also carrying a small press corps (BuzzFeed included), a documentary crew, representatives from farming startup AgLocal, and a staffer at the newly formed Internet Association lobbying group. It is trailed, loudly, by an impressive car-buggy designed by open source automotive startup Local Motors.

If you didn’t know any better, you might think that Alexis Ohanian — the insistently goofy, imposingly tall, never-off 29-year-old cofounder of what is arguably the largest cohesive community on the internet — is running for office. And in fact, he kind of is — but for a position that doesn’t yet exist.

Alexis Ohanian wants to be the President of the Internet. And he’s pretty sure he knows what he needs to do to get there.

by John Herrman, Buzz Feed |  Read more:
Illustration by John Gara

Let's Start the Foodie Backlash

On a crisp autumn evening in a north London street, a rôtisserie trailer is parked outside a garden flat, green fairy lights blinking on and off, warm chickens perfuming the air. A thirtyish hipster wanders out to where I'm standing with a friend on the pavement and drawls his unimpressed judgment of what is going on inside. "I think the arancinis are not quite spicy enough," he informs us, with an eaten-it-all-before air. "Could have more flavour, not really exotic." Right now I haven't the faintest idea what "arancinis" are (or that arancini, like panini, is already an Italian plural), but I nod knowingly while typing his thoughts into my phone, and my friend keeps him talking. "I thought the Korean burger was quite good," the hipster goes on, without much kimchi-fired enthusiasm, "but I think a lot of people don't make their food with enough shbang … They kind of cater to the middle of the road." Twenty-five years ago, he could have been an indie-rock fan bemoaning the blandness of chart music. Now he's a social-smoking, foodier-than-thou critic at a "Food Rave".

The name of the Food Rave is entirely appropriate for a modern culture in which food is the last ingestible substance you can indulge in with obsessiveness without being frowned on by society. Alex James, the Blur bassist turned gentleman cheese farmer and Sun food columnist, has said: "My 20th birthday party was all about booze, my 30th birthday was about drugs, and now I realise that my 40s are about food." And he is not alone. Food replaces drugs in the gently ageing food-fancier's pantheon of pleasure, and brings along with it traces of the old pharmaceutical vocabulary. You hear talk of taking a "hit" of a dish or its sauce, as though from a spliff or bong; and a food-obsessive in hunter-gatherer mode is thrilled to "score" a few chanterelle mushrooms, as though he has had to buy them from a dodgy-looking gent on a murky Camden street corner. Food is valued for its psychotropic "rush"; Nigella Lawson refers to salted caramel as "this Class A foodstuff". Yes, food is the new drugs for former Britpoppers and the Ecstasy generation, a safer and more respectable hedonic tool, the key to a comfortingly domesticated high.

Western industrial civilisation is eating itself stupid. We are living in the Age of Food. Cookery programmes bloat the television schedules, cookbooks strain the bookshop tables, celebrity chefs hawk their own brands of weird mince pies (Heston Blumenthal) or bronze-moulded pasta (Jamie Oliver) in the supermarkets, and cooks in super-expensive restaurants from Chicago to Copenhagen are the subject of hagiographic profiles in serious magazines and newspapers. Food festivals (or, if you will, "Feastivals") are the new rock festivals, featuring thrilling live stage performances of, er, cooking. As one dumbfounded witness of a stage appearance by Jamie Oliver observed: "The girls at the front – it's an overwhelmingly female crowd – are already holding up their iPhones […] A group in front of me are saying, 'Ohmigodohmigodohmigod' on a loop […] 'I love you, Jamie,' yells a girl on the brink of fainting." The new series of The Great British Bake-Off trounced Parade's End in the ratings, and canny karaoke-contest supremo Simon Cowell is getting in on the act with a new series in development called Food, Glorious Food!– or, as it's known among production wags, The Eggs Factor.

If you can't watch cooking on TV or in front of your face, you can at least read about it. Vast swaths of the internet have been taken over by food bloggers who post photographs of what they have eaten from an edgy street stall or at an aspirational restaurant, and compose endlessly scrollable pseudo-erotic paeans to its stimulating effects. Right now, five of the 10 bestselling books on amazon.co.uk are food books, withNigellissima outselling Fifty Shades of Grey. According to the spring 2011 Bookscan data, British sales of books in nearly all literary genres were down, except for the categories of "food and drink" (up 26.2%), followed by "religion" (up 13%). (Before 1990, the bibliographic category of "food and drink" didn't even exist.) That food and religion alone should buck the negative trend is no coincidence, for modern food books are there to answer metaphysical or "lifestyle" rather than culinary aspirations, and celebrity chefs themselves are the gurus of the age.

It is not in our day considered a sign of serious emotional derangement to announce publicly that "chocolate mousse remains the thing I feel most strongly about", or to boast that dining with celebrities on the last night of Ferran Adrià's restaurant elBulli, in Spain, "made me cry". It is, rather, the mark of a Yahoo not to be able and ready at any social gathering to converse in excruciating detail and at interminable length about food. Food is not only a safe "passion" (in the tellingly etiolated modern sense of "passion" that just means liking something a lot); it has become an obligatory one. The unexamined meal, as a pair of pioneer modern "foodies" wrote in the 1980s, is not worth eating. Most cannily, the department of philosophy at the University of North Texas announced in 2011 its "Philosophy of Food Project", no doubt having noticed which way the wind was blowing, and presumably hoping that it would be able to trick food-obsessives into hard thinking about other topics. One can of course think philosophically about food, as about anything at all, but that is not what is going on in our mainstream gastroculture.

Where will it all end? Is there any communication or entertainment or social format that has not yet been commandeered by the ravenous gastrimarge for his own gluttonous purpose? Does our cultural "food madness", as the New York Times columnist Frank Rich suggests, tip into "food psychosis"? Might it not, after all, be a good idea to worry more about what we put into our minds than what we put into our mouths?

by Steven Poole, The Guardian |  Read more:
Photo: Alarmy

Thursday, October 11, 2012

The Neuroscience of Stage Fright

Public speaking is one of our most common fears, topping flying, financial ruin, sickness, and even death. The fear can get so bad that people become physically ill before getting on stage. But this fear — often called performance anxiety or stage fright — extends beyond the pressure to perform in the moment. It's about the underlying social psychology of exposing oneself to an audience. It's this vulnerability that sets off an entire cascade of physiological processes throughout the body in a defense mechanism that at one time served an important evolutionary purpose.

Understanding the science of stage fright can also help ease the fear.

A common fear

Back in 2007, I gave a talk at a futurist conference in Chicago that featured such speakers as Ray Kurzweil, William Shatner, and Peter Diamandis of XPrize fame. If this wasn't pressure enough, the day before my presentation I learned that one of my longtime heros, cognitive scientist Marvin Minsky, was going to be in the audience. It was at this point that my body started to rebel against me; I broke out into a nasty rash, began vomiting, and contracted a rather unpleasant case of diarrhea. The next day, I stood on the stage and desperately fought back the urge to panic, delivering a presentation that was stilted, awkward, and completely uninspiring.

Sadly, this was typical for me back then. But this experience finally made me snap out of my denial: I have stage fright — and I have it bad. And I am hardly alone.

Celebrities with stage fright include Rod Stewart, Barbara Streisand, Mel Gibson, and Carol Burnett (who reportedly threw-up before many of her performances). Many prominent athletes tend to suffer from it as well, including nervous hockey goalies and boxers who just can't seem to perform when everything's on the line.

Generalized anxiety

Stage fright is an emotional and physical response that is triggered in some people when they need to perform in front of an audience — or even an anticipated or perceived audience (such as standing in front of a camera).

While feelings of stress and anxiety are present during the actual performances themselves, individuals with stage fright often start to experience its effects days or weeks in advance (something that was particularly bad in my case). Consequently, stage fright is more than just a fear that's elicited during a performance — it's also very much about the lead-up. And in fact, for some, the performance itself can be a kind of cathartic release from the tension. (...)

Like most phobias, stage fright is a perfectly normal and even natural response to situations that are perceived to be dangerous or somehow detrimental. Psychologists who work with stage fright patients describe how their inner chatter tends to focus on those things that could go wrong during the performance and in the immediate aftermath of a potential failure. For people who have it quite bad, this can amount to a kind of neuroticism in which fears are exaggerated completely out of context — what psychologists call chronic catastrophizing.

And in fact, studies have shown that these fears can be driven by any number of personality traits, including perfectionism, an ongoing desire for personal control, fear of failure and success, and an intense anxiety about not being able to perform properly when the time comes (which can often serve as a self-fulfilling prophecy). Psychologists have also observed that people with stage fright tend to place a high value on being liked and regarded with high esteem.

Moreover, during the performance itself, individuals with stage fright tend to form a mental representation of their external appearance and behavior as they presume it's being seen by the audience. Consequently, they turn their focus onto themselves and interpret the audience's attention as a perceived threat.

by George Dvorsky, io9 |  Read more:
Photo: Clover/Shutterstock.com.

Frogs


[ed. Mo Yan winner of the 2012 Nobel Prize for Literature.]

...Aunty said she staggered out of the restaurant, headed to the hospital dormitory, but wound up in a marshy area on a narrow, winding path bordered on both sides by head-high reeds. Moonlight reflected on the water around her shimmered like glass. The croaks of toads and frogs sounded first on one side and then on the other, back and forth, like an antiphonal chorus. Then the croaks came at her from all sides at the same time, waves and waves of them merging to fill the sky. Suddenly, there was total silence, broken only by the chirping of insects. Aunty said that in all her years as a medical provider, traveling up and down remote paths late at night, she’d never once felt afraid. But that night she was terror-stricken. The croaking of frogs is often described in terms of drumbeats. But that night it sounded to her like human cries, almost as if thousands of newborn infants were crying. That had always been one of her favorite sounds, she said. For an obstetrician, no sound in the world approaches the soul-stirring music of a newborn baby’s cries. The liquor she’d drunk that night, she said, left her body as cold sweat. ‘Don’t assume I was drunk and hallucinating, because as soon as the liquor oozed out through my pores, leaving me with a slight headache, my mind was clear.’ As she walked down the muddy path, all she wanted was to escape that croaking. But how? No matter how hard she tried to get away, the chilling croak – croak – croak – sounds of aggrieved crying ensnared her from all sides. She tried to run, but couldn’t; the gummy surface of the path stuck to the soles of her shoes, and it was a chore even to lift up a foot, snapping the silvery threads that held her shoes to the surface of the path. But as They came upon her like ocean waves, enshrouding her with their angry croaks, and it felt as if all those mouths were pecking at her skin, that they had grown nails to scrape her skin soon as she stepped down, more threads were formed. So she took off her shoes to walk in her bare feet, but that actually increased the grip of the mud. Aunty said she got down on her hands and knees, like an enormous frog, and began to crawl. Now the mud stuck to her knees and calves and hands, but she didn’t care, she just kept crawling. It was at that moment, she said, when an incalculable number of frogs hopped out of the dense curtain of reeds and from lily pads that shimmered in the moonlight. Some were jade green, others were golden yellow; some were as big as an electric iron, others as small as dates. The eyes of some were like nuggets of gold, those of others, red beans. They came upon her like ocean waves, enshrouding her with their angry croaks, and it felt as if all those mouths were pecking at her skin, that they had grown nails to scrape her skin. When they hopped onto her back, her neck, and her head, their weight sent her sprawling onto the muddy path. Her greatest fear, she said, came not from the constant pecking and scratching, but from the disgusting, unbearable sensation of their cold, slimy skin brushing against hers. ‘They covered me with urine, or maybe it was semen.’ She said she was suddenly reminded of a legend her grandmother had told her about a seducing frog: a maiden cooling herself on a riverbank one night fell asleep and dreamed of a liaison with a young man dressed in green. When she awoke she was pregnant and eventually gave birth to a nest of frogs. Given an explosion of energy by that terrifying image, she jumped to her feet and shed the frogs on her body like mud clods. But not all – some clung to her clothes and to her hair; two even hung by their mouths from the lobes of her ears, a pair of horrific earrings. As she took off running, she sensed that somehow the mud was losing its sucking power, and as she ran she shook her body and tore at her clothes and skin with both hands. She shrieked each time she caught one of the frogs, which she flung away. The two attached to her ears like suckling infants took some of the skin with them when she pulled them off.

by Mo Yan, Granta | Read more:
Photo: Polarjez

Felice Casorati. Italian (1883 - 1963)
via:

Marcin Maciejowski. Clothes, 2009. Oil on canvas, 140 x 160 cm.
via:

First in Line


They fill the sidewalks with tents and sleeping bags, transforming once pristine city blocks with their very presence, sharing thermoses of coffee and small hot meals.

They don’t care about the evening chill, or the stares of passerby, or the police. And the police don’t care about them. Because on that bright morning when the Apple store opens, they’ll roll up their blankets, strike their tents, and go home with a shiny new iPhone 5, as happy as clams and just as stupid.

To liberals of the 90s, Bill Gates was the symbol of both wealth and malevolence incarnate. Not only was he the richest man in the world, but his monolithic and monopolistic enterprise was based on a mediocre product with built in buggy obsolescence. He didn’t innovate; instead he partnered with IBM, purchased DOS, and then exploited both. And through ruthless business savvy, the narrative goes, Microsoft strong-armed the market despite a middling product, terrible customer service, and ruthless cost cutting.

But one man, one company, made a career (and cult) out of this “critique” of Bill Gates. (...)

Apple surpassed Microsoft as the most valuable tech company in 2010, but Jobs had long before eclipsed Bill Gates in the consumer’s CEO-aspirant imaginary. Benevolent Jobs, who died merely the 42nd wealthiest American, was worshipped by liberals with the same intensity that Gates was hated.  (...)


What’s really going on in these ads? It’s not exactly the classic hip/square dichotomy: Jon Hodgman is funny and charismatic, and there is some amount of mutual respect here. Although Hodgman is clearly a square, ‘Mac’ is not primarily a cool guy who rejects Hodgman’s identity. Instead, ‘Mac’ is unshaven, informally dressed, kinda average. The difference is not between the square who sells out and the cool guy who opts out, but rather the technocratic office worker and the precarious creative. Mac admits that PC is good at getting business done, but business is boring, and he’d rather be drinking a latte at the co-work space he shares with an industrial designer and a start-up architect. Wouldn’t you?

The ascension myth of Jobs over Gates and of Apple over Microsoft is a spectacular reflection and reenactment of the rise of post-Fordist precarious labor over the sort of middle class white-collardom historicized by C. Wright Mills. Gates was a man who rationalized the computer business. He ensured his software was packaged with outside manufacturers, who would then do the messy work of race-to-the-bottom competition for him, but would all carry the same (and same priced) Microsoft OS.

Microsoft made money on every shitty Dell or pimped out IBM, requiring almost no hardware overhead of their own. Once Windows had achieved a certain level of dominance it was impossible to make a cheap computer without either it or a high level of technical savvy (e.g., Linux). Microsoft won, as Žižek put it in a London Review of Books essay, when it had “imposed itself as an almost universal standard, practically monopolizing the field, as one embodiment of what Marx called the ‘general intellect,’ meaning collective knowledge in all its forms … Gates effectively privatized part of the general intellect.”

Since Apple couldn’t fight on the grounds of cost, it would compete on ease-of-use and, more fundamentally, “lifestyle”: it would fight for its patch of general intellect. Rather than infinite customizability, Apple would come bundled with iTunes, iMovie and Garage Band. Apple made the hardware, the software, and ultimately even the store you bought it in. Microsoft sold a product that was ubiquitous and fundamental, but Apple sold a whole line of products, an experience, a way of life. The narrative is well rehearsed.

Microsoft predominantly (and unabashedly) produces work tools: the Microsoft Office Suite remains its flagship product. The PC has always been clunky when it comes to media production and consumption (though less so than Apple and its legions want you to believe), and the graceful handling of these functions is what sets Apple apart. Of course, these “creative” fields of production are just as much work tools as Office; it’s just that your work is fun. You make music! You make movies! You’re not a slave!

And though gamers have always used PCs, they did so because you could upgrade video cards and processing speed and power as you needed — keeping up with the latest generation of games without replacing your computer — so that gamers ended up with machines whose internal functioning little resembled their office counterparts. A Mac, however, is a Mac, its functions largely black-box and proprietary. You don’t hack it and you don’t upgrade it, you just buy a new one.

Of course, it is in design and packaging, not computing, that Apple has really excelled. Other than its innovations in touch-screen technology and battery life (significant but outsourced achievements), Apple has offered little in the way of technical invention. What Apple does best is user interface and visual design, which, if you’re feeling generous, you can call a kind of beautiful craft: sowing the glove to exactly fit the hand while also grabbing the eye. But design, especially when it comes to the mass-produced consumer object, is really just the arty end of the marketing spectrum.

by Willie Osterweil, Jacobin |  Read more: