Friday, May 27, 2011

Welcome to the Summer Drinking Season

By Alex Balk

Remember how, back at the outset of last summer, you promised yourself that this time you weren't going to let the season go to waste? How you had such ambitious hopes and schemes? And how, seemingly seconds later, Labor Day rolled around and you were all, "Wait! What? I... d'oh!"

Well, that's alright. Summers are meant to be wasted. They are the disposable months of the year during which expectations are low and performance follows accordingly. You're sweaty, you're listless, you're not trying very hard... and it's okay. It's summer! Relax! And you know what? If you're going to waste your summer, you might as well spend your summer wasted. May I suggest a cocktail?

While some opt for the Dark and Stormy, and others plump for the Negroni (with unfortunate consequences), we take a different libation where I'm from. The Official Balk Family Drink of the Summer is, at heart, a gin and tonic, but with one important addition: a splash of Campari.

Too simple, you say? Ah, my friend, you obviously have not had The Official Balk Family Drink of the Summer. While there's nothing wrong with your basic gin and tonic, the splash of Campari... oh, the difference it makes. It is also an excellent test of manliness (if you are a man), since you have to be comfortable enough with yourself to be holding a pink drink throughout the afternoon and into the evening. I probably do not need to do this, but just in case your brain is as fried as mine is right now, here's how you make a gin and tonic with a splash of Campari:

Cowboys and Pit Crews

Atul Gawande delivered this year’s commencement address at Harvard Medical School.

In his book “The Youngest Science,” the great physician-writer Lewis Thomas described his internship at Boston City Hospital in pre-penicillin 1937. Hospital work, he observed, was mainly custodial. “If being in a hospital bed made a difference,” he said, “it was mostly the difference produced by warmth, shelter, and food, and attentive, friendly care, and the matchless skill of the nurses in providing these things. Whether you survived or not depended on the natural history of the disease itself. Medicine made little or no difference.”

That didn’t stop the interns from being, as he put it, “frantically busy.” He learned to focus on diagnosis—insuring nothing was missed, especially an illness with an actual, effective treatment. There were only a few. Lobar pneumonia could be treated with antiserum, an injection of rabbit antibodies against the pneumococcus, if the intern identified the subtype correctly. Patients in diabetic coma responded dramatically to animal-extracted insulin and intravenous fluid. Acute heart failure patients could be saved by bleeding away a pint of blood from an arm vein, administering a leaf-preparation of digitalis, and delivering oxygen by tent. Early syphilitic paresis sometimes responded to a mix of mercury, bismuth, and arsenic. Surgery could treat certain tumors and infections. Beyond that, medical capabilities didn’t extend much further.

The distance medicine has travelled in the couple of generations since is almost unfathomable for us today. We now have treatments for nearly all of the tens of thousand of diagnoses and conditions that afflict human beings. We have more than six thousand drugs and four thousand medical and surgical procedures, and you, the clinicians graduating today, will be legally permitted to provide them. Such capabilities cannot guarantee everyone a long and healthy life, but they can make it possible for most.

People worldwide want and deserve the benefits of your capabilities. Many fear they will be denied them, however, whether because of cost, availability, or incompetence of caregivers. We are now witnessing a global societal struggle to assure universal delivery of our know-how. We in medicine, however, have been slow to grasp why this is such a struggle, or how the volume of discovery has changed our work and responsibilities.

Read more:

Physics and the Immortality of the Soul

The topic of “Life after death” raises disreputable connotations of past-life regression and haunted houses, but there are a large number of people in the world who believe in some form of persistence of the individual soul after life ends. Clearly this is an important question, one of the most important ones we can possibly think of in terms of relevance to human life. If science has something to say about, we should all be interested in hearing.

Adam Frank thinks that science has nothing to say about it. He advocates being “firmly agnostic” on the question. (His coblogger Alva Noë resolutely disagrees.) I have an enormous respect for Adam; he’s a smart guy and a careful thinker. When we disagree it’s with the kind of respectful dialogue that should be a model for disagreeing with non-crazy people. But here he couldn’t be more wrong.

Adam claims that “simply is no controlled, experimental[ly] verifiable information” regarding life after death. By these standards, there is no controlled, experimentally verifiable information regarding whether the Moon is made of green cheese. Sure, we can take spectra of light reflecting from the Moon, and even send astronauts up there and bring samples back for analysis. But that’s only scratching the surface, as it were. What if the Moon is almost all green cheese, but is covered with a layer of dust a few meters thick? Can you really say that you know this isn’t true? Until you have actually examined every single cubic centimeter of the Moon’s interior, you don’t really have experimentally verifiable information, do you? So maybe agnosticism on the green-cheese issue is warranted. (Come up with all the information we actually do have about the Moon; I promise you I can fit it into the green-cheese hypothesis.)

Obviously this is completely crazy. Our conviction that green cheese makes up a negligible fraction of the Moon’s interior comes not from direct observation, but from the gross incompatibility of that idea with other things we think we know. Given what we do understand about rocks and planets and dairy products and the Solar System, it’s absurd to imagine that the Moon is made of green cheese. We know better.

We also know better for life after death, although people are much more reluctant to admit it. Admittedly, “direct” evidence one way or the other is hard to come by — all we have are a few legends and sketchy claims from unreliable witnesses with near-death experiences, plus a bucketload of wishful thinking. But surely it’s okay to take account of indirect evidence — namely, compatibility of the idea that some form of our individual soul survives death with other things we know about how the world works.

Claims that some form of consciousness persists after our bodies die and decay into their constituent atoms face one huge, insuperable obstacle: the laws of physics underlying everyday life are completely understood, and there’s no way within those laws to allow for the information stored in our brains to persist after we die. If you claim that some form of soul persists beyond death, what particles is that soul made of? What forces are holding it together? How does it interact with ordinary matter?

Read more: 

image credit:

Thursday, May 26, 2011

 
via:

Franz Kafka, Party Animal

by  Joanna Kavenna

Fiction writing is hardly a glamorous profession. True, novelists avoid the timetables of office work and can cultivate eccentric habits. But if they are going to get anything done, they still have to spend hours of each day hunched over a desk. The tedium of the writer's life means biographers either have to bore their readers senseless or fashion a 'myth' - exaggerating picturesque elements of the writer's personality, embroidering anecdotes and, in the end, rendering the writer as a fictional character. For every major literary figure, there are dozens of myths flying around.

In Excavating Kafka, James Hawes tackles some of the myths that have built up around the writer. He suggests that Kafka is generally touted - both in 'popular culture' and in the worthy avenues of academe - as a gaunt, melancholy, saint-like type, staring out of blurred black-and-white photographs with anguished eyes. He was a man who ordered in his will that his works should be destroyed, who languished in obscurity throughout his lifetime, who was 'crushed by a dead-end bureaucratic job' and, equally, by a tyrannical father. This Kafka was an all-round seer who had no interest in the reception of his work, so preoccupied was he by his 'Kafkaesque' imagination. 'These are the building blocks of the K-myth,' writes Hawes in his introduction. 'Unfortunately, they are all rubbish.'

Hawes, a former academic who spent 10 years studying and teaching Kafka, insists that he was not a 'lonely Middle European Nostradamus'. Rather, he lived with his parents and was set up with a relatively cushy job (six hours a day for the equivalent of £58,000 today), leaving him plenty of time to write. Thanks to his literary connections, he won a major literary prize in his early thirties before even publishing a book. He was not tragically unrequited in his love affairs; nor was he virtually unknown in his lifetime ('we see him named three times in two entirely different articles in a single edition of the Prague Daily News in 1918'). Hawes even proposes that Kafka didn't really want his work to be burned after his death and knew full well that the loyal Max Brod would never do it.

Hawes's Kafka is a canny, funny, worldly man who liked to relax by socialising with his many friends, visiting the occasional prostitute - and reading porn. The fact that Kafka subscribed to two erotic journals is presented as a grand revelation: 'No one has ever shown his readers what we are about to see: Kafka's porn.' There follow some pretty weird pen-and-ink drawings, fin de siècle in style, although Hawes also admits that 'Kafka's porn is no real secret. The mystery is that it should seem like one.' This aspect of the book has caused a furious row to erupt among German-speaking Kafka scholars, with several accusing Hawes of sensationalisation, prudishness and even anti-Semitism.

Read more: 

Ankle Biters

by Pascal-Emmanuel Gobry

Network effects. Perhaps no other phrase can get a VC's pulse higher.

They're the holy grail of online business.

What is a network effect? It's what happens when the value of a product to one user depends on how many other users there are, as economists Carl Shapiro and Hal Varian put it.

Examples include Microsoft Windows and the phone network. Windows is valuable because most other software is made for Windows, which makes more people buy Windows, which makes more developers build their apps for Windows, and so on and so forth in a virtuous circle.

Network economics online get people's hearts racing so much for two reasons: first of all, because the internet is at its base a communications network and so network effects tend to happen more there, and second of all, it's one of the few strong barriers to entry in a market where there are so few.  Or is it?

It's taken as a given that network effects online are a magical barrier to entry, but are they?

One reason Facebook is so valuable, we're told, is because of its huge network effects which make it unstoppable and undefeatable.

But Friendster had network effects. So did MySpace. Founding Facebook President Sean Parker says MySpace lost to Facebook because of its gross incompetence. Fair enough.

Except that's not the only way network effects businesses can lose. One way that network effects can be defeated is through what we'll call "verticalization."

Craigslist is perhaps one of the best network effects businesses: the reason why everyone goes there is because everyone is already there. Plenty of people have pointed out how awful Craigslist's design can be, how many things are wrong with it, and yet plenty of well-funded startups that have tried to take Craigslist on frontally with slicker offerings have foundered.

And yet... And yet, Craigslist's traffic seems to be plateau-ing. Why? This graphic by VC Andrew Parker shows why:

click on graphic

While no service has been able to defeat Craigslist head-on, plenty have built "niches" in specific verticals, with a more tailored offering, and now Craigslist seems to be stalling.

And some of these "niches" are big: Etsy, AirBnB and Ashley Madison are huge businesses.

Could the same thing happen to Facebook? We would argue it already is.

Steves

Steve Jobs

Steve Martin

Steve Speilberg

via:  here here and here

Friday Book Club - Readers of the Pack: American Best-Selling

by Ruth Franklin

IN MAKING THE LIST, his 2001 book about best sellers, former Simon & Schuster editor in chief Michael Korda recalls that the publishing house once commissioned a study of which books made the most money. After a detailed presentation, the consultant said to the editors, "Do you guys realize how much money the company would make if you only published best sellers?" He might as well have told them that they'd do better playing the lottery if they picked the right numbers. Trends come and go, but the best seller remains essentially serendipitous. An editor can be no more certain of finding the next one than a writer can be assured of writing it. "As a rule of thumb," writes John Sutherland, an English scholar who has studied the phenomenon, "what defines the bestseller is bestselling. Nothing else."

The term best seller has always been a misnomer. Fast seller would be more appropriate, since the pace of sales matters as much as the quantity. The first list of books "in order of demand" was created in 1895 by Harry Thurston Peck, editor of the trade magazine The Bookman. Publishers Weekly started its own list in 1912, but others were slow to follow: The New York Times did not create its best-seller list until 1942. Now, the Wall Street Journal and USA Today also compile national lists, and each of the major regional papers has its own—all generated in slightly different ways. The Times bases its list on sales reports from around four thousand booksellers, which it declines to name (a column by the paper's public editor a few years ago said only that they change constantly). The Wall Street Journal used to track only sales in major chain stores but now bases its rankings on data from Nielsen BookScan, an authoritative industry source that includes as many as three-quarters of the nation's bookstores, around eleven thousand. IndieBound surveys only independent bookstores. Amazon.com offers its own list, updated every hour, but—like all the others—it is based on orders, not actual sales (since returns are not taken into account). Thus a writer with a carefully timed marketing blitz can push his book to a relatively high Amazon ranking for a day or so, allowing him to claim that it was, say, a "top ten Amazon best seller." The system's vulnerability to manipulation has resulted in the perception that, as Eliza Truitt wrote in Slate, the term best seller on the cover of a book means "about as much as the phrase 'original recipe' does on a jar of spaghetti sauce."

From the start, Peck seems to have had mixed feelings about the arbitrariness of the mechanism he had chosen to anoint books. "The period during which a popular novel enjoys favor is growing shorter all the time nowadays," he wrote in 1902, lamenting "the flood of fiction that is being placed upon the market and vigorously promoted practically every month in the year." While there has never been a defined threshold for making it onto the list—there is no guarantee that a book will be a top ten best seller if it sells fifty thousand copies, one hundred thousand, or even five hundred thousand—both the level and the pace of sales have increased exponentially. (For the sake of simplicity, the statistics in this essay are drawn mainly from the annual ranking of hardcover fiction by Publishers Weekly, which is the most comprehensive historical source.) During the list's first few decades, No. 1 best sellers typically sold about a quarter million copies in the first year after their release. The first superseller, the picaresque novel Anthony Adverse by Hervey Allen (1933), sold six hundred thousand copies over its first four years. Its record was promptly beaten by Margaret Mitchell's Gone with the Wind (1936), the first book to sell one million copies in a single year. In 1956, Peyton Place by Grace Metalious—still one of the best-selling novels of all time—sold sixty thousand copies within ten days of its publication: It was at the top of the New York Times best-seller list for fifty-nine weeks. Now, each of the top five novels easily sells one million copies in hardcover. The best-selling novel of 2010, The Girl Who Kicked the Hornet's Nest by Swedish crime writer Stieg Larsson, sold nearly two million copies last year.

No possible generalization can be made regarding the 1,150 books that have appeared in the top ten of the fiction best-seller list since its inception. There are literary novels by Virginia Woolf, Ernest Hemingway, Simone de Beauvoir, J. D. Salinger, Saul Bellow, and John Updike. There are social-problem novels, such as Upton Sinclair's The Jungle (1906) and John Steinbeck's The Grapes of Wrath (1939). There are war novels: Erich Maria Remarque's All Quiet on the Western Front (one of the few German novels ever to make the list, in 1929), The Naked and the Dead (Norman Mailer, 1948), From Here to Eternity (James Jones, 1951). There are religious novels ranging from Lloyd C. Douglas's The Robe (1942) and Leon Uris's Exodus (1959) to Jonathan Livingston Seagull, Richard Bach's 1970 allegory about a bird who yearns for a higher plane of existence. There are westerns by Owen Wister (The Virginian, 1902) and Zane Grey (who published nearly a novel a year from 1915 to 1924). There are sex novels: Kathleen Winsor's Forever Amber (1944), with the tag line "Adultery's no crime—it's an amusement"; Peyton Place, which graphically depicts rape and teenage sex; and Jacqueline Susann's Valley of the Dolls (1966), in which sex comes in second to tranquilizers as a source of pleasure. There are horror novels, with Rosemary's Baby (Ira Levin, 1967) and The Exorcist (William P. Blatty, 1971) paving the way for Stephen King's current domination of the field. There is spy fiction and science fiction and—currently the most popular genre—crime fiction. "The bestseller list, from day one, has always represented a reliable mixture of the good and the bad, of quality and trash," Korda writes.

Read more:

Fantastic Voyage

Wars, Tax Cuts and Bankers


As of May 6, America's total national debt stood at $14.32 trillion. That somewhat scary number has lots of voters nervous about America's apparent fiscal irresponsibility, and that in turn has resulted in the farcical game of chicken being played over the debt ceiling.

In that context, it's important to realize how we got into this hole in the first place. The chart above, making its rounds on the political blogs this week, is pretty clear. The debt-financed Bush-era tax cuts were the biggest single contributor to our current shortfall. The wars in Afghanistan and Iraq account for a hefty chunk, too.

Despite that reality, you still have the people who claim to be most worried about America's money-management problem arguing for extending tax cuts for the highest earners as the solution because, you know, some loose change will eventually fall off those mountains of money they're making.

via:

What Makes a Great Album Cover

by Molly Tuttle

When I met Simone Rubi in 1999, she was living in Oakland, Calif., a singer in a popular band and working as a graphic designer for ESPRIT. Simone immediately won me over with her delight and appreciation for the design of the simple things in life -- a redwood tree, a tiny mushroom, a perfect wave, hand-knit slippers -- and her ability to ignite others with her enthusiasm and heartfelt propaganda. Over the past decade, I have observed with a smile as I've watched my friend travel the world, arriving in each town like a magnetic Pied Piper, luring together musicians and artists to participate in her never-ending lifestyle of artistic collaboration and celebration of good times.

Simone Rubi. Photo by Mary Rozzi

In 2007, Simone designed the cover for Feist's Grammy-nominated masterpiece, "The Reminder."


For me, this is one of the greatest album covers of all time. In the same spirit of Joni Mitchell's cover for "Ladies of the Canyon," the image captures the spirit of a woman at a particular point in her life, without hitting you over the head with a glamorous beauty shot. The elegant silhouette (shot by Mary Rozzi), the hand-crafted typeface, the sparse yet perfectly executed use of color. The sum of all those parts is one single image that visually exudes the soul of the brilliant collection of songs on the album.

Joni Mitchell's cover art for "Ladies of the Canyon"

Over the past few years since "The Reminder" was released, I have noticed other album covers having a similar feel and I can't help thinking that other designers have been influenced by the work of these inspired ladies. See below.

Read more:

Save vs. Save As


by  Jonah Lehrer

My episodic memory stinks. All my birthday parties are a blur of cake and presents. I’m notorious within my family for confusing the events of my own childhood with those of my siblings. I’m like the anti-Proust.

And yet, I have this one cinematic memory from high-school. I’m sitting at a Friday night football game (which, somewhat mysteriously, has come to resemble the Texas set of Friday Night Lights), watching the North Hollywood Huskies lose yet another game. I’m up in the last row of the bleachers with a bunch of friends, laughing, gossiping, dishing on AP tests. You know, the usual banter of freaks and geeks. But here is the crucial detail: In my autobiographical memory, we are all drinking from those slender glass bottles of Coca-Cola (the vintage kind), enjoying our swigs of sugary caffeine. Although I can’t remember much else about the night, I can vividly remember those sodas: the feel of the drink, the tang of the cola, the constant need to suppress burps.

It’s an admittedly odd detail for an otherwise logo free scene, as if Coke had paid for product placement in my brain. What makes it even more puzzling is that I know it didn’t happen, that there is no way we could have been drinking soda from glass bottles. Why not? Because the school banned glass containers. Unless I was willing to brazenly break the rules — and I was way too nerdy for that — I would have almost certainly been guzzling Coke from a big white styrofoam container, purchased for a dollar from the concession stand. It’s a less romantic image, for sure.

So where did this sentimental scene starring soda come from? My guess is a Coca-Cola ad, one of those lavishly produced clips in which the entire town is at the big football game and everyone is clean cut, good looking and holding a tasty Coke product. (You can find these stirring clips on YouTube.) The soda maker has long focused on such ads, in which the marketing message is less about the virtues of the product (who cares if Coke tastes better than Pepsi?) and more about associating the drink with a set of intensely pleasurable memories.

A new study, published in The Journal of Consumer Research, helps explain both the success of this marketing strategy and my flawed nostalgia for Coke. It turns out that vivid commercials are incredibly good at tricking the hippocampus (a center of long-term memory in the brain) into believing that the scene we just watched on television actually happened. And it happened to us.

Read more:

Falling Comet

by  Michael Hall

In the last desperate months of his life, he would come into the restaurant at all hours of the day and take a seat, sometimes at the counter and other times in one of the back booths. He was always alone. He wore a scruffy ball cap, and behind his large, square glasses there was something odd about his eyes. They didn’t always move together. Barbara Billnitzer, one of the waitresses, would bring him a menu and ask how he was doing. “Just fine,” he’d say, and they would chat about the traffic and the weather, which was always warm in South Texas, even in January. He’d order coffee—black—and sometimes a sandwich, maybe turkey with mayo. Then he’d light up a Pall Mall and look out the window or stare off into space. Soon he was lost in thought, looking like any other 55-year-old man passing the time in a Sambo’s on Tyler Street in downtown Harlingen. He had moved there with his family five years before, in 1976. It was a perfect place for a guy who wanted to get away from it all. And he had a lot to get away from. Twenty-five years before, just about everyone in the Western world had known his face. In fact, for a period of time in the mid-fifties, he had been the most popular entertainer on the planet. He had sold tens of millions of rec­ords. He had caused riots. He had headlined shows with a young opening act named Elvis Presley and had inspired John Lennon to pick up the guitar. He had changed the world.

After ten minutes or so Billnitzer would bring him his food. But usually he was thinking about something, so he ignored it. After a while, though, he’d start to shift in his seat and look around. And then he’d start to hum. Billnitzer, refilling his coffee cup, knew the tune—everybody knew that tune. It was “(We’re Gonna) Rock Around the Clock,” the best-selling rock song of all time. She smiled, because she knew what he was doing. He was giving people around him clues. He wanted people to hear him and say, “You’re Bill Haley, aren’t you?”

But they rarely did. His ball cap covered his famous spit curl, and his glasses covered much of his face. So eventually he would turn to the person next to him or even rise and walk over to a nearby table. The patrons would look up at the tall stranger looming over them. “You know who I am?” he’d ask. “I’m Bill Haley.” Then he’d take off the cap and they’d see the curl, and he’d pull out his driver’s license and they’d see his name. Sure enough, there it was: William John Clifton Haley.

He wouldn’t say much beyond that. Some of the customers tried to get to know him, asking simple coffee shop questions such as “How are you doing?” But Haley didn’t seem to be listening. He’d respond in a rambling fashion. Maybe he’d talk about a show he’d done in London back in the sixties or about Rudy Pompilli, his longtime sax player and best friend, who’d died in 1976. He missed Rudy.

Haley appreciated the company in Sambo’s—one time he left a $100 tip for a quiet waitress who could barely speak English. But usually he slipped out without saying a word of goodbye. And though he was mostly a genial customer, he could be volatile. “Once,” remembers Billnitzer, “our busboy Woody said something to him like, ‘Hey, Mr. Haley, how are you?’ and Bill got real upset, threw down his money, and stomped out.”

Haley would get in his Lincoln Continental and drive off. Sometimes he went to the Hop Shop, a bar on South Seventh Street, or Richard’s, a restaurant and bar on south Highway 77, to drink. He liked Scotch—Johnnie Walker Red was his brand. Sometimes he’d drink too much and get back in his car. Occasionally the police, who knew him well, would stop him and take him to jail. If he made it home, he’d stumble to the little pool house out back while his wife and three children slept in the main house. He’d pick up the phone and start calling people he knew from long ago: ex-wives, sons, producers, promoters, band members. He’d tell stories. He’d cry. He’d ramble. Then he’d hang up and call someone else. He felt so isolated out in that room, millions of miles from his past.

Read more:

Less Than Zero

by  Roger D. Hodge

The most telling characteristics of a society are often those that pass unnoticed. No one pays much attention to interns, for instance, yet the simple fact that at any given time hundreds of thousands of jobs are being performed for little or no pay is surely an important development in our political economy. Perhaps it says something about the value we place on work. According to Ross Perlin, the author of Intern Nation, the rise of this relatively new employment category, which is taken for granted by everyone from the antiunion governor of Wisconsin to the managers of Barack Obama’s reelection campaign, is a clear indication of the decline of labor rights in the United States.

Definitions of what exactly constitutes an internship vary widely. Are interns trainees, temps, apprentices, servants? Since the rules are vague or at least unenforced, employers simply fill in the blank with whatever tasks need doing, and interns often end up stuffing envelopes, fetching coffee, answering the phone, or collecting the boss’s dry cleaning. Not all their work is trivial, of course, and some internships offer useful training, but it is safe to say that vast numbers of interns are condemned to performing the mundane, vaguely humiliating chores that are the necessary if despised conditions of life in the white-collar world of work to which so many young people aspire. Far from providing an educational benefit or vocational training, internships have simply become, for many businesses, a convenient means of minimizing labor costs.

The College Employment Research Institute estimates that 75 percent of college students do at least one internship before graduation. The summer and part-time jobs that once occupied our otherwise idle youth have gone the way of the typewriter; nowadays, interns are everywhere, in publishing, merchandising, insurance, finance, consulting, law, engineering, and the defense industry. It seems that most large corporations pay their interns, but the number of unpaid jobs in the economy is booming. A recent article in Fortune magazine suggests that working for free might even be the “new normal,” a “wave of the future in human resources.” Based on his reporting, Perlin estimates that one to two million Americans work as interns every year, though he suspects that this number might be on the low end. Most interns are students or recent graduates, and large numbers, perhaps 50 percent overall, work for free. Worse, many actually pay tuition for the privilege of working, as a result of the common misconception on the part of both universities and employers that the bestowal of academic credit somehow nullifies the strictures of the Fair Labor Standards Act (FLSA) of 1938, which prohibits uncompensated labor except under carefully defined circumstances. Academic programs, both undergraduate and graduate, have increasingly adopted the internship as a degree requirement. Such requirements foster an economy of scarcity among the most prestigious internship programs, which like everything else in our capitalist democracy increasingly resemble commodities. Highly coveted internships at places like Vogue magazine have recently been auctioned off for as much as $42,500; Perlin notes the irony that this obscene sum was raised for the benefit of the Robert F. Kennedy Center for Justice and Human Rights. Apparently, no one was troubled by the contradiction.

Read more:

That Which Does Not Kill Me Makes Me Stranger

by  Daniel Coyle

Jure Robic, the Slovene soldier who might be the world’s best ultra-endurance athlete, lives in a small fifth-floor apartment near the railroad tracks in the town of Koroska Bela. By nature and vocation, Robic is a sober-minded person, but when he appears at his doorway, he is smiling. Not a standard-issue smile, but a wild and fidgety grin, as if he were trying to contain some huge and mysterious secret.

Robic catches himself, strides inside and proceeds to lead a swift tour of his spare, well-kept apartment. Here is his kitchen. Here is his bike. Here are his wife, Petra, and year-old son, Nal. Here, on the coffee table, are whiskey, Jägermeister, bread, chocolate, prosciutto and an inky, vegetable-based soft drink he calls Communist Coca-Cola, left over from the old days. And here, outside the window, veiled by the nightly ice fog, stand the Alps and the Austrian border. Robic shows everything, then settles onto the couch. It’s only then that the smile reappears, more nervous this time, as he pulls out a DVD and prepares to reveal the unique talent that sets him apart from the rest of the world: his insanity.

Tonight, Robic’s insanity exists only in digitally recorded form, but the rest of the time it swirls moodily around him, his personal batch of ice fog. Citizens of Slovenia, a tiny, sports-happy country that was part of the former Yugoslavia until 1991, might glow with beatific pride at the success of their ski jumpers and handballers, but they tend to become a touch unsettled when discussing Robic, who for the past two years has dominated ultracycling’s hardest, longest races. They are proud of their man, certainly, and the way he can ride thousands of miles with barely a rest. But they’re also a little, well, concerned. Friends and colleagues tend to sidle together out of Robic’s earshot and whisper in urgent, hospital-corridor tones.

‘‘He pushes himself into madness,’’ says Tomaz Kovsca, a journalist for Slovene television. ‘‘He pushes too far.’’ Rajko Petek, a 35-year-old fellow soldier and friend who is on Robic’s support crew, says: ‘‘What Jure does is frightening. Sometimes during races he gets off his bike and walks toward us in the follow car, very angry.’’

What do you do then?

Petek glances carefully at Robic, standing a few yards off. ‘‘We lock the doors,’’ he whispers.

Read more:

Wednesday, May 25, 2011

Building Blocks by Kumi Yamashita