Monday, August 6, 2012

Living, Thinking Houses

In the sizzling summer heat I’ve been thinking about igloos. To chill out in, of course, but also because I admire their elemental simplicity. Inuits traditionally used bone knives to carve bricks from quarries of hardened snow. A short, low tunnel led to the front door, trapping heat in and keeping out fierce cold and critters. Mortar wasn’t needed, because the snow bricks were shaved to fit, and at night the dome ossified into a glistening ice fort. The human warmth inside melted the ice just enough to seal the seams.

The idea behind such homes was refuge from elements and predators, based on a watchful understanding of both. The igloo was really an extension of the self — shoulder blades of snow and backbone of ice, beneath which a family slept, swathed in thick animal fur, beside one or two small lamps burning blubber. All the building materials lay at hand, perpetually recycled, costing nothing but effort.

Picture most of our houses and apartment buildings today — full of sharp angles, lighted by bulbs and colors one doesn’t find in nature, built from plywood, linoleum, iron, cement and glass. Despite their style, efficiency and maybe good location, they don’t always offer us a sense of sanctuary, rest or well-being. Because we can’t escape our ancient hunger to live close to nature, we encircle the house with lawns and gardens, install picture windows, adopt pets and Boston ferns, and scent everything that touches our lives.

This tradition of doing and undoing doesn’t really make sense or promote healthy living or a sustainable planet, so there’s an impassioned trend worldwide toward building green cities with living walls and roofs and urban farms in skyscrapers. Referring to “the north 40” would mean crops 40 floors up. In such a cityscape, the line blurs between indoor and outdoor.

Vertical gardens and living roofs are sprouting up everywhere.Mexico City’s three eco-sculptures, carpeted in over 50,000 plants, tower above car-clogged avenues. A blooming tapestry of plants adorns the exterior walls of the Quai Branly Museum in Paris. Inside Lisbon’s Dolce Vita shopping center, a plush vertical meadow undulates. In Milan’s Café Trussardi, diners and flâneurs sit in a glass-box courtyard beneath a hint of heaven: a vibrant cloud of frizzy greens, cascading vines and flowers. The Plant, an old meatpacking building in Chicago, has morphed into an eco farm, home to tilapia fish breeders, mushroom gardeners and hydroponically grown vegetables. Xero Flor America, based in North Carolina, has already sold 1.2 million square feet of living roofs.

by Diane Ackerman, NY Times | Read more:
Photo: Rodrigo Cruz

Sunday, August 5, 2012

Sean Costello


[ed. Repost. Check out the music archives.]

How To Do What You Love

To do something well you have to like it. That idea is not exactly novel. We've got it down to four words: "Do what you love." But it's not enough just to tell people that. Doing what you love is complicated.

The very idea is foreign to what most of us learn as kids. When I was a kid, it seemed as if work and fun were opposites by definition. Life had two states: some of the time adults were making you do things, and that was called work; the rest of the time you could do what you wanted, and that was called playing. Occasionally the things adults made you do were fun, just as, occasionally, playing wasn't—for example, if you fell and hurt yourself. But except for these few anomalous cases, work was pretty much defined as not-fun.

And it did not seem to be an accident. School, it was implied, was tedious because it was preparation for grownup work.

The world then was divided into two groups, grownups and kids. Grownups, like some kind of cursed race, had to work. Kids didn't, but they did have to go to school, which was a dilute version of work meant to prepare us for the real thing. Much as we disliked school, the grownups all agreed that grownup work was worse, and that we had it easy.

Teachers in particular all seemed to believe implicitly that work was not fun. Which is not surprising: work wasn't fun for most of them. Why did we have to memorize state capitals instead of playing dodgeball? For the same reason they had to watch over a bunch of kids instead of lying on a beach. You couldn't just do what you wanted.

I'm not saying we should let little kids do whatever they want. They may have to be made to work on certain things. But if we make kids work on dull stuff, it might be wise to tell them that tediousness is not the defining quality of work, and indeed that the reason they have to work on dull stuff now is so they can work on more interesting stuff later. [1]

Once, when I was about 9 or 10, my father told me I could be whatever I wanted when I grew up, so long as I enjoyed it. I remember that precisely because it seemed so anomalous. It was like being told to use dry water. Whatever I thought he meant, I didn't think he meant work could literally be fun—fun like playing. It took me years to grasp that.

Jobs

By high school, the prospect of an actual job was on the horizon. Adults would sometimes come to speak to us about their work, or we would go to see them at work. It was always understood that they enjoyed what they did. In retrospect I think one may have: the private jet pilot. But I don't think the bank manager really did.

The main reason they all acted as if they enjoyed their work was presumably the upper-middle class convention that you're supposed to. It would not merely be bad for your career to say that you despised your job, but a social faux-pas.

Why is it conventional to pretend to like what you do? The first sentence of this essay explains that. If you have to like something to do it well, then the most successful people will all like what they do. That's where the upper-middle class tradition comes from. Just as houses all over America are full of chairs that are, without the owners even knowing it, nth-degree imitations of chairs designed 250 years ago for French kings, conventional attitudes about work are, without the owners even knowing it, nth-degree imitations of the attitudes of people who've done great things.

What a recipe for alienation. By the time they reach an age to think about what they'd like to do, most kids have been thoroughly misled about the idea of loving one's work. School has trained them to regard work as an unpleasant duty. Having a job is said to be even more onerous than schoolwork. And yet all the adults claim to like what they do. You can't blame kids for thinking "I am not like these people; I am not suited to this world."

Actually they've been told three lies: the stuff they've been taught to regard as work in school is not real work; grownup work is not (necessarily) worse than schoolwork; and many of the adults around them are lying when they say they like what they do.

The most dangerous liars can be the kids' own parents. If you take a boring job to give your family a high standard of living, as so many people do, you risk infecting your kids with the idea that work is boring. [2] Maybe it would be better for kids in this one case if parents were not so unselfish. A parent who set an example of loving their work might help their kids more than an expensive house. [3]

It was not till I was in college that the idea of work finally broke free from the idea of making a living. Then the important question became not how to make money, but what to work on. Ideally these coincided, but some spectacular boundary cases (like Einstein in the patent office) proved they weren't identical.

The definition of work was now to make some original contribution to the world, and in the process not to starve. But after the habit of so many years my idea of work still included a large component of pain. Work still seemed to require discipline, because only hard problems yielded grand results, and hard problems couldn't literally be fun. Surely one had to force oneself to work on them.

If you think something's supposed to hurt, you're less likely to notice if you're doing it wrong. That about sums up my experience of graduate school.

Bounds

How much are you supposed to like what you do? Unless you know that, you don't know when to stop searching. And if, like most people, you underestimate it, you'll tend to stop searching too early. You'll end up doing something chosen for you by your parents, or the desire to make money, or prestige—or sheer inertia.

Here's an upper bound: Do what you love doesn't mean, do what you would like to do most this second. Even Einstein probably had moments when he wanted to have a cup of coffee, but told himself he ought to finish what he was working on first.

It used to perplex me when I read about people who liked what they did so much that there was nothing they'd rather do. There didn't seem to be any sort of work I liked that much. If I had a choice of (a) spending the next hour working on something or (b) be teleported to Rome and spend the next hour wandering about, was there any sort of work I'd prefer? Honestly, no.

But the fact is, almost anyone would rather, at any given moment, float about in the Carribbean, or have sex, or eat some delicious food, than work on hard problems. The rule about doing what you love assumes a certain length of time. It doesn't mean, do what will make you happiest this second, but what will make you happiest over some longer period, like a week or a month.

Unproductive pleasures pall eventually. After a while you get tired of lying on the beach. If you want to stay happy, you have to do something.

As a lower bound, you have to like your work more than any unproductive pleasure. You have to like what you do enough that the concept of "spare time" seems mistaken. Which is not to say you have to spend all your time working. You can only work so much before you get tired and start to screw up. Then you want to do something else—even something mindless. But you don't regard this time as the prize and the time you spend working as the pain you endure to earn it.

I put the lower bound there for practical reasons. If your work is not your favorite thing to do, you'll have terrible problems with procrastination. You'll have to force yourself to work, and when you resort to that the results are distinctly inferior.

To be happy I think you have to be doing something you not only enjoy, but admire. You have to be able to say, at the end, wow, that's pretty cool. This doesn't mean you have to make something. If you learn how to hang glide, or to speak a foreign language fluently, that will be enough to make you say, for a while at least, wow, that's pretty cool. What there has to be is a test.

So one thing that falls just short of the standard, I think, is reading books. Except for some books in math and the hard sciences, there's no test of how well you've read a book, and that's why merely reading books doesn't quite feel like work. You have to do something with what you've read to feel productive.


Pasión, vigor, desengaños…
El tango, más que un baile, es una forma de vida.

- Pillole di Tango
via:

Grey Area: How ‘Fifty Shades’ Dominated the Market

By late May, more than ten million copies of E.L. James’s Fifty Shades trilogy, an erotic romance series about the sexual exploits of a domineering billionaire and an inexperienced coed, had been sold in the United States, all within six weeks of the books’ publication here. This apparently unprecedented achievement occurred without the benefit of a publicity campaign, formal reviews, or Oprah’s blessing, owing to a reputation established, as one industry analyst put it, “totally through word of mouth.”

It’s not news that “word of mouth” has become a business model in the book industry. But E.L. James, a forty-nine-year-old former television executive from West London whose real name is Erika Leonard, has exceeded the sales feats of previous reader-discovered authors by such a staggering magnitude that she is in a category of her own. Last year’s breakout success, Amanda Hocking, sold merely a million copies of her self-published young-adult novels over the course of eleven months before signing a four-book deal with St. Martin’s Press.

The crucial difference may have less to do with talent, content, or luck than with a peculiarity of Leonard’s early readership: her work originated as fan fiction, a genre that operates outside the bounds of literary commerce, in online networks of enthusiasts of popular books and movies, brought together by a desire to write and read stories inspired by those works. Leonard’s excursion in the genre provided her with a captive audience of thousands of positively disposed readers, creating a market for her books before they ever carried price tags. But fan fiction is inherently collaborative and by convention resolutely anti-commercial, attributes which make its role in the evolution of her work both highly unusual and ethically fraught.

Beginning in 2009, Leonard posted, under a different title, a version of the Fifty Shades trilogy on a well-trafficked fan-fiction forum devoted to the Twilight series, the vampire-themed romance blockbusters by Stephenie Meyer. Leonard’s “TwiFic” shed Meyer’s supernatural story line and transposed the largely chaste love story of her protagonists, Edward and Bella, into a sexually explicit register. Like many fan-fiction writers, Leonard uploaded her work in serial installments, a method that enables readers to weigh in as the story progresses and allows writers to incorporate feedback as they go. Writers also read one another’s fan fictions and can infer, from the number and tenor of reader responses, what kinds of stories are popular. Leonard’s story reportedly received more than 37,000 reviews, and was read by untold thousands more who did not post reviews.

Early in 2011, after amending the work and expunging all traces of its connection to Twilight, she contracted with a small Australian press to publish it as the Fifty Shades trilogy, in ebook and print-on-demand paperback formats. By March of this year, when Vintage acquired the rights to the trilogy for more than a million dollars, all three books were at or near the top of The New York Times’ combined print and ebook bestseller list.

The vast majority of Fifty Shades’s readers are presumed to be women, as are the vast majority of fan-fiction producers and consumers, and anecdotal evidence suggests that, at least initially, there was much overlap between these groups. At Goodreads.com, a book-recommendation site with more than nine million members, readers began reviewing Fifty Shades of Grey, the first book in the trilogy, in the spring of 2011, many noting that they had first encountered the story in its fan-fiction incarnation. “I loved this story as a fanfic and the characters have stolen my heart all over again!” wrote a reader named Ashley, who, along with more than 55,000 other Goodreads members, gave Fifty Shades of Grey the site’s highest rating, five stars.

Critics, by contrast, have found much to abhor about the work. Many have lamented Leonard’s “stilted,” “cliché-ridden” prose (a typical line: “my very small inner goddess sways in a gentle victorious samba”) and decried as retrograde the sexual mores of her (now renamed) protagonists: Anastasia (Ana) Steele, a willing but inexplicably chaste college senior at Washington State University, and Christian Grey, a buff but troubled Seattle mogul who seeks to enlist her as his sexual slave. Christian is partial to BDSM—an umbrella term encompassing the erotic practices of bondage and discipline, dominance and submission, sadism and masochism—and his penthouse includes a “playroom” kitted out with chains, shackles, whips, and other kinky toys. But before Ana can experience its exquisite tortures, she must sign a contract, devised by Christian’s lawyer, consenting to become his submissive (or “sub”), ceding control over her body, diet, hygiene, sleep, and wardrobe, and stipulating her tolerance for various sexual acts and accessories (manacles, hot wax, genital clamps, etc.).

Together, the Fifty Shades books run to more than fifteen hundred pages, many if not most of them sexually explicit. (One of Leonard’s few obvious talents as a writer is maintaining variety in the frequent couplings.) What Ana, who narrates the series, refers to as Christian’s “predilection” lends to the strenuous antics a transgressive frisson. Yet critics have noted that the erotic content breaks no new ground, citing equally explicit, better-written fare, in particular The Story of O, Anne Declos’s indubitably literary portrayal of female sexual slavery, published in 1954. In a cover story for Newsweek, Katie Roiphe concluded that what’s “most alarming about the Fifty Shades of Grey phenomena, what gives it its true edge of desperation, and end-of-the-world ambiance, is that millions of otherwise intelligent women are willing to tolerate prose on this level.”

by Emily Eakin, New York Review of Books |  Read more:
Photo: Getty images

Subway Stories by Jaya Suberg
I go to my lover…
via:

The Perfect Milk Machine: How Big Data Transformed the Dairy Industry


While there are more than 8 million Holstein dairy cows in the United States, there is exactly one bull that has been scientifically calculated to be the very best in the land. He goes by the name of Badger-Bluff Fanny Freddie.

Already, Badger-Bluff Fanny Freddie has 346 daughters who are on the books and thousands more that will be added to his progeny count when they start producing milk. This is quite a career for a young animal: He was only born in 2004.

There is a reason, of course, that the semen that Badger-Bluff Fanny Freddie produces has become such a hot commodity in what one artificial-insemination company calls "today's fast paced cattle semen market." In January of 2009, before he had a single daughter producing milk, the United States Department of Agriculture took a look at his lineage and more than 50,000 markers on his genome and declared him the best bull in the land. And, three years and 346 milk- and data-providing daughters later, it turns out that they were right.

"When Freddie [as he is known] had no daughter records our equations predicted from his DNA that he would be the best bull," USDA research geneticist Paul VanRaden emailed me with a detectable hint of pride. "Now he is the best progeny tested bull (as predicted)."

Data-driven predictions are responsible for a massive transformation of America's dairy cows. While other industries are just catching on to this whole "big data" thing, the animal sciences -- and dairy breeding in particular -- have been using large amounts of data since long before VanRaden was calculating the outsized genetic impact of the most sought-after bulls with a pencil and paper in the 1980s.

by Alexis Madrigal, The Atlantic | Read more:
Photo via Reuters, h/t 3 Quarks Daily

Marcus Samuelsson, a Chef, a Brand and Then Some

Marcus Samuelsson, dapper in a Ralph Lauren tuxedo and patterned scarf, is working the celebrity-couture crowd at the Metropolitan Museum of Art.

It is a Monday evening, just around 7, and Mr. Samuelsson — hotshot chef, food impresario and kinetic force behind Red Rooster Harlem, one of Manhattan’s restaurants of the moment — is displaying his usual verve.

On the red carpet, he snaps a picture of his glamorous wife, the model and philanthropist Maya Haile, with Beyoncé. In the European sculpture gallery, he is chatting with Kanye West and several of the New York Knicks. At the Temple of Dendur, he is dining with André Balazs, the hotel owner, and Chelsea Handler.

The next morning at 10, Mr. Samuelsson, in a fresh shirt and tux trousers, is sitting in a sound studio some 60 blocks downtown, painstakingly recording the audio version of his new memoir, “Yes, Chef.” Six hours later, in a vintage, red velvet tuxedo jacket, he is overseeing an intimate dinner for 350 at Gotham Hall on behalf of Queen Silvia and Princess Madeleine of Sweden.

And the morning after that, Mr. Samuelsson is back uptown to work the lunch rush at Red Rooster, that culinary mosaic of Southern, Swedish and Ethiopian comforts on Lenox Avenue in Harlem.

Such is the wild and frenetic life of the modern celebrity chef, that strange amalgam of food savvy, marketing acumen and business skill all wrapped, in Mr. Samuelsson’s case, into a media-ready package. At 41, he has exploded not only onto New York’s food scene but also onto its cutthroat food business scene.

Yes, there is Red Rooster and five other restaurants. But there is more — much more. A forthcoming cookware collection for Macy’s. A new line of teas. Deals with American Airlines and MasterCard. Appearances on “Top Chef Masters,” “Chopped All-Stars” and “The Next Iron Chef.” Two Web sites, FoodRepublic.com andmarcussamuelsson.com, not to mention four cookbooks and the memoir. His growing, multimillion-dollar enterprise stretches from New York to Chicago to California to Stockholm, and employs more than 700 people.

It is a time-tested recipe. Mr. Samuelsson is the figurative heir of Julia Child and Wolfgang Puck, but he is hardly the only one. Mario Batali, Todd English, Tom Colicchio, Alain Ducasse, Bobby Flay, Lidia Bastianich, David Burke — the list of marquee names in celebrity chefdom is long. Many of these chefs have built sprawling empires of restaurants — Mr. Batali has 25 — and have captured the popular imagination with TV shows, iPhoneapps and assorted products.

The question for Mr. Samuelsson and other rising stars is how far and how fast they can push a personal brand. The more business ventures they start, the less they can personally control the quality. It is a quandary that any successful entrepreneur faces as a business grows.

“We constantly have to edit, curate, sift through our brand,” Mr. Samuelsson says. “Where is the stretch? Where is the perfect fit? Where does it make sense? You have to be a Baryshnikov.”

It is a challenge, but the financial rewards can be big. Successful restaurants in major cities can bring in $10 million a year, more for hot spots like Red Rooster. But chefs can easily double their income with endorsements, books, consulting jobs and just about anything emblazoned with their names.

Mr. Puck, who got the initial idea for a line of frozen foods from Johnny Carson, a regular at his Spago restaurant in the 1980s, today oversees a $400 million-a-year company. Some $30 million of that comes from consumer products like soups and sauces. His line of appliances and cookware generates $50 million. “Cooking is an evolution,” he says. “If you don’t change, you fall behind.”

by Adrienne Carter, NY Times |  Read more:
Photo: Tony Cenicola

Air-Pumped


As he rounded a corner of the display floor, Bowen saw the booth that would change his life—and eventually the lives of many others. It belonged to a company that made tube signs, gaudy 3- by 21-foot inflatable banners with peel-and-stick lettering. He imagined them swaying in the Nebraska breeze above a new appliance store or a spruced-up coffee shop, visible for blocks. Bowen immediately signed up to be a local distributor. He’d found a new gig—one that would have his kids climbing onto Omaha rooftops to install the signs over the next few years. But that was only the first weird twist waiting for him in the inflatable sign business.

When the parent company folded, he bought the leftover vinyl and some cheap fans, and contracted with a tarp-making company to sew up more inventory. By the early 1990s Bowen was shipping hundreds of signs around the country. He had to contend with different zoning restrictions in every jurisdiction, which was a pain, but whatever—the best part happened at home, when Bowen would hand his kids a rag and let them zip into the blow-ups to clean them. With a person horsing around inside them, the signs would contort and flex into all sorts of funny shapes.

Wait a minute, Bowen thought. Why not put people inside them? Converting the inflatable banners into walking blow-up signs that people wore like a costume would mean that he could dodge all those code restrictions. In 1991, Bowen hand-cut a prototype, a giant panda, for an existing client, a Chinese restaurant in Japan. The operator would wear a lightweight nylon belt with a shoulder strap that supported a 12-volt battery and fan. Bowen reinforced the fan’s housing with steel pins to make sure it wouldn’t crack or shift speeds if jostled, and he vented an intake coil out the leg of the costume. Now the contraption inflated from the operator’s thigh, at the fan, which helped maintain air pressure. The suit was virtually sag-proof—excess air leaked through the seams, just like in the old tube signs. The method turned out to have an added bonus: “We are able to push a lot of body heat out,” Bowen says. It was hot inside, but the pilot wouldn’t roast to death. Sales took off immediately—not just to stores, but to anyone who wanted a giant walking advertisement. Bowen found himself in the mascot business.

The costumes turned out to have a major problem—one Bowen didn’t see until he sold his first major sports commission to the University of Nebraska a couple of years later. When the grinning, kid-like cherub called Lil’ Red debuted on Husker sidelines in 1993, he was big, bulky, and, well, just sort of stood there. “If I’m standing on the sidelines and fans can’t see the game, it pisses them off,” says Brad Post, one of the first Lil’ Red operators. The suit got booed.

But Post, a pre-engineering student with no formal mascot experience, provided the solution. He’d been curious about the new suit from the start. “Mostly, I just wanted to get inside it and see how it worked,” Post says. He realized that while operators couldn’t do the same gymnastic maneuvers as classical, furry-suited mascots, diving and mugging up and down the sidelines, they also had fewer physical limitations. It didn’t matter what happened inside the suit, as long as the action outside looked cool. Post came up with a new move: He would lie on his stomach, pull his feet out of the legs, do a somersault, and stand back up with his feet in the costume’s head. Voilà! Lil’ Red seemed to have somersaulted into a headstand. Post also pulled the character’s head and feet together—it looked like he was shrinking. The boos turned to cheers. “It was really sort of a blank slate,” he says. “I just did what I thought would get a reaction.” (Post’s special genius was eventually rewarded—today he’s the mascot coordinator for the Denver Broncos.)

Bowen took what Post had figured out and ran with it. Back at his workshop, he added internal handles so operators could spin and jump at all angles inside the suits, straps to twist facial expressions, and swiveling couplers that allowed for flexible intake coils. By 1999, pro teams and major brands were calling with their own crazy design ideas. The Florida Marlin—the baseball team’s mascot—now spits on fans of the opposing team at baseball games, thanks to a pressure washer built into his mouthpiece. The Philadelphia Eagle shoots fireworks from the top of his head. Bowen’s outfits—his company, Signs & Shapes International, sells them as “WalkArounds”—appear in everything from Disney’s Toy Story 3 on Ice (think: super-expandable Slinky Dog) to Spider-Man: Turn Off the Dark on Broadway (pop-up, supersize supervillains) to in-store promos for Purina cat food and Tyson chicken. “If something looks cool,” Bowen says, “we’ll do it.”

The field of sports mascotting would never be the same. “Lee is very good at digesting these pie-in-the-sky ideas and then making them a reality with some simple engineering,” says Robert Boudwin, mascot operator for the Houston Rockets. (In other words, he’s Clutch the Bear.) Boudwin was instrumental in coming up with Air Head Clutch, an extra-rotund costume that can “swallow” a cheerleader by unlatching a hidden mouth compartment. It’s a special power lots of other mascots have bought since then. (The cheerleader is usually regurgitated unharmed.)

by Ben Paynter, Wired |  Read more:

Saturday, August 4, 2012

Paco de Lucía



Marlo Pascual, Untitled, 2011
via:

ines-montenegro
via:

StumbleUpon Fights to Stay Relevant

StumbleUpon, founded in 2001, was one of the first sites to become wildly popular by helping people find amusing, weird, and useful things on the internet. Bored at work? Idle at home? By repeatedly clicking StumbleUpon’s simple "stumble" button, it’s possible to scan through hundreds of websites in one sitting, as its algorithm takes stabs at what you might like. Here’s a recipe for pasta pie. Here’s a video of a robot walking up some stairs. Here’s Anderson Cooper’s blog. It’s mindless, and it's brilliant.

According to StatCounter, StumbleUpon drove more traffic in 2009 than any other social media site in the US, including Facebook, YouTube, Twitter, Digg, Reddit, and Pinterest. But like other content discovery sites, StumbleUpon is having trouble turning diversion into a business — especially now that traffic has plummeted following a big redesign.

Restarting

StumbleUpon has an interesting history. EBay swallowed up the company in 2007 when it was just a Firefox extension, albeit with 7.4 million users and some advertising revenue. But the auctionhouse couldn’t figure out what to do with its acquisition, so in 2009 cofounders Garrett Camp and Geoff Smith bought the site back with the help of investors for a reported $75 million. StumbleUpon ballooned from about 30 to 130 employees, about two-thirds of whom are developers. After spinning out, the company launched plug-ins for all the major browsers and started amassing traffic straight to StumbleUpon.com. It launched apps for iPhone and iPad, Android phones, and the Nook and Kindle e-readers.

In April, after three years as a "restartup," Camp proudly announced two milestones. StumbleUpon hit 25 million users, more than triple the number it had when it was owned by eBay. What’s more, those users were clicking the "stumble" button 1.2 billion times a month.

All seemed rosy. But behind the scenes, StumbleUpon's traffic was way down in the wake of its biggest redesign ever. StumbleUpon was always a big dumb firehose of traffic, which is why publishers loved it. But starting at the end of last year, StumbleUpon has been driving noticeably fewer hits — reminiscent of the fallout at rival aggregator Digg, where a big redesign drove users away.

In May, Camp stepped down as CEO. "After 10 years leading StumbleUpon, it’s time for a change," he wrote on the company blog. Camp, who also cofounded the car service Uber, now chairs StumbleUpon’s board. The company is still looking for a CEO. In the interim, three executives are making decisions as a committee.

Traffic Turbulence

As part of a major overhaul in December, StumbleUpon tweaked its homepage to look more modern and "fresh," and tried to make the navigation more visual. At the same time, big changes were made to the algorithm that had the effect of distributing stumbles — pageviews originating from StumbleUpon — more widely across the web, so that users weren’t being sent to the same few sites over and over again. Around the same time, StumbleUpon started pushing its mobile apps, which now account for 25 percent of stumbles. Unfortunately, mobile users tend to spend less time on the site.

by Adrianne Jeffries, The Verge |  Read more: 
Image via: My Social Agency

Kill or Capture

On September 30, 2011, in a northern province of Yemen, Anwar al-Awlaki, an American citizen and a senior figure in Al Qaeda in the Arabian Peninsula, finished his breakfast and walked with several companions to vehicles parked nearby. Before he could drive away, a missile fired from a drone operated by the Central Intelligence Agency struck the group and killed Awlaki, as well as a second American citizen, of Pakistani origin, whom the drone operators did not realize was present.

President Barack Obama had personally authorized the killing. “I want Awlaki,” he is said to have told his advisers at one point. “Don’t let up on him.” The President’s bracing words about a fellow American are reported in “Kill or Capture,” a recent and important book on the Obama Administration’s detention and targeted-killing programs, by Daniel Klaidman, a former deputy editor of Newsweek.

With those words attributed to Obama, Klaidman has reported what would appear to be the first instance in American history of a sitting President speaking of his intent to kill a particular U.S. citizen without that citizen having been charged formally with a crime or convicted at trial.

The due-process clause of the Fifth Amendment prohibits “any person” from being deprived of “life, liberty, or property without due process of law.” Obama authorized the termination of Awlaki’s life after he concluded that the boastful, mass-murder-plotting cleric had, in effect, forfeited constitutional protection by waging war against the United States and actively planning to kill Americans. Obama also believed that the Administration’s secret process establishing Awlaki’s guilt provided adequate safeguards against mistake or abuse—all in all, enough “due process of law” to take his life.

Awlaki was certainly a murderous character; his YouTube videos alone would likely convict him at a jury trial. Yet the case of Awlaki’s killing by drone strike is to the due-process clause what the proposed march of neo-Nazis through a community that included many Holocaust survivors in Skokie, Illinois, was to the First Amendment when that case arose, in 1977. It is an instance where the most onerous facts imaginable should lead to the durable affirmation of constitutional principle, as Skokie did. Instead, President Obama and his advisers have opened the door to violent action against American citizens by future Presidents when the facts may be much less compelling.

by Steve Coll, The New Yorker |  Read more:
Photograph by Tracy Woodward/The Washington Post/Getty Images

Friday, August 3, 2012

Fargo – A Documentary


Fargo is the Coen Brothers movie that had everyone going around saying “dontcha know” and “you betcha!” for a while way back in the mid-90s, and it’s so highly regarded that it was inducted into the U.S. National Film Registry for being “culturally significant.”

It’s one of those rare multi-genre movies that actually works, and watching this making-of documentary made me love Fargo even more than before.

via: Neatorama

Kingdom Come


“…Of comfort no man speak:
Let’s talk of graves, of worms, and epitaphs;
Make dust our paper, and with rainy eyes
Write sorrow on the bosom of the earth.
Let’s choose executors, and talk of wills…
For God’s sake, let us sit upon the ground,
And tell sad stories of the death of kings.

—William Shakespeare

Although I’ve yet to see sandwich-board men on the steps of the nation’s capitol declaring that the end of the world is nigh, I expect that it won’t be long before the Department of Homeland Security advises the country’s Chinese restaurants to embed the alert in the fortune cookies. President Obama appears before the congregations of the Democratic faithful as a man of sorrows acquainted with grief, cherishing the wounds of the American body politic as if they were the stigmata of the murdered Christ. The daily newscasts update the approaches of weird storms, bring reports of missing forests and lost polar bears, number the dead and dying in Africa and the Middle East, gauge the level of America’s fast-disappearing wealth. Hollywood stages nostalgic remakes of the Book of Revelation; video games mount the battle of Armageddon on the bosom of the iPad. Nor does any week pass by without a word of warning from the oracles at the Council on Foreign Relations, Fox News, and the New York Times. Their peerings into the abyss of what to the Washington politicians are known as “the out years” never fail to discover a soon forthcoming catastrophe (default on the national debt, double-dip recession, global warming, nuclear proliferation, war in Iran) deserving the close attention of their fellow travelers aboard the bus to Kingdom Come.

If the fear of the future is the story line that for the last ten years has made it easy to confuse the instruments of the American media with the trumpets of doom, the cloud of evil omens is not without a silver lining. The tears on King Richard’s dusty paper, like the handwriting on King Belshazzar’s fiery wall, protect the profit margins of the banks and the insurance companies, serve the interests of the drug and weapons industries, allow the season’s political candidates to clothe themselves in the raiment of a messiah come to cleanse the electorate of its impurities, take America back to where it belongs, risk-free and tax-exempt, in the little house on the prairie. Adapted to the service of the Church or the ambition of the state, the fear of the future is the blessing that extorts the payment of the protection money. For the Taliban and the Tea Party it’s a useful means of crowd control, but for a democratic republic, crouching in the shadow of what might happen tomorrow tends to restrict the freedom of thought as well as the freedoms of movement, and leads eventually to a death by drowning in the bathtub of self-pity. (...)

Over the last fifty years, the picture of the future has changed often enough to become recognizable as a fashion statement. I’m old enough to remember a future that was merry and bright, everything coming up roses, men on the way to the moon, and the rain in Camelot falling only after sundown. President Kennedy in 1961 extended Tom Paine’s birthday message to every other country in the world, so sure of America’s holdings in and on the future that it could afford “to pay any price, bear any burden, meet any hardship, support any friend, or oppose any foe to assure the survival and success of liberty.” I’m also old enough to remember, a year later, New York City schoolchildren being advised to hide in broom closets and under desks in the event of the arrival, said to be imminent, of Soviet nuclear missiles on their way north from Cuba.

Under the administrations of nine American presidents in the years since, I’ve heard the future described in the language of both the sales pitch and the sermon, seen it advertised as sunny beach resort and lifeless desert, as equal-opportunity employer and private club. President Reagan’s new morning in America in the 1980s followed Alvin Toffler’s bestseller Future Shock, as well as the Hollywood production of Tom Wolfe’s The Right Stuff. Toffler populated an American garden of technological Eden with kindly computers in California, as well-meaning as J. R. R. Tolkien’s industrious dwarves, spinning the golden threads of fiber optics and mining the jewels of microchips. The Right Stuff forged the American hero as titanium tubing impervious to reentry speeds and the heat of the sun, American power likened to a Promethean pillar of fire lifting its disciples out of the well of death. By way of balancing the market, the next decade produced multiple narratives of American decline, furnished abundant premonitions of doom in the form of popular books (The End of Science, The Death of Meaning, The End of Nature, The Death of Economics, and The End of History) drifting across the American sky well before the arrivals in New York of American Airlines Flight 11 and United Airlines Flight 175 from Boston.

The collapse of the World Trade Center in the fall of September 2001 destroyed the last trace elements of the American future conceived as a nostalgic rerun of the way things were in the good old days when John Wayne was securing the nation’s frontiers and Franklin D. Roosevelt was watching over its soul. The loss of the utopian romance that had once supported both the ambition of the state and the strength of the economy was terrible to behold. So terrible that it has been replaced by an apparition—Gorgon-headed and dragon-winged—that reduces its beholders to paralyzed stone. Much of the effect I attribute to the Bush administration’s war on terror, which was lost on the day it was declared. Lost because, to wage the war, the Bush administration was obliged to manufacture, distribute, and magnify the reflection of its own ignorance and fear. Nobody’s cell phone to be left untapped, a jihadist in every rose garden.

In the years since the invasions of Iraq and Afghanistan, the palsied dysfunction has become more pronounced. The foreign wars haven’t been going according to plan; the domestic financial markets have suffered calamitous reversals of fortune; the sum of the national debt goes nowhere but up. The public parks bloom with the installations of surveillance cameras; the inspections at the airports maintain the national quota of patriotic dread, introduce the frequent flyer to the game of playing dead.

Among the country’s stupefied elites, the bad news induces the wish to make time stand still, to punish the presumption of a future that presents itself as a bill collector. As self-pitying as Shakespeare’s melancholy king, they sit upon the ground and tell sad stories of the death of money. Without it the future doesn’t bear contemplating, doesn’t include their presence in it and therefore doesn’t exist. How then can the banks be expected to lend money, the government to build hospitals and schools, the rich to pay taxes for comforts not their own? The suggestion is outrageous, an intolerable effrontery, out of line with the all-American revelation that the name of the game is selfishness. The surplus of resentment affords the excuses to do nothing and bids up the market in transcendence. Politicians in Congress stand around like trees in a petrified forest, or, if allied with the zeal of the Tea Party, console themselves with notions of biblical vengeance, the wrecking of any such thing as a common good a consummation devoutly to be wished. Secure in the knowledge that only the wicked shall perish, they press forward to the Day of Judgment when the host of the damned—variously identified over the course of the centuries as false priests, proud barons, profiteering capitalists, vile communists, and godless democrats—shall fall into the hands of an angry god and gnaw their tongues in anguish.

by Lewis Lapham, Lapham's Quarterly |  Read more:

Our Perfect Summer

My mother and I were at the dry cleaner’s, standing behind a woman we had never seen. “A nice-looking woman,” my mother would later say. “Well put together. Classy.” The woman was dressed for the season in a light cotton shift patterned with oversize daisies. Her shoes matched the petals and her purse, which was black-and-yellow striped, hung over her shoulder, buzzing the flowers like a lazy bumblebee. She handed in her claim check, accepted her garments, and then expressed gratitude for what she considered to be fast and efficient service. “You know,” she said, “people talk about Raleigh but it isn’t really true, is it?”

The Korean man nodded, the way you do when you’re a foreigner and understand that someone has finished a sentence. He wasn’t the owner, just a helper who’d stepped in from the back, and it was clear he had no idea what she was saying.

“My sister and I are visiting from out of town,” the woman said, a little louder now, and again the man nodded. “I’d love to stay awhile longer and explore, but my home, well, one of my homes is on the garden tour, so I’ve got to get back to Williamsburg.”

I was eleven years old, yet still the statement seemed strange to me. If she’d hoped to impress the Korean, the woman had obviously wasted her breath, so who was this information for?

“My home, well, one of my homes”; by the end of the day my mother and I had repeated this line no less than fifty times. The garden tour was unimportant, but the first part of her sentence brought us great pleasure. There was, as indicated by the comma, a pause between the words “home” and “well,” a brief moment in which she’d decided, Oh, why not? The following word— “one”—had blown from her mouth as if propelled by a gentle breeze, and this was the difficult part. You had to get it just right or else the sentence lost its power. Falling somewhere between a self-conscious laugh and a sigh of happy confusion, the “one” afforded her statement a double meaning. To her peers it meant, “Look at me, I catch myself coming and going!” and to the less fortunate it was a way of saying, “Don’t kid yourself, it’s a lot of work having more than one house.”

The first dozen times we tried it our voices sounded pinched and snobbish, but by midafternoon they had softened. We wanted what this woman had. Mocking her made it seem hopelessly unobtainable, and so we reverted to our natural selves.

“My home, well, one of my homes . . .” My mother said it in a rush, as if she were under pressure to be more specific. It was the same way she said, “My daughter, well, one of my daughters,” but a second home was more prestigious than a second daughter, and so it didn’t really work. I went in the opposite direction, exaggerating the word “one” in a way that was guaranteed to alienate my listener.

“Say it like that and people are going to be jealous,” my mother said.

“Well, isn’t that what we want?”

“Sort of,” she said. “But mainly we want them to be happy for us.”

“But why should you be happy for someone who has more than you do?”

“I guess it all depends on the person,” she said. “Anyway, I suppose it doesn’t matter. We’ll get it right eventually. When the day arrives I’m sure it’ll just come to us.”

And so we waited.

At some point in the mid- to late nineteen-sixties, North Carolina began referring to itself as “Variety Vacationland.” The words were stamped onto license plates, and a series of television commercials reminded us that, unlike certain of our neighbors, we had both the beach and the mountains. There were those who bounced back and forth between one and the other, but most people tended to choose a landscape and stick to it. We ourselves were Beach People, Emerald Isle People, but that was mainly my mother’s doing. I don’t think our father would have cared whether he took a vacation or not. Being away from home left him anxious and crabby, but our mother loved the ocean. She couldn’t swim, but enjoyed standing at the water’s edge with a pole in her hand. It wasn’t exactly what you’d call fishing, as she caught nothing and expressed neither hope nor disappointment in regard to her efforts. What she thought about while looking at the waves was a complete mystery, yet you could tell that these thoughts pleased her, and that she liked herself better while thinking them.

One year our father waited too late to make our reservations, and we were forced to take something on the sound. It wasn’t a cottage but a run-down house, the sort of place where poor people lived. The yard was enclosed by a chain-link fence and the air was thick with the flies and mosquitoes normally blown away by the ocean breezes. Midway through the vacation a hideous woolly caterpillar fell from a tree and bit my sister Amy on the cheek. Her face swelled and discolored, and within an hour, were it not for her arms and legs, it would have been difficult to recognize her as a human. My mother drove her to the hospital, and when they returned she employed my sister as Exhibit A, pointing as if this were not her daughter but some ugly stranger forced to share our quarters. “This is what you get for waiting until the last minute,” she said to our father. “No dunes, no waves, just this.”

From that year on, our mother handled the reservations. We went to Emerald Isle for a week every September and were always oceanfront, a word that suggested a certain degree of entitlement. The oceanfront cottages were on stilts, which made them appear if not large, then at least imposing. Some were painted, some were sided, “Cape Cod style,” with wooden shingles, and all of them had names, the cleverest being “Loafer’s Paradise.” The owners had cut their sign in the shape of two moccasins resting side by side. The shoes were realistically painted and the letters were bloated and listless, loitering like drunks against the soft faux leather.

“Now that’s a sign,” our father would say, and we would agree. There was The Skinny Dipper, Pelican’s Perch, Lazy Daze, The Scotch Bonnet, Loony Dunes, the name of each house followed by the name and home town of the owner. “The Duncan Clan—Charlotte,” “The Graftons—Rocky Mount,” “Hal and Jean Starling of Pinehurst”: signs that essentially said, “My home, well, one of my homes.”

by David Sedaris, The New Yorker |  Read more: