Wednesday, September 18, 2013

Dating App Tinder Catches Fire

Miranda Levitt was gushing about her new guy. His name was Todd, she told a girlfriend one day this summer, and he was so great—a director, older, established. The 26-year-old New York actress kept enthusing until her friend, with a dawning sense of recognition, cut her off: What’s his name again? The same “great guy” had been asking her out for a week on Tinder.

“My first reaction is like, ‘What the f-‍-‍- is Tinder?’ ” Levitt says. “So of course I downloaded it and proceeded to play on it like it was a video game for weeks.”

Tinder, as Levitt learned, is not a website. It’s a pathologically addictive flirting-dating-hookup app. The first step in using it is to sign in with your Facebook ID, which gives Tinder your name, age, photos, and sexual orientation. There is no second step. You’re immediately shown the face of a person of your preferred sex, and, again, there’s only one thing to do: Swipe right if you like what you see, swipe left if you don’t. Another face instantly appears for appraisal, and then another.

Tinder feels like a game until you remember that the people behind those faces are swiping you back. If, and only if, both parties like each other, a private chat box appears. You could conceivably have a conversation. You could make a date. Or you could simply meet for sex, minutes after Tinder’s algorithms matched your profiles. One year after launching, Tinder’s hordes have swipe-rated each other 13 billion times—3 billion in August alone—and 2 million matches happen each day. It’s the fastest-growing free dating app in the U.S.

The average Tinderer checks the app 11 times per day, seven minutes at a time. The company says it knows of 50 marriage proposals to date. Levitt cannot escape it. “Last night I was out with a friend,” she says. At the bar there was a guy, and things were going well. “I go to the bathroom, and when I come back I look over at his phone and Tinder is up! I was like, ‘Are you kidding?!’ And he was like, ‘No, I mean, someone matched me, and I’m checking it!’ I was like, ‘OK, dude.’ ”

Levitt makes an exasperated noise. “It’s being integrated into my life as a twentysomething a lot more than I thought it would be,” she says.

Like the monster in Alien, Tinder may be a perfectly evolved organism, a predator for your attention built on the DNA of its social networking predecessors. The faces you see on Tinder seem real because they’re tied to Facebook accounts, the gold standard of authenticity. Tinder takes the gay app Grindr’s location function, which pinpoints eager men down to the foot, and tames it for a female audience, rounding distance to the nearest mile. You can chat with Tinder matches, but you can’t send photos or video, so the app avoids Chatroulette’s fate of being overrun by aspiring Anthony Weiners.

What makes Tinder truly killer, though, is that it was designed exclusively for smartphones and the hypersocial millennials who wield them. Although online dating has long since lost its stigma, OkCupid and EHarmony remain sites you browse alone at home, with a fortifying glass of wine and a spreadsheet to track interactions. Tinder is an app you pull up at a bar with friends, passing the iPhone around.

by Nick Summers, Bloomberg Businessweek |  Read more:
Image: Gallery Stock

Overpopulation Is Not the Problem

Many scientists believe that by transforming the earth’s natural landscapes, we are undermining the very life support systems that sustain us. Like bacteria in a petri dish, our exploding numbers are reaching the limits of a finite planet, with dire consequences. Disaster looms as humans exceed the earth’s natural carrying capacity. Clearly, this could not be sustainable.

This is nonsense. Even today, I hear some of my scientific colleagues repeat these and similar claims — often unchallenged. And once, I too believed them. Yet these claims demonstrate a profound misunderstanding of the ecology of human systems. The conditions that sustain humanity are not natural and never have been. Since prehistory, human populations have used technologies and engineered ecosystems to sustain populations well beyond the capabilities of unaltered “natural” ecosystems.

The evidence from archaeology is clear. Our predecessors in the genus Homo used social hunting strategies and tools of stone and fire to extract more sustenance from landscapes than would otherwise be possible. And, of course, Homo sapiens went much further, learning over generations, once their preferred big game became rare or extinct, to make use of a far broader spectrum of species. They did this by extracting more nutrients from these species by cooking and grinding them, by propagating the most useful species and by burning woodlands to enhance hunting and foraging success.

Even before the last ice age had ended, thousands of years before agriculture, hunter-gatherer societies were well established across the earth and depended increasingly on sophisticated technological strategies to sustain growing populations in landscapes long ago transformed by their ancestors.

The planet’s carrying capacity for prehistoric human hunter-gatherers was probably no more than 100 million. But without their Paleolithic technologies and ways of life, the number would be far less — perhaps a few tens of millions. The rise of agriculture enabled even greater population growth requiring ever more intensive land-use practices to gain more sustenance from the same old land. At their peak, those agricultural systems might have sustained as many as three billion people in poverty on near-vegetarian diets.

The world population is now estimated at 7.2 billion. But with current industrial technologies, the Food and Agriculture Organization of the United Nations has estimated that the more than nine billion people expected by 2050 as the population nears its peak could be supported as long as necessary investments in infrastructure and conducive trade, anti-poverty and food security policies are in place. Who knows what will be possible with the technologies of the future? The important message from these rough numbers should be clear. There really is no such thing as a human carrying capacity. We are nothing at all like bacteria in a petri dish.

Why is it that highly trained natural scientists don’t understand this? My experience is likely to be illustrative. Trained as a biologist, I learned the classic mathematics of population growth — that populations must have their limits and must ultimately reach a balance with their environments. Not to think so would be to misunderstand physics: there is only one earth, of course!

It was only after years of research into the ecology of agriculture in China that I reached the point where my observations forced me to see beyond my biologists’s blinders. Unable to explain how populations grew for millenniums while increasing the productivity of the same land, I discovered the agricultural economist Ester Boserup, the antidote to the demographer and economist Thomas Malthus and his theory that population growth tends to outrun the food supply. Her theories of population growth as a driver of land productivity explained the data I was gathering in ways that Malthus could never do. While remaining an ecologist, I became a fellow traveler with those who directly study long-term human-environment relationships — archaeologists, geographers, environmental historians and agricultural economists.

The science of human sustenance is inherently a social science. Neither physics nor chemistry nor even biology is adequate to understand how it has been possible for one species to reshape both its own future and the destiny of an entire planet. This is the science of the Anthropocene. The idea that humans must live within the natural environmental limits of our planet denies the realities of our entire history, and most likely the future. Humans are niche creators. We transform ecosystems to sustain ourselves. This is what we do and have always done. Our planet’s human-carrying capacity emerges from the capabilities of our social systems and our technologies more than from any environmental limits.

Two hundred thousand years ago we started down this path. The planet will never be the same. It is time for all of us to wake up to the limits we really face: the social and technological systems that sustain us need improvement.

by Erle C. Ellis, NY Times |  Read more:
Image: Katherine Streeter

Foolproof Pan Pizza

I've got a confession to make: I love pan pizza.

I'm not talking deep-dish Chicago-style with its crisp crust and rivers of cheese and sauce, I'm talking thick-crusted, fried-on-the-bottom, puffy, cheesy, focaccia-esque pan pizza of the kind that you might remember Pizza Hut having when you were a kid, though in reality, most likely that pizza never really existed—as they say, pizzas past always look better through pepperoni-tinted glasses.

It would arrive at the table in a jet black, well-worn pan, its edges browned and crisped where the cheese has melted into the gap between the crust and the pan. You'd lift up a slice and long threads of mozzarella pull out, stretching all the way across the table, a signpost saying "hey everyone, it's this kid's birthday!" You'd reach out your fingers—almost involuntarily—grasping at those cheese strings, plucking at them like guitar strings, wrapping them around your fingers so you can suck them off before diving into the slice itself.

That perfect pan pizza had an open, airy, chewy crumb in the center that slowly transformed into a crisp, golden-brown, fried crust at the very bottom and a soft, thin, doughy layer at the top right at the crust-sauce interface. It was thick and robust enough to support a heavy load of toppings, though even a plain cheese or pepperoni slice would do.

It's been years since I've gone to an actual Pizza Hut (they don't even exist in New York aside from those crappy "Pizza Hut Express" joints with the pre-fab, lukewarm individual pizzas), but I've spent a good deal of time working on my own pan pizza recipe to the point that it finally lives up to that perfect image of my childhood pan pizza that still lives on in my mind.

If only pizza that good were also easy to make. Well here's the good news: It is. This is the easiest pizza you will ever make. Seriously. All it takes is a few basic kitchen essentials, some simple ingredients, and a bit of patience.

The way I see it, there are three basic difficulties most folks have with pizza:
  • Problem 1: Kneading. How long is enough? What motion do I use? And is it really worth the doggone effort?
  • Problem 2: Stretching. Once I've got that disk of dough, how do I get it into the shape of an actual pizza, ready to be topped?
  • Problem 3: Transferring. Ok, let's say I've got my dough made and perfectly stretched onto my pizza peel. How do I get it onto that stone in the oven without disturbing the toppings or having it turn into a misshapen blob?
This recipe avoids all three of those common pitfalls, making it pretty much foolproof. To be perfectly honest, every single one of these steps has been done before, and none of it is rocket science. All I'm doing is combining them all into a single recipe.

You can jump straight into a full step-by-step slideshow of the process or find the exact measurements and instructions in the recipe here, or read on for a few more details on what to expect and how we got there.
By now, everybody and their baker's heard about no knead dough. It's a technique that was developed by Jim Lahey of Sullivan Street Bakery and popularized by Mark Bittman of the New York Times. The basic premise is simple: mix together your dough ingredients in a bowl just until they're combined, cover it, and let time take care of the rest. That's it.

So how does it work? Well the goal of kneading in a traditional dough is to create gluten, a web-like network of interconnected proteins that forms when flour is mixed together with water. All wheat flour contains some amount of protein (usually around 10 to 15%, depending on the variety of wheat). In their normal state, these proteins resemble tiny crumpled up little balls of wire. With kneading, your goal is to first work these proteins until they untangle a bit, then to rub them against each other until they link up, forming a solid chain-link fence.

It's this gluten matrix that allows your dough to be stretched without breaking, and what allows it to hold nice big air bubbles inside. Ever have a dense under-risen pizza crust? It's because whoever made it didn't properly form their gluten in the process.

Now you can see how how this can take a lot of work. Kneading, aligning, folding, linking. That's why most pizza dough recipes takes a good ten to twenty minutes of elbow grease or time in a stand mixer.

But there's another way.

See, flour naturally contains enzymes that will break down large proteins into smaller ones. Imagine them as teeny-tiny wire cutter that cut those jumbled up balls of wire into shorter pieces. The shorter the pieces are, they easier it is to untangle them, and the easier it is to then align them and link them up into a good, strong network. No-knead dough recipes take advantage of this fact.

Over the course of an overnight sit at room temperature, those enzymes get to work breaking down proteins. Meanwhile, yeast starts to consume sugars in the flour, releasing carbon dioxide gas int he process. These bubbles of gas will cause the dough to start stretching, and in the process, will jostle and align the enzyme-primed proteins, thereby creating gluten.

Simply allowing the dough to sit overnight will create a gluten network at least as strong (if not stronger!) than a dough that had been kneaded in a mixer or by hand, all with pretty much zero effort. Indeed, the flavor produced by letting yeast do its thing over the course of this night will also be superior to that of any same-day dough. Win win!

Other than time, the only real key to a successful no-knead dough is high hydration. Specifically, the water content should be at least 60% of the weight of the flour you use. Luckily, high hydration also leads to superior hole structure upon baking. I go for about 65%.

by J. Kenji López-Alt, Serious Eats |  Read more:
Image: J. Kenji López-Alt

Kingfisher atop snowladen reeds - Eisho Narazaki
via:

runswithscissors, Summer morning streetscape. Madrid, Spain - 8/11/11
via:

From Mars

This spring, Jenny Hollander, a twenty-three-year-old Columbia Journalism School student, sent out her résumé for summer internships. “Where didn’t I apply?” Hollander, who is from the U.K., said recently. “BuzzFeed, Mashable, the Fiscal Times, a lot of very small county papers all over the U.S.; California Watch, which is an investigative thing in California; the L.A. Times; the Huffington Post—twice.” She was either rejected or ignored by all of them. Then she came across a notice, on a Columbia Listserv, for a “writing internship” at an unnamed startup. The job paid fifty dollars a day. “It was all a little bit cloak-and-dagger,” Hollander said. She knew nothing about the company, but she applied anyway, and was delighted when she was hired.

On her first day of work, instead of going to an office, Hollander arrived at a newly renovated four-story town house in Williamsburg, Brooklyn. It had two kitchens, two living rooms, and a roof deck—all decorated in a funky flea-market style. The house was the headquarters of Bustle, a new online publication for women. There were four editors in their mid-twenties, and a gaggle of interns—college students or recent graduates, all women—sat around, typing on MacBooks. Many students have summer jobs that involve little more than fetching coffee and maintaining Twitter feeds, so Hollander was surprised when she was told to take out her laptop and start writing blog posts. “I called my housemate and was, like, ‘So I’m doing this job, and all I’m doing is sitting on sofas in this gorgeous house with a bunch of other girls, and we’re all writing together!’ ”

If you go to Bustle.com, you will find a sleekly designed Web site, with headlines that read like the result of a one-night stand between Us Weekly and U.S. News & World Report. Its loosely female-oriented articles cover topics ranging from evergreen style tips (“Eight Modern Ways to Wear a Hair Scarf”) to celebrity gossip (“Why We’re Concerned for Simon Cowell’s Unborn Son”), with a prominent dash of hard news (the top stories last week were about Syria). To a large degree, the articles consist of aggregation: a Bustle writer finds a piece of news that interests her—from the Times, or from a blog she likes—and summarizes it for Bustle’s readers, perhaps making its contents into a list, or collecting some related tweets. Bustle’s house style—to the extent that one exists—is brisk and easily digestible, if a little thin. Soon after she started writing for Bustle, Hollander developed a recurring feature called “This Week in Studies,” in which she recaps the results of scientific research, in slide-show form: “A stunning new study reveals that a quarter of people regret something that we posted on social media at some point: a drunk Tweet, a melancholy Facebook post. . . . Seriously: only a quarter regret these things?”

Bustle’s articles are modest, but the ambitions of its founder, a young Silicon Valley entrepreneur named Bryan Goldberg, are not. When I first spoke to him, early in the summer, he referred to Bustle as “the next great women’s publication.” He was in the process of raising an unusually large amount of pre-launch money—$6.5 million—from investors such as Time Warner Investments and 500 Startups. In six years, Goldberg told me, he hopes that Bustle will attract fifty million visitors each month and earn more than a hundred million dollars a year in advertising revenue, making it the “biggest and the most powerful women’s publication in the world.”

Goldberg, who is thirty, is not a traditional publisher: he speaks more admiringly of Elon Musk than of any Pulitzer Prize-winner. But he is not all bluff. Six years ago, at the age of twenty-four, he and a few friends started Bleacher Report, a sports Web site that, in 2012, they sold to Turner Broadcasting for more than two hundred million dollars. Bleacher Report’s success was a striking example of the new economics of media: when it began, its articles were written by a network of two thousand unpaid sports fans (critics have described the site as an example of “loser-generated content”), yet today it attracts twenty-two million unique visitors each month, putting it behind only Yahoo U.S. Sports and ESPN.com among non-league sports Web sites. Bleacher Report’s high traffic and low production costs have made it extremely profitable. Soon after acquiring Bleacher Report, Turner made it the source of sports news at CNN.com, where it replaced Sports Illustrated. This changing of the guard was a reminder of how quickly, in the Internet age, a cost-effective business plan can overtake one built on a reputation for quality. Goldberg points out that Bleacher Report is now likely worth more than the two hundred and fifty million dollars that Jeff Bezos recently paid for the Washington Post.

by Lizzie Widdicombe, New Yorker |  Read more:
Image: Pari Dukovic

Sea Change


[ed. If you read anything this week, read this.]

Imagine every person on Earth tossing a hunk of CO2 as heavy as a bowling ball into the sea. That’s what we do to the oceans every day.

Burning fossil fuels, such as coal, oil and natural gas, belches carbon dioxide into the air. But a quarter of that CO2 then gets absorbed by the seas — eight pounds per person per day, about 20 trillion pounds a year.

Scientists once considered that entirely good news, since it removed CO2 from the sky. Some even proposed piping more emissions to the sea.

But all that CO2 is changing the chemistry of the ocean faster than at any time in human history. Now the phenomenon known as ocean acidification — the lesser-known twin of climate change — is helping push the seas toward a great unraveling that threatens to scramble marine life on a scale almost too big to fathom, and far faster than first expected.

Here’s why: When CO2 mixes with water it takes on a corrosive power that erodes some animals’ shells or skeletons. It lowers the pH, making oceans more acidic and sour, and robs the water of ingredients animals use to grow shells in the first place.

Acidification wasn’t supposed to start doing its damage until much later this century.

Instead, changing sea chemistry already has killed billions of oysters along the Washington coast and at a hatchery that draws water from Hood Canal. It’s helping destroy mussels on some Northwest shores. It is a suspect in the softening of clam shells and in the death of baby scallops. It is dissolving a tiny plankton species eaten by many ocean creatures, from auklets and puffins to fish and whales — and that had not been expected for another 25 years.

And this is just the beginning.

by Craig Welch, Seattle Times |  Read more:
Image: Steve Ringman

Anglo American Withdraws from Pebble Mine

[ed. Sounds like the end might be near but big projects like this never really seem to die (in Alaska, anyway), they just cycle through a couple generations (or less) and reappear in new packaging. For additional background see: Gold Fish]

Anglo American, one of the key backers of the controversial Pebble mine in Alaska's Bristol Bay region, announced Monday that it is withdrawing from the Pebble Partnership -- and will take a $300 million hit for doing so. The London-based Anglo American has a 50 percent share of the Pebble venture, with Northern Dynasty Minerals out of Vancouver, Canada controlling the other half. The company said that Northern Dynasty will assume sole responsibility for the project.

In a statement, Anglo American CEO Mark Cutifani said that the company was seeking other investment opportunities.

"Despite our belief that Pebble is a deposit of rare magnitude and quality, we have taken the decision to withdraw following a thorough assessment of Anglo American’s extensive pipeline of long-dated project options," Cutifani said. "Our focus has been to prioritize capital to projects with the highest value and lowest risks within our portfolio, and reduce the capital required to sustain such projects during the pre-approval phases of development as part of a more effective, value-driven capital allocation model."

John Shively, CEO of the Pebble Partnership, insisted that reports of Pebble's death are premature. “Obviously we’re disappointed, but we still have a great project,” he said. “Anglo American was reviewing all of their assets. When they got to us, we didn’t make the cut,” he said.

Shively, who learned of the pullout this weekend in phone calls from the owner companies, said he expects that Northern Dynasty will decide in the next two or three weeks what its next steps should be. He said the “partnership has to be unraveled,” and Northern Dynasty has to consider its options.

Pebble has received intense scrutiny during the exploratory phase of the project. Critics say the mine's proposed location could present a risk to the Bristol Bay watershed and salmon fishery, one of the most lucrative fisheries in the world. Supporters have accused the Environmental Protection Agency of playing politics with the project after the EPA released an assessment of the potential impacts of a large open-pit mine on Bristol Bay fisheries last year. That report said that even barring a major mishap, damage to salmon runs were a likely side effect of mine development.

Meanwhile, the Pebble Mine prospect is also a high-value proposition: Northern Dynasty estimates that the proposed mining area could contain as much as 81 billion pounds of copper, 5.6 billion pounds of molybdenum and 107 million ounces of gold. Estimates have put the value of the resources at up to $300 billion.

by Ben Anderson, Alaska Dispatch |  Read more:
Image: EPA

Bascove, Pershing Square Bridge, 1993, oil on canvas, 26" x 42".Collection of the Museum of the City of New York.

Tuesday, September 17, 2013

The Plot to Kill Obamacare

The Republican party has voted unanimously against establishing the Affordable Care Act in the Senate and then in the House of Representatives, then voted some 40 times to repeal or cripple it; it has mounted a nearly successful campaign to nullify it through the courts and a failed presidential campaign that promised to repeal it; and it has used its control of state governments to block the law’s implementation across vast swaths of the country, at enormous economic cost to those states. Yet somehow, in the wake of all this, the party is consumed with the question Have we done enough to stop Obamacare?

This peculiar subject of introspection, as if Joe Francis were lying awake at night cursing himself for his prudery, reflects the deepening mix of terror and rage with which conservatives await the enrollment of millions of uninsured Americans beginning in October. On the substantive merits of the law, only the subtlest variations can be detected in the GOP’s evaluation. Mitch McConnell calls it the “single worst piece of legislation passed in the last 50 years in the country.” Representative John Fleming of Louisiana calls it “the most dangerous piece of legislation ever passed by a Congress” and “the most existential threat to our economy … since the Great Depression.” Virginia gubernatorial candidate Ken Cuccinelli harks back to the Fugitive Slave Acts for a comparative affront to liberty.

Having achieved near consensus on the policy, the party has fallen into intramural squabbling over which extraordinary threats to deploy. Shut down the government? Default on the national debt? (House leaders have wriggled out of demands to do the former by promising to do the latter.) Conservative activists have turned on their leaders as traitors for hesitating to employ the most obviously suicidal methods, affixing John Boehner’s name to the hated program (“Boehnercare”) or accusing McConnell of “empty rhetoric … about ending Obamacare.” These recriminations reprise the hallucinatory attacks by Cold War conservatives like Joe McCarthy and the John Birch Society, which over time migrated from their original targets onto such figures as President Eisenhower and the Army.

The historical echo is fitting in the sense that Obamacare has come to fill the place in the conservative psyche once occupied by communism and later by taxes: the main point of doctrinal agreement. (In constituent meetings, “this is the overriding issue that is being discussed,” one Republican member of Congress explained late last month. “Way more than immigration, way more than the debt.”) The transformation of Obamacare from a close relative of Republicans’ own health-care ideas to the locus of evil in modern life is owing to several things, including the almost tautological political fact that its success would be Obama’s: Permanent health-care reform would define Obama as a Reaganesque transformative figure, rather than the failure conservatives still hope him to be remembered as. The law’s slow rollout has made it a live issue, unlike the already-expired stimulus, and thus the main receptacle for simmering concerns over unemployment and the tepid economic recovery.

Most important, the law has, in its direct impact, opened a fissure over the role of government deeper than any since the New Deal. Obamacare threatens America’s unique status among advanced economies as a country where access to regular medical care is a privilege that generally must be earned. In a few weeks, the United States government, like those of France, or Australia, or Israel, will begin to regard health insurance as something to be handed out to one and all, however poor, lazy, or otherwise undeserving each recipient may be. “We can’t afford everything we do now, let alone provide free medical care to able-bodied adults,” as Missouri Republican Rob Schaaf, author of the state’s harsh anti-Obamacare initiative, put it. “I have a philosophical problem with doing that.”

The Obamacare wars have progressed from the legislative to the judicial to the electoral fronts, gaining intensity at every step. Now they move to a new battleground to secure the law and all it represents, or provoke its collapse. That an implementation battle is taking place at all is a highly unusual circumstance. Major new laws often stagger into operation with glitches, confusion, and hasty revisions, but not sabotage. Obamacare will come online in the midst of an unprecedented quasi-campaign atmosphere, with Republicans waging a desperate political and cultural war to destroy it.

by Jonathan Chait, New York Magazine |  Read more:
Image: Kristian Hammerstad

A Good Angle Is Hard to Find

About 10 years ago, I was driving along the Pacific Coast Highway, one of the most glorious stretches of asphalt in the country, when I decided to take a picture of myself. I had a Polaroid camera then, which I carried in the front seat of my Honda Accord along with a high-zoom Nikon bought at a specialty store for more money than I’d ever dropped in one place. I must have looked bizarre, pulling my sedan to the shoulder of the road and perching in the wildflowers with that big, boxy plastic eye held in front of me, as picture after picture spit out like an angry tongue. Polaroids are expensive to screw up, by the way. About a buck a misfire.

But I am a short girl, and a vain one, and I could never get my arms far enough away to find a flattering angle. Pictures I took of myself were often blasted with flash, or marred by the funny mistakes of a photo aimed blind: Here is your left eyeball. Behold, your forehead. Still, it was worth all the effort to have some souvenir of the moment—an instant image!—which I could tuck into an envelope and slide into the trusty rabbit tunnel that was the U.S. Postal Service, where it would wind 1500 miles back to my parents’ place in Dallas and find a new home underneath a magnet on the kitchen fridge.

The word “selfies” didn’t exist then. It would take at least another three years—and the advent of digital cameras—for the word to become necessary. In 2005, Jim Krause used the term in his manual “Photo Idea Index” to describe the kind of on-the-fly snapshots he and his friends were taking, unburdened by the cost and labor of traditional film processing. The “selfies” tag grew on Flickr, and later flourished on social media sites, where #selfies and #me became an ever-trending topic. In 2010, the iPhone introduced its flip-camera feature, allowing users to see and frame a shot of themselves. In the selfie origin story, this was the eureka moment.

These days, the sight of someone pulling over to the side of the road—or standing at a bar, or flashing a peace sign in front of a building, or waiting at the drive-thru in the front seat of the car—and taking a picture of themselves is not bizarre at all. We live in the endlessly documented moment, and the arm outstretched with that small, omnipotent rectangle held aloft is one of the defining postures of our time. We’ve had selfie scandals, from Weiner’s weiner to Amanda Bynes’ meltdown. We’ve had a million billion cautionary tales about sending erotic selfies, though it doesn’t seem to stop anyone. Criminals take selfies and so do cops. The presidential selfie surely could not be far behind. (On this, Hillary was first.)

But people are also worried about the selfie. Well, worried and irritated. Several trend stories have pondered the psychological damage on a generation that would rather take a picture of their life than actually live it. A recent study found that posting too many selfies annoys people (for this, they needed science?). Last month, the word made its way into the Oxford Dictionaries Online, but it has also become something of a smear, another tacky emblem of a culture that has directed all possible spotlights toward its own sucked-in cheeks. “Are you going to take a selfie?” a friend asked with mock derision when I pulled out my phone at dinner to check the time. And it was clearly a joke, but I wasn’t sure if he was making fun of people who do such things, or the fact that I was one of them.

by Sarah Hepola, TMN |  Read more:
Image: Danielle Julian Norton, Everything is Fine, 2012. Image credit Shannon Benine.

Thinking Out Loud

Every day, we collectively produce millions of books’ worth of writing. Globally we send 154.6 billion emails, more than 400 million tweets, and over 1 million blog posts and around 2 million blog comments on WordPress. On Facebook, we post about 16 billion words. Altogether, we compose some 52 trillion words every day on email and social media — the equivalent of 520 million books. (The entire US Library of Congress, by comparison, holds around 23 million books.)

And what makes this explosion truly remarkable is what came before: comparatively little. Before the Internet, most people rarely wrote for pleasure or intellectual satisfaction after graduating from high school or college.

Is any of this writing any good? Certainly, measured against the prose of an Austen, Orwell, or Tolstoy, the majority of online publishing pales. This isn’t surprising. The science fiction writer Theodore Sturgeon famously said something like, “Ninety percent of everything is crap,” a formulation that geeks now refer to as Sturgeon’s Law. Anyone who has spent time slogging through the swamp of books, journalism, TV, and movies knows that this holds pretty well even for edited and curated culture. So a global eruption of unedited, everyday self-expression is even more likely to produce this 90-10 split — an ocean of dreck, dotted sporadically by islands of genius.

But focusing on the individual writers and thinkers misses the point. The fact that so many of us are writing — sharing our ideas, good and bad, for the world to see — has changed the way we think. Just as we now live in public, so do we think in public. And that is accelerating the creation of new ideas and the advancement of global knowledge.

Literacy in North America has historically been focused mainly on reading, not writing; consumption, not production. While many parents worked hard to ensure their children were regular readers, they rarely pushed them to become regular writers. But according to Deborah Brandt, a scholar who has researched American literacy in the 20th and 21st centuries, the advent of digital communications has helped change that notion.

We are now a global culture of avid writers, one almost always writing for an audience. When you write something online—whether it’s a one-sentence status update, a comment on someone’s photo, or a thousand-word post—you’re doing it with the expectation that someone might read it, even if you’re doing it anonymously.

Having an audience can clarify thinking. It’s easy to win an argument inside your head. But when you face a real audience, you have to be truly convincing. (...)

Interestingly, the audience effect doesn’t necessarily require a big audience. This seems particularly true online.

Many people have told me that they feel the dynamic kick in with even a tiny handful of viewers. I’d argue that the cognitive shift in going from an audience of zero (talking to yourself) to an audience of 10 (a few friends or random strangers checking out your online post) is so big that it’s actually huger than going from 10 people to a million. [ed. I would agree with this.]

This is something that traditional thinkers of the pre-Internet age—particularly print and broadcast journalists — have trouble grasping. For them, an audience doesn’t mean anything unless it’s massive. If you’re writing specifically to make money, you need to draw a large crowd. This is part of the thinking that causes traditional media executives to scoff at the spectacle of the “guy sitting in his living room in his pajamas writing what he thinks.” But for the rest of the people in the world, who probably never did much nonwork writing in the first place—and who almost never did it for an audience—even a handful of readers can have a vertiginous, catalytic impact.

by Clive Thompson, Wired |  Read more:
Image: Simon C. Page

Gordon Parks, Frustrated, Chicago, IL, 1957
via:

Julian Opie, I dreamt I was driving my car (motorway corner), 2002.
via:

Go Ask Alice

One pill makes you larger
And one pill makes you small
And the ones that Mother gives you
Don’t do anything at all
Go ask Alice, when she’s ten feet tall

— Jefferson Airplane, “White Rabbit”

“Life’s a box of chocolates, Forrest. You never know what you’re gonna get.”

— Forrest Gump



Well, Children, it’s silly season again. Yes, that’s right: Twitter just filed an initial registration statement (or S-1) for its long-awaited initial public offering. Confidentially. And commemorated it with a tweet on its own social media platform, of course:

Tools.

* * *

This of course means every numbnuts and his dog are currently crawling out of the woodwork and regaling us with their carefully considered twaffle about what Twitter is doing, what it should do, and how much money we’re all going to make buying and selling Twitter’s IPO shares when and if they ever come to market. A particularly amusing sub-genre of said twaffle consists of various pundits of varying credibility and credulousness pontificating on what Twitter is actually worth, as if that is a concrete piece of information embedded in the wave function of quantum mechanics or the cosmic background radiation, rather than a market consensus which does not exist yet because, well, there is no public market for Twitter’s shares.

But there seems to be something about IPOs that renders even the most gimlet-eyed, levelheaded market observers (like Joe Nocera, John Hempton, and... well, just those two) a little goofy and soft in the head. Perhaps they just can’t understand why such an obvious and persistent arbitrage anomaly as the standard 10 to 15% IPO discount on newly public shares—which everybody seems to know about even though they can’t explain it—persists as it does. Or why, given how many simoleons the evil Svengalis of Wall Street get paid to underwrite IPOs, there are so many offerings that end up trading substantially higher (e.g., LinkedIn) or substantially lower (e.g., Facebook) than the offer price they set once shares are released for trading.

So, out of the bottomless goodness of my heart—and a heartfelt wish to nip some of the more ludicrous twitterpating I expect from the assembled financial media and punditry in the bud—I will share here in clear and simple terms some of the explanations I have offered in the past.

by The Epicurean Dealmaker |  Read more:
Image: uncredited