Monday, May 6, 2013

The Economics of Social Status

In economics, a good is anything that “satisfies human wants and provides utility.” This includes not just tangible goods like gold, grain, and real estate, but also services (housecleaning, dentistry, etc.) as well as abstract goods like love, health, and social status.

As an economic good, social status is a lot like health. They’re both intangible and highly personal. In proper economic terms, they are private goodsrivalrous and mostly excludable. And the fact that they’re hard to measure doesn’t make them any less valuable — in fact we spend trillions of dollars a year in their pursuit (though they often elude us).

But status differs from health in one very important respect: It can be transacted – spent as well as earned. It’s not a terminal good, but rather an intermediate good that helps us acquire other things of value. For example, I can trade some of my status for money, favors, sex, or information — and vice versa.

Health, if it’s possible to spend at all (e.g. in pursuit of career success), is extremely illiquid. But as I will argue today, status is so liquid — so easy to transact, and in real time — that it plays a fundamental economic role in our day-to-day lives.

Before we dig into the transactional nature of social status, let’s ground ourselves, briefly, in its biology and sociology.

The biology of status

No one plays status games in Heaven. Why bother? Souls have no want for food, sex, or smartphones — and thanks to His omnipresence, God even takes the fun out competing for an audience with Him.

Meanwhile, here on Earth, we (embodied primates) engage in all manner of status games. It’s one of the ways we compete over access to scarce resources like food and mates. And it’s something we share with a lot of other social animals — chickens, dogs, chimps, etc.

Here are some of the concepts that govern the day-to-day biology of social status:
  • Prestige vs. dominance. Joseph Henrich (of WEIRD fame) distinguishes two types of status. Prestige is the kind of status we get from being an impressive human specimen (think Meryl Streep), and it’s governed by our ‘approach’ instincts. Dominance, on the other hand, is the kind of status we get from being able to intimidate others (think Joseph Stalin), and is governed by fear and other ‘avoid’ instincts. Of course these two types of status aren’t mutually exclusive, but they’re analytically distinct strategies with different biological expressions.
  • Fitness displays. In The Mating Mind, Geoffrey Miller argues that many of our most prized, socially-desirable qualities — athleticism, artistic skill, eloquence, intelligence, physical beauty — serve as fitness displays, i.e., advertisements for the quality of our genes. We are attracted, socially and sexually, to people with high skill and beauty, largely because these traits are honest signals of good genes. [1]
  • Hormones. There are at least two hormones involved in processing social status: testosterone and cortisol. To grossly oversimplify, testosterone is the ‘aggression hormone’ while cortisol is the ‘stress hormone.’ In a recent paper (and also a great TED talk), Amy Cuddy et al. asked participants to adopt either a high-status pose or a low-status pose for ten minutes. The researchers then measured participants’ hormone levels and their willingness to take risks on games of chance (a behavior associated with feelings of power). Participants who took high-status poses showed increased testosterone and reduced cortisol levels, and took greater risks, relative to their counterparts who were asked to adopt low-status poses.
  • Body language. Cuddy’s experiment also illustrates the role played by our bodies in mediating status. Specifically, we’re wired to interpret people’s use of space in terms of status — the more space you take up, the higher your status. Also relevant are postures of intimidation, submission, and vulnerability.
The point I’m trying to make here is that social status is not arbitrary. Instead, it’s grounded, very concretely, in the biology of honest signals – and as such, it’s subject to very real constraints. Wild swings of status are possible, but they’re mostly the stuff of stories. Our daily lives are governed by much smaller — and more predictable — gains and losses.

by Kevin Simler, Ribbonfarm |  Read more:
Image: uncredited via:

Remote Control: Our Drone Delusion

The Fifth Amendment asserts that no “person” shall be “deprived of life, liberty, or property, without due process of law,” a statement that the Supreme Court has usually interpreted as requiring, among other things, that American citizens receive a fair trial and the right of appeal. The Obama Administration has never made clear why it thought that capturing Awlaki and bringing him to trial was infeasible. Nor has it described the specific standards it used to approve Awlaki’s execution. As things stand, Obama will bequeath to his successors a worrisome precedent: without trial, the President has the right to kill any U.S. citizen who is judged, on the basis of unpublished criteria, to have become an enemy combatant.

But Awlaki’s case, troubling as it may be, raises a broader issue: the Administration’s refusal to disclose the criteria by which it condemns anyone, American or otherwise, to death. The information used in such cases is intelligence data rather than evidence; it is not subject to cross-examination or judicial review. Unanswered questions abound. Does the President require that intelligence used to convict a terror suspect in absentia be based on multiple sources, or is one sufficient? Must intercepts, photographs, or credible firsthand testimony be obtained, or can people be executed on the basis of hearsay from paid informants? How directly involved in violence must an individual be to receive a death sentence? At what point does a preacher’s hate speech warrant his being killed?

Mazzetti describes how the imperative to protect American troops in Afghanistan from cross-border attacks originating in Pakistan led to a slackening of the standards used to mark terror suspects for assassination. After 2008, the C.I.A. won approval for a category of drone attacks known as “signature strikes,” in which, even without a specific target, an attack is justified by a pattern of behavior—young men of military age test-firing mortars at a training camp in South Waziristan, say, or riding under arms in a truck toward the Afghan border.

Under the laws of war, strikes of that kind are typically legal on a formal battlefield like that in Afghanistan—in war, if an enemy camp is discovered, it is not necessary to know the names of the fighters inside in order to attack. In secret, Obama unilaterally extended such permission to Pakistan’s border region, where the United States had never declared war. The President put the C.I.A., not the Pentagon, in charge of these attacks, in order to maintain deniability.

Without judicial review or informed public debate, the potential for abuse and overreach is vast. In one of the most disquieting passages in his book, Mazzetti notes that, as the death toll in Pakistan mounted, Obama Administration officials at one point claimed that the increased drone strikes in Pakistan had not led to any civilian deaths. “It was something of a trick of logic,” Mazzetti writes. “In an area of known militant activity, all military-age males were considered to be enemy fighters. Therefore, anyone who was killed in a drone strike there was categorized as a combatant.”

by Steve Coll, New Yorker |  Read more:
Illustration by Noma Bar

The Coming Bold Transformation of the American City


In 40 years, 2.7 billion more people will live in world cities than do now, according to the United Nations Department of Economic and Social Affairs. Urban growth in China, India, and most of the developing world will be massive. But what is less known is that population growth will also be enormous in the United States.

The U.S. population will grow 36 percent to 438 million in 2050 from 322 million today. At today’s average of 2.58 persons per household, such growth would require 44.9 million new homes. However American households are getting smaller. If one were to estimate 2.2 persons per household—the household size in Germany today and the likely U.S. size by 2050—the United States would need 74.3 million new homes, not including secondary vacation homes. This means that over the next 40 years, the United States will build more homes than all those existing today in the United Kingdom, France, and Canada combined. Urban planner and theorist Peter Calthorpe predicts that California alone will add 20 million people and 7 million households by 2050.

To meet this demand, completely new urban environments will have to be created in the United States. Where and how will the new American homes be built? What urban structures are to be created?

It is unlikely that city building on the scale to be seen through 2050 will happen ever again. Cities are a means to a way of life: the kind of urban structures created over the next few decades will have profound consequences in terms of quality of life, environmental sustainability, economic well-being, and even happiness and the civilization for hundreds of years to come. If we consider the influence American cities will exert on the rest of the world, the way they are built will determine, as well, much of the world’s sustainability and well-being.

Until today, the United States’ main legacy for the urban world has been low-density suburbs, which, most agree, have many shortcomings in terms of the environment and quality of life. The inadequacies of the suburbs are well known. They are high-energy-use environments: homes are large and thus consume much energy for cooling and heating; occupants’ mobility is dependent on the automobile; distances to reach jobs, shops, and recreation areas are long; and low-cost and high-frequency public transport is not viable in such a low-density environment. Suburbs severely restrict the mobility of vulnerable citizens—youngsters, the poor, and the very old—who usually lack access to a car. Because most destinations are unreachable on foot, suburban public spaces tend to be devoid of people—making them boring in their almost eerie silence interrupted only by the sound of cars that sporadically zoom by or lawnmowers with their maddening engines. Suburbs are not propitious for diversity: Russian literature courses or Afghan restaurants require high concentrations of people nearby from which to draw the small percentage who are interested.

Despite the ills of the suburbs, most Americans do not want to live in a Manhattan-like environment either. So, what should the third-millennium American city be like?

by Enrique Penalosa, Atlantic Cities |  Read more:
Photo: Carlos Barria/Reuters

HTM Studios, Somewhere !..
via:

Why Cable Companies Should Love a Free Internet

Tom Wheeler, Obama’s nominee to run the Federal Communications Commission, surely has much he hopes to get done. Perhaps it’s freeing up some more wireless spectrum or bringing cell-phone service to Mars—who knows. But chances are (assuming his confirmation goes smoothly) that he’ll end up spending time on different challenges, and a chief candidate is a resurgence of the net-neutrality wars.

The outgoing chairman, Julius Genachowski, made many very good and important decisions, but he also made a rather terrible one that may darken Wheeler’s term. Genachowski spent years and much political capital negotiating net-neutrality rules that everyone could live with, only to enact them in a way that is highly vulnerable to a court challenge. That challenge (brought, cynically, by Verizon after it negotiated the rules it wanted) may soon invalidate years of work and create industry chaos.

The net-neutrality rules now in place reinforce the Internet’s original design principle: that all traffic is carried equally and without any special charges beyond those of transmission. Among other things, the rules are a pricing truce for the Internet; without them, we can expect a fight that will serve no one’s interests and will ultimately stick consumers with Internet bills that rise with the same speed as cable television’s.

Unfortunately, like American Presidents who hope to avoid the politics of the Middle East, the F.C.C. may ultimately have no choice but to get involved in this fight. But one very important thing has changed since last time. Cable operators like Time Warner and Comcast, if they think carefully, should come to understand that they now need a net-neutrality rule more than anyone.

Ask a cable operator what makes its life miserable, and the answer is immediate and obvious: programming fees. Such fees have roughly doubled over the past decade during a period of near-flat inflation and economic stagnation. Sports is the most outrageous example: what ESPN charges cable operators keeps growing, and is now approaching five dollars per customer. The actual cost of providing the entire Internet to cable customers, which is something like a few dollars a month, is less than that. It is a lose-lose situation for nearly everyone (except athletes). The real victims are consumers, especially low-income consumers, who ultimately foot all the bills but cannot control the costs.

If programming costs are the worst thing in cable, the best part of the business is selling broadband. Cable broadband, which costs almost nothing to provide once the infrastructure is built, has little real competition, and operators can charge between forty and sixty dollars for the product, yielding margins that analyst Craig Moffitt describes as “comically profitable.” Margins greater than ninety per cent are a sweet business no matter what you’re doing, and what cable operators have to realize is how crucial net neutrality is to making those margins possible.

An important aspect of the Internet’s original design is that many prices were set at zero—what have been called zero-price rules. The price to join the network is zero. The price that users and sites pay to reach others is zero: a blogger doesn’t need to pay to reach Comcast’s customers. And the price that big Web sites charge broadband operators to carry their content is also zero. It’s a subtle point, but these three zeros are a large part of what makes the Internet what it is. If net neutrality goes away, so does the agreement to freeze prices at zero.

by Tim Wu, New Yorker |  Read more:
Photograph by Andrew Harrer/Bloomberg/Getty

Sunday, May 5, 2013

Vince Lombardi Accepted Gay Players

The ongoing debate about how a gay NFL player would be treated in the locker room has largely focused on the idea that times are changing, and that acceptance of a gay player would be a modern development. But it’s often overlooked that the ultimate example of the old-school football coach was also perfectly fine with having gay players on his team.

Multiple players who played for Vince Lombardi, the legendary former Packers and Redskins coach, say that he knew some of his players were gay, and that not only did he not have a problem with it, but he went out of his way to make sure no one else on his team would make it a problem.

In 1969, Lombardi’s Redskins included a running back named Ray McDonald, who in 1968 had been arrested for having sex with another man in public. In the Lombardi biography When Pride Still Mattered, author David Maraniss writes that Lombardi told his assistants he wanted them to work with McDonald to help him make the team, “And if I hear one of you people make reference to his manhood, you’ll be out of here before your ass hits the ground.”

Lombardi’s daughter Susan told Ian O’Connor of ESPNNewYork.com that her father would have been thrilled to have a player like Jason Collins, the NBA center who publicly revealed this week that he is gay.

My father was way ahead of his time,” Susan Lombardi said. “He was discriminated against as a dark-skinned Italian American when he was younger, when he felt he was passed up for coaching jobs that he deserved. He felt the pain of discrimination, and so he raised his family to accept everybody, no matter what color they were or whatever their sexual orientation was. I think it’s great what Jason Collins did, because it’s going to open a lot of doors for people. Without a doubt my father would’ve embraced him, and would’ve been very proud of him for coming out.”

Dave Kopay, the first former NFL player to come out, also played on those 1969 Redskins, and he says that while he never told Lombardi, he believes Lombardi knew not only that Kopay was gay, but that Kopay and another Redskins player, Jerry Smith, were in a romantic relationship.

“Lombardi protected and loved Jerry,” Kopay told O’Connor.

by Michael David Smith, NBC News |  Read more:
Image via:

Perminterns: Another Sign of a Broken Economy

Reminder #1,271,689 that the economy is still broken: the permintern.
In so many ways, Kate, who was born in 1987, is a perfect reflection of the opportunities and hardships of being young today. She’s smart and motivated and has a degree from an Ivy League school, yet at 25 she worries she’ll never attain the status or lifestyle of her boomer parents. She majored in political science and has a burnished social conscience, something she honed teaching creative writing in a women’s prison. But Kate’s most salient—and at this point, defining—generational trait might be that she doesn’t have a full-time job. Instead, she has been an intern for a year and a half. 
Kate moved to DC after dropping out of her first year of law school. She has cycled through one internship at a political organization and another at a media company and is now biding her time as an unpaid intern at a lobbying firm. To make ends meet, she works as a hostess in Adams Morgan three or four nights a week, which means she often clocks 15-hour days.
“I don’t mean to sound like I have an ego, but I am an intelligent, hard-working person,” Kate says. “Someone would be happy they hired me.” 
It’s a refrain heard many times from the millions of twentysomething Kates who are scrambling to find jobs with a steady paycheck and benefits. Mostly, though, they want to find a way out of the low-paying—or nonpaying—apprenticeship track. For Kate, it feels more like an internship vortex. 
After all, who wants to still be an intern at an age when you should have a 401(k) and a modicum of job security, or at least be earning more than you did at your summer job during high school? “People my age expect to start at the bottom,” Kate says, “but in this economy the bottom keeps getting lower and lower.”
Welcome to the slow, sputtering economic recovery, Generation Y. Keep in mind that this is the situation for Ivy League grads in D.C. Nor is this an aberration: it's also a major problem in the entertainment industry on the West Coast. And these are the well-to-do kids doing everything they can to climb the social ladder, do what they're supposed to, and get ahead. Many of them will give up and take a low-paying, insecure dead-end job, or get sucked into the vortex of debt and unemployment that is much of graduate school. Matters are worse for most more normal people.

For those twenty-somethings who do manage to find a decent job, buying a decent home is usually still well out of reach due to home prices that skyrocketed far beyond stagnant wages. Older Americans who bought their homes decades ago are still doing fine due to policies that have prioritized asset growth over wage growth. But for those who were children or yet unborn when housing started to shoot upwards, things are far more difficult. Of course, as bad as Millennials have it, it's even worse for the middle-aged who have been forced out of jobs and whose homes are underwater.

The economy is still broken. It will stay broken until wages rise to meet productivity growth, and until the middle class reclaims much of the wealth that has been stolen by the very wealthy.

by David Atkins, Hullabaloo

Famous Riders

A rider is a contractual proviso that outlines a series of stipulations or requests between at least two parties. While they can be attached to leases and other legal documents, they’re most famously used by musicians or bands to outline how they need their equipment to be set up and arranged, how they like their dressing room organized, and what types of food and beverages they require. Anyone who’s seen Spinal Tap knows these requests can be extremely outrageous and unreasonable. (And, in the case of Iggy Pop's, unexpectedly hilarious.)

I was inspired to create this series after reviewing a few riders from some of the biggest acts in the world, all of which were ridiculous. But what I found most interesting about them is that they offered a glimpse into their larger-than-life personalities.

I initially thought I would try and shoot all of the items listed on the catering riders but quickly realized that this would become an exercise in wasting money. So I decided to focus on the quirkiest requests and shoot them in a Flemish Baroque still-life style because I felt that there was a direct connection between the themes in these types of paintings and the riders: the idea of time passing and the ultimate mortality of a musician’s career as the limelight inevitably fades—they only have a short time in which they are able to make these demands and have them fulfilled.


Frank Sinatra
One bottle each: Absolute, Jack Daniel’s, Chivas Regal, Courvoisier, Beefeater Gin, white wine, red wine. Twenty-four chilled jumbo shrimp, Life Savers, cough drops.


Prince
Coffee and tea setup, including honey, lemon, sugar, cream, fresh ginger root. Physician will be used to administer a B-12 injection.


Britney Spears
Fish and chips, McDonald’s cheeseburgers without the buns, 100 prunes and figs, a framed photo of Princess Diana.

by Henry Hargreaves, Vice | Read more:
Photography and Direction: Henry Hargreaves
Prop Styling: Caitlin Levin

The Kentucky Derby is Decadent and Depraved

[ed. In honor of Derby weekend, Hunter S. Thompson's classic.]

"The Kentucky Derby Is Decadent and Depraved"
By Hunter S. Thompson
From Scanlan's, June 1970

Welcome to Derbytown

I GOT OFF the plane around midnight and no one spoke as I crossed the dark runway to the terminal. The air was thick and hot, like wandering into a steam bath. Inside, people hugged each other and shook hands … big grins and a whoop here and there: "By God! You old bastard! Good to see you, boy! Damn good … and I mean it!"

In the air-conditioned lounge I met a man from Houston who said his name was something or other — "but just call me Jimbo" — and he was here to get it on. "I'm ready for anything, by God! Anything at all. Yeah, what are you drinkin?" I ordered a Margarita with ice, but he wouldn't hear of it: "Naw, naw … what the hell kind of drink is that for Kentucky Derby time? What's wrong with you, boy?" He grinned and winked at the bartender. "Goddam, we gotta educate this boy. Get him some good whiskey … "

I shrugged. "Okay, a double Old Fitz on ice." Jimbo nodded his approval.

"Look." He tapped me on the arm to make sure I was listening. "I know this Derby crowd, I come here every year, and let me tell you one thing I've learned — this is no town to be giving people the impression you're some kind of faggot. Not in public, anyway. Shit, they'll roll you in a minute, knock you in the head and take every goddam cent you have."

I thanked him and fitted a Marlboro into my cigarette holder. "Say," he said, "you look like you might be in the horse business … am I right?"

"No," I said. "I'm a photographer."

"Oh yeah?" He eyed my ragged leather bag with new interest. "Is that what you got there — cameras? Who you work for?"

"Playboy," I said.

He laughed. "Well goddam! What are you gonna take pictures of — nekkid horses? Haw! I guess you'll be workin' pretty hard when they run the Kentucky Oaks. That's a race jut for fillies." He was laughing wildly. "Hell yes! And they'll all be nekkid too!"

I shook my head and said nothing; just stared at him for a moment, trying to look grim. "There's going to be trouble," I said. "My assignment is to take pictures of the riot."

"What riot?"

by Michael McCambridge, Grantland |  Read more:
Image: Ralph Steadman via:

Saturday, May 4, 2013



Anderst Gjennestad aka Strøk
via: StreetArtNews

Your Body Does Not Want to Be an Interface


The first real-world demo of Google Glass’s user interface made me laugh out loud. Forget the tiny touchpad on your temples you’ll be fussing with, or the constant “OK Glass” utterances-to-nobody: the supposedly subtle “gestural” interaction they came up with–snapping your chin upwards to activate the glasses, in a kind of twitchy, tech-augmented version of the “bro nod”–made the guy look like he was operating his own body like a crude marionette. The most “intuitive” thing we know how to do–move our own bodies–reduced to an awkward, device-mediated pantomime: this is “getting technology out of the way”?

Don’t worry, though–in a couple years, we’ll be apparently able to use future iterations of Glass much less weirdly. A Redditor discovered some code implying that we’ll be able to snap photos merely by winking. What could be more natural and effortless than that? Designers at Fjordspeculate that these kinds of body-based micro-interactions are the future of interface design. “Why swipe your arm when you can just rub your fingers together,” they write. “What could be more natural than staring at something to select it, nodding to approve something?… For privacy, you’ll be able to use imperceptible movements, or even hidden ones such as flicking your tongue across your teeth.”

These designers think that the difference between effortless tongue-flicking and Glass’s crude chin-snapping is simply one of refinement. I’m not so sure. To me they both seem equally alienating–I don’t think we want our bodies to be UIs.

The assumption driving these kinds of design speculations is that if you embed the interface–the control surface for a technology–into our own bodily envelope, that interface will “disappear”: the technology will cease to be a separate “thing” and simply become part of that envelope. The trouble is that unlike technology, your body isn’t something you “interface” with in the first place. You’re not a little homunculus “in” your body, “driving” it around, looking out Terminator-style “through” your eyes. Your body isn’t a tool for delivering your experience: itis your experience. Merging the body with a technological control surface doesn’t magically transform the act of manipulating that surface into bodily experience. I’m not a cyborg (yet) so I can’t be sure, but I suspect the effect is more the opposite: alienating you from the direct bodily experiences you already have by turning them into technological interfaces to be manipulated.  (...)

If this is starting to sound like philosophy, don’t blame me. In his book Where The Action Is, computer scientist Paul Dourish invokes Martin Heidegger (yikes!) to explain the difference between technology that “gets out of the way” and technology that becomes an object of attention unto itself. Heidegger’s concept of “ready to hand” describes a tool that, when used, feels like an extension of yourself that you “act through”. When you drive a nail with a hammer, you feel as though you are acting directly on the nail, not “asking” the hammer to do something for you. In contrast, “present at hand” describes a tool that, in use, causes you to “bump up against some aspect of its nature that makes you focus on it as an entity,” as Matt Webb of BERG writes. Most technological “interfaces”–models that represent abstract information and mediate our manipulation of it–are “present at hand” almost by definition, at least at first. As Webb notes, most of us are familiar enough with a computer mouse by now that it is more like a hammer–“ready to hand”–than an interface standing “between” us and our actions. Still, a mouse is also like a hammer in that it is something separate-from-you that you can pick up and set down with your hands. What if the “mouse” wasn’t a thing at all, but rather–as in the Fjord example of “staring to select”–an integrated aspect of your embodied, phenomenal experience?

by John Pavlus, MIT Technology Review |  Read more:
Image by Jason Brush

Call of the Sea


[ed. Beautiful web site, and there are marine snails around here like the ones pictured so I may give this recipe a try.]

Sometimes on Sundays I get the call of the sea, especially when the sun shines through my bedroom window as I wake up in the morning. The ocean is only a few minutes away and it’s as if the rays of the sun bring la mer closer to my home. On beautiful days like that, I love going to the market in Soulac-sur-Mer, a timeless belle époque sea-side village in Médoc.



For starters, I served lovely amandes de mer sautéed with garlic, olive oil and parsley with a dash of piment d’espelette. The poetic amandes de mer, in other words ‘sea almonds’, are called dog cockles in English. They have an almond-like flavor, and are cooked just like clams. While I was preparing lunch I couldn’t resist a few bulots, French marine snails that are so delightful dipped in a freshly whipped mayonnaise with a glass of crisp white wine. (...)

Daurade with herbs (serves 2)

1 daurade/ sea bream fish, approx 800-900 g, scaled and gutted
1 small onion, finely diced
1 tbsp mustard of Dijon
2 spring onions, sliced
2 garlic cloves, finely sliced
4 tbsp fresh lemon juice
Grated lemon zest from 1 lemon
8 sprigs of thyme
6 bay leaves
A handful of parsley
1 tbsp mustard
Sea-salt & black pepper

Preheat the oven to 210°C/ 410 F.

Place a large piece of aluminium foil onto a clean surface. Add an equal-sized layer of foil on top. Fold over the edges so they are secured together.
Place the fish onto the foil. Spoon one tablespoon of mustard and rub inside the fish cavity. Stuff the diced onion inside (keep 1 tbsp to scatter later on fish). Sprinkle fish with lemon zest, chopped parsley, thyme, diced onion, garlic, sliced spring onions and lemon juice. Drizzle olive oil all over fish, and add the bay leaves, inserting one in the fish cavity. Season with sea-salt and black pepper. Add another sheet of aluminium foil and carefully seal all edges of the foil to form an enclosed parcel. It should be tightly sealed so that the fish steams as it cooks without any steam escaping.
Place the fish in a roasting tray and transfer to the preheated oven for 20-25 minutes, depending on oven strength. When cooked, remove from the oven and place onto a large serving plate. Carefully undo the foil.
Sprinkle chopped fresh parsley and squeeze fresh lemon juice. Serve immediately with steamed vegetables.

by Mimi Thorrisson, Manger |  Read more:
Photos: Mimi Thorisson

A History of Like


If you blog, run a university home page, do e-commerce, write news articles for a local paper, have a local government site, or do nearly anything with the Internet, you’re pretty much required to have users “like” your pages. Otherwise, you’re going to be left out of the new economy of quantified affect. We live in what Carolin Gerlitz and Anne Helmond call a Like Economy, a distributed centralized Web of binary switches allowing us to signal if we like something or not, all powered by the now ubiquitous

But why “Like”? Why not “Love,” or “I agree,” or “This is awesome”? At first it seems like one of those accidents of popular culture, where an arbitrary boardroom decision eventually dictates our everyday language. In fact, one history of Facebook’s Like button presents it in these very terms: Facebook engineer Adam Bosworth noted that the button began as an Awesome button but was later changed to Like because like is more universal. If it had stayed Awesome, perhaps we’d be talking about an economy of Awesomes binding together the social Web and we would sound more like Teenage Mutant Ninja Turtles than Valley Girls.

There’s a deeper history to “like,” though, that is far older than Facebook. The marketing subfield of Liking Studies, which began before Internet use became mainstream, is key to understanding how this somewhat bland, reductive signal of affect became central to the larger consumer economy we live in. It also explains why Facebook will never install a Dislike button. (...)

For a largely empirical, positivist field such as marketing – which has pretensions of being a science, not an art –independent variables such as likability have value because of their perceived universal predictive power. With globalization, marketing is in greater need of just such a universal measure capable of predicting the success of global branding campaigns across cultural contexts. Cultural variations might change how marketers go about getting us to like brands, but the goal is always likability. (...)

So what is new about Facebook and the Like button? Oddly enough, it reveals too much. The great sin of Facebook is that it made “like” far too important and too obvious. Marketing is in part the practice of eliding the underlying complexity, messiness, and wastefulness of capitalist production with neat abstractions. Every ad, every customer service interaction, every display, and every package contributes to the commodity fetish, covering up the conditions of production with desire and fantasy. As such, Facebook may reveal too much of the underlying architecture of emotional capitalism. The Like button tears aside this veil to reveal the cloying, pathetic, Willy Lomanesque need of marketers to have their brands be well-liked. Keep liking, keep buying. Like us! Like us! Like us!

Liking in marketing was always meant to be a metonym for many other complex processes — persuasion, affect, cognition, recall — but it wasn’t meant to be exposed to the public as such. In Facebook, however, the “Like” button further reduces this reduction and makes it visible, making the whole process somewhat cartoonish and tiresome. The consequences can be seen in “Like us on Facebook to enter to win!” promotions and the obsession with Like counts among businesses large and small (not to mention the would-be “personal branded“).

by Robert W. Gehl, TNI |  Read more:
Image: uncredited

You Are What You Buy

The most widely read takedown of foodie-ism is probably B.R. Myers’ “The Moral Crusade Against Foodies,” which was published in the Atlantic a little over two years ago. Myers’ essay is an entertaining, even thrilling, bit of rhetoric: He cherry-picks several tone-deaf, unwittingly callous exaltations of overeating and indifference to suffering from the likes of Anthony Bourdain and Jeffrey Steingarten and then minces each citation into pulp with a well-tuned food processor of moral outrage. The scope of Myers’ argument against foodies is fairly narrow—he abhors their glorification of butchering and eating meat—but it’s little wonder his piece found an audience beyond vegans. The foodie—like his ubiquitous but hard-to-define cultural cousin, the hipster—is a figure many love to hate. But whereas hipsters irritate because they’re seen as being politically apathetic, devoid of any values to speak of, foodies are annoying for their air of moral superiority.

Alison Pearlman’s new book, Smart Casual: The Transformation of Gourmet Restaurant Style in America, is Myers’ stylistic opposite: dry, academic, factual, judicious. As such, it’s unlikely to find half the readership of Myers’ Atlantic piece. But the conclusions Pearlman draws in Smart Casual, bolstered by painstaking research, make it a much harsher commentary on foodie culture than Myers’ tour de force.

At first glance, Smart Casual is a limited, lopsided little book, just four chapters and 145 pages long. Pearlman’s self-appointed task is to examine the rise of omnivorous taste in fine dining—“omnivorous,” in this case, referring to the consumption of both traditionally highbrow and traditionally lowbrow culture. Just 50 years ago, fancy restaurants adhered to a strict formula: white tablecloths, French food (described on French-language menus), maître d’s who built careful reputations on their ability to cater to the right kind of people. Today, chefs are celebrated rather than maître d’s, and they build reputations on their creativity, not their adherence to tradition. What’s more, though chandeliers and complicated place settings persist in certain upscale eateries, expensive restaurants are characterized as often as not by eclectic interior design or even aggressively casual atmospheres. (Think of Mission Chinese Food’s original outpost in San Francisco, sharing a seating area with a dive-y Chinese takeout joint.) Pearlman, an art historian at Cal Poly Pomona, seeks to tease apart the causes and meanings of this sea change in the way the gourmet set eats. (...)

It used to be that human ingenuity was valued in the kitchen. Now, what matters more is chefs’ knowing the right producers and buying the right products. Culinary excellence can no longer be achieved simply by learning the right technique; it can be acquired only by knowing the right things to buy—and by, it needs hardly be said, shelling out however much money it takes to buy them. In this way, modern foodies’ materialistic definition of refinement is more exclusive than that of yesteryear’s dogmatic French cooking. What appears to be a celebration of the natural and the simple is in fact more constrictive and less attainable, because it depends not on talent but on means and access. (In this way, the evolution of culinary refinement reminds me of the concurrent evolution of women’s fashions, which used to let women hide imperfections by wearing girdles but now require women to maintain lithe frames without any artifice—an even more oppressive requirement.)

Materialism and agricultural name-dropping have not snuffed out all appreciation for skill—indeed, as Pearlman chronicles, the ascendance of ingredient worship has paralleled a polar-opposite trend, that of modernist cuisine. Born in Ferran Adria’s elBulli in Catalonia, Spain, and raised in American outposts like WD-50 in New York and Alinea in Chicago, modernism utilizes laboratory chemicals and equipment to give foods surprising appearances and textures. Modernists chefs are often hailed as avant-gardists, but the pieces Pearlman highlights in Smart Casual reveal a troublingly reactionary attitude. Deconstructed, disguised, minimized reinterpretations of Heath bars, doughnuts, cheesesteaks, and burgers simultaneously mock anyone unhip enough to prefer the original version and applaud their eater’s advanced palate and dainty appetite.

by L.V. Anderson, Slate |  Read more:
Illustration by Lisa Hanawalt