Saturday, February 18, 2017

The Ultimate Pursuit in Hunting: Sheep

For the herd of bighorn sheep, the rocky cliffs were a safe place, with 360-degree views and plenty of nooks to blend into the gray rocks. The ground was sprinkled with scat, and the air carried a scent like a barnyard. Thousands of feet below, the landscape unfurled into a smooth checkerboard of ranch land that stretched to the horizon. The only threat up here would be to newborn lambs, susceptible to being plucked away by eagles.

Crouched behind a stand of rocks last spring, Brendan Burns, a 38-year-old with a growing reputation as sheep hunter and guide, peered over the edge, careful not to be seen or heard. Wild sheep have acute senses, and when they spook, they bolt as one, like a flock of birds. But the sheep were not home. Amid the panorama below, Burns spotted a constellation of tiny dots in a faraway meadow. The horns gave them away.

“There aren’t a lot of circles in the wild,” Burns whispered. “When you see something curved — and they kind of shine, they have this kind of glow to them — you learn to pick them up. You just train your eye to it.”

He pulled a high-powered Swarovski scope from his pack and aimed it downhill. Eight years before, there were no sheep here. Then 21 ewes and five juvenile rams were transplanted to the Rocky Boy’s Reservation of the Chippewa Cree, which straddles part of the Bears Paw Mountains, an islandlike rise on the plains.

The herd quickly grew to 100, and 40 were relocated to South Dakota. It has again grown over 100, and another 40 are likely to be transplanted this spring, part of broad attempts to replant sheep populations that are a fraction of what they once were in the West.

“There’s obviously no coyotes around, for them to be that low and feel comfortable,” Burns whispered. “This is a nice day to be a sheep.”

There were more sheep on a closer ridge, but in this group, Burns counted 38, including 11 rams.

“That gray one in the middle is the oldest one,” he said. “We’ll probably come back and hunt him in the fall.”

A man from Michigan had paid $100,000 for the year’s only chance to hunt one sheep in the herd on the Rocky Boy’s Reservation. Burns brought him there in October, and the men traipsed through the steep and rocky terrain for days before getting themselves in position for a clean shot. The ram was 10 years old, with a scar on its forehead, a cloudy eye and several missing teeth.

Its massive horns and about 80 pounds of meat were hauled back to Michigan. In exchange, the Chippewa Cree tribe at Rocky Boy’s received the $100,000, which was used to fund two tribal game wardens overseeing wildlife on the reservation.

It is a paradox of hunting, rarely so conspicuous as with wild sheep: The hunters are often the primary conservationists. In 2013, a permit in Montana sold for $480,000, still a record. Burns assisted on that hunt, too, over 18 days in the Upper Missouri River Breaks. The result was a large ram, and hundreds of thousands of dollars that went into the budget of Montana Fish, Wildlife & Parks.

“As far as sheep-hunting being a rich man’s sport, that’s absolutely true,” said Vance Corrigan, 84, who lives along the Yellowstone River in Livingston, Mont., and is one of the most accomplished big-game hunters in the world. “But if it weren’t for the rich man, those sheep wouldn’t be there.” (...)

“Some rich people are into yachts or floor tickets to the Lakers,” Burns said. “Some sheep-hunt.”

What they are not buying is an easy trophy. Sheep live in steep and treeless terrain, above the timberline in the mountains or in the rugged hills of the desert. Sheep hunts can take hunters into places few humans have gone, and can include weeks of trekking and stalking.

“For the true hunter, you can’t buy them behind the fence,” Kronberger said. “You have to climb the mountain. The fat, rich guy is going to have a much harder time. Anybody can kill a bear if they sit on the beach or along the stream long enough. I could take a guy in a wheelchair and get him a bear. You can go and get your deer, get your elk. You can’t do that with sheep. You have to go and get it.”

All that can be hard for non-hunters to understand. Those who have trophy rooms filled with a wide selection of mounts, like Corrigan and Kronberger, said that guests are rarely attracted to the sheep at first, instead taken by the more glamorous and fearsome animals. It is like a litmus test for hunting credibility.

“If I brought 1,000 people into my trophy room, almost all of them would go to the bears and say, ‘Wow, look at the bears,’” Kronberger said. “Only a few know to go to the sheep — the other sheep hunters. Half the time, people call the sheep a goat.”

by John Branch, NY Times |  Read more:
Image: Leah Nash

Broiled Fish With Lemon Curry Butter

Broiled fish fillets topped with a little butter and a squirt of lemon is a quick, easy weeknight staple. But when the butter is spiked with plenty of garlic, a jolt of curry powder and piquant fresh ginger, then brightened with fresh herbs, it becomes a superb, company-worthy dish that still cooks in under 10 minutes flat. Use your favorite fish here; any mild fillet will allow the buttery sauce to shine.

Featured in: This Sauce Makes Everything Taste Better and Our 10 Most Popular Recipes Right Now.

by Melissa Clark, NY Times |  Read more:
Image: Alec Cohen

Ingredients

4 tablespoons unsalted butter
4 garlic cloves, finely grated or minced
1 ½ tablespoons minced thyme leaves
1 ½ teaspoon curry powder
1 ½ teaspoon grated ginger
¼ teaspoon fine sea salt, more as needed
¾ teaspoon finely grated lemon zest
Ground black pepper, to taste
4 (6-ounce) blackfish, flounder or hake fillets
Fresh lemon juice, for serving
Dill fronds or fresh parsley, for serving

Preparation

Heat the broiler. In a small saucepan over medium heat, melt butter. Stir in garlic, thyme, curry powder, ginger and 1/4 teaspoon salt; heat until fragrant, about 1 minute. Stir in lemon zest.

Season fish with salt and pepper and place on a rimmed baking sheet. Pour sauce over fish and broil until fish is flaky and cooked through, about 5 minutes. Top with a squeeze of lemon juice and fresh dill, and serve.

Friday, February 17, 2017

American Carnage

“What happened, John?” asks Viggo Tarasov toward the end of 2014’s John Wick. “We were professionals. Civilized.”

This is around the time Tarasov gets a knife lodged in his chest — courtesy of the man in question, John Wick, whom he and other underworld criminals call Baba Yaga: “the boogeyman.” Not the kind of guy you want to piss off, you’d think. But at the start of the movie, Tarasov’s manchild of a son, Iosef, breaks into Baba Yaga’s home, kills his puppy, and steals his 1969 Boss 429 Mustang, a car as sleek and dangerous as a silver bullet. To stop him from seeking revenge on his son, Tarasov sets assassins on Wick’s tail. By the time of this little speech, Wick has already successfully killed or evaded them all. “Do I look civilized to you?” asks Wick before he finishes him off.

Wick is a maniac — but a relatable one. Iosef’s violation was no mere break-in or puppy murder, after all, but a call to war. Bonnie had Clyde, Thelma had Louise, and John Wick had a dog: a partner in crime, sure, but more importantly, a companion that allowed Wick, who’d left the killing business behind for love, to imagine a better future for himself. The puppy, Daisy, was a gift from his recently buried wife. “Now that I have found my peace,” read her note with the dog, “find yours.”

In the world of John Wick, which continues Friday with the release of John Wick: Chapter 2, there’s no such thing as peace. The violence in the Wick films favors hand-to-hand, close-up battles of the will, precise kill shots, and a knife or two. It deservedly inspires comparisons to scenes by Hong Kong action geniuses like John Woo. But the sense of a distinct social world that’s closing in on itself, from which Wick cannot escape, is the Wick movies’ unique shtick. It’s heightened in Chapter 2, which expands outward from the first film’s premise to show us the hidden world order to which Wick belongs, a miniature universe of criminals and their henchmen. There’s an entire service industry of cleaners, doctors, bankers, and weapon sellers aiding men like Wick. They have their own currency and their own code. Hierarchies of hidden power and influence preside over it all.

That greater sense of world-building is what keeps the carnage in these movies from feeling arbitrary. It’s part of what sets the Wick films apart from other action films. The other part is the action itself. The directors of the original, Chad Stahelski and David Leitch, are both veteran stuntmen. For Chapter 2, Stahelski directs alone. The fight scenes he dreams up in Chapter 2 have a rare sense of danger and spontaneity. More impressively, they heighten our sense of Wick’s character: his focus, skill, and utter singularity. The plot of Chapter 2 is essentially an excuse to draw Wick back into the business and turn everyone against him. It’s a way of showing us how big this criminal underworld is, such that entire fight scenes seemingly play out in an alternate universe happening invisibly alongside our own. It all gives the violence an irresistible grandeur. Wick isn’t merely a man on a revenge mission; by the end, he’s a man trying to fight his way through an entire social order. The John Wick universe is entirely defined by social bonds, be they friendships, debts, or, in Wick’s case, blood oaths. Violence breaks these bonds — hence the intimacy of it. Wick shoots his enemies from such short distance that his gun seemingly cuts through them like it’s a sword. It feels personal. His fights play out like negotiations, as if he’s asking, Just when is it that you plan to give up and die?

Stahelski sends Wick cascading down the stairs of Roman ruins mid-fight. He has him shoot his way out of a literal hall of mirrors, and fire into a crowd at Lincoln Center. It’s surreal — and not. The filmmaking is knowingly romantic — Wick is an astonishing force to watch — but the director doesn’t let us forget the implications of shooting or stabbing someone. That’s no romance. The violence Wick enacts on others feels unusually visceral; you can tell the filmmaker has done some thinking about what a gunshot does to a body. We feel every kill — even as we indulge the fantasy, laughing along with an admiring foe as he says, “I can assure you that the stories about this man — if nothing else — have been watered down.” Wick is such a marvel that even his enemies talk about him with smiles on their faces. He’s a legend, as abstract as air. But his kill shots are concrete reminders of his existence.

You can say the heavy backstory — the dying wife; the puppy as a last, lost symbol of hope; the criminal looking to “get out of the game” — is maudlin, even painfully familiar from every movie of its kind. But in truth these facts of Wick’s character are mere scaffolding for a story about a man we can only understand through genre tropes and myths. They’re not a way in: They’re a way around. As played with cold aplomb by Keanu Reeves, Wick is a man with unexpressed torrents of rage and sadness bubbling beneath that slick exterior. His face is calm, inexpressive, practically bulletproof. Reeves, an actor whose greatest talent might be lulling us into thinking he has no range and then pulling it out of nowhere, is perfectly cast. He growls and grimaces through his beard like a Yung Eastwood in the making — but quietly, with splashes of Alain Delon’s coolness circa Le Samourai.

by K. Austin Collins, The Ringer | Read more:
Image: Lionsgate/Ringer illustration

Obama's Lost Army

[ed. See also: Dems: New Obama Activist Army is B.S.]

On July 20, 2008, Mitch Kapor, the creator of Lotus 1-2-3 and a longtime denizen of Silicon Valley’s intellectual elite, dialed in to a conference call hosted by Christopher Edley Jr., a senior policy adviser to Barack Obama’s presidential campaign. Joining them on the line were some of the world’s top experts in crowdsourcing and online engagement, including Reid Hoffman, the billionaire co-founder of LinkedIn, and Mitchell Baker, the chairman of Mozilla. Drawing on Kapor’s influence, Edley had invited them to join a “Movement 2.0 Brainstorming Group.” Together, they would ponder a crucial question: how to “sustain the movement” should Obama, who was still a month away from accepting the Democratic nomination, go on to win the White House.

Edley had been a personal friend of Obama’s since his days teaching him at Harvard Law School. Their kinship had been underscored the previous summer, when Obama had invited Edley to the Chicago apartment of Valerie Jarrett, the candidate’s closest confidant, to deliver a stern lecture to the seasoned political operatives who were running his underdog bid for the presidency. The campaign team had Obama on a relentless pace of town halls and donor calls, and Hillary Clinton had been besting him in the early primary debates. Both Barack and Michelle Obama were unhappy. According to John Heilemann and Mark Halperin’s account in Game Change, Edley urged Obama’s campaign managers to schedule fewer rallies and fund-raisers, and allow the candidate more time to think and develop innovative policy ideas.

The intervention, delivered with a full-blown harangue telling the troika managing the campaign—David Axelrod, David Plouffe, and Robert Gibbs—to “get over yourselves,” was deeply resented by the political professionals; in his memoir, Believer, Axelrod would later call Edley “systematically antagonizing.” But Jarrett and Michelle Obama, who was also in the meeting, hung on Edley’s every word. “He’s channeling Barack,” Jarrett thought, according to Game Change. Jarrett told Axelrod she thought Edley’s fiery presentation had been “brilliant.”

Now, a year later, Edley had been moved over to Obama’s still-secret transition team, helping to map out policy and personnel on education, immigration, and health care. It was a better fit for Edley, a dapper and soft-spoken law professor with a salt-and-pepper beard, who had served in senior policy-making roles under Jimmy Carter and Bill Clinton. “Although I have worked in five presidential campaigns,” he told me recently, “I hate them because there is never enough emphasis on policy.” But Edley found himself newly motivated by a single big political idea, born in part from his past experience trying to win policy fights. What if Barack Obama could become not only the first black man elected president, but the first president in history to organize an enduring grassroots movement that could last beyond his years in office?

By that point in the race, there was every reason to think that Obama could build a lasting grassroots operation. His political machine had already amassed more than 800,000 registered users on My.BarackObama, its innovative social networking platform. “MyBO,” as it was known, gave supporters the ability—unthinkable in a traditional, top-down political campaign—to organize their own local groups, campaign events, and fund-raising efforts. Its potential for large-scale organizing after the election was vast—and completely without precedent in American politics. By Election Day, Obama’s campaign would have 13 million email addresses, three million donors, and two million active members of MyBO, including 70,000 people with their own fund-raising pages. This wasn’t just some passive list of campaign supporters, Edley realized—it was an army of foot soldiers, seasoned at rallying support for Obama’s vision of change.

“As the primary season wound down, it struck me that the campaign’s broad-based engagement via the internet could evolve into a powerful tool to shape progressive politics at the national, state, and local levels,” Edley recalls. “One goal would be to support an Obama presidency. But the agenda would be far broader.”

After discussing his idea with his wife, Maria Echaveste, who had served as White House deputy chief of staff under Bill Clinton, Edley turned to his friend Kapor, a digital pioneer and progressive activist who was widely seen as a folk hero of the computer revolution. “I knew that Mitch would be an indispensable partner to judge the merits of the general idea and help figure out some details,” Edley says. “I also realized, quite quickly, that Mitch had amazing contacts in that world whom we could enlist for the project.”

Opening the July brainstorming session, Edley framed the stakes sharply, according to notes he prepared for the meeting and a summary he wrote afterward. “On the morning of November 5,” he told the assembled tech leaders, “imagine saying to millions of donors, new voters, volunteers: ‘Thanks for everything; so long.’” Instead, he urged, “Imagine a way to transfer/transmute all of that involvement into a new mechanism or set of instrumentalities through which people can feel a heightened and more powerful kind of civic engagement with each other and with Obama and other leaders. And vice versa.”

Edley echoed what many progressives were beginning to believe was possible with a President Obama: “There is a rare opportunity to have a citizen movement heading in the same progressive direction as an incumbent president.” According to his notes, the Silicon Valley luminaries on the call agreed. “Most felt it would be an unacceptable loss not to take advantage of the rare alignment of an incumbent President with a progressive agenda, and an online constituency of donors and supporters who can press for change against the inevitable upsurge of entrenched special interests which will resist it.”

As we now know, that grand vision for a postcampaign movement never came to fruition. Instead of mobilizing his unprecedented grassroots machine to pressure obstructionist lawmakers, support state and local candidates who shared his vision, and counter the Tea Party, Obama mothballed his campaign operation, bottling it up inside the Democratic National Committee. It was the seminal mistake of his presidency—one that set the tone for the next eight years of dashed hopes, and helped pave the way for Donald Trump to harness the pent-up demand for change Obama had unleashed.

“We lost this election eight years ago,” concludes Michael Slaby, the campaign’s chief technology officer. “Our party became a national movement focused on general elections, and we lost touch with nonurban, noncoastal communities. There is a straight line between our failure to address the culture and systemic failures of Washington and this election result.”

The question of why—why the president and his team failed to activate the most powerful political weapon in their arsenal—has long been one of the great mysteries of the Obama era. Now, thanks to previously unpublished emails and memos obtained by the New Republic—some from the John Podesta archive released by WikiLeaks, and others made available by Obama insiders—it’s possible for the first time to see the full contours of why Movement 2.0 failed, and what could have been.

by Micah L. Sifry, TNR |  Read more:
Image: Matt Mallams/Aurora

Thursday, February 16, 2017


Ai Weiwei
via:

Every Successful Relationship is Successful for the Same Exact Reasons

Hey, guess what? I got married two weeks ago. And like most people, I asked some of the older and wiser folks around me for a couple quick words of advice from their own marriages to make sure my wife and I didn’t shit the (same) bed. I think most newlyweds do this, especially after a few cocktails from the open bar they just paid way too much money for.

But, of course, not being satisfied with just a few wise words, I had to take it a step further.

See, I have access to hundreds of thousands of smart, amazing people through my site. So why not consult them? Why not ask them for their best relationship/marriage advice? Why not synthesize all of their wisdom and experience into something straightforward and immediately applicable to any relationship, no matter who you are?

Why not crowdsource THE ULTIMATE RELATIONSHIP GUIDE TO END ALL RELATIONSHIP GUIDES™ from the sea of smart and savvy partners and lovers here?

So, that’s what I did. I sent out the call the week before my wedding: anyone who has been married for 10+ years and is still happy in their relationship, what lessons would you pass down to others if you could? What is working for you and your partner? And if you’re divorced, what didn’t work previously?

The response was overwhelming. Almost 1,500 people replied, many of whom sent in responses measured in pages, not paragraphs. It took almost two weeks to comb through them all, but I did. And what I found stunned me…

They were incredibly repetitive.

That’s not an insult or anything. Actually, it’s kind of the opposite. These were all smart and well-spoken people from all walks of life, from all around the world, all with their own histories, tragedies, mistakes, and triumphs…

And yet they were all saying pretty much the same dozen things.

Which means that those dozen or so things must be pretty damn important… and more importantly, they work.

Here’s what they are:

by Mark Manson, Quartz |  Read more:
Image: Reuters/Lucy Nicholson

Why Craigslist is Unbeatable

Reham Fagiri’s eureka moment was the result of a deal gone wrong.

It was spring 2012, and the recent Wharton graduate was trying to sell her television on Craigslist. A prospective buyer—an older, gray-haired gentleman—came to her Philadelphia apartment to take a look. When he realized Fagiri had accidentally listed the wrong television model, he was irate.

“He got really upset about that: ‘You made me drive all the way here, blah, blah, blah,’” she recalls. Fagiri asked $200 for the TV. He offered $50. When she balked at the deal, the man announced that he was simply going to take the television. He started carrying it away.

“I’m like ‘Well, I’d rather save my life than have to argue about $150,’” she tells me. “So I was like ‘I don’t even want your money. Just take the TV.’”

This is the basic flaw of Craigslist. The site facilitates peer-to-peer interactions, but does little to ensure that those transactions go off seamlessly. After her harrowing encounter, Fagiri began trading Craigslist stories with friends and classmates, many of whom were similarly frustrated with the site. “That was kind of the second step, like, ‘OK, well, clearly it’s beyond me, and it’s my classmates too,’” she remembers.

Months later, that Craigslist experience still on her mind, Fagiri started outlining a business idea: an online used-furniture marketplace dedicated to the proposition that sometimes consumers want a middleman around to shield them from irrational strangers. She called the site AptDeco, and, like Craigslist, it would allow users to list and view ads for used furniture. Unlike Craigslist, it would also process payments, coordinate pickup and delivery, and serve as a buffer between buyer and seller.

“I’m an engineer, so I started playing around with the idea in my free time,” says Fagiri. “And then I built a small site.” On launch day she sold a West Elm headboard. That’s when Fagiri knew she was on to something. “‘Oh, OK!’” she recalls thinking. “‘I guess this is a real business!’”

More than three years after it ushered that headboard to new ownership, AptDeco is thriving and pedigreed — it was part of Y Combinator’s Winter 2014 class. According to Fagiri, the site is also profitable. (“Obviously there’s fluctuations. Some months are better. But overall we’re at the break-even profitability mark.”) For its services, AptDeco takes a 23 percent cut of the sale price and charges a flat delivery fee of either $35, $95, or $145, depending on the size of the item purchased; the site also lets you hire people to remove unwanted furniture or assemble new purchases.

AptDeco’s functional business model earns it a place of honor amongst the many startups that are vying to disrupt the “moving used crap around” space. There is Chairish, founded in 2013, which focuses on designer furniture and has raised almost $9 million in venture funding, according to Crunchbase; Viyet, founded in 2012, which specializes in high-end consignment; Trove, also known as Trove Market, a mobile-focused used-furniture service that, according to Crunchbase, has raised almost $1 million in seed funding; others include Krrb, MarketSquare, and 1stDibs.

All of these startups are jostling to dethrone the unlikeliest market leader in the history of online retail: Craigslist. The site commands vast loyalty despite doing very little to actively court its users. At times, it seems to dominate through sheer inertia. And yet Craigslist abides, and thrives, as its would-be competitors struggle to establish themselves. Which raises the question: Why is it so hard to compete with a site that is only begrudgingly a business?

by Justin Peters, Backchannel |  Read more:
Image: Li-Anne Dias

Wednesday, February 15, 2017



Day's catch (opihi's)
photo: markk

Politics 101

‘The Kids Think I’m a Shoe’

Stan Smith the man & Stan Smith the sneaker.

The island of Hilton Head in South Carolina is shaped like a sneaker, and Stan Smith lives on the laces, right off the river. Inside his house, the six-foot-four retired tennis player with the straightest back I’ve ever seen walks out of the second of his two closets and into the living room carrying five pairs of Stan Smiths, the sneaker, but he still can’t find the one he’s looking for. He has 40 pairs in 30 different styles, more or less.

The sneaker’s fame — and its longevity — takes even its namesake by surprise. You see, the Stan Smith is really the most basic of all possible sneakers. Its narrow white leather body is cushioned at the front with an almost-orthopedic round toe. Its three understated Adidas stripes are nearly missable perforations, as if they don’t care to be recognized, and it has just two spots of color, most classically in green: a tab on the back of the ankle and Smith’s face printed on the tongue. They are essentially anonymous, the saltine cracker of tennis shoes. They were endorsed by Stan Smith just after he won his first Grand Slam singles title in the summer of 1971 and just before he won his second, and last, the next year. He was, in other words, no Serena Williams, not even a Rod Laver.

Nothing about Smith or the simple design of the sneaker itself — neither has changed much since 1971 — explains how Adidas was able to sell 7 million pairs by 1985. Or how that number had grown to 22 million pairs by 1988. Or why Footwear News named it the first-ever Shoe of the Year in 2014. Or how it surpassed 50 million shoes sold as of 2016. Or how the sneaker grew far beyond its start as a technical athletic shoe and became a fashion brand, its basic blank slate evolving and taking on new meaning and purpose. (...)

With his Adidas contract, Smith became one of the first American tennis players to receive an endorsement deal. It was the very beginning of the modern brand-athlete pairings that would, a little over a decade later, lead to Michael Jordan’s very own Air Jordan line, and three decades after that, to LeBron James’s reported $1 billion lifetime endorsement deal with Nike. But when Smith was playing, none of that existed yet. If you made it to the Roland Garros main draw, you would get “six shirts, a vest sweater, a regular sweater, socks, and that’s about it,” he says, counting the items off on his fingers. “You wanted to get in the main draw, so you could get the full set of clothes.”

His agent, Donald Dell, negotiated the picture of Smith’s face on the tongue, a savvy move that made the man inseparable from the sneaker, but Haillet’s name remained on the shoe until 1978, when Smith took over for good. It was by then the premier tennis sneaker. Smith remembers being beaten by opponents wearing his face on their feet. “I didn’t think it was appropriate,” he says. There was an Argentine player named Ricardo Cano, Smith recalls, who was signed to another brand but wore Stan Smiths anyway and drew the other company’s logo on the side of the shoes. The Stans were just that much better.

Smith retired from tennis in 1985. How the sneakers, 43 years after their creation, became suddenly ubiquitous is a case study in how “cool” is created and disseminated from image-makers to mainstream consciousness. In the mid-’90s, while Nike consumed the American sneaker market, a small circle of offbeat celebrities and influential marketing professionals latched onto the shoe as a sort of anti-fashion fashion statement, part of a Waspy, but not too Waspy, vintage style they helped pioneer: tucked-in Brooks Brothers shirts with ill-fitting corduroys or khakis. It helped if you drove a vintage Mercedes.

Stan Smiths fit perfectly with this aesthetic. Here was a shoe that you could buy new, but it looked the same as it had in 1971. The skateboarder Rick Howard wore Stan Smiths in a 1993 skate video sponsored by Girl Skateboards, a company co-founded by Spike Jonze. Mike Mills, who recently directed 20th Century Women, but back then designed album covers for the Beastie Boys and Sonic Youth, was more into Rod Lavers, another Adidas tennis sneaker from the ’70s, but his friend Roman Coppola, who founded the ad agency the Directors Bureau and who later wrote Moonrise Kingdom with Wes Anderson, preferred Stan Smiths. “I’ve owned a few pairs over the years, but don’t remember any specific movement or discussion around it,” he says. His sister Sofia Coppola wore them, too. By the early aughts, branding experts such as Andy Spade, who had launched and popularized his wife Kate Spade’s company, were starting to reinterpret the retro-nostalgia look for the likes of J.Crew, Warby Parker, and Shinola, to great financial success.

Then came Phoebe Philo, the creative director of Céline. In March 2011, Philo took her bow on the Céline runway at the end of the fall-winter ready-to-wear show in Stan Smiths along with low-slung black trousers and a gray turtleneck, hair tucked in. The timing could not have been better. Philo was at the peak of her influence and power. Every editor and professional fashion woman from New York to London to Paris was shopping at Céline between the shows. Kanye West had just name-dropped her in his comeback album, My Beautiful Dark Twisted Fantasy, and was so completely bewitched by her ideas that he performed wearing women’s Céline. It was right after the label came out with the luggage bag but just before it became the only bag that seemed to matter. At this height, Phoebe Philo on the runway wearing Stan Smiths was like a gift. Here was something Philo did that everyone could copy for only $75; you could even buy a pair on Amazon. The shoes took on a new meaning. J.Crew started carrying them. The Stan Smith became fashion’s most important sneaker.

At the time, though, Adidas saw things a bit differently. While the sneaker was becoming popular in the fashion world, it was still sold almost exclusively in sporting-goods stores and often at a discount. “We weren’t really happy with how it was seen and where it was found,” says Torben Schumacher, Adidas’ vice-president of product. Adidas wanted to recalibrate how the shoe was presented.

To do that, Schumacher and Adidas decided to take the sneaker entirely off the market. “The idea of not having the model wasn’t really something that went down well,” says Schumacher, especially since it was just starting to get recognized by this new trendsetting crowd. (Smith’s first thought: “That’s interesting. I don’t really like that too much.”) Still, Schumacher and his team at Adidas spent a year and a half convincing the rest of the company of the merits of the plan. Adidas couldn’t truly reintroduce it to a new higher-end clientele, Schumacher argued, if it was still readily available in the bargain bin. “We wanted it to get the respect it deserved and the conversation about it that it deserved and for it to be seen as a commodity item,” he says. “We thought it needed something bold and drastic to prepare everyone for the story again.” By the time Adidas stopped selling the Stan Smith to places like Foot Locker, the company already had a plan of how and with whom it was going to bring it back. In 2012, the sneakers disappeared.

They began reemerging, subtly but purposefully, the next year — notably in the November 2013 issue of Vogue Paris, for which Gisele Bündchen posed naked, apart from white socks and Stan Smiths (“One of our sons saw that, we had no idea,” Margie says. “It was funny”). On January 15, 2014, they went back on sale in higher-end, fashion-focused stores like Barneys New York and the Parisian boutique Colette, still for under $100. They were instantly devoured. Later that year, Philo formally announced the Stan Smiths return, once again wearing them while taking her runway bow, this time with wide-leg pants and a camel sweater.

The trickle-down was immediate. In 2015, Adidas sold 8 million pairs of Stan Smiths. Adidas won’t confirm how many it sold in 2016, but some industry experts throw around numbers like 15 million — more than double what it moved in the shoes’ first decade of existence — the same side part and crooked smile leading them wherever they go.

Smith is the first to recognize Philo’s importance. He brings her up on two different occasions over the course of our time together. “She was one of the first to start wearing the shoe,” he remembers. “And then Pharrell Williams,” who basically bowed to Smith when they met at the U.S. Open this summer and now regularly designs his own versions of the sneaker, as does Raf Simons. “Those cost like $400 or something, and it’s the same shoe! It’s really weird, actually.”

For Smith, the sneakers are far more successful, monetarily, than he ever was in his tennis career, during which he made “$1.7 million, or something like that. I read it once,” he says. “The shoe has certainly been more than that.” In the beginning he collected an annual sum for his endorsement. These days, though, he’s paid in royalties.

Smith’s contract with Adidas expires about every five years (he’ll sign next in 2018). So why does Adidas keep Stan Smith around? Why does it need him when it has Phoebe and Gisele and Pharrell and Raf and Kanye? Turns out this 70-year-old former tennis player, who was really more of a doubles star, who has eyebrows like the flailing blowup guy at car dealerships, is the only thing that makes its shoe the original. Which is especially valuable when everybody else in the business is trying to knock off its success.

by Lauren Schwartzberg, The Cut |  Read more:
Image: João Canziani

Tuesday, February 14, 2017

E Unibus Pluram: Television and U.S. Fiction

Act Natural

Fiction writers as a species tend to be oglers. They tend to lurk and to stare. The minute fiction writers stop moving, they start lurking, and stare. They are born watchers. They are viewers. They are the ones on the subway about whose nonchalant stare there is something creepy, somehow. Almost predatory. This is because human situations are writers' food. Fiction writers watch other humans sort of the way gapers slow down for car wrecks: they covet a vision of themselves as witnesses.

But fiction writers as a species also tend to be terribly self-conscious. Even by U.S. standards. Devoting lots of productive time to studying closely how people come across to them, fiction writers also spend lots of less productive time wondering nervously how they come across to other people. How they appear, how they seem, whether their shirttail might be hanging out their fly, whether there's maybe lipstick on their teeth, whether the people they're ogling can maybe size them up as somehow creepy, lurkers and starers.

The result is that a surprising majority of fiction writers, born watchers, tend to dislike being objects of people's attention. Being watched. The exceptions to this rule - Mailer, McInerney, Janowitz - create the misleading impression that lots of belles-lettres types like people's attention. Most don't. The few who like attention just naturally get more attention. The rest of us get less, and ogle.

Most of the fiction writers I know are Americans under forty. I don't know whether fiction writers under forty watch more television than other American species. Statisticians report that television is watched over six hours a day in the average American household. I don't know any fiction writers who live in average American households. I suspect Louise Erdrich might. Actually I have never seen an average American household. Except on TV.

So right away you can see a couple of things that look potentially great, for U.S. fiction writers, about U.S. television. First, television does a lot of our predatory human research for us. American human beings are a slippery and protean bunch, in real life, as hard to get any kind of univocal handle on as a literary territory that's gone from Darwinianly naturalistic to cybernetically post-postmodern in eighty years. But television comes equipped with just such a syncretic handle. If we want to know what American normality is - what Americans want to regard as normal - we can trust television. For television's whole raison is reflecting what people want to see. It's a mirror. Not the Stendhalian mirror reflecting the blue sky and mud puddle. More like the overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile. This kind of window on nervous American self-perception is just invaluable, fictionwise. And writers can have faith in television. There is a lot of money at stake, after all; and television retains the best demographers applied social science has to offer, and these researchers can determine precisely what Americans in 1990 are, want, see: what we as Audience want to see ourselves as. Television, from the surface on down, is about desire. Fictionally speaking, desire is the sugar in human food.

The second great thing is that television looks to be an absolute godsend for a human subspecies that loves to watch people but hates to be watched itself. For the television screen affords access only one way. A psychic ball-check valve. We can see Them; They can't see Us. We can relax, unobserved, as we ogle. I happen to believe this is why television also appeals so much to lonely people. To voluntary shut-ins. Every lonely human I know watches way more than the average U.S. six hours a day. The lonely, like the fictional, love one-way watching. For lonely people are usually lonely not because of hideous deformity or odor or obnoxiousness - in fact there exist today social and support groups for persons with precisely these features. Lonely people tend rather to be lonely because they decline to bear the emotional costs associated with being around other humans. They are allergic to people. People affect them too strongly. Let's call the average U.S. lonely person Joe Briefcase. Joe Briefcase just loathes the strain of the self-consciousness which so oddly seems to appear only when other real human beings are around, staring, their human sense-antennae abristle. Joe B. fears how he might appear to watchers. He sits out the stressful U.S. game of appearance poker.

But lonely people, home, alone, still crave sights and scenes. Hence television. Joe can stare at Them, on the screen; They remain blind to Joe. It's almost like voyeurism. I happen to know lonely people who regard television as a veritable deus ex machina for voyeurs. And a lot of the criticism, the really rabid criticism less leveled than sprayed at networks, advertisers, and audiences alike, has to do with the charge that television has turned us into a nation of sweaty, slack-jawed voyeurs. This charge turns out to be untrue, but for weird reasons.

What classic voyeurism is is espial: watching people who don't know you're there as they go about the mundane but erotically charged little businesses of private life. It's interesting that so much classic voyeurism involves media of framed glass-windows, telescopes, etc. Maybe the framed glass is why the analogy to television is so tempting. But TV-watching is a different animal from Peeping Tourism. Because the people we're watching through TV's framed-glass screen are not really ignorant of the fact that somebody is watching them. In fact a whole lot of somebodies. In fact the people on television know that it is in virtue of this truly huge crowd of ogling somebodies that they are on the screen, engaging in broad non-mundane gestures, at all. Television does not afford true espial because television is performance, spectacle, which by definition requires watchers. We're not voyeurs here at all. We're just viewers. We are the Audience, megametrically many, though most often we watch alone. E unibus pluram. (...)

Not that realities about actors and phosphenes and furniture are unknown to us. We simply choose to ignore them. For six hours a day. They are part of the belief we suspend. But we're asked to hoist such a heavy load aloft. Illusions of voyeurism and privileged access require real complicity from viewers. How can we be made so willingly to acquiesce for hours daily to the illusion that the people on the TV don't know they're being looked at, to the fantasy that we're transcending privacy and feeding on unself-conscious human activity? There might be lots of reasons why these unrealities are so swallowable, but a big one is that the performers behind the two layers of glass are - varying degrees of Thespian talent aside - absolute geniuses at seeming unwatched. Now, seeming unwatched in front of a TV camera is a genuine art. Take a look at how civilians act when a TV camera is pointed at them: they simply spaz out, or else go all rigor mortis. Even PR people and politicians are, camera-wise, civilians. And we love to laugh at how stiff and false non-professionals appear, on television. How unnatural But if you've ever once been the object of that terrible blank round glass stare, you know all too well how self-conscious it makes you. A harried guy with earphones and a clipboard tells you to "act natural" as your face begins to leap around on your skull, struggling for a seemingly unwatched expression that feels impossible because "seeming unwatched" is, like the "act natural" which fathered it, oxymoronic. Try driving a golf ball as someone asks you whether you in- or exhale on your backswing, or getting promised lavish rewards if you can avoid thinking of a rhinoceros for ten seconds, and you'll get some idea of the truly heroic contortions of body and mind that must be required for Don Johnson to act unwatched as he's watched by a lens that's an overwhelming emblem of what Emerson, years before TV, called "the gaze of millions."

Only a certain very rare species of person, for Emerson, is "fit to stand the gaze of millions." It is not your normal, hard-working, quietly desperate species of American. The man who can stand the megagaze is a walking imago, a certain type of transcendent freak who, for Emerson, "carries the holiday in his eye."(2) The Emersonian holiday television actors' eyes carry is the potent illusion of a vacation from self-consciousness. Not worrying about how you come across. A total unallergy to gazes. It is contemporarily heroic. It is frightening and strong. It is also, of course, an act, a counterfeit impression - for you have to be just abnormally self-conscious and self-controlling to appear unwatched before lenses. The self-conscious appearance of unself-consciousness is the grand illusion behind TV's mirror-hall of illusions; and for us, the Audience, it is both medicine and poison.

For we gaze at these rare, highly trained, seemingly unwatched people for six hours daily. And we love these people. In terms of attributing to them true supernatural assets and desiring to emulate them, we sort of worship them. In a real Joe Briefcase-type world that shifts ever more starkly from some community of relationships to networks of strangers connected by self-interest and contest and image, the people we espy on TV offer us familiarity, community. Intimate friendship. But we split what we see. The characters are our "close friends"; but the performers are beyond strangers, they're images, demigods, and they move in a different sphere, hang out with and marry only each other, seem even as actors accessible to Audience only via the mediation of tabloids, talk show, EM signal. And yet both actors and characters, so terribly removed and filtered, seem so natural, when we watch.

Given how much we watch and what watching means, it's inevitable - but toxic - for those of us fictionists or Joe Briefcases who wish to be voyeurs to get the idea that these persons behind the glass, persons who are often the most colorful, attractive, animated, alive people in our daily experience, are also people who are oblivious to the fact that they are watched. It's toxic for allergic people because it sets up an alienating cycle, and also for writers because it replaces fiction research with a weird kind of fiction consumption. We self-conscious Americans' oversensitivity to real humans fixes us before the television and its ball-check valve in an attitude of rapt, relaxed reception. We watch various actors play various characters, etc. For 360 minutes per diem, we receive unconscious reinforcement of the deep thesis that the most significant feature of truly alive persons is watchableness, and that genuine human worth is not just identical with but rooted in the phenomenon of watching. And that the single biggest part of real watchableness is seeming to be unaware that there's any watching going on. Acting natural.

by David Foster Wallace, The Free Library |  Read more:
Image: Naldz Graphics, TV MAN

Stanford Students Recreate 5,000-year-old Chinese Beer Recipe

On a recent afternoon, a small group of students gathered around a large table in one of the rooms at the Stanford Archaeology Center.

For a hands-on view into the ancient world, students brewed beer from a 5,000-year-old recipe as part of an archaeology course with Professor Li Liu.

A collection of plastic-covered glass beakers and water bottles filled with yellow, foamy liquid stood in front of them on the table, at the end of which sat Li Liu, a professor in Chinese archaeology at Stanford.

White mold-like layers floated on top of the liquids. As the students removed the plastic covers, they crinkled their noses at the smell and sour taste of the odd-looking concoctions, which were the results of their final project for Liu’s course Archaeology of Food: Production, Consumption and Ritual.

The mixtures were homemade beer students made using ancient brewing techniques of early human civilizations. One of the experiments imitated a 5,000-year-old beer recipe Liu and her team revealed as part of published research last spring.

“Archaeology is not just about reading books and analyzing artifacts,” said Liu, the Sir Robert Ho Tung Professor in Chinese Archaeology. “Trying to imitate ancient behavior and make things with the ancient method helps students really put themselves into the past and understand why people did what they did.”
The ancient recipe

Liu, together with doctoral candidate Jiajing Wang and a group of other experts, discovered the 5,000-year-old beer recipe by studying the residue on the inner walls of pottery vessels found in an excavated site in northeast China. The research, which was published in Proceedings of the National Academy of Sciences, provided the earliest evidence of beer production in China so far.

The ancient Chinese made beer mainly with cereal grains, including millet and barley, as well as with Job’s tears, a type of grass in Asia, according to the research. Traces of yam and lily root parts also appeared in the concoction.

Liu said she was particularly surprised to find barley – which is used to make beer today – in the recipe because the earliest evidence to date of barley seeds in China dates to 4,000 years ago. This suggests why barley, which was first domesticated in western Asia, spread to China.

“Our results suggest the purpose of barley’s introduction in China could have been related to making alcohol rather than as a staple food,” Liu said.

The ancient Chinese beer looked more like porridge and likely tasted sweeter and fruitier than the clear, bitter beers of today. The ingredients used for fermentation were not filtered out, and straws were commonly used for drinking, Liu said.

Recreating the recipe

At the end of Liu’s class, each student tried to imitate the ancient Chinese beer using either wheat, millet or barley seeds.

The students first covered their grain with water and let it sprout, in a process called malting. After the grain sprouted, the students crushed the seeds and put them in water again. The container with the mixture was then placed in the oven and heated to 65 degrees Celsius (149 F) for an hour, in a process called mashing. Afterward, the students sealed the container with plastic and let it stand at room temperature for about a week to ferment.

Alongside that experiment, the students tried to replicate making beer with a vegetable root called manioc. That type of beer-making, which is indigenous to many cultures in South America where the brew is referred to as “chicha,” involves chewing and spitting manioc, then boiling and fermenting the mixture.

Madeleine Ota, an undergraduate student who took Liu’s course, said she knew nothing about the process of making beer before taking the class and was skeptical that her experiments would work. The mastication part of the experiment was especially foreign to her, she said.

“It was a strange process,” Ota said. “People looked at me weird when they saw the ‘spit beer’ I was making for class. I remember thinking, ‘How could this possibly turn into something alcoholic?’ But it was really rewarding to see that both experiments actually yielded results.”

Ota used red wheat for brewing her ancient Chinese beer. Despite the mold, the mixture had a pleasant fruity smell and a citrus taste, similar to a cider, Ota said. Her manioc beer, however, smelled like funky cheese, and Ota had no desire to check how it tasted.

The results of the students’ experiments are going to be used in further research on ancient alcohol-making that Liu and Wang are working on.

“The beer that students made and analyzed will be incorporated into our final research findings,” Wang said. “In that way, the class gives students an opportunity to not only experience what the daily work of some archaeologists looks like but also contribute to our ongoing research.”

Getting a glimpse of the ancient world


For decades, archeologists have yearned to understand the origin of agriculture and what actions may have sparked humans to transition from hunting and gathering to settling and farming, a period historians call the Neolithic Revolution.

Studying the evolution of alcohol and food production provides a window into understanding ancient human behavior, said Liu, who has been teaching Archaeology of Food for several years after coming to Stanford in 2010.

But it can be difficult to figure out precisely how the ancient people made alcohol and food from just examining artifacts because organic molecules easily break down with time. That’s why experiential archaeology is so important, Liu said.

by Alex Shashkevich, Stanford News |  Read more:
Image: Stanford News

Considerations on Cost Disease


[ed. This and other analyses of Cost Disease here:]
via: Slate Star Codex

Michael Franti & Spearhead

Air Pollution Masks – Fashion's Next Statement?

The intersect between fashion and practicality is not always the most compelling. But given that air pollution is the world’s largest single environmental health risk it seems inevitable they will come to influence each another.

Yesterday saw the launch of M90, an “urban breathing mask” created by the Swedish company Airinum and sold in more than 50 countries. Face masks are already a common sight in Asian countries, although the cheap washable cotton rectangles rarely perform well in tests. Surgical masks, the type usually worn by doctors, have tended to fare better – but are still largely ineffectual.

The market for pricier, more attractive masks has been growing steadily in the past few years. Sales are not notable but Freka, a British brand, had the monopoly for a while. And rightly so, given that they tapped into the trend for minimal sportswear, almostCéline-like in design, seeking to become more of a background accessory than anything stand-out.

Which sets Airinum apart. While the design is typically Scandinavian, these face masks are neon camo.

They aren’t the first luxe masks to have forayed into fashion. In the last few years, these have regularly appeared on the catwalk at Beijing fashion week, arguably being awarded the same gravitas as an It bag. (...)

Masks covered in the Burberry check (although they are not Burberry products) remain a common sight in Asian cities, in a bid to marry style with sensibility, even if they don’t work well. As to whether they’ll take off, affordability remains an issue. But if the aim is to market them in Europe and the US, where athleisure is king and vanity is key, perhaps this is the answer.

As for the fashion appraisal, trad camo is having a moment, particularly in menswear. But neon camo, nothing short of an eyesore, is unchartered territory. It’s also oxymoronic. But that’s the point: if the aim is to raise awareness of the problem, then it’s unlikely you’ll miss one of these on the street.

by Morwenna Ferrier, The Guardian | Read more:
Image: Airinum

Monday, February 13, 2017

Is Apple Over?

I started personal computing on an Apple II circa 1977. It was a big step up from the Heathkit and Radio Shack DIY projects I tinkered with in grade school. When IBM introduced the IBM-PC circa 1981, I semi-defected, and in 1984 I became bi-computeral (you know why).

My company functioned in a computer multiverse for some time. Macs were for art, music and publishing; PCs were for business; DEC minicomputers were for science, math and engineering. The minicomputers went away by 2000, and then we were just Mac and PC. In 2006, shortly after Macs became Intel inside and Parallels Desktop (a utility that enabled users to run Windows programs on a Mac) debuted, we became a 100% Apple shop, and we never looked back.

For more than a decade, if Apple manufactured it, we purchased it – in bulk. There was no reason to hyper-evaluate the new specifications; we just sent a purchase order to Tekserve (now T2 Computing) for as many of the new Apple devices as we needed (and maybe a few we didn't need). There are so many Apple devices in our offices, someone once said, "It looks like Steve Jobs threw up in here."

That was then.

What malevolent force could entice me to seriously consider a PC? What wickedness could tempt me to contemplate a transitioning back to Windows? What could possibly lure me to the dark side? Only Apple itself has such power.

My iPhone 7 Plus Chronicle

On September 7, 2016, I stood on line for an hour to pick up my brand new iPhone 7 plus. I had made an appointment to be one of the first to pick one up because I was still a blind faith follower of the cult of Apple. There was going to be an issue with the headphone jack (well documented in my first treatise of dissent, "Apple iPhone 7: Are You F#$king Kidding Me"). But being one of the faithful means putting aside common sense.

The moment I started to transfer information from iCloud, I was in trouble. Some apps worked, others were greyed out, and certain features were hit or miss.

Two factory resets and four hours later, I called Apple Care. After 30 minutes on hold, I was told that my iPhone must be defective and needed to be replaced.
What?
"OK, I'll just go to the Genius Bar and have it replaced." "No, sorry," said the Apple Care person, "we don't have any extra iPhones at the stores; you'll have to send it back to us." "But because of the 'new phone every year' plan you sold me last time, you took my iPhone 6 Plus back. What will I do for a phone for the five to seven days you're telling me it will take for me to get the replacement?"

(Note: Because I review technology as part of my job, I had plenty of other smartphones, but if this happened to most people, they'd be offline for a week.)

It took two tries for Apple to send me a new phone. The first replacement was lost in shipping, and the second is the one I'm carrying now. I was without an iPhone for about two weeks. To make matters worse, Apple charged my credit card $950 for each phone, so although I had no iPhones, Apple put $2,850 of charges on my credit card, saying it would refund the difference when the missing phone and the bad phone were returned (which it ultimately did).

How could Apple not have replacement phones available for the inevitable number of defective phones it might sell? Here's a better question: Did Apple sell too many defective phones for its supply of replacements?

With the number of iPhones Apple sells, some are bound to be defective – but this was not an isolated incident.

My MacBook Pro Chronicle

I wrote my second treatise of dissent, "Apple MacBook Pro 2016: WTF?," about the all-singing, all-dancing 15-inch MacBook Pro before I received my unit. Here are two videos you may enjoy about unboxing my second MacBook Pro and its battery life. Second? Yes, second. I'm writing this article on my third 15-inch MacBook Pro because the first two were defective.

by Shelly Palmer, Ad Age |  Read more:
Image: Apple

Tell Me A Story

‘Data-Driven’ Campaigns Are Killing the Democratic Party

There’s a Southern proverb often attributed to Sam Rayburn: “There’s no education in the second kick of a mule.” One month into the Trump presidency, and it’s still unclear whether the Democratic Party will learn anything from a fourth kick.

For four straight election cycles, Democrats have ignored research from the fields of cognitive linguistics and psychology that the most effective way to communicate with other humans is by telling emotional stories. Instead, the Democratic Party’s affiliates and allied organizations in Washington have increasingly mandated “data-driven” campaigns instead of ones that are message-driven and data-informed. And over four straight cycles, Democrats have suffered historic losses.

After the 2008 election, Democrats learned all the wrong lessons from President Obama’s victory, ascribing his success to his having better data. He did have better data, and it helped, but I believe he won because he was the better candidate and had a better message, presented through better storytelling.

I’m not a Luddite. I did my graduate work in political science at MIT, and as a longtime Democratic strategist, I appreciate the role that data can play in winning campaigns. But I also know that data isn’t a replacement for a message; it’s a tool to focus and direct one.

We Democrats have allowed microtargeting to become microthinking. Each cycle, we speak to fewer and fewer people and have less and less to say. We all know the results: the loss of 63 seats and control of the House, the loss of 11 seats and control of the Senate, the loss of 13 governorships, the loss of over 900 state legislative seats and control of 27 state legislative chambers.

Yet despite losses on top of losses, we have continued to double down on data-driven campaigns at the expense of narrative framing and emotional storytelling.

Consider the lot of Bill Clinton. It has been widely reported that in 2016, Bill Clinton urged Hillary Clinton’s campaign to message on the economy to white working-class voters as well as to the “Rising American Electorate” (young voters, communities of color and single white women), but couldn’t get anyone to listen to him in Brooklyn. They had an algorithm that answered all questions. Theirs was a data-driven campaign. The campaign considered Bill to be old school—a storyteller, not data driven.

I feel his pain. And unless Democrats start to change things quickly, we’ll be feeling pain in elections yet to come.

Though the problem for Democrats is urgent, the challenge is not new. Before the clamor for a “data-driven” approach, the “best practices” embraced by much of the Democratic Party apparatus encouraged campaigns that were predominantly driven by issue bullet points. In 2000, for example, the Gore presidential campaign had no shortage of position papers, but it would be challenging (at best) to say what the campaign’s message was. In contrast, in Obama’s 2008 campaign, “Hope and Change” was not only a slogan, but a message frame through which all issues were presented.

Years ago, my political mentor taught me the problem with this approach, using a memorable metaphor: issues are to a campaign message what ornaments are to a Christmas tree, he said. Ornaments make the tree more festive, but without the tree, you don’t have a Christmas tree, no matter how many ornaments you have or how beautiful they are. Issues can advance the campaign’s story, but without a narrative frame, your campaign doesn’t have a message, no matter how many issue ads or position papers it puts forward.

Storytelling has been the most effective form of communication throughout the entirety of human history. And that is unlikely to change, given that experts in neurophysiology affirm that the neural pathway for stories is central to the way the human brain functions (“The human mind is a story processor, not a logic processor,” as social psychologist Jonathan Haidt has written).

The scientific evidence of the effectiveness of storytelling is extensive. Consider the 2004 book, Don’t Think of an Elephant, in which Berkeley linguistics professor George Lakoff applied the analytic techniques from his field to politics, explaining that “all of what we know is physically embodied in our brains,” which process language through frames: “mental structures that shape the way we see the world.”

Convincing a voter—challenging an existing frame—is no small task. “When you hear a word, its frame (or collection of frames) is activated in your brain,” writes Lakoff. As a result, “if a strongly held frame doesn’t fit the facts, the facts will be ignored and the frame will be kept.” How then to persuade voters? How can we get them to change the way they see the world? Tell a story.

Further evidence was put forward in 2007’s The Political Brain, by Emory University psychologist Drew Westen. “The political brain is an emotional brain,” Westen wrote, and the choice between electoral campaigns that run on an issue-by-issue debate versus those that embrace storytelling is stark: “You can slog it out for those few millimeters of cerebral turf that process facts, figures and policy statements. Or you can take your campaign to the broader neural electorate collecting delegates throughout the brain and targeting different emotional states with messages designed to maximize their appeal.”

For Democrats, a useful metaphor to frame our storytelling is that while conservatives believe we are each in our own small boat and it is up to each of us to make it on our own, progressive morality holds that we are all on a large boat and unless we maintain that boat properly, we will all sink together. That metaphor could serve as our narrative frame, and addressing issues within this frame—rather than as separate, unrelated bullet points—would allow us to present emotional stories using language that speaks to voters’ values.

by Dave Gold, Politico |  Read more:
Image: uncredited

Saturday, February 11, 2017



Carmen Cartiness Johnson, I can see China (2015)
via: