Thursday, February 16, 2017
Every Successful Relationship is Successful for the Same Exact Reasons
Hey, guess what? I got married two weeks ago. And like most people, I asked some of the older and wiser folks around me for a couple quick words of advice from their own marriages to make sure my wife and I didn’t shit the (same) bed. I think most newlyweds do this, especially after a few cocktails from the open bar they just paid way too much money for.
But, of course, not being satisfied with just a few wise words, I had to take it a step further.
See, I have access to hundreds of thousands of smart, amazing people through my site. So why not consult them? Why not ask them for their best relationship/marriage advice? Why not synthesize all of their wisdom and experience into something straightforward and immediately applicable to any relationship, no matter who you are?
Why not crowdsource THE ULTIMATE RELATIONSHIP GUIDE TO END ALL RELATIONSHIP GUIDES™ from the sea of smart and savvy partners and lovers here?
So, that’s what I did. I sent out the call the week before my wedding: anyone who has been married for 10+ years and is still happy in their relationship, what lessons would you pass down to others if you could? What is working for you and your partner? And if you’re divorced, what didn’t work previously?
The response was overwhelming. Almost 1,500 people replied, many of whom sent in responses measured in pages, not paragraphs. It took almost two weeks to comb through them all, but I did. And what I found stunned me…
They were incredibly repetitive.
That’s not an insult or anything. Actually, it’s kind of the opposite. These were all smart and well-spoken people from all walks of life, from all around the world, all with their own histories, tragedies, mistakes, and triumphs…
And yet they were all saying pretty much the same dozen things.
Which means that those dozen or so things must be pretty damn important… and more importantly, they work.
Here’s what they are:
But, of course, not being satisfied with just a few wise words, I had to take it a step further.
See, I have access to hundreds of thousands of smart, amazing people through my site. So why not consult them? Why not ask them for their best relationship/marriage advice? Why not synthesize all of their wisdom and experience into something straightforward and immediately applicable to any relationship, no matter who you are?

So, that’s what I did. I sent out the call the week before my wedding: anyone who has been married for 10+ years and is still happy in their relationship, what lessons would you pass down to others if you could? What is working for you and your partner? And if you’re divorced, what didn’t work previously?
The response was overwhelming. Almost 1,500 people replied, many of whom sent in responses measured in pages, not paragraphs. It took almost two weeks to comb through them all, but I did. And what I found stunned me…
They were incredibly repetitive.
That’s not an insult or anything. Actually, it’s kind of the opposite. These were all smart and well-spoken people from all walks of life, from all around the world, all with their own histories, tragedies, mistakes, and triumphs…
And yet they were all saying pretty much the same dozen things.
Which means that those dozen or so things must be pretty damn important… and more importantly, they work.
Here’s what they are:
by Mark Manson, Quartz | Read more:
Image: Reuters/Lucy NicholsonWhy Craigslist is Unbeatable
Reham Fagiri’s eureka moment was the result of a deal gone wrong.
It was spring 2012, and the recent Wharton graduate was trying to sell her television on Craigslist. A prospective buyer—an older, gray-haired gentleman—came to her Philadelphia apartment to take a look. When he realized Fagiri had accidentally listed the wrong television model, he was irate.
“He got really upset about that: ‘You made me drive all the way here, blah, blah, blah,’” she recalls. Fagiri asked $200 for the TV. He offered $50. When she balked at the deal, the man announced that he was simply going to take the television. He started carrying it away.
“I’m like ‘Well, I’d rather save my life than have to argue about $150,’” she tells me. “So I was like ‘I don’t even want your money. Just take the TV.’”
This is the basic flaw of Craigslist. The site facilitates peer-to-peer interactions, but does little to ensure that those transactions go off seamlessly. After her harrowing encounter, Fagiri began trading Craigslist stories with friends and classmates, many of whom were similarly frustrated with the site. “That was kind of the second step, like, ‘OK, well, clearly it’s beyond me, and it’s my classmates too,’” she remembers.
Months later, that Craigslist experience still on her mind, Fagiri started outlining a business idea: an online used-furniture marketplace dedicated to the proposition that sometimes consumers want a middleman around to shield them from irrational strangers. She called the site AptDeco, and, like Craigslist, it would allow users to list and view ads for used furniture. Unlike Craigslist, it would also process payments, coordinate pickup and delivery, and serve as a buffer between buyer and seller.
“I’m an engineer, so I started playing around with the idea in my free time,” says Fagiri. “And then I built a small site.” On launch day she sold a West Elm headboard. That’s when Fagiri knew she was on to something. “‘Oh, OK!’” she recalls thinking. “‘I guess this is a real business!’”
More than three years after it ushered that headboard to new ownership, AptDeco is thriving and pedigreed — it was part of Y Combinator’s Winter 2014 class. According to Fagiri, the site is also profitable. (“Obviously there’s fluctuations. Some months are better. But overall we’re at the break-even profitability mark.”) For its services, AptDeco takes a 23 percent cut of the sale price and charges a flat delivery fee of either $35, $95, or $145, depending on the size of the item purchased; the site also lets you hire people to remove unwanted furniture or assemble new purchases.
AptDeco’s functional business model earns it a place of honor amongst the many startups that are vying to disrupt the “moving used crap around” space. There is Chairish, founded in 2013, which focuses on designer furniture and has raised almost $9 million in venture funding, according to Crunchbase; Viyet, founded in 2012, which specializes in high-end consignment; Trove, also known as Trove Market, a mobile-focused used-furniture service that, according to Crunchbase, has raised almost $1 million in seed funding; others include Krrb, MarketSquare, and 1stDibs.
All of these startups are jostling to dethrone the unlikeliest market leader in the history of online retail: Craigslist. The site commands vast loyalty despite doing very little to actively court its users. At times, it seems to dominate through sheer inertia. And yet Craigslist abides, and thrives, as its would-be competitors struggle to establish themselves. Which raises the question: Why is it so hard to compete with a site that is only begrudgingly a business?
It was spring 2012, and the recent Wharton graduate was trying to sell her television on Craigslist. A prospective buyer—an older, gray-haired gentleman—came to her Philadelphia apartment to take a look. When he realized Fagiri had accidentally listed the wrong television model, he was irate.
“He got really upset about that: ‘You made me drive all the way here, blah, blah, blah,’” she recalls. Fagiri asked $200 for the TV. He offered $50. When she balked at the deal, the man announced that he was simply going to take the television. He started carrying it away.

This is the basic flaw of Craigslist. The site facilitates peer-to-peer interactions, but does little to ensure that those transactions go off seamlessly. After her harrowing encounter, Fagiri began trading Craigslist stories with friends and classmates, many of whom were similarly frustrated with the site. “That was kind of the second step, like, ‘OK, well, clearly it’s beyond me, and it’s my classmates too,’” she remembers.
Months later, that Craigslist experience still on her mind, Fagiri started outlining a business idea: an online used-furniture marketplace dedicated to the proposition that sometimes consumers want a middleman around to shield them from irrational strangers. She called the site AptDeco, and, like Craigslist, it would allow users to list and view ads for used furniture. Unlike Craigslist, it would also process payments, coordinate pickup and delivery, and serve as a buffer between buyer and seller.
“I’m an engineer, so I started playing around with the idea in my free time,” says Fagiri. “And then I built a small site.” On launch day she sold a West Elm headboard. That’s when Fagiri knew she was on to something. “‘Oh, OK!’” she recalls thinking. “‘I guess this is a real business!’”
More than three years after it ushered that headboard to new ownership, AptDeco is thriving and pedigreed — it was part of Y Combinator’s Winter 2014 class. According to Fagiri, the site is also profitable. (“Obviously there’s fluctuations. Some months are better. But overall we’re at the break-even profitability mark.”) For its services, AptDeco takes a 23 percent cut of the sale price and charges a flat delivery fee of either $35, $95, or $145, depending on the size of the item purchased; the site also lets you hire people to remove unwanted furniture or assemble new purchases.
AptDeco’s functional business model earns it a place of honor amongst the many startups that are vying to disrupt the “moving used crap around” space. There is Chairish, founded in 2013, which focuses on designer furniture and has raised almost $9 million in venture funding, according to Crunchbase; Viyet, founded in 2012, which specializes in high-end consignment; Trove, also known as Trove Market, a mobile-focused used-furniture service that, according to Crunchbase, has raised almost $1 million in seed funding; others include Krrb, MarketSquare, and 1stDibs.
All of these startups are jostling to dethrone the unlikeliest market leader in the history of online retail: Craigslist. The site commands vast loyalty despite doing very little to actively court its users. At times, it seems to dominate through sheer inertia. And yet Craigslist abides, and thrives, as its would-be competitors struggle to establish themselves. Which raises the question: Why is it so hard to compete with a site that is only begrudgingly a business?
by Justin Peters, Backchannel | Read more:
Image: Li-Anne DiasWednesday, February 15, 2017
Politics 101
Hard to keep up, isn't it?
Russia Deploys Missile, Violating Treaty and Challenging Trump
‘Unbelievable Turmoil’
Beneath the Deepening Chaos, the Trump Business is Doing Just Fine
Pictures of 'Swooning' Ivanka Trump and Justin Trudeau Go Viral
After Flynn, Will Republicans Finally Stand Up to Trump?
Have TV Media Had Their Fill of Kellyanne?
Trump Is Violating the Constitution
The Nine Most Insane Moments From Donald Trump’s Reality-Challenged Press Conference
The Spectacle of Trump Acting Presidential
Russia Deploys Missile, Violating Treaty and Challenging Trump
‘Unbelievable Turmoil’
Beneath the Deepening Chaos, the Trump Business is Doing Just Fine
Pictures of 'Swooning' Ivanka Trump and Justin Trudeau Go Viral
After Flynn, Will Republicans Finally Stand Up to Trump?
Have TV Media Had Their Fill of Kellyanne?
Trump Is Violating the Constitution
The Nine Most Insane Moments From Donald Trump’s Reality-Challenged Press Conference
The Spectacle of Trump Acting Presidential
‘The Kids Think I’m a Shoe’
Stan Smith the man & Stan Smith the sneaker.
The island of Hilton Head in South Carolina is shaped like a sneaker, and Stan Smith lives on the laces, right off the river. Inside his house, the six-foot-four retired tennis player with the straightest back I’ve ever seen walks out of the second of his two closets and into the living room carrying five pairs of Stan Smiths, the sneaker, but he still can’t find the one he’s looking for. He has 40 pairs in 30 different styles, more or less.
The sneaker’s fame — and its longevity — takes even its namesake by surprise. You see, the Stan Smith is really the most basic of all possible sneakers. Its narrow white leather body is cushioned at the front with an almost-orthopedic round toe. Its three understated Adidas stripes are nearly missable perforations, as if they don’t care to be recognized, and it has just two spots of color, most classically in green: a tab on the back of the ankle and Smith’s face printed on the tongue. They are essentially anonymous, the saltine cracker of tennis shoes. They were endorsed by Stan Smith just after he won his first Grand Slam singles title in the summer of 1971 and just before he won his second, and last, the next year. He was, in other words, no Serena Williams, not even a Rod Laver.
Nothing about Smith or the simple design of the sneaker itself — neither has changed much since 1971 — explains how Adidas was able to sell 7 million pairs by 1985. Or how that number had grown to 22 million pairs by 1988. Or why Footwear News named it the first-ever Shoe of the Year in 2014. Or how it surpassed 50 million shoes sold as of 2016. Or how the sneaker grew far beyond its start as a technical athletic shoe and became a fashion brand, its basic blank slate evolving and taking on new meaning and purpose. (...)
With his Adidas contract, Smith became one of the first American tennis players to receive an endorsement deal. It was the very beginning of the modern brand-athlete pairings that would, a little over a decade later, lead to Michael Jordan’s very own Air Jordan line, and three decades after that, to LeBron James’s reported $1 billion lifetime endorsement deal with Nike. But when Smith was playing, none of that existed yet. If you made it to the Roland Garros main draw, you would get “six shirts, a vest sweater, a regular sweater, socks, and that’s about it,” he says, counting the items off on his fingers. “You wanted to get in the main draw, so you could get the full set of clothes.”
His agent, Donald Dell, negotiated the picture of Smith’s face on the tongue, a savvy move that made the man inseparable from the sneaker, but Haillet’s name remained on the shoe until 1978, when Smith took over for good. It was by then the premier tennis sneaker. Smith remembers being beaten by opponents wearing his face on their feet. “I didn’t think it was appropriate,” he says. There was an Argentine player named Ricardo Cano, Smith recalls, who was signed to another brand but wore Stan Smiths anyway and drew the other company’s logo on the side of the shoes. The Stans were just that much better.
Smith retired from tennis in 1985. How the sneakers, 43 years after their creation, became suddenly ubiquitous is a case study in how “cool” is created and disseminated from image-makers to mainstream consciousness. In the mid-’90s, while Nike consumed the American sneaker market, a small circle of offbeat celebrities and influential marketing professionals latched onto the shoe as a sort of anti-fashion fashion statement, part of a Waspy, but not too Waspy, vintage style they helped pioneer: tucked-in Brooks Brothers shirts with ill-fitting corduroys or khakis. It helped if you drove a vintage Mercedes.
Stan Smiths fit perfectly with this aesthetic. Here was a shoe that you could buy new, but it looked the same as it had in 1971. The skateboarder Rick Howard wore Stan Smiths in a 1993 skate video sponsored by Girl Skateboards, a company co-founded by Spike Jonze. Mike Mills, who recently directed 20th Century Women, but back then designed album covers for the Beastie Boys and Sonic Youth, was more into Rod Lavers, another Adidas tennis sneaker from the ’70s, but his friend Roman Coppola, who founded the ad agency the Directors Bureau and who later wrote Moonrise Kingdom with Wes Anderson, preferred Stan Smiths. “I’ve owned a few pairs over the years, but don’t remember any specific movement or discussion around it,” he says. His sister Sofia Coppola wore them, too. By the early aughts, branding experts such as Andy Spade, who had launched and popularized his wife Kate Spade’s company, were starting to reinterpret the retro-nostalgia look for the likes of J.Crew, Warby Parker, and Shinola, to great financial success.
Then came Phoebe Philo, the creative director of Céline. In March 2011, Philo took her bow on the Céline runway at the end of the fall-winter ready-to-wear show in Stan Smiths along with low-slung black trousers and a gray turtleneck, hair tucked in. The timing could not have been better. Philo was at the peak of her influence and power. Every editor and professional fashion woman from New York to London to Paris was shopping at Céline between the shows. Kanye West had just name-dropped her in his comeback album, My Beautiful Dark Twisted Fantasy, and was so completely bewitched by her ideas that he performed wearing women’s Céline. It was right after the label came out with the luggage bag but just before it became the only bag that seemed to matter. At this height, Phoebe Philo on the runway wearing Stan Smiths was like a gift. Here was something Philo did that everyone could copy for only $75; you could even buy a pair on Amazon. The shoes took on a new meaning. J.Crew started carrying them. The Stan Smith became fashion’s most important sneaker.
At the time, though, Adidas saw things a bit differently. While the sneaker was becoming popular in the fashion world, it was still sold almost exclusively in sporting-goods stores and often at a discount. “We weren’t really happy with how it was seen and where it was found,” says Torben Schumacher, Adidas’ vice-president of product. Adidas wanted to recalibrate how the shoe was presented.
To do that, Schumacher and Adidas decided to take the sneaker entirely off the market. “The idea of not having the model wasn’t really something that went down well,” says Schumacher, especially since it was just starting to get recognized by this new trendsetting crowd. (Smith’s first thought: “That’s interesting. I don’t really like that too much.”) Still, Schumacher and his team at Adidas spent a year and a half convincing the rest of the company of the merits of the plan. Adidas couldn’t truly reintroduce it to a new higher-end clientele, Schumacher argued, if it was still readily available in the bargain bin. “We wanted it to get the respect it deserved and the conversation about it that it deserved and for it to be seen as a commodity item,” he says. “We thought it needed something bold and drastic to prepare everyone for the story again.” By the time Adidas stopped selling the Stan Smith to places like Foot Locker, the company already had a plan of how and with whom it was going to bring it back. In 2012, the sneakers disappeared.
They began reemerging, subtly but purposefully, the next year — notably in the November 2013 issue of Vogue Paris, for which Gisele Bündchen posed naked, apart from white socks and Stan Smiths (“One of our sons saw that, we had no idea,” Margie says. “It was funny”). On January 15, 2014, they went back on sale in higher-end, fashion-focused stores like Barneys New York and the Parisian boutique Colette, still for under $100. They were instantly devoured. Later that year, Philo formally announced the Stan Smiths return, once again wearing them while taking her runway bow, this time with wide-leg pants and a camel sweater.
The trickle-down was immediate. In 2015, Adidas sold 8 million pairs of Stan Smiths. Adidas won’t confirm how many it sold in 2016, but some industry experts throw around numbers like 15 million — more than double what it moved in the shoes’ first decade of existence — the same side part and crooked smile leading them wherever they go.
Smith is the first to recognize Philo’s importance. He brings her up on two different occasions over the course of our time together. “She was one of the first to start wearing the shoe,” he remembers. “And then Pharrell Williams,” who basically bowed to Smith when they met at the U.S. Open this summer and now regularly designs his own versions of the sneaker, as does Raf Simons. “Those cost like $400 or something, and it’s the same shoe! It’s really weird, actually.”
For Smith, the sneakers are far more successful, monetarily, than he ever was in his tennis career, during which he made “$1.7 million, or something like that. I read it once,” he says. “The shoe has certainly been more than that.” In the beginning he collected an annual sum for his endorsement. These days, though, he’s paid in royalties.
Smith’s contract with Adidas expires about every five years (he’ll sign next in 2018). So why does Adidas keep Stan Smith around? Why does it need him when it has Phoebe and Gisele and Pharrell and Raf and Kanye? Turns out this 70-year-old former tennis player, who was really more of a doubles star, who has eyebrows like the flailing blowup guy at car dealerships, is the only thing that makes its shoe the original. Which is especially valuable when everybody else in the business is trying to knock off its success.
The island of Hilton Head in South Carolina is shaped like a sneaker, and Stan Smith lives on the laces, right off the river. Inside his house, the six-foot-four retired tennis player with the straightest back I’ve ever seen walks out of the second of his two closets and into the living room carrying five pairs of Stan Smiths, the sneaker, but he still can’t find the one he’s looking for. He has 40 pairs in 30 different styles, more or less.
The sneaker’s fame — and its longevity — takes even its namesake by surprise. You see, the Stan Smith is really the most basic of all possible sneakers. Its narrow white leather body is cushioned at the front with an almost-orthopedic round toe. Its three understated Adidas stripes are nearly missable perforations, as if they don’t care to be recognized, and it has just two spots of color, most classically in green: a tab on the back of the ankle and Smith’s face printed on the tongue. They are essentially anonymous, the saltine cracker of tennis shoes. They were endorsed by Stan Smith just after he won his first Grand Slam singles title in the summer of 1971 and just before he won his second, and last, the next year. He was, in other words, no Serena Williams, not even a Rod Laver.

With his Adidas contract, Smith became one of the first American tennis players to receive an endorsement deal. It was the very beginning of the modern brand-athlete pairings that would, a little over a decade later, lead to Michael Jordan’s very own Air Jordan line, and three decades after that, to LeBron James’s reported $1 billion lifetime endorsement deal with Nike. But when Smith was playing, none of that existed yet. If you made it to the Roland Garros main draw, you would get “six shirts, a vest sweater, a regular sweater, socks, and that’s about it,” he says, counting the items off on his fingers. “You wanted to get in the main draw, so you could get the full set of clothes.”
His agent, Donald Dell, negotiated the picture of Smith’s face on the tongue, a savvy move that made the man inseparable from the sneaker, but Haillet’s name remained on the shoe until 1978, when Smith took over for good. It was by then the premier tennis sneaker. Smith remembers being beaten by opponents wearing his face on their feet. “I didn’t think it was appropriate,” he says. There was an Argentine player named Ricardo Cano, Smith recalls, who was signed to another brand but wore Stan Smiths anyway and drew the other company’s logo on the side of the shoes. The Stans were just that much better.
Smith retired from tennis in 1985. How the sneakers, 43 years after their creation, became suddenly ubiquitous is a case study in how “cool” is created and disseminated from image-makers to mainstream consciousness. In the mid-’90s, while Nike consumed the American sneaker market, a small circle of offbeat celebrities and influential marketing professionals latched onto the shoe as a sort of anti-fashion fashion statement, part of a Waspy, but not too Waspy, vintage style they helped pioneer: tucked-in Brooks Brothers shirts with ill-fitting corduroys or khakis. It helped if you drove a vintage Mercedes.
Stan Smiths fit perfectly with this aesthetic. Here was a shoe that you could buy new, but it looked the same as it had in 1971. The skateboarder Rick Howard wore Stan Smiths in a 1993 skate video sponsored by Girl Skateboards, a company co-founded by Spike Jonze. Mike Mills, who recently directed 20th Century Women, but back then designed album covers for the Beastie Boys and Sonic Youth, was more into Rod Lavers, another Adidas tennis sneaker from the ’70s, but his friend Roman Coppola, who founded the ad agency the Directors Bureau and who later wrote Moonrise Kingdom with Wes Anderson, preferred Stan Smiths. “I’ve owned a few pairs over the years, but don’t remember any specific movement or discussion around it,” he says. His sister Sofia Coppola wore them, too. By the early aughts, branding experts such as Andy Spade, who had launched and popularized his wife Kate Spade’s company, were starting to reinterpret the retro-nostalgia look for the likes of J.Crew, Warby Parker, and Shinola, to great financial success.
Then came Phoebe Philo, the creative director of Céline. In March 2011, Philo took her bow on the Céline runway at the end of the fall-winter ready-to-wear show in Stan Smiths along with low-slung black trousers and a gray turtleneck, hair tucked in. The timing could not have been better. Philo was at the peak of her influence and power. Every editor and professional fashion woman from New York to London to Paris was shopping at Céline between the shows. Kanye West had just name-dropped her in his comeback album, My Beautiful Dark Twisted Fantasy, and was so completely bewitched by her ideas that he performed wearing women’s Céline. It was right after the label came out with the luggage bag but just before it became the only bag that seemed to matter. At this height, Phoebe Philo on the runway wearing Stan Smiths was like a gift. Here was something Philo did that everyone could copy for only $75; you could even buy a pair on Amazon. The shoes took on a new meaning. J.Crew started carrying them. The Stan Smith became fashion’s most important sneaker.
At the time, though, Adidas saw things a bit differently. While the sneaker was becoming popular in the fashion world, it was still sold almost exclusively in sporting-goods stores and often at a discount. “We weren’t really happy with how it was seen and where it was found,” says Torben Schumacher, Adidas’ vice-president of product. Adidas wanted to recalibrate how the shoe was presented.
To do that, Schumacher and Adidas decided to take the sneaker entirely off the market. “The idea of not having the model wasn’t really something that went down well,” says Schumacher, especially since it was just starting to get recognized by this new trendsetting crowd. (Smith’s first thought: “That’s interesting. I don’t really like that too much.”) Still, Schumacher and his team at Adidas spent a year and a half convincing the rest of the company of the merits of the plan. Adidas couldn’t truly reintroduce it to a new higher-end clientele, Schumacher argued, if it was still readily available in the bargain bin. “We wanted it to get the respect it deserved and the conversation about it that it deserved and for it to be seen as a commodity item,” he says. “We thought it needed something bold and drastic to prepare everyone for the story again.” By the time Adidas stopped selling the Stan Smith to places like Foot Locker, the company already had a plan of how and with whom it was going to bring it back. In 2012, the sneakers disappeared.
They began reemerging, subtly but purposefully, the next year — notably in the November 2013 issue of Vogue Paris, for which Gisele Bündchen posed naked, apart from white socks and Stan Smiths (“One of our sons saw that, we had no idea,” Margie says. “It was funny”). On January 15, 2014, they went back on sale in higher-end, fashion-focused stores like Barneys New York and the Parisian boutique Colette, still for under $100. They were instantly devoured. Later that year, Philo formally announced the Stan Smiths return, once again wearing them while taking her runway bow, this time with wide-leg pants and a camel sweater.
The trickle-down was immediate. In 2015, Adidas sold 8 million pairs of Stan Smiths. Adidas won’t confirm how many it sold in 2016, but some industry experts throw around numbers like 15 million — more than double what it moved in the shoes’ first decade of existence — the same side part and crooked smile leading them wherever they go.
Smith is the first to recognize Philo’s importance. He brings her up on two different occasions over the course of our time together. “She was one of the first to start wearing the shoe,” he remembers. “And then Pharrell Williams,” who basically bowed to Smith when they met at the U.S. Open this summer and now regularly designs his own versions of the sneaker, as does Raf Simons. “Those cost like $400 or something, and it’s the same shoe! It’s really weird, actually.”
For Smith, the sneakers are far more successful, monetarily, than he ever was in his tennis career, during which he made “$1.7 million, or something like that. I read it once,” he says. “The shoe has certainly been more than that.” In the beginning he collected an annual sum for his endorsement. These days, though, he’s paid in royalties.
Smith’s contract with Adidas expires about every five years (he’ll sign next in 2018). So why does Adidas keep Stan Smith around? Why does it need him when it has Phoebe and Gisele and Pharrell and Raf and Kanye? Turns out this 70-year-old former tennis player, who was really more of a doubles star, who has eyebrows like the flailing blowup guy at car dealerships, is the only thing that makes its shoe the original. Which is especially valuable when everybody else in the business is trying to knock off its success.
by Lauren Schwartzberg, The Cut | Read more:
Image: João CanzianiTuesday, February 14, 2017
E Unibus Pluram: Television and U.S. Fiction
Act Natural
Fiction writers as a species tend to be oglers. They tend to lurk and to stare. The minute fiction writers stop moving, they start lurking, and stare. They are born watchers. They are viewers. They are the ones on the subway about whose nonchalant stare there is something creepy, somehow. Almost predatory. This is because human situations are writers' food. Fiction writers watch other humans sort of the way gapers slow down for car wrecks: they covet a vision of themselves as witnesses.
But fiction writers as a species also tend to be terribly self-conscious. Even by U.S. standards. Devoting lots of productive time to studying closely how people come across to them, fiction writers also spend lots of less productive time wondering nervously how they come across to other people. How they appear, how they seem, whether their shirttail might be hanging out their fly, whether there's maybe lipstick on their teeth, whether the people they're ogling can maybe size them up as somehow creepy, lurkers and starers.
The result is that a surprising majority of fiction writers, born watchers, tend to dislike being objects of people's attention. Being watched. The exceptions to this rule - Mailer, McInerney, Janowitz - create the misleading impression that lots of belles-lettres types like people's attention. Most don't. The few who like attention just naturally get more attention. The rest of us get less, and ogle.
Most of the fiction writers I know are Americans under forty. I don't know whether fiction writers under forty watch more television than other American species. Statisticians report that television is watched over six hours a day in the average American household. I don't know any fiction writers who live in average American households. I suspect Louise Erdrich might. Actually I have never seen an average American household. Except on TV.
So right away you can see a couple of things that look potentially great, for U.S. fiction writers, about U.S. television. First, television does a lot of our predatory human research for us. American human beings are a slippery and protean bunch, in real life, as hard to get any kind of univocal handle on as a literary territory that's gone from Darwinianly naturalistic to cybernetically post-postmodern in eighty years. But television comes equipped with just such a syncretic handle. If we want to know what American normality is - what Americans want to regard as normal - we can trust television. For television's whole raison is reflecting what people want to see. It's a mirror. Not the Stendhalian mirror reflecting the blue sky and mud puddle. More like the overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile. This kind of window on nervous American self-perception is just invaluable, fictionwise. And writers can have faith in television. There is a lot of money at stake, after all; and television retains the best demographers applied social science has to offer, and these researchers can determine precisely what Americans in 1990 are, want, see: what we as Audience want to see ourselves as. Television, from the surface on down, is about desire. Fictionally speaking, desire is the sugar in human food.
The second great thing is that television looks to be an absolute godsend for a human subspecies that loves to watch people but hates to be watched itself. For the television screen affords access only one way. A psychic ball-check valve. We can see Them; They can't see Us. We can relax, unobserved, as we ogle. I happen to believe this is why television also appeals so much to lonely people. To voluntary shut-ins. Every lonely human I know watches way more than the average U.S. six hours a day. The lonely, like the fictional, love one-way watching. For lonely people are usually lonely not because of hideous deformity or odor or obnoxiousness - in fact there exist today social and support groups for persons with precisely these features. Lonely people tend rather to be lonely because they decline to bear the emotional costs associated with being around other humans. They are allergic to people. People affect them too strongly. Let's call the average U.S. lonely person Joe Briefcase. Joe Briefcase just loathes the strain of the self-consciousness which so oddly seems to appear only when other real human beings are around, staring, their human sense-antennae abristle. Joe B. fears how he might appear to watchers. He sits out the stressful U.S. game of appearance poker.
But lonely people, home, alone, still crave sights and scenes. Hence television. Joe can stare at Them, on the screen; They remain blind to Joe. It's almost like voyeurism. I happen to know lonely people who regard television as a veritable deus ex machina for voyeurs. And a lot of the criticism, the really rabid criticism less leveled than sprayed at networks, advertisers, and audiences alike, has to do with the charge that television has turned us into a nation of sweaty, slack-jawed voyeurs. This charge turns out to be untrue, but for weird reasons.
What classic voyeurism is is espial: watching people who don't know you're there as they go about the mundane but erotically charged little businesses of private life. It's interesting that so much classic voyeurism involves media of framed glass-windows, telescopes, etc. Maybe the framed glass is why the analogy to television is so tempting. But TV-watching is a different animal from Peeping Tourism. Because the people we're watching through TV's framed-glass screen are not really ignorant of the fact that somebody is watching them. In fact a whole lot of somebodies. In fact the people on television know that it is in virtue of this truly huge crowd of ogling somebodies that they are on the screen, engaging in broad non-mundane gestures, at all. Television does not afford true espial because television is performance, spectacle, which by definition requires watchers. We're not voyeurs here at all. We're just viewers. We are the Audience, megametrically many, though most often we watch alone. E unibus pluram. (...)
Not that realities about actors and phosphenes and furniture are unknown to us. We simply choose to ignore them. For six hours a day. They are part of the belief we suspend. But we're asked to hoist such a heavy load aloft. Illusions of voyeurism and privileged access require real complicity from viewers. How can we be made so willingly to acquiesce for hours daily to the illusion that the people on the TV don't know they're being looked at, to the fantasy that we're transcending privacy and feeding on unself-conscious human activity? There might be lots of reasons why these unrealities are so swallowable, but a big one is that the performers behind the two layers of glass are - varying degrees of Thespian talent aside - absolute geniuses at seeming unwatched. Now, seeming unwatched in front of a TV camera is a genuine art. Take a look at how civilians act when a TV camera is pointed at them: they simply spaz out, or else go all rigor mortis. Even PR people and politicians are, camera-wise, civilians. And we love to laugh at how stiff and false non-professionals appear, on television. How unnatural But if you've ever once been the object of that terrible blank round glass stare, you know all too well how self-conscious it makes you. A harried guy with earphones and a clipboard tells you to "act natural" as your face begins to leap around on your skull, struggling for a seemingly unwatched expression that feels impossible because "seeming unwatched" is, like the "act natural" which fathered it, oxymoronic. Try driving a golf ball as someone asks you whether you in- or exhale on your backswing, or getting promised lavish rewards if you can avoid thinking of a rhinoceros for ten seconds, and you'll get some idea of the truly heroic contortions of body and mind that must be required for Don Johnson to act unwatched as he's watched by a lens that's an overwhelming emblem of what Emerson, years before TV, called "the gaze of millions."
Only a certain very rare species of person, for Emerson, is "fit to stand the gaze of millions." It is not your normal, hard-working, quietly desperate species of American. The man who can stand the megagaze is a walking imago, a certain type of transcendent freak who, for Emerson, "carries the holiday in his eye."(2) The Emersonian holiday television actors' eyes carry is the potent illusion of a vacation from self-consciousness. Not worrying about how you come across. A total unallergy to gazes. It is contemporarily heroic. It is frightening and strong. It is also, of course, an act, a counterfeit impression - for you have to be just abnormally self-conscious and self-controlling to appear unwatched before lenses. The self-conscious appearance of unself-consciousness is the grand illusion behind TV's mirror-hall of illusions; and for us, the Audience, it is both medicine and poison.
For we gaze at these rare, highly trained, seemingly unwatched people for six hours daily. And we love these people. In terms of attributing to them true supernatural assets and desiring to emulate them, we sort of worship them. In a real Joe Briefcase-type world that shifts ever more starkly from some community of relationships to networks of strangers connected by self-interest and contest and image, the people we espy on TV offer us familiarity, community. Intimate friendship. But we split what we see. The characters are our "close friends"; but the performers are beyond strangers, they're images, demigods, and they move in a different sphere, hang out with and marry only each other, seem even as actors accessible to Audience only via the mediation of tabloids, talk show, EM signal. And yet both actors and characters, so terribly removed and filtered, seem so natural, when we watch.
Given how much we watch and what watching means, it's inevitable - but toxic - for those of us fictionists or Joe Briefcases who wish to be voyeurs to get the idea that these persons behind the glass, persons who are often the most colorful, attractive, animated, alive people in our daily experience, are also people who are oblivious to the fact that they are watched. It's toxic for allergic people because it sets up an alienating cycle, and also for writers because it replaces fiction research with a weird kind of fiction consumption. We self-conscious Americans' oversensitivity to real humans fixes us before the television and its ball-check valve in an attitude of rapt, relaxed reception. We watch various actors play various characters, etc. For 360 minutes per diem, we receive unconscious reinforcement of the deep thesis that the most significant feature of truly alive persons is watchableness, and that genuine human worth is not just identical with but rooted in the phenomenon of watching. And that the single biggest part of real watchableness is seeming to be unaware that there's any watching going on. Acting natural.
Fiction writers as a species tend to be oglers. They tend to lurk and to stare. The minute fiction writers stop moving, they start lurking, and stare. They are born watchers. They are viewers. They are the ones on the subway about whose nonchalant stare there is something creepy, somehow. Almost predatory. This is because human situations are writers' food. Fiction writers watch other humans sort of the way gapers slow down for car wrecks: they covet a vision of themselves as witnesses.

The result is that a surprising majority of fiction writers, born watchers, tend to dislike being objects of people's attention. Being watched. The exceptions to this rule - Mailer, McInerney, Janowitz - create the misleading impression that lots of belles-lettres types like people's attention. Most don't. The few who like attention just naturally get more attention. The rest of us get less, and ogle.
Most of the fiction writers I know are Americans under forty. I don't know whether fiction writers under forty watch more television than other American species. Statisticians report that television is watched over six hours a day in the average American household. I don't know any fiction writers who live in average American households. I suspect Louise Erdrich might. Actually I have never seen an average American household. Except on TV.
So right away you can see a couple of things that look potentially great, for U.S. fiction writers, about U.S. television. First, television does a lot of our predatory human research for us. American human beings are a slippery and protean bunch, in real life, as hard to get any kind of univocal handle on as a literary territory that's gone from Darwinianly naturalistic to cybernetically post-postmodern in eighty years. But television comes equipped with just such a syncretic handle. If we want to know what American normality is - what Americans want to regard as normal - we can trust television. For television's whole raison is reflecting what people want to see. It's a mirror. Not the Stendhalian mirror reflecting the blue sky and mud puddle. More like the overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile. This kind of window on nervous American self-perception is just invaluable, fictionwise. And writers can have faith in television. There is a lot of money at stake, after all; and television retains the best demographers applied social science has to offer, and these researchers can determine precisely what Americans in 1990 are, want, see: what we as Audience want to see ourselves as. Television, from the surface on down, is about desire. Fictionally speaking, desire is the sugar in human food.
The second great thing is that television looks to be an absolute godsend for a human subspecies that loves to watch people but hates to be watched itself. For the television screen affords access only one way. A psychic ball-check valve. We can see Them; They can't see Us. We can relax, unobserved, as we ogle. I happen to believe this is why television also appeals so much to lonely people. To voluntary shut-ins. Every lonely human I know watches way more than the average U.S. six hours a day. The lonely, like the fictional, love one-way watching. For lonely people are usually lonely not because of hideous deformity or odor or obnoxiousness - in fact there exist today social and support groups for persons with precisely these features. Lonely people tend rather to be lonely because they decline to bear the emotional costs associated with being around other humans. They are allergic to people. People affect them too strongly. Let's call the average U.S. lonely person Joe Briefcase. Joe Briefcase just loathes the strain of the self-consciousness which so oddly seems to appear only when other real human beings are around, staring, their human sense-antennae abristle. Joe B. fears how he might appear to watchers. He sits out the stressful U.S. game of appearance poker.
But lonely people, home, alone, still crave sights and scenes. Hence television. Joe can stare at Them, on the screen; They remain blind to Joe. It's almost like voyeurism. I happen to know lonely people who regard television as a veritable deus ex machina for voyeurs. And a lot of the criticism, the really rabid criticism less leveled than sprayed at networks, advertisers, and audiences alike, has to do with the charge that television has turned us into a nation of sweaty, slack-jawed voyeurs. This charge turns out to be untrue, but for weird reasons.
What classic voyeurism is is espial: watching people who don't know you're there as they go about the mundane but erotically charged little businesses of private life. It's interesting that so much classic voyeurism involves media of framed glass-windows, telescopes, etc. Maybe the framed glass is why the analogy to television is so tempting. But TV-watching is a different animal from Peeping Tourism. Because the people we're watching through TV's framed-glass screen are not really ignorant of the fact that somebody is watching them. In fact a whole lot of somebodies. In fact the people on television know that it is in virtue of this truly huge crowd of ogling somebodies that they are on the screen, engaging in broad non-mundane gestures, at all. Television does not afford true espial because television is performance, spectacle, which by definition requires watchers. We're not voyeurs here at all. We're just viewers. We are the Audience, megametrically many, though most often we watch alone. E unibus pluram. (...)
Not that realities about actors and phosphenes and furniture are unknown to us. We simply choose to ignore them. For six hours a day. They are part of the belief we suspend. But we're asked to hoist such a heavy load aloft. Illusions of voyeurism and privileged access require real complicity from viewers. How can we be made so willingly to acquiesce for hours daily to the illusion that the people on the TV don't know they're being looked at, to the fantasy that we're transcending privacy and feeding on unself-conscious human activity? There might be lots of reasons why these unrealities are so swallowable, but a big one is that the performers behind the two layers of glass are - varying degrees of Thespian talent aside - absolute geniuses at seeming unwatched. Now, seeming unwatched in front of a TV camera is a genuine art. Take a look at how civilians act when a TV camera is pointed at them: they simply spaz out, or else go all rigor mortis. Even PR people and politicians are, camera-wise, civilians. And we love to laugh at how stiff and false non-professionals appear, on television. How unnatural But if you've ever once been the object of that terrible blank round glass stare, you know all too well how self-conscious it makes you. A harried guy with earphones and a clipboard tells you to "act natural" as your face begins to leap around on your skull, struggling for a seemingly unwatched expression that feels impossible because "seeming unwatched" is, like the "act natural" which fathered it, oxymoronic. Try driving a golf ball as someone asks you whether you in- or exhale on your backswing, or getting promised lavish rewards if you can avoid thinking of a rhinoceros for ten seconds, and you'll get some idea of the truly heroic contortions of body and mind that must be required for Don Johnson to act unwatched as he's watched by a lens that's an overwhelming emblem of what Emerson, years before TV, called "the gaze of millions."
Only a certain very rare species of person, for Emerson, is "fit to stand the gaze of millions." It is not your normal, hard-working, quietly desperate species of American. The man who can stand the megagaze is a walking imago, a certain type of transcendent freak who, for Emerson, "carries the holiday in his eye."(2) The Emersonian holiday television actors' eyes carry is the potent illusion of a vacation from self-consciousness. Not worrying about how you come across. A total unallergy to gazes. It is contemporarily heroic. It is frightening and strong. It is also, of course, an act, a counterfeit impression - for you have to be just abnormally self-conscious and self-controlling to appear unwatched before lenses. The self-conscious appearance of unself-consciousness is the grand illusion behind TV's mirror-hall of illusions; and for us, the Audience, it is both medicine and poison.
For we gaze at these rare, highly trained, seemingly unwatched people for six hours daily. And we love these people. In terms of attributing to them true supernatural assets and desiring to emulate them, we sort of worship them. In a real Joe Briefcase-type world that shifts ever more starkly from some community of relationships to networks of strangers connected by self-interest and contest and image, the people we espy on TV offer us familiarity, community. Intimate friendship. But we split what we see. The characters are our "close friends"; but the performers are beyond strangers, they're images, demigods, and they move in a different sphere, hang out with and marry only each other, seem even as actors accessible to Audience only via the mediation of tabloids, talk show, EM signal. And yet both actors and characters, so terribly removed and filtered, seem so natural, when we watch.
Given how much we watch and what watching means, it's inevitable - but toxic - for those of us fictionists or Joe Briefcases who wish to be voyeurs to get the idea that these persons behind the glass, persons who are often the most colorful, attractive, animated, alive people in our daily experience, are also people who are oblivious to the fact that they are watched. It's toxic for allergic people because it sets up an alienating cycle, and also for writers because it replaces fiction research with a weird kind of fiction consumption. We self-conscious Americans' oversensitivity to real humans fixes us before the television and its ball-check valve in an attitude of rapt, relaxed reception. We watch various actors play various characters, etc. For 360 minutes per diem, we receive unconscious reinforcement of the deep thesis that the most significant feature of truly alive persons is watchableness, and that genuine human worth is not just identical with but rooted in the phenomenon of watching. And that the single biggest part of real watchableness is seeming to be unaware that there's any watching going on. Acting natural.
by David Foster Wallace, The Free Library | Read more:
Image: Naldz Graphics, TV MANStanford Students Recreate 5,000-year-old Chinese Beer Recipe
On a recent afternoon, a small group of students gathered around a large table in one of the rooms at the Stanford Archaeology Center.
For a hands-on view into the ancient world, students brewed beer from a 5,000-year-old recipe as part of an archaeology course with Professor Li Liu.
A collection of plastic-covered glass beakers and water bottles filled with yellow, foamy liquid stood in front of them on the table, at the end of which sat Li Liu, a professor in Chinese archaeology at Stanford.
White mold-like layers floated on top of the liquids. As the students removed the plastic covers, they crinkled their noses at the smell and sour taste of the odd-looking concoctions, which were the results of their final project for Liu’s course Archaeology of Food: Production, Consumption and Ritual.
The mixtures were homemade beer students made using ancient brewing techniques of early human civilizations. One of the experiments imitated a 5,000-year-old beer recipe Liu and her team revealed as part of published research last spring.
“Archaeology is not just about reading books and analyzing artifacts,” said Liu, the Sir Robert Ho Tung Professor in Chinese Archaeology. “Trying to imitate ancient behavior and make things with the ancient method helps students really put themselves into the past and understand why people did what they did.”
The ancient recipe
Liu, together with doctoral candidate Jiajing Wang and a group of other experts, discovered the 5,000-year-old beer recipe by studying the residue on the inner walls of pottery vessels found in an excavated site in northeast China. The research, which was published in Proceedings of the National Academy of Sciences, provided the earliest evidence of beer production in China so far.
The ancient Chinese made beer mainly with cereal grains, including millet and barley, as well as with Job’s tears, a type of grass in Asia, according to the research. Traces of yam and lily root parts also appeared in the concoction.
Liu said she was particularly surprised to find barley – which is used to make beer today – in the recipe because the earliest evidence to date of barley seeds in China dates to 4,000 years ago. This suggests why barley, which was first domesticated in western Asia, spread to China.
“Our results suggest the purpose of barley’s introduction in China could have been related to making alcohol rather than as a staple food,” Liu said.
The ancient Chinese beer looked more like porridge and likely tasted sweeter and fruitier than the clear, bitter beers of today. The ingredients used for fermentation were not filtered out, and straws were commonly used for drinking, Liu said.
Recreating the recipe
At the end of Liu’s class, each student tried to imitate the ancient Chinese beer using either wheat, millet or barley seeds.
The students first covered their grain with water and let it sprout, in a process called malting. After the grain sprouted, the students crushed the seeds and put them in water again. The container with the mixture was then placed in the oven and heated to 65 degrees Celsius (149 F) for an hour, in a process called mashing. Afterward, the students sealed the container with plastic and let it stand at room temperature for about a week to ferment.
Alongside that experiment, the students tried to replicate making beer with a vegetable root called manioc. That type of beer-making, which is indigenous to many cultures in South America where the brew is referred to as “chicha,” involves chewing and spitting manioc, then boiling and fermenting the mixture.
Madeleine Ota, an undergraduate student who took Liu’s course, said she knew nothing about the process of making beer before taking the class and was skeptical that her experiments would work. The mastication part of the experiment was especially foreign to her, she said.
“It was a strange process,” Ota said. “People looked at me weird when they saw the ‘spit beer’ I was making for class. I remember thinking, ‘How could this possibly turn into something alcoholic?’ But it was really rewarding to see that both experiments actually yielded results.”
Ota used red wheat for brewing her ancient Chinese beer. Despite the mold, the mixture had a pleasant fruity smell and a citrus taste, similar to a cider, Ota said. Her manioc beer, however, smelled like funky cheese, and Ota had no desire to check how it tasted.
The results of the students’ experiments are going to be used in further research on ancient alcohol-making that Liu and Wang are working on.
“The beer that students made and analyzed will be incorporated into our final research findings,” Wang said. “In that way, the class gives students an opportunity to not only experience what the daily work of some archaeologists looks like but also contribute to our ongoing research.”
Getting a glimpse of the ancient world
For decades, archeologists have yearned to understand the origin of agriculture and what actions may have sparked humans to transition from hunting and gathering to settling and farming, a period historians call the Neolithic Revolution.
Studying the evolution of alcohol and food production provides a window into understanding ancient human behavior, said Liu, who has been teaching Archaeology of Food for several years after coming to Stanford in 2010.
But it can be difficult to figure out precisely how the ancient people made alcohol and food from just examining artifacts because organic molecules easily break down with time. That’s why experiential archaeology is so important, Liu said.
For a hands-on view into the ancient world, students brewed beer from a 5,000-year-old recipe as part of an archaeology course with Professor Li Liu.

White mold-like layers floated on top of the liquids. As the students removed the plastic covers, they crinkled their noses at the smell and sour taste of the odd-looking concoctions, which were the results of their final project for Liu’s course Archaeology of Food: Production, Consumption and Ritual.
The mixtures were homemade beer students made using ancient brewing techniques of early human civilizations. One of the experiments imitated a 5,000-year-old beer recipe Liu and her team revealed as part of published research last spring.
“Archaeology is not just about reading books and analyzing artifacts,” said Liu, the Sir Robert Ho Tung Professor in Chinese Archaeology. “Trying to imitate ancient behavior and make things with the ancient method helps students really put themselves into the past and understand why people did what they did.”
The ancient recipe
Liu, together with doctoral candidate Jiajing Wang and a group of other experts, discovered the 5,000-year-old beer recipe by studying the residue on the inner walls of pottery vessels found in an excavated site in northeast China. The research, which was published in Proceedings of the National Academy of Sciences, provided the earliest evidence of beer production in China so far.
The ancient Chinese made beer mainly with cereal grains, including millet and barley, as well as with Job’s tears, a type of grass in Asia, according to the research. Traces of yam and lily root parts also appeared in the concoction.
Liu said she was particularly surprised to find barley – which is used to make beer today – in the recipe because the earliest evidence to date of barley seeds in China dates to 4,000 years ago. This suggests why barley, which was first domesticated in western Asia, spread to China.
“Our results suggest the purpose of barley’s introduction in China could have been related to making alcohol rather than as a staple food,” Liu said.
The ancient Chinese beer looked more like porridge and likely tasted sweeter and fruitier than the clear, bitter beers of today. The ingredients used for fermentation were not filtered out, and straws were commonly used for drinking, Liu said.
Recreating the recipe
At the end of Liu’s class, each student tried to imitate the ancient Chinese beer using either wheat, millet or barley seeds.
The students first covered their grain with water and let it sprout, in a process called malting. After the grain sprouted, the students crushed the seeds and put them in water again. The container with the mixture was then placed in the oven and heated to 65 degrees Celsius (149 F) for an hour, in a process called mashing. Afterward, the students sealed the container with plastic and let it stand at room temperature for about a week to ferment.
Alongside that experiment, the students tried to replicate making beer with a vegetable root called manioc. That type of beer-making, which is indigenous to many cultures in South America where the brew is referred to as “chicha,” involves chewing and spitting manioc, then boiling and fermenting the mixture.
Madeleine Ota, an undergraduate student who took Liu’s course, said she knew nothing about the process of making beer before taking the class and was skeptical that her experiments would work. The mastication part of the experiment was especially foreign to her, she said.
“It was a strange process,” Ota said. “People looked at me weird when they saw the ‘spit beer’ I was making for class. I remember thinking, ‘How could this possibly turn into something alcoholic?’ But it was really rewarding to see that both experiments actually yielded results.”
Ota used red wheat for brewing her ancient Chinese beer. Despite the mold, the mixture had a pleasant fruity smell and a citrus taste, similar to a cider, Ota said. Her manioc beer, however, smelled like funky cheese, and Ota had no desire to check how it tasted.
The results of the students’ experiments are going to be used in further research on ancient alcohol-making that Liu and Wang are working on.
“The beer that students made and analyzed will be incorporated into our final research findings,” Wang said. “In that way, the class gives students an opportunity to not only experience what the daily work of some archaeologists looks like but also contribute to our ongoing research.”
Getting a glimpse of the ancient world
For decades, archeologists have yearned to understand the origin of agriculture and what actions may have sparked humans to transition from hunting and gathering to settling and farming, a period historians call the Neolithic Revolution.
Studying the evolution of alcohol and food production provides a window into understanding ancient human behavior, said Liu, who has been teaching Archaeology of Food for several years after coming to Stanford in 2010.
But it can be difficult to figure out precisely how the ancient people made alcohol and food from just examining artifacts because organic molecules easily break down with time. That’s why experiential archaeology is so important, Liu said.
by Alex Shashkevich, Stanford News | Read more:
Image: Stanford News
Air Pollution Masks – Fashion's Next Statement?
The intersect between fashion and practicality is not always the most compelling. But given that air pollution is the world’s largest single environmental health risk it seems inevitable they will come to influence each another.
Yesterday saw the launch of M90, an “urban breathing mask” created by the Swedish company Airinum and sold in more than 50 countries. Face masks are already a common sight in Asian countries, although the cheap washable cotton rectangles rarely perform well in tests. Surgical masks, the type usually worn by doctors, have tended to fare better – but are still largely ineffectual.
The market for pricier, more attractive masks has been growing steadily in the past few years. Sales are not notable but Freka, a British brand, had the monopoly for a while. And rightly so, given that they tapped into the trend for minimal sportswear, almostCéline-like in design, seeking to become more of a background accessory than anything stand-out.
Which sets Airinum apart. While the design is typically Scandinavian, these face masks are neon camo.
They aren’t the first luxe masks to have forayed into fashion. In the last few years, these have regularly appeared on the catwalk at Beijing fashion week, arguably being awarded the same gravitas as an It bag. (...)
Masks covered in the Burberry check (although they are not Burberry products) remain a common sight in Asian cities, in a bid to marry style with sensibility, even if they don’t work well. As to whether they’ll take off, affordability remains an issue. But if the aim is to market them in Europe and the US, where athleisure is king and vanity is key, perhaps this is the answer.
As for the fashion appraisal, trad camo is having a moment, particularly in menswear. But neon camo, nothing short of an eyesore, is unchartered territory. It’s also oxymoronic. But that’s the point: if the aim is to raise awareness of the problem, then it’s unlikely you’ll miss one of these on the street.
by Morwenna Ferrier, The Guardian | Read more:
Image: Airinum
Yesterday saw the launch of M90, an “urban breathing mask” created by the Swedish company Airinum and sold in more than 50 countries. Face masks are already a common sight in Asian countries, although the cheap washable cotton rectangles rarely perform well in tests. Surgical masks, the type usually worn by doctors, have tended to fare better – but are still largely ineffectual.

Which sets Airinum apart. While the design is typically Scandinavian, these face masks are neon camo.
They aren’t the first luxe masks to have forayed into fashion. In the last few years, these have regularly appeared on the catwalk at Beijing fashion week, arguably being awarded the same gravitas as an It bag. (...)
Masks covered in the Burberry check (although they are not Burberry products) remain a common sight in Asian cities, in a bid to marry style with sensibility, even if they don’t work well. As to whether they’ll take off, affordability remains an issue. But if the aim is to market them in Europe and the US, where athleisure is king and vanity is key, perhaps this is the answer.
As for the fashion appraisal, trad camo is having a moment, particularly in menswear. But neon camo, nothing short of an eyesore, is unchartered territory. It’s also oxymoronic. But that’s the point: if the aim is to raise awareness of the problem, then it’s unlikely you’ll miss one of these on the street.
by Morwenna Ferrier, The Guardian | Read more:
Image: Airinum
Monday, February 13, 2017
Is Apple Over?
I started personal computing on an Apple II circa 1977. It was a big step up from the Heathkit and Radio Shack DIY projects I tinkered with in grade school. When IBM introduced the IBM-PC circa 1981, I semi-defected, and in 1984 I became bi-computeral (you know why).
My company functioned in a computer multiverse for some time. Macs were for art, music and publishing; PCs were for business; DEC minicomputers were for science, math and engineering. The minicomputers went away by 2000, and then we were just Mac and PC. In 2006, shortly after Macs became Intel inside and Parallels Desktop (a utility that enabled users to run Windows programs on a Mac) debuted, we became a 100% Apple shop, and we never looked back.
For more than a decade, if Apple manufactured it, we purchased it – in bulk. There was no reason to hyper-evaluate the new specifications; we just sent a purchase order to Tekserve (now T2 Computing) for as many of the new Apple devices as we needed (and maybe a few we didn't need). There are so many Apple devices in our offices, someone once said, "It looks like Steve Jobs threw up in here."
That was then.
What malevolent force could entice me to seriously consider a PC? What wickedness could tempt me to contemplate a transitioning back to Windows? What could possibly lure me to the dark side? Only Apple itself has such power.
My iPhone 7 Plus Chronicle
On September 7, 2016, I stood on line for an hour to pick up my brand new iPhone 7 plus. I had made an appointment to be one of the first to pick one up because I was still a blind faith follower of the cult of Apple. There was going to be an issue with the headphone jack (well documented in my first treatise of dissent, "Apple iPhone 7: Are You F#$king Kidding Me"). But being one of the faithful means putting aside common sense.
The moment I started to transfer information from iCloud, I was in trouble. Some apps worked, others were greyed out, and certain features were hit or miss.
Two factory resets and four hours later, I called Apple Care. After 30 minutes on hold, I was told that my iPhone must be defective and needed to be replaced.
(Note: Because I review technology as part of my job, I had plenty of other smartphones, but if this happened to most people, they'd be offline for a week.)
It took two tries for Apple to send me a new phone. The first replacement was lost in shipping, and the second is the one I'm carrying now. I was without an iPhone for about two weeks. To make matters worse, Apple charged my credit card $950 for each phone, so although I had no iPhones, Apple put $2,850 of charges on my credit card, saying it would refund the difference when the missing phone and the bad phone were returned (which it ultimately did).
How could Apple not have replacement phones available for the inevitable number of defective phones it might sell? Here's a better question: Did Apple sell too many defective phones for its supply of replacements?
With the number of iPhones Apple sells, some are bound to be defective – but this was not an isolated incident.
My MacBook Pro Chronicle
I wrote my second treatise of dissent, "Apple MacBook Pro 2016: WTF?," about the all-singing, all-dancing 15-inch MacBook Pro before I received my unit. Here are two videos you may enjoy about unboxing my second MacBook Pro and its battery life. Second? Yes, second. I'm writing this article on my third 15-inch MacBook Pro because the first two were defective.

For more than a decade, if Apple manufactured it, we purchased it – in bulk. There was no reason to hyper-evaluate the new specifications; we just sent a purchase order to Tekserve (now T2 Computing) for as many of the new Apple devices as we needed (and maybe a few we didn't need). There are so many Apple devices in our offices, someone once said, "It looks like Steve Jobs threw up in here."
That was then.
What malevolent force could entice me to seriously consider a PC? What wickedness could tempt me to contemplate a transitioning back to Windows? What could possibly lure me to the dark side? Only Apple itself has such power.
My iPhone 7 Plus Chronicle
On September 7, 2016, I stood on line for an hour to pick up my brand new iPhone 7 plus. I had made an appointment to be one of the first to pick one up because I was still a blind faith follower of the cult of Apple. There was going to be an issue with the headphone jack (well documented in my first treatise of dissent, "Apple iPhone 7: Are You F#$king Kidding Me"). But being one of the faithful means putting aside common sense.
The moment I started to transfer information from iCloud, I was in trouble. Some apps worked, others were greyed out, and certain features were hit or miss.
Two factory resets and four hours later, I called Apple Care. After 30 minutes on hold, I was told that my iPhone must be defective and needed to be replaced.
What?"OK, I'll just go to the Genius Bar and have it replaced." "No, sorry," said the Apple Care person, "we don't have any extra iPhones at the stores; you'll have to send it back to us." "But because of the 'new phone every year' plan you sold me last time, you took my iPhone 6 Plus back. What will I do for a phone for the five to seven days you're telling me it will take for me to get the replacement?"
(Note: Because I review technology as part of my job, I had plenty of other smartphones, but if this happened to most people, they'd be offline for a week.)
It took two tries for Apple to send me a new phone. The first replacement was lost in shipping, and the second is the one I'm carrying now. I was without an iPhone for about two weeks. To make matters worse, Apple charged my credit card $950 for each phone, so although I had no iPhones, Apple put $2,850 of charges on my credit card, saying it would refund the difference when the missing phone and the bad phone were returned (which it ultimately did).
How could Apple not have replacement phones available for the inevitable number of defective phones it might sell? Here's a better question: Did Apple sell too many defective phones for its supply of replacements?
With the number of iPhones Apple sells, some are bound to be defective – but this was not an isolated incident.
My MacBook Pro Chronicle
I wrote my second treatise of dissent, "Apple MacBook Pro 2016: WTF?," about the all-singing, all-dancing 15-inch MacBook Pro before I received my unit. Here are two videos you may enjoy about unboxing my second MacBook Pro and its battery life. Second? Yes, second. I'm writing this article on my third 15-inch MacBook Pro because the first two were defective.
by Shelly Palmer, Ad Age | Read more:
Image: Apple
Tell Me A Story
‘Data-Driven’ Campaigns Are Killing the Democratic Party
There’s a Southern proverb often attributed to Sam Rayburn: “There’s no education in the second kick of a mule.” One month into the Trump presidency, and it’s still unclear whether the Democratic Party will learn anything from a fourth kick.
For four straight election cycles, Democrats have ignored research from the fields of cognitive linguistics and psychology that the most effective way to communicate with other humans is by telling emotional stories. Instead, the Democratic Party’s affiliates and allied organizations in Washington have increasingly mandated “data-driven” campaigns instead of ones that are message-driven and data-informed. And over four straight cycles, Democrats have suffered historic losses.
After the 2008 election, Democrats learned all the wrong lessons from President Obama’s victory, ascribing his success to his having better data. He did have better data, and it helped, but I believe he won because he was the better candidate and had a better message, presented through better storytelling.
I’m not a Luddite. I did my graduate work in political science at MIT, and as a longtime Democratic strategist, I appreciate the role that data can play in winning campaigns. But I also know that data isn’t a replacement for a message; it’s a tool to focus and direct one.
We Democrats have allowed microtargeting to become microthinking. Each cycle, we speak to fewer and fewer people and have less and less to say. We all know the results: the loss of 63 seats and control of the House, the loss of 11 seats and control of the Senate, the loss of 13 governorships, the loss of over 900 state legislative seats and control of 27 state legislative chambers.
Yet despite losses on top of losses, we have continued to double down on data-driven campaigns at the expense of narrative framing and emotional storytelling.
Consider the lot of Bill Clinton. It has been widely reported that in 2016, Bill Clinton urged Hillary Clinton’s campaign to message on the economy to white working-class voters as well as to the “Rising American Electorate” (young voters, communities of color and single white women), but couldn’t get anyone to listen to him in Brooklyn. They had an algorithm that answered all questions. Theirs was a data-driven campaign. The campaign considered Bill to be old school—a storyteller, not data driven.
I feel his pain. And unless Democrats start to change things quickly, we’ll be feeling pain in elections yet to come.
Though the problem for Democrats is urgent, the challenge is not new. Before the clamor for a “data-driven” approach, the “best practices” embraced by much of the Democratic Party apparatus encouraged campaigns that were predominantly driven by issue bullet points. In 2000, for example, the Gore presidential campaign had no shortage of position papers, but it would be challenging (at best) to say what the campaign’s message was. In contrast, in Obama’s 2008 campaign, “Hope and Change” was not only a slogan, but a message frame through which all issues were presented.
Years ago, my political mentor taught me the problem with this approach, using a memorable metaphor: issues are to a campaign message what ornaments are to a Christmas tree, he said. Ornaments make the tree more festive, but without the tree, you don’t have a Christmas tree, no matter how many ornaments you have or how beautiful they are. Issues can advance the campaign’s story, but without a narrative frame, your campaign doesn’t have a message, no matter how many issue ads or position papers it puts forward.
Storytelling has been the most effective form of communication throughout the entirety of human history. And that is unlikely to change, given that experts in neurophysiology affirm that the neural pathway for stories is central to the way the human brain functions (“The human mind is a story processor, not a logic processor,” as social psychologist Jonathan Haidt has written).
The scientific evidence of the effectiveness of storytelling is extensive. Consider the 2004 book, Don’t Think of an Elephant, in which Berkeley linguistics professor George Lakoff applied the analytic techniques from his field to politics, explaining that “all of what we know is physically embodied in our brains,” which process language through frames: “mental structures that shape the way we see the world.”
Convincing a voter—challenging an existing frame—is no small task. “When you hear a word, its frame (or collection of frames) is activated in your brain,” writes Lakoff. As a result, “if a strongly held frame doesn’t fit the facts, the facts will be ignored and the frame will be kept.” How then to persuade voters? How can we get them to change the way they see the world? Tell a story.
Further evidence was put forward in 2007’s The Political Brain, by Emory University psychologist Drew Westen. “The political brain is an emotional brain,” Westen wrote, and the choice between electoral campaigns that run on an issue-by-issue debate versus those that embrace storytelling is stark: “You can slog it out for those few millimeters of cerebral turf that process facts, figures and policy statements. Or you can take your campaign to the broader neural electorate collecting delegates throughout the brain and targeting different emotional states with messages designed to maximize their appeal.”
For Democrats, a useful metaphor to frame our storytelling is that while conservatives believe we are each in our own small boat and it is up to each of us to make it on our own, progressive morality holds that we are all on a large boat and unless we maintain that boat properly, we will all sink together. That metaphor could serve as our narrative frame, and addressing issues within this frame—rather than as separate, unrelated bullet points—would allow us to present emotional stories using language that speaks to voters’ values.
There’s a Southern proverb often attributed to Sam Rayburn: “There’s no education in the second kick of a mule.” One month into the Trump presidency, and it’s still unclear whether the Democratic Party will learn anything from a fourth kick.
For four straight election cycles, Democrats have ignored research from the fields of cognitive linguistics and psychology that the most effective way to communicate with other humans is by telling emotional stories. Instead, the Democratic Party’s affiliates and allied organizations in Washington have increasingly mandated “data-driven” campaigns instead of ones that are message-driven and data-informed. And over four straight cycles, Democrats have suffered historic losses.

I’m not a Luddite. I did my graduate work in political science at MIT, and as a longtime Democratic strategist, I appreciate the role that data can play in winning campaigns. But I also know that data isn’t a replacement for a message; it’s a tool to focus and direct one.
We Democrats have allowed microtargeting to become microthinking. Each cycle, we speak to fewer and fewer people and have less and less to say. We all know the results: the loss of 63 seats and control of the House, the loss of 11 seats and control of the Senate, the loss of 13 governorships, the loss of over 900 state legislative seats and control of 27 state legislative chambers.
Yet despite losses on top of losses, we have continued to double down on data-driven campaigns at the expense of narrative framing and emotional storytelling.
Consider the lot of Bill Clinton. It has been widely reported that in 2016, Bill Clinton urged Hillary Clinton’s campaign to message on the economy to white working-class voters as well as to the “Rising American Electorate” (young voters, communities of color and single white women), but couldn’t get anyone to listen to him in Brooklyn. They had an algorithm that answered all questions. Theirs was a data-driven campaign. The campaign considered Bill to be old school—a storyteller, not data driven.
I feel his pain. And unless Democrats start to change things quickly, we’ll be feeling pain in elections yet to come.
Though the problem for Democrats is urgent, the challenge is not new. Before the clamor for a “data-driven” approach, the “best practices” embraced by much of the Democratic Party apparatus encouraged campaigns that were predominantly driven by issue bullet points. In 2000, for example, the Gore presidential campaign had no shortage of position papers, but it would be challenging (at best) to say what the campaign’s message was. In contrast, in Obama’s 2008 campaign, “Hope and Change” was not only a slogan, but a message frame through which all issues were presented.
Years ago, my political mentor taught me the problem with this approach, using a memorable metaphor: issues are to a campaign message what ornaments are to a Christmas tree, he said. Ornaments make the tree more festive, but without the tree, you don’t have a Christmas tree, no matter how many ornaments you have or how beautiful they are. Issues can advance the campaign’s story, but without a narrative frame, your campaign doesn’t have a message, no matter how many issue ads or position papers it puts forward.
Storytelling has been the most effective form of communication throughout the entirety of human history. And that is unlikely to change, given that experts in neurophysiology affirm that the neural pathway for stories is central to the way the human brain functions (“The human mind is a story processor, not a logic processor,” as social psychologist Jonathan Haidt has written).
The scientific evidence of the effectiveness of storytelling is extensive. Consider the 2004 book, Don’t Think of an Elephant, in which Berkeley linguistics professor George Lakoff applied the analytic techniques from his field to politics, explaining that “all of what we know is physically embodied in our brains,” which process language through frames: “mental structures that shape the way we see the world.”
Convincing a voter—challenging an existing frame—is no small task. “When you hear a word, its frame (or collection of frames) is activated in your brain,” writes Lakoff. As a result, “if a strongly held frame doesn’t fit the facts, the facts will be ignored and the frame will be kept.” How then to persuade voters? How can we get them to change the way they see the world? Tell a story.
Further evidence was put forward in 2007’s The Political Brain, by Emory University psychologist Drew Westen. “The political brain is an emotional brain,” Westen wrote, and the choice between electoral campaigns that run on an issue-by-issue debate versus those that embrace storytelling is stark: “You can slog it out for those few millimeters of cerebral turf that process facts, figures and policy statements. Or you can take your campaign to the broader neural electorate collecting delegates throughout the brain and targeting different emotional states with messages designed to maximize their appeal.”
For Democrats, a useful metaphor to frame our storytelling is that while conservatives believe we are each in our own small boat and it is up to each of us to make it on our own, progressive morality holds that we are all on a large boat and unless we maintain that boat properly, we will all sink together. That metaphor could serve as our narrative frame, and addressing issues within this frame—rather than as separate, unrelated bullet points—would allow us to present emotional stories using language that speaks to voters’ values.
by Dave Gold, Politico | Read more:
Image: uncredited
Saturday, February 11, 2017
A Resort for the Apocalypse
Rising S Bunkers, one of several companies that specialize in high-end shelters—its Presidential model includes a gym, a workshop, a rec room, a greenhouse, and a car depot —says sales of its $500,000-plus units increased 700 percent last year. (This compares with a more modest 150 percent increase across other Rising S units.) Bunker companies won’t disclose customers’ names, but Gary Lynch, Rising S’s CEO, told me his clients include Hollywood actors and “highly recognizable sports stars.” Other luxury shelters are marketed to businesspeople, from bankers to Bill Gates, who is rumored to have bunkers beneath his houses in Washington State and California.
Whereas Cold War shelters, by design, were near the home and easy to get to, a handful of bunker companies are building entire survival communities in remote locations. Some of them share literal foundations with Cold War buildings: One project, Vivos XPoint, involves refurbishing 575 munitions-storage bunkers in South Dakota; Vivos Europa One, in Germany, is a Soviet armory turned luxury community with a subterranean swimming pool.
By contrast, Trident Lakes, a 700-acre, $330 million development in Ector, Texas, an hour and a half north of Dallas, is being built from scratch. Marketed as a “5-star playground, equipped with defcon 1 preparedness,” it is the project of a group of investors who incorporated as Vintuary Holdings. According to James O’Connor, the CEO, Trident Lakes “is designed for enjoyment like any other resort.” (This pitch is rather different from its Cold War–era counterparts: A 1963 bunker advertisement from the Kelsey-Hayes company shows a family tucked under its home, with just rocking chairs for comfort.)
In some regards, the plans for Trident Lakes do resemble those for a resort. Amenities will include a hotel, an athletic center, a golf course, and polo fields. The community is slated to have 600 condominiums, ranging in price from $500,000 to $1.5 million, each with a waterfront view (to which end, three lakes and 10 beaches will be carved out of farmland). Other features are more unusual: 90 percent of each unit will be underground, armed security personnel will guard a wall surrounding the community, and there will be helipads for coming and going.

By contrast, Trident Lakes, a 700-acre, $330 million development in Ector, Texas, an hour and a half north of Dallas, is being built from scratch. Marketed as a “5-star playground, equipped with defcon 1 preparedness,” it is the project of a group of investors who incorporated as Vintuary Holdings. According to James O’Connor, the CEO, Trident Lakes “is designed for enjoyment like any other resort.” (This pitch is rather different from its Cold War–era counterparts: A 1963 bunker advertisement from the Kelsey-Hayes company shows a family tucked under its home, with just rocking chairs for comfort.)
In some regards, the plans for Trident Lakes do resemble those for a resort. Amenities will include a hotel, an athletic center, a golf course, and polo fields. The community is slated to have 600 condominiums, ranging in price from $500,000 to $1.5 million, each with a waterfront view (to which end, three lakes and 10 beaches will be carved out of farmland). Other features are more unusual: 90 percent of each unit will be underground, armed security personnel will guard a wall surrounding the community, and there will be helipads for coming and going.
by Ben Rowan, The Atlantic | Read more:
Image: Chris Philpot
Labels:
Architecture,
Business,
Culture,
Design,
Security
The Glorious Exit of Jeffrey Loria, the Worst Owner in Sports

Whatever frustration percolated over a rich man getting even richer paled compared to the ding-dong-the-witch-is-dead giddiness expressed by Marlins players and executives past and present in texts and calls to one another. Presuming the deal goes through – plenty of pitfalls remain, a source familiar with the agreement confirmed to Yahoo Sports, and Loria would like to bask in the glow of the All-Star Game at Marlins Stadium in July, so the timing of any sale remains unclear – it will bring to an end an ownership reign that stained the sport for more than a decade.
To understand the treachery of Loria and David Samson, the team president and son of Loria’s ex-wife, one need only understand a single number: $1.2 billion. That’s how much a $91 million note from J.P. Morgan to help finance the team’s new stadium, which opened in 2012, is going to cost Miami-area taxpayers. That’s 13 times the original loan. In all, $409 million worth of loans will balloon to $2.4 billion.
And here’s the thing: That’s not even the worst part. For years, the Marlins cried poor to local politicians, saying they needed a stadium to make money. Never would they open up their financials, of course, because they would have shown the Marlins had cleared nearly $50 million in profits the two years before Miami-Dade County approved the stadium funding. Ultimately, the government cowed, and the Marlins got perhaps the most sweetheart of sweetheart stadium deals, which is saying something. They covered only a quarter of construction costs. They keep all of the stadium revenues: tickets, parking, concessions. They pay $2.3 million annually in rent – money that goes to pay off a county loan.
by Jeff Passan, Yahoo Sports | Read more:
Image: via:
Why Whole Foods is Now Struggling
Organic food has never been so popular among American consumers. Ironically, that’s bad news for the brand that made organic a household name — namely, the Austin-based Whole Foods.
On Wednesday, Whole Foods reported what is arguably its worst performance in a decade, announcing its sixth consecutive quarter of falling same-store sales and cutting its outlook for the year. The company is closing nine stores, the most it has ever closed at one time. A mere 16 months ago, Whole Foods predicted it would grow its 470 U.S. locations to more than 1,200.
The problem is one that chief executive John Mackey probably didn’t predict when he first opened Whole Foods as a neighborhood natural foods store 36 years ago: Organics, then a fringe interest, have become so thoroughly mainstream that organic chains now have to face conventional big-box competitors. Mass-market retailers were responsible for 53.3 percent of organic food sales in 2015, according to the Organic Trade Association; natural retailers clocked in just north of 37.
And Whole Foods is hardly the only store feeling the squeeze: Sprouts and Fresh Market, the second- and third-largest publicly traded organic stores, have also seen falling stock prices.
“Whole Foods created this space and had it all to themselves for years,” said Brian Yarbrough, an analyst at Edward Jones. “But in the past five years, a lot of people started piling in. And now there's a lot of competition.”
In many ways, the story of Whole Foods's decline is also the story of how the organic movement took over the United States. Between 2005 and 2015, sales of organic food increased 209 percent, according to the Organic Trade Association. Last year, organic sales topped $43.3 billion.
The driving force behind this growth, most analysts agree, is none other than millennials: Consumers aged 18 to 34 are the largest buyers of organics, and they’re the most likely to consider themselves “knowledgeable” about their food. As they came of age, mainstream grocery chains have been forced to adapt, too.
Walmart ramped up its organics selection in 2006. Kroger introduced its Simple Truth brand in 2012 — the store’s chief executive, Mike Ellis, later said it was the store’s “most successful brand launch ever.” Earlier this week, Aldi announced plans for a $1.6 billion U.S. expansion, with much of that growth aimed at offering “a wider range of organic and gluten-free products.”
By volume, the largest organic retailer in the United States is believed to be Costco, which in 2015 sold $4 billion of organic produce and packaged foods. Like Walmart, Kroger and Aldi, Costco sells organic produce for considerably less than do natural food stores, farmers markets or Whole Foods. In fact, lowering prices has been one of Whole Food’s primary strategies for dealing with competitors.
Apart from shuttering stores and stalling expansion plans, the company is continuing to focus on 365 by Whole Foods, a two-year-old division aimed at launching stores for “value-conscious” consumers. It’s also been dropping prices at its regular locations and mailing out national discount circulars, something it had not previously done. Speaking to investors Wednesday, Mackey indicated that he did not want to see “too big of a gap” between the prices at Whole Foods and those at stores like Costco and Kroger.
But some organic advocates are concerned that lowering the prices of organic foods — an apparent prerequisite for mainstream popularity — can only happen at the expense of the movement’s early principles. This fear is not entirely new: Michael Pollan fretted about it in the pages of the New York Times when Walmart began selling organic Rice Krispie treats 11 years ago. But with results like Whole Foods's, it is becoming more urgent, said Ronnie Cummins, the co-founder of the Organic Consumers Association.
by Caitlin Dewey, WP | Read more:
Image:Ty Wright/Bloomberg News
On Wednesday, Whole Foods reported what is arguably its worst performance in a decade, announcing its sixth consecutive quarter of falling same-store sales and cutting its outlook for the year. The company is closing nine stores, the most it has ever closed at one time. A mere 16 months ago, Whole Foods predicted it would grow its 470 U.S. locations to more than 1,200.

And Whole Foods is hardly the only store feeling the squeeze: Sprouts and Fresh Market, the second- and third-largest publicly traded organic stores, have also seen falling stock prices.
“Whole Foods created this space and had it all to themselves for years,” said Brian Yarbrough, an analyst at Edward Jones. “But in the past five years, a lot of people started piling in. And now there's a lot of competition.”
In many ways, the story of Whole Foods's decline is also the story of how the organic movement took over the United States. Between 2005 and 2015, sales of organic food increased 209 percent, according to the Organic Trade Association. Last year, organic sales topped $43.3 billion.
The driving force behind this growth, most analysts agree, is none other than millennials: Consumers aged 18 to 34 are the largest buyers of organics, and they’re the most likely to consider themselves “knowledgeable” about their food. As they came of age, mainstream grocery chains have been forced to adapt, too.
Walmart ramped up its organics selection in 2006. Kroger introduced its Simple Truth brand in 2012 — the store’s chief executive, Mike Ellis, later said it was the store’s “most successful brand launch ever.” Earlier this week, Aldi announced plans for a $1.6 billion U.S. expansion, with much of that growth aimed at offering “a wider range of organic and gluten-free products.”
By volume, the largest organic retailer in the United States is believed to be Costco, which in 2015 sold $4 billion of organic produce and packaged foods. Like Walmart, Kroger and Aldi, Costco sells organic produce for considerably less than do natural food stores, farmers markets or Whole Foods. In fact, lowering prices has been one of Whole Food’s primary strategies for dealing with competitors.
Apart from shuttering stores and stalling expansion plans, the company is continuing to focus on 365 by Whole Foods, a two-year-old division aimed at launching stores for “value-conscious” consumers. It’s also been dropping prices at its regular locations and mailing out national discount circulars, something it had not previously done. Speaking to investors Wednesday, Mackey indicated that he did not want to see “too big of a gap” between the prices at Whole Foods and those at stores like Costco and Kroger.
But some organic advocates are concerned that lowering the prices of organic foods — an apparent prerequisite for mainstream popularity — can only happen at the expense of the movement’s early principles. This fear is not entirely new: Michael Pollan fretted about it in the pages of the New York Times when Walmart began selling organic Rice Krispie treats 11 years ago. But with results like Whole Foods's, it is becoming more urgent, said Ronnie Cummins, the co-founder of the Organic Consumers Association.
by Caitlin Dewey, WP | Read more:
Image:Ty Wright/Bloomberg News
Friday, February 10, 2017
Subscribe to:
Posts (Atom)