Saturday, March 18, 2017

Ignore the Snobs, Drink the Cheap, Delicious Wine

So-called natural wines have recently supplanted kale as the “it” staple of trendy tables — the “latest in holier-than-thou drinking,” according to The Financial Times. Farmed organically and made with minimal intervention, the wine in these special bottles is not to be confused with what one natural wine festival called “industrialized, big-brand, manufactured, nothing-but-alcoholic-grape-juice wines.” In other words, what most of us drink.

The mania for natural wine has puzzled many: How can wine, presumably a simple mix of grapes and yeast, be unnatural? Yet when it comes to sub-$40 wines — the sweet spot for American drinkers, who spend an average of $9.89 per bottle — the winemaking process can be surprisingly high-tech. Like the Swedish Fish Oreos or Dinamita Doritos engineered by flavor experts at snack food companies, many mass-market wines are designed by sensory scientists with the help of data-driven focus groups and dozens of additives that can, say, enhance a wine’s purple hue or add a mocha taste. The goal is to turn wine into an everyday beverage with the broad appeal of beer or soda.

Connoisseurs consider processed wines the enological equivalent of processed foods, if not worse. The natural winemaker Anselme Selosse maintains that chemical futzing “lobotomizes the wine.”

But they are wrong. These maligned bottles have a place. The time has come to learn to love unnatural wines.

As a trained sommelier, I never expected to say that. I spent long days studying the farming practices that distinguish the Grand Crus of Burgundy and learning to savor the delicate aromas of aged Barolos from organic growers in Piedmont. Yellow Tail, that cheap staple of grocery stores and bodegas, was my sworn enemy.

When Treasury Wine Estates, one of the world’s largest wine conglomerates, invited me to California for a rare view into how its inexpensive offerings are — in industry parlance — “created from the consumer backwards,” I was prepared to be appalled. Researchers who’d worked with Treasury spoke of wine “development” as if it were software or face cream. That seemed like a bad sign.

Then I learned Treasury had parted from the tried-and-true method of making wine, in which expert vintners create bottles that satisfy their vision of quality. Instead, amateurs’ tastes were shaping the flavors.

I watched this process unfold in a cramped conference room where Lei Mikawa, the head of Treasury’s sensory insights lab, had assembled nearly a dozen employees from across the company. First, Ms. Mikawa had the tasters calibrate their palates, so they shared a consistent definition of “earthy” or “jammy.” In a few days, they would blind taste 14 red wines and rate the flavors of each. (The samples usually include a mix of existing Treasury offerings, unreleased prototypes and hit wines that the company may hope to emulate.) Next, approximately 100 amateurs from the general public would score the samples they liked best. By comparing the sensory profile of the wines with the ones consumers most enjoyed, Ms. Mikawa could tell Treasury what its target buyers crave.

Maybe they’d want purplish wines with blackberry aromas, or low-alcohol wines in a pink shade. Whatever it was, there was no feature winemakers couldn’t engineer.

Wine too full of astringent, mouth-puckering tannins? Add Ovo-Pure (powdered egg whites), isinglass (fish bladder granulate) or gelatin. Not tannic enough? Replace $1,000 oak barrels with stainless steel and a bag of oak chips (toasted for flavor), tank planks (oak staves), oak dust (what it sounds like) or a few drops of liquid oak tannin (pick between “mocha” and “vanilla”). Cut acidity with calcium carbonate. Crank it up with tartaric acid. When it’s all over, wines still missing that something special can get a dose of Mega Purple, a grape-juice concentrate that has been called a “magic potion” for its ability to deepen color and fruit flavors.

More than 60 additives can legally be added to wine, and aside from the preservative sulfur dioxide, winemakers aren’t required to disclose any of them.

This should have been the ultimate turnoff. Where was the artistry? The mystery? But the more I learned, the more I accepted these unnatural wines as one more way to satisfy drinkers and even create new connoisseurs.

For one thing, winemaking has long fused art with science, even if that’s not the story drinkers are told. Ancient Romans doctored their wines with pig’s blood, marble dust, lead and sulfur dioxide. Bordelaise winemakers have been treating their wines with egg whites for centuries. And though the chemicals dosed into wine can sound alarming, some, like tartaric acid, already occur naturally in grapes. The only difference is that today’s winemakers can manage the process with more precision.

by Bianka Bosker, NY Times |  Read more:
Image: Sébastien Plassard

Friday, March 17, 2017

Bare Necessities

Fun, in Prudhoe Bay, Alaska, is a calendar event. Out here, on the largest and most remote oil field in the United States, thousands of workers rise each morning in endless summer, eternal darkness, mosquitos, and snow, to begin twelve-hour shifts, which on the drilling rigs requires a discipline that is taken seriously: a mistake, however small, could cause this entire place to explode, as it did in West Texas two years ago, or in Texas City twelve years ago. For a change of landscape one can board a bus with elderly tourists to the edge of the Arctic Ocean, point out the artificial palm tree, suggest a dip, and laugh—the water is 28 degrees—but even that route grows dull: the single gravel lane that traces tundra abuts miles of pipeline. For the oil workers, there is little to look forward to before the end of a two-week shift except for scheduled socialization. Each summer, such fun goes by the name Deadhorse Dash, a 5K race traced across nearby Holly Lake.

“Last year, someone dressed up as a dancing polar bear,” Casey Pfeifer, a cafeteria attendant, tells me when I arrive at the Prudhoe Bay Hotel for lunch on the afternoon of the race. Casey is wearing purple eyeliner and a sweatshirt that reads MICHIGAN in looping gold-glitter cursive. Every two months Casey travels between Idaho and Prudhoe Bay—for her, life in Alaska is synonymous with adventure—to work in the service industry at places like the Hotel, which is not actually a hotel at all but a work-camp lodge, with hundreds of tiny rooms housing twin-size cots and lockers. Casey smiles at me from behind her warming tray and I feel cozy, despite the dirt and dust clinging to my skin. The fluorescent lights illuminate her golden hair, which is tucked into a sock bun, and she tongs a sliver of battered cod. “Picture it,” Casey says. She sways her butt to the sound of nothing. “This giant bear, and he is grooving.”

I picture an enormous mascot gyrating to the Backstreet Boys. It is not my idea of fun, but I am an outsider. I had arrived on the North Slope only the day before, seeking a week in the most isolated community in America and what I hoped would be storybook Alaska: purple arching Coho salmon, caribou, moose, air that belongs in a breath-mint commercial. Instead I found square buildings like so many others, and a cafeteria just like that of a high school, with wheels of cheesecake and racks of chips. How normal everything felt. At an empty table, I watch workers lay playing cards out in front of them. Behind them, mounted televisions loop the Steve Harvey Show and Maury, The Price Is Right and Dr. Phil. Workers in heavy coveralls spoon cubes of honeydew onto their plates, consider the merits of the cacciatore, and pile their bowls with limp linguini. They puff their cheeks like chipmunks, gearing up, they joke, for what would no doubt prove a feat of monumental athleticism.

“The calories aren’t expended in the walking,” one worker tells me, reaching into a basket of Little Debbie Swiss Cake Rolls. I watch as his hands, the largest I have ever seen, raise the cakes to his mouth. He consumes them whole, parting his lips dramatically—wet pink petals, upon which the skin blisters, burned by Arctic sunlight. His name, he says, is Jeff Snow, but he goes by Snowman. He earned the nickname in the dead of winter, because up here, he comes alive: a redneck, forklift-driving Frosty the Snowman, made animate by extremes.

“The real work tonight is swatting the mosquitos,” Snowman says. He rolls his eyes, he laughs.

“The Deadhorse Dash is mostly bullshit. But it’s the sort of bullshit you look forward to.”

According to posters fixed to the cafeteria’s white-painted cinderblock walls, participants are to meet at six o’clock by the biggest warehouse in the stretch, owned by Carlisle Transport. The evening would start with a few minutes of mingling, during which men with binoculars would scan the horizon for polar bears. “They rarely come in this far in summer,” Snowman says, “but better alive than dead and sorry.” Once our safety is assured, we would set out across the tundra, tracing a two-mile stretch from one edge of Holly Lake to another across a landscape normally restricted to oil-field employees and suppliers who hold the highest level of security clearance. At the halfway checkpoint, marked by a folding table, we would collect a token, redeemable at the finish line for a burger, a handful of chips, a chocolate-chip cookie wrapped in thin plastic, and our choice of apple or banana.

An Arctic picnic in eternal summer.

“It’s a privilege, really,” Casey says. There is a home-away-from-home feeling here, she explains. But still, one passes most days as if a zombie. You rise, you work, you eat, you go to bed, repeat. Mostly, life on the North Slope is spent waiting to return to life in the Lower 48 and, with it, a return to children’s birthday parties, dinners with the spouse, backyard barbecues, and the simplicities of normal life: a fishing line, unfurled and bobbing red above a riverbank in Idaho.

“Put it to you this way,” Snowman says. “We don’t do anything up here but work and sleep and eat. So shit like this means an awful lot.”

by Amy Butcher, Harper's |  Read more:
Image: Amy Butcher

Thursday, March 16, 2017

The Lessons of Obamacare

What Republicans should have learned, but haven't.

On January 6, President Barack Obama sat down with us for one of his final interviews before leaving the White House. The subject was the Affordable Care Act — the legislation that has come to carry his name and define his legacy.

It was strange circumstances Obama found himself in. He was leaving office an unusually popular president, with approval numbers nearing 60 percent. But his most important domestic achievement was imperiled. Republicans had spent years slamming Obamacare for high premiums, high deductibles, high copays, and daunting complexity. Donald Trump had won the White House in part by promising to repeal the ACA and replace it with “something terrific.” Both houses of Congress would be controlled by Republicans who appeared set to carry out his plan.

But over the course of the next 70 minutes, it became clear that Obama didn’t think they would get the job done. If he sounded unexpectedly confident, it’s because he believed the wicked problems of health reform — problems that bedeviled him and his administration for eight years — would turn on the GOP with equal force.

“Now is the time when Republicans have to go ahead and show their cards,” he said. “If in fact they have a program that would genuinely work better, and they want to call it whatever they want — they can call it Trumpcare or McConnellcare or Ryancare — if it actually works, I will be the first one to say, ‘Great; you should have told me that in 2009. I asked.’”

Two months later, the release of House Republicans’ replacement plan — the American Health Care Act — has made Obama look prescient. The bill quickly placed Republicans under siege from both the left, which has found more to like in Obamacare as its survival has become threatened, and the right, which attacked the replacement as unrealistic and ill-considered, and, most damning of all, as “Obamacare 2.0.”

The biggest problem Republicans face, though, isn’t from activists in either party. It’s from the tens of millions of Americans who now depend on Obamacare, and their friends, families, co-workers, and neighbors. They have been promised a replacement that costs less and covers more, and the GOP’s plan does neither.

According to the Congressional Budget Office, the AHCA would throw 24 million people off health insurance over the next 10 years and leave the remnant in plans with higher deductibles, higher copays, and less coverage. The law would let insurers charge older Americans 500 percent more than younger Americans, and the sparer subsidies wouldn’t adjust to the local cost of insurance coverage, and thus would be insufficient in many areas. This is not the “something terrific” Trump promised, nor the kind of health care that polling shows Americans want.

We are reporters who have covered health care, and the legislative ideas that became the Affordable Care Act, since before Obama’s election. In the course of that reporting, including recent conversations with Obama and dozens of elected officials and staffers responsible for the Affordable Care Act’s design, passage, and implementation, we have unearthed several lessons from the law, which current and future health reformers should heed.

At the moment, Republicans are ignoring most of them.

Lesson 1: Everything in health care is a painful trade-off. Own it.

Obama had a habit, back in meetings during the Affordable Care Act’s drafting, his former advisers recall. He would start twisting an invisible Rubik’s cube in the air, working his hands around to try to make the pieces fit together just right.

This was what health policy felt like: trying to slot together competing priorities in a way that was just as maddening as trying to get the color sides of a Rubik’s cube to line up.

Any government health coverage expansion involves a series of trade-offs, decisions that will inevitably anger one constituency or another. Provide robust health insurance plans, for example, and you need to spend more money — if you don’t, you must decide to cover fewer people. Provide skimpier coverage, and the price tag of a health insurance expansion goes down, but people get frustrated with their high deductibles and copays.

Change the system so one group pays less, and another group, inevitably, has to pay more.

“Those trade-offs have bedeviled efforts to expand health insurance coverage for decades,” says Doug Elmendorf, who directed the Congressional Budget Office during the health law debate. “It is very hard to maximize health coverage while minimizing the cost to the government and disruptions to current insurance arrangements.”

The most important part of writing health policy isn’t figuring out a way around those trade-offs, although many legislators have tried. It’s making the trade-offs that will lead to the best outcomes, and explaining those clearly to constituents. The Obama administration knew from the start that it wanted to make health insurance more accessible to those who had traditionally struggled to get covered: people who are sicker, older, and poorer — and did not have access to employer-sponsored coverage. Democrats didn’t just want to get millions covered. They had specific demographics in mind they wanted to benefit.

“If you replace a 60-year-old with a 20-year-old, that doesn’t change the number of people covered, but it changes the value of the coverage and of the program,” says Jonathan Gruber, an MIT economist who helped the White House model the economic effects of Obamacare.

Democrats had to make very clear trade-offs to advantage this older, sicker population.

For example, the law limits the premiums that insurers could charge their oldest consumers to just three times whatever they billed the youngest enrollees. The Affordable Care Act mandated that insurers must cover 10 “essential health benefit” categories. These included medical care that plans in the individual market have historically left out, such as mental health services and maternity care.

These changes were great for those who were older and required significant medical care. But bringing unhealthy people into the market is difficult, “because it requires the healthy people who had a sweet deal in the past to pay higher rates,” says former Health and Human Services Secretary Kathleen Sebelius. “There is no question that some people’s rates went up, but the old market didn’t work very well for the majority of people who needed coverage.”

Democrats’ trade-off brought consequences, one of them being that the health care law has long struggled to attract as many young people as the White House would like. Back in 2012, administration officials told us they wanted one-third of the marketplace enrollees to be between 18 and 34. The number has never gotten there, hovering around one-quarter for the past four years.

Zeke Emanuel, who worked as one of President Obama’s health care advisers, says the administration tilted the playing field too far in favor of the sick and elderly, making it difficult for young people to sign up. He says the administration should have let insurers charge older people more, perhaps four times as much as the youngest consumers.

“We made the wrong trade-off,” he says. “The consequence is costs for old people are higher because we don’t have enough young people in the pool.”

Veterans of the 2009 health care fight have dozens of stories about different trade-offs they had to make, ones that would anger different constituencies. The administration was constantly trying to balance the desire to expand coverage to as many people as possible against the commitment to keeping the package revenue-neutral. It faced outside pressure from hospitals and insurers, who some thought might turn their backs on the effort if it didn’t bring tens of millions of Americans into the health insurance system.

“Almost every aspect of the bill was inextricably linked,” says Nancy-Ann DeParle, one of Obama’s top health care advisers. “Every time we tweaked the subsidies or the individual mandate penalties, CBO had to re-estimate the bill to see how it affected coverage. If CBO said that coverage decreased, that was a big problem, because the hospitals’ support for the bill was contingent on getting a high percentage of the uninsured covered.”

Trump’s own ideas about health policy do not seem to grapple seriously with these trade-offs. He repeatedly talks about covering more people at a lower cost but has offered no plan to do so.

The American Health Care Act, however, lays these issues bare. It makes different trade-offs than the ones that Democrats made. The bill would change the rules of the individual market to advantage people who are younger, healthier, and higher-income — but disadvantage people who are older, sicker, and poorer.

AHCA, for example, would allow insurers to charge the oldest enrollees five times as much as the youngest enrollees. It would allow insurers to sell less robust health insurance plans that cover a smaller percentage of enrollees’ costs.

The results are particularly grim for older, poorer enrollees — many of whom vote Republican. According to the CBO’s analysis of the plan, a 64-year-old making $26,500 would see his premiums rise by 750 percent under the AHCA. But not only are Republicans refusing to own that trade-off — they’re refusing to own any trade-offs.

“Nobody will be worse off financially,” promised Health and Human Services Secretary Tom Price on a recent Meet the Press appearance. That’s a promise no plan could keep, but that Republicans have now made, in public, and that will be played back on ad after ad after ad.

Democrats learned, over months of hard work, that there was no free lunch in health policy. Republicans are now beginning to run into the same difficult truth: Every new winner in health care comes with a new loser.

by Sarah Kliff and Ezra Klein, Vox | Read more:
Image: uncredited

Crocodile Tears: The Logo Battle Between Lacoste, Izod & Crocodile Garments

Image: Lacoste S.A.

The Revolution Will Not Be Curated

The era of the curator has begun,” declared the prominent art critic Michael Brenson in 1998. The figures who assembled artworks into galleries, he reasoned, were now “as essential” to exhibits as the artists themselves. Curators were a species of universal genius who “must be at once aestheticians, diplomats, economists, critics, historians, politicians, audience developers, and promoters,” Brenson wrote. “They must be able to communicate not only with artists but also with community leaders, business executives, and heads of state.” And what a curator “welcomes or excludes” is what makes all the difference.

Whatever else we might think of this assertion, it was certainly prescient. Today the era of the curator is in full flower. The contemporary literature about the heroic organizer of exhibitions is large and enthusiastic, with adulatory new installments added all the time. In 2006, a prominent art writer saw a generation of bold young curators “armed with a vision of possibility and an image of the curator as a free agent, capable of almost anything.” In 2012, the New York Times marveled at the growing number of “programs in curating studies” and at how certain curators established themselves as “star names” in the art world. Brightest among these stars, without a doubt, is one Hans Ulrich Obrist, a curator at the Serpentine Gallery in London, the author of Ways of Curating and A Brief History of Curating, and the closest thing there is to an art-world superstar these days, his every taste-quirk fawned over by the press. (...)

And, of course, “curating” describes something that websites are supposed to do. It is the new and more benign word for what a short while ago was called “aggregating,” or what a less pretentious person might call “editing” or “sifting.” The web is a vast, chaotic, onrushing thing, the idea goes, and “curators” promise to sort it all out for us, welcoming and excluding as they see fit. That’s why what goes on at Pinterest and Tumblr and Instagram and Digg is often called “curating.” Above all, curating is what takes place at Facebook, where busily sifting “news curators” used to choose stories to be included in the hotly desirable “trending” category. (...)

What is a curator, and why is it the admired cultural position of the moment? Why is this the word that springs to our tongues today when once we would have said “DJ,” or “blogger,” or “expert,” or just “snob”? And why is it persistently associated with liberals?

Consider the most basic aspect of the word as we use it today. A curator is an arbiter, someone who distinguishes between what is good and what is bad. Curators tell us what to welcome and what to exclude, what to keep and what to toss. They make judgments. They define what is legitimate and what is not.

But curators don’t make these judgments subjectively or out of the blue, as would chefs or gourmands or other sorts of fussy people. No, curators are professional arbiters of taste and judgment, handing down their verdicts on news stories or pot roasts from a position of dignity and certified authority.

The word is deeply associated with academic achievement. Gallery curators are often people with advanced degrees, and “curation” and its variants are sometimes used to describe certain kinds of university officials. The highest officers of the University of Missouri, for example, are called curators, and at Bennington College, even prospective students are encouraged to think of themselves as curators—curators, that is, of their applications to associate with this illustrious institution. As Bennington’s magazine puts it, they are invited to “curate their submissions and engage in the admissions process as a learning experience.”

It’s all about social status, in other words, and the eternal desire of Americans to claw their way upward by means of some fancy-sounding euphemism. Back in the 1980s, English professor Paul Fussell set down a list of occupations that had contrived to class themselves up by adopting longer names that sounded more professional. “In many universities,” he wrote,
what used to be the bursar is now the disbursement officer, just the way what used to be an undertaker (already sufficient as a euphemism, one would think) is now a funeral director, an advance of two whole syllables. . . . Selling is raised to retailing or marketing, or even better, to merchandising, an act that exactly doubles its syllables, while sales manager in its turn is doubled by being raised to Vice-President, Merchandising. The person on the telephone who used to provide Information now gives . . . Directory Assistance, which is two syllables grander.
And so experts of every kind have in our time been promoted to curators, which is not just a longer word but one that carries grand professional implications.

Curatolatry also imparts a certain smiling friendliness to expertise. Long ago, a “curator” was a medical worker of some indeterminate sort—someone charged with curing, basically. And even today we can see that curators do what they do not because they are greedy or snobbish but because they want to nurture the public. This is especially important as scandals ripple through profession after profession: accounting, appraising, investment banking, medicine, and so on. A curator would never use monopoly power to gouge users of some prescription medicine, for example. They care about us too much. They are not dictators; they are explainers, to mention another occupation that is much in vogue nowadays. They just want to help you to learn and understand. They are authority figures, yes, but they are lovable and benevolent ones.

And they are infinitely adaptable. Curatorial authority can be counted on to fine-tune your taste in food, your news consumption—even, in all likelihood, your ideological worldview. And with every sphere of American experience so promiscuously aestheticized, the curatorial reflex can be understood as something that a benevolent class of tastemakers and enlightened celebrities is selflessly undertaking for your own good. That’s why, for example, self-appointed celebrity pundits such as Lena Dunham and Alec Baldwin claim improbable perches in the protest culture of liberalism—and why Meryl Streep, who laid into Trump on the press’s behalf at the Golden Globe awards, has acquired the status of a twenty-first-century Edward R. Murrow.

by Thomas Frank, The Baffler |  Read more:
Image: Lindsay Ballant

Malcolm Gladwell Wants to Make the World Safe for Mediocrity

On black identity in the Caribbean and the United States

COWEN: There’s a discussion that Sylvia Wynter, the Jamaican intellectual, offered in year 2000, and I’d like your opinion on this. She said there’s something special about the United States: that in Jamaica, or in many parts of the Caribbean more broadly, that being middle class can in some way counter the fact of blackness socially, and serve as a kind of offset. But she said about the United States, and here I quote, “The US itself is based on the insistent negation of black identity, the obsessive hypervaluation of being white.” Do you think that’s an accurate perspective?

GLADWELL: Well, yeah, there is something . . . well, I hesitate to say under-theorized, but there is something under-theorized about the differences between West Indian and American black culture, the psychological difference between what it means to come from those two places. I think only when you look very closely at that difference do you understand the heavy weight that particular American heritage places on African-Americans. What’s funny about West Indians is, they can always spot another West Indian. And at a certain point you wonder, “How do they always know?” It’s because after a while you get good at spotting the absence of that weight.

And it explains as well the well-known phenomenon of how disproportionately successful West Indians are when they come to the United States because they seem to be better equipped to deal with the particular pathologies attached to race in this country — my mother being a very good example. But of course there are a million examples.

I was just reading for one of my podcasts; I’ve been reading all these oral history transcripts from the civil rights movement. I was reading one today and I’m halfway through. And I had that completely unbidden thing, “Oh, this guy’s a West Indian.” He was an African-American attorney and a civil rights lawyer in Virginia in the ’60s. I got a 30-page transcript. I got to page 15, I’m like, “He’s West Indian.” And then, literally page 16, “My father came from Trinidad and Tobago with my mother and me.”

COWEN: [laughs]

GLADWELL: There is something very, very real there that’s not, I feel, fully appreciated.

COWEN: Another difference that struck me — tell me what you think of this — is that the notion of freedom for much of the Caribbean, it’s in some way more celebratory, and it’s more rooted in history, and it may be because these are mostly majority black societies. History is in a sense controlled; it’s much more commemorative. Does that make sense to you? It’s not a struggle to control the narration of history at a national level.

GLADWELL: Yes. You’re in charge of the narrative —

COWEN: Yes.

GLADWELL: . . . which is huge. I thought of this because I wanted to do — sorry, my podcast is on my mind — I wanted to do and I haven’t managed to figure out how to do it, but there’s a Jamaican poet called Louise Bennett. If you are Jamaican, you know exactly who this person is. She’s probably the most important colloquial poet. I think that’s the wrong word. Popular poet. And she wrote poetry in dialect. So for a generation of Jamaicans, she was an assertion of Jamaican identity and culture. My mother was a scholarship student at a predominantly white boarding school in Jamaica. She and the other black students of the school, as an act of protest, read Louise Bennett poetry at the school function when she was 12 years old.

If you read Louise Bennett’s poetry, much of it is about race. It’s about race where the Jamaican, the black Jamaican often has the upper hand. The black Jamaican is always telling some sly joke at the expense of the white minority. So it’s poetry that doesn’t make the same kind of sense in a society where you’re a relatively powerless minority. It’s the kind of thing that makes sense if you’re not in control of major institutions and such, but you are 95 percent of the population and you feel like you’re going to win pretty soon.

My mother used to read this poem to me as a child where Louise Bennett is . . . the poem is all about sitting in a beauty parlor, getting her hair straightened, sitting next to a white woman who’s getting her hair curled.

[laughter]

GLADWELL: And the joke is that the white woman’s paying a lot more to get her hair curled than Louise Bennett is to get her hair straightened. That’s the point. It’s all this subtle one-upmanship. But that’s very Jamaican.

On the subject of Revisionist History season two

COWEN: Now, to ask about your podcasts. I know some of them in the second season, they’ll be about the civil rights movement — in particular, the 1950s, which are a somewhat neglected time. I’ll throw out just a few possible forces that led America to start to become more integrated in the ’50s, and you tell me which you think are neglected or underrated.

One would be professional sports and Jackie Robinson starting to play baseball in the late ’40s. Another would be entertainers, a move toward having more black leads in movies and also music, say Chuck Berry or even James Brown. Harry Truman integrating the military, or the desire, for purposes of Cold War propaganda, to actually show this country is making some progress on civil rights issues. Which of those or which other factors do you feel are the ones we’re missing in understanding this history?

GLADWELL: If I had to rank those, army one. And I would say that the entertainment and sports . . . I would say that it was either neutral or worse than neutral.

COWEN: Why worse than neutral?

GLADWELL: Because I actually think if we were to take the long view, and we would look at this from a hundred years from now, we would say that . . . it is not unusual for minorities to first make their mark in sports and entertainment. You see it with Jews, you see it with Italians, you see it with Irish. But the thing that’s striking to me about those movements is they move in and out of those worlds pretty quickly. So the Jewish moment in sports is really quite short.

COWEN: Sure.

[laughter]

GLADWELL: Which is in retrospect not that surprising.

COWEN: Boxing especially.

GLADWELL: It’s like that long. The African-American moment in those transitional fields is really long; it continues to this day. And it’s almost to the point where you feel that what happens is, they move into those worlds and get stalled there. And their presence in that world accentuates and aggravates existing prejudice about their community as opposed to serving as a way station to a better place.

So, if your problem is that you’re facing a series of stereotypes about how you are intellectually inferior, how you have a broken culture, how you have . . . I could go on and on and on with all of the stereotypes that exist. Then how does playing brutally violent sports help you? How is an association, almost an overrepresentation in these various kinds of public entertainments advance your cause? I’m for those things when they’re transitional, and I’m against them when they seem like dead ends.

COWEN: How important a factor was the research of Mamie and Kenneth Clark? That’s some work that, had there been a Malcolm Gladwell at the time, would have been written up even more — the notion that when there’s segregation, people may value themselves or their race less. It seems that had a big impact on the Warren Court, on other thinking. What’s your take on their influence?

GLADWELL: Well, the great book on this is Daryl Scott’s Contempt and Pity. He’s a very good black historian at Howard [University], I believe. Yes, he’s the chair of history at Howard. And he has much to say, so I got quite taken when I was doing this season of my podcast with the black critique of Brown v. Board of Education]. And the black critique of Brown starts with some of that psychological research because the psychological research is profoundly problematic on many levels.

So what Clark was showing, and what so moved the court in the Warren decision, was this research where you would take the black and the white doll, and you show that to the black kid. And you would say, “Which is the good doll?” And the black kid points to the white doll. “And which doll do you associate with yourself?” And they don’t want to answer the question. And the court said, “This is the damage done by segregation.”

Scott points out that if you actually look at the research that Clark did, the black children who were most likely to have these deeply problematic responses in the doll test were those from the North, who were in integrated schools. The southern kids in segregated schools did not regard the black doll as problematic. They were like, “That’s me. Fine.”

That result, that it was black kids, minority kids from integrated schools, who had the most adverse reactions to their own representation in a doll, is consistent with all of the previous literature on self-hatred, which starts with Jews. That literature begins with, where does Jewish self-hatred come from? Jewish self-hatred does not come from Eastern Europe and the ghettos. It comes from when Jewish immigrants confront and come into close conflict and contact with majority white culture. That’s when self-hatred starts, when you start measuring yourself at close quarters against the other, and the other seems so much more free and glamorous and what have you.

So, in other words, the Warren Court picks the wrong research. There are all kinds of problems caused by segregation. This happens to be not one of them. So why does the Warren Court do that? Because they are trafficking — this is Scott’s argument — they are trafficking in an uncomfortable and unfortunate trope about black Americans, which is that black American culture is psychologically damaged. That the problem with black people is not that they’re denied power, or that doors are closed to them, or that . . . no, it’s because that something at their core, their family life and their psyches, have, in some way, been crushed or distorted or harmed by their history.

It personalizes the struggle. By personalizing the struggle, what the Warren Court is trying to do is to manufacture an argument against segregation that will be acceptable to white people, particularly Southern white people. And so, what they’re saying is, “Look, it’s not you that’s the problem. It’s black people. They’re harmed in their hearts, and we have to usher them into the mainstream.”

They’re not making the correct argument, which was, “You guys have been messing with these people for 200 years! Stop!” They can’t make that argument because Warren desperately wants a majority. He wants a nine-nothing majority on the court. So, instead, they construct this, in retrospect, deeply offensive argument, about how it’s all about black people carrying this . . . and using social science in a way that’s actually quite deeply problematic. It’s not what the social science said.

by Tyler Cowen and Malcolm Gladwell, Medium |  Read more:
Image: Caren Louise Photographs

Wednesday, March 15, 2017

Breaking Faith

Over the past decade, pollsters charted something remarkable: Americans—long known for their piety—were fleeing organized religion in increasing numbers. The vast majority still believed in God. But the share that rejected any religious affiliation was growing fast, rising from 6 percent in 1992 to 22 percent in 2014. Among Millennials, the figure was 35 percent.

Some observers predicted that this new secularism would ease cultural conflict, as the country settled into a near-consensus on issues such as gay marriage. After Barack Obama took office, a Center for American Progress report declared that “demographic change,” led by secular, tolerant young people, was “undermining the culture wars.” In 2015, the conservative writer David Brooks, noting Americans’ growing detachment from religious institutions, urged social conservatives to “put aside a culture war that has alienated large parts of three generations.”

That was naive. Secularism is indeed correlated with greater tolerance of gay marriage and pot legalization. But it’s also making America’s partisan clashes more brutal. And it has contributed to the rise of both Donald Trump and the so-called alt-right movement, whose members see themselves as proponents of white nationalism. As Americans have left organized religion, they haven’t stopped viewing politics as a struggle between “us” and “them.” Many have come to define us and them in even more primal and irreconcilable ways.

When pundits describe the Americans who sleep in on Sundays, they often conjure left-leaning hipsters. But religious attendance is down among Republicans, too. According to data assembled for me by the Public Religion Research Institute (PRRI), the percentage of white Republicans with no religious affiliation has nearly tripled since 1990. This shift helped Trump win the GOP nomination. During the campaign, commentators had a hard time reconciling Trump’s apparent ignorance of Christianity and his history of pro-choice and pro-gay-rights statements with his support from evangelicals. But as Notre Dame’s Geoffrey Layman noted, “Trump does best among evangelicals with one key trait: They don’t really go to church.” A Pew Research Center poll last March found that Trump trailed Ted Cruz by 15 points among Republicans who attended religious services every week. But he led Cruz by a whopping 27 points among those who did not.

Why did these religiously unaffiliated Republicans embrace Trump’s bleak view of America more readily than their churchgoing peers? Has the absence of church made their lives worse? Or are people with troubled lives more likely to stop attending services in the first place? Establishing causation is difficult, but we know that culturally conservative white Americans who are disengaged from church experience less economic success and more family breakdown than those who remain connected, and they grow more pessimistic and resentful. Since the early 1970s, according to W. Bradford Wilcox, a sociologist at the University of Virginia, rates of religious attendance have fallen more than twice as much among whites without a college degree as among those who graduated college. And even within the white working class, those who don’t regularly attend church are more likely to suffer from divorce, addiction, and financial distress. As Wilcox explains, “Many conservative, Protestant white men who are only nominally attached to a church struggle in today’s world. They have traditional aspirations but often have difficulty holding down a job, getting and staying married, and otherwise forging real and abiding ties in their community. The culture and economy have shifted in ways that have marooned them with traditional aspirations unrealized in their real-world lives.”

The worse Americans fare in their own lives, the darker their view of the country. According to PRRI, white Republicans who seldom or never attend religious services are 19 points less likely than white Republicans who attend at least once a week to say that the American dream “still holds true.”

But non-churchgoing conservatives didn’t flock to Trump only because he articulated their despair. He also articulated their resentments. For decades, liberals have called the Christian right intolerant. When conservatives disengage from organized religion, however, they don’t become more tolerant. They become intolerant in different ways. Research shows that evangelicals who don’t regularly attend church are less hostile to gay people than those who do. But they’re more hostile to African Americans, Latinos, and Muslims. In 2008, the University of Iowa’s Benjamin Knoll noted that among Catholics, mainline Protestants, and born-again Protestants, the less you attended church, the more anti-immigration you were. (This may be true in Europe as well. A recent thesis at Sweden’s Uppsala University, by an undergraduate named Ludvig Bromé, compared supporters of the far-right Swedish Democrats with people who voted for mainstream candidates. The former were less likely to attend church, or belong to any other community organization.)

How might religious nonattendance lead to intolerance? Although American churches are heavily segregated, it’s possible that the modest level of integration they provide promotes cross-racial bonds. In their book, Religion and Politics in the United States, Kenneth D. Wald and Allison Calhoun-Brown reference a different theory: that the most-committed members of a church are more likely than those who are casually involved to let its message of universal love erode their prejudices.

Whatever the reason, when cultural conservatives disengage from organized religion, they tend to redraw the boundaries of identity, de-emphasizing morality and religion and emphasizing race and nation. Trump is both a beneficiary and a driver of that shift.

by Peter Beinart, The Atlantic |  Read more:
Image: Edmon De Haro

Tuesday, March 14, 2017

Roy Orbison and Friends


[ed. 2:30 to 5:20. James Burton and Bruce Springsteen. Some pretty awesome guitar playing.]

President Trump's Press Secretary, Sean Spicer, 2008 annual Easter Egg Roll.
via:

How Gonzaga Became the Central Hope for the Struggling City of Spokane

[ed. I like Spokane. Clean, wide city streets, friendly people, reasonable traffic, active city center, nice restaurants, tidy neighborhoods. Certainly didn't appear as down-trodden as this article suggests. I've spent some time on the Gonzaga campus too, and it's beautiful.]

I doubt he remembers, but the first time I met Mark Few was when he was with his wife and children, looking for a seat inside a mega-church on the outskirts of Spokane, Washington. I was home from university for the winter holiday and had tagged along with my father on that Sunday morning. Upon traversing a thousand-car parking lot, we were greeted by six video screens, a handful of professional cameramen, a 12-person band, and hundreds of Protestants, gathered together to sing contemporary Christian rock.

Few, walking quickly and in lockstep with his wife, hustled past me, no doubt looking to find a seat before a crazed fan could accost him. “Great work,” I said, as he zipped past. Few, whose face is compressed and tanned like a Florida retiree, was wearing a yellow, wool sweater. He nodded towards me. “Thanks,” he said, before disappearing into a mass of singing white people.

As the men’s basketball head coach at Gonzaga University, Few is an extremely tough man to pin down. I bumped into the world-famous art dealer Larry Gagosian at the Hemingway Bar in Paris not long after that, and even he had time for a couple of words. But in the deeply conservative, largely rural, college-basketball-obsessed town of Spokane, Coach Few is the famous equivalent of about nine Larry Gagosians. He is always getting approached in Spokane’s restaurants, stores, parking lots, even churches. As the coach of the Gonzaga men’s basketball team, he is the central – perhaps the only – source of hope for a struggling city.

In 1881, Spokane was incorporated as a lumber and mining town, with thousands of men coming by way of the newly established Northern Pacific Railway through Montana and Idaho in search of gold, silver, and mill jobs. Jesuits founded Gonzaga shortly thereafter, in 1887, offering classes in theology and Latin. Surrounded by open country and pine trees, Spokane sits on a tiny lump of a hill. The air smells of Ponderosa bark, and the city experiences all four seasons: temperatures soar over a hundred degrees in May and drop below zero in December.

Spokane (pronounced spoh-kan) has changed a good deal since its founding, and as is typical with cities whose central industry is no longer demanded, its quality of life began to slide once the demand for milling and mining fell in the early-20th century. In the 1930s, with the second world war spurring the economy, aluminum plants became Spokane’s central industry; but in the postwar period, Spokane experienced few newcomers. All of its job industries had dried up.

In 1974, there was a world expo that brought a trolley system, a gondola ride, and a more expanded downtown, replete with carousel and Ferris wheel, but the carousel is now closed most of the year, the Ferris wheel now rusted. As far as economics and quality of life goes, Spokane has stayed essentially the same since its postwar slump: still poor, still dangerous.

Last year, Spokane ranked as the 22nd most dangerous city in the United States, up from 26th the year before. Last year alone there were 10 murders, 1,100 violent crimes, and 12,000 property crimes. President Trump’s message of gloom and doom resonated acutely with Spokane and the deeply conservative US congresswoman Cathy McMorris Rodgers has represented Spokane County since 2005. Spokane’s unemployment rate is stalled at about 7%, the highest for a medium- or large-sized city in Washington and double the rate of Seattle. Over 17% of Spokane’s population lives below the poverty line. Spokane, in short, is a town in desperate need of success, vicarious or otherwise. (...)

When Mark Few was named head coach of the Gonzaga men’s basketball team in 1999, he put Spokane on the map. Every year that Few has been head coach, the Gonzaga Bulldogs have gained entrance to the NCAA tournament, making it to the Sweet 16 five times and once to the Elite Eight. Yet, they have never been to the Final Four, and a championship has always seemed unlikely, no matter their ranking or early hype. (...)

For most Spokanites, the Zags have become like a close friend. For my father and me, they are a team that we look towards during times of success, but also – perhaps especially – during times of difficulty. My father listens to every Zags game on the radio, while eating his dinner alone. My mother passed a few years ago, so whether they’re running up the score against Santa Clara or losing to BYU, my father listens in, extending his invisible support for the team just as they return their type of invisible support to him.

In the same way that a hometown team provides emotional support to its residents, it provides a common social currency as well. Personally, I could not be living a more different type of life than my father – or than many of my childhood friends who stayed back in Spokane – but a quick mention of Karnoswki’s high-scoring season or Williams-Goss’s rebounding prowess immediately levels the conversational playing field. Perhaps this common currency and invisible mutual support helps reconcile the lack of logic inherent in turning over your feelings of self-worth and happiness to strangers dribbling and shooting an orange ball.

Zooming out even more, the reason a struggling town ascribes emotional significance to a constantly rotating group of 18- to 21-year-old boys is a slippery phenomenon. What is it, exactly, that my father is hoping for when he’s eating his pasta and listening to the game on the radio? What about the yelling fans who paid five dollars for upper-level seats? The bus driver with his “Go Zags” sticker on his ticket machine? The residents of the crumbling house who post a “Gonzaga Bulldogs” pennant in their window?

The aforementioned psychology of connection explains much of it, but not all of it. As someone who growing up was generally more interested in reading a book than watching the Zags play, the answer has long eluded me. But I believe those activities have more in common than I’d previously given them credit for. The janitor who makes up the 17% of people living below the poverty line, returning to home in north Spokane late at night to catch the Zags isn’t thinking about the “mirror neurons” that are firing or the common social currency he is establishing. He isn’t thinking about the economic possibilities of a strong sports team, Few’s $1.37m per year salary, or the soaring number of applications to Gonzaga after they made it to the Elite Eight. He’s excited that his team is ranked to be national champions this year, that his team is in its best form in the school’s history.

by Cody Delistraty, The Guardian |  Read more:
Images: markk

Why Porn Has Gotten So Rough

“It’s become more rough. It’s become generally more… humiliating,” offers Julianna, a top porn agent. “Anyone can open the internet and find anything they want, and when you watch this, you go, ok, what’s the next step? You’re always curious about going deeper and deeper and deeper.”

Julianna is the co-founder of Julmodels, an agency for porn performers based in Hungary. She is also one of the subjects of Pornocracy, an eye-opening documentary about the state of porn playing at SXSW. The crux of the film, directed by the French porn veteran turned director Ovidie, is that free XXX tube sites have not only left the adult industry in tatters, but are a pox on society: a danger to sex workers, forcing them into extreme acts of degradation due to dwindling demand, and to our youth, allowing them unfettered access to hardcore pornography.

The latter issue looms large in Ovidie’s harrowing film, a stygian exploration into porn’s white collar underbelly that likens its hoodied, pierced creator to a Lisbeth Salander-esque hacktivist truth-bombing the system. In one particularly cringe-inducing scene, Pierre Woodman, a renowned DIY porn filmmaker, captures the corrupting influence of tube sites.

“The root of it all is that internet piracy is killing adult movies, streaming content that should only be for adults but that is now unfortunately available to young people as well,” he says. “And I’m fed up with hearing every day during casting sessions a girl who says, ‘Oh I’ve known you since I was eight years old.’ That’s just too much.”

After navigating her way past performers, handlers, and producers, Ovidie’s quest leads her to the kingpin: MindGeek, a multinational corporation with a near-monopoly on free streaming porn. The conglomerate owns all the sites in the Pornhub network, including YouPorn, RedTube, GayTube, Tube8, and Pornhub; as well as the porno studios Brazzers, Digital Playground, Reality Kings, Twistys, and the bulk of Playboy’s digital and TV operations. But the sprawling company, which previously operated under the names Mansef and Manwin, has run afoul of the law on numerous occasions. In 2009, the Secret Service seized $6.4 million in funds from two fidelity bank accounts controlled by Mansef, with Feds accusing the syndicate of money laundering; and in 2012, its then-owner Fabian Thylmann, a young German programmer once hailed as the Mark Zuckerberg of porn, was arrested on charges of tax evasion.

What Pornocracy does is raises plenty of questions concerning MindGeek’s operations. Why is it headquartered in Luxembourg, a notorious tax haven, when most of its operations appear to be run out of Canada? Do Wall Street hedge funds have a controlling interest in the company? Who is actually pulling the strings? How are these sites not violating copyright laws? And why is the money allegedly being routed through various countries to performers?

“They’re a fishy, weird company,” says Stoya, a Digital Playground contract girl from 2007-2013, in the film. “My Fleshlight royalties, when the wire transfers come in, go through banks in places like South Africa. They have offices in Ireland. It’s a bunch of men with Greek last names and thick Greek accents claiming to be Quebecois.”

by Marlow Stern, BuzzFeed |  Read more:
Image: Magneto Presse

You Can Now Send and Request Money in Gmail on Android

Google Wallet has been integrated into Gmail on the web since 2013, but today Google is rolling out a new integration on mobile. Starting today, users of the Gmail app on Android will be able to send or request money with anyone, including those who don’t have a Gmail address, with just a tap.

The user experience has been designed to make exchanging money as easy as attaching a file, Google explains in its announcement. To access the new feature, you tap the attachment icon (the paperclip), then choose either send or request money, depending on your needs. A pop-up window appears where you can input the amount and add a note, and send.

The entire process takes place in the Gmail app – you don’t have to have Google Wallet installed. In addition, recipients can configure it so the money they receive through Gmail goes directly into their bank account. There are no fees involved, notes Google.

The goal, seemingly, is to take on quick payment apps like PayPal, Venmo or Square Cash, by offering a feature to move money right within Gmail’s app. This could be useful for those times where the money is already a topic of an email conversation – like when you’re planning a trip with friends, or getting the family to go in together on a gift for your parents, for example.

But whether or not people would think to turn to Gmail for other uses, like splitting the dinner bill or paying friends back for drinks, is another matter.

by Sarah Perez, TechCrunch |  Read more:
Image: uncredited

Daylight Saving Hell

I shouldn’t be obsessed with daylight saving time, but I am. Like a pregnancy due date, a college graduation, or an income-tax payment, I have DST circled in red on my calendar and amplified with exclamation marks.

A few years ago, it meant nothing to me. I work at home—I can sleep or rise anytime I want, and I don’t get melancholy when the days get shorter. But here’s what I’ve come to anticipate with dread: changing the time on the clock in my car.

It’s nothing fancy: a 2015 Subaru Forester that I bought used. Although I don’t consider myself a dimwit, I absolutely cannot figure out how to set the clock. Twice a year, when the time changes, I find myself sitting in the car reading the Forester manual or at my desk watching YouTube videos on this subject and still, setting the clock is unfathomable.

My Forester has a large central display and, to the left of the wheel, three imposing black levers. To set the time, you have to do an intricate dance with these levers, along with corresponding icons that look like amoebas or the rings of Saturn. Even to get to the “clock function screen,” you must first navigate past the main information screen, which provides access to a dozen other functions. If you’re persistent enough, you’ll eventually land on one that says DATE, which means, in classical Subaru, TIME. From there, you go back to the three levers and start playing with them, looking for a way to enter the correct hour of day. If you mess up—by taking one hand off the lever too quickly or depressing the wrong one out of sequence—the entire screen goes dark and you’re back to square one.

There was a time when the clock in my car was correct. After the battery died (because I forgetfully left the headlights on), it was five hours off. I tried every way I could to adjust it, and then, in desperation, I duct-taped a small travel clock to the dashboard and used that.

When the guy from AAA came to charge the battery, I asked him if he could set the time. This is a man who spends every day under the hood of a car. He took one look at the three levers and snorted, “Sorry, can’t help you.”

I made an appointment at the Subaru dealership. A jumpsuited technician got into the driver’s seat and started to manipulate the three levers. He kept a blank face, but I could tell he was having problems. After about fifteen minutes, the screen said it was January 4035 and the time was 00.15 A.M.

He walked into the garage and another technician came out. I did not like this at all; it felt like when you’re suffering from a condition so anomalous that your doctor needs an immediate consultation with a superior. The new guy settled into the driver’s seat, where he, too, became frustrated. I felt like I had to say something, to excuse myself for bringing a car into the dealership to have the clock set and to make them more at ease with the problem.

“This is so unusual,” I said. “I mean, I’m not that stupid.” I blabbered on: “I actually went to Yale, where I did my graduate work, and I still can’t figure out how to set this damn clock!” My bona fides did nothing to help them set the clock, relax, or commiserate with me, so I shut up. After half an hour, the clock was set. I asked the guys to show me how to do it and they said, “Too complicated.” I left it at that.

Following this professional intervention, I took the travel clock off the dashboard and glanced happily at the official car clock to check the time. It felt so luxurious. But then it hit me: soon it would be daylight saving time and the clock would again be wrong.

by Jane Stern, Paris Review |  Read more:
Image: uncredited

Monday, March 13, 2017


Elles
via:

Lisa Breslow, Spring and Green, 2016
via:

Tom Gauld
via:

Meet the Companies Literally Dropping ‘Irish’ Pubs in Cities Across the World

[ed. I used to dream of going to Ireland or Scotland and experiencing the historic pub culture (along the lines of Balleykissangel or Local Hero), but not much anymore. I get the impression everything's been too westernized and Disneyfied (like many other cities and countries of the world).]

The walls of the bar are covered in old art, photographs of Ireland, and yellowing posters in frames. A pair of hurleys, the flat ash stick of the Gaelic game, are tacked above the door frame. The bar’s otherwise full of dust-coated bottles of bygone whiskeys and stouts, musical instruments, and familiar ridged glass partitions that gracefully generate several spaces where there might have been just one.

Christy Moore, beloved grandfather of contemporary Irish folk music, hums over the speakers. The manager — who, pleasingly, shares a first name with Moore — flits warmly and easily from bar to table, genially, and in a Donegal accent, asking about the general well-being of diners and drinkers. Notably, there are few shamrocks, in any form or medium — they, along with leprechauns, are generally derided as emblematic of a very loose grip on Ireland and “Irishness.”

The Auld Dubliner — small, dark, and convincing, with a flat, matte, unassuming facade (red and yellow lettering over black paint, on wood) — rests between a heavily illuminated branch of T-Mobile and a “dueling piano café” on a street approximately 5,000 miles from the place invoked in its name. Almost every part of the bar the eye falls on — from the stocky tables and the upholstered chairs to the floor tiling and the mock oil lamps dangling from the ceiling — were railed into the unit in Long Beach, California, from a 40-foot container that spent between three and five weeks at sea.

The bar’s trappings belie its location — a retail complex — and the year of its opening: 2003. Like thousands elsewhere, it was designed and prefabricated in Ireland: an export not cultural or theoretical, but actual. The assiduous export and installation of these pre-made Irish bars has been going on for more than 30 years, resulting in a global network of establishments that are interrelated but unrelated. A loose confederation. A franchise without a name.

In the late 1970s, Dublin architecture student Mel McNally and some classmates were tasked with analyzing a piece of local architecture. They decided to make their subject the city’s pubs. A dim view was taken of their proposal, but in the end, the project was such a success that it became a months-long public exhibition. Much of the work went missing in the final days, as McNally tells it, so emotive and sought-after were the drawings and renderings.

McNally went on to research the whole of Ireland to establish a definitive playbook of pub varieties, which led to the foundation of a design and manufacturing specialist, the Irish Pub Company [IPC], in 1990. The ambition was to design and build complete interiors of pubs, first domestically, but then for foreign markets, assembling huge shipments of flooring, decorative glass, mirrors, ceiling tiles, light fixtures, furniture, signage, and bric-a-brac, as well as the obvious centerpiece: the bar itself.

The group now sells bars in six “styles” that can be selected from a company catalog: Shop, Gastro, Victorian, Brewery, Country, and Celtic. At a glance, the variations may seem slight. Upon closer inspection, though, the Victorian option makes distinctively liberal use of brass accents and plummy tones. “Country” is a simpler affair: woody, closer to a kitchen, and liable to feature wall-mounted crockery and/or an open fire. “Modern” would appear to be the hipster iteration, the furniture sleek and the setting more contemporary, one conducive to nu-Irish pursuits like craft beer and artisanal gin tasting.

The Celtic style, on the other hand, plays up ancient folklore and mythology. “Brewery” uses related paraphernalia, cobblestone, and slate to get at the historical version of its name. “Shop” riffs on the rural pubs that doubled as general stores — or the general stores that doubled as pubs — a special configuration still found in Ireland.

Asked about essential components of an Irish bar, McNally offers, “I think everybody recognizes that good stained glass makes a difference,” delivering the line with total solemnity. Also important: spaces. “When I talk about spaces in pubs, very few clients know what I’m talking about,” he says, naming Dublin’s the Long Hall — a revered, beguiling Dublin pub, popular and relied upon for generations — as emblematic. Part of a protected structure, the pub has a jaunty red exterior and is a deep red within, like a heart, warm and compact, with chambers that inform the natural flow of patrons. “You know when you walk in how you wind up gathering up with people.”

The brewery behind Guinness, faced with flagging sales internationally, partnered with IPC shortly after its 1990 launch. McNally’s model was a highly effective conduit for sales of stout, and financial backing offered by Guinness enabled McNally’s expansion into continental Europe by subsidizing new operators and investing heavily in marketing.

The companies worked together to promote the flatpack Irish bar, made to order, as a marketable commodity. Introductory workshops were hosted. Country managers were appointed to handle particularly interested markets. Later on, assistance reportedly took the shape of a five-day class on all operational aspects and extended to the recruitment of Irish people to staff new openings.

by Siobhán Brett, Eater | Read more:
Image: Irish Pub Company

The New Party of No

[ed. I'd vote for Elizabeth in a New York minute. She's spent her whole career supporting working families, eviserating Wall Street, and fighting inequality. Expect her to be demonized in Clinton style as soon as mid-term elections roll around.]

On the morning after Election Day, Chuck Schumer’s phone rang. It was Donald Trump. Trump has repeatedly described Schumer as his friend — which, the New York senator was at pains to clarify when we first spoke in mid-February, “isn’t quite true.” There had been the occasional favor; at Schumer’s request, Trump hosted a fund-raiser for the Democratic Senatorial Campaign Committee at Mar-a-Lago in 2008, and Schumer made a cameo on “The Apprentice” in 2006. Beyond that, Schumer told me: “I bump into him at meetings here and there. We never went out to dinner once. We never played golf together. I sort of knew him.”

On election night, Schumer was with Hillary Clinton at the Javits Convention Center in Midtown Manhattan when, around 8 p.m., he saw some troubling exit polls coming out of Florida and North Carolina. They showed that college-educated women in both states — a demographic that everyone assumed would be a lock — were underperforming for Clinton. Schumer called one of her top campaign advisers, who tried to reassure him. “He says, ‘Don’t worry, our firewalls in Wisconsin and Pennsylvania and Michigan are strong,’ ” Schumer recalls. “ ‘There’s no way Trump can win.’ ”

Schumer kept up appearances. He tweeted a photo of himself in front of a catering table with Kate McKinnon, who plays Clinton on “Saturday Night Live” (“I got to congratulate Hillary Clinton — oops, wrong Hillary!”), and then took the stage, leading the crowd in a chant of “I believe that she will win!” But by shortly after 11 p.m., Trump had taken Ohio and North Carolina. The probability dashboards on the data-journalism websites had lurched Trumpward, and an unthinkable future was lumbering into view.

Schumer, who was in line to succeed Harry Reid as the top-ranking Democrat in the Senate after Reid’s retirement in December, had spent roughly $8 million of his own campaign funds on Democratic senatorial campaigns in other states in hopes of retaking control of the upper chamber, which the Democrats lost in 2014, and of making himself the majority leader. On his bookshelf he kept a copy of “Master of the Senate,” the historian Robert Caro’s exhaustive chronicle of Lyndon Johnson’s years as the Democratic majority leader, which Caro had inscribed to Schumer: “Whose career I have been following for years with real admiration, so that I have no doubt that he will be a great leader of senators.” But by the early hours of Wednesday it was clear that the Democrats would not take the Senate and that Schumer would not be Lyndon Johnson.

“We’ll work together,” Trump said on the phone call. He said he wanted to repeal the Affordable Care Act — “that A.C.A. is terrible,” he told Schumer — which was an obvious nonstarter for the incoming minority leader. He also said he wanted a trillion-dollar infrastructure plan. “I said, ‘Well, a trillion dollars sounds good to me,’ ” Schumer told me. But to get Democrats on board, he warned, three conditions had to be met. “You can’t do it with these tax breaks,” he said. Second, he could not “cut the programs we care about — Medicare, education, scientific research — to pay for this. It’s got to be new spending.” Finally, the bill had to preserve existing environmental and labor protections. “I said, ‘To do that, you’re going to have to get half your party really mad at you.’ ”

Schumer, as he saw it, was calling Trump’s bluff. “Donald Trump ran as an anti-establishment populist — against both the Democratic and Republican establishments,” he told me. Whether or not he had meant it, the Democrats could try to hold him to it. On the several occasions that Trump called Schumer in the weeks after the election, Schumer argued that he could try to govern as a hard-right conservative, but “America is not a hard-right country,” and there would be electoral consequences.

It might not have been the strongest card to play, but Schumer did not have a strong hand. The election in November left the Democrats stripped of power at every level of federal authority. Schumer would now possess the only means they had of exerting even limited influence over Trump’s agenda: a Senate Democratic caucus that, while several seats shy of a majority, was large enough to make life complicated for Senate Republicans. But that could happen only if the Democrats formed a united front — and it was unclear whether they could, or even wanted to.

The 2016 election was not just an electoral crisis for the party but also an existential one, more severe than any that the Democrats had experienced in decades. The party had glided through the campaign with a sense of destiny: In July, Schumer breezily remarked that “For every blue-collar Democrat we will lose in western Pennsylvania, we will pick up two, three moderate Republicans in the suburbs in Philadelphia, and you can repeat that in Ohio and Illinois and Wisconsin.” Then Hillary Clinton lost to a candidate who revived a strain of nativist, nationalist politics that had been dormant in the Republican Party for at least a generation, and who won in part on the ballots of Barack Obama voters in traditionally Democratic strongholds like Michigan and Wisconsin. “I sleep like a baby,” Chris Murphy, the Democratic senator from Connecticut, told me. “And I can sleep anywhere — on the road, on the floor, in my kid’s bed jammed up against the wall.” But on election night, he says: “It’s so cliché, but I stayed up all night. I was, mentally, totally unprepared. At some level, you do have this — ” he trailed off and was silent for a moment. “You do start to question whether you know the country as well as you thought.”

The Democratic primaries and caucuses, meanwhile, had left the party sharply divided. Clinton lost 22 states to Bernie Sanders, Vermont’s independent and self-identified socialist senator, whose out-of-nowhere challenge had stirred a grass-roots excitement that Clinton’s campaign conspicuously lacked and acrimoniously split the Democratic Party’s centrist and left-leaning contingents — the latter of which viewed the Obama years as a missed opportunity to fight economic inequality, reorient trade policy and rein in Wall Street. Clinton might have won the popular vote, but in a way, this only amplified the confusion: about whether the party needed to transform or simply tinker; whether it needed to move to the right or to the left; whether the voters who were willing to vote for a candidate who said the things Trump said could be won back at all.

These were problems for anyone trying to chart a course for the Democrats, but in a particularly acute way they were problems for Schumer, a politician who was better known as a dealmaker, a student of consensus, than as a pathfinder. As majority leader, the next two years might have been the pinnacle of his career: bill signings, valedictory news conferences (and few politicians visibly delight in news conferences the way Schumer does), the sorts of late-night negotiations that historians like Caro write books about. Instead, Schumer found himself with a job that The Times Union, in Albany, observed two days after the election was “something of a booby prize.” The Democrats, who spent Obama’s presidency railing against Republican obstructionism, would soon be facing a president who, in his stated ambitions to unmake much of Obama’s legacy, was all but inviting them to try the same. Whether this was in Schumer’s​ ​DNA was one question. Whether it was in t​he party’s​ ​was another.

The Democrats have never been a natural opposition party, or a particularly effective one. Republicans from Reagan to the Tea Party broadly believed in reducing government, as the anti-tax activist Grover Norquist famously put it, “to the size where I can drag it into the bathroom and drown it in the bathtub.” Cutting budgets and eliminating programs might require a Republican president and a congressional majority, but lacking this, a disciplined minority party could gum up the works, starving existing initiatives and blocking attempts to expand them.

Democrats, by contrast, have generally been united by a belief in government that tries to do big things, in the manner of Franklin D. Roosevelt’s New Deal or Johnson’s Great Society or, later, Barack Obama’s Affordable Care Act — a belief that, practically speaking, requires either landslide majorities or a willingness to compromise. Several public-opinion polls in recent years have found that this difference is reflected in the party’s electorates, which have increasingly come to view the political process in starkly different ways. In a 2014 Pew survey, 82 percent of people who identified as “consistently liberal” said they liked politicians who were willing to make compromises; just 32 percent of “consistently conservative” respondents agreed.

The week of Trump’s inauguration, David Brock — the onetime conservative journalist turned liberal gadfly — hosted a private gathering of Democratic luminaries at Turnberry Isle, a golf resort outside Miami. One speaker Brock invited was Ronald Klain, the former chief of staff to Vice Presidents Al Gore and Joe Biden, who previously worked for Tom Daschle, the Senate minority leader, in the aftermath of the 1994 midterms, when the Republicans took control of both houses of Congress for the first time in 40 years. Addressing Brock’s crowd, Klain called for Democrats to embrace what he dubbed the Hundred-Day Fight Club. As Klain learned from working with Daschle, “You have to take on a lot of fights to win any fights,” he told me recently. “When you’re in the minority, you can’t be too choosy. I advocated a strategy of more comprehensive opposition.” But plenty of the Democrats present — among them the Chicago mayor and former White House chief of staff Rahm Emanuel, who was sharing the stage with Klain — disagreed. “At the time, there was more of a divide,” Klain told me. “The ‘we gotta pick our spots’ philosophy: ‘He’s a new president — we don’t want to look like McConnell looked in 2009.’ ” Mitch McConnell, the Republican senator from Kentucky who was then the minority leader and is now majority leader, reportedly mapped out a program of near-total resistance shortly before Obama took office — a strategy that Democrats spent subsequent years attacking as cynical and irresponsible. “We’re Democrats,” Klain said. “We like to govern.”

Among the Democrats who appeared to share Klain’s postelection view was Harry Reid, Schumer’s predecessor as minority leader, who would be retiring in December. Reid and Schumer, then Reid’s deputy, were an effective team for years in the Senate, partners in a long-running good-cop-bad-cop act. Schumer was known as a sharp-elbowed partisan during his 18 years in the House of Representatives, but in the Senate he had become an avatar of the gabby aisle-crossing bonhomie that had historically characterized the upper chamber. “You know, I get along,” Schumer told me. “I’m in the gym in the morning, I’m talking with Thune, and Lamar, and Cornyn all the time,” he said, referring to the Republican Senators John Thune, Lamar Alexander and John Cornyn. “I’m friends with them. They attack me, I attack them. We understand that.”

This was the old way of the Senate — one that began to fade in the late 1970s as the ideological consolidation of the parties accelerated, reducing their need and taste for compromise. By the time Schumer arrived in 1999, four years after the Newt Gingrich-helmed Republican revolution, it was fast becoming an anachronism, with meals in the senators’ bipartisan dining room giving way to one-party caucus lunches. By the time Reid became minority leader, amid the scorched-earth polarization of George W. Bush’s second term, it was all but gone.

Reid, like his Republican counterpart, McConnell, was one of the rare politicians who seemed to genuinely not care if people liked him or not. He was known for his blunt-instrument floor speeches, sandbags of verbiage delivered with minimal theatrics and less ambiguity of purpose. It was in this mode that he took to the Senate floor, a week after the election, and drew a line.

Senate Democrats had tried for years to pass the kind of infrastructure bill Trump had suggested, Reid reminded his colleagues, only to run up against Republican opposition. “If Trump wants to pursue policies that will help working people, Democrats will take a pragmatic approach,” he said. “But we also have other responsibilities.” He made clear that the price of Democratic cooperation should be Trump’s dropping of Stephen K. Bannon, the former Breitbart News executive chairman who ran his campaign and whom Trump named as his chief White House strategist two days earlier.

“In his first official act, Trump appointed a man who is seen as a champion of white supremacy as the No. 1 strategist in his White House,” Reid said. “As long as a champion of racial division is a step away from the Oval Office,” he added, “it will be impossible to take Trump’s efforts to heal the nation seriously.”

Reid and Schumer might have differed temperamentally, but they were both thinking about the 2018 midterm elections, in which 25 Democratic senators would be defending their seats. Lose eight seats, and the Republicans would have 60, enough to override a Democratic filibuster — at which point the Democrats’ debates about what they stood for or against would be academic. Five of the those senators —“the Big Five,” Schumer called them — were moderate Democrats in states in the Midwest, the Mountain West and Appalachia that Trump had just won handily. The abiding question was what, exactly, the voters who cast ballots for both Trump and, say, North Dakota’s Democratic senator, Heidi Heitkamp, were voting for in 2016; the party was still far from a clear answer.

Schumer, who began holding weekly dinners with the Big Five after the election, believed it was best to allow these senators to cooperate with Trump as necessary. But according to members of Reid’s staff at the time, Reid (who declined to comment) worried that, given Trump’s lack of interest in policy detail and disregard for ethical conflicts, even well-intentioned legislative compromises could prove to be politically costly — that an eventual backlash against the president would also fall upon Democrats who gestured toward working with him. “Reid didn’t want to validate the assumption that this incompetent blowhard could get a bill to the floor in the first place, which has proved to be a struggle for Trump so far,” Adam Jentleson, at the time Reid’s deputy chief of staff, told me.

Democrats were also still deeply divided over whether it was even possible to navigate 2017 without resolving the ideological and policy differences that fractured the party in 2016. At a closed-door meeting of the Democracy Alliance — a network of high-rolling Democratic campaign donors — at the Mandarin Oriental hotel in Washington the week after the election, Senator Elizabeth Warren delivered an emotional address, excoriating the party for losing its way. One attendee paraphrased her speech to me: “People want someone to fight for them — that’s why they voted for Donald Trump. He might not actually do it, but he said he would fight for them. On trade, in American politics, we’ve gotten where we either look like we’re all about free trade without any empathy for people who have lost their jobs, or we’re rabid nationalist-protectionists. We need to build a policy in between. In 2016, we did not come out clear. When we are clear about what we believe, when we fight for people, we’ll win.” To beat pugilistic right-wing populism, maybe you needed pugilistic left-wing populism.

Reid brought Warren onto the Democratic Senate leadership team in 2014, and she was one of the people he most trusted to keep the Senate caucus on its bearings through the difficult weather ahead. Shortly before Thanksgiving, he summoned Warren to the minority leader’s office. When she arrived, the room was littered with art supplies; on an easel was a half-finished portrait of Reid that would be unveiled at his retirement party the following month. Its subject was preoccupied with the future of the party to which he had dedicated decades of his life. Reid told Warren she needed to think seriously about running for president in 2020. “He was worried in November,” Warren told me recently. “For me, it was so important to make clear: We will fight back — we will fight back. We’re not here to make this normal.”

by Charles Homans, NY Times |  Read more:
Image: Photo illustration by James Victore. Photograph by Alex Wong/Getty Images