Wednesday, April 15, 2015

Meh!-lennials

We live in an age of ceaseless generational analysis. Among certain classes, especially business elites, it is considered a sign of profound insight to speak only in terms of youth and its consumer preferences. The jargon once endemic to Ad Age (which coined the term “Generation Y”) now peppers style sections and business books, earnest organizing meetings and talk shows, such that no one of any age can open a newspaper or a website without reading about the “millennials” — people born between 1982 and 2004 — and their doings, interests, and needs.

It seems not to matter to the proliferation of writing about millennials that so much of it has been internally contradictory. In the year 2000, the sinister David Brooks said that stats suggested the boomers were raising friendly, sociable, and altruistic kids. In 2012, Jean Twenge at the Atlantic retaliated with fresh stats that revealed them to be inveterate narcissists profoundly uninterested in social problems. “Politicians: Millennials Won’t Vote Because They Hate You” declaimed Bloomberg, prompting an older Huffington Post correspondent to wonder ruefully, “Millennials: Why Do They Hate Us?” All this despite evidence that millennials vote in the same numbers as young people of previous generations. Millennials, according to Business Insider, are disaffected with workplace authority and value flexibility, but an IBM study written up in the Washington Post suggests that in this respect, too, millennials are indistinguishable from other generations. Reading around, you can form a picture of millennials either as great disrupters, creating massive discontinuities in civilization, or as essentially the same as everyone else. In this way generational analysis resembles astrology: ascribe any quality to a certain sign and your claims are guaranteed to be neither true nor false.

It’s easy, of course, to make fun of generational analysis. For many years generations have been the favored category of social pseudoscientists, not to mention marketing gurus and breathless lifestyle journalists. But much of the oxymoronic character of millennial-speak derives from its pairing claims to statistical rigor with an utterly unscientific fondness for making wild predictions. Behind this is a confusion of logic, according to which the present desires of humans create the future: once you know what young people want, you know what tomorrow will be like (and how to make a buck off it). Institutions, classes, and environments play hardly any role in this view. One influential example is Richard Florida’s theory of the “creative class,” which imagined the salvation of postindustrial cities resulting from young people choosing to live in them. If millennials like cities, the thinking went, then cities will be rejuvenated. In 2012, Florida sheepishly qualified some points of this theory in a new introduction to 2002’s The Rise of the Creative Class, but his original thesis was so persuasive that it’s still regarded as common knowledge. Meanwhile, the cities that banked on this kind of thinking, like St. Louis or Baltimore, have foundered spectacularly.

The abundance of such lazy analysis may seem reason enough to dismiss “generations” as a meaningful tool for understanding history. What are generations, one might say, but an ingenious marketing rubric we have come to treat as natural? But the fact remains that generations capture everyday divides that everyone recognizes intuitively. People are born into spans of time, into worlds that precede them and survive them. If it makes sense to segment history into periods, it follows that those periods have something to do with the people growing up and dying within them. (...)

But millennials grew up not self-making but defined and redefined by people several decades older. When the term was coined in 1991 by demographers William Strauss and Neil Howe, a great deal of hope was placed in millennials (the oldest of whom were around 8, and not especially responsive to polls). Nurtured by caring parents, Strauss and Howe argued, this new generation would be civic-minded and ethical. Not only would they be less interested in TV than their parents, but “what programs Millennials do watch will be sanitized and laden with moral lessons.” This hopeful portrait was a reaction to its time. It was the close of the Reagan era, when the once socially minded boomers were seen (even by themselves) as having become irremediable narcissists, and twentysomethings were portrayed as Patrick Bateman–type sociopaths. A crazy messianism attached itself to the youth: millennials were going to save this involutionary, belligerent, and vacuous country from itself. And in the years that followed, proliferating urban farms and community-supported agriculture and bike-shares — all faithfully chronicled in GOOD, the echt-millennial, nonprofit-loving magazine of the larger, for-profit Good Worldwide Inc. — began, if you squinted and cherry-picked, to prove the point. The religious fervor peaked with the election of Obama in 2008 — proof, it seemed, that millennials would change the world (66 percent of 18- to 29-year-olds who voted voted for Obama, though nearly half that group declined to vote at all).

Subsequently, the narrative changed. As the economy went into free fall, the fascination with millennials reached a new intensity, and the think pieces proliferated. And, increasingly, the think pieces disagreed. Who are the millennials, and how do we explain their behavior? What do they stand for? (As if 100 million people ever stood for a uniform thing.) The answers differed greatly depending on the writer and the poll, but the pitch of anxiety was constant. Much of the obsession came from the business world — from the older, wealthy, mostly white decision makers who longed for a master key to understanding the needs and attitudes of the young people who would make and consume their products. For their analysis, these businessfolk looked to the major media institutions — which, racked by the recession, in a panic to figure out the internet, and acutely aware that no one under 90 read the newspaper, were themselves obsessed with what young people wanted. So the papers and magazines catered to their loyal readership — wealthy older people — by feeding them piece after piece about millennials, who seemed less promising than they once had.

by The Editors, N+1 |  Read more:
Image: Google

Aernout Overbeeke
via:

Flight Facilities feat. Micky Green

Would You Let the I.R.S. Prepare Your Taxes?

Around this time every year, Joseph Bankman, a professor of tax law at Stanford Law School and a longtime advocate of using technology to simplify tax filing, gets on the phone with reporters to explain what is wrong with how we do our taxes in the United States. Every year he says pretty much the same thing: No other industrialized country asks its citizens to jump through as many hoops to calculate their taxes as ours.

It isn’t just lawmakers or the hapless-seeming Internal Revenue Service that is perpetuating the annoyance of tax time, he adds. Instead it is the private sector — specifically, the software company Intuit, which makes TurboTax, the most popular tax program in the country.

For more than a decade, Mr. Bankman and a small group of tax experts have called on the government to create a tax preparation method that they say would vastly reduce the time and cost of tax-filing for most people. Intuit has been a primary obstacle to the effort.

The reform plan would work like this: Today, employers, banks, brokerage firms and pretty much every other financial organization in the country send the federal government detailed records about our economic activity every year. These organizations also send you, the taxpayer, a similar set of documents, which are forms with names like W2 and 1098. After you file your taxes, the government matches its two sets of documents to make sure you have filed correctly.

To Mr. Bankman, this double documentation doesn’t make much sense. If the government is already collecting financial data from employers and banks, why can’t the I.R.S. use that information to precalculate our tax returns for us? At the very least, why can’t tax software just connect to the government’s database to download all the information that the government has collected, saving us all that record-keeping and data entry?

“Imagine if your vehicle registration fee was done the same way,” Mr. Bankman asked in a recent interview. “Imagine if the state said, ‘Go to your car, find your VIN number and then look at this table that has different tax rates to find out how much you owe.’ If they did, people would probably need to hire an expert for that too.” (...)

Dennis Ventry, a professor at the School of Law at the University of California, Davis who has studied the issue, said that while Free File and Intuit’s integrations with private companies were beneficial, they won’t be nearly as helpful as a government program to reform tax filing.

Mr. Ventry said that if return-free filing were operated nationally, tens of millions of people with simple tax situations might have to do just a few minutes of work at tax time every year. The I.R.S. would send them a tax return that had already been filled in with their financial data, and if everything looked in order, they would file it either through the mail or electronically. The return would be completely voluntary. People who disputed the I.R.S.’s calculation would be able to do their taxes the old-fashioned way. Tens of millions of additional taxpayers with more complex returns would be able to save time by downloading all the financial information that the government has collected about them during the year. You would be able to do your return without hunting for every stray W2 or 1099 in your household.

“It can help all 145 million taxpayers,” Mr. Ventry said.

But Intuit’s opposition to return-free filing has been ferocious.

by Farhad Manjoo, NY Times |  Read more:
Image: Stuart Goldenberg

One Company’s New Minimum Wage: $70,000 a Year

The idea began percolating, said Dan Price, the founder of Gravity Payments, after he read an article on happiness. It showed that, for people who earn less than about $70,000, extra money makes a big difference in their lives.

His idea bubbled into reality on Monday afternoon, when Mr. Price surprised his 120-person staff by announcing that he planned over the next three years to raise the salary of even the lowest-paid clerk, customer service representative and salesman to a minimum of $70,000.

“Is anyone else freaking out right now?” Mr. Price asked after the clapping and whooping died down into a few moments of stunned silence. “I’m kind of freaking out.”

If it’s a publicity stunt, it’s a costly one. Mr. Price, who started the Seattle-based credit-card payment processing firm in 2004 at the age of 19, said he would pay for the wage increases by cutting his own salary from nearly $1 million to $70,000 and using 75 to 80 percent of the company’s anticipated $2.2 million in profit this year.

The paychecks of about 70 employees will grow, with 30 ultimately doubling their salaries, according to Ryan Pirkle, a company spokesman. The average salary at Gravity is $48,000 year.

Mr. Price’s small, privately owned company is by no means a bellwether, but his unusual proposal does speak to an economic issue that has captured national attention: The disparity between the soaring pay of chief executives and that of their employees.

The United States has one of the world’s largest pay gaps, with chief executives earning nearly 300 times what the average worker makes, according to some economists’ estimates. That is much higher than the 20-to-1 ratio recommended by Gilded Age magnates like J. Pierpont Morgan and the 20th century management visionary Peter Drucker.

“The market rate for me as a C.E.O. compared to a regular person is ridiculous, it’s absurd,” said Mr. Price, who said his main extravagances were snowboarding and picking up the bar bill. He drives a 12-year-old Audi, which he received in a barter for service from the local dealer.

“As much as I’m a capitalist, there is nothing in the market that is making me do it,” he said, referring to paying wages that make it possible for his employees to go after the American dream, buy a house and pay for their children’s education. (...)

Mr. Price said he wanted to do something to address the issue of inequality, although his proposal “made me really nervous” because he wanted to do it without raising prices for his customers or cutting back on service.

Of all the social issues that he felt he was in a position to do something about as a business leader, “that one seemed like a more worthy issue to go after.”

by Patricial Cohen, NY Times |  Read more:
Image: Matthew Williams

Only the Good Dine Young


I have unreasonable expectations of what it means to “dine,” and I blame my parents.

Every night around seven p.m., my dad would turn on Ready, Set, Cook! on the Food Network and watch it while he drank wine and made dinner, and I would sit at the kitchen table and read the most recent Zagat guide to New York City. I would work through it page by page, starting with the most popular (Union Square Cafe), then the pages with the top food rankings, the top service rankings, and the top decor rankings, and then the alphabetical restaurant listings, skimming for the names in all caps and checking their scores. Have you been to Le Bernardin? Yes. Have you been to Jean-Georges? Yes. Why isn’t there anything above a twenty-eight? Because things can’t be perfect. But why isn’t there a twenty-nine? I don’t know, Brette. Maybe next year.

I learned that my parents had been to Daniel, too, and that the restaurant was pronounced Dan-yell, because the chef there was French. It was a twenty-eight. Later that year, when my parents asked me what I wanted for my tenth birthday, I asked if we could have dinner there.

At Daniel? Yes please.

I’m not sure if my parents were surprised or appalled or pleased that this quiet girl of theirs who wore basketball jerseys to school had just asked to eat at one of the fanciest and most expensive restaurants in New York City. But they said yes. They were in. I was delighted.

My dad supervised as I called and made a reservation, exactly a month in advance. My voice shook on the phone: Can I please make a reservation for three people? (My little brother, who was six, wasn’t allowed to come.) Oui, mademoiselle. My dad got on the phone afterwards to confirm. It was settled. We were going!

The night of the dinner, my mom blow-dried my hair and I put on a lavender suit and a matching headband. We drove for an hour, into the city, and when I walked into the restaurant—in my mind, a darkened arena ringed with lights, with a man in a suit stepping out behind the host stand to bend down and shake my hand—my eyes widened into saucers. I was so incredibly happy.

These are the things I remember:

by Brette Warshaw, Lucky Peach | Read more:
Image: uncredited

Tuesday, April 14, 2015

Second Hand Stories in a Rusting Steel City

A second hand store in a second hand town is about what you'd expect. A bit derelict, a bit kooky. Rows of thin gold necklaces, mostly crosses and hearts too small to melt down for scrap. Four Sony PlayStations, only two of which work. Boxes of cords to who knows what. Tools — piles of screwdrivers and buzz saws and toolboxes and doohickeys on collapsible tables. Scuffed guitars hanging by their necks. Guns and more guns — some mounted on a pegboard wall and others stuck in a cardboard barrel, butts up. Unopened, unwatchable movies. “Kitchen” scales never purchased for cooking.

In a place like Braddock, Pennsylvania, nothing much surprises you. It’s a poor place, mostly black, mostly a shadow of the boomtown steel days. Some say they should just tear the whole place up and pave a highway through it. Others suggest a new shopping mall in lieu of the roadway. Still others think it should become a green paradise, a playground for artists and intellectuals to put down roots and educate people on sustainability.

Norm doesn’t take much stock in those ideas.

“It didn’t look too good when I got here,” he says of Braddock and the shop. His father-in-law had owned Steel City Pawnbrokers since 1952 and Norm started working here because of his wife. Her father needed some help, and Norm had just quit his job at Stop-N-Go in nearby Oakland.

“It’s a timeless business, one of the oldest in the world — before malls,” he proclaims.

Despite just barely staying afloat (he and the other employees call it a “miracle,” and none of them are quite sure of the true balance of the store’s books), Norm is in five days a week, usually perched on a secondhand stool between Dennis and Ray. Dennis is the “music guy,” mid-forties with black curly hair. He’s the odd man out of the trio — more spry than Norm or Ray, a bit more hip. He works here for the health insurance and to supplement his income as a jazz musician. Ray doesn’t seem to have anything better to do, so he keeps coming in. And Norm?

“I’ve wanted to get out of here a number of times,” he says. “I used to work with my wife’s uncle, and he wanted it his way” — stock perfectly aligned, interactions with customers strictly prescribed.

“I’d quit at least once a week. I’d walk out, and my father-in-law would call and tell me, ‘I’m not gonna be around forever, you’ll be in charge eventually.’”

“He left you an empire!” Ray chimes in. Norm rubs his balding head, amused. When he smiles, he looks a little like Dopey from Snow White. He took over Steel City in the mid-1990s after his wife’s dad died. Even so: “My father-in-law has got some grip on me and won’t go away.”

“It takes balls to own a pawn shop,” Ray assures. He’s in his mid-fifties and sits next to Norm in a chair with a strap-on back massager. The seat and style screams hotel décor and is no doubt one of the many “finds” they’ve come across over the years. Ray has a bad… well, everything, and rarely rises from his throne unless he has to. Dennis mills around the store, fiddling with guitars or tools, singing “Let It Be” in front of a wall of worn LPs.

It would be easy to imagine Norm and Ray sitting almost anywhere — two retirees at the end of a dock with fishing poles propped in their metal lawn chairs; on a front porch gripping beers and using a cooler as a table; waiting for their wives in the lobby of a hair salon — and nothing would change, not the timbre of their days or their conversations over cans of Coke and bag lunches. Their work at Steel City strikes less as a job than a hobby, a habit, a favor to no one in particular.

Once this was the place to be. Steel country — home to Andrew Carnegie’s first mill, in fact. The industrial nexus of the Monongahela, Allegheny, and Ohio rivers. Jobs, families, and children. Streets with movie theaters, pharmacies, and restaurants. Schools with trophy-winning football teams. America’s first supermarket, an A&P, opened here. Carnegie’s first library, too.

The short and dirty narrative of its decline, oft repeated in national and local media: In the 1920s, more than 20,000 people called Braddock home. Industry bloomed; soot stuck to aging stony buildings. In the 1970s, steel crashed and took livelihoods with it in an economic free fall. In 2009, a full fifth of the population earned less than $10,000 annually. Today, fewer than 2,200 residents remain, and Braddock’s median income has dropped to less than half the state average.

To the extent that there are newcomers in Braddock, they seem drawn by a sense of possibility or duty; gentrifiers, yes, but also those who feel the pull of an aging mother. Some are lured by promises of cheap land and a marketing campaign underwritten by Levi’s and driven by the town’s big and brash mayor, John Fetterman. He’s a newcomer too, arriving in 2001 as a volunteer for AmeriCorps, which he joined after finishing grad school at Harvard. “Everyone in the country is asking, ‘Where’s the bottom?’” he told the New York Times in 2009. “I think we’ve found it.”

by Robyn K. Coggins, Wilson Quarterly |  Read more:
Image: Marcus Santos

Shibata Zeshin, Mouse
via:

Paper Moons

Swindle and Fraud, the vaudeville team of nouns headlining this issue of  Lapham’s Quarterly, are old dogs always keen to learn new tricks, and their spirited performance during the Great Recession showcased the attention paid to their studies since the Great Houdini, on the evening of January 7, 1918, vanished a five-ton elephant from the stage of New York’s Hippodrome Theater. The new act for a new century topped up the weight of the production values:

Nine banks emptied of more than $500 billion in capital, as much as $8 trillion withdrawn from the Dow Jones Industrial Average, $2 trillion from the nation’s pension and retirement accounts. Sure-handed juggling of the public trust into the private purse. Stock market touts tap dancing the old soft-shoe with the Securities and Exchange Commission, hedge fund operatives with the Federal Reserve. A cool $1 trillion lifted from the U.S. Treasury in broad daylight, members of Congress working the money-box routine with banks too big to fail.

Throughout the whole of its extended run, the spectacle drew holiday crowds into the circus tents of the tabloid press, and joyous in Mudville was the feasting on fools. Why then the gloom among the wizards of Oz in the upper income brackets of the national news media? One might have hoped for at least a tip of the hat from the Wall Street Journal and The Economist, from Bloomberg News and the American Enterprise Institute. How not exult in the powers of the unfettered free market, admire the entrepreneurial initiative, the scale of the revival of the go-ahead, can-do spirit that made America great?

But instead of disbursing laurels, the guardians at the gates of the country’s moral treasure delivered sermons on the text of American decline, many of them in tune with the one composed by New York Times columnist Thomas Friedman in October 2008, by which time the shearing of the sheep was rolling as merrily along as a Macy’s Thanksgiving Day Parade.

“The Puritan ethic of hard work and saving still matters,” said Friedman. “We need to get back to collaborating the old-fashioned way. That is, people making decisions based on business judgment, experience, prudence, clarity of communications, and thinking about how—not just how much.”

A noble sentiment and no doubt readily available in New England gift shops, but to account for the sacking of the Wall Street temple of Mammon as a falling away from the Puritan work ethic is to misread America’s economic and political history, to mistake the message encoded in the DNA of the American dream. Who among the faithful ever has preferred the bird in the hand to the five in the bush? The spoilsports in the pulpits of spiritual reawakening never lack for proof of shameful behavior and lackluster deportment, but when they call as witness for the prosecution the milk-white marble of Western civilization and the holy scripture of American exceptionalism, they tread on shaky ground.

by Lewis Lapham, Lapham's Quarterly |  Read more:
Image: Pierre-Louis Pierson via:

Gayle Bard, Skagit Flats
via:

A Convention for the Bookish

To walk any part of the eight miles of skyway that connect much of downtown Minneapolis this past weekend was to hear snatches of dialogue endemic to writers. The forty-ninth annual Associated Writing Programs Conference—the largest gathering of poets, writers, writing students, creative-writing-program faculty, literary-journal editors, arts organizations, small presses, and literary entrepreneurs in the country—was under way, and it was snowing. Outside the glass walls of the cavernous Minneapolis Convention Center, big, fluffy, wet flakes were floating down.

But the fourteen thousand literary folks in attendance weren’t paying much attention to the weather. As a whole, they did not seem to be outdoorsy people. They spend most of their days, after all, staring into the blue glow of their computer screens, or sitting around workshop tables beneath florescent lights, or poring over piles and piles of manuscripts in windowless rooms. Their work, whether writing or reading, necessitates solitude, and they had travelled from all over the country to participate, to network, to party. They were here to be with their people, weather be damned. In the weeks leading up to the four-day conference, the literary community on Twitter swelled with excitement, and #AWP15 began to trend. It did not trend in a Kanye and Kardashian kind of way, obviously. It trended the way literary writers and poets trend, which is to say not very much. (...)

If every industry has its trade show, and if writing can possibly be described as an industry, A.W.P. has become a thriving nexus of all things literary. Founded in 1967, its first conference was held in 1972, at the Library of Congress, with six events and sixteen presenters including George Garrett, Wallace Stegner, and Ralph Ellison. This year’s conference was host to five hundred and fifty events, two thousand presenters, and over seven hundred small presses, journals, and literary organizations. If Book Expo America, or B.E.A., which is held each spring, is the convention for book publishing, then A.W.P. is the convention for the bookish. (...)

In an age-old literary method for managing terror—though arguably one with diminishing returns—the parties around A.W.P. were booze-fests. (A Monday morning tweet: “Are you a writer? My truck driving husband/AWP escort: No, but I drink like one.”) On Friday afternoon, Electric Literature, The Paris Review, and the National Book Foundation hosted an invitation-only liquid lunch—one martini per guest. One Story magazine held a superhero-themed party, at the Walker Museum, where the editors wore colorful Lone Ranger-style masks emblazoned with lightning bolts and the wine flowed freely. At the Sarabande Books booth, every purchase was accompanied by a shot of Jim Beam. Each night during the conference, the bar at the Hilton was packed three-deep with poets, writers, and those who love them. At breakfast, these same writers wore sunglasses and croaked out orders for lattes and dry scrambled eggs before heading off to a morning panel on, say, “The Bump and Grind of Meaning: Intuition and Formal Play in Hybrid Nonfiction.”

by Dani Shapiro, New Yorker |  Read more:
Image: HEEB/LAIF/REDUX

The Girls on Shit Duty

I never thought I’d work a job that was dictated by human shit. But things change. When you’re responsible for following men around and cleaning up after them it’s, at best, funny and humbling, and at worst, humiliating. At this remote fly-in fishing lodge in Northern Ontario, we housekeepers are not only modern-day chambermaids, but also plumbers, cleaning ladies, mother-figures, mock-wives, servants and, on the most difficult of days, whipping girls.

But mostly, we’re the Queens of Clean. Every day, the girls who serve the guests their heavy, rich meals of sticky ribs, oily flapjacks, and chocolate pudding are also responsible for tidying the rooms when the fishermen head out on the water. We housekeepers make the beds, sweep the floors in the cabins, refill the tissues and toilet paper, refold the towels, and replace the linens. We pick up garbage that’s been left on the floor, scoop pubic hairs out of the shower drains, and empty the slightly more palatable hair out of the sink traps. We do this with aplomb and a liquid efficiency. Yet somehow, when we have to clean the toilets, we always find ourselves staring down at the bowl and sighing. A weeklong trip filled with deep-fried shore lunches—beer-battered onion rings and fresh walleye fillets destroyed by a gallon of canola oil—does funny things to a man’s insides. Nine weeks of cleaning poop-covered toilets in the remoteness of the Canadian Shield wilderness is likely to do funny things to a woman, too.

by Anna Maxymiw, Hazlitt |  Read more:
Image: Vicki Nerino

Monday, April 13, 2015

Eating a Big Mac at the Arctic Circle


In January of 1989, the temperature got down to sixty degrees below zero in Fairbanks, Alaska, and stayed there for three weeks. Furnaces burned through heating oil at a serious rate, and parking lots slowly filled with cars that wouldn't start anymore, even with the engine block heaters that everyone up there has. Every year in the dead of winter, usually the Japanese imports were the last cars left on the road, but in 1989, even some of the Toyotas had given up.

In that kind of weather, automatic doors would freeze open or closed, so they'd have to be disabled. The people working the drive-thru window at McDonald's wore their parkas while they stood at their posts, because it was impossible to stay warm with the cold air blasting in with every transaction. And there were lots of them: in the winter of '89; almost no one actually got out of their cars and walked anywhere if they didn't have to, including me and my friend Lori, whose Datsun 200sx held up nicely during that particularly long cold snap.

Lori and I were in high school, and we thought that it would be hilariously funny to go through the drive-thru at McDonald's and order ice cream. We got a lot of mileage out of it: our satisfied giggling when the person taking the order paused and said, "You want what?" and then the spectacle we made of ourselves back at school after racing through the ice fog (when water particles in the air literally freeze solid, thick as any fog that rolls off the ocean). We danced around amidst the cacophony of slamming lockers and yelled conversations, waving the tall soft-serve cones over our heads, the hallways glowing orange in the early-afternoon setting sun, so close to the Arctic Circle.

Besides that visit for ice cream, I didn't otherwise go to McDonald's. Its parking lot was where kids met up on weekend nights to figure out where the party was ("party" meaning, usually, a pallet fire, a keg of the cheapest beer possible, and Def Leppard blasting from a boom box). The restaurant was, to me, tainted by its association with people who didn't have anywhere else to go. At 18 years old I was tired of the scene, eager to get out and away from what I saw as small town small-mindedness. I was ready for more.

by Elisabeth Fairfield Stokes , Eater | Read more:
Image: Nick Mealey

[ed. World, meet Calvin]
photo: Nate

Saturday, April 11, 2015

A War Well Lost

Johann Hari is a British journalist who has written for many of the world’s leading newspapers and magazines, including The New York Times, Le Monde, The Guardian, The Los Angeles Times, The New Republic, The Nation, Slate, El Mundo, and The Sydney Morning Herald. He was an op-ed columnist for The Independent for nine years. He graduated from King’s College, Cambridge with a double first in social and political sciences in 2001.

Hari was twice named “National Newspaper Journalist of the Year” by Amnesty International. He was named “Environmental Commentator of the Year” at the Editorial Intelligence Awards, and “Gay Journalist of the Year” at the Stonewall Awards. He has also won the Martha Gellhorn Prize for political writing.

Hari’s latest book is the New York Times best seller Chasing the Scream: The First and Last Days of the War on Drugs. You can follow him on Twitter @johannhari101

S. Harris: Thanks for taking the time to speak with me, Johann. You’ve written a wonderful book about the war on drugs—about its history and injustice—and I hope everyone will read it. The practice of making certain psychoactive compounds illegal raises some deep and difficult questions about how to create societies worth living in. I strongly suspect that you and I will agree about the ethics here: The drug war has been a travesty and a tragedy. But you’re much more knowledgeable about the history of this war, so I’d like to ask you a few questions before we begin staking out common ground.

The drug war started almost exactly 100 years ago. That means our great-grandparents could wander into any pharmacy and buy cocaine or heroin. Why did the drug war begin, and who started it?

J. Hari: It’s really fascinating, because when I realized we were coming up to this centenary, I thought of myself as someone who knew a good deal about the drug war. I’d written about it quite a lot, as you know, and I had drug addiction in my family. One of my earliest memories is of trying to wake up one of my relatives and not being able to.

And yet I just realized there were many basic questions I didn’t know the answer to, including exactly the one you’re asking: Why were drugs banned 100 years ago? Why do we continue banning them? What are the actual alternatives in practice? And what really causes drug use and drug addiction?

To find the answers, I went on this long journey—across nine countries, 30,000 miles—and I learned that almost everything I thought at the start was basically wrong. Drugs aren’t what we think they are. The drug war isn’t what we think it is. Drug addiction isn’t what we think it is. And the alternatives aren’t what we think they are.

If you had said to me, “Why were drugs banned?” I would have guessed that most people, if you stopped them in the street, would say, “We don’t want people to become addicted, we don’t want kids to use drugs,” that kind of thing.

What is fascinating when you go back and read the archives from the time is that that stuff barely comes up. Drugs were banned in the United States a century ago for a very different reason. They were banned in the middle of a huge race panic.(...)

S. Harris: We’ll talk about the phenomenon of addiction, and discuss the novel understanding of it you arrive at in the book. But first I think we should acknowledge that drugs and alcohol can cause social harms that every society has an interest in preventing. It’s not hard to see why some people think that the appropriate response to the chaos these substances often cause is to prohibit them.

Consider alcohol. We know, of course, that Prohibition was a disaster. But when you consider what cities were like before the Women’s Christian Temperance Union got working—with men abandoning their jobs and families, spending all day in saloons, and winding up just hammered in the gutter—it’s not hard to see what people were worried about. Ken Burns’s documentary on Prohibition explains this history in a very colorful way. As you and I race to the conclusion that prohibition of all sorts is both unethical and doomed to fail, I think we should acknowledge that many drugs, alcohol included, have the potential to ruin people’s lives.

And it wasn’t completely crazy to think that banning the use of specific drugs might be a good way, ethically and practically, to mitigate their harms. But ever since Prohibition we’ve known that the cure is worse than the disease. When you ban substances that people enjoy using so much that they’ll break the law to do it, you create a black market with huge profits. And since purveyors of illicit drugs have no legal way to secure their investment, the trade will be run by increasingly violent criminals.

In a single stroke, therefore, prohibition creates organized crime and all the social ills attributable to the skyrocketing cost of drugs—addicts are forced to become thieves and prostitutes in order to afford their next fix. Why isn’t the stupidity of prohibition now obvious to everyone?

J. Hari: What’s fascinating is that it was obvious at the time. The drug war really began in the 1930s, when Harry Anslinger was the first person to use the phrase “warfare against drugs”—and it was massively resisted across the United States and across the world. This is a forgotten and suppressed history, and I was startled to uncover it.

I tell it mainly through the story of this extraordinary doctor, Henry Smith Williams, who at the birth of the drug war prophesied all of it. It’s worth remembering that when drugs were first banned, doctors resisted to such a degree that 17,000 of them had to be rounded up and arrested because they insisted on continuing to prescribe to drug addicts. The mayor of Los Angeles stood outside a heroin-prescribing clinic and said, effectively, “You will not close this down. It does a good job for the people of Los Angeles.” The early drug war was hugely contested, and many people rightly pointed out why it wouldn’t work. This is a really important thing to remember. And one of the most fascinating things for me was seeing how much the arguments at both the beginning of the drug war and in societies where they have finally end it have echoed each other. (...)

S. Harris: This brings us to the topic of addiction. Is addiction an easily defined physiological state that is purely a matter of which substance a person takes and how regularly he takes it? Or is it largely the product of external variables? In your book, you make the latter case. And I think most people would be surprised to learn that in a context where drug use is more normalized, a heroin addict, for instance, can be a fully productive member of society. There’s nothing about regularly taking heroin that by definition renders a person unable to function. So let’s talk a bit about what addiction is and the various ways it changes with its social context.

J. Hari: This is the thing that most surprised me in the research for the book. I thought I knew quite a lot about addiction, not least because I’ve had it in my life since I was a child, with my relatives. But if you had said to me four years ago, “What causes, say, heroin addiction?” I would have looked at you as if you were a bit simpleminded, and I would have said, “Heroin causes heroin addiction.”

For 100 years we’ve been told a story about addiction that’s just become part of our common sense. It’s obvious to us. We think that if you, I, and the first 20 people to read this on your site all used heroin together for 20 days, on day 21 we would be heroin addicts, because there are chemical hooks in heroin that our bodies would start to physically need, and that’s what addiction is. (...)

I didn’t know until I went and interviewed Bruce Alexander, who’s a professor in Vancouver and, I think, one of the most important figures in addiction studies in the world today. He explained to me that our idea of addiction comes in part from a series of experiments that were done earlier in the 20th century. They’re really simple experiments, and your readers can do them at home if they’re feeling a bit sadistic. You get a rat, you put it in a cage, and you give it two water bottles: One is water, and the other is water laced with heroin or cocaine. The rat will almost always prefer the drugged water and will almost always kill itself. So there you go. That’s our theory of addiction. You might remember the famous Partnership for a Drug-Free America ad from the 1980s that depicted this.

But in the 1970s, Bruce Alexander came along and thought, “Hang on a minute. We’re putting the rat in an empty cage. It’s got nothing to do except use these drugs. Let’s try this differently.”

So he built a very different cage and called it Rat Park. Rat Park was like heaven for rats. They had everything a rat could possibly want: lovely food, colored balls, tunnels, loads of friends. They could have loads of sex. And they had both the water bottles—the normal water and the drugged water. What’s fascinating is that in Rat Park they didn’t like the drugged water. They hardly ever drank it. None of them ever drank it in a way that looked compulsive. None of them ever overdosed.

An interesting human example of this was happening at the same time; I’ll talk about it in a second. What Bruce says is that this shows that both the right-wing and left-wing theories of addiction are flawed. The right-wing theory is that it’s a moral failing—you’re a hedonist, you indulge yourself, all of that. The left-wing theory is that your brain gets hijacked, you get taken over, and you become a slave.

Bruce says it’s not your morality and it’s not your brain. To a much larger degree than we’ve ever before appreciated, it’s your cage. Addiction is an adaption to your environment.

by Sam Harris |  Read more:
Image: Pete Zarria