Tuesday, April 14, 2015


Shibata Zeshin, Mouse
via:

Paper Moons

Swindle and Fraud, the vaudeville team of nouns headlining this issue of  Lapham’s Quarterly, are old dogs always keen to learn new tricks, and their spirited performance during the Great Recession showcased the attention paid to their studies since the Great Houdini, on the evening of January 7, 1918, vanished a five-ton elephant from the stage of New York’s Hippodrome Theater. The new act for a new century topped up the weight of the production values:

Nine banks emptied of more than $500 billion in capital, as much as $8 trillion withdrawn from the Dow Jones Industrial Average, $2 trillion from the nation’s pension and retirement accounts. Sure-handed juggling of the public trust into the private purse. Stock market touts tap dancing the old soft-shoe with the Securities and Exchange Commission, hedge fund operatives with the Federal Reserve. A cool $1 trillion lifted from the U.S. Treasury in broad daylight, members of Congress working the money-box routine with banks too big to fail.

Throughout the whole of its extended run, the spectacle drew holiday crowds into the circus tents of the tabloid press, and joyous in Mudville was the feasting on fools. Why then the gloom among the wizards of Oz in the upper income brackets of the national news media? One might have hoped for at least a tip of the hat from the Wall Street Journal and The Economist, from Bloomberg News and the American Enterprise Institute. How not exult in the powers of the unfettered free market, admire the entrepreneurial initiative, the scale of the revival of the go-ahead, can-do spirit that made America great?

But instead of disbursing laurels, the guardians at the gates of the country’s moral treasure delivered sermons on the text of American decline, many of them in tune with the one composed by New York Times columnist Thomas Friedman in October 2008, by which time the shearing of the sheep was rolling as merrily along as a Macy’s Thanksgiving Day Parade.

“The Puritan ethic of hard work and saving still matters,” said Friedman. “We need to get back to collaborating the old-fashioned way. That is, people making decisions based on business judgment, experience, prudence, clarity of communications, and thinking about how—not just how much.”

A noble sentiment and no doubt readily available in New England gift shops, but to account for the sacking of the Wall Street temple of Mammon as a falling away from the Puritan work ethic is to misread America’s economic and political history, to mistake the message encoded in the DNA of the American dream. Who among the faithful ever has preferred the bird in the hand to the five in the bush? The spoilsports in the pulpits of spiritual reawakening never lack for proof of shameful behavior and lackluster deportment, but when they call as witness for the prosecution the milk-white marble of Western civilization and the holy scripture of American exceptionalism, they tread on shaky ground.

by Lewis Lapham, Lapham's Quarterly |  Read more:
Image: Pierre-Louis Pierson via:

Gayle Bard, Skagit Flats
via:

A Convention for the Bookish

To walk any part of the eight miles of skyway that connect much of downtown Minneapolis this past weekend was to hear snatches of dialogue endemic to writers. The forty-ninth annual Associated Writing Programs Conference—the largest gathering of poets, writers, writing students, creative-writing-program faculty, literary-journal editors, arts organizations, small presses, and literary entrepreneurs in the country—was under way, and it was snowing. Outside the glass walls of the cavernous Minneapolis Convention Center, big, fluffy, wet flakes were floating down.

But the fourteen thousand literary folks in attendance weren’t paying much attention to the weather. As a whole, they did not seem to be outdoorsy people. They spend most of their days, after all, staring into the blue glow of their computer screens, or sitting around workshop tables beneath florescent lights, or poring over piles and piles of manuscripts in windowless rooms. Their work, whether writing or reading, necessitates solitude, and they had travelled from all over the country to participate, to network, to party. They were here to be with their people, weather be damned. In the weeks leading up to the four-day conference, the literary community on Twitter swelled with excitement, and #AWP15 began to trend. It did not trend in a Kanye and Kardashian kind of way, obviously. It trended the way literary writers and poets trend, which is to say not very much. (...)

If every industry has its trade show, and if writing can possibly be described as an industry, A.W.P. has become a thriving nexus of all things literary. Founded in 1967, its first conference was held in 1972, at the Library of Congress, with six events and sixteen presenters including George Garrett, Wallace Stegner, and Ralph Ellison. This year’s conference was host to five hundred and fifty events, two thousand presenters, and over seven hundred small presses, journals, and literary organizations. If Book Expo America, or B.E.A., which is held each spring, is the convention for book publishing, then A.W.P. is the convention for the bookish. (...)

In an age-old literary method for managing terror—though arguably one with diminishing returns—the parties around A.W.P. were booze-fests. (A Monday morning tweet: “Are you a writer? My truck driving husband/AWP escort: No, but I drink like one.”) On Friday afternoon, Electric Literature, The Paris Review, and the National Book Foundation hosted an invitation-only liquid lunch—one martini per guest. One Story magazine held a superhero-themed party, at the Walker Museum, where the editors wore colorful Lone Ranger-style masks emblazoned with lightning bolts and the wine flowed freely. At the Sarabande Books booth, every purchase was accompanied by a shot of Jim Beam. Each night during the conference, the bar at the Hilton was packed three-deep with poets, writers, and those who love them. At breakfast, these same writers wore sunglasses and croaked out orders for lattes and dry scrambled eggs before heading off to a morning panel on, say, “The Bump and Grind of Meaning: Intuition and Formal Play in Hybrid Nonfiction.”

by Dani Shapiro, New Yorker |  Read more:
Image: HEEB/LAIF/REDUX

The Girls on Shit Duty

I never thought I’d work a job that was dictated by human shit. But things change. When you’re responsible for following men around and cleaning up after them it’s, at best, funny and humbling, and at worst, humiliating. At this remote fly-in fishing lodge in Northern Ontario, we housekeepers are not only modern-day chambermaids, but also plumbers, cleaning ladies, mother-figures, mock-wives, servants and, on the most difficult of days, whipping girls.

But mostly, we’re the Queens of Clean. Every day, the girls who serve the guests their heavy, rich meals of sticky ribs, oily flapjacks, and chocolate pudding are also responsible for tidying the rooms when the fishermen head out on the water. We housekeepers make the beds, sweep the floors in the cabins, refill the tissues and toilet paper, refold the towels, and replace the linens. We pick up garbage that’s been left on the floor, scoop pubic hairs out of the shower drains, and empty the slightly more palatable hair out of the sink traps. We do this with aplomb and a liquid efficiency. Yet somehow, when we have to clean the toilets, we always find ourselves staring down at the bowl and sighing. A weeklong trip filled with deep-fried shore lunches—beer-battered onion rings and fresh walleye fillets destroyed by a gallon of canola oil—does funny things to a man’s insides. Nine weeks of cleaning poop-covered toilets in the remoteness of the Canadian Shield wilderness is likely to do funny things to a woman, too.

by Anna Maxymiw, Hazlitt |  Read more:
Image: Vicki Nerino

Monday, April 13, 2015

Eating a Big Mac at the Arctic Circle


In January of 1989, the temperature got down to sixty degrees below zero in Fairbanks, Alaska, and stayed there for three weeks. Furnaces burned through heating oil at a serious rate, and parking lots slowly filled with cars that wouldn't start anymore, even with the engine block heaters that everyone up there has. Every year in the dead of winter, usually the Japanese imports were the last cars left on the road, but in 1989, even some of the Toyotas had given up.

In that kind of weather, automatic doors would freeze open or closed, so they'd have to be disabled. The people working the drive-thru window at McDonald's wore their parkas while they stood at their posts, because it was impossible to stay warm with the cold air blasting in with every transaction. And there were lots of them: in the winter of '89; almost no one actually got out of their cars and walked anywhere if they didn't have to, including me and my friend Lori, whose Datsun 200sx held up nicely during that particularly long cold snap.

Lori and I were in high school, and we thought that it would be hilariously funny to go through the drive-thru at McDonald's and order ice cream. We got a lot of mileage out of it: our satisfied giggling when the person taking the order paused and said, "You want what?" and then the spectacle we made of ourselves back at school after racing through the ice fog (when water particles in the air literally freeze solid, thick as any fog that rolls off the ocean). We danced around amidst the cacophony of slamming lockers and yelled conversations, waving the tall soft-serve cones over our heads, the hallways glowing orange in the early-afternoon setting sun, so close to the Arctic Circle.

Besides that visit for ice cream, I didn't otherwise go to McDonald's. Its parking lot was where kids met up on weekend nights to figure out where the party was ("party" meaning, usually, a pallet fire, a keg of the cheapest beer possible, and Def Leppard blasting from a boom box). The restaurant was, to me, tainted by its association with people who didn't have anywhere else to go. At 18 years old I was tired of the scene, eager to get out and away from what I saw as small town small-mindedness. I was ready for more.

by Elisabeth Fairfield Stokes , Eater | Read more:
Image: Nick Mealey

[ed. World, meet Calvin]
photo: Nate

Saturday, April 11, 2015

A War Well Lost

Johann Hari is a British journalist who has written for many of the world’s leading newspapers and magazines, including The New York Times, Le Monde, The Guardian, The Los Angeles Times, The New Republic, The Nation, Slate, El Mundo, and The Sydney Morning Herald. He was an op-ed columnist for The Independent for nine years. He graduated from King’s College, Cambridge with a double first in social and political sciences in 2001.

Hari was twice named “National Newspaper Journalist of the Year” by Amnesty International. He was named “Environmental Commentator of the Year” at the Editorial Intelligence Awards, and “Gay Journalist of the Year” at the Stonewall Awards. He has also won the Martha Gellhorn Prize for political writing.

Hari’s latest book is the New York Times best seller Chasing the Scream: The First and Last Days of the War on Drugs. You can follow him on Twitter @johannhari101

S. Harris: Thanks for taking the time to speak with me, Johann. You’ve written a wonderful book about the war on drugs—about its history and injustice—and I hope everyone will read it. The practice of making certain psychoactive compounds illegal raises some deep and difficult questions about how to create societies worth living in. I strongly suspect that you and I will agree about the ethics here: The drug war has been a travesty and a tragedy. But you’re much more knowledgeable about the history of this war, so I’d like to ask you a few questions before we begin staking out common ground.

The drug war started almost exactly 100 years ago. That means our great-grandparents could wander into any pharmacy and buy cocaine or heroin. Why did the drug war begin, and who started it?

J. Hari: It’s really fascinating, because when I realized we were coming up to this centenary, I thought of myself as someone who knew a good deal about the drug war. I’d written about it quite a lot, as you know, and I had drug addiction in my family. One of my earliest memories is of trying to wake up one of my relatives and not being able to.

And yet I just realized there were many basic questions I didn’t know the answer to, including exactly the one you’re asking: Why were drugs banned 100 years ago? Why do we continue banning them? What are the actual alternatives in practice? And what really causes drug use and drug addiction?

To find the answers, I went on this long journey—across nine countries, 30,000 miles—and I learned that almost everything I thought at the start was basically wrong. Drugs aren’t what we think they are. The drug war isn’t what we think it is. Drug addiction isn’t what we think it is. And the alternatives aren’t what we think they are.

If you had said to me, “Why were drugs banned?” I would have guessed that most people, if you stopped them in the street, would say, “We don’t want people to become addicted, we don’t want kids to use drugs,” that kind of thing.

What is fascinating when you go back and read the archives from the time is that that stuff barely comes up. Drugs were banned in the United States a century ago for a very different reason. They were banned in the middle of a huge race panic.(...)

S. Harris: We’ll talk about the phenomenon of addiction, and discuss the novel understanding of it you arrive at in the book. But first I think we should acknowledge that drugs and alcohol can cause social harms that every society has an interest in preventing. It’s not hard to see why some people think that the appropriate response to the chaos these substances often cause is to prohibit them.

Consider alcohol. We know, of course, that Prohibition was a disaster. But when you consider what cities were like before the Women’s Christian Temperance Union got working—with men abandoning their jobs and families, spending all day in saloons, and winding up just hammered in the gutter—it’s not hard to see what people were worried about. Ken Burns’s documentary on Prohibition explains this history in a very colorful way. As you and I race to the conclusion that prohibition of all sorts is both unethical and doomed to fail, I think we should acknowledge that many drugs, alcohol included, have the potential to ruin people’s lives.

And it wasn’t completely crazy to think that banning the use of specific drugs might be a good way, ethically and practically, to mitigate their harms. But ever since Prohibition we’ve known that the cure is worse than the disease. When you ban substances that people enjoy using so much that they’ll break the law to do it, you create a black market with huge profits. And since purveyors of illicit drugs have no legal way to secure their investment, the trade will be run by increasingly violent criminals.

In a single stroke, therefore, prohibition creates organized crime and all the social ills attributable to the skyrocketing cost of drugs—addicts are forced to become thieves and prostitutes in order to afford their next fix. Why isn’t the stupidity of prohibition now obvious to everyone?

J. Hari: What’s fascinating is that it was obvious at the time. The drug war really began in the 1930s, when Harry Anslinger was the first person to use the phrase “warfare against drugs”—and it was massively resisted across the United States and across the world. This is a forgotten and suppressed history, and I was startled to uncover it.

I tell it mainly through the story of this extraordinary doctor, Henry Smith Williams, who at the birth of the drug war prophesied all of it. It’s worth remembering that when drugs were first banned, doctors resisted to such a degree that 17,000 of them had to be rounded up and arrested because they insisted on continuing to prescribe to drug addicts. The mayor of Los Angeles stood outside a heroin-prescribing clinic and said, effectively, “You will not close this down. It does a good job for the people of Los Angeles.” The early drug war was hugely contested, and many people rightly pointed out why it wouldn’t work. This is a really important thing to remember. And one of the most fascinating things for me was seeing how much the arguments at both the beginning of the drug war and in societies where they have finally end it have echoed each other. (...)

S. Harris: This brings us to the topic of addiction. Is addiction an easily defined physiological state that is purely a matter of which substance a person takes and how regularly he takes it? Or is it largely the product of external variables? In your book, you make the latter case. And I think most people would be surprised to learn that in a context where drug use is more normalized, a heroin addict, for instance, can be a fully productive member of society. There’s nothing about regularly taking heroin that by definition renders a person unable to function. So let’s talk a bit about what addiction is and the various ways it changes with its social context.

J. Hari: This is the thing that most surprised me in the research for the book. I thought I knew quite a lot about addiction, not least because I’ve had it in my life since I was a child, with my relatives. But if you had said to me four years ago, “What causes, say, heroin addiction?” I would have looked at you as if you were a bit simpleminded, and I would have said, “Heroin causes heroin addiction.”

For 100 years we’ve been told a story about addiction that’s just become part of our common sense. It’s obvious to us. We think that if you, I, and the first 20 people to read this on your site all used heroin together for 20 days, on day 21 we would be heroin addicts, because there are chemical hooks in heroin that our bodies would start to physically need, and that’s what addiction is. (...)

I didn’t know until I went and interviewed Bruce Alexander, who’s a professor in Vancouver and, I think, one of the most important figures in addiction studies in the world today. He explained to me that our idea of addiction comes in part from a series of experiments that were done earlier in the 20th century. They’re really simple experiments, and your readers can do them at home if they’re feeling a bit sadistic. You get a rat, you put it in a cage, and you give it two water bottles: One is water, and the other is water laced with heroin or cocaine. The rat will almost always prefer the drugged water and will almost always kill itself. So there you go. That’s our theory of addiction. You might remember the famous Partnership for a Drug-Free America ad from the 1980s that depicted this.

But in the 1970s, Bruce Alexander came along and thought, “Hang on a minute. We’re putting the rat in an empty cage. It’s got nothing to do except use these drugs. Let’s try this differently.”

So he built a very different cage and called it Rat Park. Rat Park was like heaven for rats. They had everything a rat could possibly want: lovely food, colored balls, tunnels, loads of friends. They could have loads of sex. And they had both the water bottles—the normal water and the drugged water. What’s fascinating is that in Rat Park they didn’t like the drugged water. They hardly ever drank it. None of them ever drank it in a way that looked compulsive. None of them ever overdosed.

An interesting human example of this was happening at the same time; I’ll talk about it in a second. What Bruce says is that this shows that both the right-wing and left-wing theories of addiction are flawed. The right-wing theory is that it’s a moral failing—you’re a hedonist, you indulge yourself, all of that. The left-wing theory is that your brain gets hijacked, you get taken over, and you become a slave.

Bruce says it’s not your morality and it’s not your brain. To a much larger degree than we’ve ever before appreciated, it’s your cage. Addiction is an adaption to your environment.

by Sam Harris |  Read more:
Image: Pete Zarria

Friday, April 10, 2015

Barely Keeping Up in TV’s New Golden Age

[ed. See also: Myths of the Golden Age]

Not long ago, a friend at work told me I absolutely, positively must watch “Broad City” on Comedy Central, saying it was a slacker-infused hilarity.

My reaction? Oh no, not another one.

The vast wasteland of television has been replaced by an excess of excellence that is fundamentally altering my media diet and threatening to consume my waking life in the process. I am not alone. Even as alternatives proliferate and people cut the cord, they are continuing to spend ever more time in front of the TV without a trace of embarrassment.

I was never one of those snobby people who would claim to not own a television when the subject came up, but I was generally more a reader than a watcher. That was before the explosion in quality television tipped me over into a viewing frenzy.

Something tangible, and technical, is at work. The addition of ancillary devices onto what had been a dumb box has made us the programming masters of our own universes. Including the cable box — with its video on demand and digital video recorder — and Apple TV, Chromecast, PlayStation, Roku, Wii and Xbox, that universe is constantly expanding. Time-shifting allows not just greater flexibility, but increased consumption. According to Nielsen, Americans watched almost 15 hours of time-shifted television a month in 2013, two more hours a month than the year before.

And what a feast. Right now, I am on the second episode of Season 2 of “House of Cards” (Netflix), have caught up on “Girls” (HBO) and am reveling in every episode of “Justified” (FX). I may be a little behind on “The Walking Dead” (AMC) and “Nashville” (ABC) and have just started “The Americans” (FX), but I am pretty much in step with comedies like “Modern Family” (ABC) and “Archer” (FX) and like everyone one else I know, dying to see how “True Detective” (HBO) ends. Oh, and the fourth season of “Game of Thrones” (HBO) starts next month.

Whew. Never mind being able to hold all these serials simultaneously in my head, how can there possibly be room for anything else? So far, the biggest losers in this fight for mind share are not my employer or loved ones, but other forms of media.

My once beloved magazines sit in a forlorn pile, patiently waiting for their turn in front of my eyes. Television now meets many of the needs that pile previously satisfied. I have yet to read the big heave on Amazon in The New Yorker, or the feature on the pathology of contemporary fraternities in the March issue of The Atlantic, and while I have an unhealthy love of street food, I haven’t cracked the spine on Lucky Peach’s survey of the same. Ditto for what looks like an amazing first-person account in Mother Jones from the young Americans who were kidnapped in Iran in 2009. I am a huge fan of the resurgent trade magazines like Adweek and The Hollywood Reporter, but watching the products they describe usually wins out over reading about them.

Magazines in general had a tough year, with newsstand sales down over 11 percent, John Harrington, an industry analyst who tracks circulation, said.

And then there are books. I have a hierarchy: books I’d like to read, books I should read, books I should read by friends of mine and books I should read by friends of mine whom I am likely to bump into. They all remain on standby. That tablets now contain all manner of brilliant stories that happen to be told in video, not print, may be partly why e-book sales leveled out last year. After a day of online reading that has me bathed in the information stream, when I have a little me-time, I mostly want to hit a few buttons on one of my three remotes — cable, Apple, Roku — and watch the splendors unfurl.

by David Carr, NY Times | Read more:
Image: Nathaniel Bell for Netflix

Return of the King

Mad Men still has a half-season to go, but Don Draper’s obituary has already been written. We don’t know exactly how it will end for Don, but the critical consensus is that his fate is sealed: for the past seven years, we’ve watched him follow the same downward trajectory his silhouetted likeness traces in the opening credits, so that all that’s left is for him to land. In a piece lamenting the “death of adulthood in American culture,” A. O. Scott says that Mad Men is one of several recent pop cultural narratives — among them The Sopranos and Breaking Bad — that chart the “final, exhausted collapse” of white men and their regimes, but I’m not convinced. Don has a way of bouncing back. Where one episode opens with him on an examination table, lying to his doctor about how much he drinks and smokes as if his bloodshot eyes and smoker’s cough didn’t give him away (even bets on cirrhosis and emphysema), another finds him swimming laps, cutting down on his drinking, and keeping a journal in an effort to “gain a modicum of control.” Over the course of the past six and a half seasons, Don has been on the brink of personal and professional destruction too many times to count, and yet when we last saw him at the conclusion of “Waterloo,” the final episode of the last half-season, which aired last May, he was fresh-faced and back on top. The truth is that Mad Men has something far more unsettling (and historically accurate) to tell us about the way that white male power works to protect its own interests, precisely by staging and restaging its own death.

In fact, a closer look at “Waterloo” in particular makes clear that the show does not chronicle the last gasp of the white male, as Scott would have it, but outlines the way that a wily old guard has followed the advice of E. Digby Baltzell (who coined the acronym WASP in 1964) by “absorbing talented and distinguished members of minority groups into its privileged ranks” in order to maintain its grip on power. After several episodes of unrelenting humiliation for Don, this installment was so thoroughly upbeat that it had critics wondering just whose Waterloo it was, anyway. Unlike Napoleon, Don doesn’t defiantly march into a futile, fatal battle to save his job, but instead surprises everyone by stepping graciously aside, handing a big pitch for Burger Chef to his protégé, Peggy Olson. (...)

It’s tempting to read both the ad and Peggy’s triumphant performance as harbingers of our own more enlightened, inclusive era, where women and people of color have a seat and a voice at the clean well-lit table that Peggy describes. There are plenty of indications that we are witnessing the small steps that will ultimately amount to real progress (not least of which, the moon landing that provides the episode’s symbolic framework). Remember at the beginning of this season (in “A Day’s Work”), when senior partner Bertram Cooper, a member of the old guard if ever there was one, insists that a black secretary be moved from her post as receptionist at the front of the office? (“I’m all for the national advancement of colored people,” he says, “but I don’t believe people should advance all the way to the front.”) Now Joan Holloway obliges by promoting her to office manager, and it is she — her name is Dawn, naturally — who is not just front but center at the end of “Waterloo” when she calls to order the meeting at which Cooper’s death and a fresh start for the agency are announced.

But as exhilarating as it is to watch Peggy nail the presentation, and to watch Dawn command the room if just for a moment, the big winner in this episode is the status quo, which puts a new face on the same old model. Peggy’s pitch for Burger Chef promises that everyone will get a seat at the table, but if we’ve learned anything over the course of six and a half seasons, it’s that it is actually an invitation-only affair for an exceptional few. Yes, Mad Men narrates the crisis of white masculinity, but as this episode makes clear, that crisis is not about who gets a piece of pie, but about who controls the pie; as Bert tautologically instructs his younger partner Roger Sterling, “Whoever is in control is in charge.”

by Kathy Knapp, LA Review of Books |  Read more:
Image: via:

International Louie Louie Day


Louie Louie was written by R&B singer Richard Berry in 1955. His band, “The Pharaohs”, recorded and released it in 1957. It got some airplay on the band’s home turf around San Francisco, and became popular in the pacific northwest. It was covered by other garage bands and became a somewhat popular party tune in the western states.

In Berry’s original recording the lyric is quite clear: It’s a song is about a sailor who spends three days traveling to Jamaica to see his girl. The story is told to a bartender named Louie. Nothing even remotely obscene in that original version.

The version we all know and love was recorded by the Kingsmen on April 6, 1963 in Portland Oregon. The cover was not of the original Richard Berry recording, but a later version by Robin Roberts with his backing band “The Wailers.” The Robin Roberts version was released in 1961 and became a local hit in Tacoma, Washington.

For reasons lost in the mists of time, the Kingsmen’s recording session cost $50, and consisted of a single take. Legend suggests they thought that take was a rehearsal, or maybe a demo tape.

A different version of Louie Louie was also recorded the same week, in the same recording studio, by Paul Revere and the Raiders. The Raiders version is considered much better musically, but the Kingsmen’s version got all the glory.

The Kingsmen’s lead singer on Louie Louie was Jack Ely, whose birthday is April 11. That date became the basis for the widely celebrated “International Louie Louie Day.” It was the only time Ely recorded with the Kingsmen as lead vocalist. He left the band shortly after to return to school, or over a dispute about who was to be lead vocalist. Accounts vary. When the song became popular the band refused to take him back. The TV and concert performances the Kingsmen did during the tune’s most popular years were lip synced.

by Gene Baucom, Medium | Read more:
Video: YouTube

Adrian Tomine, Jennifer Davis
via:

Pàtric Marín, Wild in the City
via:

What the Deer Are Telling Us

In 1909, a United States Forest Service officer named Aldo Leopold shot a mother wolf from a perch of rimrock in the Apache National Forest in Arizona. It was a revelatory moment in the life of the young naturalist. “In those days we never heard of passing up a chance to kill a wolf,” Leopold wrote in an essay called “Thinking Like a Mountain,” later included in his Sand County Almanac, published posthumously after his death in 1948 and which went on to sell several million copies. “We reached the old wolf in time to watch a fierce green fire dying in her eyes. I realized then, and have known ever since, that there was something new to me in those eyes—something known only to her and to the mountain.”

Leopold, who today is revered among ecologists, was among the earliest observers of the impact of wolves on deer abundance, and of the impact of too many deer on plant life. In “Thinking Like a Mountain,” he outlined for the first time the basic theory of trophic cascades, which states that top-down predators determine the health of an ecosystem. The theory as presented by Leopold held that the extirpation of wolves and cougars in Arizona, and elsewhere in the West, would result in a booming deer population that would browse unsustainably in the forests of the high country. “I now suspect that just as a deer herd lives in mortal fear of its wolves,” Leopold wrote, “so does a mountain live in mortal fear of its deer.”

One of the areas where Leopold studied deer irruptions was the Kaibab Plateau near the Grand Canyon. By 1924, the deer population on the Kaibab had peaked at 100,000. Then it crashed. During 1924-26, 60 percent of the deer perished due to starvation. Leopold believed this pattern of deer exceeding the carrying capacity of the land would repeat across the U.S. wherever predators had been eliminated as a trophic force. By 1920, wolves and cougars were gone from the ecosystems east of the Mississippi—shot, trapped, poisoned, as human settlement fragmented their habitat— and they were headed toward extirpation in most parts of the American West. Within two generations, the hunting of deer had been heavily regulated, the calls from conservationists had been heeded for deer reintroduction throughout the eastern U.S., and swaths of state and federally managed forest had been protected from any kind of hunting.

Freed both of human and animal predation, however, deer did not follow the pattern predicted by Leopold. Instead of eating themselves out of house and home, they survived—they thrived—by altering their home range to their benefit. As recent studies have shown, certain kinds of grasses and sedges preferred by deer react to over-browsing the way the bluegrass on a suburban lawn reacts to a lawnmower. The grasses grow back faster and healthier, and provide more sustenance for more deer. In short, there has been enough food in our forests, mountains, and grasslands for white-tailed deer in the U.S. to reach unprecedented numbers, about 32 million, more than at any time since record-keeping began.

In 1968, Stanford biology professor Paul Ehrlich predicted that another widespread species would die out as a result of overpopulation. But he was spectacularly wrong. Like the deer, the steadily ingenious Homo sapiens altered its home range—most notably the arable land—to maximize its potential for survival. As Homo sapiens continues to thrive across the planet today, the species might take a moment to find its reflection in the rampant deer.

Conservation biologists who have followed the deer tend to make an unhappy assessment of its progress. They mutter dark thoughts about killing deer, and killing a lot of them. In fact, they already are. In 2011, in the name of conservation, the National Park Service and U.S. Department of Agriculture teamed up with hunters to “harvest” 3 million antlerless deer. I asked Thomas Rooney, one of the nation’s top deer irruption researchers, about the losses in forest ecosystems overrun by deer. “I’d say the word is ‘apocalypse,’ ” Rooney said.

On a warm fall day last year, I went to see Rooney, a professor of biology at Wright State University, in Dayton, Ohio. In his office, I noticed a well-thumbed copy of Ehrlich’s The Population Bomb, and I asked him if he thought a comparison might be drawn between human overpopulation and deer overpopulation. He looked at me as if the point was obvious. “Deer, like humans,” he said, “can come in and eliminate biodiversity, though not to their immediate detriment.” (...)

He told me about a study published last year in Conservation Biology that bemoaned “pandemic deer overabundance,” language suggesting the creature was a disease on the land. Ecosystem damage becomes apparent at roughly 15 deer per square mile, and the damage grows with density. Some areas of the northeast host as many as 100 deer per square mile. (The Wright State University reserve has a density of around 40 deer per square mile.) He noted a 2013 article co-authored by a group of Nature Conservancy scientists who warned that “no other threat to forested habitats is greater at this point in time—not lack of fire, not habitat conversion, not climate change.” (...)

I asked Rooney about the remarkable ability of deer to thrive in their home range—most of the U.S.—while producing ecosystem simplification and a biodiversity crash. In his own studies of deer habitats in Wisconsin, Rooney found that only a few types of grass thrive under a deer-dominant regime. The rest, amounting to around 80 percent of native Wisconsin plant species, had been eradicated. “The 80 percent represent the disappearance of 300 million years of evolutionary history,” he said. He looked deflated.

A turkey vulture pounded its wings through the canopy, and in the darkening sky a military cargo plane howled in descent toward nearby Wright-Paterson Air Force Base. Rooney and I emerged from the forest onto a campus parking lot where Homo sapiens held sway. The self-assured mammals crossed fields of exotic bluegrass under pruned hardwoods surrounded by a sea of concrete, tarmac, glass, and metal. There were no flowers except those managed in beds. There were no other animals to be seen except the occasional squirrel, and these were rat-like, worried, scurrying. The Homo sapiens got into cars that looked the same, on streets that looked the same, and they were headed to domiciles that looked more or less the same. This is home for us.

by Christopher Ketcham, Nautilus |  Read more:
Image: Chris Buzelli

The Wave That Swept the World

In the beginning was the wave. The blue and white tsunami, ascending from the left of the composition like a massive claw, descends pitilessly on Mount Fuji – the most august mountain in Japan, turned in Katsushika Hokusai’s vision into a small and vulnerable hillock. Under the Wave off Kanagawa, one of Hokusai’s Thirty-Six Views of Mount Fuji, has been an icon of Japan since the print was first struck in 1830–31, yet it forms part of a complex global network of art, commerce, and politics. Its intense blue comes from Hokusai’s pioneering use of Prussian Blue ink – a foreign pigment, imported, probably via China, from England or Germany. The wave, from the beginning, stretched beyond Japan. Soon, it would crash over Europe.

This week the Museum of Fine Arts in Boston, home to the greatest collection of Japanese art outside Japan, opens a giant retrospective of the art of Hokusai, showcasing his indispensible woodblock prints of the genre we call ukiyo-e, or ‘images of the floating world’. It’s the second Hokusai retrospective in under a year; last autumn, the wait to see the artist’s two-part mega-show at the Grand Palais in Paris stretched to two hours or more. American and French audiences adore Hokusai – and have for centuries. He is, after all, not only one of the great figures of Japanese art, but a father figure of much of Western modernism. Without Hokusai, there might have been no Impressionism – and the global art world we today take for granted might look very different indeed.

Fine print

Hokusai’s prints didn’t find their way to the West until after the artist’s death in 1849. During his lifetime Japan was still subject to sakoku, the longstanding policy that forbade foreigners from entering and Japanese from leaving, on penalty of death. But in the 1850s, with the arrival of the ‘black ships’ of the American navy under Matthew Perry, Japan gave up its isolationist policies – and officers and diplomats, then artists and collectors, discovered Japanese woodblock printing. In Japan, Hokusai was seen as vulgar, beneath the consideration of the imperial literati. In the West, his delineation of space with color and line, rather than via one-point perspective, would have revolutionary impact.

Both the style and the subject matter of ukiyo-e prints appealed to young artists like Félix Bracquemond, one of the first French artists to be seduced by Japan. Yet the Japanese prints traveling to the West in the first years after Perry were contemporary artworks, rather than the slightly earlier masterpieces of Hokusai, Hiroshige, and Utamaro. Many of the prints that arrived were used as wrapping paper for commercial goods. Everything changed on 1 April, 1867, when the Exposition Universelle opened on the Champ de Mars, the massive Paris marching grounds that now lies in the shadow of the Eiffel Tower. It featured, for the first time, a Japanese pavilion – and its showcase of ukiyo-e prints revealed the depth of Japanese printmaking to French artists for the first time.

by Jason Farago, BBC |  Read more:
Image: Katsushika Hokusai 

Thursday, April 9, 2015

Just Don't Call It a Panama Hat

There are many types of Panama hats but they all have one thing in common: they’re made in Ecuador. Some say it was the Americans who came up with the misleading name, after they saw photographs of Theodore Roosevelt wearing one as he inspected the construction of the Panama Canal. Legend goes it was actually a loan from Eloy Alfaro, the president of Ecuador and hero of the revolution of 1895. Others say the hats were named after the Isthmus of Panama, the point from which they have historically been exported to the rest of the world.

Yet the misnomer didn’t prevent the famous straw hat, more correctly referred to as Montecristi hat, from being designated by UNESCO as Intangible Cultural Heritage in 2012–Ecuador has produced them since the early 17th Century. It takes three months to make a superfino Montecristi hat (the best grade there is), and weavers can only work in the early and late hours of the day because the straw breaks when it’s exposed to high temperature. According to tradition, hats are cleaned, finished and sold in the town of Montecristi, the Panama hat’s spiritual home in the province of Manabi.

In the small and remote village of Pile nearby, the craft is passed on through family. Manuel Lopez, 41, learned to weave with his father at the age of eight. He says he teaches his own children now, though making a Montecristi hat is becoming a lost art. A weaver only makes between $700 to $1,200 to produce a superfino hat, which can fetch for $25,000 abroad. And now that China has become the world’s top producer of straw hats (which they actually make from paper), Ecuador’s hat makers are unable to keep up with decline in price and demand. With most young people looking for more lucrative opportunities elsewhere, experts say the last-ever traditionally made Montecristi hat will be woven in the next 15 years.

by Eduardo Leal, Roads & Kingdoms |  Read more:
Image: Eduardo Leal