Monday, April 13, 2015
Sunday, April 12, 2015
It's The Masters!
Saturday, April 11, 2015
A War Well Lost
Johann Hari is a British journalist who has written for many of the world’s leading newspapers and magazines, including The New York Times, Le Monde, The Guardian, The Los Angeles Times, The New Republic, The Nation, Slate, El Mundo, and The Sydney Morning Herald. He was an op-ed columnist for The Independent for nine years. He graduated from King’s College, Cambridge with a double first in social and political sciences in 2001.
Hari was twice named “National Newspaper Journalist of the Year” by Amnesty International. He was named “Environmental Commentator of the Year” at the Editorial Intelligence Awards, and “Gay Journalist of the Year” at the Stonewall Awards. He has also won the Martha Gellhorn Prize for political writing.
Hari’s latest book is the New York Times best seller Chasing the Scream: The First and Last Days of the War on Drugs. You can follow him on Twitter @johannhari101
S. Harris: Thanks for taking the time to speak with me, Johann. You’ve written a wonderful book about the war on drugs—about its history and injustice—and I hope everyone will read it. The practice of making certain psychoactive compounds illegal raises some deep and difficult questions about how to create societies worth living in. I strongly suspect that you and I will agree about the ethics here: The drug war has been a travesty and a tragedy. But you’re much more knowledgeable about the history of this war, so I’d like to ask you a few questions before we begin staking out common ground.
The drug war started almost exactly 100 years ago. That means our great-grandparents could wander into any pharmacy and buy cocaine or heroin. Why did the drug war begin, and who started it?
J. Hari: It’s really fascinating, because when I realized we were coming up to this centenary, I thought of myself as someone who knew a good deal about the drug war. I’d written about it quite a lot, as you know, and I had drug addiction in my family. One of my earliest memories is of trying to wake up one of my relatives and not being able to.
And yet I just realized there were many basic questions I didn’t know the answer to, including exactly the one you’re asking: Why were drugs banned 100 years ago? Why do we continue banning them? What are the actual alternatives in practice? And what really causes drug use and drug addiction?
To find the answers, I went on this long journey—across nine countries, 30,000 miles—and I learned that almost everything I thought at the start was basically wrong. Drugs aren’t what we think they are. The drug war isn’t what we think it is. Drug addiction isn’t what we think it is. And the alternatives aren’t what we think they are.
If you had said to me, “Why were drugs banned?” I would have guessed that most people, if you stopped them in the street, would say, “We don’t want people to become addicted, we don’t want kids to use drugs,” that kind of thing.
What is fascinating when you go back and read the archives from the time is that that stuff barely comes up. Drugs were banned in the United States a century ago for a very different reason. They were banned in the middle of a huge race panic.(...)
S. Harris: We’ll talk about the phenomenon of addiction, and discuss the novel understanding of it you arrive at in the book. But first I think we should acknowledge that drugs and alcohol can cause social harms that every society has an interest in preventing. It’s not hard to see why some people think that the appropriate response to the chaos these substances often cause is to prohibit them.
Consider alcohol. We know, of course, that Prohibition was a disaster. But when you consider what cities were like before the Women’s Christian Temperance Union got working—with men abandoning their jobs and families, spending all day in saloons, and winding up just hammered in the gutter—it’s not hard to see what people were worried about. Ken Burns’s documentary on Prohibition explains this history in a very colorful way. As you and I race to the conclusion that prohibition of all sorts is both unethical and doomed to fail, I think we should acknowledge that many drugs, alcohol included, have the potential to ruin people’s lives.
And it wasn’t completely crazy to think that banning the use of specific drugs might be a good way, ethically and practically, to mitigate their harms. But ever since Prohibition we’ve known that the cure is worse than the disease. When you ban substances that people enjoy using so much that they’ll break the law to do it, you create a black market with huge profits. And since purveyors of illicit drugs have no legal way to secure their investment, the trade will be run by increasingly violent criminals.
In a single stroke, therefore, prohibition creates organized crime and all the social ills attributable to the skyrocketing cost of drugs—addicts are forced to become thieves and prostitutes in order to afford their next fix. Why isn’t the stupidity of prohibition now obvious to everyone?
J. Hari: What’s fascinating is that it was obvious at the time. The drug war really began in the 1930s, when Harry Anslinger was the first person to use the phrase “warfare against drugs”—and it was massively resisted across the United States and across the world. This is a forgotten and suppressed history, and I was startled to uncover it.
I tell it mainly through the story of this extraordinary doctor, Henry Smith Williams, who at the birth of the drug war prophesied all of it. It’s worth remembering that when drugs were first banned, doctors resisted to such a degree that 17,000 of them had to be rounded up and arrested because they insisted on continuing to prescribe to drug addicts. The mayor of Los Angeles stood outside a heroin-prescribing clinic and said, effectively, “You will not close this down. It does a good job for the people of Los Angeles.” The early drug war was hugely contested, and many people rightly pointed out why it wouldn’t work. This is a really important thing to remember. And one of the most fascinating things for me was seeing how much the arguments at both the beginning of the drug war and in societies where they have finally end it have echoed each other. (...)
S. Harris: This brings us to the topic of addiction. Is addiction an easily defined physiological state that is purely a matter of which substance a person takes and how regularly he takes it? Or is it largely the product of external variables? In your book, you make the latter case. And I think most people would be surprised to learn that in a context where drug use is more normalized, a heroin addict, for instance, can be a fully productive member of society. There’s nothing about regularly taking heroin that by definition renders a person unable to function. So let’s talk a bit about what addiction is and the various ways it changes with its social context.
J. Hari: This is the thing that most surprised me in the research for the book. I thought I knew quite a lot about addiction, not least because I’ve had it in my life since I was a child, with my relatives. But if you had said to me four years ago, “What causes, say, heroin addiction?” I would have looked at you as if you were a bit simpleminded, and I would have said, “Heroin causes heroin addiction.”
For 100 years we’ve been told a story about addiction that’s just become part of our common sense. It’s obvious to us. We think that if you, I, and the first 20 people to read this on your site all used heroin together for 20 days, on day 21 we would be heroin addicts, because there are chemical hooks in heroin that our bodies would start to physically need, and that’s what addiction is. (...)
I didn’t know until I went and interviewed Bruce Alexander, who’s a professor in Vancouver and, I think, one of the most important figures in addiction studies in the world today. He explained to me that our idea of addiction comes in part from a series of experiments that were done earlier in the 20th century. They’re really simple experiments, and your readers can do them at home if they’re feeling a bit sadistic. You get a rat, you put it in a cage, and you give it two water bottles: One is water, and the other is water laced with heroin or cocaine. The rat will almost always prefer the drugged water and will almost always kill itself. So there you go. That’s our theory of addiction. You might remember the famous Partnership for a Drug-Free America ad from the 1980s that depicted this.
But in the 1970s, Bruce Alexander came along and thought, “Hang on a minute. We’re putting the rat in an empty cage. It’s got nothing to do except use these drugs. Let’s try this differently.”
So he built a very different cage and called it Rat Park. Rat Park was like heaven for rats. They had everything a rat could possibly want: lovely food, colored balls, tunnels, loads of friends. They could have loads of sex. And they had both the water bottles—the normal water and the drugged water. What’s fascinating is that in Rat Park they didn’t like the drugged water. They hardly ever drank it. None of them ever drank it in a way that looked compulsive. None of them ever overdosed.
An interesting human example of this was happening at the same time; I’ll talk about it in a second. What Bruce says is that this shows that both the right-wing and left-wing theories of addiction are flawed. The right-wing theory is that it’s a moral failing—you’re a hedonist, you indulge yourself, all of that. The left-wing theory is that your brain gets hijacked, you get taken over, and you become a slave.
Bruce says it’s not your morality and it’s not your brain. To a much larger degree than we’ve ever before appreciated, it’s your cage. Addiction is an adaption to your environment.
Hari was twice named “National Newspaper Journalist of the Year” by Amnesty International. He was named “Environmental Commentator of the Year” at the Editorial Intelligence Awards, and “Gay Journalist of the Year” at the Stonewall Awards. He has also won the Martha Gellhorn Prize for political writing.Hari’s latest book is the New York Times best seller Chasing the Scream: The First and Last Days of the War on Drugs. You can follow him on Twitter @johannhari101
S. Harris: Thanks for taking the time to speak with me, Johann. You’ve written a wonderful book about the war on drugs—about its history and injustice—and I hope everyone will read it. The practice of making certain psychoactive compounds illegal raises some deep and difficult questions about how to create societies worth living in. I strongly suspect that you and I will agree about the ethics here: The drug war has been a travesty and a tragedy. But you’re much more knowledgeable about the history of this war, so I’d like to ask you a few questions before we begin staking out common ground.
The drug war started almost exactly 100 years ago. That means our great-grandparents could wander into any pharmacy and buy cocaine or heroin. Why did the drug war begin, and who started it?
J. Hari: It’s really fascinating, because when I realized we were coming up to this centenary, I thought of myself as someone who knew a good deal about the drug war. I’d written about it quite a lot, as you know, and I had drug addiction in my family. One of my earliest memories is of trying to wake up one of my relatives and not being able to.
And yet I just realized there were many basic questions I didn’t know the answer to, including exactly the one you’re asking: Why were drugs banned 100 years ago? Why do we continue banning them? What are the actual alternatives in practice? And what really causes drug use and drug addiction?
To find the answers, I went on this long journey—across nine countries, 30,000 miles—and I learned that almost everything I thought at the start was basically wrong. Drugs aren’t what we think they are. The drug war isn’t what we think it is. Drug addiction isn’t what we think it is. And the alternatives aren’t what we think they are.
If you had said to me, “Why were drugs banned?” I would have guessed that most people, if you stopped them in the street, would say, “We don’t want people to become addicted, we don’t want kids to use drugs,” that kind of thing.
What is fascinating when you go back and read the archives from the time is that that stuff barely comes up. Drugs were banned in the United States a century ago for a very different reason. They were banned in the middle of a huge race panic.(...)
S. Harris: We’ll talk about the phenomenon of addiction, and discuss the novel understanding of it you arrive at in the book. But first I think we should acknowledge that drugs and alcohol can cause social harms that every society has an interest in preventing. It’s not hard to see why some people think that the appropriate response to the chaos these substances often cause is to prohibit them.
Consider alcohol. We know, of course, that Prohibition was a disaster. But when you consider what cities were like before the Women’s Christian Temperance Union got working—with men abandoning their jobs and families, spending all day in saloons, and winding up just hammered in the gutter—it’s not hard to see what people were worried about. Ken Burns’s documentary on Prohibition explains this history in a very colorful way. As you and I race to the conclusion that prohibition of all sorts is both unethical and doomed to fail, I think we should acknowledge that many drugs, alcohol included, have the potential to ruin people’s lives.
And it wasn’t completely crazy to think that banning the use of specific drugs might be a good way, ethically and practically, to mitigate their harms. But ever since Prohibition we’ve known that the cure is worse than the disease. When you ban substances that people enjoy using so much that they’ll break the law to do it, you create a black market with huge profits. And since purveyors of illicit drugs have no legal way to secure their investment, the trade will be run by increasingly violent criminals.
In a single stroke, therefore, prohibition creates organized crime and all the social ills attributable to the skyrocketing cost of drugs—addicts are forced to become thieves and prostitutes in order to afford their next fix. Why isn’t the stupidity of prohibition now obvious to everyone?
J. Hari: What’s fascinating is that it was obvious at the time. The drug war really began in the 1930s, when Harry Anslinger was the first person to use the phrase “warfare against drugs”—and it was massively resisted across the United States and across the world. This is a forgotten and suppressed history, and I was startled to uncover it.
I tell it mainly through the story of this extraordinary doctor, Henry Smith Williams, who at the birth of the drug war prophesied all of it. It’s worth remembering that when drugs were first banned, doctors resisted to such a degree that 17,000 of them had to be rounded up and arrested because they insisted on continuing to prescribe to drug addicts. The mayor of Los Angeles stood outside a heroin-prescribing clinic and said, effectively, “You will not close this down. It does a good job for the people of Los Angeles.” The early drug war was hugely contested, and many people rightly pointed out why it wouldn’t work. This is a really important thing to remember. And one of the most fascinating things for me was seeing how much the arguments at both the beginning of the drug war and in societies where they have finally end it have echoed each other. (...)
S. Harris: This brings us to the topic of addiction. Is addiction an easily defined physiological state that is purely a matter of which substance a person takes and how regularly he takes it? Or is it largely the product of external variables? In your book, you make the latter case. And I think most people would be surprised to learn that in a context where drug use is more normalized, a heroin addict, for instance, can be a fully productive member of society. There’s nothing about regularly taking heroin that by definition renders a person unable to function. So let’s talk a bit about what addiction is and the various ways it changes with its social context.
J. Hari: This is the thing that most surprised me in the research for the book. I thought I knew quite a lot about addiction, not least because I’ve had it in my life since I was a child, with my relatives. But if you had said to me four years ago, “What causes, say, heroin addiction?” I would have looked at you as if you were a bit simpleminded, and I would have said, “Heroin causes heroin addiction.”
For 100 years we’ve been told a story about addiction that’s just become part of our common sense. It’s obvious to us. We think that if you, I, and the first 20 people to read this on your site all used heroin together for 20 days, on day 21 we would be heroin addicts, because there are chemical hooks in heroin that our bodies would start to physically need, and that’s what addiction is. (...)
I didn’t know until I went and interviewed Bruce Alexander, who’s a professor in Vancouver and, I think, one of the most important figures in addiction studies in the world today. He explained to me that our idea of addiction comes in part from a series of experiments that were done earlier in the 20th century. They’re really simple experiments, and your readers can do them at home if they’re feeling a bit sadistic. You get a rat, you put it in a cage, and you give it two water bottles: One is water, and the other is water laced with heroin or cocaine. The rat will almost always prefer the drugged water and will almost always kill itself. So there you go. That’s our theory of addiction. You might remember the famous Partnership for a Drug-Free America ad from the 1980s that depicted this.
But in the 1970s, Bruce Alexander came along and thought, “Hang on a minute. We’re putting the rat in an empty cage. It’s got nothing to do except use these drugs. Let’s try this differently.”
So he built a very different cage and called it Rat Park. Rat Park was like heaven for rats. They had everything a rat could possibly want: lovely food, colored balls, tunnels, loads of friends. They could have loads of sex. And they had both the water bottles—the normal water and the drugged water. What’s fascinating is that in Rat Park they didn’t like the drugged water. They hardly ever drank it. None of them ever drank it in a way that looked compulsive. None of them ever overdosed.
An interesting human example of this was happening at the same time; I’ll talk about it in a second. What Bruce says is that this shows that both the right-wing and left-wing theories of addiction are flawed. The right-wing theory is that it’s a moral failing—you’re a hedonist, you indulge yourself, all of that. The left-wing theory is that your brain gets hijacked, you get taken over, and you become a slave.
Bruce says it’s not your morality and it’s not your brain. To a much larger degree than we’ve ever before appreciated, it’s your cage. Addiction is an adaption to your environment.
by Sam Harris | Read more:
Image: Pete ZarriaFriday, April 10, 2015
Barely Keeping Up in TV’s New Golden Age
[ed. See also: Myths of the Golden Age]
Not long ago, a friend at work told me I absolutely, positively must watch “Broad City” on Comedy Central, saying it was a slacker-infused hilarity.
My reaction? Oh no, not another one.
The vast wasteland of television has been replaced by an excess of excellence that is fundamentally altering my media diet and threatening to consume my waking life in the process. I am not alone. Even as alternatives proliferate and people cut the cord, they are continuing to spend ever more time in front of the TV without a trace of embarrassment.
I was never one of those snobby people who would claim to not own a television when the subject came up, but I was generally more a reader than a watcher. That was before the explosion in quality television tipped me over into a viewing frenzy.
Something tangible, and technical, is at work. The addition of ancillary devices onto what had been a dumb box has made us the programming masters of our own universes. Including the cable box — with its video on demand and digital video recorder — and Apple TV, Chromecast, PlayStation, Roku, Wii and Xbox, that universe is constantly expanding. Time-shifting allows not just greater flexibility, but increased consumption. According to Nielsen, Americans watched almost 15 hours of time-shifted television a month in 2013, two more hours a month than the year before.
And what a feast. Right now, I am on the second episode of Season 2 of “House of Cards” (Netflix), have caught up on “Girls” (HBO) and am reveling in every episode of “Justified” (FX). I may be a little behind on “The Walking Dead” (AMC) and “Nashville” (ABC) and have just started “The Americans” (FX), but I am pretty much in step with comedies like “Modern Family” (ABC) and “Archer” (FX) and like everyone one else I know, dying to see how “True Detective” (HBO) ends. Oh, and the fourth season of “Game of Thrones” (HBO) starts next month.
Whew. Never mind being able to hold all these serials simultaneously in my head, how can there possibly be room for anything else? So far, the biggest losers in this fight for mind share are not my employer or loved ones, but other forms of media.
My once beloved magazines sit in a forlorn pile, patiently waiting for their turn in front of my eyes. Television now meets many of the needs that pile previously satisfied. I have yet to read the big heave on Amazon in The New Yorker, or the feature on the pathology of contemporary fraternities in the March issue of The Atlantic, and while I have an unhealthy love of street food, I haven’t cracked the spine on Lucky Peach’s survey of the same. Ditto for what looks like an amazing first-person account in Mother Jones from the young Americans who were kidnapped in Iran in 2009. I am a huge fan of the resurgent trade magazines like Adweek and The Hollywood Reporter, but watching the products they describe usually wins out over reading about them.
Magazines in general had a tough year, with newsstand sales down over 11 percent, John Harrington, an industry analyst who tracks circulation, said.
And then there are books. I have a hierarchy: books I’d like to read, books I should read, books I should read by friends of mine and books I should read by friends of mine whom I am likely to bump into. They all remain on standby. That tablets now contain all manner of brilliant stories that happen to be told in video, not print, may be partly why e-book sales leveled out last year. After a day of online reading that has me bathed in the information stream, when I have a little me-time, I mostly want to hit a few buttons on one of my three remotes — cable, Apple, Roku — and watch the splendors unfurl.
by David Carr, NY Times | Read more:
Image: Nathaniel Bell for Netflix
Not long ago, a friend at work told me I absolutely, positively must watch “Broad City” on Comedy Central, saying it was a slacker-infused hilarity.
My reaction? Oh no, not another one.
The vast wasteland of television has been replaced by an excess of excellence that is fundamentally altering my media diet and threatening to consume my waking life in the process. I am not alone. Even as alternatives proliferate and people cut the cord, they are continuing to spend ever more time in front of the TV without a trace of embarrassment.
I was never one of those snobby people who would claim to not own a television when the subject came up, but I was generally more a reader than a watcher. That was before the explosion in quality television tipped me over into a viewing frenzy.Something tangible, and technical, is at work. The addition of ancillary devices onto what had been a dumb box has made us the programming masters of our own universes. Including the cable box — with its video on demand and digital video recorder — and Apple TV, Chromecast, PlayStation, Roku, Wii and Xbox, that universe is constantly expanding. Time-shifting allows not just greater flexibility, but increased consumption. According to Nielsen, Americans watched almost 15 hours of time-shifted television a month in 2013, two more hours a month than the year before.
And what a feast. Right now, I am on the second episode of Season 2 of “House of Cards” (Netflix), have caught up on “Girls” (HBO) and am reveling in every episode of “Justified” (FX). I may be a little behind on “The Walking Dead” (AMC) and “Nashville” (ABC) and have just started “The Americans” (FX), but I am pretty much in step with comedies like “Modern Family” (ABC) and “Archer” (FX) and like everyone one else I know, dying to see how “True Detective” (HBO) ends. Oh, and the fourth season of “Game of Thrones” (HBO) starts next month.
Whew. Never mind being able to hold all these serials simultaneously in my head, how can there possibly be room for anything else? So far, the biggest losers in this fight for mind share are not my employer or loved ones, but other forms of media.
My once beloved magazines sit in a forlorn pile, patiently waiting for their turn in front of my eyes. Television now meets many of the needs that pile previously satisfied. I have yet to read the big heave on Amazon in The New Yorker, or the feature on the pathology of contemporary fraternities in the March issue of The Atlantic, and while I have an unhealthy love of street food, I haven’t cracked the spine on Lucky Peach’s survey of the same. Ditto for what looks like an amazing first-person account in Mother Jones from the young Americans who were kidnapped in Iran in 2009. I am a huge fan of the resurgent trade magazines like Adweek and The Hollywood Reporter, but watching the products they describe usually wins out over reading about them.
Magazines in general had a tough year, with newsstand sales down over 11 percent, John Harrington, an industry analyst who tracks circulation, said.
And then there are books. I have a hierarchy: books I’d like to read, books I should read, books I should read by friends of mine and books I should read by friends of mine whom I am likely to bump into. They all remain on standby. That tablets now contain all manner of brilliant stories that happen to be told in video, not print, may be partly why e-book sales leveled out last year. After a day of online reading that has me bathed in the information stream, when I have a little me-time, I mostly want to hit a few buttons on one of my three remotes — cable, Apple, Roku — and watch the splendors unfurl.
by David Carr, NY Times | Read more:
Image: Nathaniel Bell for Netflix
Return of the King
Mad Men still has a half-season to go, but Don Draper’s obituary has already been written. We don’t know exactly how it will end for Don, but the critical consensus is that his fate is sealed: for the past seven years, we’ve watched him follow the same downward trajectory his silhouetted likeness traces in the opening credits, so that all that’s left is for him to land. In a piece lamenting the “death of adulthood in American culture,” A. O. Scott says that Mad Men is one of several recent pop cultural narratives — among them The Sopranos and Breaking Bad — that chart the “final, exhausted collapse” of white men and their regimes, but I’m not convinced. Don has a way of bouncing back. Where one episode opens with him on an examination table, lying to his doctor about how much he drinks and smokes as if his bloodshot eyes and smoker’s cough didn’t give him away (even bets on cirrhosis and emphysema), another finds him swimming laps, cutting down on his drinking, and keeping a journal in an effort to “gain a modicum of control.” Over the course of the past six and a half seasons, Don has been on the brink of personal and professional destruction too many times to count, and yet when we last saw him at the conclusion of “Waterloo,” the final episode of the last half-season, which aired last May, he was fresh-faced and back on top. The truth is that Mad Men has something far more unsettling (and historically accurate) to tell us about the way that white male power works to protect its own interests, precisely by staging and restaging its own death.
In fact, a closer look at “Waterloo” in particular makes clear that the show does not chronicle the last gasp of the white male, as Scott would have it, but outlines the way that a wily old guard has followed the advice of E. Digby Baltzell (who coined the acronym WASP in 1964) by “absorbing talented and distinguished members of minority groups into its privileged ranks” in order to maintain its grip on power. After several episodes of unrelenting humiliation for Don, this installment was so thoroughly upbeat that it had critics wondering just whose Waterloo it was, anyway. Unlike Napoleon, Don doesn’t defiantly march into a futile, fatal battle to save his job, but instead surprises everyone by stepping graciously aside, handing a big pitch for Burger Chef to his protégé, Peggy Olson. (...)
It’s tempting to read both the ad and Peggy’s triumphant performance as harbingers of our own more enlightened, inclusive era, where women and people of color have a seat and a voice at the clean well-lit table that Peggy describes. There are plenty of indications that we are witnessing the small steps that will ultimately amount to real progress (not least of which, the moon landing that provides the episode’s symbolic framework). Remember at the beginning of this season (in “A Day’s Work”), when senior partner Bertram Cooper, a member of the old guard if ever there was one, insists that a black secretary be moved from her post as receptionist at the front of the office? (“I’m all for the national advancement of colored people,” he says, “but I don’t believe people should advance all the way to the front.”) Now Joan Holloway obliges by promoting her to office manager, and it is she — her name is Dawn, naturally — who is not just front but center at the end of “Waterloo” when she calls to order the meeting at which Cooper’s death and a fresh start for the agency are announced.
But as exhilarating as it is to watch Peggy nail the presentation, and to watch Dawn command the room if just for a moment, the big winner in this episode is the status quo, which puts a new face on the same old model. Peggy’s pitch for Burger Chef promises that everyone will get a seat at the table, but if we’ve learned anything over the course of six and a half seasons, it’s that it is actually an invitation-only affair for an exceptional few. Yes, Mad Men narrates the crisis of white masculinity, but as this episode makes clear, that crisis is not about who gets a piece of pie, but about who controls the pie; as Bert tautologically instructs his younger partner Roger Sterling, “Whoever is in control is in charge.”
In fact, a closer look at “Waterloo” in particular makes clear that the show does not chronicle the last gasp of the white male, as Scott would have it, but outlines the way that a wily old guard has followed the advice of E. Digby Baltzell (who coined the acronym WASP in 1964) by “absorbing talented and distinguished members of minority groups into its privileged ranks” in order to maintain its grip on power. After several episodes of unrelenting humiliation for Don, this installment was so thoroughly upbeat that it had critics wondering just whose Waterloo it was, anyway. Unlike Napoleon, Don doesn’t defiantly march into a futile, fatal battle to save his job, but instead surprises everyone by stepping graciously aside, handing a big pitch for Burger Chef to his protégé, Peggy Olson. (...)
It’s tempting to read both the ad and Peggy’s triumphant performance as harbingers of our own more enlightened, inclusive era, where women and people of color have a seat and a voice at the clean well-lit table that Peggy describes. There are plenty of indications that we are witnessing the small steps that will ultimately amount to real progress (not least of which, the moon landing that provides the episode’s symbolic framework). Remember at the beginning of this season (in “A Day’s Work”), when senior partner Bertram Cooper, a member of the old guard if ever there was one, insists that a black secretary be moved from her post as receptionist at the front of the office? (“I’m all for the national advancement of colored people,” he says, “but I don’t believe people should advance all the way to the front.”) Now Joan Holloway obliges by promoting her to office manager, and it is she — her name is Dawn, naturally — who is not just front but center at the end of “Waterloo” when she calls to order the meeting at which Cooper’s death and a fresh start for the agency are announced.
But as exhilarating as it is to watch Peggy nail the presentation, and to watch Dawn command the room if just for a moment, the big winner in this episode is the status quo, which puts a new face on the same old model. Peggy’s pitch for Burger Chef promises that everyone will get a seat at the table, but if we’ve learned anything over the course of six and a half seasons, it’s that it is actually an invitation-only affair for an exceptional few. Yes, Mad Men narrates the crisis of white masculinity, but as this episode makes clear, that crisis is not about who gets a piece of pie, but about who controls the pie; as Bert tautologically instructs his younger partner Roger Sterling, “Whoever is in control is in charge.”
by Kathy Knapp, LA Review of Books | Read more:
Image: via:
International Louie Louie Day
Louie Louie was written by R&B singer Richard Berry in 1955. His band, “The Pharaohs”, recorded and released it in 1957. It got some airplay on the band’s home turf around San Francisco, and became popular in the pacific northwest. It was covered by other garage bands and became a somewhat popular party tune in the western states.
In Berry’s original recording the lyric is quite clear: It’s a song is about a sailor who spends three days traveling to Jamaica to see his girl. The story is told to a bartender named Louie. Nothing even remotely obscene in that original version.
The version we all know and love was recorded by the Kingsmen on April 6, 1963 in Portland Oregon. The cover was not of the original Richard Berry recording, but a later version by Robin Roberts with his backing band “The Wailers.” The Robin Roberts version was released in 1961 and became a local hit in Tacoma, Washington.
For reasons lost in the mists of time, the Kingsmen’s recording session cost $50, and consisted of a single take. Legend suggests they thought that take was a rehearsal, or maybe a demo tape.
A different version of Louie Louie was also recorded the same week, in the same recording studio, by Paul Revere and the Raiders. The Raiders version is considered much better musically, but the Kingsmen’s version got all the glory.
The Kingsmen’s lead singer on Louie Louie was Jack Ely, whose birthday is April 11. That date became the basis for the widely celebrated “International Louie Louie Day.” It was the only time Ely recorded with the Kingsmen as lead vocalist. He left the band shortly after to return to school, or over a dispute about who was to be lead vocalist. Accounts vary. When the song became popular the band refused to take him back. The TV and concert performances the Kingsmen did during the tune’s most popular years were lip synced.
by Gene Baucom, Medium | Read more:
Video: YouTube
What the Deer Are Telling Us
In 1909, a United States Forest Service officer named Aldo Leopold shot a mother wolf from a perch of rimrock in the Apache National Forest in Arizona. It was a revelatory moment in the life of the young naturalist. “In those days we never heard of passing up a chance to kill a wolf,” Leopold wrote in an essay called “Thinking Like a Mountain,” later included in his Sand County Almanac, published posthumously after his death in 1948 and which went on to sell several million copies. “We reached the old wolf in time to watch a fierce green fire dying in her eyes. I realized then, and have known ever since, that there was something new to me in those eyes—something known only to her and to the mountain.”
Leopold, who today is revered among ecologists, was among the earliest observers of the impact of wolves on deer abundance, and of the impact of too many deer on plant life. In “Thinking Like a Mountain,” he outlined for the first time the basic theory of trophic cascades, which states that top-down predators determine the health of an ecosystem. The theory as presented by Leopold held that the extirpation of wolves and cougars in Arizona, and elsewhere in the West, would result in a booming deer population that would browse unsustainably in the forests of the high country. “I now suspect that just as a deer herd lives in mortal fear of its wolves,” Leopold wrote, “so does a mountain live in mortal fear of its deer.”
One of the areas where Leopold studied deer irruptions was the Kaibab Plateau near the Grand Canyon. By 1924, the deer population on the Kaibab had peaked at 100,000. Then it crashed. During 1924-26, 60 percent of the deer perished due to starvation. Leopold believed this pattern of deer exceeding the carrying capacity of the land would repeat across the U.S. wherever predators had been eliminated as a trophic force. By 1920, wolves and cougars were gone from the ecosystems east of the Mississippi—shot, trapped, poisoned, as human settlement fragmented their habitat— and they were headed toward extirpation in most parts of the American West. Within two generations, the hunting of deer had been heavily regulated, the calls from conservationists had been heeded for deer reintroduction throughout the eastern U.S., and swaths of state and federally managed forest had been protected from any kind of hunting.
Freed both of human and animal predation, however, deer did not follow the pattern predicted by Leopold. Instead of eating themselves out of house and home, they survived—they thrived—by altering their home range to their benefit. As recent studies have shown, certain kinds of grasses and sedges preferred by deer react to over-browsing the way the bluegrass on a suburban lawn reacts to a lawnmower. The grasses grow back faster and healthier, and provide more sustenance for more deer. In short, there has been enough food in our forests, mountains, and grasslands for white-tailed deer in the U.S. to reach unprecedented numbers, about 32 million, more than at any time since record-keeping began.
In 1968, Stanford biology professor Paul Ehrlich predicted that another widespread species would die out as a result of overpopulation. But he was spectacularly wrong. Like the deer, the steadily ingenious Homo sapiens altered its home range—most notably the arable land—to maximize its potential for survival. As Homo sapiens continues to thrive across the planet today, the species might take a moment to find its reflection in the rampant deer.
Conservation biologists who have followed the deer tend to make an unhappy assessment of its progress. They mutter dark thoughts about killing deer, and killing a lot of them. In fact, they already are. In 2011, in the name of conservation, the National Park Service and U.S. Department of Agriculture teamed up with hunters to “harvest” 3 million antlerless deer. I asked Thomas Rooney, one of the nation’s top deer irruption researchers, about the losses in forest ecosystems overrun by deer. “I’d say the word is ‘apocalypse,’ ” Rooney said.
On a warm fall day last year, I went to see Rooney, a professor of biology at Wright State University, in Dayton, Ohio. In his office, I noticed a well-thumbed copy of Ehrlich’s The Population Bomb, and I asked him if he thought a comparison might be drawn between human overpopulation and deer overpopulation. He looked at me as if the point was obvious. “Deer, like humans,” he said, “can come in and eliminate biodiversity, though not to their immediate detriment.” (...)
He told me about a study published last year in Conservation Biology that bemoaned “pandemic deer overabundance,” language suggesting the creature was a disease on the land. Ecosystem damage becomes apparent at roughly 15 deer per square mile, and the damage grows with density. Some areas of the northeast host as many as 100 deer per square mile. (The Wright State University reserve has a density of around 40 deer per square mile.) He noted a 2013 article co-authored by a group of Nature Conservancy scientists who warned that “no other threat to forested habitats is greater at this point in time—not lack of fire, not habitat conversion, not climate change.” (...)
I asked Rooney about the remarkable ability of deer to thrive in their home range—most of the U.S.—while producing ecosystem simplification and a biodiversity crash. In his own studies of deer habitats in Wisconsin, Rooney found that only a few types of grass thrive under a deer-dominant regime. The rest, amounting to around 80 percent of native Wisconsin plant species, had been eradicated. “The 80 percent represent the disappearance of 300 million years of evolutionary history,” he said. He looked deflated.
A turkey vulture pounded its wings through the canopy, and in the darkening sky a military cargo plane howled in descent toward nearby Wright-Paterson Air Force Base. Rooney and I emerged from the forest onto a campus parking lot where Homo sapiens held sway. The self-assured mammals crossed fields of exotic bluegrass under pruned hardwoods surrounded by a sea of concrete, tarmac, glass, and metal. There were no flowers except those managed in beds. There were no other animals to be seen except the occasional squirrel, and these were rat-like, worried, scurrying. The Homo sapiens got into cars that looked the same, on streets that looked the same, and they were headed to domiciles that looked more or less the same. This is home for us.
Leopold, who today is revered among ecologists, was among the earliest observers of the impact of wolves on deer abundance, and of the impact of too many deer on plant life. In “Thinking Like a Mountain,” he outlined for the first time the basic theory of trophic cascades, which states that top-down predators determine the health of an ecosystem. The theory as presented by Leopold held that the extirpation of wolves and cougars in Arizona, and elsewhere in the West, would result in a booming deer population that would browse unsustainably in the forests of the high country. “I now suspect that just as a deer herd lives in mortal fear of its wolves,” Leopold wrote, “so does a mountain live in mortal fear of its deer.”One of the areas where Leopold studied deer irruptions was the Kaibab Plateau near the Grand Canyon. By 1924, the deer population on the Kaibab had peaked at 100,000. Then it crashed. During 1924-26, 60 percent of the deer perished due to starvation. Leopold believed this pattern of deer exceeding the carrying capacity of the land would repeat across the U.S. wherever predators had been eliminated as a trophic force. By 1920, wolves and cougars were gone from the ecosystems east of the Mississippi—shot, trapped, poisoned, as human settlement fragmented their habitat— and they were headed toward extirpation in most parts of the American West. Within two generations, the hunting of deer had been heavily regulated, the calls from conservationists had been heeded for deer reintroduction throughout the eastern U.S., and swaths of state and federally managed forest had been protected from any kind of hunting.
Freed both of human and animal predation, however, deer did not follow the pattern predicted by Leopold. Instead of eating themselves out of house and home, they survived—they thrived—by altering their home range to their benefit. As recent studies have shown, certain kinds of grasses and sedges preferred by deer react to over-browsing the way the bluegrass on a suburban lawn reacts to a lawnmower. The grasses grow back faster and healthier, and provide more sustenance for more deer. In short, there has been enough food in our forests, mountains, and grasslands for white-tailed deer in the U.S. to reach unprecedented numbers, about 32 million, more than at any time since record-keeping began.
In 1968, Stanford biology professor Paul Ehrlich predicted that another widespread species would die out as a result of overpopulation. But he was spectacularly wrong. Like the deer, the steadily ingenious Homo sapiens altered its home range—most notably the arable land—to maximize its potential for survival. As Homo sapiens continues to thrive across the planet today, the species might take a moment to find its reflection in the rampant deer.
Conservation biologists who have followed the deer tend to make an unhappy assessment of its progress. They mutter dark thoughts about killing deer, and killing a lot of them. In fact, they already are. In 2011, in the name of conservation, the National Park Service and U.S. Department of Agriculture teamed up with hunters to “harvest” 3 million antlerless deer. I asked Thomas Rooney, one of the nation’s top deer irruption researchers, about the losses in forest ecosystems overrun by deer. “I’d say the word is ‘apocalypse,’ ” Rooney said.
On a warm fall day last year, I went to see Rooney, a professor of biology at Wright State University, in Dayton, Ohio. In his office, I noticed a well-thumbed copy of Ehrlich’s The Population Bomb, and I asked him if he thought a comparison might be drawn between human overpopulation and deer overpopulation. He looked at me as if the point was obvious. “Deer, like humans,” he said, “can come in and eliminate biodiversity, though not to their immediate detriment.” (...)
He told me about a study published last year in Conservation Biology that bemoaned “pandemic deer overabundance,” language suggesting the creature was a disease on the land. Ecosystem damage becomes apparent at roughly 15 deer per square mile, and the damage grows with density. Some areas of the northeast host as many as 100 deer per square mile. (The Wright State University reserve has a density of around 40 deer per square mile.) He noted a 2013 article co-authored by a group of Nature Conservancy scientists who warned that “no other threat to forested habitats is greater at this point in time—not lack of fire, not habitat conversion, not climate change.” (...)
I asked Rooney about the remarkable ability of deer to thrive in their home range—most of the U.S.—while producing ecosystem simplification and a biodiversity crash. In his own studies of deer habitats in Wisconsin, Rooney found that only a few types of grass thrive under a deer-dominant regime. The rest, amounting to around 80 percent of native Wisconsin plant species, had been eradicated. “The 80 percent represent the disappearance of 300 million years of evolutionary history,” he said. He looked deflated.
A turkey vulture pounded its wings through the canopy, and in the darkening sky a military cargo plane howled in descent toward nearby Wright-Paterson Air Force Base. Rooney and I emerged from the forest onto a campus parking lot where Homo sapiens held sway. The self-assured mammals crossed fields of exotic bluegrass under pruned hardwoods surrounded by a sea of concrete, tarmac, glass, and metal. There were no flowers except those managed in beds. There were no other animals to be seen except the occasional squirrel, and these were rat-like, worried, scurrying. The Homo sapiens got into cars that looked the same, on streets that looked the same, and they were headed to domiciles that looked more or less the same. This is home for us.
by Christopher Ketcham, Nautilus | Read more:
Image: Chris Buzelli
The Wave That Swept the World
In the beginning was the wave. The blue and white tsunami, ascending from the left of the composition like a massive claw, descends pitilessly on Mount Fuji – the most august mountain in Japan, turned in Katsushika Hokusai’s vision into a small and vulnerable hillock. Under the Wave off Kanagawa, one of Hokusai’s Thirty-Six Views of Mount Fuji, has been an icon of Japan since the print was first struck in 1830–31, yet it forms part of a complex global network of art, commerce, and politics. Its intense blue comes from Hokusai’s pioneering use of Prussian Blue ink – a foreign pigment, imported, probably via China, from England or Germany. The wave, from the beginning, stretched beyond Japan. Soon, it would crash over Europe.This week the Museum of Fine Arts in Boston, home to the greatest collection of Japanese art outside Japan, opens a giant retrospective of the art of Hokusai, showcasing his indispensible woodblock prints of the genre we call ukiyo-e, or ‘images of the floating world’. It’s the second Hokusai retrospective in under a year; last autumn, the wait to see the artist’s two-part mega-show at the Grand Palais in Paris stretched to two hours or more. American and French audiences adore Hokusai – and have for centuries. He is, after all, not only one of the great figures of Japanese art, but a father figure of much of Western modernism. Without Hokusai, there might have been no Impressionism – and the global art world we today take for granted might look very different indeed.
Fine print
Hokusai’s prints didn’t find their way to the West until after the artist’s death in 1849. During his lifetime Japan was still subject to sakoku, the longstanding policy that forbade foreigners from entering and Japanese from leaving, on penalty of death. But in the 1850s, with the arrival of the ‘black ships’ of the American navy under Matthew Perry, Japan gave up its isolationist policies – and officers and diplomats, then artists and collectors, discovered Japanese woodblock printing. In Japan, Hokusai was seen as vulgar, beneath the consideration of the imperial literati. In the West, his delineation of space with color and line, rather than via one-point perspective, would have revolutionary impact.
Both the style and the subject matter of ukiyo-e prints appealed to young artists like Félix Bracquemond, one of the first French artists to be seduced by Japan. Yet the Japanese prints traveling to the West in the first years after Perry were contemporary artworks, rather than the slightly earlier masterpieces of Hokusai, Hiroshige, and Utamaro. Many of the prints that arrived were used as wrapping paper for commercial goods. Everything changed on 1 April, 1867, when the Exposition Universelle opened on the Champ de Mars, the massive Paris marching grounds that now lies in the shadow of the Eiffel Tower. It featured, for the first time, a Japanese pavilion – and its showcase of ukiyo-e prints revealed the depth of Japanese printmaking to French artists for the first time.
by Jason Farago, BBC | Read more:
Image: Katsushika Hokusai Thursday, April 9, 2015
Just Don't Call It a Panama Hat
Yet the misnomer didn’t prevent the famous straw hat, more correctly referred to as Montecristi hat, from being designated by UNESCO as Intangible Cultural Heritage in 2012–Ecuador has produced them since the early 17th Century. It takes three months to make a superfino Montecristi hat (the best grade there is), and weavers can only work in the early and late hours of the day because the straw breaks when it’s exposed to high temperature. According to tradition, hats are cleaned, finished and sold in the town of Montecristi, the Panama hat’s spiritual home in the province of Manabi.
In the small and remote village of Pile nearby, the craft is passed on through family. Manuel Lopez, 41, learned to weave with his father at the age of eight. He says he teaches his own children now, though making a Montecristi hat is becoming a lost art. A weaver only makes between $700 to $1,200 to produce a superfino hat, which can fetch for $25,000 abroad. And now that China has become the world’s top producer of straw hats (which they actually make from paper), Ecuador’s hat makers are unable to keep up with decline in price and demand. With most young people looking for more lucrative opportunities elsewhere, experts say the last-ever traditionally made Montecristi hat will be woven in the next 15 years.
by Eduardo Leal, Roads & Kingdoms | Read more:
Image: Eduardo Leal
Wednesday, April 8, 2015
Thursday, April 2, 2015
Our Land, Up for Grabs
A battle is looming over America’s public lands.
It’s difficult to understand why, given decades of consistent, strong support from voters of both parties for protecting land, water and the thousands of jobs and billions of dollars in economic benefits these resources make possible.
Last week, the United States Senate voted 51 to 49 to support an amendment to a nonbinding budget resolution to sell or give away all federal lands other than the national parks and monuments.
If the measure is ever implemented, hundreds of millions of acres of national forests, rangelands, wildlife refuges, wilderness areas and historic sites will revert to the states or local governments or be auctioned off. These lands constitute much of what’s left of the nation’s natural and historical heritage.
This was bad enough. But it followed a 228-to-119 vote in the House of Representatives approving another nonbinding resolution that said “the federal estate is far too large” and voiced support for reducing it and “giving states and localities more control over the resources within their boundaries.” Doing so, the resolution added, “will lead to increased resource production and allow states and localities to take advantage of the benefits of increased economic activity.”
The measures, supported only by the Republicans who control both houses, were symbolic. But they laid down a marker that America’s public lands, long held in trust by the government for its people, may soon be up for grabs.
We’ll get a better sense of Congress’s commitment to conservation this year when it decides whether to reauthorize the Land and Water Conservation Fund, created in 1965 and financed by fees paid by oil companies foroffshore drilling. The program underwrites state and local park and recreation projects, conservation easements for ranches and farms, plus national parks, forests and wildlife refuges.
Nearly $17 billion has gone to those purposes over the years, including 41,000 state and local park and recreation projects, some of which my organization has helped put together. (Another $19 billion was diverted by Congress to other purposes.) The program expires Sept. 30 unless Congress keeps it alive.
Land protection has long been an issue for which voters of both parties have found common cause. Since 1988, some $71.7 billion has been authorized to conserve land in more than 1,800 state and local elections in 43 states. Last year, $13.2 billion was approved by voters in 35 initiatives around the country — the most in a single year in the 27 years my organization has tracked these initiatives and, in some cases, led them.
But this consensus is being ignored, and not just in the nation’s capital.
It’s difficult to understand why, given decades of consistent, strong support from voters of both parties for protecting land, water and the thousands of jobs and billions of dollars in economic benefits these resources make possible.
Last week, the United States Senate voted 51 to 49 to support an amendment to a nonbinding budget resolution to sell or give away all federal lands other than the national parks and monuments.If the measure is ever implemented, hundreds of millions of acres of national forests, rangelands, wildlife refuges, wilderness areas and historic sites will revert to the states or local governments or be auctioned off. These lands constitute much of what’s left of the nation’s natural and historical heritage.
This was bad enough. But it followed a 228-to-119 vote in the House of Representatives approving another nonbinding resolution that said “the federal estate is far too large” and voiced support for reducing it and “giving states and localities more control over the resources within their boundaries.” Doing so, the resolution added, “will lead to increased resource production and allow states and localities to take advantage of the benefits of increased economic activity.”
The measures, supported only by the Republicans who control both houses, were symbolic. But they laid down a marker that America’s public lands, long held in trust by the government for its people, may soon be up for grabs.
We’ll get a better sense of Congress’s commitment to conservation this year when it decides whether to reauthorize the Land and Water Conservation Fund, created in 1965 and financed by fees paid by oil companies foroffshore drilling. The program underwrites state and local park and recreation projects, conservation easements for ranches and farms, plus national parks, forests and wildlife refuges.
Nearly $17 billion has gone to those purposes over the years, including 41,000 state and local park and recreation projects, some of which my organization has helped put together. (Another $19 billion was diverted by Congress to other purposes.) The program expires Sept. 30 unless Congress keeps it alive.
Land protection has long been an issue for which voters of both parties have found common cause. Since 1988, some $71.7 billion has been authorized to conserve land in more than 1,800 state and local elections in 43 states. Last year, $13.2 billion was approved by voters in 35 initiatives around the country — the most in a single year in the 27 years my organization has tracked these initiatives and, in some cases, led them.
But this consensus is being ignored, and not just in the nation’s capital.
by Will Rogers, NY Times | Read more:
Image: via:
The Common Man’s Crown
The 1903 World Series was the first of baseball’s modern era. Boston and Pittsburgh were adhering to newly codified rules of play — and also initiating a new code of dress, as no one could have known, least of all the men in the stands, uniformly obedient to the laws of Edwardian haberdashery. The spectators wore “derbies, boaters, checkered caps and porkpie hats,” wrote Beverly Chico in her book, “Hats and Headwear Around the World.” Each style signaled a distinct social identity. All are now regarded largely as museum pieces, having fallen away in favor of a hat that offers casual comfort and a comforting image of classlessness. Given our cult of youth, our populist preference for informality and our native inclination toward sportiness, its emergence as the common man’s crown was inevitable.
Frank Sinatra supposedly implored the fedora-wearers of his era to cock their brims: Angles are attitudes. Ballplayers have accepted this as truth since at least that first World Series, when Fred Clarke, Pittsburgh’s left fielder and manager, wore his visor insouciantly askew, and the general public has come to know the ground rules as well. Here’s a test of fluency in the sartorial vernacular of Americans: You can read the tilt of a bill like the cut of a jib. The way you wear your hat is essential to others’ memories of you, and the look of a ball cap’s brim communicates tribal identity more meaningfully than the symbols stitched across its front. Is the bill flatter than an AstroTurf outfield? Curved like the trajectory of a fly ball? Straightforwardly centered? Reversed like that of a catcher in his crouch or a loiterer on his corner? The cap conforms to most any cast of mind.
Watch people fiddling with their baseball caps as they sit at a stoplight or on a bar stool, primping and preening in what must be the most socially acceptable form of self-grooming. No one begrudges their fussiness, because everyone appreciates the attempt to express a point of view. The cap presents studies of plasticity in action and of the individual effort to stake out a singular place on the roster, and the meaning of the logo is as mutable as any other aspect. To wear a New York Yankees cap in the United States is to show support for the team, maybe, or to invest in the hegemony of an imperial city. To wear one abroad — the Yankees model is by far the best-selling Major League Baseball cap in Europe and Asia — is to invest in an idealized America, a phenomenon not unlike pulling on contraband bluejeans in the old Soviet Union. (...)
“Until the late 1970s, wearing a ball cap anywhere but on the baseball field carried with it a cultural stigma,” James Lilliefors writes in his book “Ball Cap Nation,” citing the Mets cap of the “Odd Couple” slob Oscar Madison as one example of its signaling mundane degeneracy. In Lilliefors’s reckoning, eight factors contributed to the cap’s increased legitimacy, including the explosion of television sports, the maturation of the first generation of Little League retirees and the relative suavity of the Detroit Tigers cap worn by Tom Selleck as the title character of “Magnum P.I.”: “It made sporting a ball cap seem cool rather than quirky; and it created an interest in authentic M.L.B. caps.” What had been merely juvenile came to seem attractively boyish, and New Era was poised to reap the rewards, having begun selling its wares to the general public, by way of a mail-order ad in the Sporting News, in 1979. (...)
Where the basic structure of a derby or a boater spoke of the wearer’s rank and region, the baseball cap is comparatively subtle. Angles are indeed accents, and a millimetric bend in the bill will inflect the article’s voice. The hip-hop habit is to wear the cap perfectly fresh and clean, as if it arrived on the head directly from the cash register, spotless except, perhaps, for the circle of the manufacturer’s label still stuck to it, alerting admirers that this is no counterfeit and that the cap is as new as the money that bought it. In tribute to this practice, New Era not long ago issued a limited-edition series of caps in the colors of its sticker, black and gold, as if the company were at once flattering its customers and further transforming them into advertisements for itself.
Peel the sticker away and bow the brim a bit: This is the simple start of asserting a further level of ownership. Taken to an extreme, the process can resemble a burlesque of the ancient ritual of breaking in the baseball mitts with which the cap’s contours rhyme. To speak to an undergraduate about a “dirty white baseball cap” is to evoke a fratboy lifestyle devoted to jam bands and domestic lager and possibly lacrosse. To spend time among the frat boys themselves is to learn the baroque techniques for accelerating wear and tear. Some wear them in the shower; others yet undertake artificial rituals involving the hair dryer and the dishwasher and the kitchen sink, recalling the collegians of midcentury who, expressing the prep fetish for the shabby genteel, took sandpaper to the collars of their Oxford shirts to gain a frayed edge.
Frank Sinatra supposedly implored the fedora-wearers of his era to cock their brims: Angles are attitudes. Ballplayers have accepted this as truth since at least that first World Series, when Fred Clarke, Pittsburgh’s left fielder and manager, wore his visor insouciantly askew, and the general public has come to know the ground rules as well. Here’s a test of fluency in the sartorial vernacular of Americans: You can read the tilt of a bill like the cut of a jib. The way you wear your hat is essential to others’ memories of you, and the look of a ball cap’s brim communicates tribal identity more meaningfully than the symbols stitched across its front. Is the bill flatter than an AstroTurf outfield? Curved like the trajectory of a fly ball? Straightforwardly centered? Reversed like that of a catcher in his crouch or a loiterer on his corner? The cap conforms to most any cast of mind.Watch people fiddling with their baseball caps as they sit at a stoplight or on a bar stool, primping and preening in what must be the most socially acceptable form of self-grooming. No one begrudges their fussiness, because everyone appreciates the attempt to express a point of view. The cap presents studies of plasticity in action and of the individual effort to stake out a singular place on the roster, and the meaning of the logo is as mutable as any other aspect. To wear a New York Yankees cap in the United States is to show support for the team, maybe, or to invest in the hegemony of an imperial city. To wear one abroad — the Yankees model is by far the best-selling Major League Baseball cap in Europe and Asia — is to invest in an idealized America, a phenomenon not unlike pulling on contraband bluejeans in the old Soviet Union. (...)
“Until the late 1970s, wearing a ball cap anywhere but on the baseball field carried with it a cultural stigma,” James Lilliefors writes in his book “Ball Cap Nation,” citing the Mets cap of the “Odd Couple” slob Oscar Madison as one example of its signaling mundane degeneracy. In Lilliefors’s reckoning, eight factors contributed to the cap’s increased legitimacy, including the explosion of television sports, the maturation of the first generation of Little League retirees and the relative suavity of the Detroit Tigers cap worn by Tom Selleck as the title character of “Magnum P.I.”: “It made sporting a ball cap seem cool rather than quirky; and it created an interest in authentic M.L.B. caps.” What had been merely juvenile came to seem attractively boyish, and New Era was poised to reap the rewards, having begun selling its wares to the general public, by way of a mail-order ad in the Sporting News, in 1979. (...)
Where the basic structure of a derby or a boater spoke of the wearer’s rank and region, the baseball cap is comparatively subtle. Angles are indeed accents, and a millimetric bend in the bill will inflect the article’s voice. The hip-hop habit is to wear the cap perfectly fresh and clean, as if it arrived on the head directly from the cash register, spotless except, perhaps, for the circle of the manufacturer’s label still stuck to it, alerting admirers that this is no counterfeit and that the cap is as new as the money that bought it. In tribute to this practice, New Era not long ago issued a limited-edition series of caps in the colors of its sticker, black and gold, as if the company were at once flattering its customers and further transforming them into advertisements for itself.
Peel the sticker away and bow the brim a bit: This is the simple start of asserting a further level of ownership. Taken to an extreme, the process can resemble a burlesque of the ancient ritual of breaking in the baseball mitts with which the cap’s contours rhyme. To speak to an undergraduate about a “dirty white baseball cap” is to evoke a fratboy lifestyle devoted to jam bands and domestic lager and possibly lacrosse. To spend time among the frat boys themselves is to learn the baroque techniques for accelerating wear and tear. Some wear them in the shower; others yet undertake artificial rituals involving the hair dryer and the dishwasher and the kitchen sink, recalling the collegians of midcentury who, expressing the prep fetish for the shabby genteel, took sandpaper to the collars of their Oxford shirts to gain a frayed edge.
by Troy Patterson, NY Times | Read more:
Image: Mauricio Alejo
Subscribe to:
Comments (Atom)













