Tuesday, July 29, 2014
Wax and Wane: The Tough Realities Behind Vinyl's Comeback
“You used to be able to turn over a record in four weeks,” says John Beeler, project manager at Asthmatic Kitty, the label home of Sufjan Stevens. “But I’m now telling my artists that we need at least three months from the time they turn it in to the time we get it back.” Across the board, lengthy lead times that were once anomalies are now the norm. “They’ve been longer this year than they were even nine months ago,” says Nick Blandford, managing director of the Secretly Label Group, which includes prominent indie imprints Secretly Canadian, Jagjaguwar, and Dead Oceans, and artists including Bon Iver and the War on Drugs. “We crossed our fingers and hoped that turn times would improve after Record Store Day in April, but they’re still about the same. We’ve just accepted this as the reality.”
So when it comes to the current state of the vinyl industry’s unlikely resurrection, everyone is happy. And everyone is frustrated.
Vinyl’s sharp rise began in 2008, when sales nearly doubled from the previous year’s 1 million to 1.9 million. The tallies have gone up each year since, and 2013’s 6.1 million is a 33 percent increase over 2012’s 4.6 million. (Those numbers are even larger when you account for releases that fall outside SoundScan’s reach.) The resurgent format’s market share is still far smaller than CDs, digital, and streaming—vinyl accounted for only 2 percent of all album sales last year, compared to 41 percent for digital and 57 percent for CDs—and no one expects it to regain dominance. But it’s more than a trend, and it’s not going away anytime soon. “Four years ago, maybe half our releases would get an LP option,” says James Cartwright, production manager at Merge Records. “Now every release we do has a vinyl format.”
Mounting today’s LPs side-by-side on a giant wall would offer a particularly kaleidoscopic display since a significant chunk of sales now come from colored discs. While some purists claim these sorts of limited-edition releases and Record Store Day exclusives are leading to the cartoonization of a format, it’s apparent after speaking with pressing plants, labels, and record stores that artists like Jack White are giving people what they want. As vinyl sales have climbed, so has the demand for exclusives. Musicol’s two-press operation in Columbus, Ohio, has been pressing vinyl since the 1960s, and though the place used to press about 90 percent black vinyl, color vinyl now accounts for about half of its orders. Meanwhile, Cleveland’s five-year-old Gotta Groove Records presses about 40 percent of its LPs and 45s on colored vinyl.
And White isn’t the only one upping the ante with quirky embellishments. On a recent tour of Gotta Groove’s operation, sparkling specs littered the ground near the 7” machine after a just-completed run of 100 45s were pressed on clear vinyl with glitter. Covering the walls of a listening room were more custom orders that ranged from impressive to confounding. One band pressed coffee grounds into their records. Another incorporated the ashes of a 19th-century Bible. And an upcoming order will include shredded cash. The plant has to draw a line when a client’s order includes bodily fluids. “At least once a month a band wants to press their blood into the record,” says Gotta Groove VP of sales and marketing Matt Earley, who always says no.
Now, you might think adding blood or coffee to vinyl is a sign that the format has officially crossed the line from cultural commodity to tchotchke—and there are certainly bands that would agree. In fact, Beeler at Asthmatic Kitty says some of his label’s artists are beginning to resist colored vinyl and other exclusives. But Asthmatic Kitty and others still do it, because consumers demand it, and those limited-edition releases drive sales. (These sorts of exclusive releases also often bypass distributors and record stores, driving sales directly to a label’s web store.)
by Joel Oliphint, Pitchfork | Read more:
Image: Mara Robinson
The Overblown Stigma of Genital Herpes
Even after his friends hype him up, Jamin Peckham still backs out sometimes. It’s not that he’s shy or insecure about his looks. Instead, what keeps this 27-year-old from approaching the cute girl across the room is a set of hypotheticals that most people don’t deal with.
“My mind runs ahead to ‘the disclosure talk’ and then all the way down to, ‘What if we have sex and what if I give it to her?’” said Peckham, an IT professional who lives in Austin, Texas.
Peckham has had genital herpes for six years now and got it from an ex-girlfriend who didn’t know she had it. He hasn’t been in a relationship with any girls since his diagnosis, though he’s been rejected by a few girls who asked to be friends after hearing about his condition. Due to this, Peckham said that he has to work harder than ever to secure a romantic relationship.
Some think of people like Peckham as immoral, assuming only people who sleep around get genital herpes. The stigma of the virus, which exists at the heart of this faulty mindset, is usually worse than the symptoms themselves, as it affects dating, social life and psychological health.
According to the Centers for Disease Control and Prevention, about one out of six people in the United States aged 14 to 49 have genital herpes caused by the HSV-2 infection (the herpes simplex virus often responsible for genital herpes). The overall genital herpes statistic is probably higher, the CDC stated, since many people are also contracting genital herpes through oral sex caused by HSV-1 (the kind of herpes usually responsible for cold sores). Taking that into account, genital herpes statistics are usually quoted at closer to 25 percent for women and 10 percent for men, but most of these people don’t even know they have it.
In terms of a person’s health, genital herpes is usually nothing to worry about. According to the National Institutes of Health, many people with genital herpes never even have outbreaks or their outbreaks decrease over time (one or two outbreaks a year is not uncommon). The virus can lie dormant in your system for years without coming to the surface. The initial outbreak is often the worst, occurring a few days to a couple of weeks after being infected. Symptoms may include a fever, headache, and muscle aches for a few weeks. But for the most part, outbreaks consist of painful fever blisters or sores on or near the genitals (or, in less common cases, sores appearing elsewhere) for a few days, as well as burning, itching, swelling, and irritation that may be triggered by stress or fatigue. The virus never goes away, and some take antiviral medicines to relieve or suppress outbreaks. (...)
Genital herpes is contracted during sexual contact, usually spread through fluids on the genitals or mouth. You can only get genital herpes from someone who already has it, can get it during just one sexual encounter, and can get it with or without a condom. Condoms merely lower your risk, according to the CDC. You can even get it if the other person doesn’t have symptoms, since the virus sheds about 10 percent of the time for asymptomatic HSV-2 infections, according to a 2011 study published in the Journal of American Medical Association.
Herpes has a unique stigma among sexually transmitted diseases. HIV/AIDS is stigmatized, but few laugh at people who have it because it’s a serious illness. HPV can lead to cancer, on occasion, and women get tested regularly for it, making it no joke to most. Chlamydia, syphilis, crabs, scabies, and gonorrhea are sometimes the target of jokes, but these STDS are typically curable, so people won’t have to endure the annoyance for too long. Genital herpes, though, isn’t curable, is thought of as a disease only the promiscuous and cheating-types get, and is a popular joke topic.
Despite the fact that herpes has been around since the time of the Ancient Greeks, according to Stanford University, the widespread stigma seems to be just decades old. Herpes is the “largest epidemic no one wants to talk about,” Eric Sabo wrote in the New York Times. Both Project Accept and HSV Singles Dating blame an antiviral drug marketing campaign during the late 1970s to mid-1980s for herpes’ stigma. But it’s difficult to pin down exactly when and why our negative associations started.
“My mind runs ahead to ‘the disclosure talk’ and then all the way down to, ‘What if we have sex and what if I give it to her?’” said Peckham, an IT professional who lives in Austin, Texas.
Peckham has had genital herpes for six years now and got it from an ex-girlfriend who didn’t know she had it. He hasn’t been in a relationship with any girls since his diagnosis, though he’s been rejected by a few girls who asked to be friends after hearing about his condition. Due to this, Peckham said that he has to work harder than ever to secure a romantic relationship.Some think of people like Peckham as immoral, assuming only people who sleep around get genital herpes. The stigma of the virus, which exists at the heart of this faulty mindset, is usually worse than the symptoms themselves, as it affects dating, social life and psychological health.
According to the Centers for Disease Control and Prevention, about one out of six people in the United States aged 14 to 49 have genital herpes caused by the HSV-2 infection (the herpes simplex virus often responsible for genital herpes). The overall genital herpes statistic is probably higher, the CDC stated, since many people are also contracting genital herpes through oral sex caused by HSV-1 (the kind of herpes usually responsible for cold sores). Taking that into account, genital herpes statistics are usually quoted at closer to 25 percent for women and 10 percent for men, but most of these people don’t even know they have it.
In terms of a person’s health, genital herpes is usually nothing to worry about. According to the National Institutes of Health, many people with genital herpes never even have outbreaks or their outbreaks decrease over time (one or two outbreaks a year is not uncommon). The virus can lie dormant in your system for years without coming to the surface. The initial outbreak is often the worst, occurring a few days to a couple of weeks after being infected. Symptoms may include a fever, headache, and muscle aches for a few weeks. But for the most part, outbreaks consist of painful fever blisters or sores on or near the genitals (or, in less common cases, sores appearing elsewhere) for a few days, as well as burning, itching, swelling, and irritation that may be triggered by stress or fatigue. The virus never goes away, and some take antiviral medicines to relieve or suppress outbreaks. (...)
Genital herpes is contracted during sexual contact, usually spread through fluids on the genitals or mouth. You can only get genital herpes from someone who already has it, can get it during just one sexual encounter, and can get it with or without a condom. Condoms merely lower your risk, according to the CDC. You can even get it if the other person doesn’t have symptoms, since the virus sheds about 10 percent of the time for asymptomatic HSV-2 infections, according to a 2011 study published in the Journal of American Medical Association.
Herpes has a unique stigma among sexually transmitted diseases. HIV/AIDS is stigmatized, but few laugh at people who have it because it’s a serious illness. HPV can lead to cancer, on occasion, and women get tested regularly for it, making it no joke to most. Chlamydia, syphilis, crabs, scabies, and gonorrhea are sometimes the target of jokes, but these STDS are typically curable, so people won’t have to endure the annoyance for too long. Genital herpes, though, isn’t curable, is thought of as a disease only the promiscuous and cheating-types get, and is a popular joke topic.
Despite the fact that herpes has been around since the time of the Ancient Greeks, according to Stanford University, the widespread stigma seems to be just decades old. Herpes is the “largest epidemic no one wants to talk about,” Eric Sabo wrote in the New York Times. Both Project Accept and HSV Singles Dating blame an antiviral drug marketing campaign during the late 1970s to mid-1980s for herpes’ stigma. But it’s difficult to pin down exactly when and why our negative associations started.
by Jon Fortenbury, The Atlantic | Read more:
Image: Instant Vantage/flickrMonday, July 28, 2014
For Coconut Waters, a Street Fight for Shelf Space
Like kale salads and Robin Thicke, coconut water seems to have jumped from invisible to unavoidable without a pause in the realm of the vaguely familiar.
The stuff is everywhere — not just in supermarkets and convenience stores, but also on ads on buses (“Crack life open”) and bar signs (“Detox while you retox,” reads one in Manhattan, promoting a Vita Coco Arnold Palmer cocktail). It has turned up on television, as a question on “Jeopardy,” and it regularly makes cameos in glossy magazines, clutched by hydrating celebrities.
The battle for this market, worth $400 million a year and growing, now involves big players like Pepsi and Coke. But in the beginning, it looked more like a street fight between two guys. One was then a 29-year-old college dropout who rolled to Manhattan bodegas at night, on in-line skates, carrying samples in a backpack. The other was a former Peace Corps volunteer, driving a beat-up Econoline Ford van and fighting for the same turf.
Michael Kirban, who with a buddy founded Vita Coco, and Mark Rampolla, who founded its archrival Zico, happened to start selling nearly identical brands, in the same neighborhoods of New York City, at almost the same time — a week or two apart, in late 2004.
Those in the fray called it the coconut water wars. Each side quickly bulked up with sales teams and tried to win over Manhattan, one grocery store and yoga studio at a time.
The fighting quickly got ugly. It included simple acts of retail vandalism, like tossing the competition’s signs in the garbage, as well as attempts at psychological point-scoring that could charitably be described as sophomoric. Mr. Kirban sometimes placed a container of Zico beside a sleeping vagabond, took a photograph and then emailed it to Mr. Rampolla. And on more than a few occasions, the Zico sales force showed up outside Vita Coco’s offices, then near Union Square, and handed out free Zico samples.
“It was guerrilla tactics,” recalls Mr. Rampolla, talking from his home in Redondo Beach, Calif. “And not legal because you’re supposed to have permits. But if you were quick enough, no one would hassle you.”
Coconut water went from local skirmish to beverage fame despite what might seem like a major impediment: its flavor. Anyone expecting the confectioner’s version of coconut — the one you find in coconut ice cream, for instance — may be repelled. This is the juice of a green coconut, and the taste is a mix of faintly sweet and a tad salty. Some have compared it to socks, sweat and soap. And that group includes people crucial to coconut water’s success.
“When I tried it, I didn’t get it,” says Lewis Hershkowitz, the president of Big Geyser, which distributes Zico in New York City. “I thought it was disgusting.”
For many, the challenging taste is part of the appeal. Some are so smitten with the flavor they have created online forums that sound like support groups.
A decade ago, companies like Goya sold coconut water in stores catering to immigrants, and in quantities that hardly registered in market research. Today, more than 200 brands around the world sell “nature’s own sports drink,” as fans call it, and sales are rising by double-digit figures.
“This will eventually be a $1 billion-a-year category,” says John Craven, founder and chief executive of BevNet, a trade publication. “It’s the real deal. It isn’t a new flavor of Coke. It’s not Bud Light Lime-A-Rita. This has staying power. People put it in their diet and it stays there.”
The titans of the industry are on board. In 2010, PepsiCo acquired a majority stake in the distant third-place contender, O.N.E., and in 2009 Coca-Cola bought a 20 percent stake in Zico. Last year, it purchased the company outright.
Coke’s initial investment in Zico seemed like catastrophic news for Vita Coco, the only brand still controlled by its founders.
“I thought we were dead,” says Mr. Kirban of Vita Coco. “I didn’t tell anybody at the time, but I remember wondering, ‘How are we going to beat Coke?' ”
The answer would involve Madonna, Hula Hoops, a family-owned investment firm in Belgium and a former professional tennis player turned salesman named Goldy. Vita Coco now owns more than 60 percent of the coconut water market, while Zico has less than 20 percent, according to Euromonitor, a research company. Two weeks ago, Vita Coco agreed to sell a 25 percent stake of itself to Red Bull China, giving it a head start in the world’s most populous country and valuing the company at about $665 million.
How a tiny, privately held company outmaneuvered the biggest players in the world is material for a business school case study. And to tell the whole story, you need to start in 2003, at a bar on the Lower East Side of Manhattan. There, Mr. Kirban and his friend and future business partner, Ira Liran, spotted two Brazilian women.
The stuff is everywhere — not just in supermarkets and convenience stores, but also on ads on buses (“Crack life open”) and bar signs (“Detox while you retox,” reads one in Manhattan, promoting a Vita Coco Arnold Palmer cocktail). It has turned up on television, as a question on “Jeopardy,” and it regularly makes cameos in glossy magazines, clutched by hydrating celebrities.
The battle for this market, worth $400 million a year and growing, now involves big players like Pepsi and Coke. But in the beginning, it looked more like a street fight between two guys. One was then a 29-year-old college dropout who rolled to Manhattan bodegas at night, on in-line skates, carrying samples in a backpack. The other was a former Peace Corps volunteer, driving a beat-up Econoline Ford van and fighting for the same turf.Michael Kirban, who with a buddy founded Vita Coco, and Mark Rampolla, who founded its archrival Zico, happened to start selling nearly identical brands, in the same neighborhoods of New York City, at almost the same time — a week or two apart, in late 2004.
Those in the fray called it the coconut water wars. Each side quickly bulked up with sales teams and tried to win over Manhattan, one grocery store and yoga studio at a time.
The fighting quickly got ugly. It included simple acts of retail vandalism, like tossing the competition’s signs in the garbage, as well as attempts at psychological point-scoring that could charitably be described as sophomoric. Mr. Kirban sometimes placed a container of Zico beside a sleeping vagabond, took a photograph and then emailed it to Mr. Rampolla. And on more than a few occasions, the Zico sales force showed up outside Vita Coco’s offices, then near Union Square, and handed out free Zico samples.
“It was guerrilla tactics,” recalls Mr. Rampolla, talking from his home in Redondo Beach, Calif. “And not legal because you’re supposed to have permits. But if you were quick enough, no one would hassle you.”
Coconut water went from local skirmish to beverage fame despite what might seem like a major impediment: its flavor. Anyone expecting the confectioner’s version of coconut — the one you find in coconut ice cream, for instance — may be repelled. This is the juice of a green coconut, and the taste is a mix of faintly sweet and a tad salty. Some have compared it to socks, sweat and soap. And that group includes people crucial to coconut water’s success.
“When I tried it, I didn’t get it,” says Lewis Hershkowitz, the president of Big Geyser, which distributes Zico in New York City. “I thought it was disgusting.”
For many, the challenging taste is part of the appeal. Some are so smitten with the flavor they have created online forums that sound like support groups.
A decade ago, companies like Goya sold coconut water in stores catering to immigrants, and in quantities that hardly registered in market research. Today, more than 200 brands around the world sell “nature’s own sports drink,” as fans call it, and sales are rising by double-digit figures.
“This will eventually be a $1 billion-a-year category,” says John Craven, founder and chief executive of BevNet, a trade publication. “It’s the real deal. It isn’t a new flavor of Coke. It’s not Bud Light Lime-A-Rita. This has staying power. People put it in their diet and it stays there.”
The titans of the industry are on board. In 2010, PepsiCo acquired a majority stake in the distant third-place contender, O.N.E., and in 2009 Coca-Cola bought a 20 percent stake in Zico. Last year, it purchased the company outright.
Coke’s initial investment in Zico seemed like catastrophic news for Vita Coco, the only brand still controlled by its founders.
“I thought we were dead,” says Mr. Kirban of Vita Coco. “I didn’t tell anybody at the time, but I remember wondering, ‘How are we going to beat Coke?' ”
The answer would involve Madonna, Hula Hoops, a family-owned investment firm in Belgium and a former professional tennis player turned salesman named Goldy. Vita Coco now owns more than 60 percent of the coconut water market, while Zico has less than 20 percent, according to Euromonitor, a research company. Two weeks ago, Vita Coco agreed to sell a 25 percent stake of itself to Red Bull China, giving it a head start in the world’s most populous country and valuing the company at about $665 million.
How a tiny, privately held company outmaneuvered the biggest players in the world is material for a business school case study. And to tell the whole story, you need to start in 2003, at a bar on the Lower East Side of Manhattan. There, Mr. Kirban and his friend and future business partner, Ira Liran, spotted two Brazilian women.
by David Segal, NY Times | Read more:
Image: Serge BlochSunday, July 27, 2014
Saturday, July 26, 2014
Lessons From America's War for the Greater Middle East
For well over 30 years now, the United States military has been intensively engaged in various quarters of the Islamic world. An end to that involvement is nowhere in sight.
Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.
To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.
What are we to make of the larger enterprise in which the U.S. forces have been engaged since well before today’s Notre Dame undergraduates were even born? What is the nature of the military struggle we are waging? What should we call it?
For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.
For a brief period during the early years of the George W. Bush administration, certain neoconservatives promoted the term World War IV. This never caught on, however, in part because, unlike other major 20th century conflicts, it found the American people sitting on the sidelines.
With interventions in Iraq and Afghanistan dragging on inconclusively, some military officers began referring to what they called the Long War. While nicely capturing the temporal dimension of the conflict, this label had nothing to say about purpose, adversary or location. As with World War IV, the Long War never gained much traction.
Here’s another possibility. Since 1980, back when President Jimmy Carter promulgated the Carter Doctrine, the United States has been engaged in what we should rightfully call America’s War for the Greater Middle East. The premise underlying that war can be simply stated: with disorder, dysfunction and disarray in the Islamic world posing a growing threat to vital U.S. national security interests, the adroit application of hard power would enable the United States to check those tendencies and thereby preserve the American way of life.
Choose whatever term you like: police, pacify, shape, control, dominate, transform. In 1980, President Carter launched the United States on a project aimed at nothing less than determining the fate and future of the peoples inhabiting the arc of nations from the Maghreb and the Arabian Peninsula to the Persian Gulf and Central Asia.
Since the end of World War II, American soldiers had fought and died in Asia. Even when the wars in Korea and Vietnam ended, U.S. troop contingents continued to garrison the region. In Europe, a major U.S. military presence dating from the start of the Cold War signaled Washington’s willingness to fight there as well. Prior to Carter’s watershed 1980 statement, no comparable U.S. commitment toward the Islamic world existed. Now that was going to change.
Only in retrospect does this become clear, of course. At the time President Carter declared the Persian Gulf a vital national security interest — that was the literal meaning of the Carter Doctrine — he did not intend to embark upon a war. Nor did he anticipate what course that war was going to follow — its duration, costs and consequences. Like the European statesmen who a hundred years ago touched off the cataclysm we know today as World War I, Carter merely lit a fuse without knowing where it led. (...)
Neither Carter nor his advisers foresaw what awaited 10 or 20 years down the line. They were largely clueless as to what lay inside the Pandora’s box they insisted on opening. But what they and their successors in government found there prompted them to initiate a sequence of military actions, some large, some small, that deserve collective recognition as a war. That war continues down to the present day.
Look closely enough and the dots connect. Much as, say, the Berlin Airlift, the Korean War, the Cuban Missile Crisis and the invasion of Grenada (among many other events) all constitute episodes in what we call the Cold War, so, too, do seemingly disparate events such as the Beirut bombing of 1983, the “Black Hawk Down” debacle of 1993 and the Iraq invasion of 2003 (among many others) all form part of a single narrative. Acknowledging the existence of that narrative — seeing America’s War for the Greater Middle East whole — is a prerequisite to learning.
Let me state plainly my own overall assessment of that war. We have not won it. We are not winning it. And simply pressing on is unlikely to produce more positive results next year or the year after — hence, the imperative of absorbing the lessons this ongoing war has to teach. Learning offers a first-step toward devising wiser, more effective and less costly policies.
The “10 theses” that follow constitute a preliminary effort to identify the most important of those lessons.
Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.
To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.
What are we to make of the larger enterprise in which the U.S. forces have been engaged since well before today’s Notre Dame undergraduates were even born? What is the nature of the military struggle we are waging? What should we call it?For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.
For a brief period during the early years of the George W. Bush administration, certain neoconservatives promoted the term World War IV. This never caught on, however, in part because, unlike other major 20th century conflicts, it found the American people sitting on the sidelines.
With interventions in Iraq and Afghanistan dragging on inconclusively, some military officers began referring to what they called the Long War. While nicely capturing the temporal dimension of the conflict, this label had nothing to say about purpose, adversary or location. As with World War IV, the Long War never gained much traction.
Here’s another possibility. Since 1980, back when President Jimmy Carter promulgated the Carter Doctrine, the United States has been engaged in what we should rightfully call America’s War for the Greater Middle East. The premise underlying that war can be simply stated: with disorder, dysfunction and disarray in the Islamic world posing a growing threat to vital U.S. national security interests, the adroit application of hard power would enable the United States to check those tendencies and thereby preserve the American way of life.
Choose whatever term you like: police, pacify, shape, control, dominate, transform. In 1980, President Carter launched the United States on a project aimed at nothing less than determining the fate and future of the peoples inhabiting the arc of nations from the Maghreb and the Arabian Peninsula to the Persian Gulf and Central Asia.
Since the end of World War II, American soldiers had fought and died in Asia. Even when the wars in Korea and Vietnam ended, U.S. troop contingents continued to garrison the region. In Europe, a major U.S. military presence dating from the start of the Cold War signaled Washington’s willingness to fight there as well. Prior to Carter’s watershed 1980 statement, no comparable U.S. commitment toward the Islamic world existed. Now that was going to change.
Only in retrospect does this become clear, of course. At the time President Carter declared the Persian Gulf a vital national security interest — that was the literal meaning of the Carter Doctrine — he did not intend to embark upon a war. Nor did he anticipate what course that war was going to follow — its duration, costs and consequences. Like the European statesmen who a hundred years ago touched off the cataclysm we know today as World War I, Carter merely lit a fuse without knowing where it led. (...)
Neither Carter nor his advisers foresaw what awaited 10 or 20 years down the line. They were largely clueless as to what lay inside the Pandora’s box they insisted on opening. But what they and their successors in government found there prompted them to initiate a sequence of military actions, some large, some small, that deserve collective recognition as a war. That war continues down to the present day.
Look closely enough and the dots connect. Much as, say, the Berlin Airlift, the Korean War, the Cuban Missile Crisis and the invasion of Grenada (among many other events) all constitute episodes in what we call the Cold War, so, too, do seemingly disparate events such as the Beirut bombing of 1983, the “Black Hawk Down” debacle of 1993 and the Iraq invasion of 2003 (among many others) all form part of a single narrative. Acknowledging the existence of that narrative — seeing America’s War for the Greater Middle East whole — is a prerequisite to learning.
Let me state plainly my own overall assessment of that war. We have not won it. We are not winning it. And simply pressing on is unlikely to produce more positive results next year or the year after — hence, the imperative of absorbing the lessons this ongoing war has to teach. Learning offers a first-step toward devising wiser, more effective and less costly policies.
The “10 theses” that follow constitute a preliminary effort to identify the most important of those lessons.
by Andrew Bacevich, Notre Dame Magazine | Read more:
Image: via:
Where Do Cocktail Prices Come From?
Unlike the people who drink them, not all cocktails are created equal. Or at least that's what their prices seem to indicate. The mixed drinks at one bar in one city might be double what they cost at a cocktail-conscious watering hole in another part of the country.
But it doesn't even take a supersonic bar-hop across America to observe this phenomenon. A house cocktail at New York City's Pouring Ribbons, an innovative establishment slinging impeccable drinks, will cost you $14. Not too far uptown, at the stately bar at the NoMad Hotel—where the drinks are similarly innovative and well executed—an original cocktail sells for $16. Then there's ZZ's Clam Bar, in Greenwich Village, where sipping on one of chief bartender Thomas Waugh's elegant liquid creations will set you back $20—or nearly 43 percent more than the cost of a drink at Pouring Ribbons.
Complicating things further, there are plenty of bars and restaurants that go out of their way, it would appear, to price their house cocktails consistently—say, all for $12 apiece—suggesting to a casual observer that, perhaps, all these drinks are an equal value.
I reached out to several managers of serious cocktail destinations in order to better understand what accounts for the broad swings in price we encounter from place to place as we ply the now-extensive craft-cocktail landscape, as well as why some cocktail menus are priced uniformly.
A cocktail by nature is a combination, in differing ratios, of a set of ingredients that each have costs, so many cocktail bars spend a lot of time and effort crunching the numbers behind their drinks. Setting prices for a cocktail-focused list can take a lot more work than menu-pricing might take at a wine or beer bar. That's certainly the impression I get from Jeffrey Morgenthaler, the bar manager at Clyde Common in Portland, Oregon. He approaches the pricing of his cocktail menu with a great deal of mathematical precision, coupled with a small dose of professional intuition.
In addition to bartending, Morgenthaler maintains a blog about his craft, and pricing strategy has been a recurring subject over the years. He's even released Microsoft Excel spreadsheets to his readers, many of whom are in the service industry, as instructional tools. The charts are basic versions of the ones he uses at Clyde Common to calculate pour cost and, by extension, sales prices for drinks.
Pour cost is pretty much what it sounds like: the cost a bar incurs by pouring a given cocktail. But pour cost is typically expressed as a percentage of the sale price of a drink rather than a raw number; so if it costs a bar $2 in goods to produce a drink that it sells for $10, the pour cost of that drink is 20 percent. "Some places need the pour cost to come in at 18 percent," Morgenthaler tells me, "others are fine with 25 percent. It all depends on the business operations." In other words, a bar might decide upon an acceptable range in which its pour costs must fall, given how other aspects of the business factor in, and then calculate the price of drinks based on that range. Between two drinks sold for the same price, the one with the higher pour cost earns the bar a smaller profit.
But it doesn't even take a supersonic bar-hop across America to observe this phenomenon. A house cocktail at New York City's Pouring Ribbons, an innovative establishment slinging impeccable drinks, will cost you $14. Not too far uptown, at the stately bar at the NoMad Hotel—where the drinks are similarly innovative and well executed—an original cocktail sells for $16. Then there's ZZ's Clam Bar, in Greenwich Village, where sipping on one of chief bartender Thomas Waugh's elegant liquid creations will set you back $20—or nearly 43 percent more than the cost of a drink at Pouring Ribbons.Complicating things further, there are plenty of bars and restaurants that go out of their way, it would appear, to price their house cocktails consistently—say, all for $12 apiece—suggesting to a casual observer that, perhaps, all these drinks are an equal value.
I reached out to several managers of serious cocktail destinations in order to better understand what accounts for the broad swings in price we encounter from place to place as we ply the now-extensive craft-cocktail landscape, as well as why some cocktail menus are priced uniformly.
A cocktail by nature is a combination, in differing ratios, of a set of ingredients that each have costs, so many cocktail bars spend a lot of time and effort crunching the numbers behind their drinks. Setting prices for a cocktail-focused list can take a lot more work than menu-pricing might take at a wine or beer bar. That's certainly the impression I get from Jeffrey Morgenthaler, the bar manager at Clyde Common in Portland, Oregon. He approaches the pricing of his cocktail menu with a great deal of mathematical precision, coupled with a small dose of professional intuition.
In addition to bartending, Morgenthaler maintains a blog about his craft, and pricing strategy has been a recurring subject over the years. He's even released Microsoft Excel spreadsheets to his readers, many of whom are in the service industry, as instructional tools. The charts are basic versions of the ones he uses at Clyde Common to calculate pour cost and, by extension, sales prices for drinks.
Pour cost is pretty much what it sounds like: the cost a bar incurs by pouring a given cocktail. But pour cost is typically expressed as a percentage of the sale price of a drink rather than a raw number; so if it costs a bar $2 in goods to produce a drink that it sells for $10, the pour cost of that drink is 20 percent. "Some places need the pour cost to come in at 18 percent," Morgenthaler tells me, "others are fine with 25 percent. It all depends on the business operations." In other words, a bar might decide upon an acceptable range in which its pour costs must fall, given how other aspects of the business factor in, and then calculate the price of drinks based on that range. Between two drinks sold for the same price, the one with the higher pour cost earns the bar a smaller profit.
by Roger Kamholz, Serious Eats | Read more:
Image: Alice GaoIsrael Mows the Lawn
In 2004, a year before Israel’s unilateral disengagement from the Gaza Strip, Dov Weissglass, éminence grise to Ariel Sharon, explained the initiative’s purpose to an interviewer from Haaretz:
This wasn’t an academic exercise. After pursuing a policy of enforced integration between 1967 and the late 1980s, Israeli policy shifted towards separation during the 1987-93 uprising, and then fragmentation during the Oslo years. For the Gaza Strip, an area about the size of Greater Glasgow, these changes entailed a gradual severance from the outside world, with the movement of persons and goods into and out of the territory increasingly restricted.
The screws were turned tighter during the 2000-5 uprising, and in 2007 the Gaza Strip was effectively sealed shut. All exports were banned, and just 131 truckloads of foodstuffs and other essential products were permitted entry per day. Israel also strictly controlled which products could and could not be imported. Prohibited items have included A4 paper, chocolate, coriander, crayons, jam, pasta, shampoo, shoes and wheelchairs.
In 2010, commenting on this premeditated and systematic degradation of the humanity of an entire population, David Cameron characterised the Gaza Strip as a ‘prison camp’ and – for once – did not neuter this assessment by subordinating his criticism to proclamations about the jailers’ right of self-defence against their inmates.
It’s often claimed that Israel’s reason for escalating this punitive regime to a new level of severity was to cause the overthrow of Hamas after its 2007 seizure of power in Gaza. The claim doesn’t stand up to serious scrutiny. Removing Hamas from power has indeed been a policy objective for the US and the EU ever since the Islamist movement won the 2006 parliamentary elections, and their combined efforts to undermine it helped set the stage for the ensuing Palestinian schism.
Israel’s agenda has been different. Had it been determined to end Hamas rule it could easily have done so, particularly while Hamas was still consolidating its control over Gaza in 2007, and without necessarily reversing the 2005 disengagement. Instead, it saw the schism between Hamas and the Palestinian Authority as an opportunity to further its policies of separation and fragmentation, and to deflect growing international pressure for an end to an occupation that has lasted nearly half a century. Its massive assaults on the Gaza Strip in 2008-9 (Operation Cast Lead) and 2012 (Operation Pillar of Defence), as well as countless individual attacks between and since, were in this context exercises in what the Israeli military called ‘mowing the lawn’: weakening Hamas and enhancing Israel’s powers of deterrence. As the 2009 Goldstone Report and other investigations have demonstrated, often in excruciating detail, the grass consists overwhelmingly of non-combatant Palestinian civilians, indiscriminately targeted by Israel’s precision weaponry.
The significance of the disengagement plan is the freezing of the peace process … And when you freeze that process, you prevent the establishment of a Palestinian state, and you prevent a discussion on the refugees, the borders and Jerusalem. Effectively, this whole package called the Palestinian state, with all that it entails, has been removed indefinitely from our agenda. And all this with … a [US] presidential blessing and the ratification of both houses of Congress … The disengagement is actually formaldehyde. It supplies the amount of formaldehyde that is necessary so there will not be a political process with the Palestinians.In 2006 Weissglass was just as frank about Israel’s policy towards Gaza’s 1.8 million inhabitants: ‘The idea is to put the Palestinians on a diet, but not to make them die of hunger.’ He was not speaking metaphorically: it later emerged that the Israeli defence ministry had conducted detailed research on how to translate his vision into reality, and arrived at a figure of 2279 calories per person per day – some 8 per cent less than a previous calculation because the research team had originally neglected to account for ‘culture and experience’ in determining nutritional ‘red lines’.
This wasn’t an academic exercise. After pursuing a policy of enforced integration between 1967 and the late 1980s, Israeli policy shifted towards separation during the 1987-93 uprising, and then fragmentation during the Oslo years. For the Gaza Strip, an area about the size of Greater Glasgow, these changes entailed a gradual severance from the outside world, with the movement of persons and goods into and out of the territory increasingly restricted.
The screws were turned tighter during the 2000-5 uprising, and in 2007 the Gaza Strip was effectively sealed shut. All exports were banned, and just 131 truckloads of foodstuffs and other essential products were permitted entry per day. Israel also strictly controlled which products could and could not be imported. Prohibited items have included A4 paper, chocolate, coriander, crayons, jam, pasta, shampoo, shoes and wheelchairs.
In 2010, commenting on this premeditated and systematic degradation of the humanity of an entire population, David Cameron characterised the Gaza Strip as a ‘prison camp’ and – for once – did not neuter this assessment by subordinating his criticism to proclamations about the jailers’ right of self-defence against their inmates.
It’s often claimed that Israel’s reason for escalating this punitive regime to a new level of severity was to cause the overthrow of Hamas after its 2007 seizure of power in Gaza. The claim doesn’t stand up to serious scrutiny. Removing Hamas from power has indeed been a policy objective for the US and the EU ever since the Islamist movement won the 2006 parliamentary elections, and their combined efforts to undermine it helped set the stage for the ensuing Palestinian schism.
Israel’s agenda has been different. Had it been determined to end Hamas rule it could easily have done so, particularly while Hamas was still consolidating its control over Gaza in 2007, and without necessarily reversing the 2005 disengagement. Instead, it saw the schism between Hamas and the Palestinian Authority as an opportunity to further its policies of separation and fragmentation, and to deflect growing international pressure for an end to an occupation that has lasted nearly half a century. Its massive assaults on the Gaza Strip in 2008-9 (Operation Cast Lead) and 2012 (Operation Pillar of Defence), as well as countless individual attacks between and since, were in this context exercises in what the Israeli military called ‘mowing the lawn’: weakening Hamas and enhancing Israel’s powers of deterrence. As the 2009 Goldstone Report and other investigations have demonstrated, often in excruciating detail, the grass consists overwhelmingly of non-combatant Palestinian civilians, indiscriminately targeted by Israel’s precision weaponry.
by Mouin Rabbani, LRB | Read more:
Weed Weddings
[ed. As far as weddings are concerned, I'm pretty sure the novelty will wear off soon enough. Can't say that about other aspects of our culture, though.]
Earlier this month, when Ellen Epstein arrived at the Devil’s Thumb Ranch in Tabernash, Colo., for the wedding of her friends Lauren Meisels and Bradley Melshenker, she, like the other guests, found a gift bag waiting for her in her hotel room. But rather than a guide to activities in the area or a jar of locally made honey, the canvas bag contained a rolled joint, a lighter and lip balm infused with mango butter and cannabis, along with this note: “We wanted to show you some of the things we love the best.”She knew then that the wedding of her fellow Boulder residents would be just a little different from the ones she had attended in the past.
The Meisels and Melshenker nuptials looked as if their inspiration had come not from the pages of Martha Stewart Weddings but from High Times. All of the floral arrangements, including the bride’s bouquet, contained a variety of white flowers mixed with marijuana buds and leaves. Mr. Melshenker and his groomsmen wore boutonnieres crafted out of twine and marijuana buds, and Mr. Melshenker’s three dogs, who were also in attendance, wore collars made of cannabis buds, eucalyptus leaves and pink ribbons.
Before going into dinner, the guests were given a baby marijuana plant in a ceramic pot with their name and table assignment written on a card in green ink, in the kind of stylish script you might find on a container of artisanal goat cheese. The tables were named after different strains of marijuana, like Blue Dream, Sour Diesel and Skywalker (the groom’s favorite strain). Ms. Epstein, who was seated at Skywalker, said that everyone at her table, where the ages ranged from 40 to 70, passed around a device similar to an electronic cigarette — except that it contained hash oil instead of nicotine. “It didn’t feel weird or bizarre,” she said. “It kind of becomes a new cocktail.”
With the sale of marijuana for recreational use now legal in Colorado and Washington State, pot and its various paraphernalia are becoming visible at weddings in those states — as table favors for guests like miniature vaporizers or group activites like a hookah lounge. (...)
Jake Rosenbarger of Kim & Jake’s Cakes in Boulder said he would not make a cannabis cake if asked. Marijuana ruins the flavor, he said, and it can even ruin a wedding. “It can divide a room as much as pull it together,” he said. “It creates a vibe of, ‘Are you in the cool kids club or not?’ ”
Penni Ervin, a wedding planner in Crested Butte, was aghast when asked if she was working on any weddings in which pot was involved. “We’re talking about highly professional people, and I just don’t see C.E.O.s getting stoned,” she said. “It’s a family event with grandma and grandpa,” adding, “and you don’t want them to get shocked.”
by Lois Smith Brady, NY Times | Read more:
Image: Alison VagniniFriday, July 25, 2014
Fellow Vegans, Please Stop Making Me Hate You
When I was young and self-hating, I used to not-really-jokingly tell people that I was a "queer writer vegan who hates other queer writer vegans." We can unpack the sadness of that statement at a later date; suffice it to say that I am a competitive attention-seeker, and when competitive attention-seekers are uncertain and immature, sometimes they blame others for their own insecurities rather than examining their own behaviors.
But I digress. Obviously, I no longer hate other queers or other writers. Duh. I do, however, still sometimes hate other vegans. At least, I hate the way some other vegans behave about the whole shebang.
Here's the thing. I've been a vegan for five years now, and I can say that there are a lot of facets of the lifestyle that I appreciate. For example, I like the fact that I eat a hell of a lot more fruits and vegetables than I did in my Dorito-and-Diet-Coke-reliant teenage years. I like that I am occasionally driven by sheer necessity to create new, exciting combinations of breakfast foods (such as peanut-butter-and-frozen-pea tacos, or peanut-butter-and-broccoli stir fry, or peanut-butter-and-one's-own-hand despair-pops). I like that a lot of the vegans I have met are chill folks willing to swap nutritional yeast recipes or let me steal a bite of their tofu breakfast burrito.
And, on the real, I like that I'm not creating any personal demand for factory-farmed milk and eggs. That's what drove me to becoming vegan in the first place, and while I don't talk about it much -- because, frankly, I wrestle with the ethics of avoiding chick-maceration while gorging on strawberries picked by exploited farmworkers -- it's still a pretty big part of why I avoid everything pushed on me by the Cheese Lobby.
However, these positives are far from universal. Veganism is, by nature, not for everyone. And the sooner everyone realizes that, the less inclined I will be to automatically make an Aggressively Placid Face at the next person to espouse the evils of honey at me.
Take what just happened in Detroit, for example. On Thursday, PETA, never known for being a font of rationality when it comes to animal rights, offered to pay the water bill for 10 city families who "pledged to go vegan for a month." Despite the fact that half of Detroit's residents are struggling to, say, flush their toilets or cook on the stove, PETA apparently took it upon itself to use a basic human necessity as leverage for "pledging" to forgo animal products.
Clearly, this is a moronic, unsustainable venture. I am not a resident of Detroit, but if a stranger approached me in my hour of desperation and told me to kill a man just to watch him die, I would 100 percent promise her that her target would be at the bottom of Lake Michigan within the hour. I wouldn't do it, of course, but so long as she was willing to fork over the moolah for utilities, I'd tell her whatever she wanted to hear.
Similarly, there's no indication that PETA will check in with these folks after supposedly ponying up cash for their needs. As of now, they seem content to throw a few pro-vegan pamphlets at families before jetting back to wherever animal rights executives go when they're not trying to raise a stink. In other words, this smells like a publicity stunt, and a half-assed one at best.
From an outside perspective, it appeared as if PETA wanted to cast itself as the wise savior who just needed to offer a tiny incentive -- i.e., water in your own home -- to spark the wonder of veganism within the hearts of Detroiters. In fact, it even graciously pointed out to its would-be beneficiaries that "by accepting our offer to go vegan, not only will families be getting an immediate financial boost and helping animals, if they stick with it, they’ll also lower their risk of obesity, heart disease, cancer, diabetes and strokes." Shut up, PETA. (...)
Again, most vegans I know do not behave this poorly to such a large degree. Many, in fact, understand that food is a personal experience, and that it's unacceptable to shame others for listening to their own bodies, putting their needs ahead of what they perceive to be important, or just frankly not really caring what they place in their face. But I think we've all known vegans who refuse to empathize with other humans in favor of empathizing with farm animals -- and that is no way to create social or environmental change in the long run. For one thing, that's a dickish way to behave, period. For another, it's not going to shift anyone's eating habits, except maybe in the opposite direction out of spite.
But I digress. Obviously, I no longer hate other queers or other writers. Duh. I do, however, still sometimes hate other vegans. At least, I hate the way some other vegans behave about the whole shebang.
Here's the thing. I've been a vegan for five years now, and I can say that there are a lot of facets of the lifestyle that I appreciate. For example, I like the fact that I eat a hell of a lot more fruits and vegetables than I did in my Dorito-and-Diet-Coke-reliant teenage years. I like that I am occasionally driven by sheer necessity to create new, exciting combinations of breakfast foods (such as peanut-butter-and-frozen-pea tacos, or peanut-butter-and-broccoli stir fry, or peanut-butter-and-one's-own-hand despair-pops). I like that a lot of the vegans I have met are chill folks willing to swap nutritional yeast recipes or let me steal a bite of their tofu breakfast burrito.And, on the real, I like that I'm not creating any personal demand for factory-farmed milk and eggs. That's what drove me to becoming vegan in the first place, and while I don't talk about it much -- because, frankly, I wrestle with the ethics of avoiding chick-maceration while gorging on strawberries picked by exploited farmworkers -- it's still a pretty big part of why I avoid everything pushed on me by the Cheese Lobby.
However, these positives are far from universal. Veganism is, by nature, not for everyone. And the sooner everyone realizes that, the less inclined I will be to automatically make an Aggressively Placid Face at the next person to espouse the evils of honey at me.
Take what just happened in Detroit, for example. On Thursday, PETA, never known for being a font of rationality when it comes to animal rights, offered to pay the water bill for 10 city families who "pledged to go vegan for a month." Despite the fact that half of Detroit's residents are struggling to, say, flush their toilets or cook on the stove, PETA apparently took it upon itself to use a basic human necessity as leverage for "pledging" to forgo animal products.
Clearly, this is a moronic, unsustainable venture. I am not a resident of Detroit, but if a stranger approached me in my hour of desperation and told me to kill a man just to watch him die, I would 100 percent promise her that her target would be at the bottom of Lake Michigan within the hour. I wouldn't do it, of course, but so long as she was willing to fork over the moolah for utilities, I'd tell her whatever she wanted to hear.
Similarly, there's no indication that PETA will check in with these folks after supposedly ponying up cash for their needs. As of now, they seem content to throw a few pro-vegan pamphlets at families before jetting back to wherever animal rights executives go when they're not trying to raise a stink. In other words, this smells like a publicity stunt, and a half-assed one at best.
From an outside perspective, it appeared as if PETA wanted to cast itself as the wise savior who just needed to offer a tiny incentive -- i.e., water in your own home -- to spark the wonder of veganism within the hearts of Detroiters. In fact, it even graciously pointed out to its would-be beneficiaries that "by accepting our offer to go vegan, not only will families be getting an immediate financial boost and helping animals, if they stick with it, they’ll also lower their risk of obesity, heart disease, cancer, diabetes and strokes." Shut up, PETA. (...)
Again, most vegans I know do not behave this poorly to such a large degree. Many, in fact, understand that food is a personal experience, and that it's unacceptable to shame others for listening to their own bodies, putting their needs ahead of what they perceive to be important, or just frankly not really caring what they place in their face. But I think we've all known vegans who refuse to empathize with other humans in favor of empathizing with farm animals -- and that is no way to create social or environmental change in the long run. For one thing, that's a dickish way to behave, period. For another, it's not going to shift anyone's eating habits, except maybe in the opposite direction out of spite.
by Kate Conway, XOJane | Read more:
Image: uncredited
Thursday, July 24, 2014
Subscribe to:
Comments (Atom)










