Monday, July 28, 2014
For Coconut Waters, a Street Fight for Shelf Space
Like kale salads and Robin Thicke, coconut water seems to have jumped from invisible to unavoidable without a pause in the realm of the vaguely familiar.
The stuff is everywhere — not just in supermarkets and convenience stores, but also on ads on buses (“Crack life open”) and bar signs (“Detox while you retox,” reads one in Manhattan, promoting a Vita Coco Arnold Palmer cocktail). It has turned up on television, as a question on “Jeopardy,” and it regularly makes cameos in glossy magazines, clutched by hydrating celebrities.
The battle for this market, worth $400 million a year and growing, now involves big players like Pepsi and Coke. But in the beginning, it looked more like a street fight between two guys. One was then a 29-year-old college dropout who rolled to Manhattan bodegas at night, on in-line skates, carrying samples in a backpack. The other was a former Peace Corps volunteer, driving a beat-up Econoline Ford van and fighting for the same turf.
Michael Kirban, who with a buddy founded Vita Coco, and Mark Rampolla, who founded its archrival Zico, happened to start selling nearly identical brands, in the same neighborhoods of New York City, at almost the same time — a week or two apart, in late 2004.
Those in the fray called it the coconut water wars. Each side quickly bulked up with sales teams and tried to win over Manhattan, one grocery store and yoga studio at a time.
The fighting quickly got ugly. It included simple acts of retail vandalism, like tossing the competition’s signs in the garbage, as well as attempts at psychological point-scoring that could charitably be described as sophomoric. Mr. Kirban sometimes placed a container of Zico beside a sleeping vagabond, took a photograph and then emailed it to Mr. Rampolla. And on more than a few occasions, the Zico sales force showed up outside Vita Coco’s offices, then near Union Square, and handed out free Zico samples.
“It was guerrilla tactics,” recalls Mr. Rampolla, talking from his home in Redondo Beach, Calif. “And not legal because you’re supposed to have permits. But if you were quick enough, no one would hassle you.”
Coconut water went from local skirmish to beverage fame despite what might seem like a major impediment: its flavor. Anyone expecting the confectioner’s version of coconut — the one you find in coconut ice cream, for instance — may be repelled. This is the juice of a green coconut, and the taste is a mix of faintly sweet and a tad salty. Some have compared it to socks, sweat and soap. And that group includes people crucial to coconut water’s success.
“When I tried it, I didn’t get it,” says Lewis Hershkowitz, the president of Big Geyser, which distributes Zico in New York City. “I thought it was disgusting.”
For many, the challenging taste is part of the appeal. Some are so smitten with the flavor they have created online forums that sound like support groups.
A decade ago, companies like Goya sold coconut water in stores catering to immigrants, and in quantities that hardly registered in market research. Today, more than 200 brands around the world sell “nature’s own sports drink,” as fans call it, and sales are rising by double-digit figures.
“This will eventually be a $1 billion-a-year category,” says John Craven, founder and chief executive of BevNet, a trade publication. “It’s the real deal. It isn’t a new flavor of Coke. It’s not Bud Light Lime-A-Rita. This has staying power. People put it in their diet and it stays there.”
The titans of the industry are on board. In 2010, PepsiCo acquired a majority stake in the distant third-place contender, O.N.E., and in 2009 Coca-Cola bought a 20 percent stake in Zico. Last year, it purchased the company outright.
Coke’s initial investment in Zico seemed like catastrophic news for Vita Coco, the only brand still controlled by its founders.
“I thought we were dead,” says Mr. Kirban of Vita Coco. “I didn’t tell anybody at the time, but I remember wondering, ‘How are we going to beat Coke?' ”
The answer would involve Madonna, Hula Hoops, a family-owned investment firm in Belgium and a former professional tennis player turned salesman named Goldy. Vita Coco now owns more than 60 percent of the coconut water market, while Zico has less than 20 percent, according to Euromonitor, a research company. Two weeks ago, Vita Coco agreed to sell a 25 percent stake of itself to Red Bull China, giving it a head start in the world’s most populous country and valuing the company at about $665 million.
How a tiny, privately held company outmaneuvered the biggest players in the world is material for a business school case study. And to tell the whole story, you need to start in 2003, at a bar on the Lower East Side of Manhattan. There, Mr. Kirban and his friend and future business partner, Ira Liran, spotted two Brazilian women.
The stuff is everywhere — not just in supermarkets and convenience stores, but also on ads on buses (“Crack life open”) and bar signs (“Detox while you retox,” reads one in Manhattan, promoting a Vita Coco Arnold Palmer cocktail). It has turned up on television, as a question on “Jeopardy,” and it regularly makes cameos in glossy magazines, clutched by hydrating celebrities.

Michael Kirban, who with a buddy founded Vita Coco, and Mark Rampolla, who founded its archrival Zico, happened to start selling nearly identical brands, in the same neighborhoods of New York City, at almost the same time — a week or two apart, in late 2004.
Those in the fray called it the coconut water wars. Each side quickly bulked up with sales teams and tried to win over Manhattan, one grocery store and yoga studio at a time.
The fighting quickly got ugly. It included simple acts of retail vandalism, like tossing the competition’s signs in the garbage, as well as attempts at psychological point-scoring that could charitably be described as sophomoric. Mr. Kirban sometimes placed a container of Zico beside a sleeping vagabond, took a photograph and then emailed it to Mr. Rampolla. And on more than a few occasions, the Zico sales force showed up outside Vita Coco’s offices, then near Union Square, and handed out free Zico samples.
“It was guerrilla tactics,” recalls Mr. Rampolla, talking from his home in Redondo Beach, Calif. “And not legal because you’re supposed to have permits. But if you were quick enough, no one would hassle you.”
Coconut water went from local skirmish to beverage fame despite what might seem like a major impediment: its flavor. Anyone expecting the confectioner’s version of coconut — the one you find in coconut ice cream, for instance — may be repelled. This is the juice of a green coconut, and the taste is a mix of faintly sweet and a tad salty. Some have compared it to socks, sweat and soap. And that group includes people crucial to coconut water’s success.
“When I tried it, I didn’t get it,” says Lewis Hershkowitz, the president of Big Geyser, which distributes Zico in New York City. “I thought it was disgusting.”
For many, the challenging taste is part of the appeal. Some are so smitten with the flavor they have created online forums that sound like support groups.
A decade ago, companies like Goya sold coconut water in stores catering to immigrants, and in quantities that hardly registered in market research. Today, more than 200 brands around the world sell “nature’s own sports drink,” as fans call it, and sales are rising by double-digit figures.
“This will eventually be a $1 billion-a-year category,” says John Craven, founder and chief executive of BevNet, a trade publication. “It’s the real deal. It isn’t a new flavor of Coke. It’s not Bud Light Lime-A-Rita. This has staying power. People put it in their diet and it stays there.”
The titans of the industry are on board. In 2010, PepsiCo acquired a majority stake in the distant third-place contender, O.N.E., and in 2009 Coca-Cola bought a 20 percent stake in Zico. Last year, it purchased the company outright.
Coke’s initial investment in Zico seemed like catastrophic news for Vita Coco, the only brand still controlled by its founders.
“I thought we were dead,” says Mr. Kirban of Vita Coco. “I didn’t tell anybody at the time, but I remember wondering, ‘How are we going to beat Coke?' ”
The answer would involve Madonna, Hula Hoops, a family-owned investment firm in Belgium and a former professional tennis player turned salesman named Goldy. Vita Coco now owns more than 60 percent of the coconut water market, while Zico has less than 20 percent, according to Euromonitor, a research company. Two weeks ago, Vita Coco agreed to sell a 25 percent stake of itself to Red Bull China, giving it a head start in the world’s most populous country and valuing the company at about $665 million.
How a tiny, privately held company outmaneuvered the biggest players in the world is material for a business school case study. And to tell the whole story, you need to start in 2003, at a bar on the Lower East Side of Manhattan. There, Mr. Kirban and his friend and future business partner, Ira Liran, spotted two Brazilian women.
by David Segal, NY Times | Read more:
Image: Serge BlochSunday, July 27, 2014
Saturday, July 26, 2014
Lessons From America's War for the Greater Middle East
For well over 30 years now, the United States military has been intensively engaged in various quarters of the Islamic world. An end to that involvement is nowhere in sight.
Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.
To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.
What are we to make of the larger enterprise in which the U.S. forces have been engaged since well before today’s Notre Dame undergraduates were even born? What is the nature of the military struggle we are waging? What should we call it?
For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.
For a brief period during the early years of the George W. Bush administration, certain neoconservatives promoted the term World War IV. This never caught on, however, in part because, unlike other major 20th century conflicts, it found the American people sitting on the sidelines.
With interventions in Iraq and Afghanistan dragging on inconclusively, some military officers began referring to what they called the Long War. While nicely capturing the temporal dimension of the conflict, this label had nothing to say about purpose, adversary or location. As with World War IV, the Long War never gained much traction.
Here’s another possibility. Since 1980, back when President Jimmy Carter promulgated the Carter Doctrine, the United States has been engaged in what we should rightfully call America’s War for the Greater Middle East. The premise underlying that war can be simply stated: with disorder, dysfunction and disarray in the Islamic world posing a growing threat to vital U.S. national security interests, the adroit application of hard power would enable the United States to check those tendencies and thereby preserve the American way of life.
Choose whatever term you like: police, pacify, shape, control, dominate, transform. In 1980, President Carter launched the United States on a project aimed at nothing less than determining the fate and future of the peoples inhabiting the arc of nations from the Maghreb and the Arabian Peninsula to the Persian Gulf and Central Asia.
Since the end of World War II, American soldiers had fought and died in Asia. Even when the wars in Korea and Vietnam ended, U.S. troop contingents continued to garrison the region. In Europe, a major U.S. military presence dating from the start of the Cold War signaled Washington’s willingness to fight there as well. Prior to Carter’s watershed 1980 statement, no comparable U.S. commitment toward the Islamic world existed. Now that was going to change.
Only in retrospect does this become clear, of course. At the time President Carter declared the Persian Gulf a vital national security interest — that was the literal meaning of the Carter Doctrine — he did not intend to embark upon a war. Nor did he anticipate what course that war was going to follow — its duration, costs and consequences. Like the European statesmen who a hundred years ago touched off the cataclysm we know today as World War I, Carter merely lit a fuse without knowing where it led. (...)
Neither Carter nor his advisers foresaw what awaited 10 or 20 years down the line. They were largely clueless as to what lay inside the Pandora’s box they insisted on opening. But what they and their successors in government found there prompted them to initiate a sequence of military actions, some large, some small, that deserve collective recognition as a war. That war continues down to the present day.
Look closely enough and the dots connect. Much as, say, the Berlin Airlift, the Korean War, the Cuban Missile Crisis and the invasion of Grenada (among many other events) all constitute episodes in what we call the Cold War, so, too, do seemingly disparate events such as the Beirut bombing of 1983, the “Black Hawk Down” debacle of 1993 and the Iraq invasion of 2003 (among many others) all form part of a single narrative. Acknowledging the existence of that narrative — seeing America’s War for the Greater Middle East whole — is a prerequisite to learning.
Let me state plainly my own overall assessment of that war. We have not won it. We are not winning it. And simply pressing on is unlikely to produce more positive results next year or the year after — hence, the imperative of absorbing the lessons this ongoing war has to teach. Learning offers a first-step toward devising wiser, more effective and less costly policies.
The “10 theses” that follow constitute a preliminary effort to identify the most important of those lessons.
Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.
To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.

For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.
For a brief period during the early years of the George W. Bush administration, certain neoconservatives promoted the term World War IV. This never caught on, however, in part because, unlike other major 20th century conflicts, it found the American people sitting on the sidelines.
With interventions in Iraq and Afghanistan dragging on inconclusively, some military officers began referring to what they called the Long War. While nicely capturing the temporal dimension of the conflict, this label had nothing to say about purpose, adversary or location. As with World War IV, the Long War never gained much traction.
Here’s another possibility. Since 1980, back when President Jimmy Carter promulgated the Carter Doctrine, the United States has been engaged in what we should rightfully call America’s War for the Greater Middle East. The premise underlying that war can be simply stated: with disorder, dysfunction and disarray in the Islamic world posing a growing threat to vital U.S. national security interests, the adroit application of hard power would enable the United States to check those tendencies and thereby preserve the American way of life.
Choose whatever term you like: police, pacify, shape, control, dominate, transform. In 1980, President Carter launched the United States on a project aimed at nothing less than determining the fate and future of the peoples inhabiting the arc of nations from the Maghreb and the Arabian Peninsula to the Persian Gulf and Central Asia.
Since the end of World War II, American soldiers had fought and died in Asia. Even when the wars in Korea and Vietnam ended, U.S. troop contingents continued to garrison the region. In Europe, a major U.S. military presence dating from the start of the Cold War signaled Washington’s willingness to fight there as well. Prior to Carter’s watershed 1980 statement, no comparable U.S. commitment toward the Islamic world existed. Now that was going to change.
Only in retrospect does this become clear, of course. At the time President Carter declared the Persian Gulf a vital national security interest — that was the literal meaning of the Carter Doctrine — he did not intend to embark upon a war. Nor did he anticipate what course that war was going to follow — its duration, costs and consequences. Like the European statesmen who a hundred years ago touched off the cataclysm we know today as World War I, Carter merely lit a fuse without knowing where it led. (...)
Neither Carter nor his advisers foresaw what awaited 10 or 20 years down the line. They were largely clueless as to what lay inside the Pandora’s box they insisted on opening. But what they and their successors in government found there prompted them to initiate a sequence of military actions, some large, some small, that deserve collective recognition as a war. That war continues down to the present day.
Look closely enough and the dots connect. Much as, say, the Berlin Airlift, the Korean War, the Cuban Missile Crisis and the invasion of Grenada (among many other events) all constitute episodes in what we call the Cold War, so, too, do seemingly disparate events such as the Beirut bombing of 1983, the “Black Hawk Down” debacle of 1993 and the Iraq invasion of 2003 (among many others) all form part of a single narrative. Acknowledging the existence of that narrative — seeing America’s War for the Greater Middle East whole — is a prerequisite to learning.
Let me state plainly my own overall assessment of that war. We have not won it. We are not winning it. And simply pressing on is unlikely to produce more positive results next year or the year after — hence, the imperative of absorbing the lessons this ongoing war has to teach. Learning offers a first-step toward devising wiser, more effective and less costly policies.
The “10 theses” that follow constitute a preliminary effort to identify the most important of those lessons.
by Andrew Bacevich, Notre Dame Magazine | Read more:
Image: via:
Where Do Cocktail Prices Come From?
Unlike the people who drink them, not all cocktails are created equal. Or at least that's what their prices seem to indicate. The mixed drinks at one bar in one city might be double what they cost at a cocktail-conscious watering hole in another part of the country.
But it doesn't even take a supersonic bar-hop across America to observe this phenomenon. A house cocktail at New York City's Pouring Ribbons, an innovative establishment slinging impeccable drinks, will cost you $14. Not too far uptown, at the stately bar at the NoMad Hotel—where the drinks are similarly innovative and well executed—an original cocktail sells for $16. Then there's ZZ's Clam Bar, in Greenwich Village, where sipping on one of chief bartender Thomas Waugh's elegant liquid creations will set you back $20—or nearly 43 percent more than the cost of a drink at Pouring Ribbons.
Complicating things further, there are plenty of bars and restaurants that go out of their way, it would appear, to price their house cocktails consistently—say, all for $12 apiece—suggesting to a casual observer that, perhaps, all these drinks are an equal value.
I reached out to several managers of serious cocktail destinations in order to better understand what accounts for the broad swings in price we encounter from place to place as we ply the now-extensive craft-cocktail landscape, as well as why some cocktail menus are priced uniformly.
A cocktail by nature is a combination, in differing ratios, of a set of ingredients that each have costs, so many cocktail bars spend a lot of time and effort crunching the numbers behind their drinks. Setting prices for a cocktail-focused list can take a lot more work than menu-pricing might take at a wine or beer bar. That's certainly the impression I get from Jeffrey Morgenthaler, the bar manager at Clyde Common in Portland, Oregon. He approaches the pricing of his cocktail menu with a great deal of mathematical precision, coupled with a small dose of professional intuition.
In addition to bartending, Morgenthaler maintains a blog about his craft, and pricing strategy has been a recurring subject over the years. He's even released Microsoft Excel spreadsheets to his readers, many of whom are in the service industry, as instructional tools. The charts are basic versions of the ones he uses at Clyde Common to calculate pour cost and, by extension, sales prices for drinks.
Pour cost is pretty much what it sounds like: the cost a bar incurs by pouring a given cocktail. But pour cost is typically expressed as a percentage of the sale price of a drink rather than a raw number; so if it costs a bar $2 in goods to produce a drink that it sells for $10, the pour cost of that drink is 20 percent. "Some places need the pour cost to come in at 18 percent," Morgenthaler tells me, "others are fine with 25 percent. It all depends on the business operations." In other words, a bar might decide upon an acceptable range in which its pour costs must fall, given how other aspects of the business factor in, and then calculate the price of drinks based on that range. Between two drinks sold for the same price, the one with the higher pour cost earns the bar a smaller profit.

Complicating things further, there are plenty of bars and restaurants that go out of their way, it would appear, to price their house cocktails consistently—say, all for $12 apiece—suggesting to a casual observer that, perhaps, all these drinks are an equal value.
I reached out to several managers of serious cocktail destinations in order to better understand what accounts for the broad swings in price we encounter from place to place as we ply the now-extensive craft-cocktail landscape, as well as why some cocktail menus are priced uniformly.
A cocktail by nature is a combination, in differing ratios, of a set of ingredients that each have costs, so many cocktail bars spend a lot of time and effort crunching the numbers behind their drinks. Setting prices for a cocktail-focused list can take a lot more work than menu-pricing might take at a wine or beer bar. That's certainly the impression I get from Jeffrey Morgenthaler, the bar manager at Clyde Common in Portland, Oregon. He approaches the pricing of his cocktail menu with a great deal of mathematical precision, coupled with a small dose of professional intuition.
In addition to bartending, Morgenthaler maintains a blog about his craft, and pricing strategy has been a recurring subject over the years. He's even released Microsoft Excel spreadsheets to his readers, many of whom are in the service industry, as instructional tools. The charts are basic versions of the ones he uses at Clyde Common to calculate pour cost and, by extension, sales prices for drinks.
Pour cost is pretty much what it sounds like: the cost a bar incurs by pouring a given cocktail. But pour cost is typically expressed as a percentage of the sale price of a drink rather than a raw number; so if it costs a bar $2 in goods to produce a drink that it sells for $10, the pour cost of that drink is 20 percent. "Some places need the pour cost to come in at 18 percent," Morgenthaler tells me, "others are fine with 25 percent. It all depends on the business operations." In other words, a bar might decide upon an acceptable range in which its pour costs must fall, given how other aspects of the business factor in, and then calculate the price of drinks based on that range. Between two drinks sold for the same price, the one with the higher pour cost earns the bar a smaller profit.
by Roger Kamholz, Serious Eats | Read more:
Image: Alice GaoIsrael Mows the Lawn
In 2004, a year before Israel’s unilateral disengagement from the Gaza Strip, Dov Weissglass, éminence grise to Ariel Sharon, explained the initiative’s purpose to an interviewer from Haaretz:
This wasn’t an academic exercise. After pursuing a policy of enforced integration between 1967 and the late 1980s, Israeli policy shifted towards separation during the 1987-93 uprising, and then fragmentation during the Oslo years. For the Gaza Strip, an area about the size of Greater Glasgow, these changes entailed a gradual severance from the outside world, with the movement of persons and goods into and out of the territory increasingly restricted.
The screws were turned tighter during the 2000-5 uprising, and in 2007 the Gaza Strip was effectively sealed shut. All exports were banned, and just 131 truckloads of foodstuffs and other essential products were permitted entry per day. Israel also strictly controlled which products could and could not be imported. Prohibited items have included A4 paper, chocolate, coriander, crayons, jam, pasta, shampoo, shoes and wheelchairs.
In 2010, commenting on this premeditated and systematic degradation of the humanity of an entire population, David Cameron characterised the Gaza Strip as a ‘prison camp’ and – for once – did not neuter this assessment by subordinating his criticism to proclamations about the jailers’ right of self-defence against their inmates.
It’s often claimed that Israel’s reason for escalating this punitive regime to a new level of severity was to cause the overthrow of Hamas after its 2007 seizure of power in Gaza. The claim doesn’t stand up to serious scrutiny. Removing Hamas from power has indeed been a policy objective for the US and the EU ever since the Islamist movement won the 2006 parliamentary elections, and their combined efforts to undermine it helped set the stage for the ensuing Palestinian schism.
Israel’s agenda has been different. Had it been determined to end Hamas rule it could easily have done so, particularly while Hamas was still consolidating its control over Gaza in 2007, and without necessarily reversing the 2005 disengagement. Instead, it saw the schism between Hamas and the Palestinian Authority as an opportunity to further its policies of separation and fragmentation, and to deflect growing international pressure for an end to an occupation that has lasted nearly half a century. Its massive assaults on the Gaza Strip in 2008-9 (Operation Cast Lead) and 2012 (Operation Pillar of Defence), as well as countless individual attacks between and since, were in this context exercises in what the Israeli military called ‘mowing the lawn’: weakening Hamas and enhancing Israel’s powers of deterrence. As the 2009 Goldstone Report and other investigations have demonstrated, often in excruciating detail, the grass consists overwhelmingly of non-combatant Palestinian civilians, indiscriminately targeted by Israel’s precision weaponry.
The significance of the disengagement plan is the freezing of the peace process … And when you freeze that process, you prevent the establishment of a Palestinian state, and you prevent a discussion on the refugees, the borders and Jerusalem. Effectively, this whole package called the Palestinian state, with all that it entails, has been removed indefinitely from our agenda. And all this with … a [US] presidential blessing and the ratification of both houses of Congress … The disengagement is actually formaldehyde. It supplies the amount of formaldehyde that is necessary so there will not be a political process with the Palestinians.In 2006 Weissglass was just as frank about Israel’s policy towards Gaza’s 1.8 million inhabitants: ‘The idea is to put the Palestinians on a diet, but not to make them die of hunger.’ He was not speaking metaphorically: it later emerged that the Israeli defence ministry had conducted detailed research on how to translate his vision into reality, and arrived at a figure of 2279 calories per person per day – some 8 per cent less than a previous calculation because the research team had originally neglected to account for ‘culture and experience’ in determining nutritional ‘red lines’.
This wasn’t an academic exercise. After pursuing a policy of enforced integration between 1967 and the late 1980s, Israeli policy shifted towards separation during the 1987-93 uprising, and then fragmentation during the Oslo years. For the Gaza Strip, an area about the size of Greater Glasgow, these changes entailed a gradual severance from the outside world, with the movement of persons and goods into and out of the territory increasingly restricted.
The screws were turned tighter during the 2000-5 uprising, and in 2007 the Gaza Strip was effectively sealed shut. All exports were banned, and just 131 truckloads of foodstuffs and other essential products were permitted entry per day. Israel also strictly controlled which products could and could not be imported. Prohibited items have included A4 paper, chocolate, coriander, crayons, jam, pasta, shampoo, shoes and wheelchairs.
In 2010, commenting on this premeditated and systematic degradation of the humanity of an entire population, David Cameron characterised the Gaza Strip as a ‘prison camp’ and – for once – did not neuter this assessment by subordinating his criticism to proclamations about the jailers’ right of self-defence against their inmates.
It’s often claimed that Israel’s reason for escalating this punitive regime to a new level of severity was to cause the overthrow of Hamas after its 2007 seizure of power in Gaza. The claim doesn’t stand up to serious scrutiny. Removing Hamas from power has indeed been a policy objective for the US and the EU ever since the Islamist movement won the 2006 parliamentary elections, and their combined efforts to undermine it helped set the stage for the ensuing Palestinian schism.
Israel’s agenda has been different. Had it been determined to end Hamas rule it could easily have done so, particularly while Hamas was still consolidating its control over Gaza in 2007, and without necessarily reversing the 2005 disengagement. Instead, it saw the schism between Hamas and the Palestinian Authority as an opportunity to further its policies of separation and fragmentation, and to deflect growing international pressure for an end to an occupation that has lasted nearly half a century. Its massive assaults on the Gaza Strip in 2008-9 (Operation Cast Lead) and 2012 (Operation Pillar of Defence), as well as countless individual attacks between and since, were in this context exercises in what the Israeli military called ‘mowing the lawn’: weakening Hamas and enhancing Israel’s powers of deterrence. As the 2009 Goldstone Report and other investigations have demonstrated, often in excruciating detail, the grass consists overwhelmingly of non-combatant Palestinian civilians, indiscriminately targeted by Israel’s precision weaponry.
by Mouin Rabbani, LRB | Read more:
Weed Weddings
[ed. As far as weddings are concerned, I'm pretty sure the novelty will wear off soon enough. Can't say that about other aspects of our culture, though.]

She knew then that the wedding of her fellow Boulder residents would be just a little different from the ones she had attended in the past.
The Meisels and Melshenker nuptials looked as if their inspiration had come not from the pages of Martha Stewart Weddings but from High Times. All of the floral arrangements, including the bride’s bouquet, contained a variety of white flowers mixed with marijuana buds and leaves. Mr. Melshenker and his groomsmen wore boutonnieres crafted out of twine and marijuana buds, and Mr. Melshenker’s three dogs, who were also in attendance, wore collars made of cannabis buds, eucalyptus leaves and pink ribbons.
Before going into dinner, the guests were given a baby marijuana plant in a ceramic pot with their name and table assignment written on a card in green ink, in the kind of stylish script you might find on a container of artisanal goat cheese. The tables were named after different strains of marijuana, like Blue Dream, Sour Diesel and Skywalker (the groom’s favorite strain). Ms. Epstein, who was seated at Skywalker, said that everyone at her table, where the ages ranged from 40 to 70, passed around a device similar to an electronic cigarette — except that it contained hash oil instead of nicotine. “It didn’t feel weird or bizarre,” she said. “It kind of becomes a new cocktail.”
With the sale of marijuana for recreational use now legal in Colorado and Washington State, pot and its various paraphernalia are becoming visible at weddings in those states — as table favors for guests like miniature vaporizers or group activites like a hookah lounge. (...)
Jake Rosenbarger of Kim & Jake’s Cakes in Boulder said he would not make a cannabis cake if asked. Marijuana ruins the flavor, he said, and it can even ruin a wedding. “It can divide a room as much as pull it together,” he said. “It creates a vibe of, ‘Are you in the cool kids club or not?’ ”
Penni Ervin, a wedding planner in Crested Butte, was aghast when asked if she was working on any weddings in which pot was involved. “We’re talking about highly professional people, and I just don’t see C.E.O.s getting stoned,” she said. “It’s a family event with grandma and grandpa,” adding, “and you don’t want them to get shocked.”
by Lois Smith Brady, NY Times | Read more:
Image: Alison VagniniFriday, July 25, 2014
Fellow Vegans, Please Stop Making Me Hate You
When I was young and self-hating, I used to not-really-jokingly tell people that I was a "queer writer vegan who hates other queer writer vegans." We can unpack the sadness of that statement at a later date; suffice it to say that I am a competitive attention-seeker, and when competitive attention-seekers are uncertain and immature, sometimes they blame others for their own insecurities rather than examining their own behaviors.
But I digress. Obviously, I no longer hate other queers or other writers. Duh. I do, however, still sometimes hate other vegans. At least, I hate the way some other vegans behave about the whole shebang.
Here's the thing. I've been a vegan for five years now, and I can say that there are a lot of facets of the lifestyle that I appreciate. For example, I like the fact that I eat a hell of a lot more fruits and vegetables than I did in my Dorito-and-Diet-Coke-reliant teenage years. I like that I am occasionally driven by sheer necessity to create new, exciting combinations of breakfast foods (such as peanut-butter-and-frozen-pea tacos, or peanut-butter-and-broccoli stir fry, or peanut-butter-and-one's-own-hand despair-pops). I like that a lot of the vegans I have met are chill folks willing to swap nutritional yeast recipes or let me steal a bite of their tofu breakfast burrito.
And, on the real, I like that I'm not creating any personal demand for factory-farmed milk and eggs. That's what drove me to becoming vegan in the first place, and while I don't talk about it much -- because, frankly, I wrestle with the ethics of avoiding chick-maceration while gorging on strawberries picked by exploited farmworkers -- it's still a pretty big part of why I avoid everything pushed on me by the Cheese Lobby.
However, these positives are far from universal. Veganism is, by nature, not for everyone. And the sooner everyone realizes that, the less inclined I will be to automatically make an Aggressively Placid Face at the next person to espouse the evils of honey at me.
Take what just happened in Detroit, for example. On Thursday, PETA, never known for being a font of rationality when it comes to animal rights, offered to pay the water bill for 10 city families who "pledged to go vegan for a month." Despite the fact that half of Detroit's residents are struggling to, say, flush their toilets or cook on the stove, PETA apparently took it upon itself to use a basic human necessity as leverage for "pledging" to forgo animal products.
Clearly, this is a moronic, unsustainable venture. I am not a resident of Detroit, but if a stranger approached me in my hour of desperation and told me to kill a man just to watch him die, I would 100 percent promise her that her target would be at the bottom of Lake Michigan within the hour. I wouldn't do it, of course, but so long as she was willing to fork over the moolah for utilities, I'd tell her whatever she wanted to hear.
Similarly, there's no indication that PETA will check in with these folks after supposedly ponying up cash for their needs. As of now, they seem content to throw a few pro-vegan pamphlets at families before jetting back to wherever animal rights executives go when they're not trying to raise a stink. In other words, this smells like a publicity stunt, and a half-assed one at best.
From an outside perspective, it appeared as if PETA wanted to cast itself as the wise savior who just needed to offer a tiny incentive -- i.e., water in your own home -- to spark the wonder of veganism within the hearts of Detroiters. In fact, it even graciously pointed out to its would-be beneficiaries that "by accepting our offer to go vegan, not only will families be getting an immediate financial boost and helping animals, if they stick with it, they’ll also lower their risk of obesity, heart disease, cancer, diabetes and strokes." Shut up, PETA. (...)
Again, most vegans I know do not behave this poorly to such a large degree. Many, in fact, understand that food is a personal experience, and that it's unacceptable to shame others for listening to their own bodies, putting their needs ahead of what they perceive to be important, or just frankly not really caring what they place in their face. But I think we've all known vegans who refuse to empathize with other humans in favor of empathizing with farm animals -- and that is no way to create social or environmental change in the long run. For one thing, that's a dickish way to behave, period. For another, it's not going to shift anyone's eating habits, except maybe in the opposite direction out of spite.
But I digress. Obviously, I no longer hate other queers or other writers. Duh. I do, however, still sometimes hate other vegans. At least, I hate the way some other vegans behave about the whole shebang.

And, on the real, I like that I'm not creating any personal demand for factory-farmed milk and eggs. That's what drove me to becoming vegan in the first place, and while I don't talk about it much -- because, frankly, I wrestle with the ethics of avoiding chick-maceration while gorging on strawberries picked by exploited farmworkers -- it's still a pretty big part of why I avoid everything pushed on me by the Cheese Lobby.
However, these positives are far from universal. Veganism is, by nature, not for everyone. And the sooner everyone realizes that, the less inclined I will be to automatically make an Aggressively Placid Face at the next person to espouse the evils of honey at me.
Take what just happened in Detroit, for example. On Thursday, PETA, never known for being a font of rationality when it comes to animal rights, offered to pay the water bill for 10 city families who "pledged to go vegan for a month." Despite the fact that half of Detroit's residents are struggling to, say, flush their toilets or cook on the stove, PETA apparently took it upon itself to use a basic human necessity as leverage for "pledging" to forgo animal products.
Clearly, this is a moronic, unsustainable venture. I am not a resident of Detroit, but if a stranger approached me in my hour of desperation and told me to kill a man just to watch him die, I would 100 percent promise her that her target would be at the bottom of Lake Michigan within the hour. I wouldn't do it, of course, but so long as she was willing to fork over the moolah for utilities, I'd tell her whatever she wanted to hear.
Similarly, there's no indication that PETA will check in with these folks after supposedly ponying up cash for their needs. As of now, they seem content to throw a few pro-vegan pamphlets at families before jetting back to wherever animal rights executives go when they're not trying to raise a stink. In other words, this smells like a publicity stunt, and a half-assed one at best.
From an outside perspective, it appeared as if PETA wanted to cast itself as the wise savior who just needed to offer a tiny incentive -- i.e., water in your own home -- to spark the wonder of veganism within the hearts of Detroiters. In fact, it even graciously pointed out to its would-be beneficiaries that "by accepting our offer to go vegan, not only will families be getting an immediate financial boost and helping animals, if they stick with it, they’ll also lower their risk of obesity, heart disease, cancer, diabetes and strokes." Shut up, PETA. (...)
Again, most vegans I know do not behave this poorly to such a large degree. Many, in fact, understand that food is a personal experience, and that it's unacceptable to shame others for listening to their own bodies, putting their needs ahead of what they perceive to be important, or just frankly not really caring what they place in their face. But I think we've all known vegans who refuse to empathize with other humans in favor of empathizing with farm animals -- and that is no way to create social or environmental change in the long run. For one thing, that's a dickish way to behave, period. For another, it's not going to shift anyone's eating habits, except maybe in the opposite direction out of spite.
by Kate Conway, XOJane | Read more:
Image: uncredited
Thursday, July 24, 2014
Guy Walks Into a Bar
So the guy asks the bartender, “Where’d he come from?”
And the bartender’s, like, “There’s a genie in the men’s room who grants wishes.”
So the guy runs into the men’s room and, sure enough, there’s this genie. And the genie’s, like, “Your wish is my command.” So the guy’s, like, “O.K., I wish for world peace.” And there’s this big cloud of smoke—and then the room fills up with geese.
So the guy walks out of the men’s room and he’s, like, “Hey, bartender, I think your genie might be hard of hearing.”
And the bartender’s, like, “No kidding. You think I wished for a twelve-inch pianist?”
So the guy processes this. And he’s, like, “Does that mean you wished for a twelve-inch penis?”
And the bartender’s, like, “Yeah. Why, what did you wish for?”
And the guy’s, like, “World peace.”
So the bartender is understandably ashamed.
And the guy orders a beer, like everything is normal, but it’s obvious that something has changed between him and the bartender.
And the bartender’s, like, “I feel like I should explain myself further.”
And the guy’s, like, “You don’t have to.”
But the bartender continues, in a hushed tone. And he’s, like, “I have what’s known as penile dysmorphic disorder. Basically, what that means is I fixate on my size. It’s not that I’m small down there. I’m actually within the normal range. Whenever I see it, though, I feel inadequate.”
And the guy feels sorry for him. So he’s, like, “Where do you think that comes from?”
And the bartender’s, like, “I don’t know. My dad and I had a tense relationship. He used to cheat on my mom, and I knew it was going on, but I didn’t tell her. I think it’s wrapped up in that somehow.”
And the guy’s, like, “Have you ever seen anyone about this?”
And the bartender’s, like, “Oh, yeah, I started seeing a therapist four years ago. But she says we’ve barely scratched the surface.”
by Simon Rich, New Yorker | Read more:
Image: Yann Kebbi
Inside Sun Noodle, the Secret Weapon of America's Best Ramen Shops
Now, ramen shops have proliferated in cities from Los Angeles and New York to DC, Chicago, and even Milwaukee. People stand in line for ramen. Chefs create mash-ups of ramen and hamburgers, and people stand in line for those, too.
Behind the scenes of the so-called ramen boom of recent years is Sun Noodle. Over the last 33 years, the Hawaiian company has built three factories which pump out a combined 90,000 servings of ramen noodles per day. It sells these noodles to notable ramenya across America, including nine of New York Times critic Pete Wells' picks for the top 10 ramen destinations in New York. Ivan Orkin, one of Japan's most respected ramen chefs, says that Sun Noodle was the clear choice when he recently opened two restaurants in New York City. And Momofuku's David Chang, who is often credited with the rise of ramen in America, believes that Sun Noodle facilitated that boom. "It's an entire micro-industry they've created," he says. (...)
Sun Noodle Begins
A trip to Hawaii was a once-in-a-lifetime opportunity to an 19-year-old Hidehito Uki. He was working for a noodle factory in the Japanese countryside when he got the call from his father, who operated another noodle company named Unoki in Japan. His father's business partner had pulled out of their project in Hawaii just before it opened. The project was dead, but a noodle-making machine remained on-site. Did Hidehito want it?
Hidehito arrived in Honolulu in 1981. He didn't speak any English, and he didn't know anything about the Hawaiian noodle market. All he knew was that people in Hawaii were interested in noodles, particularly the local variety called saimin, a native Hawaiian noodle soup that is similar to ramen but made with egg noodles and topped with things like Spam. Saimin dates back the islands' plantation history, and was such a locally beloved comfort food that McDonald's already offered saimin on its menus in Hawaii by the time Hidehito arrived.
There were about 20 noodle manufacturers on the island of Oahu at the time, mostly churning out saimin noodles. There were a few ramen shops and plenty of instant ramen available, but Hidehito didn't find much in the way of fresh ramen as he launched Sun Noodle. The quality of the flour wasn't very good either. "I was so surprised, and I wondered if I could have a successful business in Hawaii," Hidehito says.
Getting that first customer did turn out to be a challenge. Hidehito's strategy was to bring samples to potential clients who didn't really understand what he was offering after years of working with instant noodles. They didn't want to eat Hidehito's noodles with their unfamiliarly firm texture, a result of the alkalinity that is key to fresh ramen noodles. He would listen to their feedback, return to his factory, and make the noodles again. Hidehito went back and forth about 15 times with Ezogiku, a small Japanese ramen shop that had opened its first international location in Hawaii seven years earlier. The owners were impressed, and Ezogiku became Sun Noodle's first customer. More customers came. (...)
Sun Noodle has a reputation for working with chefs to create a noodle that best complements their broth recipe. At the New Jersey factory, there are 40 recipes for dough on the master sheet. Each of these can be cut differently — wavy, straight, thick, thin — meaning that there are altogether about 120 types of ramen noodle produced on just one assembly line in the 10,000-square-foot factory. "Can you imagine a bakery that makes 75 kinds of bread, 80 kinds of bread?" Orkin asks.
And Sun Noodle is obsessive about the quality of each of these 120 types of ramen noodles. Every detail matters, starting with the flour. Sun Noodle uses eight different types of flour from suppliers in Canada, Australia, and America, in various combinations. The flour is tempered for at least eight hours at a temperature between 62 and 67 degrees. The factory filters water on a reverse osmosis machine, and constantly measures the humidity of the factory to adjust the water levels correspondingly. Sun Noodle also adds kansui, a mix of sodium carbonate and potassium carbonate, to the water in order to reproduce the alkalinity of Japanese water that makes ramen noodles firm and springy.
Failing the Third Machine Age
[ed. See also: And So It Begins.]
A cheerily written op-ed in the New York Times proclaims: “It’s time for robot caregivers”.
Why? We have many elderly people who need care, and children—especially those with disabilities—the piece argues, and not enough caregivers.
Call in the machines, she says:
This is not just an inhuman policy perspective, it’s economically destructive and rests on accepting current economic policies and realities as if they were immutable.
Let me explain. When people confidently announce that once robots come for our jobs, we’ll find something else to do like we always did, they are drawing from a very short history. The truth is, there’s only been one-and-a-three-quarters of a machine age—we are close to concluding the second one—we are moving into the third one.
And there is probably no fourth one.
Humans have only so many “irreplaceable” skills, and the idea that we’ll just keep outrunning the machines, skill-wise, is a folly. (...)
But wait, you say, there’s a next set of skills, surely?
That has been the historical argument: sure, robots may replace us, but humans have always found a place to go.
As I recounted, there are really only one and a maybe two thirds examples of such shifts, so far, so forgive me if I find such induction unconvincing. Manual labor (one), mental labor (still happening) and now mental skills are getting replaced, we are retreating, partially into emotional labor—i.e. care-giving.
And now machines, we are told, are coming for care-giving.
We are told that this is because there aren't enough humans?
Let’s just start with the obvious: Nonsense.
Of course we have enough human caregivers for the elderly. The country –and the world— is awash in underemployment and unemployment, and many people find caregiving to be a fulfilling and desirable profession. The only problem is that we –as a society— don’t want to pay caregivers well and don’t value their labor. Slightly redistributive policies that would slightly decrease the existing concentration of wealth to provide subsidies for childcare or elder care are, unfortunately, deemed untouchable goals by political parties beholden to a narrow slice of society.
Remember: whenever you hear there’s a shortage of humans (or food), it is almost always a code for shortage of money. (Modern famines are also almost always a shortage of money, not food). Modern shortages of “labor” are almost always a shortage of willingness to pay well, or a desire to avoid hiring the “wrong” kind of people. (...)
Next, consider that emotional labor is all that’s left to escape to as humans workers after manual and mental labor have been already been mostly taken over by machines.
(Creative labor is sometimes cited as another alternative but I am discounting this since it is already discounted—it is very difficult, already, to make a living through creative labor, and it’s getting harder and not easier. But that’s another post).
US Bureau of Labor Statistics projects the following jobs as the ones with the largest growth in the next decade: Personal care aides, registered nurses, retail salespersons, home health aides, fast-food, nursing assistants, secretaries, customer service representatives, janitors…
It’s those face-to-face professions, ones in which being in contact with another human being are important, that are growing in numbers—almost every other profession is shrinking, numerically.
(No there won’t be a shortage of engineers and programmers either—engineers and programmers, better than anyone, should know that machine intelligence is coming for them fairly soon, and will move up the value chain pretty quickly. Also, much of this “shortage”, too, is about controlling workers and not paying them—note how Silicon Valley colluded to not pay its engineers too much, even as the companies in question had hoarded billions in cash. In a true shortage under market conditions, companies would pay more to that which was scarce).
Many of these jobs BLS says will grow, however, are only there for the grace-of-the-generation that still wants to see a cashiers while checking out—and besides, they are low-paid jobs. Automation plus natural language processing by machines is going to obliterate through those jobs in the next decade or two. (Is anyone ready for the even worse labor crisis that will ensue?) Machines will take your order at the fast-food joint, they will check out your groceries without having to scan them, it will become even harder to get a human on the customer service line.
What’s left as jobs is those transactions in which the presence of the human is something more than a smiling face that takes your order and enters into another machine—the cashier and the travel agent that has now been replaced by us, in the “self-serve” economy.
What’s left is deep emotional labor: taking care of each other.
by Zeynep Tufekci, Medium | Read more:
A cheerily written op-ed in the New York Times proclaims: “It’s time for robot caregivers”.
Why? We have many elderly people who need care, and children—especially those with disabilities—the piece argues, and not enough caregivers.
Call in the machines, she says:
“We do not have anywhere near enough human caregivers for the growing number of older Americans.”This how to fail the third machine age.
This is not just an inhuman policy perspective, it’s economically destructive and rests on accepting current economic policies and realities as if they were immutable.

And there is probably no fourth one.
Humans have only so many “irreplaceable” skills, and the idea that we’ll just keep outrunning the machines, skill-wise, is a folly. (...)
But wait, you say, there’s a next set of skills, surely?
That has been the historical argument: sure, robots may replace us, but humans have always found a place to go.
As I recounted, there are really only one and a maybe two thirds examples of such shifts, so far, so forgive me if I find such induction unconvincing. Manual labor (one), mental labor (still happening) and now mental skills are getting replaced, we are retreating, partially into emotional labor—i.e. care-giving.
And now machines, we are told, are coming for care-giving.
We are told that this is because there aren't enough humans?
Let’s just start with the obvious: Nonsense.
Of course we have enough human caregivers for the elderly. The country –and the world— is awash in underemployment and unemployment, and many people find caregiving to be a fulfilling and desirable profession. The only problem is that we –as a society— don’t want to pay caregivers well and don’t value their labor. Slightly redistributive policies that would slightly decrease the existing concentration of wealth to provide subsidies for childcare or elder care are, unfortunately, deemed untouchable goals by political parties beholden to a narrow slice of society.
Remember: whenever you hear there’s a shortage of humans (or food), it is almost always a code for shortage of money. (Modern famines are also almost always a shortage of money, not food). Modern shortages of “labor” are almost always a shortage of willingness to pay well, or a desire to avoid hiring the “wrong” kind of people. (...)
Next, consider that emotional labor is all that’s left to escape to as humans workers after manual and mental labor have been already been mostly taken over by machines.
(Creative labor is sometimes cited as another alternative but I am discounting this since it is already discounted—it is very difficult, already, to make a living through creative labor, and it’s getting harder and not easier. But that’s another post).
US Bureau of Labor Statistics projects the following jobs as the ones with the largest growth in the next decade: Personal care aides, registered nurses, retail salespersons, home health aides, fast-food, nursing assistants, secretaries, customer service representatives, janitors…
It’s those face-to-face professions, ones in which being in contact with another human being are important, that are growing in numbers—almost every other profession is shrinking, numerically.
(No there won’t be a shortage of engineers and programmers either—engineers and programmers, better than anyone, should know that machine intelligence is coming for them fairly soon, and will move up the value chain pretty quickly. Also, much of this “shortage”, too, is about controlling workers and not paying them—note how Silicon Valley colluded to not pay its engineers too much, even as the companies in question had hoarded billions in cash. In a true shortage under market conditions, companies would pay more to that which was scarce).
Many of these jobs BLS says will grow, however, are only there for the grace-of-the-generation that still wants to see a cashiers while checking out—and besides, they are low-paid jobs. Automation plus natural language processing by machines is going to obliterate through those jobs in the next decade or two. (Is anyone ready for the even worse labor crisis that will ensue?) Machines will take your order at the fast-food joint, they will check out your groceries without having to scan them, it will become even harder to get a human on the customer service line.
What’s left as jobs is those transactions in which the presence of the human is something more than a smiling face that takes your order and enters into another machine—the cashier and the travel agent that has now been replaced by us, in the “self-serve” economy.
What’s left is deep emotional labor: taking care of each other.
by Zeynep Tufekci, Medium | Read more:
Image: Kyodo via AP Images
Wednesday, July 23, 2014
Arctic Man
Wild rides and crazed nights at America's most extreme ski race.
It's April in Alaska so the traffic on the Glenn Highway can't be blamed on either winter snow or summer tourists. The line of yellowing motorhomes, bulbous camper trailers, jacked-up pickups and shopworn Subarus inching out of Wasilla onto the hairpins and steep climbs of the Glenn is, as the bumper stickers say, "Alaska Grown," the annual migration of the state's Sledneck population to Arctic Man. Once clear of the sprawl of Wasilla, the signs along the way read like pages flying back on a calendar, flipping past the state's prospector and homestead era — "Jackass Creek," "Frost Heave," "Eureka" — to the Native names, from long before there was English to write them down: "Matanuska," "Chickaloon," "Tazlina." Then there's the highway itself, named for Edwin Glenn, a Spanish-American war vet and Army officer who was the first American soldier ever court-martialed for waterboarding. But earlier in his career, in the late 1890s, Glenn led two expeditions into this wilderness.
Maybe that's the lesson: If you put your name in the ground up here, it stays. Your life outside the state is your own concern.
After the Glenn, you head up past Gulkana — Athabascan for "winding river" — and then a final rush out onto the frozen moonscape of Summit Lake, where the peaks of the Alaska Range fill the horizon, all the way to mighty Denali, which might be the best counterexample of Alaskan identity: William McKinley may have been president, but he never set foot in Alaska, so most Alaskans call the nation's largest mountain by its native name, Denali.
You turn off the highway, down a road piled with eight feet of snow on both sides. This is Camp Isabel, once the single biggest work camp along the Trans-Alaska Pipeline, now a forgotten gravel airstrip at the base of the Hoodoo Mountains. Perhaps 1,000 motorhomes, RVs and trailers are already here, strewn like fallen Jenga pieces inside the frozen walls. Snowmachines buzz past your doors, above your head on the snow banks and over the distant peaks like swarming gnats. The temperature is way below freezing, but the air still carries the smell of gasoline, grilled meat and alcohol. A four-wheeler rumbles past pulling a big sled and on the big sled is a couch, a so-called Alaskan Rickshaw. Four people are riding, holding drinks. One of them is wearing a full wolf pelt, snout, eyes, ears and all. He nods and tips his cup "Hello."
Arctic Man is a weeklong, booze and fossil-fueled Sledneck Revival bookended around the world's craziest ski race. Both the festival and the race at its heart have been firing off every year in these mountains for more than half as long as Alaska has been a state. Over the course of a week, something like 10,000 partiers and their snowmachines disgorge onto Camp Isabel's 300-acre pad to drink, grill, fight, drink and, at least while the sun is out, blast their sleds through the ear-deep powder in the surrounding hills one last time before it all melts away. Then on Friday morning, anyone not hopelessly hungover or already drunk by noon swarms up the valley south of camp to watch the damnedest ski race on earth.
It's April in Alaska so the traffic on the Glenn Highway can't be blamed on either winter snow or summer tourists. The line of yellowing motorhomes, bulbous camper trailers, jacked-up pickups and shopworn Subarus inching out of Wasilla onto the hairpins and steep climbs of the Glenn is, as the bumper stickers say, "Alaska Grown," the annual migration of the state's Sledneck population to Arctic Man. Once clear of the sprawl of Wasilla, the signs along the way read like pages flying back on a calendar, flipping past the state's prospector and homestead era — "Jackass Creek," "Frost Heave," "Eureka" — to the Native names, from long before there was English to write them down: "Matanuska," "Chickaloon," "Tazlina." Then there's the highway itself, named for Edwin Glenn, a Spanish-American war vet and Army officer who was the first American soldier ever court-martialed for waterboarding. But earlier in his career, in the late 1890s, Glenn led two expeditions into this wilderness.

After the Glenn, you head up past Gulkana — Athabascan for "winding river" — and then a final rush out onto the frozen moonscape of Summit Lake, where the peaks of the Alaska Range fill the horizon, all the way to mighty Denali, which might be the best counterexample of Alaskan identity: William McKinley may have been president, but he never set foot in Alaska, so most Alaskans call the nation's largest mountain by its native name, Denali.
You turn off the highway, down a road piled with eight feet of snow on both sides. This is Camp Isabel, once the single biggest work camp along the Trans-Alaska Pipeline, now a forgotten gravel airstrip at the base of the Hoodoo Mountains. Perhaps 1,000 motorhomes, RVs and trailers are already here, strewn like fallen Jenga pieces inside the frozen walls. Snowmachines buzz past your doors, above your head on the snow banks and over the distant peaks like swarming gnats. The temperature is way below freezing, but the air still carries the smell of gasoline, grilled meat and alcohol. A four-wheeler rumbles past pulling a big sled and on the big sled is a couch, a so-called Alaskan Rickshaw. Four people are riding, holding drinks. One of them is wearing a full wolf pelt, snout, eyes, ears and all. He nods and tips his cup "Hello."
Arctic Man is a weeklong, booze and fossil-fueled Sledneck Revival bookended around the world's craziest ski race. Both the festival and the race at its heart have been firing off every year in these mountains for more than half as long as Alaska has been a state. Over the course of a week, something like 10,000 partiers and their snowmachines disgorge onto Camp Isabel's 300-acre pad to drink, grill, fight, drink and, at least while the sun is out, blast their sleds through the ear-deep powder in the surrounding hills one last time before it all melts away. Then on Friday morning, anyone not hopelessly hungover or already drunk by noon swarms up the valley south of camp to watch the damnedest ski race on earth.
by Matt White, SBNation | Read more:
Image: Brian Montalbo
Subscribe to:
Posts (Atom)