Wednesday, December 31, 2014
Hawaiian Dreamers
Alexander Tikhomirov
The Wreck of the Kulluk
On the morning of Dec. 21, as the Aiviq and Kulluk crews prepared to depart on the three-week journey back to Seattle, Shell’s new warranty surveyor took careful note of the certificates for the steel shackles that would connect the two vessels and bear the dead weight of the giant Kulluk — without noticing that they had been replaced with theoretically sturdier shackles of unknown origin. He examined the shackles but had little means by which to test their strength. He did not ask the crew to rotate the shackles, as one might rotate tires on a car. He did not consider it part of his job, he would tell Coast Guard investigators, to examine whether Shell’s overall plan to cross the Gulf of Alaska made any sense.
The Aiviq pulled the Kulluk away just after lunchtime. The Aiviq’s usual captain was on vacation for the holidays, as were Slaiby and other members of Shell’s senior Alaska staff. The rest of the tug’s crew had limited experience in the operation of the new, complex vessel. Onboard the Kulluk was a skeleton crew of 18 men, along for the ride in large part because of an inconvenience of the rig’s flag of convenience. The Kulluk was registered in the Marshall Islands, the same country where BP’s contractors had registered the Deepwater Horizon, and the Marshall Islands required that the rig be manned, even when under tow. The new captain of the Aiviq, Jon Skoglund, proposed aiming straight for the Seattle area, a direct and faster “great circle” route that would leave the two ships alone in the middle of the North Pacific but would avoid the shoals, rocks and waves of the Alaskan coast. There would be no shore to crash into. They could lengthen the towline for better shock absorption and would have room to move if problems arose. The idea was dropped because the Kulluk’s 18 crew members would then be out of range of Coast Guard search-and-rescue helicopters. Perversely, a flag of convenience, seen as a way to avoid government regulation, had Shell seeking a government safety net — and a longer, more dangerous, near-shore route.
Skoglund was so concerned as he began his first Gulf of Alaska tow that he sent an email to the chief Kulluk mariner on the other side of the towline. “To be blunt,” he wrote, “I believe that this length of tow, at this time of year, in this location, with our current routing guarantees an ass kicking.”(...)
The Aiviq pulled the Kulluk away just after lunchtime. The Aiviq’s usual captain was on vacation for the holidays, as were Slaiby and other members of Shell’s senior Alaska staff. The rest of the tug’s crew had limited experience in the operation of the new, complex vessel. Onboard the Kulluk was a skeleton crew of 18 men, along for the ride in large part because of an inconvenience of the rig’s flag of convenience. The Kulluk was registered in the Marshall Islands, the same country where BP’s contractors had registered the Deepwater Horizon, and the Marshall Islands required that the rig be manned, even when under tow. The new captain of the Aiviq, Jon Skoglund, proposed aiming straight for the Seattle area, a direct and faster “great circle” route that would leave the two ships alone in the middle of the North Pacific but would avoid the shoals, rocks and waves of the Alaskan coast. There would be no shore to crash into. They could lengthen the towline for better shock absorption and would have room to move if problems arose. The idea was dropped because the Kulluk’s 18 crew members would then be out of range of Coast Guard search-and-rescue helicopters. Perversely, a flag of convenience, seen as a way to avoid government regulation, had Shell seeking a government safety net — and a longer, more dangerous, near-shore route.
Skoglund was so concerned as he began his first Gulf of Alaska tow that he sent an email to the chief Kulluk mariner on the other side of the towline. “To be blunt,” he wrote, “I believe that this length of tow, at this time of year, in this location, with our current routing guarantees an ass kicking.”(...)
By the early, dark hours of Dec. 29, the unified command was convinced that lives were at risk, and it dispatched two Coast Guard Jayhawk helicopters to evacuate the 18 increasingly anxious men stuck aboard the Kulluk.
The rescuers had thought it would be a relatively easy job. The Coast Guard air base on Kodiak Island, the northernmost such facility in the country, is an outpost at the edge of a wilderness of water and mountains. Its search-and-rescue area spans four million square miles. To have a case “so close to home plate,” as one of them put it — 45 flight minutes and roughly a hundred miles away — seemed a stroke of incredible luck. Further, the Kulluk’s size suggested stability, a straightforward rescue.
When it appeared out of the night, “the Kulluk was wicked lit up,” said Jason Bunch, the Coast Guard rescue swimmer on the first of two helicopters. “It was like a city.” The pilots illuminated the rig further with hover lights and spotlights. They wore night-vision goggles, just in case the lights weren’t enough. They had no problems seeing the rig and the crippled Aiviq and the taut line between them, but the surrounding darkness still eroded the pilots’ depth perception. With no horizon to reference, it was harder to hover, and the goggles took away their peripheral vision.
There are two primary ways Jayhawk crews save people at sea: They lower a basket to the deck of the stricken vessel and hoist them up, or they pluck them out of the water by lowering a basket and a swimmer like Bunch, a 12-year Kodiak veteran, who in 50-foot swells and 35-degree water will drag them to the basket himself.
But now that the Kulluk was being towed from its “stern,” the point of attachment for the emergency towline, it and the Aiviq were oriented exactly the wrong way for an approach. In order to maintain control, the helicopters would ideally face the heavy winds head-on, but the derrick blocked their path. If they tried to hover above the flight deck, the only obvious place for a hoist, there was a perfect tail wind, which made steering unpredictable and dangerous.
The wind blew the tops off some of the massive swells, but otherwise they weren’t breaking. The swells came in close sets, one right after another, interrupted by long sets of “monster waves” more than twice as high. The Kulluk tipped severely but somewhat rhythmically until the monsters arrived. “As the bigger and bigger ones came, they made it go around in a circle,” Bunch told me in Kodiak. His eyes went buggy. His right hand gyrated wildly. The deck, normally 70 feet above the water, “was dipping so deep that the water was surging on it.”
“You know when you have a bobber on a fishing pole,” he asked, “and then you throw it out there and reel in really fast, and it makes a wake over the bobber? That’s what it looked like.” If the Kulluk’s deck was pitching that badly, he said, you could imagine what the derrick was doing. It looked as if it were trying to bat the helicopters out of the sky.
The two helicopters took turns circling the drill rig, looking for a way in, for any place to lower the basket. They radioed back and forth with the Kulluk’s crew — “Very, very professional, very squared away,” Bunch said, “considering the environment” — and briefly wondered if there was a way to get them in the water to be picked up by the swimmers. It was impossible. The jump from the deck to the sea could kill a man if he timed it wrong. If it didn’t, the Kulluk could kill him on the next swing of the bat. They flew back to Kodiak, refueled, brainstormed some more at the base, then went back out.
Six hours had passed from the moment Bunch and his crew first flew over the Kulluk. It was still night. Bunch took out his watch and began timing the gaps between big sets of swells. “We started coming up with some harebrained schemes,” he said. They looked for “maybe doable” spots for the hoist. “We’d have 90 seconds to be in and out, which is just impossible, but we were actually talking about it.” Reality — what their commander called “the courage not to do it” — slowly set in. The two helicopters returned to base. “We were experienced,” Bunch said, “so eventually we were like, ‘This is stupid.' ”
The rescuers had thought it would be a relatively easy job. The Coast Guard air base on Kodiak Island, the northernmost such facility in the country, is an outpost at the edge of a wilderness of water and mountains. Its search-and-rescue area spans four million square miles. To have a case “so close to home plate,” as one of them put it — 45 flight minutes and roughly a hundred miles away — seemed a stroke of incredible luck. Further, the Kulluk’s size suggested stability, a straightforward rescue.
When it appeared out of the night, “the Kulluk was wicked lit up,” said Jason Bunch, the Coast Guard rescue swimmer on the first of two helicopters. “It was like a city.” The pilots illuminated the rig further with hover lights and spotlights. They wore night-vision goggles, just in case the lights weren’t enough. They had no problems seeing the rig and the crippled Aiviq and the taut line between them, but the surrounding darkness still eroded the pilots’ depth perception. With no horizon to reference, it was harder to hover, and the goggles took away their peripheral vision.
There are two primary ways Jayhawk crews save people at sea: They lower a basket to the deck of the stricken vessel and hoist them up, or they pluck them out of the water by lowering a basket and a swimmer like Bunch, a 12-year Kodiak veteran, who in 50-foot swells and 35-degree water will drag them to the basket himself.
But now that the Kulluk was being towed from its “stern,” the point of attachment for the emergency towline, it and the Aiviq were oriented exactly the wrong way for an approach. In order to maintain control, the helicopters would ideally face the heavy winds head-on, but the derrick blocked their path. If they tried to hover above the flight deck, the only obvious place for a hoist, there was a perfect tail wind, which made steering unpredictable and dangerous.
The wind blew the tops off some of the massive swells, but otherwise they weren’t breaking. The swells came in close sets, one right after another, interrupted by long sets of “monster waves” more than twice as high. The Kulluk tipped severely but somewhat rhythmically until the monsters arrived. “As the bigger and bigger ones came, they made it go around in a circle,” Bunch told me in Kodiak. His eyes went buggy. His right hand gyrated wildly. The deck, normally 70 feet above the water, “was dipping so deep that the water was surging on it.”
“You know when you have a bobber on a fishing pole,” he asked, “and then you throw it out there and reel in really fast, and it makes a wake over the bobber? That’s what it looked like.” If the Kulluk’s deck was pitching that badly, he said, you could imagine what the derrick was doing. It looked as if it were trying to bat the helicopters out of the sky.
The two helicopters took turns circling the drill rig, looking for a way in, for any place to lower the basket. They radioed back and forth with the Kulluk’s crew — “Very, very professional, very squared away,” Bunch said, “considering the environment” — and briefly wondered if there was a way to get them in the water to be picked up by the swimmers. It was impossible. The jump from the deck to the sea could kill a man if he timed it wrong. If it didn’t, the Kulluk could kill him on the next swing of the bat. They flew back to Kodiak, refueled, brainstormed some more at the base, then went back out.
Six hours had passed from the moment Bunch and his crew first flew over the Kulluk. It was still night. Bunch took out his watch and began timing the gaps between big sets of swells. “We started coming up with some harebrained schemes,” he said. They looked for “maybe doable” spots for the hoist. “We’d have 90 seconds to be in and out, which is just impossible, but we were actually talking about it.” Reality — what their commander called “the courage not to do it” — slowly set in. The two helicopters returned to base. “We were experienced,” Bunch said, “so eventually we were like, ‘This is stupid.' ”
by McKenzie Funk, NY Times | Read more:
Image: James Mason,Toni Greaves/Getty ImagesBye Bae
The International House of Pancakes set itself apart among chain restaurants this September when it tweeted, “Pancakes. Errybody got time fo’ dat.” But the American starch dispensary—whose claims to internationality include a middling presence in Canada, four stores in the Middle East, and a menu disconcertingly inclusive of burritos, spaghetti, and the word French—failed to distinguish itself the next month with its tweet “Pancakes bae <3.”
At that point, the term bae had already been used by the official social-media accounts of Olive Garden, Jamba Juice, Pizza Hut, Whole Foods, Mountain Dew, AT&T, Wal-Mart, Burger King and, not surprisingly, the notoriously idiosyncratic Internet personas of Arby’s and Denny’s. Each time, the word was delivered with magnificently forceful offhandedness, the calculated ease of the doll that comes to life and tries to pass herself off as a real girl but fails to fully conceal the hinges in her knees. (“What hinges? Oh, these?”)
This bae trendspotting is courtesy of a newly minted Twitter account calledBrands Saying Bae, which tweeted its first on December 27. Yesterday morning it had 7,000 followers, and by evening it had doubled to 14,000. That is the sort of audience engagement and growth that corporate accounts almost never see, despite their best attempts at hipness through dubious cultural appropriation. Brands Saying Bae is reminding people, rather, that advertising—of which social-media accounts for businesses are a part—seeks out that authenticity, twists it out of shape, and turns culture against people. Our brains are cannily adapted to sense inauthenticity and come to hate what is force-fed. So it is with a heavy heart that we mourn this year the loss of bae, inevitable as it was.
Bae was generally adored as a word in 2014, even finding itself among the runners-up for the Oxford Dictionaries’ Word of the Year. (Along with normcore and slacktivism, though all would eventually suffer a disappointing loss at the hands of the uninspired vape.) Oxford’s blog loosely defined bae as a “term of endearment for one’s romantic partner” common among teenagers, with “origins in African-American English,” perpetuated widely on social media and in music, particularly hip-hop and R&B. The lyrical database Rap Genius actually tracesbae back as far as 2005. But after nearly a decade of subcultural percolation, 2014 was the year that bae went fully mainstream. (...)
In the case of bae, Urban Dictionary entries date back years and have been very widely read. One user on the site defined it as “baby, boo, sweetie” in December of 2008, pegging its usage to Western Florida. Even before that, in August of 2006, a user defined it as “a lover or significant other”—though in the ensuing years that definition has garnered equal shares of up-votes and down-votes, with an impressive 11,000 of each. It’s impossible to parse how many of those readers disagree with the particulars of the definition, and how many are simply expressing distaste for the word.
Video blogger William Haynes, who would be among the down-votes, made an adamant case in his popular YouTube series in August that “unknown to the general populace, bae is actually an acronym.” So it would technically be BAE. And according to Haynes, it means Before Anyone Else. That theory has mild support on Urban Dictionary, though it first appeared long after the initial definitions.
Katy Steinmetz in Time aptly mentioned another, more likely origin story earlier this year—one that also accounts for the uncommon a-e pairing—that bae is simply a shortened version of babe (or baby, or beau). “Slangsters do love to embrace the dropped-letter versions of words,” she wrote, noting that in some circles cool has become coo, crazy cray, et cetera. (...)
Now the ordinary people on the Internet appropriating bae are the people who run the social-media accounts for commercial brands. That all of this might be affecting linguistic patterns in a broader way is interesting. The commercial appropriation of a word signals the end of its hipness in any case, but as Kwame Opam at The Verge called it, “appropriation of urban youth culture” can banish a term to a particularly bleached sphere of irrelevance.
The most egregious usage involves the lack of any joke, or even logic. In August, Pizza Hut tweeted “Bacon Stuffed Crust. Bae-con Stuffed Crust.” What the Brands Saying Bae twitter has highlighted is the absurdity of that gimmick, which is the same as is employed in a sitcom where an elderly woman says something sexual, and then, cue the laugh track. The humor is ostensibly to come from the juxtaposition of the source and the nature of the diction. Brands aren’t supposed to talk like that. Whaaaat? It’s the same tired device that killed OMG and basic. (IHOP also tweeted, in June, “Pancakes or you’re basic.”) Laughing. Out. Loud.
At that point, the term bae had already been used by the official social-media accounts of Olive Garden, Jamba Juice, Pizza Hut, Whole Foods, Mountain Dew, AT&T, Wal-Mart, Burger King and, not surprisingly, the notoriously idiosyncratic Internet personas of Arby’s and Denny’s. Each time, the word was delivered with magnificently forceful offhandedness, the calculated ease of the doll that comes to life and tries to pass herself off as a real girl but fails to fully conceal the hinges in her knees. (“What hinges? Oh, these?”)
This bae trendspotting is courtesy of a newly minted Twitter account calledBrands Saying Bae, which tweeted its first on December 27. Yesterday morning it had 7,000 followers, and by evening it had doubled to 14,000. That is the sort of audience engagement and growth that corporate accounts almost never see, despite their best attempts at hipness through dubious cultural appropriation. Brands Saying Bae is reminding people, rather, that advertising—of which social-media accounts for businesses are a part—seeks out that authenticity, twists it out of shape, and turns culture against people. Our brains are cannily adapted to sense inauthenticity and come to hate what is force-fed. So it is with a heavy heart that we mourn this year the loss of bae, inevitable as it was.
Bae was generally adored as a word in 2014, even finding itself among the runners-up for the Oxford Dictionaries’ Word of the Year. (Along with normcore and slacktivism, though all would eventually suffer a disappointing loss at the hands of the uninspired vape.) Oxford’s blog loosely defined bae as a “term of endearment for one’s romantic partner” common among teenagers, with “origins in African-American English,” perpetuated widely on social media and in music, particularly hip-hop and R&B. The lyrical database Rap Genius actually tracesbae back as far as 2005. But after nearly a decade of subcultural percolation, 2014 was the year that bae went fully mainstream. (...)
In the case of bae, Urban Dictionary entries date back years and have been very widely read. One user on the site defined it as “baby, boo, sweetie” in December of 2008, pegging its usage to Western Florida. Even before that, in August of 2006, a user defined it as “a lover or significant other”—though in the ensuing years that definition has garnered equal shares of up-votes and down-votes, with an impressive 11,000 of each. It’s impossible to parse how many of those readers disagree with the particulars of the definition, and how many are simply expressing distaste for the word.
Video blogger William Haynes, who would be among the down-votes, made an adamant case in his popular YouTube series in August that “unknown to the general populace, bae is actually an acronym.” So it would technically be BAE. And according to Haynes, it means Before Anyone Else. That theory has mild support on Urban Dictionary, though it first appeared long after the initial definitions.
Katy Steinmetz in Time aptly mentioned another, more likely origin story earlier this year—one that also accounts for the uncommon a-e pairing—that bae is simply a shortened version of babe (or baby, or beau). “Slangsters do love to embrace the dropped-letter versions of words,” she wrote, noting that in some circles cool has become coo, crazy cray, et cetera. (...)
Now the ordinary people on the Internet appropriating bae are the people who run the social-media accounts for commercial brands. That all of this might be affecting linguistic patterns in a broader way is interesting. The commercial appropriation of a word signals the end of its hipness in any case, but as Kwame Opam at The Verge called it, “appropriation of urban youth culture” can banish a term to a particularly bleached sphere of irrelevance.
The most egregious usage involves the lack of any joke, or even logic. In August, Pizza Hut tweeted “Bacon Stuffed Crust. Bae-con Stuffed Crust.” What the Brands Saying Bae twitter has highlighted is the absurdity of that gimmick, which is the same as is employed in a sitcom where an elderly woman says something sexual, and then, cue the laugh track. The humor is ostensibly to come from the juxtaposition of the source and the nature of the diction. Brands aren’t supposed to talk like that. Whaaaat? It’s the same tired device that killed OMG and basic. (IHOP also tweeted, in June, “Pancakes or you’re basic.”) Laughing. Out. Loud.
by James Hamblin, The Atlantic | Read more:
Image: Paul Michael Hughes/ShutterstockTuesday, December 30, 2014
Are Some Diets “Mass Murder”?
[T]he Inuit, the Masai, and the Samburu people of Uganda all originally ate diets that were 60-80% fat and yet were not obese and did not have hypertension or heart disease.
The hypothesis that saturated fat is the main dietary cause of cardiovascular disease is strongly associated with one man, Ancel Benjamin Keys, a biologist at the University of Minnesota. […] Keys launched his “diet-heart hypothesis” at a meeting in New York in 1952, when the United States was at the peak of its epidemic of heart disease, with his study showing a close correlation between deaths from heart disease and proportion of fat in the diet in men in six countries (Japan, Italy, England and Wales, Australia, Canada, and the United States). Keys studied few men and did not have a reliable way of measuring diets, and in the case of the Japanese and Italians he studied them soon after the second world war, when there were food shortages. Keys could have gathered data from many more countries and people (women as well as men) and used more careful methods, but, suggests Teicholz, he found what he wanted to find. […]
At a World Health Organization meeting in 1955 Keys’s hypothesis was met with great criticism, but in response he designed the highly influential Seven Countries Study, which was published in 1970 and showed a strong correlation between saturated fat (Keys had moved on from fat to saturated fat) and deaths from heart disease. Keys did not select countries (such as France, Germany, or Switzerland) where the correlation did not seem so neat, and in Crete and Corfu he studied only nine men. […]
[T]he fat hypothesis led to a massive change in the US and subsequently international diet. One congressional staffer, Nick Mottern, wrote a report recommending that fat be reduced from 40% to 30% of energy intake, saturated fat capped at 10%, and carbohydrate increased to 55-60%. These recommendations went through to Dietary Guidelines for Americans, which were published for the first time in 1980. (Interestingly, a recommendation from Mottern that sugar be reduced disappeared along the way.)
It might be expected that the powerful US meat and dairy lobbies would oppose these guidelines, and they did, but they couldn’t counter the big food manufacturers such as General Foods, Quaker Oats, Heinz, the National Biscuit Company, and the Corn Products Refining Corporation, which were both more powerful and more subtle. In 1941 they set up the Nutrition Foundation, which formed links with scientists and funded conferences and research before there was public funding for nutrition research. […]
Saturated fats such as lard, butter, and suet, which are solid at room temperature, had for centuries been used for making biscuits, pastries, and much else, but when saturated fat became unacceptable a substitute had to be found. The substitute was trans fats, and since the 1980s these fats, which are not found naturally except in some ruminants, have been widely used and are now found throughout our bodies. There were doubts about trans fats from the very beginning, but Teicholz shows how the food companies were highly effective in countering any research that raised the risks of trans fats. […]
Another consequence of the fat hypothesis is that around the world diets have come to include much more carbohydrate, including sugar and high fructose corn syrup, which is cheap, extremely sweet, and “a calorie source but not a nutrient.”2 5 25 More and more scientists believe that it is the surfeit of refined carbohydrates that is driving the global pandemic of obesity, diabetes, and non-communicable diseases.
by Richard Smith, BMJ | Read more:
The hypothesis that saturated fat is the main dietary cause of cardiovascular disease is strongly associated with one man, Ancel Benjamin Keys, a biologist at the University of Minnesota. […] Keys launched his “diet-heart hypothesis” at a meeting in New York in 1952, when the United States was at the peak of its epidemic of heart disease, with his study showing a close correlation between deaths from heart disease and proportion of fat in the diet in men in six countries (Japan, Italy, England and Wales, Australia, Canada, and the United States). Keys studied few men and did not have a reliable way of measuring diets, and in the case of the Japanese and Italians he studied them soon after the second world war, when there were food shortages. Keys could have gathered data from many more countries and people (women as well as men) and used more careful methods, but, suggests Teicholz, he found what he wanted to find. […]
At a World Health Organization meeting in 1955 Keys’s hypothesis was met with great criticism, but in response he designed the highly influential Seven Countries Study, which was published in 1970 and showed a strong correlation between saturated fat (Keys had moved on from fat to saturated fat) and deaths from heart disease. Keys did not select countries (such as France, Germany, or Switzerland) where the correlation did not seem so neat, and in Crete and Corfu he studied only nine men. […]
[T]he fat hypothesis led to a massive change in the US and subsequently international diet. One congressional staffer, Nick Mottern, wrote a report recommending that fat be reduced from 40% to 30% of energy intake, saturated fat capped at 10%, and carbohydrate increased to 55-60%. These recommendations went through to Dietary Guidelines for Americans, which were published for the first time in 1980. (Interestingly, a recommendation from Mottern that sugar be reduced disappeared along the way.)
It might be expected that the powerful US meat and dairy lobbies would oppose these guidelines, and they did, but they couldn’t counter the big food manufacturers such as General Foods, Quaker Oats, Heinz, the National Biscuit Company, and the Corn Products Refining Corporation, which were both more powerful and more subtle. In 1941 they set up the Nutrition Foundation, which formed links with scientists and funded conferences and research before there was public funding for nutrition research. […]
Saturated fats such as lard, butter, and suet, which are solid at room temperature, had for centuries been used for making biscuits, pastries, and much else, but when saturated fat became unacceptable a substitute had to be found. The substitute was trans fats, and since the 1980s these fats, which are not found naturally except in some ruminants, have been widely used and are now found throughout our bodies. There were doubts about trans fats from the very beginning, but Teicholz shows how the food companies were highly effective in countering any research that raised the risks of trans fats. […]
Another consequence of the fat hypothesis is that around the world diets have come to include much more carbohydrate, including sugar and high fructose corn syrup, which is cheap, extremely sweet, and “a calorie source but not a nutrient.”2 5 25 More and more scientists believe that it is the surfeit of refined carbohydrates that is driving the global pandemic of obesity, diabetes, and non-communicable diseases.
by Richard Smith, BMJ | Read more:
Beyond Meat
I dumped meat a few weeks ago, and it was not an easy breakup. Some of my most treasured moments have involved a deck, a beer, and a cheeseburger. But the more I learned, the more I understood that the relationship wasn’t good for either of us. A few things you should never do if you want to eat factory meat in unconflicted bliss: write a story on water scarcity in the American Southwest; Google “How much shit is in my hamburger?”; watch an undercover video of a slaughterhouse in action; and read the 2009 Worldwatch Institute report “Livestock and Climate Change.”
I did them all. And that was that. By then I knew that with every burger I consumed, I was helping to suck America’s rivers dry, munching on a fecal casserole seasoned liberally with E. coli, passively condoning an orgy of torture that would make Hannibal Lecter blanch, and accelerating global warming as surely as if I’d plowed my Hummer into a solar installation. We all needed to kick the meat habit, starting with me.
Yet previous attempts had collapsed in the face of time-sucking whole-food preparation and cardboard-scented tofu products. All the veggie burgers I knew of seemed to come in two flavors of unappealing: the brown-rice, high-carb, nap-inducing mush bomb, and the colon-wrecking gluten chew puck. Soylent? In your pasty dreams. If I couldn’t have meat, I needed something damn close. A high-performance, low-commitment protein recharge, good with Budweiser.
I took long, moody walks on the dirt roads near my Vermont house. I passed my neighbor’s farm. One of his beef cattle stepped up to the fence and gazed at me. My eyes traced his well-marbled flanks and meaty chest. I stared into those bottomless brown eyes. “I can’t quit you,” I whispered to him.
But I did. Not because my willpower suddenly rose beyond its default Lebowski setting, but because a box arrived at my door and made it easy.
Inside were four quarter-pound brown patties. I tossed one on the grill. It hit with a satisfying sizzle. Gobbets of lovely fat began to bubble out. A beefy smell filled the air. I browned a bun. Popped a pilsner. Mustard, ketchup, pickle, onions. I threw it all together with some chips on the side and took a bite. I chewed. I thought. I chewed some more. And then I began to get excited about the future.
It was called the Beast Burger, and it came from a Southern California company called Beyond Meat, located a few blocks from the ocean. At that point, the Beast was still a secret, known only by its code name: the Manhattan Beach Project. I’d had to beg Ethan Brown, the company’s 43-year-old CEO, to send me a sample.
And it was vegan. “More protein than beef,” Brown told me when I rang him up after tasting it. “More omegas than salmon. More calcium than milk. More antioxidants than blueberries. Plus muscle-recovery aids. It’s the ultimate performance burger.”
“How do you make it so meat-like?” I asked.
“It is meat,” he replied enigmatically. “Come on out. We’ll show you our steer.”
Byond Meat HQ was a brick warehouse located a stone’s throw from Chevron’s massive El Segundo refinery, which hiccuped gray fumes into the clear California sky. “Old economy, new economy,” Brown said as we stepped inside. Two-dozen wholesome millennials tapped away at laptops on temporary tables in the open space, which looked remarkably like a set that had been thrown together that morning for a movie about startups. Bikes and surfboards leaned in the corners. In the test kitchen, the Beyond Meat chef, Dave Anderson—former celebrity chef to the stars and cofounder of vegan-mayo company Hampton Creek—was frying experimental burgers made of beans, quinoa, and cryptic green things.
The “steer” was the only one with its own space. It glinted, steely and unfeeling, in the corner of the lab. It was a twin-screw extruder, the food-industry workhorse that churns out all the pastas and PowerBars of the world. Beyond Meat’s main extruders, as well as its 60 other employees, labor quietly in Missouri, producing the company’s current generation of meat substitutes, but this was the R&D steer. To make a Beast Burger, powdered pea protein, water, sunflower oil, and various nutrients and natural flavors go into a mixer at one end, are cooked and pressurized, get extruded out the back, and are then shaped into patties ready to be reheated on consumers’ grills.
“It’s about the dimensions of a large steer, right?” Brown said to me as we admired it. “And it does the same thing.” By which he meant that plant stuff goes in one end, gets pulled apart, and is then reassembled into fibrous bundles of protein. A steer does this to build muscle. The extruder in the Beyond Meat lab does it to make meat. Not meat-like substances, Brown will tell you. Meat. Meat from plants. Because what is meat but a tasty, toothy hunk of protein? Do we really need animals to assemble it for us, or have we reached a stage of enlightenment where we can build machines to do the dirty work for us?
Livestock, in fact, are horribly inefficient at making meat. Only about 3 percent of the plant matter that goes into a steer winds up as muscle. The rest gets burned for energy, ejected as methane, blown off as excess heat, shot out the back of the beast, or repurposed into non-meat-like things such as blood, bone, and brains. The process buries river systems in manure and requires an absurd amount of land. Roughly three-fifths of all farmland is used to grow beef, although it accounts for just 5 percent of our protein. But we love meat, and with the developing world lining up at the table and sharpening their steak knives, global protein consumption is expected to double by 2050.
I did them all. And that was that. By then I knew that with every burger I consumed, I was helping to suck America’s rivers dry, munching on a fecal casserole seasoned liberally with E. coli, passively condoning an orgy of torture that would make Hannibal Lecter blanch, and accelerating global warming as surely as if I’d plowed my Hummer into a solar installation. We all needed to kick the meat habit, starting with me.
Yet previous attempts had collapsed in the face of time-sucking whole-food preparation and cardboard-scented tofu products. All the veggie burgers I knew of seemed to come in two flavors of unappealing: the brown-rice, high-carb, nap-inducing mush bomb, and the colon-wrecking gluten chew puck. Soylent? In your pasty dreams. If I couldn’t have meat, I needed something damn close. A high-performance, low-commitment protein recharge, good with Budweiser.
I took long, moody walks on the dirt roads near my Vermont house. I passed my neighbor’s farm. One of his beef cattle stepped up to the fence and gazed at me. My eyes traced his well-marbled flanks and meaty chest. I stared into those bottomless brown eyes. “I can’t quit you,” I whispered to him.
But I did. Not because my willpower suddenly rose beyond its default Lebowski setting, but because a box arrived at my door and made it easy.
Inside were four quarter-pound brown patties. I tossed one on the grill. It hit with a satisfying sizzle. Gobbets of lovely fat began to bubble out. A beefy smell filled the air. I browned a bun. Popped a pilsner. Mustard, ketchup, pickle, onions. I threw it all together with some chips on the side and took a bite. I chewed. I thought. I chewed some more. And then I began to get excited about the future.
It was called the Beast Burger, and it came from a Southern California company called Beyond Meat, located a few blocks from the ocean. At that point, the Beast was still a secret, known only by its code name: the Manhattan Beach Project. I’d had to beg Ethan Brown, the company’s 43-year-old CEO, to send me a sample.
And it was vegan. “More protein than beef,” Brown told me when I rang him up after tasting it. “More omegas than salmon. More calcium than milk. More antioxidants than blueberries. Plus muscle-recovery aids. It’s the ultimate performance burger.”
“How do you make it so meat-like?” I asked.
“It is meat,” he replied enigmatically. “Come on out. We’ll show you our steer.”
Byond Meat HQ was a brick warehouse located a stone’s throw from Chevron’s massive El Segundo refinery, which hiccuped gray fumes into the clear California sky. “Old economy, new economy,” Brown said as we stepped inside. Two-dozen wholesome millennials tapped away at laptops on temporary tables in the open space, which looked remarkably like a set that had been thrown together that morning for a movie about startups. Bikes and surfboards leaned in the corners. In the test kitchen, the Beyond Meat chef, Dave Anderson—former celebrity chef to the stars and cofounder of vegan-mayo company Hampton Creek—was frying experimental burgers made of beans, quinoa, and cryptic green things.
The “steer” was the only one with its own space. It glinted, steely and unfeeling, in the corner of the lab. It was a twin-screw extruder, the food-industry workhorse that churns out all the pastas and PowerBars of the world. Beyond Meat’s main extruders, as well as its 60 other employees, labor quietly in Missouri, producing the company’s current generation of meat substitutes, but this was the R&D steer. To make a Beast Burger, powdered pea protein, water, sunflower oil, and various nutrients and natural flavors go into a mixer at one end, are cooked and pressurized, get extruded out the back, and are then shaped into patties ready to be reheated on consumers’ grills.
“It’s about the dimensions of a large steer, right?” Brown said to me as we admired it. “And it does the same thing.” By which he meant that plant stuff goes in one end, gets pulled apart, and is then reassembled into fibrous bundles of protein. A steer does this to build muscle. The extruder in the Beyond Meat lab does it to make meat. Not meat-like substances, Brown will tell you. Meat. Meat from plants. Because what is meat but a tasty, toothy hunk of protein? Do we really need animals to assemble it for us, or have we reached a stage of enlightenment where we can build machines to do the dirty work for us?
Livestock, in fact, are horribly inefficient at making meat. Only about 3 percent of the plant matter that goes into a steer winds up as muscle. The rest gets burned for energy, ejected as methane, blown off as excess heat, shot out the back of the beast, or repurposed into non-meat-like things such as blood, bone, and brains. The process buries river systems in manure and requires an absurd amount of land. Roughly three-fifths of all farmland is used to grow beef, although it accounts for just 5 percent of our protein. But we love meat, and with the developing world lining up at the table and sharpening their steak knives, global protein consumption is expected to double by 2050.
by Rowan Jacobsen, Outside | Read more:
Image: Misha Gravenor[ed. Style. I could really go for a coat like this.]
A deliciously soft and airy jacket. Mr Cesare Prandelli (Manager of Italy’s national side).
via:
The Tragedy of the American Military
In mid-September, while President Obama was fending off complaints that he should have done more, done less, or done something different about the overlapping crises in Iraq and Syria, he traveled to Central Command headquarters, at MacDill Air Force Base in Florida. There he addressed some of the men and women who would implement whatever the U.S. military strategy turned out to be.
The part of the speech intended to get coverage was Obama’s rationale for reengaging the United States in Iraq, more than a decade after it first invaded and following the long and painful effort to extricate itself. This was big enough news that many cable channels covered the speech live. I watched it on an overhead TV while I sat waiting for a flight at Chicago’s O’Hare airport. When Obama got to the section of his speech announcing whether he planned to commit U.S. troops in Iraq (at the time, he didn’t), I noticed that many people in the terminal shifted their attention briefly to the TV. As soon as that was over, they went back to their smartphones and their laptops and their Cinnabons as the president droned on.
Usually I would have stopped watching too, since so many aspects of public figures’ appearances before the troops have become so formulaic and routine. But I decided to see the whole show. Obama gave his still-not-quite-natural-sounding callouts to the different military services represented in the crowd. (“I know we’ve got some Air Force in the house!” and so on, receiving cheers rendered as “Hooyah!” and “Oorah!” in the official White House transcript.) He told members of the military that the nation was grateful for their nonstop deployments and for the unique losses and burdens placed on them through the past dozen years of open-ended war. He noted that they were often the face of American influence in the world, being dispatched to Liberia in 2014 to cope with the then-dawning Ebola epidemic as they had been sent to Indonesia 10 years earlier to rescue victims of the catastrophic tsunami there. He said that the “9/11 generation of heroes” represented the very best in its country, and that its members constituted a military that was not only superior to all current adversaries but no less than “the finest fighting force in the history of the world.”
If any of my fellow travelers at O’Hare were still listening to the speech, none of them showed any reaction to it. And why would they? This has become the way we assume the American military will be discussed by politicians and in the press: Overblown, limitless praise, absent the caveats or public skepticism we would apply to other American institutions, especially ones that run on taxpayer money. A somber moment to reflect on sacrifice. Then everyone except the few people in uniform getting on with their workaday concerns. (...)
This reverent but disengaged attitude toward the military—we love the troops, but we’d rather not think about them—has become so familiar that we assume it is the American norm. But it is not. When Dwight D. Eisenhower, as a five-star general and the supreme commander, led what may have in fact been the finest fighting force in the history of the world, he did not describe it in that puffed-up way. On the eve of the D-Day invasion, he warned his troops, “Your task will not be an easy one,” because “your enemy is well-trained, well-equipped, and battle-hardened.” As president, Eisenhower’s most famous statement about the military was his warning in his farewell address of what could happen if its political influence grew unchecked.
At the end of World War II, nearly 10 percent of the entire U.S. population was on active military duty—which meant most able-bodied men of a certain age (plus the small number of women allowed to serve). Through the decade after World War II, when so many American families had at least one member in uniform, political and journalistic references were admiring but not awestruck. Most Americans were familiar enough with the military to respect it while being sharply aware of its shortcomings, as they were with the school system, their religion, and other important and fallible institutions.
Now the American military is exotic territory to most of the American public. As a comparison: A handful of Americans live on farms, but there are many more of them than serve in all branches of the military. (Well over 4 million people live on the country’s 2.1 million farms. The U.S. military has about 1.4 million people on active duty and another 850,000 in the reserves.) The other 310 million–plus Americans “honor” their stalwart farmers, but generally don’t know them. So too with the military. Many more young Americans will study abroad this year than will enlist in the military—nearly 300,000 students overseas, versus well under 200,000 new recruits. As a country, America has been at war nonstop for the past 13 years. As a public, it has not. A total of about 2.5 million Americans, roughly three-quarters of 1 percent, served in Iraq or Afghanistan at any point in the post-9/11 years, many of them more than once.
The difference between the earlier America that knew its military and the modern America that gazes admiringly at its heroes shows up sharply in changes in popular and media culture. While World War II was under way, its best-known chroniclers were the Scripps Howard reporter Ernie Pyle, who described the daily braveries and travails of the troops (until he was killed near the war’s end by Japanese machine-gun fire on the island of Iejima), and the Stars and Stripes cartoonist Bill Mauldin, who mocked the obtuseness of generals and their distance from the foxhole realities faced by his wisecracking GI characters, Willie and Joe.
From Mister Roberts to South Pacific to Catch-22, from The Caine Mutiny toThe Naked and the Dead to From Here to Eternity, American popular and high culture treated our last mass-mobilization war as an effort deserving deep respect and pride, but not above criticism and lampooning. The collective achievement of the military was heroic, but its members and leaders were still real people, with all the foibles of real life. A decade after that war ended, the most popular military-themed TV program was The Phil Silvers Show, about a con man in uniform named Sgt. Bilko. As Bilko, Phil Silvers was that stock American sitcom figure, the lovable blowhard—a role familiar from the time of Jackie Gleason in The Honeymooners to Homer Simpson in The Simpsons today. Gomer Pyle, USMC; Hogan’s Heroes; McHale’s Navy; and even the anachronistic frontier show F Troop were sitcoms whose settings were U.S. military units and whose villains—and schemers, and stooges, and occasional idealists—were people in uniform. American culture was sufficiently at ease with the military to make fun of it, a stance now hard to imagine outside the military itself. (...)
The most biting satirical novel to come from the Iraq-Afghanistan era, Billy Lynn’s Long Halftime Walk, by Ben Fountain, is a takedown of our empty modern “thank you for your service” rituals. It is the story of an Army squad that is badly shot up in Iraq; is brought back to be honored at halftime during a nationally televised Dallas Cowboys Thanksgiving Day game; while there, is slapped on the back and toasted by owner’s-box moguls and flirted with by cheerleaders, “passed around like everyone’s favorite bong,” as platoon member Billy Lynn thinks of it; and is then shipped right back to the front.
The people at the stadium feel good about what they’ve done to show their support for the troops. From the troops’ point of view, the spectacle looks different. “There’s something harsh in his fellow Americans, avid, ecstatic, a burning that comes of the deepest need,” the narrator says of Billy Lynn’s thoughts. “That’s his sense of it, they all need something from him, this pack of half-rich lawyers, dentists, soccer moms, and corporate VPs, they’re all gnashing for a piece of a barely grown grunt making $14,800 a year.” Fountain’s novel won the National Book Critics Circle Award for fiction in 2012, but it did not dent mainstream awareness enough to make anyone self-conscious about continuing the “salute to the heroes” gestures that do more for the civilian public’s self-esteem than for the troops’. As I listened to Obama that day in the airport, and remembered Ben Fountain’s book, and observed the hum of preoccupied America around me, I thought that the parts of the presidential speech few Americans were listening to were the ones historians might someday seize upon to explain the temper of our times.
The part of the speech intended to get coverage was Obama’s rationale for reengaging the United States in Iraq, more than a decade after it first invaded and following the long and painful effort to extricate itself. This was big enough news that many cable channels covered the speech live. I watched it on an overhead TV while I sat waiting for a flight at Chicago’s O’Hare airport. When Obama got to the section of his speech announcing whether he planned to commit U.S. troops in Iraq (at the time, he didn’t), I noticed that many people in the terminal shifted their attention briefly to the TV. As soon as that was over, they went back to their smartphones and their laptops and their Cinnabons as the president droned on.
Usually I would have stopped watching too, since so many aspects of public figures’ appearances before the troops have become so formulaic and routine. But I decided to see the whole show. Obama gave his still-not-quite-natural-sounding callouts to the different military services represented in the crowd. (“I know we’ve got some Air Force in the house!” and so on, receiving cheers rendered as “Hooyah!” and “Oorah!” in the official White House transcript.) He told members of the military that the nation was grateful for their nonstop deployments and for the unique losses and burdens placed on them through the past dozen years of open-ended war. He noted that they were often the face of American influence in the world, being dispatched to Liberia in 2014 to cope with the then-dawning Ebola epidemic as they had been sent to Indonesia 10 years earlier to rescue victims of the catastrophic tsunami there. He said that the “9/11 generation of heroes” represented the very best in its country, and that its members constituted a military that was not only superior to all current adversaries but no less than “the finest fighting force in the history of the world.”
If any of my fellow travelers at O’Hare were still listening to the speech, none of them showed any reaction to it. And why would they? This has become the way we assume the American military will be discussed by politicians and in the press: Overblown, limitless praise, absent the caveats or public skepticism we would apply to other American institutions, especially ones that run on taxpayer money. A somber moment to reflect on sacrifice. Then everyone except the few people in uniform getting on with their workaday concerns. (...)
This reverent but disengaged attitude toward the military—we love the troops, but we’d rather not think about them—has become so familiar that we assume it is the American norm. But it is not. When Dwight D. Eisenhower, as a five-star general and the supreme commander, led what may have in fact been the finest fighting force in the history of the world, he did not describe it in that puffed-up way. On the eve of the D-Day invasion, he warned his troops, “Your task will not be an easy one,” because “your enemy is well-trained, well-equipped, and battle-hardened.” As president, Eisenhower’s most famous statement about the military was his warning in his farewell address of what could happen if its political influence grew unchecked.
At the end of World War II, nearly 10 percent of the entire U.S. population was on active military duty—which meant most able-bodied men of a certain age (plus the small number of women allowed to serve). Through the decade after World War II, when so many American families had at least one member in uniform, political and journalistic references were admiring but not awestruck. Most Americans were familiar enough with the military to respect it while being sharply aware of its shortcomings, as they were with the school system, their religion, and other important and fallible institutions.
Now the American military is exotic territory to most of the American public. As a comparison: A handful of Americans live on farms, but there are many more of them than serve in all branches of the military. (Well over 4 million people live on the country’s 2.1 million farms. The U.S. military has about 1.4 million people on active duty and another 850,000 in the reserves.) The other 310 million–plus Americans “honor” their stalwart farmers, but generally don’t know them. So too with the military. Many more young Americans will study abroad this year than will enlist in the military—nearly 300,000 students overseas, versus well under 200,000 new recruits. As a country, America has been at war nonstop for the past 13 years. As a public, it has not. A total of about 2.5 million Americans, roughly three-quarters of 1 percent, served in Iraq or Afghanistan at any point in the post-9/11 years, many of them more than once.
The difference between the earlier America that knew its military and the modern America that gazes admiringly at its heroes shows up sharply in changes in popular and media culture. While World War II was under way, its best-known chroniclers were the Scripps Howard reporter Ernie Pyle, who described the daily braveries and travails of the troops (until he was killed near the war’s end by Japanese machine-gun fire on the island of Iejima), and the Stars and Stripes cartoonist Bill Mauldin, who mocked the obtuseness of generals and their distance from the foxhole realities faced by his wisecracking GI characters, Willie and Joe.
From Mister Roberts to South Pacific to Catch-22, from The Caine Mutiny toThe Naked and the Dead to From Here to Eternity, American popular and high culture treated our last mass-mobilization war as an effort deserving deep respect and pride, but not above criticism and lampooning. The collective achievement of the military was heroic, but its members and leaders were still real people, with all the foibles of real life. A decade after that war ended, the most popular military-themed TV program was The Phil Silvers Show, about a con man in uniform named Sgt. Bilko. As Bilko, Phil Silvers was that stock American sitcom figure, the lovable blowhard—a role familiar from the time of Jackie Gleason in The Honeymooners to Homer Simpson in The Simpsons today. Gomer Pyle, USMC; Hogan’s Heroes; McHale’s Navy; and even the anachronistic frontier show F Troop were sitcoms whose settings were U.S. military units and whose villains—and schemers, and stooges, and occasional idealists—were people in uniform. American culture was sufficiently at ease with the military to make fun of it, a stance now hard to imagine outside the military itself. (...)
The most biting satirical novel to come from the Iraq-Afghanistan era, Billy Lynn’s Long Halftime Walk, by Ben Fountain, is a takedown of our empty modern “thank you for your service” rituals. It is the story of an Army squad that is badly shot up in Iraq; is brought back to be honored at halftime during a nationally televised Dallas Cowboys Thanksgiving Day game; while there, is slapped on the back and toasted by owner’s-box moguls and flirted with by cheerleaders, “passed around like everyone’s favorite bong,” as platoon member Billy Lynn thinks of it; and is then shipped right back to the front.
The people at the stadium feel good about what they’ve done to show their support for the troops. From the troops’ point of view, the spectacle looks different. “There’s something harsh in his fellow Americans, avid, ecstatic, a burning that comes of the deepest need,” the narrator says of Billy Lynn’s thoughts. “That’s his sense of it, they all need something from him, this pack of half-rich lawyers, dentists, soccer moms, and corporate VPs, they’re all gnashing for a piece of a barely grown grunt making $14,800 a year.” Fountain’s novel won the National Book Critics Circle Award for fiction in 2012, but it did not dent mainstream awareness enough to make anyone self-conscious about continuing the “salute to the heroes” gestures that do more for the civilian public’s self-esteem than for the troops’. As I listened to Obama that day in the airport, and remembered Ben Fountain’s book, and observed the hum of preoccupied America around me, I thought that the parts of the presidential speech few Americans were listening to were the ones historians might someday seize upon to explain the temper of our times.
by James Fallows, The Atlantic | Read more:
Image: David Goldman/APMonday, December 29, 2014
Pot Pie, Redefined?
Recreational marijuana is both illegal and controversial in most of the country, and its relationship to food does not rise much above a joke about brownies or a stoner chef’s late-night pork belly poutine.
But cooking with cannabis is emerging as a legitimate and very lucrative culinary pursuit.
In Colorado, which has issued more than 160 edible marijuana licenses, skilled line cooks are leaving respected restaurants to take more lucrative jobs infusing cannabis into food and drinks. In Washington, one of four states that allow recreational marijuana sales, a large cannabis bakery dedicated to affluent customers with good palates will soon open in Seattle.
Major New York publishing houses and noted cookbook authors are pondering marijuana projects, and chefs on both coasts and in food-forward countries like Denmark have been staging underground meals with modern twists like compressed watermelon, smoked cheese and marijuana-oil vinaigrette.
“It really won’t be long until it becomes part of haute cuisine and part of respectable culinary culture, instead of just an illegal doobie in the backyard,” said Ken Albala, director of the food studies program at the University of the Pacific in San Francisco.
Two problems, however, stand in the way: First, it’s hard to control how high people get when they eat marijuana. And second, it really doesn’t taste that good.
Still, what if chefs could develop a culinary canon around marijuana that tamed both its taste and mood-altering effects, and diners came to appreciate dishes with marijuana the way one appreciates good bourbon? Paired with delicious recipes and the pleasures of good company, cannabis cookery might open a new dimension in dining that echoes the evolutions in the wine and cocktail cultures.
“I am sure someone is going to grow some that is actually delicious and we’ll all learn about it,” said Ruth Reichl, the former editor of Gourmet magazine and a former New York Times restaurant critic. Who could have predicted that kale would be the trendiest green on the plate, or that people would line up for pear and blue cheese ice cream, she asked. (...)
Cooking with marijuana requires a scientist’s touch to draw out and control the cannabinoids like tetrahydrocannabinol, or THC, which alter one’s mood and physical sensations. To get a consistent, controllable effect, marijuana is best heated and combined with fats like butter, olive oil or cream. (...)
Twenty-three states and the District of Columbia have legalized medical marijuana sales. Only four states — Washington, Oregon, Alaska and Colorado — allow recreational sales. The people who sell edible marijuana often advise people who have not tried it before to start with 10 milligrams or less. Dosing is easier to control in batter-based dishes or chocolate, where the drug can be distributed more evenly. In savory applications, dosing is trickier. A cook might be able to make sure a tablespoon of lime-cilantro butter has 10 milligrams of THC, but will the guest eat exactly that amount?
But cooking with cannabis is emerging as a legitimate and very lucrative culinary pursuit.
In Colorado, which has issued more than 160 edible marijuana licenses, skilled line cooks are leaving respected restaurants to take more lucrative jobs infusing cannabis into food and drinks. In Washington, one of four states that allow recreational marijuana sales, a large cannabis bakery dedicated to affluent customers with good palates will soon open in Seattle.
Major New York publishing houses and noted cookbook authors are pondering marijuana projects, and chefs on both coasts and in food-forward countries like Denmark have been staging underground meals with modern twists like compressed watermelon, smoked cheese and marijuana-oil vinaigrette.
“It really won’t be long until it becomes part of haute cuisine and part of respectable culinary culture, instead of just an illegal doobie in the backyard,” said Ken Albala, director of the food studies program at the University of the Pacific in San Francisco.
Two problems, however, stand in the way: First, it’s hard to control how high people get when they eat marijuana. And second, it really doesn’t taste that good.
Still, what if chefs could develop a culinary canon around marijuana that tamed both its taste and mood-altering effects, and diners came to appreciate dishes with marijuana the way one appreciates good bourbon? Paired with delicious recipes and the pleasures of good company, cannabis cookery might open a new dimension in dining that echoes the evolutions in the wine and cocktail cultures.
“I am sure someone is going to grow some that is actually delicious and we’ll all learn about it,” said Ruth Reichl, the former editor of Gourmet magazine and a former New York Times restaurant critic. Who could have predicted that kale would be the trendiest green on the plate, or that people would line up for pear and blue cheese ice cream, she asked. (...)
Cooking with marijuana requires a scientist’s touch to draw out and control the cannabinoids like tetrahydrocannabinol, or THC, which alter one’s mood and physical sensations. To get a consistent, controllable effect, marijuana is best heated and combined with fats like butter, olive oil or cream. (...)
Twenty-three states and the District of Columbia have legalized medical marijuana sales. Only four states — Washington, Oregon, Alaska and Colorado — allow recreational sales. The people who sell edible marijuana often advise people who have not tried it before to start with 10 milligrams or less. Dosing is easier to control in batter-based dishes or chocolate, where the drug can be distributed more evenly. In savory applications, dosing is trickier. A cook might be able to make sure a tablespoon of lime-cilantro butter has 10 milligrams of THC, but will the guest eat exactly that amount?
by Kim Severson, NY Times | Read more:
Image: Matthew Staver for The New York TimesThe Fall of the Creative Class
[ed. For more background on Richard Florida's work see: Questioning the Cult of the Creative Class. Also, if you're interested: the response and counter-response to this essay.]
In the late 1990s, my wife and I got in a U-Haul, hit I-90 and headed west for a few days until we came to Portland, Oregon. We had no jobs, no apartment, and no notion other than getting out of Minnesota.
We chose Portland mainly because it was cheaper than the other places we’d liked on a month-long road trip through the West (San Francisco, Seattle, Missoula), because it had a great book store we both fell in love with, and because I had a cousin who lived there in the northeast part of the city, which was somewhat less trendy back then. (Our first night, police found a body in the park across the street.) The plan was to stay a year, then try the other coast, then who knows? We were young! But we loved it and stayed for nearly five years. Then, when we started thinking of breeding, like salmon, we decided to swim back to the pool in which we were bred.
For a variety of not-very-well-thought-out reasons, this brought us to Madison, Wisconsin. It wasn’t too far from our families. It had a stellar reputation. And for the Midwest, it possessed what might pass for cachet. It was liberal and open minded. It was a college town. It had coffee shops and bike shops. Besides, it had been deemed a “Creative Class” stronghold by Richard Florida, the prophet of prosperous cool. We had no way of knowing how wrong he was about Madison…and about everything.
Florida’s idea was a nice one: Young, innovative people move to places that are open and hip and tolerant. They, in turn, generate economic innovation. I loved this idea because, as a freelance writer, it made me important. I was poor, but somehow I made everyone else rich! It seemed to make perfect sense. Madison, by that reasoning, should have been clamoring to have me, since I was one of the mystical bearers of prosperity. (...)
For some reason, these and most other relationships never quite blossomed the way we’d hoped, the way they had in all the other place we’d lived. For a time, my wife had a soulless job with a boss who sat behind her, staring at the back of her head. I found work in a dusty tomb of a bookstore, doing data entry with coworkers who complained about their neurological disorders, or who told me about the magical creatures they saw on their way home, and who kept websites depicting themselves as minotaurs.
I’m not sure what exactly I expected, but within a year or two it was clear that something wasn’t right. If Madison was such a Creative Class hotbed overflowing with independent, post-industrial workers like myself, we should have fit in. Yet our presence didn’t seem to matter to anyone, creatively or otherwise. And anyway, Madison’s economy was humming along with unemployment around four percent, while back in fun, creative Portland, it was more than twice that, at eight and a half percent. This was not how the world according to Florida was supposed to work. I started to wonder if I’d misread him. Around town I encountered a few other transplants who also found themselves scratching their heads over what the fuss had been about. Within a couple years, most of them would be gone. (...)
Jamie Peck is a geography professor who has been one of the foremost critics of Richard Florida’s Creative Class theory. He now teaches at the University of British Columbia in Vancouver, but at the time Florida’s book was published in 2002, he was also living in Madison. “The reason I wrote about this,” Peck told me on the phone, “is because Madison’s mayor started to embrace it. I lived on the east side of town, probably as near to this lifestyle as possible, and it was bullshit that this was actually what was driving Madison’s economy. What was driving Madison was public sector spending through the university, not the dynamic Florida was describing.”
In his initial critique, Peck said The Rise of the Creative Class was filled with “self-indulgent forms of amateur microsociology and crass celebrations of hipster embourgeoisement.” That’s another way of saying that Florida was just describing the “hipsterization” of wealthy cities and concluding that this was what was causing those cities to be wealthy. As some critics have pointed out, that’s a little like saying that the high number of hot dog vendors in New York City is what’s causing the presence of so many investment bankers. So if you want banking, just sell hot dogs. “You can manipulate your arguments about correlation when things happen in the same place,” says Peck.
What was missing, however, was any actual proof that the presence of artists, gays and lesbians or immigrants was causing economic growth, rather than economic growth causing the presence of artists, gays and lesbians or immigrants. Some more recent work has tried to get to the bottom of these questions, and the findings don’t bode well for Florida’s theory. In a four-year, $6 million study of thirteen cities across Europe called “Accommodating Creative Knowledge,” that was published in 2011, researchers found one of Florida’s central ideas—the migration of creative workers to places that are tolerant, open and diverse—was simply not happening.
“They move to places where they can find jobs,” wrote author Sako Musterd, “and if they cannot find a job there, the only reason to move is for study or for personal social network reasons, such as the presence of friends, family, partners, or because they return to the place where they have been born or have grown up.” But even if they had been pouring into places because of “soft” factors like coffee shops and art galleries, according to Stefan Krätke, author of a 2010 German study, it probably wouldn’t have made any difference, economically. Krätke broke Florida’s Creative Class (which includes accountants, realtors, bankers and politicians) into five separate groups and found that only the “scientifically and technologically creative” workers had an impact on regional GDP. Krätke wrote “that Florida’s conception does not match the state of findings of regional innovation research and that his way of relating talent and technology might be regarded as a remarkable exercise in simplification.”
Perhaps one of the most damning studies was in some ways the simplest. In 2009 Michele Hoyman and Chris Faricy published a study using Florida’s own data from 1990 to 2004, in which they tried to find a link between the presence of the creative class workers and any kind of economic growth. “The results were pretty striking,” said Faricy, who now teaches political science at Washington State University. “The measurement of the creative class that Florida uses in his book does not correlate with any known measure of economic growth and development. Basically, we were able to show that the emperor has no clothes.” Their study also questioned whether the migration of the creative class was happening. “Florida said that creative class presence—bohemians, gays, artists—will draw what we used to call yuppies in,” says Hoyman. “We did not find that.”
In the late 1990s, my wife and I got in a U-Haul, hit I-90 and headed west for a few days until we came to Portland, Oregon. We had no jobs, no apartment, and no notion other than getting out of Minnesota.
We chose Portland mainly because it was cheaper than the other places we’d liked on a month-long road trip through the West (San Francisco, Seattle, Missoula), because it had a great book store we both fell in love with, and because I had a cousin who lived there in the northeast part of the city, which was somewhat less trendy back then. (Our first night, police found a body in the park across the street.) The plan was to stay a year, then try the other coast, then who knows? We were young! But we loved it and stayed for nearly five years. Then, when we started thinking of breeding, like salmon, we decided to swim back to the pool in which we were bred.
For a variety of not-very-well-thought-out reasons, this brought us to Madison, Wisconsin. It wasn’t too far from our families. It had a stellar reputation. And for the Midwest, it possessed what might pass for cachet. It was liberal and open minded. It was a college town. It had coffee shops and bike shops. Besides, it had been deemed a “Creative Class” stronghold by Richard Florida, the prophet of prosperous cool. We had no way of knowing how wrong he was about Madison…and about everything.
Florida’s idea was a nice one: Young, innovative people move to places that are open and hip and tolerant. They, in turn, generate economic innovation. I loved this idea because, as a freelance writer, it made me important. I was poor, but somehow I made everyone else rich! It seemed to make perfect sense. Madison, by that reasoning, should have been clamoring to have me, since I was one of the mystical bearers of prosperity. (...)
For some reason, these and most other relationships never quite blossomed the way we’d hoped, the way they had in all the other place we’d lived. For a time, my wife had a soulless job with a boss who sat behind her, staring at the back of her head. I found work in a dusty tomb of a bookstore, doing data entry with coworkers who complained about their neurological disorders, or who told me about the magical creatures they saw on their way home, and who kept websites depicting themselves as minotaurs.
I’m not sure what exactly I expected, but within a year or two it was clear that something wasn’t right. If Madison was such a Creative Class hotbed overflowing with independent, post-industrial workers like myself, we should have fit in. Yet our presence didn’t seem to matter to anyone, creatively or otherwise. And anyway, Madison’s economy was humming along with unemployment around four percent, while back in fun, creative Portland, it was more than twice that, at eight and a half percent. This was not how the world according to Florida was supposed to work. I started to wonder if I’d misread him. Around town I encountered a few other transplants who also found themselves scratching their heads over what the fuss had been about. Within a couple years, most of them would be gone. (...)
Jamie Peck is a geography professor who has been one of the foremost critics of Richard Florida’s Creative Class theory. He now teaches at the University of British Columbia in Vancouver, but at the time Florida’s book was published in 2002, he was also living in Madison. “The reason I wrote about this,” Peck told me on the phone, “is because Madison’s mayor started to embrace it. I lived on the east side of town, probably as near to this lifestyle as possible, and it was bullshit that this was actually what was driving Madison’s economy. What was driving Madison was public sector spending through the university, not the dynamic Florida was describing.”
In his initial critique, Peck said The Rise of the Creative Class was filled with “self-indulgent forms of amateur microsociology and crass celebrations of hipster embourgeoisement.” That’s another way of saying that Florida was just describing the “hipsterization” of wealthy cities and concluding that this was what was causing those cities to be wealthy. As some critics have pointed out, that’s a little like saying that the high number of hot dog vendors in New York City is what’s causing the presence of so many investment bankers. So if you want banking, just sell hot dogs. “You can manipulate your arguments about correlation when things happen in the same place,” says Peck.
What was missing, however, was any actual proof that the presence of artists, gays and lesbians or immigrants was causing economic growth, rather than economic growth causing the presence of artists, gays and lesbians or immigrants. Some more recent work has tried to get to the bottom of these questions, and the findings don’t bode well for Florida’s theory. In a four-year, $6 million study of thirteen cities across Europe called “Accommodating Creative Knowledge,” that was published in 2011, researchers found one of Florida’s central ideas—the migration of creative workers to places that are tolerant, open and diverse—was simply not happening.
“They move to places where they can find jobs,” wrote author Sako Musterd, “and if they cannot find a job there, the only reason to move is for study or for personal social network reasons, such as the presence of friends, family, partners, or because they return to the place where they have been born or have grown up.” But even if they had been pouring into places because of “soft” factors like coffee shops and art galleries, according to Stefan Krätke, author of a 2010 German study, it probably wouldn’t have made any difference, economically. Krätke broke Florida’s Creative Class (which includes accountants, realtors, bankers and politicians) into five separate groups and found that only the “scientifically and technologically creative” workers had an impact on regional GDP. Krätke wrote “that Florida’s conception does not match the state of findings of regional innovation research and that his way of relating talent and technology might be regarded as a remarkable exercise in simplification.”
Perhaps one of the most damning studies was in some ways the simplest. In 2009 Michele Hoyman and Chris Faricy published a study using Florida’s own data from 1990 to 2004, in which they tried to find a link between the presence of the creative class workers and any kind of economic growth. “The results were pretty striking,” said Faricy, who now teaches political science at Washington State University. “The measurement of the creative class that Florida uses in his book does not correlate with any known measure of economic growth and development. Basically, we were able to show that the emperor has no clothes.” Their study also questioned whether the migration of the creative class was happening. “Florida said that creative class presence—bohemians, gays, artists—will draw what we used to call yuppies in,” says Hoyman. “We did not find that.”
by Frank Bures, thirty two Magazine | Read more:
Image: Will Dinski
Kazuo Ishiguro, The Art of Fiction No. 196
[ed. One of my favorite authors has a new book coming out in March: The Buried Giant.]
Kazuo Ishiguro was born in Nagasaki in 1954 and moved with his family to the small town of Guildford, in southern England, when he was five. He didn’t return to Japan for twenty-nine years. (His Japanese, he says, is “awful.”) At twenty-seven he published his first novel, A Pale View of Hills (1982), set largely in Nagasaki, to near unanimous praise. His second novel, An Artist of the Floating World (1986), won Britain’s prestigious Whitbread award. And his third, The Remains of the Day (1989), sealed his international fame. It sold more than a million copies in English, won the Booker Prize, and was made into a Merchant Ivory movie starring Anthony Hopkins, with a screenplay by Ruth Prawer Jhabvala. (An earlier script by Harold Pinter, Ishiguro recalls, featured “a lot of game being chopped up on kitchen boards.”) Ishiguro was named an Officer of the Order of the British Empire and, for a while, his portrait hung at 10 Downing Street. Defying consecration, he surprised readers with his next novel, The Unconsoled (1995), more than five hundred pages of what appeared to be stream-of-consciousness. Some baffled critics savaged it; James Wood wrote that “it invents its own category of badness.” But others came passionately to its defense, including Anita Brookner, who overcame her initial doubts to call it “almost certainly a masterpiece.” The author of two more acclaimed novels—When We Were Orphans (2000) and Never Let Me Go (2005)—Ishiguro has also written screenplays and teleplays, and he composes lyrics, most recently for the jazz chanteuse Stacey Kent. Their collaborative CD, Breakfast on the Morning Tram, was a best-selling jazz album in France.
In the pleasant white stucco house where Ishiguro lives with his sixteen-year-old daughter, Naomi, and his wife, Lorna, a former social worker, there are three gleaming electric guitars and a state-of-the-art stereo system. The small office upstairs where Ishiguro writes is custom designed in floor-to-ceiling blond wood with rows of color-coded binders neatly stacked in cubbyholes. Copies of his novels in Polish, Italian, Malaysian, and other languages line one wall. (...)
INTERVIEWER
You had success with your fiction right from the start—but was there any writing from your youth that never got published?
KAZUO ISHIGURO
After university, when I was working with homeless people in west London, I wrote a half-hour radio play and sent it to the BBC. It was rejected but I got an encouraging response. It was kind of in bad taste, but it’s the first piece of juvenilia I wouldn’t mind other people seeing. It was called “Potatoes and Lovers.” When I submitted the manuscript, I spelled potatoes incorrectly, so it said potatos. It was about two young people who work in a fish-and-chips café. They are both severely cross-eyed, and they fall in love with each other, but they never acknowledge the fact that they’re cross-eyed. It’s the unspoken thing between them. At the end of the story they decide not to marry, after the narrator has a strange dream where he sees a family coming toward him on the seaside pier. The parents are cross-eyed, the children are cross-eyed, the dog is cross-eyed, and he says, All right, we’re not going to marry.
INTERVIEWER
What possessed you to write that story?
by Susannah Hunnewell, Paris Review | Read more:
Image: Matt Carr/Getty Images
The Foreign Spell
It’s fashionable in some circles to talk of Otherness as a burden to be borne, and there will always be some who feel threatened by—and correspondingly hostile to—anyone who looks and sounds different from themselves. But in my experience, foreignness can as often be an asset. The outsider enjoys a kind of diplomatic immunity in many places, and if he seems witless or alien to some, he will seem glamorous and exotic to as many others. In open societies like California, someone with Indian features such as mine is a target of positive discrimination, as strangers ascribe to me yogic powers or Vedic wisdom that couldn’t be further from my background (or my interest).
Besides, the very notion of the foreign has been shifting in our age of constant movement, with more than fifty million refugees; every other Torontonian you meet today is what used to be called a foreigner, and the number of people living in lands they were not born to will surpass 300 million in the next generation. Soon there’ll be more foreigners on earth than there are Americans. Foreignness is a planetary condition, and even when you walk through your hometown—whether that’s New York or London or Sydney—half the people around you are speaking in languages and dealing in traditions different from your own. (...)
Growing up, I soon saw that I was ill-equipped for many things by my multi-continental upbringing—I would never enjoy settling down in any one place, and I wouldn’t vote anywhere for my first half-century on earth—but I saw, too, that I had been granted a kind of magic broomstick that few humans before me had ever enjoyed. By the age of nine, flying alone over the North Pole six times a year—between my parents’ home in California and my schools in England—I realized that only one generation before, when my parents had gone to college in Britain, they had had to travel for weeks by boat, sometimes around the stormy Cape of Good Hope. When they bid goodbye to their loved ones—think of V. S. Naipaul hearing of his father’s death while in England, but unable to return to Trinidad—they could not be sure they’d ever see them again.
At seventeen, I was lucky enough to spend the summer in India, the autumn in England, the winter in California, and the spring bumping by bus from Tijuana down to Bolivia—and then up the west coast of South America. I wasn’t rich, but the door to the world was swinging open for those of us ready to live rough and call ourselves foreigners for life. If my native India, the England of my childhood, and the America of my official residence were foreign, why not spend time in Yemen and on Easter Island?
In retrospect, it seems inevitable that I would move, in early adulthood, to what still, after twenty-seven years of residence, remains the most foreign country I know, Japan. However long I live here, even if I speak the language fluently, I will always be a gaikokujin, an “outsider person,” whom customs officials strip-search and children stare at as they might a yeti. I’m reminded of this on a daily basis. Even the dogs I pass on my morning walks around the neighborhood bark and growl every time they catch wind of this butter-reeking alien.
Japan remains itself by maintaining an unbreachable divide between those who belong to the group and those who don’t. This has, of course, left the country behind in an ever more porous world of multiple homes, and is a source of understandable frustration among, say, those Koreans who have lived in the country for generations but were—until relatively recently—obliged to be fingerprinted every year and denied Japanese passports. Yet for a lifelong visitor, the clarity of its divisions is welcome; in free-and-easy California, I always feel as accepted as everyone else, but that doesn’t make me feel any more Californian. Besides, I know that Japan can work as smoothly as it does only by having everyone sing their specific parts from the same score, creating a single choral body. The system that keeps me out produces the efficiency and harmony that draws me in.
I cherish foreignness, personally and internationally, and feel short-shrifted when United Airlines, like so many multinationals today, assures me in a slogan, “The word foreign is losing its meaning”; CNN, for decades, didn’t even use the word, in deference to what it hoped would be a global audience. Big companies have an investment in telling themselves—and us—that all the world’s a single market. Yet all the taco shacks and Ayurvedic doctors and tai chi teachers in the world don’t make the depths of other cultures any more accessible to us. “Read The Sheltering Sky,” I want to tell my neighbors in California as they talk about that adorable urchin they met in the souk in Marrakesh. Next time you’re in Jamaica—or Sri Lanka or Cambodia—think of Forster’s Marabar Caves as much as of the postcard sights that leave you pleasantly consoled. Part of the power of travel is that you stand a good chance of being hollowed out by it. The lucky come back home complaining about crooked rug merchants and dishonest taxi drivers; the unlucky never come home at all.
by Pico Iyer, Lapham's Quarterly | Read more:
Image: Islands, by Brad Kunkle, 2012Going Aboard
When Herman Melville was twenty-one, he embarked on the whaleship Acushnet, out of New Bedford. We all know what that led to. This past summer, Mystic Seaport finished their five-year, 7.5-million-dollar restoration of the 1841 whaleship Charles W. Morgan, the sister ship to the Acushnet. The Morgan is in many ways identical to Melville’s fictional Pequod, save that sperm whale jawbone tiller and a few other sinister touches. Mystic Seaport celebrated the completion by sailing the Morgan around New England for a couple months. I went aboard for a night and a day, intent on following in Ishmael’s footsteps, hoping to breathe a little life into my idea of the distant, literary ship.
by Ben Shattuck, Paris Review | Read more:
Image: Ben Shattuck
2014: The Year When Activist Documentaries Hit the Breaking Point
If I were making a documentary about the uniformity that has infested modern documentaries, it would go something like this: Open with a sequence detailing the extent of the problem, flashing on examples of its reach, cutting in quick, declarative sound bites, scored with music of steadily mounting tension that climaxes just as the title is revealed. Over the next 90-120 minutes, I would lay out the problem in greater detail, primarily via copious interviews with experts on the subject, their data points illustrated via scores of snazzily animated infographics. Along the way, I would introduce the viewer to a handful of Regular Folk affected by the issue at hand, and show how their daily lives have become a struggle (or an inspiration). But lest I send the viewer staggering from the theater bereft of hope, I’d conclude by explaining, in the simplest terms possible, exactly how to solve the problem. And then, over the end credits, I would tell you, the viewer, what you can do to help — beginning with a visit to my documentary’s official website.
What you would learn from this film is that too many of today’s documentaries have become feature-length versions of TV newsmagazine segments, each a 60 Minutes piece stretched out to two hours, two pounds of sugar in a five-pound bag. And perhaps this viewer became more aware of it in 2014 because, early in the year, I saw a film that was like a case study in what’s wrong with this approach: Fed Up, a position-paper doc on the obesity epidemic. It’s got the thesis-paragraph pre-title opening, the animated graphics (complete with cutesy, nonstop sound effects), the closing-credit instructions. And then, as if its TV-news style isn’t obvious enough, it’s even got the comically commonplace “headless fat people walking down the streets” B-roll and narration by, no kidding, Katie Couric.
Fed Up plays like something made to burn off time on MSNBC some Saturday afternoon between reruns of Caught On Camera and Lock-Up, but nope: I saw it at the beginning of 2014 because it was playing at the Sundance Film Festival. It received a simultaneous theatrical and VOD release in May; last month, Indiewire reported that its robust earnings in both have made it one of the year’s most successful documentaries.
Look, this could just be a matter of pet peeves and personal preferences, and of trends that have merely made themselves apparent to someone whose vocation requires consumption of more documentaries than the average moviegoer. But this formula, and the style that goes hand in hand with it, is infecting more and more nonfiction films, lending an air of troubling sameness to activist docs like Ivory Tower (on the financial crisis of higher education) and Citizen Koch (on the massive casualties of the Citizens United decision). But it’s been in the air for some time, with earlier films like Food Inc., Bully, The Invisible War, Waiting for “Superman,” and the granddaddy of the movement, Davis Guggenheim’s Oscar-winning An Inconvenient Truth — a film, lest we forget, about a PowerPoint presentation. And it doesn’t stop there; even a profile movie like Nas: Time Is Illmatic has a big, state-the-premise pre-title sequence, which plays, in most of these films, like the teaser before the first commercial break.
The formulaic construction of these documentaries — as set in stone as the meet-cute/hate/love progression of rom-coms or the rise/addiction/fall/comeback framework of the music biopic — is particularly galling because it’s shackling a form where even fewer rules should apply. The ubiquity (over the past decade and a half) of low-cost, low-profile, high-quality video cameras and user-friendly, dirt-cheap non-linear editing technology has revolutionized independent film in general, allowing young filmmakers opportunities to create professional-looking product even directors of the previous generation could only dream of. (...)
It’s easy to arrive at that point with these diverse subjects, the logic goes, but a more straightforward, news-doc approach is required for aggressive, activist documentaries with points to make and moviegoers to educate — and the commonness of that thinking is perhaps why so many critics have gone nuts for CITIZENFOUR, Laura Poitras’ account of Edward Snowden’s leak of NSA documents detailing surveillance programs around the world. That’s a giant topic, but the surprise of the picture is how intimate and personal it is, primarily due to the filmmaker’s place within the story: she was the contact point for Snowden, hooked in to his actions via encrypted messages, in the room with the whistleblower as he walked through the documents with Glenn Greenwald.
As a result, much of the film is spent in Snowden’s Hong Kong hotel, Poitras’ camera capturing those explanations and strategy sessions, a procedural detailing logistics, conferences, and conversations. There are no expert talking heads to provide (unnecessary, I would argue) context; there are no jazzy charts and graphs to explain it all to the (presumably) slower folks in the audience. The only such images come in a quick-cut montage of illustrations within the leaked documents, and they’re solely that — illustrations. The most powerful and informative graphics in the film are the mesmerizing images of encrypted messages from Snowden to Poitras, which fill the screen with impenetrable numbers, letters, and symbols, before clearing away to reveal the truth underneath, a powerful metaphor for Snowden’s actions (and the film itself).
What you would learn from this film is that too many of today’s documentaries have become feature-length versions of TV newsmagazine segments, each a 60 Minutes piece stretched out to two hours, two pounds of sugar in a five-pound bag. And perhaps this viewer became more aware of it in 2014 because, early in the year, I saw a film that was like a case study in what’s wrong with this approach: Fed Up, a position-paper doc on the obesity epidemic. It’s got the thesis-paragraph pre-title opening, the animated graphics (complete with cutesy, nonstop sound effects), the closing-credit instructions. And then, as if its TV-news style isn’t obvious enough, it’s even got the comically commonplace “headless fat people walking down the streets” B-roll and narration by, no kidding, Katie Couric.
Fed Up plays like something made to burn off time on MSNBC some Saturday afternoon between reruns of Caught On Camera and Lock-Up, but nope: I saw it at the beginning of 2014 because it was playing at the Sundance Film Festival. It received a simultaneous theatrical and VOD release in May; last month, Indiewire reported that its robust earnings in both have made it one of the year’s most successful documentaries.
Look, this could just be a matter of pet peeves and personal preferences, and of trends that have merely made themselves apparent to someone whose vocation requires consumption of more documentaries than the average moviegoer. But this formula, and the style that goes hand in hand with it, is infecting more and more nonfiction films, lending an air of troubling sameness to activist docs like Ivory Tower (on the financial crisis of higher education) and Citizen Koch (on the massive casualties of the Citizens United decision). But it’s been in the air for some time, with earlier films like Food Inc., Bully, The Invisible War, Waiting for “Superman,” and the granddaddy of the movement, Davis Guggenheim’s Oscar-winning An Inconvenient Truth — a film, lest we forget, about a PowerPoint presentation. And it doesn’t stop there; even a profile movie like Nas: Time Is Illmatic has a big, state-the-premise pre-title sequence, which plays, in most of these films, like the teaser before the first commercial break.
The formulaic construction of these documentaries — as set in stone as the meet-cute/hate/love progression of rom-coms or the rise/addiction/fall/comeback framework of the music biopic — is particularly galling because it’s shackling a form where even fewer rules should apply. The ubiquity (over the past decade and a half) of low-cost, low-profile, high-quality video cameras and user-friendly, dirt-cheap non-linear editing technology has revolutionized independent film in general, allowing young filmmakers opportunities to create professional-looking product even directors of the previous generation could only dream of. (...)
It’s easy to arrive at that point with these diverse subjects, the logic goes, but a more straightforward, news-doc approach is required for aggressive, activist documentaries with points to make and moviegoers to educate — and the commonness of that thinking is perhaps why so many critics have gone nuts for CITIZENFOUR, Laura Poitras’ account of Edward Snowden’s leak of NSA documents detailing surveillance programs around the world. That’s a giant topic, but the surprise of the picture is how intimate and personal it is, primarily due to the filmmaker’s place within the story: she was the contact point for Snowden, hooked in to his actions via encrypted messages, in the room with the whistleblower as he walked through the documents with Glenn Greenwald.
As a result, much of the film is spent in Snowden’s Hong Kong hotel, Poitras’ camera capturing those explanations and strategy sessions, a procedural detailing logistics, conferences, and conversations. There are no expert talking heads to provide (unnecessary, I would argue) context; there are no jazzy charts and graphs to explain it all to the (presumably) slower folks in the audience. The only such images come in a quick-cut montage of illustrations within the leaked documents, and they’re solely that — illustrations. The most powerful and informative graphics in the film are the mesmerizing images of encrypted messages from Snowden to Poitras, which fill the screen with impenetrable numbers, letters, and symbols, before clearing away to reveal the truth underneath, a powerful metaphor for Snowden’s actions (and the film itself).
by Jason Bailey, Flavorwire | Read more:
Image: Fed Up
Sunday, December 28, 2014
The Capitalist Nightmare at the Heart of Breaking Bad
Back in October, you could have gone to Toys ’R’ Us and picked up the perfect present for the Breaking Bad fan in your family. Fifty bucks (all right, let’s assume you’re in Albuquerque) would buy you “Heisenberg (Walter White)” complete with a dinky little handgun clutched in his mitt; his sidekick Jesse, in an orange hazmat suit, was yours for $40. But then a Florida mom (it’s always a mom; it’s often in Florida) objected, and got a petition going, needless to say. “While the show may be compelling viewing for adults, its violent content and celebration of the drug trade make this collection unsuitable to be sold alongside Barbie dolls and Disney characters,” she wrote.
It’s worth noting, perhaps, that if Barbie’s proportions had their equivalent in an adult female, that woman would have room for only half a liver and a few inches of intestine; her tiny feet and top-heavy frame would oblige her to walk on all fours. A great role model? I’m not so sure. (And Disney is not always entirely benign. My mother was five when Snow White came out; I’m not sure she ever really recovered from her encounter with its Evil Queen.)
“I’m so mad, I am burning my Florida Mom action figure in protest,” Bryan Cranston tweeted when the storm broke. Cranston went from advertising haemorrhoid cream (“Remember – oxygen action is special with Preparation H”) to playing Hal, the goofy dad on Malcolm in the Middle, to full-on superstardom as Breaking Bad became a talisman of modern popular culture. The show began broadcasting in the US in January 2008 and ran for five seasons. Stephen King called it the best television show in 15 years; it was showered with dozens of awards; Cranston took the Emmy for Outstanding Lead Actor in a Drama Series for four out of the show’s five seasons.
So get over it, Florida Mom. Breaking Bad was, and remains (at least for the time being), the apogee of water-cooler culture: serious but seriously cool, and the nerd’s revenge, to boot. Walter White – for those of you who are yet to have your lives devoured by the show – is a high-school chemistry teacher: you might think that’s a respected, reasonably well-compensated profession, but in 21st-century America he’s got to have a second job at a carwash just to make ends meet. When he is diagnosed, as the series begins, with terminal lung cancer, his terror (his existential, male, white-collar terror) focuses not on the prospect of his own death, but on how he will provide for his family. A chance encounter with a former student, Jesse Pinkman – a classic dropout nogoodnik with a sideline in drug sales – sets his unlikely career as a drug baron in motion. As his alter ego “Heisenberg” (the name a knowing echo of that icon of uncertainty), Walter has chemical skills that enable him to cook some of the purest methamphetamine the world has ever known . . . and the rest, as they say, is history. (...)
But here’s the thing: Florida Mom is on to something, even if she’s wrong about exactly what it is she was objecting to. “A celebration of the drug trade”? I don’t think so. But why did Breaking Bad get under my skin? Why does it still bother me, all these months later? And why do I think, in an era of exceptional television, that it’s the best thing I have ever seen? (...)
Not everyone wants to use words such as “metadiegetic” when talking about telly, and the close analysis of everything from the show’s vision of landscape to its use of music, or “the epistemological implications of the use a criminal pseudonym”, may be exhausting for some. Yet Pierson’s essay, which opens the volume, draws attention to one of the chief reasons the show has such a terrible and enduring resonance.
Breaking Bad is, he argues, a demonstration of the true consequences of neoliberal ideology: the idea that “the market should be the organising agent for nearly all social, political, economic and personal decisions”. Under neoliberal criminology, the criminal is not a product of psychological disorder, but “a rational-economic actor who contemplates and calculates the risks and the rewards of his actions”. And there is Walter White in a nutshell.
It’s worth noting, perhaps, that if Barbie’s proportions had their equivalent in an adult female, that woman would have room for only half a liver and a few inches of intestine; her tiny feet and top-heavy frame would oblige her to walk on all fours. A great role model? I’m not so sure. (And Disney is not always entirely benign. My mother was five when Snow White came out; I’m not sure she ever really recovered from her encounter with its Evil Queen.)
“I’m so mad, I am burning my Florida Mom action figure in protest,” Bryan Cranston tweeted when the storm broke. Cranston went from advertising haemorrhoid cream (“Remember – oxygen action is special with Preparation H”) to playing Hal, the goofy dad on Malcolm in the Middle, to full-on superstardom as Breaking Bad became a talisman of modern popular culture. The show began broadcasting in the US in January 2008 and ran for five seasons. Stephen King called it the best television show in 15 years; it was showered with dozens of awards; Cranston took the Emmy for Outstanding Lead Actor in a Drama Series for four out of the show’s five seasons.
So get over it, Florida Mom. Breaking Bad was, and remains (at least for the time being), the apogee of water-cooler culture: serious but seriously cool, and the nerd’s revenge, to boot. Walter White – for those of you who are yet to have your lives devoured by the show – is a high-school chemistry teacher: you might think that’s a respected, reasonably well-compensated profession, but in 21st-century America he’s got to have a second job at a carwash just to make ends meet. When he is diagnosed, as the series begins, with terminal lung cancer, his terror (his existential, male, white-collar terror) focuses not on the prospect of his own death, but on how he will provide for his family. A chance encounter with a former student, Jesse Pinkman – a classic dropout nogoodnik with a sideline in drug sales – sets his unlikely career as a drug baron in motion. As his alter ego “Heisenberg” (the name a knowing echo of that icon of uncertainty), Walter has chemical skills that enable him to cook some of the purest methamphetamine the world has ever known . . . and the rest, as they say, is history. (...)
But here’s the thing: Florida Mom is on to something, even if she’s wrong about exactly what it is she was objecting to. “A celebration of the drug trade”? I don’t think so. But why did Breaking Bad get under my skin? Why does it still bother me, all these months later? And why do I think, in an era of exceptional television, that it’s the best thing I have ever seen? (...)
Not everyone wants to use words such as “metadiegetic” when talking about telly, and the close analysis of everything from the show’s vision of landscape to its use of music, or “the epistemological implications of the use a criminal pseudonym”, may be exhausting for some. Yet Pierson’s essay, which opens the volume, draws attention to one of the chief reasons the show has such a terrible and enduring resonance.
Breaking Bad is, he argues, a demonstration of the true consequences of neoliberal ideology: the idea that “the market should be the organising agent for nearly all social, political, economic and personal decisions”. Under neoliberal criminology, the criminal is not a product of psychological disorder, but “a rational-economic actor who contemplates and calculates the risks and the rewards of his actions”. And there is Walter White in a nutshell.
by Erica Wagner, New Statesman | Read more:
Image: Ralph SteadmanSaturday, December 27, 2014
Bob Dylan
[ed. NSA, CIA, VA, Health Insurance, Hospitals, Facebook, Citicorp, College Tuition, Transportation, Public Utilities, Climate Change, Publishing, Big Pharma, Net Neutrality, Minimum Wage, Guantanamo, AfghanIraq, Torture, K-Street, Wall Street, Congress (and much, much more). Happy New Year.]
Fish Cakes Conquer Their Shyness
A Recipe for Spicy Fish Cakes
The typical fish cake does not call attention to itself. Potato-rich, monochromatic and satisfying, it is the kind of thing you’d make for a homey dinner when the food wasn’t the point.
Not so with these fish cakes, which, with their mix of aromatic chiles and herbs, are a brighter and more compelling take. The recipe starts out like any other by combining cooked white fillets with mashed potatoes, bread crumbs and eggs. After chilling, the mixture is coated in flour and fried until crisp and brown.
But that’s all for the similarities. I’ve added flavor in every step. Instead of merely boiling the fish, I sear it with garlic, then steam it in vermouth or white wine. After the fish is done, the potatoes are simmered in the same pan as a way to deglaze it and incorporate the tasty browned bits stuck to its bottom. I leave the garlic cloves in the pan, too, to thoroughly soften along with the potatoes, then I mash the roots all together. Those garlicky mashed potatoes make a rich and pungent base for the fish.
For seasoning, I stir in minced scallions, cilantro and basil, grated lime zest and hot green chiles. The cakes are speckled with green in the center, rather than dull all-white. And the flavor is vibrant and spicy — though the degree of spice depends on your chile. A small serrano will give you a mild but persistent heat. Substituting a jalapeño takes it down a notch, while using a Thai chile could make it fiery enough for your cheeks to flush.
The typical fish cake does not call attention to itself. Potato-rich, monochromatic and satisfying, it is the kind of thing you’d make for a homey dinner when the food wasn’t the point.
Not so with these fish cakes, which, with their mix of aromatic chiles and herbs, are a brighter and more compelling take. The recipe starts out like any other by combining cooked white fillets with mashed potatoes, bread crumbs and eggs. After chilling, the mixture is coated in flour and fried until crisp and brown.
But that’s all for the similarities. I’ve added flavor in every step. Instead of merely boiling the fish, I sear it with garlic, then steam it in vermouth or white wine. After the fish is done, the potatoes are simmered in the same pan as a way to deglaze it and incorporate the tasty browned bits stuck to its bottom. I leave the garlic cloves in the pan, too, to thoroughly soften along with the potatoes, then I mash the roots all together. Those garlicky mashed potatoes make a rich and pungent base for the fish.
For seasoning, I stir in minced scallions, cilantro and basil, grated lime zest and hot green chiles. The cakes are speckled with green in the center, rather than dull all-white. And the flavor is vibrant and spicy — though the degree of spice depends on your chile. A small serrano will give you a mild but persistent heat. Substituting a jalapeño takes it down a notch, while using a Thai chile could make it fiery enough for your cheeks to flush.
by Melissa Clark, NY Times | Read more:
Image: NY Times
Lesbianism Made Easy
The easiest way to pick up a straight woman, which is so obvious you’ll be embarrassed you didn’t think of it, is to pick up her boyfriend and/or husband. Male heterosexuals, for reasons no one really understands, find the practice of lesbianism — particularly when utilizing their favorite film stars or own personal girlfriends — a particularly appealing way of spending time, second perhaps only to receiving blow jobs. In this, they are united with their homosexual brothers, except for the lesbian part.
Surprisingly many female heterosexuals attached to males are willing to please their boyfriends in this fashion. Of course, there is no reason, other than logic and common decency, to expect the female in question to admit the pleasure she may receive from this hobby of her boyfriend’s — particularly if it has ever been a little hobby of hers in those bouncy college days or other times in her excitingly varied life.
Should you not wish to be offended or disappointed by the degree of open enthusiasm your heterosexual displays about having carnal knowledge of with or on you, it pays to adopt a hardened veneer so as to allow certain statements typical of her kind to bounce off your chest without injuring either your self-esteem or any future chances of being called upon for another go at enhancing her sacred relationship.
These statements will usually take the form of: “This isn’t really my thing”; “I’m not into women”; “I’m only doing this because I really really love Ted”; and “Oooh! That was — I mean, not that I’d ever want to do it again, but God, you’re … sweet.”
There are several possible responses to such clearly desperate, if insulting, statements. You may consider a reply along the lines of, “I don’t know what it is; I usually find sleeping with women much wilder, more uninhibited and multiorgasmic than this!” or a classically simple, “I never want to do that again.” These insults to your female heterosexual’s performance and appeal will, if she’s a woman worth having, effectively provoke her to prove to you, and herself, that you very much enjoyed sleeping with her, whatever you may think you’re pulling now. No doubt she will even be forced to make you repeat various acts until she’s satisfied it’s clear to all concerned that while she may not choose to enjoy what you’re doing together, you can’t deny that you find it fairly … compelling. You should feel free to continue denying your enjoyment, so that she will be forced to call you late into the evening to reiterate her point, during which time you can explain to her that the phone truly isn’t the place for such discussions so why doesn’t she come over so you can clear the air once and for all?
by Helen Eisenbach, Medium | Read more:
Image: uncredited
Surprisingly many female heterosexuals attached to males are willing to please their boyfriends in this fashion. Of course, there is no reason, other than logic and common decency, to expect the female in question to admit the pleasure she may receive from this hobby of her boyfriend’s — particularly if it has ever been a little hobby of hers in those bouncy college days or other times in her excitingly varied life.
Should you not wish to be offended or disappointed by the degree of open enthusiasm your heterosexual displays about having carnal knowledge of with or on you, it pays to adopt a hardened veneer so as to allow certain statements typical of her kind to bounce off your chest without injuring either your self-esteem or any future chances of being called upon for another go at enhancing her sacred relationship.
These statements will usually take the form of: “This isn’t really my thing”; “I’m not into women”; “I’m only doing this because I really really love Ted”; and “Oooh! That was — I mean, not that I’d ever want to do it again, but God, you’re … sweet.”
There are several possible responses to such clearly desperate, if insulting, statements. You may consider a reply along the lines of, “I don’t know what it is; I usually find sleeping with women much wilder, more uninhibited and multiorgasmic than this!” or a classically simple, “I never want to do that again.” These insults to your female heterosexual’s performance and appeal will, if she’s a woman worth having, effectively provoke her to prove to you, and herself, that you very much enjoyed sleeping with her, whatever you may think you’re pulling now. No doubt she will even be forced to make you repeat various acts until she’s satisfied it’s clear to all concerned that while she may not choose to enjoy what you’re doing together, you can’t deny that you find it fairly … compelling. You should feel free to continue denying your enjoyment, so that she will be forced to call you late into the evening to reiterate her point, during which time you can explain to her that the phone truly isn’t the place for such discussions so why doesn’t she come over so you can clear the air once and for all?
by Helen Eisenbach, Medium | Read more:
Image: uncredited
Friday, December 26, 2014
The Secret to the Uber Economy is Wealth Inequality
The same goes for services. When I lived there, a man came around every morning to collect my clothes and bring them back crisply ironed the next day; he would have washed them, too, but I had a washing machine.
These luxuries are not new. I took advantage of them long before Uber became a verb, before the world saw the first iPhone in 2007, even before the first submarine fibre-optic cable landed on our shores in 1997. In my hometown of Mumbai, we have had many of these conveniences for at least as long as we have had landlines—and some even earlier than that.
It did not take technology to spur the on-demand economy. It took masses of poor people.
In San Francisco, another peninsular city on another west coast on the other side of the world, a similar revolution of convenience is underway, spurred by the unstoppable rise of Uber, the on-demand taxi service, which went from offering services in 60 cities around the world at the end of last year to more than 200 today.
Uber’s success has sparked a revolution, covered in great detail this summer by Re/code, a tech blog, which ran a special series about “the new instant gratification economy.” As Re/code pointed out, after Uber showed how it’s done, nearly every pitch made by starry-eyed technologists “in Silicon Valley seemed to morph overnight into an ‘Uber for X’ startup.”
Various companies are described now as “Uber for massages,” “Uber for alcohol,” and “Uber for laundry and dry cleaning,” among many, many other things (“Uber for city permits”). So profound has been their cultural influence in 2014, one man wrote a poem about them for Quartz. (Nobody has yet written a poem dedicated to the other big cultural touchstone of 2014 for the business and economics crowd, French economist Thomas Piketty’s smash hit, Capital in the Twenty-First Century.)
The conventional narrative is this: enabled by smartphones, with their GPS chips and internet connections, enterprising young businesses are using technology to connect a vast market willing to pay for convenience with small businesses or people seeking flexible work.
This narrative ignores another vital ingredient, without which this new economy would fall apart: inequality.
There are only two requirements for an on-demand service economy to work, and neither is an iPhone. First, the market being addressed needs to be big enough to scale—food, laundry, taxi rides. Without that, it’s just a concierge service for the rich rather than a disruptive paradigm shift, as a venture capitalist might say. Second, and perhaps more importantly, there needs to be a large enough labor class willing to work at wages that customers consider affordable and that the middlemen consider worthwhile for their profit margins. (...)
There is no denying the seductive nature of convenience—or the cold logic of businesses that create new jobs, whatever quality they may be. But the notion that brilliant young programmers are forging a newfangled “instant gratification” economy is a falsehood. Instead, it is a rerun of the oldest sort of business: middlemen insinuating themselves between buyers and sellers.
by Leo Mirani, Quartz | Read more:
Image: Reuters
Subscribe to:
Posts (Atom)