Monday, April 29, 2013


Blima Efraim - Hemingway Cat in a Bath
via:

Bags of Cash

For more than a decade, wads of American dollars packed into suitcases, backpacks and, on occasion, plastic shopping bags have been dropped off every month or so at the offices of Afghanistan’s president — courtesy of the Central Intelligence Agency.

All told, tens of millions of dollars have flowed from the C.I.A. to the office of President Hamid Karzai, according to current and former advisers to the Afghan leader.

“We called it ‘ghost money,’ ” said Khalil Roman, who served as Mr. Karzai’s deputy chief of staff from 2002 until 2005. “It came in secret, and it left in secret.”

The C.I.A., which declined to comment for this article, has long been known to support some relatives and close aides of Mr. Karzai. But the new accounts of off-the-books cash delivered directly to his office show payments on a vaster scale, and with a far greater impact on everyday governing.

Moreover, there is little evidence that the payments bought the influence the C.I.A. sought. Instead, some American officials said, the cash has fueled corruption and empowered warlords, undermining Washington’s exit strategy from Afghanistan.

“The biggest source of corruption in Afghanistan,” one American official said, “was the United States.”

The United States was not alone in delivering cash to the president. Mr. Karzai acknowledged a few years ago that Iran regularly gave bags of cash to one of his top aides.

At the time, in 2010, American officials jumped on the payments as evidence of an aggressive Iranian campaign to buy influence and poison Afghanistan’s relations with the United States. What they did not say was that the C.I.A. was also plying the presidential palace with cash — and unlike the Iranians, it still is.

American and Afghan officials familiar with the payments said the agency’s main goal in providing the cash has been to maintain access to Mr. Karzai and his inner circle and to guarantee the agency’s influence at the presidential palace, which wields tremendous power in Afghanistan’s highly centralized government. The officials spoke about the money only on the condition of anonymity. (...)

Like the Iranian cash, much of the C.I.A.’s money goes to paying off warlords and politicians, many of whom have ties to the drug trade and, in some cases, the Taliban. The result, American and Afghan officials said, is that the agency has greased the wheels of the same patronage networks that American diplomats and law enforcement agents have struggled unsuccessfully to dismantle, leaving the government in the grips of what are basically organized crime syndicates.

by Matthew Rosenberg, NY Times |  Read more:
Image: Anja Niedringhaus/Associated Press

Battle for Buddha


When Janice Marturano conducted the “mindful leadership experience” workshop last January at the World Economic Summit in Davos, Switzerland, she was hoping for an audience of 20—at most. “I was prepared for one or two,” the founder and executive director of the Washington-based Institute for Mindful Leadership admits. She needn’t have worried. There was a lineup; people were even turned away. More than 70 of the world’s most influential people crammed into the room, many standing for 90 minutes to learn “techniques for developing focus, clarity and compassion.” The next morning, Marturano led a meditation—a Davos first—that drew 40 people, two-thirds of whom had never meditated before.

The spectre of masters of the universe chanting Om at Davos serves as only one measure of how “mindfulness” has become the new Western mantra. The technique, linked to Buddhist practice, teaches being present in the moment, always attentive to, and accepting one’s thoughts and responses, without judgment. In a 1977 study, mindfulness pioneer Jack Kornfeld presented the approach as a remedy to Western excesses, or “the egoistic, hedonic treadmill of continually avoiding discomfort and seeking pleasure from outside sources that are ultimately unsatisfying and short-lived.”

Mindfulness entered the medical mainstream in the 1980s as a clinically proven method for alleviating chronic pain and stress. Since then, it has metastasized into an omnibus panacea—to help children with attention-deficit hyperactivity disorder concentrate, soldiers with post-traumatic stress disorder recover and, now, Fortune 500 executives compete. In Paul Harrison’s upcoming documentary, The Mindfulness Movie, psychologist Guy Claxton frames the benefits in mercantile terms: “At the most basic level, mindfulness enables you to get value for money out of life,” he says.

What has gripped Western attention is mindfulness’s ability to improve performance—of Olympic athletes, parents, and even nations, as promised in U.S. Congressman Tim Ryan’s 2012 bestseller, Mindful Nation: How a simple practice can help us reduce stress, improve performance and recapture the American spirit. Institutions and companies are racing to adopt “mindful” practices—among them Google, the U.S. military and Monsanto. Jeff Weiner, CEO of the social-networking site LinkedIn, is a disciple, boasting that “compassion” and “listening to others” are now his central management tenets.

A mindfulness industry has taken root, seen in a boom in corporate coaching, “yin” yoga (which develops mindfulness by holding poses at a point of intensity for five to 10 minutes) and books such as Raji Lukkoor’s Inner Pilgrimage: Ten Days to a Mindful Meand 10 Mindful Minutes by actress Goldie Hawn, who runs a mindfulness foundation. The first mass-market magazine devoted to the topic, Mindful, has just launched; the first issue of the Halifax-based bimonthly bills itself as “your guide to less stress and more joy” with features such as “The science of changing your brain.” Publisher Jim Gimian says he wants to send “a very broad message that mindfulness is a lifestyle, a broadly appealing part of life and not something esoteric or foreign.” Even the ads are “curated” to reflect this message, he says; placing a full-page ad for women’s clothing line Eileen Fisher on the first page was strategic: the company also advertises in Vogue.

The trend to mindfulness would seem to signal mass recognition of the need to slow down and pay attention in a turbo-driven, reactive society. Yet its migration from ashram to boardroom is not without tensions. High-profile Buddhists are taking off the gloves, albeit thoughtfully; they say mindfulness is part of a continuum—one of the seven factors of enlightenment—not a self-help technique or “a path which can lead to bigger profits,” as the Financial Times put it. And long-time practioners worry that mindfulness repackaged as a quick fix or a commercial platform could in fact lead to mindlessness, and reinforce the very problems it’s trying to heal.

by Anne Kingston, Macleans |  Read more: 
Illustration by Taylor Shute

The Impossible Decision

Graduate students are always thinking about the pleasures and travails of grad school, and springtime is a period of especially intense reflection. It’s in the spring, often in March and April, that undergraduates receive their acceptance letters. When that happens, they turn to their teachers, many of them graduate students, for advice. They ask the dreaded, complicated, inevitable question: To go, or not to go?

Answering that question is not easy. For graduate students, being consulted about grad school is a little like starring in one of those “Up” documentaries (“28 Up,” ideally; “35 Up,” in some cases). Your students do the work of Michael Apted, the series’s laconic director, asking all sorts of tough, personal questions. They push you to think about the success and failure of your life projects; to decide whether or not you are happy; to guess what the future holds; to consider your life on a decades-long scale. This particular spring, the whole conversation has been enriched by writers from around the Web, who have weighed in on the pros and cons of graduate school, especially in the humanities. In addition to the usual terrifying articles in the advice section of the Chronicle of Higher Education, a pair of pieces in Slate—“Thesis Hatement,” by Rebecca Schuman, and “Thesis Defense” by Katie Roiphe—have sparked many thoughtful responses from bloggers and journalists. It’s as though a virtual symposium has been convened.

I’m a former humanities graduate student myself—I went to grad school in English from 2003 through 2011 before becoming a journalist, and am still working nights on my dissertation—and I’m impressed by the clarity of the opinions these essays express. (Rebecca Schuman: “Don’t do it. Just don’t”; Katie Roiphe: “It gives you a habit of intellectual isolation that is… useful, bracing, that gives you strength and originality.”) I can’t muster up that clarity myself, though. I’m very glad that I went to graduate school—my life would be different, and definitely worse, without it. But when I’m asked to give students advice about what they should do, I’m stumped. Over time, I’ve come to feel that giving good advice about graduate school is impossible. It’s like giving people advice about whether they should have children, or move to New York, or join the Army, or go to seminary.

Maybe I’ve been in school too long; doctoral study has a way of turning your head into a never-ending seminar, and I’m now capable of having complicated, inconclusive thoughts about nearly any subject. But advice helps people when they are making rational decisions, and the decision to go to grad school in English is essentially irrational. In fact, it’s representative of a whole class of decisions that bring you face to face with the basic unknowability and uncertainty of life.

by Joshua Rothman, New Yorker |  Read more:
Illustration by Michael Crawford

Sunday, April 28, 2013

LARC - Awesome Cat Shelter on Lanai


[ed. Feral cats used to be everywhere on the Hawaiian island of Lanai, now there are virtually none. Instead, they're all (300+) housed, fed and cared for at this adoption facility:
The sanctuary is a spacious open air 15,000 square foot enclosure. This enclosure is complete with spacious cubicles for sleeping, “pallet palaces” for hiding, large 8-foot long irrigation pipes for hiding in or chasing each other through and kitty-climbing jungle gyms. The enclosure also boasts numerous bushes, long grasses for catching zzzzz’s under and trees for climbing. Some of these trees also provide sleeping perches for those more adventuresome felines. On many a day you can spot up to 6 or 8 cats swaying in the breeze in the crooks of the tree branches. We are privileged to have a “purrrfect” welcoming committee greet guests when they enter this feline sanctuary.
I had a hard time leaving when I visited. Wherever I went there were seven or eight cats following me, wanting attention, hoping to be petted. For very little funds LARC does a great community service and is a deeply caring organization. If you'd like to help out, or even adopt a kitty, here's their website where you can read more about what they do: http://lanaianimalrescue.org/ ]

Where Vitamin Supplements Come From

I don’t know about you, but ever since I swallowed my first Flintstones’ chewable, I envisioned vitamin supplements coming from a magical fairyland where wizards would squeeze all the nutrients from whole vegetables and fruits. Do you have these visions too?

People that use vitamin supplements likely start with good intentions. But where do these products actually come from? Are vitamin supplements any more natural than white flour or pharmaceuticals?

Where do vitamin supplements come from?

When people think of drugs, most think “artificial.” When people think of vitamin supplements, most think “natural.”

But both drugs and vitamin supplements can be artificial or natural. Many vitamin supplements produced today are artificial. Meanwhile, the world of “natural” isn’t all hopscotch tournaments and fairy dances. Poison hemlock, hallucinogenic mushrooms, rhubarb leaves and sprouted kidney beans are all natural – and potentially deadly.

There are six categories of nutrients used in the manufacturing of vitamin supplements.

1. Natural Source

These include nutrients from vegetable, animal or mineral sources. But before making it into the supplement bottle, they undergo significant processing and refining. Examples include vitamin D from fish liver oils, vitamin E from vegetable oils, and natural beta-carotene.

When a vitamin is marked “natural”, it only has to include 10% of actual natural plant-derived ingredients. The other 90% could be synthetic.

Consider vitamin E tocopherols, which can be extracted from vegetable oils (often soybean, due to low costs).
  1. First, the soybeans are crushed and the protein is removed by precipitation.
  2. Second, the resultant oil is distilled off to become bottled vegetable oil.
  3. Third, the remaining materials are solubilized to remove any carbohydrates.
  4. Fourth, the vitamin E is solvent extracted away from the remaining waxes and lecithin.
Synthetic alpha-tocopherol is a combination of eight isomers, natural alpha-tocopherol is just one isomer, and consuming various isomers can decrease bioavailability.

Natural vitamin E – notice the D-alpha tocopherol

Synthetic vitamin E (notice the dl-alpha)

Another example is vitamin D3. The manufacturing starts with 7-dehydrocholesterol (usually from wool oil), which turns into cholecalciferol (vitamin D3) when exposed to ultraviolet light.

by Ryan Andrews, Precision Nutrition |  Read more:
Images: uncredited

We Copy Like We Breathe


When Cory Doctorow started his Keynote speech at this year's SIGGRAPH conference he started bravely by granting the audience "unequivocal permission to record video, audio, and to use those recordings ... in all media now known or yet to be invented throughout the known universe." This past Wednesday, two days after the speech, the Keynote was available on YouTube.

In the speech, Doctorow, co-editor of Boing Boing, outlined copyright and digital rights management's current state of affairs by providing details and examples that took the conversation far beyond the typically polarized copyright debate that divides the analysis into two mutually exclusive parts - either bad or good. In warming up to a proposal of his own set of laws he outlined an important issue that affects those experimenting on multiple portable platforms such as the iPhone, iPad, Android, and other emerging devices. Apple worked as the central example because of their sophisticated management of DRM, supported by the fact that they are generally good at what they do. Doctorow's concern about Apple's proprietary restrictions on transferring purchases from iTunes or the App Store were compounded by a recent announcement in the Guardian that German patent court has granted Apple a preliminary injunction that would prevent any import of Samsung's new Galaxy tablet into the country. This is certainly a concern for consumers and adds to the importance of Doctorow’s speech - but it’s an even bigger concern for artists who are experimenting on these platforms. As more artists make apps for the App Store they are opting into a restricted environment. If a consumer buys their app, and wants to transfer it to another device, they have no recourse except to ask Apple for permission. The chance that Apple will forego their ownership of the app's DRM for creative freedom is slim. Combined with the myriad of extraneous copyright laws that Doctorow outlines and the fact, as he states it, that artists are by far the most aggressive content copiers and producers - there is definitely a reason to be concerned.

The second half of the Keynote was spent reviewing Doctorow's three laws:
1: "Any time someone puts a lock on something that belongs to you and won't give you the key, they didn't put the lock there for your benefit."
2: "Fame won't guarantee fortune, but no one has ever gotten rich by being obscure."
3: "Information doesn't want to be free, people do."
Throughout his explanation of the laws, Doctorow raises many interesting points and manages to make a few well-timed jokes - there is even one about kittens. In the end, the laws serve more to highlight the unfair copyright practices currently in use around the world and give his argument a more critical angle. Before his speech’s conclusion he made a universal plea asking for freedom for his daughter, his country, and our collective digital future. He finished with a charge to take action and call for laws friendly to creatives and creative industries - laws that encourage production without the fear of surveillance or a loss of rights.

by Jason Huff, Rhizome (2011) |  Read more (transcript):

If This Was a Pill, You’d Do Anything to Get It


When Ken Coburn has visitors to the cramped offices of Health Quality Partners in Doylestown, Pa., he likes to show them a graph. It’s not his graph, he’s quick to say. Coburn is not the sort to take credit for other’s work. But it’s a graph that explains why he’s doing what he’s doing. It’s a graph he particularly wishes the folks who run Medicare would see, because if they did, then there’s no way they’d be threatening to shut down his program.

The graph shows the U.S. death rate for infectious diseases between 1900 and 1996. The line starts all the way at the top. In 1900, 800 of every 100,000 Americans died from infectious diseases. The top killers were pneumonia, tuberculosis and diarrhea. But the line quickly begins falling. By 1920, fewer than 400 of every 100,000 Americans died from infectious diseases. By 1940, it was less than 200. By 1960, it’s below 100. When’s the last time you heard of an American dying from diarrhea?

“For all the millennia before this in human history,” Coburn says, “it was all about tuberculosis and diarrheal diseases and all the other infectious disease. The idea that anybody lived long enough to be confronting chronic diseases is a new invention. Average life expectancy was 45 years old at the turn of the century. You didn’t have 85-year-olds with chronic diseases.”

With chronic illnesses like diabetes and heart disease you don’t get better, or at least not quickly. They don’t require cures so much as management. Their existence is often proof of medicine’s successes. Three decades ago, cancer typically killed you. Today, many cancers can be fought off for years or even indefinitely. The same is true for AIDS, and acute heart failure and so much else. This, to Coburn, is the core truth, and core problem, of today’s medical system: Its successes have changed the problems, but the health-care system hasn’t kept up.

Kenneth Thorpe, chairman of the health policy and management school at Emory University, estimates that 95 percent of spending in Medicare goes to patients with one or more chronic conditions — with enrollees suffering five or more chronic conditions accounting for 78 percent of its spending. “This is the Willie Sutton rule,” he says. “If 80 percent of the spending is going to patients with five or more conditions, that’s where our health-care system needs to go.”

Health Quality Partners is all about going there. The program enrolls Medicare patients with at least one chronic illness and one hospitalization in the past year. It then sends a trained nurse to see them every week, or every month, whether they’re healthy or sick. It sounds simple and, in a way, it is. But simple things can be revolutionary.

Most care-management systems rely on nurses sitting in call centers, checking up on patients over the phone. That model has mostly been a failure. And while many health systems send a nurse regularly in the weeks or months after a serious hospitalization, few send one regularly to even seemingly healthy patients. This a radical redefinition of the health-care system’s role in the lives of the elderly. It redefines being old and chronically ill as a condition requiring professional medical management.

Health Quality Partners’ results have been extraordinary. According to an independent analysis by the consulting firm Mathematica, HQP has reduced hospitalizations by 33 percent and cut Medicare costs by 22 percent.

Others in the profession have taken notice. “It’s like they’ve discovered the fountain of youth in Doylestown, Pa.,” marvels Jeffrey Brenner, founder of the Camden Coalition of Healthcare Providers.

Now Medicare is thinking of shutting it off.

by Ezra Klein, Washington Post |  Read more:
Image: Amanda Voisard, for The Washington Post

Eyvind Earle, Soft Green Meadows
via:

Why Your Supermarket Only Sells 5 Kinds of Apples

Every Fall at Maine's Common Ground Country Fair, the Lollapalooza of sustainable agriculture, John Bunker sets out a display of eccentric apples. Last September, once again, they covered every possible size, shape, and color in the wide world of appleness. There was a gnarled little yellow thing called a Westfield Seek-No-Further; a purplish plum impostor called a Black Oxford; a massive, red-streaked Wolf River; and one of Thomas Jefferson's go-to fruits, the Esopus Spitzenburg. Bunker is known in Maine as "The Apple Whisperer," or simply "The Apple Guy," and, after laboring for years in semi-obscurity, he has never been in more demand. Through the catalog of Fedco Trees, a mail-order company he founded in Maine 30 years ago, Bunker has sown the seeds of a grassroots apple revolution.

All weekend long, I watched people gravitate to what Bunker ("Bunk" to his friends, a category that seems to include half the population of Maine) calls "the vibrational pull" of a table laden with bright apples. "Baldwin!" said a tiny old man with white hair and intermittent teeth, pointing to a brick-red apple that was one of America's most important until the frigid winter of 1933-34 knocked it into obscurity. "That's the best!"

A leathery blonde from the coast held up a Blue Pearmain in wonder. "Blue Peahmain," she marveled. "My ma had one in her yahd."

Another woman got choked up by the sight of the Pound Sweet. "My grandmother had a Pound Sweet! She used to let me have one every time I hung out the laundry."

It wasn't just nostalgia. A steady conga line of homesteading hipsters—Henry David Thoreau meets Johnny Depp—paraded up to Bunk to get his blessing on their farm plans. "I've got three Kavanaghs and two Cox's Orange Pippins for fresh eating, a Wolf River for baking, and three Black Oxfords for winter keeping, but I feel like there are some gaps I need to fill. What do you recommend for cider?" Bunk, who is 62, dished out free advice through flayed vocal cords that made his words sound as if they were made of New England slate.

Most people approached with apples in hand, hoping for an ID of the tree that had been in their driveway or field ever since they bought the place. Some showed him photos on iPhones. Everywhere he travels in Maine, from the Common Ground Country Fair to the many Rotary Clubs and historical societies where he speaks, Bunk is presented with a series of mystery apples to identify. He's happy to oblige, but what he's really looking for are the ones he can'tidentify. It's all part of being an apple detective.

In the mid-1800s, there were thousands of unique varieties of apples in the United States, some of the most astounding diversity ever developed in a food crop. Then industrial agriculture crushed that world. The apple industry settled on a handful of varieties to promote worldwide, and the rest were forgotten. They became commercially extinct—but not quite biologically extinct.

by Rowan Jacobsen, Mother Jones |  Read more:
Image:USDA

Saturday, April 27, 2013


Rob Hann
via:

Pinging the Whole Internet


You probably haven’t heard of HD Moore, but up to a few weeks ago every Internet device in the world, perhaps including some in your own home, was contacted roughly three times a day by a stack of computers that sit overheating his spare room. “I have a lot of cooling equipment to make sure my house doesn’t catch on fire,” says Moore, who leads research at computer security company Rapid7. In February last year he decided to carry out a personal census of every device on the Internet as a hobby. “This is not my day job; it’s what I do for fun,” he says.

Moore has now put that fun on hold. “[It] drew quite a lot of complaints, hate mail, and calls from law enforcement,” he says. But the data collected has revealed some serious security problems, and exposed some vulnerable business and industrial systems of a kind used to control everything from traffic lights to power infrastructure.

Moore’s census involved regularly sending simple, automated messages to each one of the 3.7 billion IP addresses assigned to devices connected to the Internet around the world (Google, in contrast, collects information offered publicly by websites). Many of the two terabytes (2,000 gigabytes) worth of replies Moore received from 310 million IPs indicated that they came from devices vulnerable to well-known flaws, or configured in a way that could to let anyone take control of them.

On Tuesday, Moore published results on a particularly troubling segment of those vulnerable devices: ones that appear to be used for business and industrial systems. Over 114,000 of those control connections were logged as being on the Internet with known security flaws. Many could be accessed using default passwords and 13,000 offered direct access through a command prompt without a password at all.

Those vulnerable accounts offer attackers significant opportunities, says Moore, including rebooting company servers and IT systems, accessing medical device logs and customer data, and even gaining access to industrial control systems at factories or power infrastructure. Moore’s latest findings were aided by a similar dataset published by an anonymous hacker last month, gathered by compromising 420,000 pieces of network hardware.

by Tom Simonite, MIT Technology Review |  Read more:
Image by Carna Botnet

Cheryl Kelley, 396, 2009, oil on aluminum panel, 36 x 48"

Fapstinence

Traditionally, people undergo a bit of self-examination when faced with a ­potentially fatal rupture in their long-term relationship. Thirty-two-year-old Henry* admits that what he did was a little more extreme. “If you’d told me that I wasn’t going to masturbate for 54 days, I would have told you to fuck off,” he says.

Masturbation had been part of Henry’s daily routine since childhood. Although he remembered a scandalized babysitter who “found me trying to have sex with a chair” at age 5, Henry says he never felt shame about his habit. While he was of the opinion that a man who has a committed sexual relationship with porn was probably not going to have as successful a relationship with a woman, he had no qualms about watching it. Which he did most days.

Then, early last year and shortly before his girlfriend of two years moved to Los Angeles, Henry happened to watch a TED talk by the psychologist Philip Zimbardo called “The Demise of Guys.” It described males who “prefer the asynchronistic Internet world to the spontaneous interactions in social relationships” and therefore fail to succeed in school, work, and with women. When his girlfriend left, Henry went on to watch a TEDX talk by Gary Wilson, an anatomist and physiologist, whose lecture series, “Your Brain on Porn,” claims, among other things, that porn conditions men to want constant variety—an endless set of images and fantasies—and requires them to experience increasingly heightened stimuli to feel aroused. A related link led Henry to a community of people engaged in attempts to quit masturbation on the social news site Reddit. After reading the ­enthusiastic posts claiming improved virility, Henry began frequenting the site.

by Emily Witt, New York Magazine | Read more:
Photo: Bobby Doherty/New York Magazine



Pierrot and Harlequin - Anya Stasenko & Slava Leontiev
via: