Thursday, September 5, 2013

Rhumba

Found a baby rattlesnake in the house when we got here last fall—not dead yet but dying, stretched S-like on the large glue board I’d set out to catch the scorpions and daddy longlegs that hold barn dances in our empty house while we’re gone. This is my fancy about what the critters do in our absence: dozens of pale tan creatures, barbed tails arced high, sashay their long-legged partners across the living room floor, or lasso and ride the skittery six-inch black centipedes that sometimes scurry along our baseboards flaunting their wicked orange feelers. We find their desiccated corpses under chairs, beneath windows, laid out in the empty laundry basket when we return in the fall. My mind thinks: What has exhausted them so? A barn dance. A play party. A Wild West rodeo. More likely it’s the poison I spray in the crevices before we close up the house to head north for the summers, but I enjoy the barn-dance fancy.

Our rattlesnake problem isn’t fancy, though. It’s fact.

“Watch out for the snakes,” my husband and I tell each other when we go outside to walk or work. Our rock house sits on a rocky bluff in the Sans Bois Mountains of southeastern Oklahoma. Somewhere north on the ridge behind us is a winter den. If you’ve seen the rattler scene in True Grit, you’ve got the picture. This land where we live is the same country. The Sans Bois and the Winding Stair are twin ranges: rugged, low, humpbacked mountains slashing east to west above the plains, thick with hickory, oak, southern pine; they hold the same jagged sandstone bluffs and limestone caverns. When the days shorten, the rattlers come passing through on their way to that hidden crevice, where, until spring, they’ll sleep entwined in great complicated knots, moil all over each other, crawl outside to sun themselves on warm days. A group of rattlers gathered together like that is called a rhumba, by the way. A rhumba of rattlesnakes.

One strangely warm November day in 2011, Paul and I watched a five-foot diamondback crawl out from beneath the ramp to our shed. The snake was beautiful, really, ruddier than most, and as thick around as my forearm, its rattles lifted high as it moved in no particular hurry but with clear purpose, down into a rocky drainage ditch, up again on the far side, continuing across the ridge in obedience to that den’s siren song. I come from a people who will not allow any poisonous snake to live. My husband is a city guy from Boston with no family tradition of snake killing. Whether either of us could have brought ourselves to shoot that diamondback is a moot point, however. We didn’t have a gun.

My dad has been mentioning this lack ever since we bought the place years ago. He took one look at the rocky ridge behind the house, the huge sandstone slabs that lie tumbled about the property like a giant’s toy blocks—a thousand places for a rattler to love—and shook his head. “Y’all might want to think about getting yourselves a gun.”

Okay, we’d think about it, we said. (...)

I wrote a story once about a fellow who liked to drink in bars in southeastern Oklahoma with a young rattlesnake coiled under his hat—he’d sweep off his hat to reveal the baby rattler, just to shock and impress the ladies. That fiction was based on a true story my dad told about a cowboy snake wrangler from Heavener who did such things, and died from it. When I was researching the story, I learned a few facts. Rattlesnakes can strike two-thirds their own length and get back to coil so fast the human eye can hardly see it. They have medium-to-poor eyesight, a perfected sense of smell through that constantly flickering tongue, and excellent motion detectors via bones in their jaws that can feel the tiniest mammal footfall. They have acute heat sensors in the pits between eye and nostril—that’s what gives them the name “pit viper.” When a rattler strikes, its fangs pierce the flesh, instantaneously injecting venom as if through a hypodermic needle. Venom is the correct term, of course, not poison, although locally we still, and probably always will, call them poisonous snakes. In humans, a rattlesnake bite is horrifically painful. Symptoms include swelling, hemorrhage, lowered blood pressure, difficulty breathing, increased heart rate, fever, sweating, weakness, giddiness, nausea, vomiting, intense burning pain. If left untreated, death can come within hours.

Most species of rattlesnake—the velvet-tail, the ground rattler, even the eastern diamondback—when disturbed, will try to withdraw. Not the western diamondback: it will stand its ground. It may even advance to get within better striking distance. Western diamondbacks account for the majority of snakebite deaths in the United States. They account for most of the rattlesnakes we see on our mountain. Two weeks after the shed-ramp rattler, we encountered another. This time the whole family was there to witness.

by Rilla Askew, Tin House |  Read more:
Image: Wikipedia

Keith Stanley, 365 Days of Ikebana - Day 26
via:

The Global Elite’s Favorite Strongman

Paul Kagame, the president of Rwanda, agreed to meet me at 11 a.m. on a recent Saturday. Kagame’s office is on top of a hill near the center of Kigali, Rwanda’s capital, and I took a taxi there, driven by a man in a suit and tie. Whenever I’m in Kigali, I am always impressed by how spotless it is, how the city hums with efficiency, which is all the more remarkable considering that Rwanda remains one of the poorest nations in the world. Even on a Saturday morning, platoons of women in white gloves rhythmically swept the streets, softly singing to themselves. I passed the Union Trade Center mall in the middle of town, where traffic circulates smoothly around a giant fountain. There was no garbage in the streets and none of the black plastic bags that get tangled up in the fences and trees of so many other African cities — Kagame’s government has banned them. There were no homeless youth sleeping on the sidewalks or huffing glue to kill their hunger. In Rwanda, vagrants and petty criminals have been scooped up by the police and sent to a youth “rehabilitation center” on an island in the middle of Lake Kivu that some Rwandan officials jokingly call their Hawaii — because it is so lush and beautiful — though people in Kigali whisper about it as if it were Alcatraz. There aren’t even large slums in Kigali, because the government simply doesn’t allow them.

The night before, I strolled back to my hotel from a restaurant well past midnight — a stupid idea in just about any other African capital. But Rwanda is one of the safest places I’ve been, this side of Zurich, which is hard to reconcile with the fact that less than 20 years ago more civilians were murdered here in a three-month spree of madness than during just about any other three-month period in human history, including the Holocaust. During Rwanda’s genocide, the majority Hutus turned on the minority Tutsis, slaughtering an estimated one million men, women and children, most dispatched by machetes or crude clubs. Rwandans say it is difficult for any outsider to appreciate how horrifying it was. Nowadays, it’s hard to find even a jaywalker.

No country in Africa, if not the world, has so thoroughly turned itself around in so short a time, and Kagame has shrewdly directed the transformation. Measured against many of his colleagues, like the megalomaniac Robert Mugabe of Zimbabwe, who ran a beautiful, prosperous nation straight into the ground, or the Democratic Republic of Congo’s amiable but feckless Joseph Kabila, who is said to play video games while his country falls apart, Kagame seems like a godsend. Spartan, stoic, analytical and austere, he routinely stays up to 2 or 3 a.m. to thumb through back issues of The Economist or study progress reports from red-dirt villages across his country, constantly searching for better, more efficient ways to stretch the billion dollars his government gets each year from donor nations that hold him up as a shining example of what aid money can do in Africa. He is a regular at Davos, the world economic forum, and friendly with powerful people, including Bill Gates and Bono. The Clinton Global Initiative honored him with a Global Citizen award, and Bill Clinton said that Kagame “freed the heart and the mind of his people.”

This praise comes in part because Kagame has made indisputable progress fighting the single greatest ill in Africa: poverty. Rwanda is still very poor — the average Rwandan lives on less than $1.50 a day — but it is a lot less poor than it used to be. Kagame’s government has reduced child mortality by 70 percent; expanded the economy by an average of 8 percent annually over the past five years; and set up a national health-insurance program — which Western experts had said was impossible in a destitute African country. Progressive in many ways, Kagame has pushed for more women in political office, and today Rwanda has a higher percentage of them in Parliament than any other country. His countless devotees, at home and abroad, say he has also delicately re-engineered Rwandan society to defuse ethnic rivalry, the issue that exploded there in 1994 and that stalks so many African countries, often dragging them into civil war.

But Kagame may be the most complicated leader in Africa. The question is not so much about his results but his methods. He has a reputation for being merciless and brutal, and as the accolades have stacked up, he has cracked down on his own people and covertly supported murderous rebel groups in neighboring Congo. At least, that is what a growing number of critics say, including high-ranking United Nations officials and Western diplomats, not to mention the countless Rwandan dissidents who have recently fled. They argue that Kagame’s tidy, up-and-coming little country, sometimes described as the Singapore of Africa, is now one of the most straitjacketed in the world. Few people inside Rwanda feel comfortable speaking freely about the president, and many aspects of life are dictated by the government — Kagame’s administration recently embarked on an “eradication campaign” of all grass-roofed huts, which the government meticulously counted (in 2009 there were 124,671). In some areas of the country, there are rules, enforced by village commissars, banning people from dressing in dirty clothes or sharing straws when drinking from a traditional pot of beer, even in their own homes, because the government considers it unhygienic. Many Rwandans told me that they feel as if their president is personally watching them. “It’s like there’s an invisible eye everywhere,” said Alice Muhirwa, a member of an opposition political party. “Kagame’s eye.”

by Jeffrey Gettleman, NY Times |  Read more:
Image: Nadav Kander

Cancer's Origins Revealed

Researchers have provided the first comprehensive compendium of mutational processes that drive tumour development. Together, these mutational processes explain most mutations found in 30 of the most common cancer types. This new understanding of cancer development could help to treat and prevent a wide-range of cancers.

Each mutational process leaves a particular pattern of mutations, an imprint or signature, in the genomes of cancers it has caused. By studying 7,042 genomes of people with the most common forms of cancer, the team uncovered more than 20 signatures of processes that mutate DNA. For many of the signatures, they also identified the underlying biological process responsible.

All cancers are caused by mutations in DNA occurring in cells of the body during a person's lifetime. Although we know that chemicals in tobacco smoke cause mutations in lung cells that lead to lung cancers and ultraviolet light causes mutations in skin cells that lead to skin cancers, we have remarkably little understanding of the biological processes that cause the mutations which are responsible for the development of most cancers.

"We have identified the majority of the mutational signatures that explain the genetic development and history of cancers in patients," says Ludmil Alexandrov first author from the Wellcome Trust Sanger Institute. "We are now beginning to understand the complicated biological processes that occur over time and leave these residual mutational signatures on cancer genomes."

All of the cancers contained two or more signatures, reflecting the variety of processes that work together during the development of cancer. However, different cancers have different numbers of mutational processes. For example, two mutational processes underlie the development of ovarian cancer, while six mutational processes underlie the development of liver cancer.

Some of the mutational signatures are found in multiple cancer types, while others are confined to a single cancer type. Out of the 30 cancer types, 25 had signatures arising from age-related mutational processes. Another signature, caused by defects in repairing DNA due to mutations in the breast cancer susceptibility genes BRCA1 and 2, was found in breast, ovarian and pancreatic cancers.

by Wellcome Trust Sanger Institute |  Read more:
Image: WTSI

Wednesday, September 4, 2013

Porn: the Shocking Truth

I recently came across a 15-year-old pupil of mine absolutely beside herself with grief. After a certain amount of cajoling, she eventually wailed that some boys in her class had called her “brainy”, before bursting into a fresh flood of hysteria.

Fumbling for the Kleenex, and wondering - not for the first time - what the hell had happened with feminism and this generation of teenagers, I attempted to mop up her tears and explain that being described as “brainy” was a Good Thing and something she should be pleased about, even if the idiotic boys were too stupid to recognise that. “I love being called brainy,” I said, soothingly.

These words had the desired effect, or so I thought. She instantly stopped crying and looked at me in astonishment. “But that’s nasty, Miss,” she cried.

It was now my turn to look puzzled. She sighed. “Miss, being brainy means you give head to a lot of people. Giving brain means giving head.”

This conversation crystallised something that had been nagging at me for quite some time.

Language is a powerful signifier for all humans, but perhaps more for teenagers than most. For them, language is not just a tool of communication, but a means of establishing class, race, religion, which part of town they’re from, the music they listen to, the groups to which they are affiliated and myriad other things that adults forgot when they hit 20. This has always been the case, but what is new among the teenagers I work with is the casual appropriation of what were once compliments as insults - invariably targeted at females. If a girl “gives character”, she needs slapping down. If she’s “lively”, she’s slutty.

Teenagers are not particularly thoughtful about their words or actions, so when you spot a trend, worrying or otherwise, the only way to understand it is to ask them why they’re doing it. It is actually surprising how much this question surprises them - but it gets them talking.

“Sex is a way to get girls to do more stuff weirdly. If they do something and you threaten to tell everyone, they do more stuff.”

“I hated giving blow jobs, but didn’t want to look weird. And he said I was a freak if I didn’t. He was my best mate, but sex turned him into someone else.”

If this was a purely linguistic trend we could all sigh with relief, put the kettle on and hope it goes away. But it’s not just a linguistic one. It is manifesting in the thoughts, actions, character and behaviour of teenagers everywhere, and if you think it’s just the bad ones, the naughty ones, the poor ones, the sink estate ones, you are being very naive indeed.

This is, thanks to the internet, the first generation with free, easy and mass exposure to hardcore pornography. These are the first teenagers to have grown up with “sexting”, sex tapes, making their own sex tapes on phones, saucy snaps of classmates on Facebook, MSN orgies and extensive insight into the sex lives of celebrities and politicians - hell, even teachers.

It is impossible to discuss the long-term effect this will have on a generation of teenagers because we’re not there yet, but I can tell you about the short-term effects. I’ve been observing them closely for a couple of years. And some of what I’ve witnessed would be shocking to less seasoned adults.

by Chloe Combi, TES Connect | Read more:
Image via:

Where Nokia Went Wrong

Nokia’s agreement on Tuesday to sell its handset business to Microsoft for $7.2 billion is something of a minor business coup for Nokia, since a year from now that business might well turn out to have been worth nothing. It also demonstrates just how far and fast Nokia has fallen in recent years. Not that long ago, it was the world’s dominant and pace-setting mobile-phone maker. Today, it has just three per cent of the global smartphone market, and its market cap is a fifth of what it was in 2007—even after rising more than thirty per cent on Tuesday.

What happened to Nokia is no secret: Apple and Android crushed it. But the reasons for that failure are a bit more mysterious. Historically, after all, Nokia had been a surprisingly adaptive company, moving in and out of many different businesses—paper, electricity, rubber galoshes. Recently, it successfully reinvented itself again. For years, the company had been a conglomerate, with a number of disparate businesses operating under the Nokia umbrella; in the early nineteen-nineties, anticipating the rise of cell phones, executives got rid of everything but the telecom business. Even more strikingly, Nokia was hardly a technological laggard—on the contrary, it came up with its first smartphone back in 1996, and built a prototype of a touch-screen, Internet-enabled phone at the end of the nineties. It also spent enormous amounts of money on research and development. What it was unable to do, though, was translate all that R. & D. spending into products that people actually wanted to buy.

One way to explain this is to point out that Nokia was an engineering company that needed more marketing savvy. But this isn’t quite right; in the early aughts, Nokia was acclaimed for its marketing, and was seen as the company that had best figured out how to turn mobile phones into fashion accessories. It’s more accurate to say that Nokia was, at its heart, a hardware company rather than a software company—that is, its engineers were expert at building physical devices, but not the programs that make those devices work. In the end, the company profoundly underestimated the importance of software, including the apps that run on  smartphones, to the experience of using a phone. (...)

[ed. From the Comments Section: This article repeats common misperceptions: “Nokia was acclaimed for its marketing, and was seen as the company that had best figured out how to turn mobile phones into fashion accessories." In fact, Nokia had hired a good industrial designer and produced fairly good looking phones for awhile; this was not because of any deep consumer insight. It was a cool product from an interesting company that few had heard of from a country that few knew much about. Many within the company believed that their growing and soon-to-be-dominant market share was proof of marketing prowess. There was no traditional consumer research-based approach to product design. (As we have seen, you can skip that step if your company is headed by Steve Jobs.) So despite spending money on research and paying lip service to consumer needs, the product development process was never based on deep consumer insight. There was almost an attitude within the company that people had learned to use Nokia devices and the Nokia UI because they loved the company and the product, and therefore would stay with Nokia in the future.

This is a fundamental flaw that doesn't show up in a way that is plain enough for senior management to understand until you stop having the hot product from the cool company. We in the States saw this slip fairly early on when young people started telling us in research that Nokia phones were what their parents owned and not a product or brand that spoke to them. (Yes, there was an appreciation of industrial design within the company, but design was far removed, physically and metaphorically, from the company, where there was never a true design ethos—beyond the physical appearance of the phone.)

Nowhere was this gap between self-perception and reality greater than with the "Communicator." Referred to by some as "the brick," the company tried several iterations, with very little software or form factor innovation, and ultimately very modest sales (and a complete flop in the US). As with most large companies, they were operating within an echo chamber. Many top executives used Communicators, despite their un-user-friendly software and hardware, and you even saw administrative assistants using these expensive, high-end devices at Nokia's HQ in Espoo.

There was also a profound failure to understand the importance of the ecosystem that surrounds a platform, something that Apple has capitalized on masterfully. This mistake was grounded in the belief that it was hardware that mattered. The company’s approach to N-Gage, Nokia’s attempt to establish a mobile games platform, was a great demonstration of this. As a member of the global developer organization, I was sitting in the audience in Espoo when Anssi Vanjoki announced the device. We were shocked. No one had briefed the developer organization management, and as it turns out, that was by design. There was no plan for a broad games developer effort and no understanding of the importance of one. Instead, they had adopted the old console approach of working with a small number of hand-picked publishers. (It is no small irony that Finland’s most successful mobile phone export of the last few years has been the global game hit, Angry Birds.)]
by James Surowiecki, The Atlantic |  Read more:
Image: Angel Franco/The New York Times/Redux

Silicon Valley Tactics to Keep Control

The gods of Silicon Valley have repeatedly sought to take the companies they founded public while retaining control as if they were still private.

Recent events at Google and other technology companies show that perhaps this control may be bad not only for the companies but also for the founders, who are increasingly living in a world bereft of checks and balances.

Silicon Valley has for the most part held public shareholders in mixed regard. Preferring to keep them on the sidelines is not a new development.

Google was a leader in this movement. When it went public in 2004, it did so using a dual-class structure. Its co-founders, Sergey Brin and Larry Page, were issued shares with 10 votes apiece, while public shareholders received shares with only one vote. The idea was to ensure that the co-founders kept control of Google even if they sold some of their shares.

But boundaries get pushed, as does everything in the tech world. Facebook went further in restricting shareholder control when it went public by adopting a dual-class structure that allowed its co-founder Mark Zuckerberg to keep control even if he owned less than 10 percent of the company. In fact, if Mr. Zuckerberg dies, his heirs still have the potential to control the company.

Putting this in perspective, had Apple gone public with Facebook’s structure, Steven P. Jobs’s widow, Laurene Powell Jobs, and Apple’s co-founder Steve Wozniak (most recently a “Dancing With the Stars” contestant) would possibly still be in control.

Not to be outdone, Google proposed last year that the company issue a new class of shares with no voting rights. According to public documents filed by Google, this share class was put into place at the behest of the founders, being justified by Mr. Brin and Mr. Page as allowing them to “concentrate on the long term.”

The idea is that absent the pressures of the public market, executives can look after the long-term best interests of the company. This will allow Google to experiment with things like Google Glass, which may not be immediately profitable.

It’s an alluring argument. On the plus side, this allows for the company to look out for a wide array of interests beyond shareholders’ that focus solely on stock price. In the media, this structure has worked with some success (The New York Times Company, for example, has a dual-class structure).

But the problem with this structure is that the shareholders’ voice of dissent is locked out. And studies have shown that in general, this type of dual-class structure does not perform as well as traditional arrangements.

by Steven M. Davidoff, Dealbook |  Read more:
Image: Harry Campbell

Magnolia
via:

Tuesday, September 3, 2013


Dubai from above.
via:

Planetary Boundaries


[ed. At least somebody's thinking about these things (pdf).]

The planetary boundaries framework quickly became popular among various stakeholders, arguably because of its scientific grounding combined with its intuitive rationale and easily accessible visual presentation. A common request since its publication has been to downscale the planetary boundaries to the level of “individuals”, companies and countries, that is, what is required for each to stay within the “safe operating space”.
                                                                                             
                                                                                               — Swedish study on methodology
via:

Do You Believe in Sharing?

While delivering his Nobel lecture in 2007, Al Gore declared: “Today, we dumped another 70 million tons of global-warming pollution into the thin shell of atmosphere surrounding our planet, as if it were an open sewer.”

It’s a powerful example of the way we tend to argue about the impact of the human race on the planet that supports us: statistical or scientific claims combined with a call to action. But the argument misses something important: if we are to act, then how? Who must do what, who will benefit and how will all this be agreed and policed?

To ask how people work together to deal with environmental problems is to ask one of the fundamental questions in social science: how do people work together at all? This is the story of two researchers who attacked the question in very different ways – and with very different results.

“The Tragedy of the Commons” is a seminal article about why some environmental problems are so hard to solve. It was published in the journal Science in 1968 and its influence was huge. Partly this was the zeitgeist: the late 1960s and early 1970s was an era of big environmental legislation and regulation in the US. Yet that cannot be the only reason that the “tragedy of the commons” has joined a very small group of concepts – such as the “prisoner’s dilemma” or the “selfish gene” – to have escaped from academia to take on a life of their own.

The credit must go to Garrett Hardin, the man who coined the phrase and wrote the article. Hardin was a respected ecologist but “The Tragedy of the Commons” wasn’t an ecological study. It wasn’t really a piece of original research at all.

“Nothing he wrote in there had not been said by fisheries economists,” says Daniel Cole, a professor at Indiana University and a scholar of Hardin’s research. The key idea, indeed, goes back to Aristotle. Hardin’s genius was in developing a powerful, succinct story with a memorable name.

The story goes as follows: imagine common pasture, land owned by everyone and no one, “open to all” for grazing livestock. Now consider the incentives faced by people bringing animals to feed. Each new cow brought to the pasture represents pure private profit for the individual herdsman in question. But the commons cannot sustain an infinite number of cows. At some stage it will be overgrazed and the ecosystem may fail. That risk is not borne by any individual, however, but by society as a whole.

With a little mathematical elaboration Hardin showed that these incentives led inescapably to ecological disaster and the collapse of the commons. The idea of a communally owned resource might be appealing but it was ultimately self-defeating.

It was in this context that Hardin deployed the word “tragedy”. He didn’t use it to suggest that this was sad. He meant that this was inevitable. Hardin, who argued that much of the natural sciences was grounded by limits – such as the speed of light or the force of gravity – quoted the philosopher Alfred North Whitehead, who wrote that tragedy “resides in the solemnity of the remorseless working of things”.

Lin Ostrom never believed in “the remorseless working of things”. Born Elinor Awan in Los Angeles in 1933, by the time she first saw Garrett Hardin present his ideas she had already beaten the odds.

Lin was brought up in Depression-era poverty after her Jewish father left her Protestant mother. She was bullied at school – Beverly Hills High, of all places – because she was half-Jewish. She divorced her first husband, Charles Scott, after he discouraged her from pursuing an academic career, where she suffered discrimination for years. Initially steered away from mathematics at school, Lin was rejected by the economics programme at UCLA. She was only – finally – accepted on a PhD in political science after observing that UCLA’s political science department hadn’t admitted a woman for 40 years.

She persevered and secured her PhD after studying the management of fresh water in Los Angeles. In the first half of the 20th century, the city’s water supply had been blighted by competing demands to pump fresh water for drinking and farming. By the 1940s, however, the conflicting parties had begun to resolve their differences. In both her PhD, which she completed in 1965, and subsequent research, Lin showed that such outcomes often came from private individuals or local associations, who came up with their own rules and then lobbied the state to enforce them. In the case of the Los Angeles water producers, they drew up contracts to share their resources and the city’s water supply stabilised.

It was only when Lin saw Hardin lecture that she realised that she had been studying the tragedy of the commons all along. It was 1968, the year that the famous article was published. Garrett Hardin was 53, in the early stages of a career as a campaigning public intellectual that would last the rest of his life. Lin was 35, now Ostrom: she had married Vincent Ostrom, a respected political scientist closer to Hardin’s age, and together they had moved to Indiana University. Watching Hardin lecture galvanised her. But that wasn’t because she was convinced he was right. It was because she was convinced that he was wrong.

In his essay, Hardin explained that there was no way to manage communal property sustainably. The only solution was to obliterate the communal aspect. Either the commons could be nationalised and managed by the state – a Leviathan for the age of environmentalism – or the commons could be privatised, divided up into little parcels and handed out to individual farmers, who would then look after their own land responsibly. The theory behind all this is impeccable and, despite coming from a biologist, highly appealing to anyone with an economics training.

But Lin Ostrom could see that there must be something wrong with the logic. Her research on managing water in Los Angeles, watching hundreds of different actors hammer out their messy yet functional agreements, provided a powerful counter-example to Hardin. She knew of other examples, too, in which common resources had been managed sustainably without Hardin’s black-or-white solutions.

The problem with Hardin’s logic was the very first step: the assumption that communally owned land was a free-for-all. It wasn’t. The commons were owned by a community. They were managed by a community. These people were neighbours. They lived next door to each other. In many cases, they set their own rules and policed those rules.

This is not to deny the existence of the tragedy of the commons altogether. Hardin’s analysis looks prescient when applied to our habit of pumping carbon dioxide into the atmosphere or overfishing the oceans. But the existence of clear counter-examples should make us hesitate before accepting Hardin’s argument that tragedy is unstoppable. Lin Ostrom knew that there was nothing inevitable about the self-destruction of “common pool resources”, as economists call them. The tragedy of the commons wasn’t a tragedy at all. It was a problem – and problems have solutions.

If Garrett Hardin and Lin Ostrom had reached different conclusions about the commons, perhaps that was because their entire approaches to academic research were different. Hardin wanted to change the world; Ostrom merely wanted to describe it.

by Tim Harford, Goodreads |  Read more:
Images via: here and here 

The Liberal Dilemma

[ed. See also: Your Labor Day Syria Reader, Part 2 with William R. Polk.]

I think this piece by Paul Waldman is a thoughtful rundown of the way many liberals are sorting out the difficult question of Syria (and why they've moved on to the discussion of the politics of it instead):
I’m paid to have opinions, and I can’t figure out what my opinion is. On one hand, Bashar Assad is a mass murderer who, it seems plain, would be happy to kill half the population of his country if it would keep him in power. On the other hand, if he was taken out in a strike tomorrow the result would probably be a whole new civil war, this time not between the government and rebels but among competing rebel groups. On one hand, there’s value in enforcing international norms against certain kinds of despicable war crimes; on the other hand, Assad killed 100,000 Syrians quite adequately with guns and bombs before everybody got really mad about the 1,400 he killed with poison gas. On one hand, a round of missile strikes isn’t going to have much beyond a symbolic effect without changing the outcome of the civil war; on the other hand, the last thing we want is to get into another protracted engagement like Iraq.
In short, we’re confronted with nothing but bad options, and anyone who thinks there’s an unambiguously right course of action is a fool. So it’s a lot easier to talk about the politics.
I honestly don't find this quite that difficult although I am sympathetic to the emotional need to "do something." For the second time today, I'll offer my maxim: "If it's not obvious that violence is the only answer then it's not the answer."

And in this case, it's actually pretty clear to me. Violence is being proposed as a symbolic gesture that virtually no one expects will change a thing for the Syrian people and which could make things worse. That's just not good enough.

by Digby, Hullabaloo |  Read more:

The New Speed of Fashion


I was chatting with the hot young London designer Jonathan Anderson, marveling at how in just three years he had matched his transgender frilly men’s wear with the addition of his intriguing women’s collections.

“What’s that?” I asked, looking at a spread of drawings on the wall of his studio-cum-workroom in London’s down-at-the-heels Dalston neighborhood. (Think: East Village.)

“Resort!” said the 28-year-old Northern Irishman whose label is known as J. W. Anderson.

Resort? Already! This guy has been in business only five years and has just 12 people in his studio. Does he really have to join the fashion treadmill, churning out more than four collections a year? A treadmill it is, as Alber Elbaz of Lanvin said with a sigh recently, before his men’s-wear show: he used to go on exploratory trips and hang out in downtown galleries, trawling for inspiration for his shows. But with the number of collections now doubled, there is no time to do much travel beyond the virtual kind.

If we accept that the pace of fashion today was part of the problem behind the decline of John Galliano, the demise of Alexander McQueen and the cause of other well-known rehab cleanups, nonstop shows seem a high price to pay for the endless “newness” demanded of fashion now.

The strain on both budgets and designers is heavy. And only the fat-cat corporations can really afford to put on two mega ready-to-wear shows a year, or four if you add two haute couture shows, or six if you count men’s wear. Resort and prefall push the number up to eight. A couple of promotional shows in Asia, Brazil, Dubai or Moscow can bring the count to 10.

Ten shows a year! If you knock off the holiday season and the summer break, that means a show nearly every month.

But who needs more fashion and is gagging for yet another show? And how can designers cope, given that even the prolific Picasso did not churn out work like factory-baked cookies?

For all the promotional excitement attached to the international collections, it is the resort or prefall lines that are on the shelves for close to six months, while the so-called main line is in and out in about eight brief weeks.

How to make sense of this endless rush for the new when there are no longer any simple markers, like seasons? During the summer, when you are looking for a breezy maxi dress, the fall wool coats are hanging on the rails. Come early November, they will have vanished in favor of resort, which used to be called cruise, as if everyone hopped on a boat to the Caribbean with the first autumn chill.

Who are the crazy ones? The buying public demanding fashion now!, clicking online to buy during Burberry’s live-stream runway show months before the clothes are produced for the stores? The online shoppers hitting on special delivery pieces from Net-a-Porter that no one else will have — at least for the next two weeks?

Or has fashion itself gone mad, gathering speed so ferociously that it seems as if the only true luxury today is the ability to buy new and exclusive clothes every microsecond?

by Suzy Menkes, NY Times |  Read more:
Image: William Klein/Trunk Archive

Henri Rousseau, The Flamingos
via: