Tuesday, November 19, 2013

Bringing God Into It

The political left struggles, Rabbi Michael Lerner believes, because it has abandoned the spiritual values that undergird it—kindness, compassion, generosity spur the left’s concerns for social justice and a benevolent approach to public policy, yet these things can’t be weighed by science or valued through the stock exchange. The left, the editor of Tikkun magazine argues, has ceased to talk about the motivators that lend meaning to people’s lives. Lerner is one of the nation’s most influential progressive intellectuals and political leaders.

“The left’s hostility to religion is one of the main reasons people who otherwise might be involved with progressive politics get turned off,” he said. “So it becomes important to ask why.

“One reason is that conservatives have historically used religion to justify oppressive social systems and political regimes. Another reason is that many of the most rigidly anti-religious folk on the left are themselves refugees from repressive religious communities. Rightly rejecting the sexism, homophobia and authoritarianism they experienced in their own religious community, they unfairly generalize that to include all religious communities, unaware of the many religious communities that have played leadership roles in combating these and other forms of social injustice. Yet a third possible reason is that some on the left have never seen a religious community that embodies progressive values. But the left enjoyed some of its greatest success in the 1960s, when it was led by a black religious community and by a religious leader, Martin Luther King Jr.”

Indeed, Lerner points out, the great changes in American society—the end of slavery, the increase of rights for women and minorities—all have their progressive origins in the religious community. It’s time to reclaim that legacy, he said, and create a new community.

“It’s not true that the left is without belief,” he said. “The left is captivated by a belief I’ve called scientism.

“Science is not the same as scientism—the belief that the only things that are real or can be known are those that can be empirically observed and measured. As a religious person, I don’t rely on science to tell me what is right and wrong or what love means or why my life is important. I understand that such questions cannot be answered through empirical observations. Claims about God, ethics, beauty and any other face of human experience that is not subject to empirical verification—all these spiritual dimensions of life—are dismissed by the scientistic worldview as inherently unknowable and hence meaningless.

“Scientism extends far beyond an understanding and appreciation of the role of science in society,” Lerner said. “It has become the religion of the secular consciousness. Why do I say it’s a religion? Because it is a belief system that has no more scientific foundation than any other belief system. The view that that which is real and knowable is that which can be empirically verified or measured is a view that itself cannot be empirically measured or verified and thus by its own criterion is unreal or unknowable. It is a religious belief system with powerful adherents. Spiritual progressives therefore insist on the importance of distinguishing between our strong support for science and our opposition to scientism.”

Liberalism, he argues, emerged as part of the broad movement against the feudal order, which taught that God had appointed people to their place in the hierarchical economic and political order for the good of the greater whole. Our current economic system, capitalism, was created by challenging the church’s role in organizing social life, and empirical observation and rational thought became the battering ram the merchant class used to weaken the church’s authority.

“The idea that people are only motivated by material self-interest became the basis for a significant part of what we now call the political left, or labor movement, and the Democratic Party,” Lerner said. “We reduce it to, ‘It’s the economy, stupid.’ But in the research I did with thousands of middle-income working-class people, I found that there was a pervasive desire for meaning and a purpose-driven life, and for recognition by others in a nonutilitarian way, and that the absence of this kind of recognition and deprivation of meaning caused a huge amount of suffering and could best be described as a deep spiritual hunger that had little to do with how much money people were making.

by Tim Johnson, Cascadia Weekly |  Read more:
Image: uncredited

Monday, November 18, 2013

Auto Correct

Human beings make terrible drivers. They talk on the phone and run red lights, signal to the left and turn to the right. They drink too much beer and plow into trees or veer into traffic as they swat at their kids. They have blind spots, leg cramps, seizures, and heart attacks. They rubberneck, hotdog, and take pity on turtles, cause fender benders, pileups, and head-on collisions. They nod off at the wheel, wrestle with maps, fiddle with knobs, have marital spats, take the curve too late, take the curve too hard, spill coffee in their laps, and flip over their cars. Of the ten million accidents that Americans are in every year, nine and a half million are their own damn fault.

A case in point: The driver in the lane to my right. He’s twisted halfway around in his seat, taking a picture of the Lexus that I’m riding in with an engineer named Anthony Levandowski. Both cars are heading south on Highway 880 in Oakland, going more than seventy miles an hour, yet the man takes his time. He holds his phone up to the window with both hands until the car is framed just so. Then he snaps the picture, checks it onscreen, and taps out a lengthy text message with his thumbs. By the time he puts his hands back on the wheel and glances up at the road, half a minute has passed.

Levandowski shakes his head. He’s used to this sort of thing. His Lexus is what you might call a custom model. It’s surmounted by a spinning laser turret and knobbed with cameras, radar, antennas, and G.P.S. It looks a little like an ice-cream truck, lightly weaponized for inner-city work. Levandowski used to tell people that the car was designed to chase tornadoes or to track mosquitoes, or that he belonged to an élite team of ghost hunters. But nowadays the vehicle is clearly marked: “Self-Driving Car.”

Every week for the past year and a half, Levandowski has taken the Lexus on the same slightly surreal commute. He leaves his house in Berkeley at around eight o’clock, waves goodbye to his fiancée and their son, and drives to his office in Mountain View, forty-three miles away. The ride takes him over surface streets and freeways, old salt flats and pine-green foothills, across the gusty blue of San Francisco Bay, and down into the heart of Silicon Valley. In rush-hour traffic, it can take two hours, but Levandowski doesn’t mind. He thinks of it as research. While other drivers are gawking at him, he is observing them: recording their maneuvers in his car’s sensor logs, analyzing traffic flow, and flagging any problems for future review. The only tiresome part is when there’s roadwork or an accident ahead and the Lexus insists that he take the wheel. A chime sounds, pleasant yet insistent, then a warning appears on his dashboard screen: “In one mile, prepare to resume manual control.” (...)

Not everyone finds this prospect appealing. As a commercial for the Dodge Charger put it two years ago, “Hands-free driving, cars that park themselves, an unmanned car driven by a search-engine company? We’ve seen that movie. It ends with robots harvesting our bodies for energy.” Levandowski understands the sentiment. He just has more faith in robots than most of us do. “People think that we’re going to pry the steering wheel from their cold, dead hands,” he told me, but they have it exactly wrong. Someday soon, he believes, a self-driving car will save your life. (...)

The driverless-car project occupies a lofty, garagelike space in suburban Mountain View. It’s part of a sprawling campus built by Silicon Graphics in the early nineties and repurposed by Google, the conquering army, a decade later. Like a lot of high-tech offices, it’s a mixture of the whimsical and the workaholic—candy-colored sheet metal over a sprung-steel chassis. There’s a Foosball table in the lobby, exercise balls in the sitting room, and a row of what look like clown bicycles parked out front, free for the taking. When you walk in, the first things you notice are the wacky tchotchkes on the desks: Smurfs, “Star Wars” toys, Rube Goldberg devices. The next things you notice are the desks: row after row after row, each with someone staring hard at a screen.

It had taken me two years to gain access to this place, and then only with a staff member shadowing my every step. Google guards its secrets more jealously than most. At the gourmet cafeterias that dot the campus, signs warn against “tailgaters”—corporate spies who might slink in behind an employee before the door swings shut. Once inside, though, the atmosphere shifts from vigilance to an almost missionary zeal. “We want to fundamentally change the world with this,” Sergey Brin, the co-founder of Google, told me.

Brin was dressed in a charcoal hoodie, baggy pants, and sneakers. His scruffy beard and flat, piercing gaze gave him a Rasputinish quality, dulled somewhat by his Google Glass eyewear. At one point, he asked if I’d like to try the glasses on. When I’d positioned the miniature projector in front of my right eye, a single line of text floated poignantly into view: “3:51 p.m. It’s okay.”

“As you look outside, and walk through parking lots and past multilane roads, the transportation infrastructure dominates,” Brin said. “It’s a huge tax on the land.” Most cars are used only for an hour or two a day, he said. The rest of the time, they’re parked on the street or in driveways and garages. But if cars could drive themselves, there would be no need for most people to own them. A fleet of vehicles could operate as a personalized public-transportation system, picking people up and dropping them off independently, waiting at parking lots between calls. They’d be cheaper and more efficient than taxis—by some calculations, they’d use half the fuel and a fifth the road space of ordinary cars—and far more flexible than buses or subways. Streets would clear, highways shrink, parking lots turn to parkland. “We’re not trying to fit into an existing business model,” Brin said. “We are just on such a different planet.”

by Burkhard Bilger, New Yorker |  Read more:
Image: Harry Campbell

[ed. It's a little known fact (... or maybe not) that brown bears often step in the same footprints of other bears. I've heard of a place in S.E. Alaska where their prints are worn into solid rock. I haven't seen the site, but have observed the phenomenon on a lot of other trails. I'll post some pics one of these days.]

via:

The Insanity of Our Food Policy

American food policy has long been rife with head-scratching illogic. We spend billions every year on farm subsidies, many of which help wealthy commercial operations to plant more crops than we need. The glut depresses world crop prices, harming farmers in developing countries. Meanwhile, millions of Americans live tenuously close to hunger, which is barely kept at bay by a food stamp program that gives most beneficiaries just a little more than $4 a day. (...)

The House has proposed cutting food stamp benefits by $40 billion over 10 years — that’s on top of $5 billion in cuts that already came into effect this month with the expiration of increases to the food stamp program that were included in the 2009 stimulus law. Meanwhile, House Republicans appear satisfied to allow farm subsidies, which totaled some $14.9 billion last year, to continue apace. Republican proposals would shift government assistance from direct payments — paid at a set rate to farmers every year to encourage them to keep growing particular crops, regardless of market fluctuations — to crop insurance premium subsidies. But this is unlikely to be any cheaper. Worse, unlike direct payments, the insurance premium subsidies carry no income limit for the farmers who would receive this form of largess. (...)

Farm subsidies were much more sensible when they began eight decades ago, in 1933, at a time when more than 40 percent of Americans lived in rural areas. Farm incomes had fallen by about a half in the first three years of the Great Depression. In that context, the subsidies were an anti-poverty program.

Now, though, the farm subsidies serve a quite different purpose. From 1995 to 2012, 1 percent of farms received about $1.5 million each, which is more than a quarter of all subsidies, according to the Environmental Working Group. Some three-quarters of the subsidies went to just 10 percent of farms. These farms received an average of more than $30,000 a year — about 20 times the amount received by the average individual beneficiary last year from the federal Supplemental Nutrition Assistant Program, or SNAP, commonly called food stamps.

Today, food stamps are one of the main support beams in our anti-poverty efforts. More than 80 percent of the 45 million or so Americans who participated in SNAP in 2011, the last year for which there is comprehensive data from the United States Department of Agriculture, had gross household incomes below the poverty level. (Since then, the total number of participants has expanded to nearly 48 million.) Even with that support, many of them experience food insecurity, that is, they had trouble putting food on the table at some point during the year. (...)

This is not how America is supposed to work. In his famous 1941 “four freedoms” speech, Franklin D. Roosevelt enunciated the principle that all Americans should have certain basic economic rights, including “freedom from want.” These ideas were later embraced by the international community in the Universal Declaration of Human Rights, which also enshrined the right to adequate food. But while the United States was instrumental in advocating for these basic economic human rights on the international scene — and getting them adopted — America’s performance back home has been disappointing.

It is, of course, no surprise that with the high level of poverty millions of Americans have had to turn to the government to meet the basic necessities of life. And those numbers increased drastically with the onset of the Great Recession. The number of Americans on food stamps went up by more than 80 percent between 2007 and 2013.

To say that most of these Americans are technically poor only begins to get at the depth of their need. In 2012, for example, two in five SNAP recipients had gross incomes that were less than half of the poverty line. The amount they get from the program is very small — $4.39 a day per recipient. This is hardly enough to survive on, but it makes an enormous difference in the lives of those who get it: The Center on Budget and Policy Priorities estimates that SNAP lifted four million Americans out of poverty in 2010.

by Joseph E. Stiglitz, NY Times |  Read more:
Image: Javier Jaén

Peter Lindbergh for Vogue Italia May 1999.
via:

Sunday, November 17, 2013

Boulevard of Broken Dreams

Six in the morning, Beverly Hills. The air is filled with the aroma of expensive lawns, warming in the pallid sun. Plastic-bound copies of the LA Times lie before wrought iron gates, watched by security cameras, a chatter of birds, a glimpse of pink sky. Stand quite still on the sidewalk here, and the neighbourhood draws into focus. Box hedges, orange trees, the scent of magnolia. The ineluctable neatness of here.

For several blocks, Sunset Boulevard is home to LA as we know it—millionaires and billionaires, Oscar-winners and entrepreneurs, supermodels and TV shrinks. And over its high fences you catch flickers of affluence: a floodlit basketball court, a sliver of turquoise swimming pool.

But stand a little longer, and you see things that do not fit so neatly. Close to where Sunset meets the curve of Foothill Road, a woman waits at a bus stop. She is nondescript—black coat, white trainers, scarf, short hair, Trader Joe's bag. She speaks softly, as if her voice might ruffle the grass.

Her name is Petra, and she is a 64-year-old live-in housekeeper. She talks of how she moved to Los Angeles from Peru over two decades ago, and of the longing she still feels for home. Today is Sunday, her day off, so she is going to the Catholic church, two bus rides away in Culver City. The Number 2 bus draws up, and she is swallowed by the soft hiss of the doors. As the bus slides by, the faces in the windows are all Hispanic or black, all weary.

The street resumes its steady composure. A red sports car hums towards the coast, and a woman in white walks in circles in the middle of Arden Drive.

This is a story of belonging and not belonging, of preposterous wealth and immense poverty; of how, in a city where people love to be seen, so many can slip through the cracks unnoticed.

It is also the story of a single street, Sunset Boulevard, a 22-mile vein that goes from the coast to the clutter of downtown, past Sunset Strip, the Church of Scientology and on through Silver Lake. And of how, if you should choose to walk that street, from sunrise to sunset, you will come to see a city unadorned and unmade, a city at odds with itself. (...)

It is still early as Sarah the photographer and I reach Sunset Strip; the streets below the high-rises lie smooth and quiet. We can still smell the early lilacs of Beverly Hills, hear the low call of wood pigeons as we pass City National Bank, billboards for Guess jeans and Jack Daniel's. Scratched on an electricity cupboard is a warning: YOUNG HOLLYWOOD WILL PAY.

At this hour, the Strip is largely populated by late-night stragglers and morning street-sweepers. The cleaners in their orange tabards work head-down, tidying all evidence of the evening’s revelry—broken glass swept from patios, beer bottles fished from eucalyptus hedges. A group of young women in short skirts, bare legs and leopard-print heels totter by in a cloud of boozy laughter. In a bus shelter sits a young man wearing shorts, a Chanel earring and elaborate sunglasses, ready to make his weekly journey home from an electronica club. His name is Jake. "I live far," he says sleepily. "It’s in LA county, but it's far." When a woman jogs past, he looks faintly baffled by this strange collision of night and day.

Past the Viper Room, where River Phoenix died 20 years ago, and the clairvoyant and the tattoo parlour, and the window of the Hustler store, with its gimp masks and its stripper shoes and the huge sign that reads: "The Screaming O—Have One Tonight". Past the car-rental store where you can lease a Bentley, the better to impress your date or your business associates. Past the gaggle of Nickelback fans camped outside a plush hotel, hoping to catch a glimpse of them. And on to Book Soup, which has occupied this spot for nearly 40 years. Nicholas, a 63-year-old beautician, is flipping through Paris Match. "I love this place," he says. "It's the only civilised place on the Strip. I first started coming here way back in the early Eighties, when I had a little nook up there, a salon, and the choice was either to come here or get drunk in the bars."

He loves the smell of books, and he likes to buy the European magazines. "It gives me a different perspective," he explains. "There's more truth, more reality than flash. At my age I can't deal with fluff, I need something more in my brain. My daughter says to me 'Dad, what are you doing here? This is La-La Land!'"

by Laura Barton, Intelligent Life |  Read more:
Image: Sarah Lee

Saturday, November 16, 2013


Alex Colville, Refrigerator, 1977
via:

Who Is Conservation For?


Once, Gretchen Daily only had eyes for the rain forest.

Eighteen years ago, as a young scientist on the rise, Daily arrived at a renowned research station in the hills of Costa Rica armed with nearly 100 shellacked plywood platforms. As a student at Stanford University, studying under the famed biologist Paul Ehrlich, she had seen how large birds, defying expectations, seemed to thrive on small bits of forest spackled in the area's coffee plantations, when theory predicted their demise. On her return, she planned to spread her feeding platforms in staggered densities to test that observation; local kids promised to monitor the mesitas.

But when the morning came, so did the bees.

Africanized honeybees had swarmed the mesitas. The locals, always supportive of research on their lands, were peeved; every year these killer bees claim a few lives in Costa Rica. No one died, but the experiment was an utter, fast failure. "It was an 'aha!' moment," Daily said later, "but it was, 'Aha, what an idiot I've been.'" She was at a loss. She already had a spot at the station. She couldn't just leave, nor could she learn how to study a different creature before her stint was over. She knew birds, of course, but was never great at sorting species by their song, which ruled out work in the cacophonous forest. On the farms, though, she realized, she could use her eyes and master a smaller list of warbles, tying the birds' incidence to cultivation methods and the forest's verge. It was pure survey work, but it hadn't been done. And so it was that Daily looked outside the forest.

"Because of that chance of bad luck," she said. "I went out and opened my eyes and finally awakened to all the biodiversity in the countryside."

What she saw helped change the future of environmental science.

Daily crept among the arabica's cottony blooms, indexing hundreds of species thriving in what she had expected to be a dull monoculture. There were fiery-billed aracari, rufous-breasted wrens, even violaceous trogons, their golden bellies burning bright. Few of these birds—and, in later surveys, insects, frogs, bats, or other mammals—could be considered pests. There was a weave at work among the plantation, the forest, and the animals strung between them. The natural world had never left this man-made system; it was, in many ways, benefiting it, pollinating crops and chomping up berry borers.

In turn, the farmers were dependent on this natural capital, as Daily would call it, for their own economic well-being. Ehrlich had mentioned the benefits that humanity derived from nature. But why had she stayed so focused on the forest? Daily wondered. Because it seemed pristine, untouched? That was a lie; global warming was well under way. Humanity's shadow cloaked the planet, and all of its shades deserved study. "Any sensible conservation science should look at this," she thought.

Her own field of conservation biology—then a hot young science dedicated to saving endangered species, and a dominant voice in environmental science at the time—did not. And so Daily, now a professor at Stanford, along with a host of collaborators, set out to change the science.

Though Daily would never say this, her quest in many ways reflects the failure of a past generation. For decades, scientists have warned that the world is showing signs of deep environmental strain, close to suffering a great wave of human-caused species extinctions. Yet despite these calls of alarm, victories for conservation have been few and dear, and development has continued apace. Farming has grown to cover a quarter of the world's land. Fisheries and fresh water are ever closer to exhaustion. In the United States, wetlands are disappearing, and contaminants are often found in inland fish at harmful levels. Up to a third of the country's native species are at risk of extinction. In 2010 the world failed nearly every target the United Nations had set for halting biodiversity loss. And on top of all that, we are wrapped in warming at a rate unprecedented in modern times, thanks to emissions of fossil fuels. As one scientist told me, given the rising temperatures, the Joshua trees are leaving Joshua Tree National Park.

Humanity's great influence across the planet has even prompted many scientists to argue that we have left the Holocene and entered a new geological epoch, dubbed the Anthropocene. Many of the large nonprofit conservation groups, like the Nature Conservancy and the World Wildlife Fund, prompted as much by the need for new donors as by scientific imperative, have embraced the concept, emphasizing pragmatic work that protects people and the natural world. It's strange to say, but climate change came with a silver lining, says Jonathan Hoekstra, director of WWF's conservation-science program and one of Daily's collaborators.

"We were a field that always looked backwards in terms of trying to frame where we wanted to go," Hoekstra says. "It was like walking backwards through life. It was crazy when you think about it. Climate change has forced us to say, man, the world is changing. It's changing in ways that are unprecedented relative to our historic benchmarks. We need to be open to the possibility that the best possible future is going to be different, in possibly profound ways, from the past."

The rhetorical shift to a human-centered conservation has been quick, if not always easy—angry debate and ethical qualms are hallmarks of the change. But it has also called for a new kind of science, one that finds a way to understand humans, animals, and the environment at once; a science built to knit together the forest and crop rows of the Costa Rican coffee plantation. It's a science Daily has helped construct for the past two decades, combining economics and applied ecology to describe the benefits that humans gain from the natural world—drinking water, pollination, recreation. And at the base of it all is one snooze-inducing term: ecosystem services.

You can call it the jargon that ate conservation. The study of ecosystem services has exploded in recent years, passing from fad to the defining frame of conservation science. In 1995, ecosystem services were mentioned just seven times in the scholarly literature; by 2012, they came up 4,116 times. Biodiversity, once the north star of conservation, has become one light in a constellation. Even the famed Harvard biologist E.O. Wilson, a sentinel against capitulation in conservation, can now be seen singing the benefits of nature's services.

But the rise of this human-centered science has not come without pain, or loss. A cohort of leaders who only 30 years ago created another radical science—conservation biology—is increasingly marginalized. The vigor of activism has waned. And much uncertainty remains about whether ecosystem services, as it steps into the real world, will serve as a conciliatory vision to save species and the world or will simply be ignored, its models spinning away unnoticed by the powers that be. Perhaps worse, it could be taken as an apologia for climate change, absolving humanity of its collective environmental toll.

Few are more responsible for popularizing ecosystem services than Daily, yet these are fears she shares. Which is in part why, in 2005, she and several influential peers began the Natural Capital Project to apply their nascent science in the real world. It's taken time, more time than they first imagined, but in the past couple of years, the project's efforts have begun to flower, Daily says.

"I'm hoping conservation will have legitimacy and relevance like it's never had in the past," she says. "And thereby have impact and success like it hasn't really had in the past. Not on the scale that's required."

by Paul Voosen, Chronicle of Higher Education | Read more:
Image: Nick Norman, National Geographic, Aurora Photos

Sharecropping in the Cloud


Members of the contemporary tech industry speak of cloud computing with such awe and reverence that one might think that they were referring to the Kingdom of Heaven. “The cloud is for everyone. The cloud is a democracy,” declared Marc Benioff, CEO of Salesforce.com, a major business software company, in 2010.

Today, more and more companies are shifting their products and services to the cloud, most recently including Adobe with the successor to its Creative Suite of graphic design and editing software. Tech websites fill daily with articles arguing for businesses and individuals to transfer their data to the cloud. As Steve Jobs once commented, “I don’t need a hard disk in my computer if I can get to the server faster… carrying around these non-connected computers is byzantine by comparison.” Few in the industry would argue against the convenience and opportunities provided by the technology.

This consensus, however, is not without its discontents. Instead of functioning as a digital democracy, the net activist Jaron Lanier sees the cloud as more of a feudal kingdom. In his 2010 book, You Are Not a Gadget, Lanier illustrated the stratification of the digital world into “Peasants and Lords of the Clouds”: the lords own the digital architecture and are rewarded handsomely for it, while the creative class forms the peasantry, reduced to providing content for free and hoping for patronage.

To extend Lanier’s metaphor further, one might compare the emerging predominance of the cloud with the economic transition from feudalism to capitalism. As with their historical counterparts in the countryside during the emergence of capitalism, economic transition and technological improvements are transforming digital peasants into sharecroppers who must pay periodic fees under the lord’s terms for the privilege of utilizing software or viewing content. Historically, as today, elites used legal mechanisms combined with paeans to rights and efficiency to justify their new systems of rents and control at the expense of ordinary people. (...)

In this shift to the cloud, consumers of media are being transformed from effective owners, still legally subject to licensing restrictions but in physical possession of media, to renters, held captive by the whims of corporate rentiers backed by a tightening intellectual property regime. As Peter Frase has argued, this emphasis on intellectual property and rents has been and will remain a defining feature of contemporary capitalism.

by Harry C. Merritt, Jacobin | Read more:
Image: Florian Herzinger / Wikimedia

Jackson Browne


Elizabeth Couloigner, Other Places 31 
via:

Scientists Discover World's Oldest Clam, Killing It in the Process

A team of researchers has reported that Ming the Mollusk, the oldest clam ever found, is in fact 507-years-old, 102 years older than the previous estimate of its age. But that is as old as Ming will ever get.

Ming, an ocean quahog clam, was pulled up from 262-feet-deep waters off the coast of Iceland in 2006. Scientists from Bangor University, in the United Kingdom, who were studying the long-living clams as palimpsests of climate change, analyzed the lines on its shell to estimate its age, much as alternating bands of light and dark in a fish’s ear-bones are used to tell how old the animal is. This clam was 402 years old, the team said. It was called Ming, after the 1368-to-1644 Chinese dynasty during which it was born.

But a new analysis of the clam has put the hoary mollusk at 507-years-old, which means that it was born in 1499. This is the same year that the English hanged a Flemish man, Perkin Warbeck, for (doing a bad job of) pretending to be the lost son of King Edward IV and the heir to the British throne. It’s also the same year that Switzerland became its own state, the French King Louis XII got married, and Diane de Poitiers, future mistress to another French king, Henry II, was born.

When it was first found in 2006, Ming, celebrated as a disinterested non-observer to centuries of world upheavals, a hermetic parable of the benefits of not interacting at all with humans, with whom the clam is unlucky enough to share the planet, was called the world’s oldest animal. But, after some quibbling about whether that distinction should go to some venerable corals, the distinction was downgraded to “world’s oldest non-colonial animal,” because clams don’t grow in colonies as corals do. The Guinness Book of World Records simplifies the grandness of it all and just calls Ming the world’s oldest mollusk.

But this is a record that other clams are well placed to beat. That’s because Ming is not getting older. To study Ming’s senescent insides in 2006, the researchers had to pop the clam open. Ming died. It's Wikipedia page reads in the past tense.

by Elizabeth Barber, CSM |  Read more:
Image: geobeats/YouTube

Friday, November 15, 2013


Dan Eldon
via:

The Battle of Bretton Woods

At the end of the Second World War, many thought that a lasting peace would be possible only if we learned to manage the world economy. The fact that the worst war in history had followed shortly on the heels of the worst economic crisis seemed to confirm that international political crisis and economic instability went hand in hand. In the 1940s, this was a relatively new way of thinking about interstate relations. Negotiations for the peace settlement after the First World War had largely skirted economic questions in favour of political and legal ones – settling territorial borders, for example, or the rights of national minorities. When Keynes criticised the peacemakers in 1919 for ignoring Europe’s economic troubles, and for thinking of money only in terms of booty for the victors, he was ahead of his time: ‘It is an extraordinary fact that the fundamental economic problems of a Europe starving and disintegrating before their eyes, was the one question in which it was impossible to arouse the interest of the Four,’ Keynes wrote in The Economic Consequences of the Peace, referring to the quartet of national leaders who shaped the Treaty of Versailles. Their indifference wasn’t much of a surprise: national leaders at the time had little direct experience in managing economic affairs beyond their own borders. The worldwide commercial system that had sprung up in the decades before the war had been facilitated largely through the efforts of private business and finance; the gold standard set the rules of exchange, but states mostly stayed out of the way, except when lowering trade barriers or enforcing contracts. When things went badly, they didn’t try to intervene. (...)

When the Anglo-American conversation shifted away from trade and towards the seemingly technical issues of currency and finance, progress towards a deal proceeded more smoothly. In August 1941, Keynes, now adviser to the chancellor and leading postwar economic planning, returned from negotiations over Lend-Lease in Washington to draft plans for a new international monetary regime. Over the course of several meetings from the summer of 1942, Keynes and his American counterpart, the economist and US Treasury official Harry Dexter White, traded blows over how to rewrite the monetary rules of the international economy. They made curious sparring partners: Keynes, the world-famous economist and public intellectual, pitted against White, an obscure technocrat and late-blooming academic born to working-class Jewish immigrants from Lithuania and plucked by the US Treasury from his post at a small Wisconsin university. Neither seemed to enjoy the company of the other: Keynes was disdainful of what he saw as the inferior intellect and gruff manners of the ‘aesthetically oppressive’ White, whose ‘harsh rasping voice’ proved a particular annoyance. Keynes, meanwhile, was the archetype of the haughty English lord; as White remarked to the British economist Lionel Robbins, ‘your Baron Keynes sure pees perfume.’

Squabbles aside, the two men ended up largely in agreement about the basic aims of the new international monetary system: to stabilise exchange rates; facilitate international financial co-operation; prohibit competitive currency depreciations and arbitrary alterations of currency values; and restrict the international flow of capital to prevent the short-term, speculative investments widely believed to have destabilised the interwar monetary system. They also agreed on the need to establish a new international institution to provide financial assistance to states experiencing exchange imbalances and to enforce rules about currency values (what would become the International Monetary Fund), and another to provide capital for postwar reconstruction (the future World Bank). A closely managed and regulated international financial system would replace the unco-ordinated and competitive system of the interwar years. And with currencies stabilised – so they hoped – world trade could be resumed. (...)

One of the most innovative aspects of the Anglo-American deal was the fact that it prioritised the need for full employment and social insurance policies at the national level over thoroughgoing international economic integration. To this extent, it was more Keynesian than not – and it represented a dramatic departure from older assumptions about the way the world’s financial system should function. Under the gold standard, which had facilitated a period of financial and commercial globalisation in the late 19th and early 20th centuries, governments had possessed few means of responding to an economic downturn beyond cutting spending and raising interest rates in the hope that prices and wages would drop so low that the economy would right itself. Populations simply had to ride out periods of deflation and mass unemployment, as the state couldn’t do much to help them: pursuing expansionary fiscal or monetary measures (what states tend to do today) would jeopardise the convertibility of the state’s currency into gold. For these reasons, the gold standard was well suited to a 19th-century world in which there were few organised workers’ parties and labour unions, but not so well suited to a messy world of mass democracy. The Keynesian revolution in economic governance gave the state a set of powerful new tools for responding to domestic economic distress – but they wouldn’t work as long as the gold standard called the shots. (...)

American dominance over the system was guaranteed by another crucial fact: in 1944, the US dollar was the only currency available widely enough to facilitate international exchange under the new ‘gold exchange standard’. This was intended to be a modified version of the gold standard which, in practice, would allow states to adjust their currency values against the dollar as they saw fit (depending on whether they prioritised economic growth, for example, or controlling inflation), with the value of the dollar convertible into gold at a fixed rate of $35 an ounce. What this meant was that, after the end of the war, the US dollar would effectively become the world’s currency of reserve – which it remains to this day (although it’s no longer pegged to gold). This arrangement would give the US the privilege of being indebted to the world ‘free of charge’, as Charles de Gaulle later put it, but would work only as long as the US saw maintaining gold convertibility as working in its national interest. Harry Dexter White apparently hadn’t envisaged a scenario in which it wouldn’t, but this eventually happened in the 1970s, when deficits from financing the Vietnam War piled so high that the US began to face a run on its gold reserves. In 1971, Richard Nixon removed the dollar’s peg to gold – effectively bringing Bretton Woods to an end – rather than raising interest rates to staunch the outflow of gold, which would probably have caused a recession (with an election on the horizon). Before this, the track record of the gold exchange standard had been pretty good: the years of its operation had seen stable exchange rates, unprecedented global economic growth, the rebirth of world trade and relatively low unemployment. This period also saw the emergence of many different models of the welfare state – in Europe, the United States and Japan – just as the economists behind Bretton Woods had intended.

by Jamie Martin, LRB |  Read more:
Image: uncredited

Armando Barrios - Cantata (1985)
via: