Monday, November 18, 2013


Peter Lindbergh for Vogue Italia May 1999.
via:

Sunday, November 17, 2013

Boulevard of Broken Dreams

Six in the morning, Beverly Hills. The air is filled with the aroma of expensive lawns, warming in the pallid sun. Plastic-bound copies of the LA Times lie before wrought iron gates, watched by security cameras, a chatter of birds, a glimpse of pink sky. Stand quite still on the sidewalk here, and the neighbourhood draws into focus. Box hedges, orange trees, the scent of magnolia. The ineluctable neatness of here.

For several blocks, Sunset Boulevard is home to LA as we know it—millionaires and billionaires, Oscar-winners and entrepreneurs, supermodels and TV shrinks. And over its high fences you catch flickers of affluence: a floodlit basketball court, a sliver of turquoise swimming pool.

But stand a little longer, and you see things that do not fit so neatly. Close to where Sunset meets the curve of Foothill Road, a woman waits at a bus stop. She is nondescript—black coat, white trainers, scarf, short hair, Trader Joe's bag. She speaks softly, as if her voice might ruffle the grass.

Her name is Petra, and she is a 64-year-old live-in housekeeper. She talks of how she moved to Los Angeles from Peru over two decades ago, and of the longing she still feels for home. Today is Sunday, her day off, so she is going to the Catholic church, two bus rides away in Culver City. The Number 2 bus draws up, and she is swallowed by the soft hiss of the doors. As the bus slides by, the faces in the windows are all Hispanic or black, all weary.

The street resumes its steady composure. A red sports car hums towards the coast, and a woman in white walks in circles in the middle of Arden Drive.

This is a story of belonging and not belonging, of preposterous wealth and immense poverty; of how, in a city where people love to be seen, so many can slip through the cracks unnoticed.

It is also the story of a single street, Sunset Boulevard, a 22-mile vein that goes from the coast to the clutter of downtown, past Sunset Strip, the Church of Scientology and on through Silver Lake. And of how, if you should choose to walk that street, from sunrise to sunset, you will come to see a city unadorned and unmade, a city at odds with itself. (...)

It is still early as Sarah the photographer and I reach Sunset Strip; the streets below the high-rises lie smooth and quiet. We can still smell the early lilacs of Beverly Hills, hear the low call of wood pigeons as we pass City National Bank, billboards for Guess jeans and Jack Daniel's. Scratched on an electricity cupboard is a warning: YOUNG HOLLYWOOD WILL PAY.

At this hour, the Strip is largely populated by late-night stragglers and morning street-sweepers. The cleaners in their orange tabards work head-down, tidying all evidence of the evening’s revelry—broken glass swept from patios, beer bottles fished from eucalyptus hedges. A group of young women in short skirts, bare legs and leopard-print heels totter by in a cloud of boozy laughter. In a bus shelter sits a young man wearing shorts, a Chanel earring and elaborate sunglasses, ready to make his weekly journey home from an electronica club. His name is Jake. "I live far," he says sleepily. "It’s in LA county, but it's far." When a woman jogs past, he looks faintly baffled by this strange collision of night and day.

Past the Viper Room, where River Phoenix died 20 years ago, and the clairvoyant and the tattoo parlour, and the window of the Hustler store, with its gimp masks and its stripper shoes and the huge sign that reads: "The Screaming O—Have One Tonight". Past the car-rental store where you can lease a Bentley, the better to impress your date or your business associates. Past the gaggle of Nickelback fans camped outside a plush hotel, hoping to catch a glimpse of them. And on to Book Soup, which has occupied this spot for nearly 40 years. Nicholas, a 63-year-old beautician, is flipping through Paris Match. "I love this place," he says. "It's the only civilised place on the Strip. I first started coming here way back in the early Eighties, when I had a little nook up there, a salon, and the choice was either to come here or get drunk in the bars."

He loves the smell of books, and he likes to buy the European magazines. "It gives me a different perspective," he explains. "There's more truth, more reality than flash. At my age I can't deal with fluff, I need something more in my brain. My daughter says to me 'Dad, what are you doing here? This is La-La Land!'"

by Laura Barton, Intelligent Life |  Read more:
Image: Sarah Lee

Saturday, November 16, 2013


Alex Colville, Refrigerator, 1977
via:

Who Is Conservation For?


Once, Gretchen Daily only had eyes for the rain forest.

Eighteen years ago, as a young scientist on the rise, Daily arrived at a renowned research station in the hills of Costa Rica armed with nearly 100 shellacked plywood platforms. As a student at Stanford University, studying under the famed biologist Paul Ehrlich, she had seen how large birds, defying expectations, seemed to thrive on small bits of forest spackled in the area's coffee plantations, when theory predicted their demise. On her return, she planned to spread her feeding platforms in staggered densities to test that observation; local kids promised to monitor the mesitas.

But when the morning came, so did the bees.

Africanized honeybees had swarmed the mesitas. The locals, always supportive of research on their lands, were peeved; every year these killer bees claim a few lives in Costa Rica. No one died, but the experiment was an utter, fast failure. "It was an 'aha!' moment," Daily said later, "but it was, 'Aha, what an idiot I've been.'" She was at a loss. She already had a spot at the station. She couldn't just leave, nor could she learn how to study a different creature before her stint was over. She knew birds, of course, but was never great at sorting species by their song, which ruled out work in the cacophonous forest. On the farms, though, she realized, she could use her eyes and master a smaller list of warbles, tying the birds' incidence to cultivation methods and the forest's verge. It was pure survey work, but it hadn't been done. And so it was that Daily looked outside the forest.

"Because of that chance of bad luck," she said. "I went out and opened my eyes and finally awakened to all the biodiversity in the countryside."

What she saw helped change the future of environmental science.

Daily crept among the arabica's cottony blooms, indexing hundreds of species thriving in what she had expected to be a dull monoculture. There were fiery-billed aracari, rufous-breasted wrens, even violaceous trogons, their golden bellies burning bright. Few of these birds—and, in later surveys, insects, frogs, bats, or other mammals—could be considered pests. There was a weave at work among the plantation, the forest, and the animals strung between them. The natural world had never left this man-made system; it was, in many ways, benefiting it, pollinating crops and chomping up berry borers.

In turn, the farmers were dependent on this natural capital, as Daily would call it, for their own economic well-being. Ehrlich had mentioned the benefits that humanity derived from nature. But why had she stayed so focused on the forest? Daily wondered. Because it seemed pristine, untouched? That was a lie; global warming was well under way. Humanity's shadow cloaked the planet, and all of its shades deserved study. "Any sensible conservation science should look at this," she thought.

Her own field of conservation biology—then a hot young science dedicated to saving endangered species, and a dominant voice in environmental science at the time—did not. And so Daily, now a professor at Stanford, along with a host of collaborators, set out to change the science.

Though Daily would never say this, her quest in many ways reflects the failure of a past generation. For decades, scientists have warned that the world is showing signs of deep environmental strain, close to suffering a great wave of human-caused species extinctions. Yet despite these calls of alarm, victories for conservation have been few and dear, and development has continued apace. Farming has grown to cover a quarter of the world's land. Fisheries and fresh water are ever closer to exhaustion. In the United States, wetlands are disappearing, and contaminants are often found in inland fish at harmful levels. Up to a third of the country's native species are at risk of extinction. In 2010 the world failed nearly every target the United Nations had set for halting biodiversity loss. And on top of all that, we are wrapped in warming at a rate unprecedented in modern times, thanks to emissions of fossil fuels. As one scientist told me, given the rising temperatures, the Joshua trees are leaving Joshua Tree National Park.

Humanity's great influence across the planet has even prompted many scientists to argue that we have left the Holocene and entered a new geological epoch, dubbed the Anthropocene. Many of the large nonprofit conservation groups, like the Nature Conservancy and the World Wildlife Fund, prompted as much by the need for new donors as by scientific imperative, have embraced the concept, emphasizing pragmatic work that protects people and the natural world. It's strange to say, but climate change came with a silver lining, says Jonathan Hoekstra, director of WWF's conservation-science program and one of Daily's collaborators.

"We were a field that always looked backwards in terms of trying to frame where we wanted to go," Hoekstra says. "It was like walking backwards through life. It was crazy when you think about it. Climate change has forced us to say, man, the world is changing. It's changing in ways that are unprecedented relative to our historic benchmarks. We need to be open to the possibility that the best possible future is going to be different, in possibly profound ways, from the past."

The rhetorical shift to a human-centered conservation has been quick, if not always easy—angry debate and ethical qualms are hallmarks of the change. But it has also called for a new kind of science, one that finds a way to understand humans, animals, and the environment at once; a science built to knit together the forest and crop rows of the Costa Rican coffee plantation. It's a science Daily has helped construct for the past two decades, combining economics and applied ecology to describe the benefits that humans gain from the natural world—drinking water, pollination, recreation. And at the base of it all is one snooze-inducing term: ecosystem services.

You can call it the jargon that ate conservation. The study of ecosystem services has exploded in recent years, passing from fad to the defining frame of conservation science. In 1995, ecosystem services were mentioned just seven times in the scholarly literature; by 2012, they came up 4,116 times. Biodiversity, once the north star of conservation, has become one light in a constellation. Even the famed Harvard biologist E.O. Wilson, a sentinel against capitulation in conservation, can now be seen singing the benefits of nature's services.

But the rise of this human-centered science has not come without pain, or loss. A cohort of leaders who only 30 years ago created another radical science—conservation biology—is increasingly marginalized. The vigor of activism has waned. And much uncertainty remains about whether ecosystem services, as it steps into the real world, will serve as a conciliatory vision to save species and the world or will simply be ignored, its models spinning away unnoticed by the powers that be. Perhaps worse, it could be taken as an apologia for climate change, absolving humanity of its collective environmental toll.

Few are more responsible for popularizing ecosystem services than Daily, yet these are fears she shares. Which is in part why, in 2005, she and several influential peers began the Natural Capital Project to apply their nascent science in the real world. It's taken time, more time than they first imagined, but in the past couple of years, the project's efforts have begun to flower, Daily says.

"I'm hoping conservation will have legitimacy and relevance like it's never had in the past," she says. "And thereby have impact and success like it hasn't really had in the past. Not on the scale that's required."

by Paul Voosen, Chronicle of Higher Education | Read more:
Image: Nick Norman, National Geographic, Aurora Photos

Sharecropping in the Cloud


Members of the contemporary tech industry speak of cloud computing with such awe and reverence that one might think that they were referring to the Kingdom of Heaven. “The cloud is for everyone. The cloud is a democracy,” declared Marc Benioff, CEO of Salesforce.com, a major business software company, in 2010.

Today, more and more companies are shifting their products and services to the cloud, most recently including Adobe with the successor to its Creative Suite of graphic design and editing software. Tech websites fill daily with articles arguing for businesses and individuals to transfer their data to the cloud. As Steve Jobs once commented, “I don’t need a hard disk in my computer if I can get to the server faster… carrying around these non-connected computers is byzantine by comparison.” Few in the industry would argue against the convenience and opportunities provided by the technology.

This consensus, however, is not without its discontents. Instead of functioning as a digital democracy, the net activist Jaron Lanier sees the cloud as more of a feudal kingdom. In his 2010 book, You Are Not a Gadget, Lanier illustrated the stratification of the digital world into “Peasants and Lords of the Clouds”: the lords own the digital architecture and are rewarded handsomely for it, while the creative class forms the peasantry, reduced to providing content for free and hoping for patronage.

To extend Lanier’s metaphor further, one might compare the emerging predominance of the cloud with the economic transition from feudalism to capitalism. As with their historical counterparts in the countryside during the emergence of capitalism, economic transition and technological improvements are transforming digital peasants into sharecroppers who must pay periodic fees under the lord’s terms for the privilege of utilizing software or viewing content. Historically, as today, elites used legal mechanisms combined with paeans to rights and efficiency to justify their new systems of rents and control at the expense of ordinary people. (...)

In this shift to the cloud, consumers of media are being transformed from effective owners, still legally subject to licensing restrictions but in physical possession of media, to renters, held captive by the whims of corporate rentiers backed by a tightening intellectual property regime. As Peter Frase has argued, this emphasis on intellectual property and rents has been and will remain a defining feature of contemporary capitalism.

by Harry C. Merritt, Jacobin | Read more:
Image: Florian Herzinger / Wikimedia

Jackson Browne


Elizabeth Couloigner, Other Places 31 
via:

Scientists Discover World's Oldest Clam, Killing It in the Process

A team of researchers has reported that Ming the Mollusk, the oldest clam ever found, is in fact 507-years-old, 102 years older than the previous estimate of its age. But that is as old as Ming will ever get.

Ming, an ocean quahog clam, was pulled up from 262-feet-deep waters off the coast of Iceland in 2006. Scientists from Bangor University, in the United Kingdom, who were studying the long-living clams as palimpsests of climate change, analyzed the lines on its shell to estimate its age, much as alternating bands of light and dark in a fish’s ear-bones are used to tell how old the animal is. This clam was 402 years old, the team said. It was called Ming, after the 1368-to-1644 Chinese dynasty during which it was born.

But a new analysis of the clam has put the hoary mollusk at 507-years-old, which means that it was born in 1499. This is the same year that the English hanged a Flemish man, Perkin Warbeck, for (doing a bad job of) pretending to be the lost son of King Edward IV and the heir to the British throne. It’s also the same year that Switzerland became its own state, the French King Louis XII got married, and Diane de Poitiers, future mistress to another French king, Henry II, was born.

When it was first found in 2006, Ming, celebrated as a disinterested non-observer to centuries of world upheavals, a hermetic parable of the benefits of not interacting at all with humans, with whom the clam is unlucky enough to share the planet, was called the world’s oldest animal. But, after some quibbling about whether that distinction should go to some venerable corals, the distinction was downgraded to “world’s oldest non-colonial animal,” because clams don’t grow in colonies as corals do. The Guinness Book of World Records simplifies the grandness of it all and just calls Ming the world’s oldest mollusk.

But this is a record that other clams are well placed to beat. That’s because Ming is not getting older. To study Ming’s senescent insides in 2006, the researchers had to pop the clam open. Ming died. It's Wikipedia page reads in the past tense.

by Elizabeth Barber, CSM |  Read more:
Image: geobeats/YouTube

Friday, November 15, 2013


Dan Eldon
via:

The Battle of Bretton Woods

At the end of the Second World War, many thought that a lasting peace would be possible only if we learned to manage the world economy. The fact that the worst war in history had followed shortly on the heels of the worst economic crisis seemed to confirm that international political crisis and economic instability went hand in hand. In the 1940s, this was a relatively new way of thinking about interstate relations. Negotiations for the peace settlement after the First World War had largely skirted economic questions in favour of political and legal ones – settling territorial borders, for example, or the rights of national minorities. When Keynes criticised the peacemakers in 1919 for ignoring Europe’s economic troubles, and for thinking of money only in terms of booty for the victors, he was ahead of his time: ‘It is an extraordinary fact that the fundamental economic problems of a Europe starving and disintegrating before their eyes, was the one question in which it was impossible to arouse the interest of the Four,’ Keynes wrote in The Economic Consequences of the Peace, referring to the quartet of national leaders who shaped the Treaty of Versailles. Their indifference wasn’t much of a surprise: national leaders at the time had little direct experience in managing economic affairs beyond their own borders. The worldwide commercial system that had sprung up in the decades before the war had been facilitated largely through the efforts of private business and finance; the gold standard set the rules of exchange, but states mostly stayed out of the way, except when lowering trade barriers or enforcing contracts. When things went badly, they didn’t try to intervene. (...)

When the Anglo-American conversation shifted away from trade and towards the seemingly technical issues of currency and finance, progress towards a deal proceeded more smoothly. In August 1941, Keynes, now adviser to the chancellor and leading postwar economic planning, returned from negotiations over Lend-Lease in Washington to draft plans for a new international monetary regime. Over the course of several meetings from the summer of 1942, Keynes and his American counterpart, the economist and US Treasury official Harry Dexter White, traded blows over how to rewrite the monetary rules of the international economy. They made curious sparring partners: Keynes, the world-famous economist and public intellectual, pitted against White, an obscure technocrat and late-blooming academic born to working-class Jewish immigrants from Lithuania and plucked by the US Treasury from his post at a small Wisconsin university. Neither seemed to enjoy the company of the other: Keynes was disdainful of what he saw as the inferior intellect and gruff manners of the ‘aesthetically oppressive’ White, whose ‘harsh rasping voice’ proved a particular annoyance. Keynes, meanwhile, was the archetype of the haughty English lord; as White remarked to the British economist Lionel Robbins, ‘your Baron Keynes sure pees perfume.’

Squabbles aside, the two men ended up largely in agreement about the basic aims of the new international monetary system: to stabilise exchange rates; facilitate international financial co-operation; prohibit competitive currency depreciations and arbitrary alterations of currency values; and restrict the international flow of capital to prevent the short-term, speculative investments widely believed to have destabilised the interwar monetary system. They also agreed on the need to establish a new international institution to provide financial assistance to states experiencing exchange imbalances and to enforce rules about currency values (what would become the International Monetary Fund), and another to provide capital for postwar reconstruction (the future World Bank). A closely managed and regulated international financial system would replace the unco-ordinated and competitive system of the interwar years. And with currencies stabilised – so they hoped – world trade could be resumed. (...)

One of the most innovative aspects of the Anglo-American deal was the fact that it prioritised the need for full employment and social insurance policies at the national level over thoroughgoing international economic integration. To this extent, it was more Keynesian than not – and it represented a dramatic departure from older assumptions about the way the world’s financial system should function. Under the gold standard, which had facilitated a period of financial and commercial globalisation in the late 19th and early 20th centuries, governments had possessed few means of responding to an economic downturn beyond cutting spending and raising interest rates in the hope that prices and wages would drop so low that the economy would right itself. Populations simply had to ride out periods of deflation and mass unemployment, as the state couldn’t do much to help them: pursuing expansionary fiscal or monetary measures (what states tend to do today) would jeopardise the convertibility of the state’s currency into gold. For these reasons, the gold standard was well suited to a 19th-century world in which there were few organised workers’ parties and labour unions, but not so well suited to a messy world of mass democracy. The Keynesian revolution in economic governance gave the state a set of powerful new tools for responding to domestic economic distress – but they wouldn’t work as long as the gold standard called the shots. (...)

American dominance over the system was guaranteed by another crucial fact: in 1944, the US dollar was the only currency available widely enough to facilitate international exchange under the new ‘gold exchange standard’. This was intended to be a modified version of the gold standard which, in practice, would allow states to adjust their currency values against the dollar as they saw fit (depending on whether they prioritised economic growth, for example, or controlling inflation), with the value of the dollar convertible into gold at a fixed rate of $35 an ounce. What this meant was that, after the end of the war, the US dollar would effectively become the world’s currency of reserve – which it remains to this day (although it’s no longer pegged to gold). This arrangement would give the US the privilege of being indebted to the world ‘free of charge’, as Charles de Gaulle later put it, but would work only as long as the US saw maintaining gold convertibility as working in its national interest. Harry Dexter White apparently hadn’t envisaged a scenario in which it wouldn’t, but this eventually happened in the 1970s, when deficits from financing the Vietnam War piled so high that the US began to face a run on its gold reserves. In 1971, Richard Nixon removed the dollar’s peg to gold – effectively bringing Bretton Woods to an end – rather than raising interest rates to staunch the outflow of gold, which would probably have caused a recession (with an election on the horizon). Before this, the track record of the gold exchange standard had been pretty good: the years of its operation had seen stable exchange rates, unprecedented global economic growth, the rebirth of world trade and relatively low unemployment. This period also saw the emergence of many different models of the welfare state – in Europe, the United States and Japan – just as the economists behind Bretton Woods had intended.

by Jamie Martin, LRB |  Read more:
Image: uncredited

Armando Barrios - Cantata (1985)
via:

How I Found Out I Didn't Have Herpes

Six months ago, I sat waiting in my gynecologist’s exam room chair, fully clothed and wishing I were anywhere else. At that particular moment, I’d even have preferred being naked and spread-eagled on the paper-lined bed. It’s not true what they say about the stirrups being the worst part of the ladyparts exam room: it’s the chair. Once you’re clothed and in the chair, it means you’re there to talk.

You never forget your first time debriefing with your gynecologist. Mine was four years ago, at age 22, when I sat crumpled in a chair just like this. A few days before, I’d had a rough romp of casual oral sex, a one-night head-stand. Minutes after the guy went down on me, I felt that something wasn’t right with my vagina, and two days later, I broke out in sores. “You poor thing,” the nurse practitioner at my college’s health center told me. “You have herpes.”

“Don’t I need to be tested?” I choked out between sobs. She’d cocked her head and tossed me a pity smile, as if to say, don’t you think I’ve seen enough herpes to know what it looks like?

My sores couldn’t be anything else, she told me. It didn’t matter if it was HSV-1 or HSV-2, because once it presents genitally, herpes is herpes. And it’s mine for life.

I never got another outbreak, but at 22, I still entered the dating world feeling like damaged goods. I was young, healthy, attractive, and grateful to anyone who agreed to fuck me after I told him I had herpes. (Only at first. As I wrote on this site a year and a half ago, herpes eventually helped me become a better dater and gravitate toward decent men.) But the conversation—the “before we do this, I have to tell you something” routine—never got easy. And the diagnosis inevitably warped the way I thought about myself. I no longer felt like a free agent in the world of love and sex; instead, I assumed I’d have to settle a notch or two down from the man who could have loved a herpes-free me. I may never have had another sore, but I still felt marked.

Four years after being diagnosed, I was at the gyno for my annual pap smear when I decided to order the sex-haver’s special: tests for HIV, gonorrhea, chlamydia and syphilis. I also figured it was time to meet my herpes, so I requested an off-menu HSV blood test that isn’t considered part of the routine STD-screening panel. “If you don’t hear from us by Wednesday, everything’s normal,” the doc told me.

And then the “we found something” call never came. That wasn’t my normal.

I called the lab to see what had happened to my test. “Oh yeah, here you are,” the lab tech told me as she pulled up my record. “You’re negative for everything.”

What.

“No,” I told the tech. “Check again. I definitely have herpes.”

“I don’t know who told you that, but you don’t,” she said. Both of my blood tests for HSV-1 and HSV-2 were negative. (...)

Follow me down the herpetic rabbit hole, which is muddied first by stigma and second by the fact that, biologically, the herpes infection is rather complicated. The other expert I spoke with, H. Hunter Handsfield, MD, is Professor Emeritus of Medicine at the University of Washington Center for AIDS and STD. It’s one of the hardest STDs to teach to medical students, he said, and he dedicates more time lecturing about it than almost any other infection. “It is complex for a lot of doctors out there,” he said. “A lot of practitioners don’t have the level of nuance.”

by The Hairpin |  Read more:
Image: senoranderson/flickr

Locked in the Cabinet


Sixteen years ago, president Bill Clinton’s secretary of labor, Robert Reich, summed up the frustrations of adjusting to life in the Cabinet, where even a close personal relationship with the president, dating to their Oxford days, didn’t spare him from being bossed around by arrogant West Wing nobodies. “From the view of the White House staff, cabinet officials are provincial governors presiding over alien, primitive territories,” Reich wrote in a classic of the pissed-off-secretary genre, Locked in the Cabinet. “Anything of any importance occurs in the national palace.”

Two presidents later, the Cabinet is a swarm of 23 people that includes 15 secretaries and eight other Cabinet-rank officers. And yet never has the job of Cabinet secretary seemed smaller. The staffers who rule Obama’s West Wing often treat his Cabinet as a nuisance: At the top of the pecking order are the celebrity power players, like former Secretary of State Hillary Clinton, to be warily managed; at the bottom, what they see as a bunch of well-intentioned political naifs only a lip-slip away from derailing the president’s agenda. Chu might have been the first Obama Cabinet secretary to earn the disdain of White House aides, but he was hardly the last.

“We are completely marginalized … until the shit hits the fan,” says one former Cabinet deputy secretary, summing up the view of many officials I interviewed. “If your question is: Did the president rely a lot on his Cabinet as a group of advisers? No, he didn’t,” says former Obama Transportation Secretary Ray LaHood.

Little wonder, then, that Obama has called the group together only rarely, for what by most accounts are not much more than ritualistic team-building exercises: According to CBS News White House reporter Mark Knoller, the Cabinet met 19 times in Obama’s first term and four times in the first 10 months of his second term. That’s once every three months or so—about as long as you can drive around before you’re supposed to change your oil.

For any modern president, the advantages of hoarding power in the White House at the expense of the Cabinet are obvious—from more efficient internal communication and better control of external messaging to avoiding messy confirmation battles and protecting against pesky congressional subpoenas. But over the course of his five years in office, Obama has taken this White House tendency to an extreme, according to more than 50 interviews with current and former secretaries, White House staffers and executive branch officials, who described his Cabinet as a restless nest of ambition, fits-and-starts achievement and power-jockeying under a shadow of unfulfilled promise.

That’s a far cry from the vision Obama sketched out in the months leading up to his 2008 election. Back then, he waxed expansive about the Cabinet, promising to rejuvenate the institution as a venue for serious innovation and genuine decision making. “I don’t want to have people who just agree with me,” he told Time magazine, after reading Doris Kearns Goodwin’s classic account of President Abraham Lincoln and his advisers, Team of Rivals. “I want people who are continually pushing me out of my comfort zone.”

Obama, many of his associates now concede, never really intended to be pushed out of his comfort zone. While he personally recruited stars such as Clinton, Treasury Secretary Timothy Geithner and Defense Secretary Robert Gates, most other picks for his first Cabinet were made by his staff, with less involvement from the president. “[Bill] Clinton spent almost all of his time picking the Cabinet at the expense of the White House staff; Obama made the opposite mistake,” says a person close to both presidents.

by Glenn Thrush, Politico |  Read more:
Image: Composite/Politico

Thursday, November 14, 2013


Leidy Churchman, Chuck 2010
via:

[ed. This kind of reminds me of my old house, except it was log constructed and had a big deck all the way around.]
via:

Tuomas Kivinen, Alley in Nishi-Nakasu. Fukuoka City, Japan.

Why We Are Allowed to Hate Silicon Valley

If Ronald Reagan was the first Teflon President, then Silicon Valley is the first Teflon Industry: no matter how much dirt one throws at it, nothing seems to stick. While “Big Pharma,” “Big Food” and “Big Oil” are derogatory terms used to describe the greediness that reigns supreme in those industries, this is not the case with “Big Data.” This innocent term is never used to refer to the shared agendas of technology companies. What shared agendas? Aren’t these guys simply improving the world, one line of code at a time?

Something odd is going on here. While we understand that the interests of pharmaceutical, food and oil companies naturally diverge from our own, we rarely approach Silicon Valley with the requisite suspicion. Instead, we continue to treat data as if it were a special, magical commodity that could single-handedly defend itself against any evil genius who dares to exploit it.

Earlier this year, a tiny scratch appeared on the rhetorical Teflon of Silicon Valley. The Snowden affair helped – but so did other events. The world seems to have finally realized that “disruption” – the favorite word of the digital elites –describes a rather ugly, painful phenomenon. Thus, university professors are finally complaining about the “disruption” brought on by the massive open online courses (MOOCs); taxi drivers are finally fighting services like Uber; residents of San Francisco are finally bemoaning the “disruption” of monthly rents in a city that has suddenly been invaded by millionaires. And then, of course, there are the crazy, despicable ideas coming from Silicon Valley itself: the latest proposal, floated by one tech executive at a recent conference, is that Silicon Valley should secede from the country and “build an opt-in society, ultimately outside the United States, run by technology.” Let’s share his pain: A country that needs a congressional hearing to fix a web-site is a disgrace to Silicon Valley.

This bubbling discontent is reassuring. It might even help bury some of the myths spun by Silicon Valley. Wouldn’t it be nice if one day, told that Google’s mission is to “organize the world’s information and make it universally accessible and useful,” we would finally read between the lines and discover its true meaning: “to monetize all of the world’s information and make it universally inaccessible and profitable”? With this act of subversive interpretation, we might eventually hit upon the greatest emancipatory insight of all: Letting Google organize all of the world’s information makes as much sense as letting Halliburton organize all of the world’s oil.

But any jubilation is premature: Silicon Valley still holds a firm grip on the mechanics of the public debate. As long as our critique remains tied to the plane of technology and information– a plane that is often described by that dreadful, meaningless, overused word “digital” – Silicon Valley will continue to be seen as an exceptional and unique industry. When food activists go after Big Food and accuse those companies of adding too much salt and fat to their snacks to make us crave even more of them, no one dares accuse these activists of being anti-science. Yet, a critique of Facebook or Twitter along similar lines – for example, that they have designed their services to play up our anxieties and force us to perpetually click the “refresh” button to get the latest update – almost immediately brings accusations of technophobia and Luddism.

The reason why the digital debate feels so empty and toothless is simple: framed as a debate over “the digital” rather than “the political” and “the economic,” it’s conducted on terms that are already beneficial to technology companies. Unbeknownst to most of us, the seemingly exceptional nature of commodities in question – from “information” to “networks” to “the Internet” – is coded into our language. It’s this hidden exceptionalism that allows Silicon Valley to dismiss its critics as Luddites who, by opposing “technology,” “information” or “the Internet”-- they don’t do plurals in Silicon Valley, for the nuance risks overwhelming their brains – must also be opposed to “progress.”

How do you spot “the digital debate”? Look for arguments that appeal to the essences of things – of technology, information, knowledge and, of course, the Internet itself. Thus, whenever you hear someone say “this law is bad because it will break the Internet” or “this new gadget is good because that’s what technology wants,” you know that you have left the realm of the political – where arguments are usually framed around the common good – and have entered the realm of bad metaphysics. In that realm, what you are being asked to defend is the well-being of phantom digital gods that function as convenient stand-ins for corporate interests. Why does anything that might “break the Internet” also risk breaking Google? This can’t be a coincidence, can it?

Perhaps, we should ditch the technology/progress dialectic altogether. “Is it O.K. to be a Luddite?” ran the title of a fabulous 1984 essay by Thomas Pynchon – a question that he answered, by and large, in the affirmative. This question feels outdated today. “Is it okay not to be a Luddite but still hate Silicon Valley?” is a much better question, for the real enemy is not technology but the present political and economic regime – a wild combination of the military-industrial complex and the out-of-control banking and advertising – that deploys latest technologies to achieve its ugly (even if lucrative and occasionally pleasant) ends. Silicon Valley represents the most visible, the most discussed, and the most naive part of this assemblage. In short, it’s okay to hate Silicon Valley – we just need to do it for the right reasons. Below are three of them – but this is hardly an exhaustive list.

by Evgeny Morozov, Frankfurter Allgemeine | Read more:
Image: via Telegraph UK