Sunday, August 31, 2014


Julia Kater
via:

Packrafts in the Parks

The morning sun was just breaking over Bonanza Ridge when a group of wilderness racers gathered last summer in the decommissioned copper mining town of Kennicott in Alaska’s Wrangell Mountains. The iconic red-and-white industrial buildings from nearly 100 years ago, now a national historic monument and undergoing restoration by the National Park Service, glowed in the light of the new day as Monte Montepare, co-owner of Kennicott Wilderness Guides, faced the crowd. Everyone had gathered for a one-of-a-kind race, a dash on foot up Bonanza, then down the backside to the upper reaches of McCarthy Creek.

McCarthy Creek parallels the spine of Bonanza Ridge for several miles until it curves like a fishhook around the base of Sourdough Peak.

Upon reaching the creek, the racers would dump their packs, inflate the boats they carried with them, then shoot down 10 miles of rapids to Kennicott’s sister town of McCarthy. The finish line, not coincidentally, was right in front of the town’s only bar.

The boat carried by each of these racers is the 21st century incarnation of a design concept that’s been around for a couple decades now. Packrafts are lightweight (about 5 pounds), compact, and easily stuffed into a backpack. Essentially, they’re super-tough one-person rubber rafts, the diminutive cousins of the 16- and 20-footers used for more mainstream river trips. Small size is the secret of their advantage: A packraft gives the wilderness traveler the sort of amphibious capability that humans have longed for since the earliest days of our species. A hundred years ago, your only real option was to build a raft or hope you could find a canoe cached on your side of the river. Now, with a packraft, the backcountry trekker can go virtually anywhere, including a fast boogie up and over a mountain, then downstream through some substantial whitewater in time for beer-thirty.

Montepare welcomed all the racers and laid out the rules in front of the Kennicott Wilderness Guides main office; then head ranger Stephens Harper got up and delivered a safety and environmental briefing every bit as mandatory as the helmet and drysuit each racer was required to have. The first annual McCarthy Creek Packraft Race, which started as a way for Kennicott Wilderness to promote its guiding business and have some fun, had grown in importance from being just a bunch of whitewater bums looking for a thrill.

Pretty much by accident, the company and its owners, as well as the racers, found themselves front and center in a rancorous debate over land use, backcountry permitting and public lands policy taking place thousands of miles from the Wrangell Mountains. The jaundiced eyes of nonprofit conservation groups were watching.

Grand Canyon episode

Back in 2011, an erstwhile river warrior hiked down into the Grand Canyon with a packraft, blew it up, and shoved off into Hance, one of the longer and more difficult rapids. Within seconds he’d dumped his boat and was being sucked down into the gorge below while his girlfriend stood helplessly on the bank.

He made it out by the skin of his teeth, with the whole thing on tape, thanks to a GoPro cam. And, of course, what good is a near-drowning experience if you haven’t posted the video on YouTube? It didn’t take long for the National Park Service staff at the Grand Canyon to see it and decide, based on this one incident, that packrafters were a menace both to themselves and to public lands. The video was pulled after a few days but the damage was done.

This tale might seem familiar to readers here in Alaska, given the recent tragic death of Rob Kehrer while packrafting in Wrangell-St. Elias as part of the Alaska Wilderness Classic race. Earlier this month he launched his packraft into the treacherous Tana River and disappeared behind a wall of whitewater. His body was found on a gravel bar downstream.

One can understand how NPS managers might take a dim view of packrafters in their parks, given events such as these. But as with most thorny management issues, there is a lot more to the story than those few incidents that make the headlines.

by Kris Farmen, Alaska Dispatch |  Read more:
Image: Luc Mehl via:

What Your 1st-Grade Life Says About the Rest of It

In the beginning, when they knew just where to find everyone, they pulled the children out of their classrooms.

They sat in any quiet corner of the schools they could claim: the sociologists from Johns Hopkins and, one at a time, the excitable first-graders. Monica Jaundoo, whose parents never made it past the eighth grade. Danté Washington, a boy with a temper and a dad who drank too much. Ed Klein, who came from a poor white part of town where his mother sold cocaine.

They talked with the sociologists about teachers and report cards, about growing up to become rock stars or police officers. For many of the children, this seldom happened in raucous classrooms or overwhelmed homes: a quiet, one-on-one conversation with an adult eager to hear just about them. “I have this special friend,” Jaundoo thought as a 6-year-old, “who’s only talking to me.”

Later, as the children grew and dispersed, some falling out of the school system and others leaving the city behind, the conversations took place in McDonald’s, in public libraries, in living rooms or lock-ups. The children — 790 of them, representative of the Baltimore public school system’s first-grade class in 1982 — grew harder to track as the patterns among them became clearer.

Over time, their lives were constrained — or cushioned — by the circumstances they were born into, by the employment and education prospects of their parents, by the addictions or job contacts that would become their economic inheritance. Johns Hopkins researchers Karl Alexander and Doris Entwisle watched as less than half of the group graduated high school on time. Before they turned 18, 40 percent of the black girls from low-income homes had given birth to their own babies. At the time of the final interviews, when the children were now adults of 28, more than 10 percent of the black men in the study were incarcerated. Twenty-six of the children, among those they could find at last count, were no longer living.

A mere 4 percent of the first-graders Alexander and Entwisle had classified as the “urban disadvantaged” had by the end of the study completed the college degree that’s become more valuable than ever in the modern economy. A related reality: Just 33 of 314 had left the low-income socioeconomic status of their parents for the middle class by age 28.

Today, the “kids” — as Alexander still calls them — are 37 or 38. Alexander, now 68, retired from Johns Hopkins this summer just as the final, encompassing book from the 25-year study was published. Entwisle, then 89, died of lung cancer last November shortly after the final revisions on the book. Its sober title, “The Long Shadow,” names the thread running through all those numbers and conversations: The families and neighborhoods these children were born into cast a heavy influence over the rest of their lives, from how they fared in the first grade to what they became as grownups.

Some of them — children largely from the middle-class and blue-collar white families still in Baltimore’s public school system in 1982 — grew up to managerial jobs and marriages and their own stable homes. But where success occurred, it was often passed down, through family resources or networks simply out of reach of most of the disadvantaged.

Collectively, the study of their lives, and the outliers among them, tells an unusually detailed story — both empirical and intimate — of the forces that surround and steer children growing up in a post-industrial city like Baltimore.

“The kids they followed grew up in the worst era for big cities in the U.S. at any point in our history,” says Patrick Sharkey, a sociologist at New York University familiar with the research. Their childhood spanned the crack epidemic, the decline of urban industry, the waning national interest in inner cities and the war on poverty.

In that sense, this study is also about Baltimore itself — how it appeared to researchers and their subjects, to children and the adults they would later become.

by Emily Badger, Washington Post |  Read more:
Image: Linda Davidson/The Washington Post

The Dawn of the Post-Clinic Abortion

In June 2001, under a cloud-streaked sky, Rebecca Gomperts set out from the Dutch port of Scheveningen in a rented 110-foot ship bound for Ireland. Lashed to the deck was a shipping container, freshly painted light blue and stocked with packets of mifepristone (which used to be called RU-486) and misoprostol. The pills are given to women in the first trimester to induce a miscarriage. Medical abortion, as this procedure is called, had recently become available in the Netherlands. But use of misoprostol and mifepristone to end a pregnancy was illegal in Ireland, where abortion by any means remains against the law, with few exceptions.

Gomperts is a general-practice physician and activist. She first assisted with an abortion 20 years ago on a trip to Guinea, just before she finished medical school in Amsterdam. Three years later, Gomperts went to work as a ship’s doctor on a Greenpeace vessel. Landing in Mexico, she met a girl who was raising her younger siblings because her mother had died during a botched illegal abortion. When the ship traveled to Costa Rica and Panama, women told her about hardships they suffered because they didn’t have access to the procedure. “It was not part of my medical training to talk about illegal abortion and the public-health impact it has,” Gomperts told me this summer. “In those intense discussions with women, it really hit me.”

When she returned to the Netherlands, Gomperts decided she wanted to figure out how to help women like the ones she had met. She did some legal and medical research and concluded that in a Dutch-registered ship governed by Dutch law, she could sail into the harbor of a country where abortion is illegal, take women on board, bring them into international waters, give them the pills at sea and send them home to miscarry. Calling the effort Women on Waves, she chose Dublin as her first destination.

Ten women each gave Gomperts 10,000 Dutch guilders (about $5,500), part of the money needed to rent a boat and pay for a crew. But to comply with Dutch law, she also had to build a mobile abortion clinic. Tapping contacts she made a decade earlier, when she attended art school at night while studying medicine, she got in touch with Joep van Lieshout, a well-known Dutch artist, and persuaded him to design the clinic. They applied for funds from the national arts council and built it together inside the shipping container. When the transport ministry threatened to revoke the ship’s authorization because of the container on deck, van Lieshout faxed them a certificate decreeing the clinic a functional work of art, titled “a-portable.” The ship was allowed to sail, and van Lieshout later showed a mock-up of the clinic at the Venice Biennale.

As the boat sailed toward Dublin, Gomperts and her shipmates readied their store of pills and fielded calls from the press and emails from hundreds of Irish women seeking appointments. The onslaught of interest took them by surprise. So did a controversy that was starting to brew back home. Conservative politicians in the Netherlands denounced Gomperts for potentially breaking a law that required a special license for any doctor to provide an abortion after six and a half weeks of pregnancy. Gomperts had applied for it a few months earlier and received no reply. She set sail anyway, planning to perform abortions only up to six and a half weeks if the license did not come through.

When Gomperts’s ship docked in Dublin, she still didn’t have the license. Irish women’s groups were divided over what to do. Gomperts decided she couldn’t go ahead without their united support and told a group of reporters and protesters that she wouldn’t be able to give out a single pill. “This is just the first of many trips that we plan to make,” she said from the shore, wrapped in a blanket, a scene that is captured in “Vessel,” a documentary about her work that will be released this winter. Gomperts was accused of misleading women. A headline in The Telegraph in London read: “Abortion Boat Admits Dublin Voyage Was a Publicity Sham.”

Gomperts set sail again two years later, this time resolving to perform abortions only up to six and a half weeks. She went to Poland first and to Portugal in 2004. The Portuguese minister of defense sent two warships to stop the boat, then just 12 miles offshore, from entering national waters. No local boat could be found to ferry out the women who were waiting onshore. “In the beginning we were very pissed off, thinking the campaign was failing because the ship couldn’t get in,” one Portuguese activist says in “Vessel.” “But at a certain point, we realized that was the best thing that could ever happen. Because we had media coverage from everywhere.”

Without consulting her local allies, Gomperts changed strategy. She appeared on a Portuguese talk show, held up a pack of pills on-screen and explained exactly how women could induce an abortion at home — specifying the number of pills they needed to take, at intervals, and warning that they might feel pain. A Portuguese anti-abortion campaigner who was also on the show challenged the ship’s operation on legal grounds. “Excuse me,” Gomperts said. “I really think you should not talk about things that you don’t know anything about, O.K. . . . I know what I can do within the law.” Looking directly at him, she added, “Concerning pregnancy, you’re a man, you can walk away when your girlfriend is pregnant. I’m pregnant now, and I had an abortion when I was — a long time ago. And I’m very happy that I have the choice to continue my pregnancy how I want, and that I had the choice to end it when I needed it.” She pointed at the man. “You have never given birth, so you don’t know what it means to do that.”

Two and a half years later, Portugal legalized abortion. As word of Gomperts’s TV appearance spread, activists in other countries saw it as a breakthrough. Gomperts had communicated directly to women what was still, in many places, a well-kept secret: There were pills on the market with the power to end a pregnancy. Emails from women all over the world poured into Women on Waves, asking about the medication and how to get it. Gomperts wanted to help women “give themselves permission” to take the pills, as she puts it, with as little involvement by the government, or the medical profession, as possible. She realized that there was an easier way to do this than showing up in a port. She didn’t need a ship. She just needed the Internet.

by Emily Bazelon, NY Times |  Read more:
Image:Linda Nylind via:

Does It Help to Know History?

About a year ago, I wrote about some attempts to explain why anyone would, or ought to, study English in college. The point, I thought, was not that studying English gives anyone some practical advantage on non-English majors, but that it enables us to enter, as equals, into a long existing, ongoing conversation. It isn’t productive in a tangible sense; it’s productive in a human sense. The action, whether rewarded or not, really is its own reward. The activity is the answer.

It might be worth asking similar questions about the value of studying, or at least, reading, history these days, since it is a subject that comes to mind many mornings on the op-ed page. Every writer, of every political flavor, has some neat historical analogy, or mini-lesson, with which to preface an argument for why we ought to bomb these guys or side with those guys against the guys we were bombing before. But the best argument for reading history is not that it will show us the right thing to do in one case or the other, but rather that it will show us why even doing the right thing rarely works out. The advantage of having a historical sense is not that it will lead you to some quarry of instructions, the way that Superman can regularly return to the Fortress of Solitude to get instructions from his dad, but that it will teach you that no such crystal cave exists. What history generally “teaches” is how hard it is for anyone to control it, including the people who think they’re making it.

Roger Cohen, for instance, wrote on Wednesday about all the mistakes that the United States is supposed to have made in the Middle East over the past decade, with the implicit notion that there are two histories: one recent, in which everything that the United States has done has been ill-timed and disastrous; and then some other, superior, alternate history, in which imperial Western powers sagaciously, indeed, surgically, intervened in the region, wisely picking the right sides and thoughtful leaders, promoting militants without aiding fanaticism, and generally aiding the cause of peace and prosperity. This never happened. As the Libyan intervention demonstrates, the best will in the world—and, seemingly, the best candidates for our support—can’t cure broken polities quickly. What “history” shows is that the same forces that led to the Mahdi’s rebellion in Sudan more than a century ago—rage at the presence of a colonial master; a mad turn towards an imaginary past as a means to equal the score—keep coming back and remain just as resistant to management, close up or at a distance, as they did before. ISIS is a horrible group doing horrible things, and there are many factors behind its rise. But they came to be a threat and a power less because of all we didn’t do than because of certain things we did do—foremost among them that massive, forward intervention, the Iraq War. (The historical question to which ISIS is the answer is: What could possibly be worse than Saddam Hussein?)

Another, domestic example of historical blindness is the current cult of the political hypersagacity of Lyndon B. Johnson. L.B.J. was indeed a ruthless political operator and, when he had big majorities, got big bills passed—the Civil Rights Act, for one. He also engineered, and masterfully bullied through Congress, the Vietnam War, a moral and strategic catastrophe that ripped the United States apart and, more important, visited a kind of hell on the Vietnamese. It also led American soldiers to commit war crimes, almost all left unpunished, of a kind that it still shrivels the heart to read about. Johnson did many good things, but to use him as a positive counterexample of leadership to Barack Obama or anyone else is marginally insane.

Johnson’s tragedy was critically tied to the cult of action, of being tough and not just sitting there and watching. But not doing things too disastrously is not some minimal achievement; it is a maximal achievement, rarely managed. Studying history doesn’t argue for nothing-ism, but it makes a very good case for minimalism: for doing the least violent thing possible that might help prevent more violence from happening.

The real sin that the absence of a historical sense encourages is presentism, in the sense of exaggerating our present problems out of all proportion to those that have previously existed. It lies in believing that things are much worse than they have ever been—and, thus, than they really are—or are uniquely threatening rather than familiarly difficult. Every episode becomes an epidemic, every image is turned into a permanent injury, and each crisis is a historical crisis in need of urgent aggressive handling—even if all experience shows that aggressive handling of such situations has in the past, quite often made things worse. (The history of medicine is that no matter how many interventions are badly made, the experts who intervene make more: the sixteenth-century doctors who bled and cupped their patients and watched them die just bled and cupped others more.) What history actually shows is that nothing works out as planned, and that everything has unintentional consequences. History doesn’t show that we should never go to war—sometimes there’s no better alternative. But it does show that the results are entirely uncontrollable, and that we are far more likely to be made by history than to make it. History is past, and singular, and the same year never comes round twice.

by Adam Gopnik, New Yorker |  Read more:
Image: Nathan Huang

Saturday, August 30, 2014


Joan Miró (Spanish, 1893-1983), The Conductor [Le Chef d’orchestre], 1976
via:

Predictive First: How A New Era Of Apps Will Change The Game

Over the past several decades, enterprise technology has consistently followed a trail that’s been blazed by top consumer tech brands. This has certainly been true of delivery models – first there were software CDs, then the cloud, and now all kinds of mobile apps. In tandem with this shift, the way we build applications has changed and we’re increasingly learning the benefits of taking a mobile-first approach to software development.

Case in point: Facebook, which of course began as a desktop app, struggled to keep up with emerging mobile-first experiences like Instagram and WhatsApp, and ended up acquiring them for billions of dollars to play catch up.

The Predictive-First Revolution

Recent events like the acquisition of RelateIQ by Salesforce demonstrate that we’re at the beginning of another shift toward a new age of predictive-first applications. The value of data science and predictive analytics has been proven again and again in the consumer landscape by products like Siri, Waze and Pandora.

Big consumer brands are going even deeper, investing in artificial intelligence (AI) models such as “deep learning.” Earlier this year, Google spent $400 million to snap up AI company DeepMind, and just a few weeks ago, Twitter bought another sophisticated machine-learning startup called MadBits. Even Microsoft is jumping on the bandwagon, with claims that its “Project Adam” network is faster than the leading AI system, Google Brain, and that its Cortana virtual personal assistant is smarter than Apple’s Siri.

The battle for the best data science is clearly underway. Expect even more data-intelligent applications to emerge beyond the ones you use every day like Google web search. In fact, this shift is long overdue for enterprise software.

Predictive-first developers are well poised to overtake the incumbents because predictive apps enable people to work smarter and reduce their workloads even more dramatically than last decade’s basic data bookkeeping approaches to customer relationship management, enterprise resource planning and human resources systems.

Look at how Bluenose is using predictive analytics to help companies engage at-risk customers and identify drivers of churn, how Stripe’s payments solution is leveraging machine learning to detect fraud, or how Gild is mining big data to help companies identify the best talent.

These products are revolutionizing how companies operate by using machine learning and predictive modeling techniques to factor in thousands of signals about whatever problem a business is trying to solve, and feeding that insight directly into day-to-day decision workflows. But predictive technologies aren’t the kind of tools you can just add later. Developers can’t bolt predictive onto CRM, marketing automation, applicant tracking, or payroll platforms after the fact. You need to think predictive from day one to fully reap the benefits.

by Vik Singh, TechCrunch |  Read more:
Image: Holger Neimann

Friday, August 29, 2014

Seven Days and Nights in the World's Largest, and Rowdiest Retirement Community

[ed. I've posted about The Villages before, but this is another good, ground-level perspective. Obviously, the alternative community's raison d'etre strikes a resonant chord with a number of people. Not me.]

Seventy miles northwest of Orlando International Airport, amid the sprawling, flat central Florida nothingness — past all of those billboards for Jesus and unborn fetuses and boiled peanuts and gator meat — springs up a town called Wildwood. Storefront churches. O’Shucks Oyster Bar. Family Dollar. Nordic Gun & Pawn. A community center with a playground overgrown by weeds. Vast swaths of tree-dotted pastureland. This area used to be the very center of Florida’s now fast-disappearing cattle industry. The houses are low-slung, pale stucco. One has a weight bench in the yard. There’s a rail yard crowded with static freight trains. The owners of a dingy single-wide proudly fly the stars and bars.

And then, suddenly, unexpectedly, Wildwood’s drabness explodes into green Southern splendor: majestic oaks bearing spindly fronds of Spanish moss that hang down almost to the ground. What was once rolling pasture land has been leveled with clay and sand. Acres of palmetto, hummock, and pine forest clear-cut and covered with vivid sod. All around me, old men drive golf carts styled to look like German luxury automobiles or that have tinted windows and enclosures to guard against the morning chill, along a wide, paved cart path. It’s a bizarre sensation, like happening upon a geriatric man’s vision of heaven itself. I have just entered The Villages.

This is one of the fastest-growing small cities in America, a place so intoxicating that weekend visitors frequently impulse-purchase $200,000 homes. The community real estate office sells about 250 houses every month. The grass is always a deep Pakistan green. The sunrises and sunsets are so intensely pink and orange and red they look computer-enhanced. The water in the public pools is always the perfect temperature. Residents can play golf on one of 40 courses every day for free. Happy hour begins at 11 a.m. Musical entertainment can be found in three town squares 365 nights a year. It’s landlocked but somehow still feels coastal. There’s no (visible) poverty or suffering. Free, consensual, noncommittal sex with a new partner every night is an option. There’s zero litter or dog shit on the sidewalks and hardly any crime and the laws governing the outside world don’t seem to apply here. You can be the you you’ve always dreamed of.

One hundred thousand souls over the age of 55 live here, packed into 54,000 homes spread over 32 square miles and three counties, a greater expanse of land than the island of Manhattan. Increasingly, this is how Americans are spending their golden years — not in the cities and towns where they established their roots, but in communities with people their own age, with similar interests and values. Trailer parks are popping up outside the gates; my aunt and uncle spend the summer months in western Pennsylvania in a gated 55-plus community, and when the weather turns they live in one through the winter to play golf and line-dance in the town squares.

There are people, younger than 55, generally, who suspect that this all seems too good to be true. They — we — point to the elusive, all-powerful billionaire developer who lords over, and profits from, every aspect of his residents’ lives; or the ersatz approximation of some never-realized Main Street USA idyll — so white, so safe — exemplified by Mitt Romney’s tone-deaf rendition of “God Bless America,”performed at one of his many campaign swings through The Villages. But those who live and will likely die here and who feel they’ve earned the right to indulge themselves aren’t anguishing over it. I am living here for a week to figure out if The Villages is a supersize, reinvigorated vision of the American dream, or a caricature, or if there’s even a difference. The question I’m here to try to answer is a scary one: How do we want to finish our lives?

by Alex French, Buzz Feed |  Read more:
Image: Edward Linsmier

Thursday, August 28, 2014


Walter Gerth, Joke Chair, for Strassle.
via:

Feel Good Inc.


[ed. Original and a killer cover version.]
Image: via:

America’s Tech Guru Steps Down—But He’s Not Done Rebooting the Government

The White House confirmed today the rumors that Todd Park, the nation’s Chief Technical Officer and the spiritual leader of its effort to reform the way the government uses technology, is leaving his post. Largely for family reasons—a long delayed promise to his wife to raise their family in California—he’s moving back to the Bay Area he left when he began working for President Barack Obama in 2009.

But Park is not departing the government, just continuing his efforts on a more relevant coast. Starting in September, he’s assuming a new post, so new that the White House had to figure out what to call him. It finally settled on technology adviser to the White House based in Silicon Valley. But Park knows how he will describe himself: the dude in the Valley who’s working for the president. President Obama said in a statement, “Todd has been, and will continue to be, a key member of my administration.” Park will lead the effort to recruit top talent to help the federal government overhaul its IT. In a sense, he is doubling down on an initiative he’s already set well into motion: bringing a Silicon Valley sensibility to the public sector.

It’s a continuation of what Park has already been doing for months. If you were at the surprisingly louche headquarters of the nonprofit Mozilla Foundation in Mountain View, California, one evening in June, you could have seen for yourself. Park was looking for recruits among the high-performing engineers of Silicon Valley, a group that generally ignores the government.

There were about a hundred of them, filling several lounges and conference rooms. As they waited, they nibbled on the free snacks and beverages from the open pantry; pizza would arrive later. Park, a middle-aged Asian American in a blue polo shirt approached a makeshift podium. Though he hates the spotlight, in events like these—where his passion for reforming the moribund state of government information technology flares—he has a surprising propensity for breathing fire.

“America needs you!” he said to the crowd. “Not a year from now! But Right. The. Fuck. Now!”

Indeed, America needs them, badly. Astonishing advances in computer technology and connectivity have dramatically transformed just about every aspect of society but government. Achievements that Internet companies seem to pull off effortlessly—innovative, easy-to-use services embraced by hundreds of millions of people—are tougher than Mars probes for federal agencies to execute. The recent history of government IT initiatives reads like a catalog of overspending, delays, and screwups. The Social Security Administration has spent six years and $300 million on a revamp of its disability-claim-filing process that still isn’t finished. The FBI took more than a decade to complete a case-filing system in 2012 at a cost of $670 million. And this summer a routine software update fried the State Department database used in processing visas; the fix took weeks, ruining travel plans for thousands.

Park knows the problem is systemic—a mindset that locks federal IT into obsolete practices—“a lot of people in government are, like, suspended in amber,” he said to the crowd at Mozilla. In the rest of the tech world, nimbleness, speed, risk-taking and relentless testing are second nature, essential to surviving in a competitive landscape that works to the benefit of consumers. But the federal government’s IT mentality is still rooted in caution, as if the digital transformation that has changed our lives is to be regarded with the utmost suspicion. It favors security over experimentation and adherence to bureaucratic procedure over agile problem-solving. That has led to an inherently sclerotic and corruptible system that doesn’t just hamper innovation, it leaves government IT permanently lagging, unable to perform even the most basic functions we expect. So it’s not at all surprising that the government has been unable to attract the world-class engineers who might be able to fix this mess, a fact that helps perpetuate a cycle of substandard services and poorly performing agencies that seems to confirm the canard that anything produced by government is prima facie lousy. “If we don’t get this right,” says Tom Freedman, coauthor of Future of Failure, a 61-page study on the subject for the Ford Foundation, “the future of governing effectively is in real question.”

No one believes this more deeply than Park, a Harvard-educated son of Korean immigrants. Mozilla board member and LinkedIn founder Reid Hoffman had secured the venue on short notice. (“I do what I can to help Todd,” Hoffman later explained. “We’re very fortunate to have him.”) Park, 41, founded two health IT companies—athenahealth and Castlight Health—and led them to successful IPOs before joining the Department of Health and Human Services in 2009 as CTO. In 2012, President Obama named him CTO of the entire US. Last fall, Park’s stress levels increased dramatically when he caught the hot-potato task of rebooting the disastrously dysfunctional HealthCare.gov website. But he was also given special emergency dispensation to ignore all the usual government IT procedures and strictures, permission that he used to pull together a so-called Ad Hoc team of Silicon Valley talent. The team ultimately rebooted the site and in the process provided a potential blueprint for reform. What if Park could duplicate this tech surge, creating similar squads of Silicon Valley types, parachuting them into bureaucracies to fix pressing tech problems? Could they actually clear the way for a golden era of gov-tech, where transformative apps were as likely to come from DC as they were from San Francisco or Mountain View, and people loved to use federal services as much as Googling and buying products on Amazon?

Park wants to move government IT into the open source, cloud-based, rapid-iteration environment that is second nature to the crowd considering his pitch tonight. The president has given reformers like him leave, he told them, “to bloweverything the fuck up and make it radically better.” This means taking on big-pocketed federal contractors, risk-averse bureaucrats, and politicians who may rail at overruns but thrive on contributions from those benefiting from the waste.

by Steven Levy, Wired |  Read more:
Image: Michael George

André 3000 Is Moving On in Film, Music and Life

Since April, André 3000 has been on the road, traveling from festival to festival with his old partner Big Boi to celebrate the 20th anniversary of their debut album as Outkast. And on Sept. 26, he’ll star, under his original name, André Benjamin, as Jimi Hendrix in “Jimi: All Is by My Side,” a biopic about the year just before Hendrix’s breakthrough, when he moved to London, underwent a style transformation and squared off against Eric Clapton.

Don’t let those things fool you. Over the eight years since the last Outkast project, the Prohibition-era film and album “Idlewild,” André 3000, now 39, has become, through some combination of happenstance and reluctance, one of the most reclusive figures in modern pop, verging on the chimerical.

Invisible but for his fingerprints, that is. For the better part of his career, André 3000 has been a pioneer, sometimes to his detriment. Outkast was a titan of Southern hip-hop when it was still being maligned by coastal rap purists. On the 2003 double album “Speakerboxxx/The Love Below,” which has been certified 11 times platinum, he effectively abandoned rapping altogether in favor of tender singing, long before melody had become hip-hop’s coin of the realm. His forays into fashion (Benjamin Bixby) and animated television (“Class of 3000”) would have made far more sense — and had a far bigger impact — a couple years down the line. In many ways, André 3000 anticipated the sound and shape of modern hip-hop ambition.

And yet here he is, on a quiet summer afternoon in his hometown, dressed in a hospital scrubs shirt, paint-splattered jeans and black wrestling shoes, talking for several hours before heading to the studio to work on a song he’s producing for Aretha Franklin’s coming album. In conversation, he’s open-eared, contemplative and un-self-conscious, a calm artist who betrays no doubt about the purity of his needs. And he’s a careful student of Hendrix, nailing his sing-songy accent (likening it to Snagglepuss) and even losing 20 pounds off his already slim frame for the part.

“I wanted André for the role, beyond the music, because of where he was psychologically — his curiosity about the world was a lot like Jimi,” said John Ridley, the film’s writer and director, who also wrote the screenplay for “12 Years a Slave.”

In the interview, excerpts from which are below, André 3000 spoke frankly about a tentative return to the spotlight that has at times been tumultuous — the loss of both of his parents, followed by early tour appearances that drew criticism and concern — as well as his bouts with self-doubt and his continuing attempts to redefine himself as an artist. “You do the world a better service,” he said, “by sticking to your guns.”

by Jon Caramanica, NY Times |  Read more:
Image: Patrick Redmond/XLrator Media

The New Commute

Americans already spend an estimated 463 hours, or 19 days annually, inside their cars, 38 hours of which, on average (almost an entire work week), are spent either stalled or creeping along in traffic. Because congestion is now so prevalent, Americans factor up to 60 minutes of travel time for important trips (like to the airport) that might normally take only 20 minutes. All of this congestion is caused, to a significant degree, by a singular fact—that most commuters in the U.S. (76.4 percent) not only drive to work, but they drive to work alone. For people like Paul Minett, this number represents a spectacular inefficiency, all those empty and available passenger seats flying by on the highway like a vast river of opportunity. And so the next realm of transportation solutions is based on the idea that if we can’t build our way out of our traffic problems, we might be able to think our way out, devising technological solutions that try to fill those empty seats. A lot of that thinking, it turns out, has been happening in San Francisco. (...)

First launched in 2008, Avego’s ridesharing commuter app promised to extend the social media cloud in ways that could land you happily in someone’s unused passenger seat. Sean O’Sullivan, the company’s founder, has described their product as a cross between car-pooling, public transport, and eBay. By using the app, one could, within minutes, or perhaps a day in advance, find a ride with someone going from downtown San Francisco, say, to Sonoma County—with no more planning than it takes to update your Facebook page. Put this tool in the hands of many tens of thousands of users, so goes the vision, and add to it a platform for evaluating riders, a method of automated financial transactions, and a variety of incentives and rewards for participating in the scheme, and you could have something truly revolutionary on your hands.

Urban planners like Mark Hallenbeck, director of the Washington State Transportation Center at the University of Washington, see ridesharing as evidence of a paradigm shift from “thinking about corridors”—about building more and wider highways, for instance—to “thinking about markets.” That shift is happening in large part because there’s nowhere to put larger roads. Seattle, for instance, sits between two topographically limiting features—Lake Washington to the east and Puget Sound to the west. Interstate 5, the main north-south corridor on the West Coast, runs between the two. Widening Interstate 5 to eliminate traffic delays would require the construction of an additional fourteen lanes, according to Hallenbeck. “The reality is, that’s not going to happen,” he says, and not just in Seattle, but in most urban centers. “So, you say, what’s next?” Ridesharing could be one answer, harnessing the vast reservoir of empty seats and, in market terms, giving those empty seats a value, creating liquidity in a new marketplace of seats, passengers, and drivers.

It's worth pausing here to make a distinction between ridesharing and so-called peer-to-peer taxi services like Lyft, Uber, and Sidecar, which have gained considerable traction in larger cities. This became clear to me when, on my first day in San Francisco, I used Uber to arrange a trip from Japantown to Fort Mason in the Marina District. In the lobby of my hotel, I opened my Uber app and got a message that said, “Jose will be arriving in three minutes.” I walked out of the lobby, past the taxi stand, and within seconds a car slid up beside me, tinted window rolling down to reveal a handsome, friendly face asking, “Mark?” Jose was no shlub. He drove his own car, which fairly sparkled inside and out. He offered me a bottle of chilled water. Another driver, a beautician who asked to be called Amy because her boyfriend didn’t know she was working for Lyft, distributes late-night hangover gift bags. She arrived to take me to an interview waiving a Lyft trademark pink mustache out of her sunroof, and soon commenced a GPS-assisted navigation across town that seemed part taxi ride, part amusement park ride, and part therapy session. At the end of the trip, I wanted to pay my fare and ask her to marry me. Of course Lyft/Uber/Sidecar weren’t actual bona fide “ridesharing” operations, as Brant Arthur, manager of the WeGo project in Sonoma County, reminded me. “They weren’t taking cars off the road—they were just people driving around town in their own cars making extra money.” They were basically, in his words, “unregulated taxis.”

But an entrepreneur named Paul Kogan saw something striking in the way the Lyft app allowed you to view all the potential Lyft drivers in your area, in real time. With the help of a few partners, he founded a company called Hovee, and in May of 2013 launched a series of beta tests at several large Bay Area tech firms. “Our trigger was that screen,” Kogan told me. “The fact that you could now do this was important.” By “this,” Kogan meant match people up to share a certain experience. Kogan’s team turned to dating services for inspiration. They looked at the mobile app Tinder, which helps people locate each other for encounters of the flesh. “What’s interesting about carpooling services,” says Kogan, “is that you always come back to the dating analogy.” It was, of course, the analogy that Kogan’s team was interested in—transportation match-making to facilitate shared rides.

by Mark Svenvold, Orion |  Read more:
Image: via:

Lena Gurr (1897-1992), Still Life with Chair
via:

Home Again

Salman Rushdie wrote an amusing little book in 1992. The title of the book is The Wizard of Oz. It’s about the famous movie with Judy Garland’s Dorothy and Toto and the Wicked Witches, East and West. The movie The Wizard of Oz is celebrating its 75-year anniversary this month. For three-quarters of a century, this unusual movie has been infecting the brains of young people all over the world. Rushdie was one of them. At age ten, Rushdie wrote his first story. He called it “Over the Rainbow.” Strange to think that there is a direct line from The Wizard of Oz to Rushdie’s now-classic tale of the partition of India, Midnight’s Children (1980).

Rushdie is an unabashed lover of the film. Call the film, he writes, “imaginative truth. Call it (reach for your revolvers now) art.” Rushdie also has strong opinions about what this artful film is and is not about. It is not about going home. Yes, Dorothy frequently talks about going home. After her house falls on the Wicked Witch of the East, the munchkins and the Good Witch Glinda tell her to go home immediately. She isn’t safe in Oz, they tell her, not with the Wicked Witch of the West still lurking about. So Dorothy follows the Yellow Brick Road in order to find the Wizard, who will help her return to her home in Kansas. At the end of the movie, she clicks her ruby slippers together and repeats, “There’s no place like home, there’s no place like home.” In a movie that Rushdie says is not about going home, there is quite a lot of home-talk.

But that’s not, says Rushdie, the real story. “Anybody,” he writes, “who has swallowed the screenwriters’ notion that this is a film about the superiority of ‘home’ over ‘away’, that the ‘moral’ of The Wizard of Oz is as sickly-sweet as an embroidered sampler — East, West, home’s best — would do well to listen to the yearning in Judy Garland’s voice, as her face tilts up towards the skies.”

Point taken. Garland’s Dorothy does yearn and tilt as she sings her famous song. That song is, indeed, the soul of the movie. Ostensibly, Dorothy runs away from home to save her little dog Toto. But maybe that is just an excuse to go on an adventure. One minute Dorothy is worrying about her scruffy small beast, the next minute she is crooning about a magical place she heard of once in a lullaby. Everybody needs a good reason to break the bonds of home and seek out something new, something “over the rainbow.” “In its most emotional moments,” writes Rushdie:
This is unarguably a film about the joys of going away, of leaving the greyness and entering the colour, of making a new life in the place ‘where there isn’t any trouble’. ‘Over the Rainbow’ is, or ought to be, the anthem of all the world’s migrants, all those who go in search of the place where ‘the dreams that you dare to dream really do come true’.
Rushdie is, therefore, highly annoyed by the ending of The Wizard of Oz. Waking up again on her bed in good ol’ Kansas, Dorothy declares, “If I ever go looking for my heart’s desire again, I won’t look further than my own back yard. And if it isn’t there, I never really lost it to begin with.” This is no longer the anthem of all the world’s migrants. This is an encomium to staying put. “How does it come about,” Rushdie asks, “at the close of this radical and enabling film, which teaches us in the least didactic way possible to build on what we have, to make the best of ourselves, that we are given this conservative little homily?” How could anyone renounce the colorful world of Oz, Rushdie wants to know, the dream of a new and different place, for the black and white comforts, the utter drabness of Kansas and all the homespun hominess it represents?

Rushdie’s explanation for this failure of imagination in the movie is that the filmmakers lost their nerve. The default position for the “moralizing” and “sentimental” Hollywood studio system was always for the safe and staid. The pat ending of The Wizard of Oz, thinks Rushdie, can and therefore should be ignored.

But it must be asked: What tension does the plot of the movie contain if Dorothy doesn’t go home?

In fact, Dorothy starts going home very early in the movie. First, she sings her lovely song about rainbows, grabs Toto and leaves the family farm. In a few minutes, she runs into Professor Marvel, the travelling salesman/quack/fortune teller. Professor Marvel realizes he has stumbled upon a little girl who has foolishly run away. He tricks her into worrying about her aunt and wanting to go back home. Within minutes, Dorothy is on her way. That would have been the entire plot of the movie. Little girl worries about dog, dreams about far away places, “runs away” from home for an hour or so, then rushes back to where she belongs and to the people who love her.

But fate has other things in store for Dorothy. There is an event. A freak of nature. A tornado comes. Dorothy is knocked unconscious and the world starts spinning. Perhaps Dorothy has even invoked the cyclone with all the yearning in her song. Now, she is going to have a real journey, whether she likes it or not. But it is a journey in which she never, actually, leaves home. The conceit of the movie (we find out later) is that Dorothy never even leaves her bed. She merely dreamed up Oz after getting bumped on the head. Or did she?

The question to ask, however, is not “Did Dorothy leave home”?” but rather “Did she want to leave home?” Did she want to leave Kansas more than she wanted to stay? Did she want to leave simply in order to experience the joy of coming back home? Do we have to leave one home forever in order to find our true home? Is it even possible to talk of home without some knowledge of what is not our home? Home seems an eternal problem.

by Morgan Meis, The Smart Set |  Read more:
Image via: 

The Meet Cute


[ed. This video is interesting not because of the themes it explores: internet dating, expectations gone awry, the manner in which simple misunderstandings escalate into failed relationships. What's more interesting (at least to me) is that an underwear company commissioned this short 'Rom/Commentary' to sell their product, taking product placement to a whole new level. In effect, becoming content creators themselves. A little value-added twist on the one-way blast we usually get from most advertisements.]

Wednesday, August 27, 2014

What Kind of Father Am I?

[ed. Repost]

One evening—not long after my family moved to the old country farmhouse where my wife and I have lived for 45 years—our youngest son (my namesake, Jim, then three-year-old Jimmy) came into the woodshed, while I was there putting away some tools. “Look,” he said proudly, cradling in his arms the largest rat I had ever seen.

Instinctively, in what no doubt would be a genetic response of any parent, I tried to grab the rat from his arms before it bit him; but, as I reached toward it, the rat tightened its body, menacing me with its sharp teeth. At once, I stepped back: that, too, was an instinctive response, though rational thought immediately followed it. Was the rat rabid? Whether that was so or not, it was clear that the rat trusted Jimmy but not me, and yet it might bite both of us if I threatened it further.

“Where did you find it?” I asked my son.

“In the barn.”

“Which barn? The one with all the hay?”

“Yes.”

“It was just lying there, on the hay?”

“Yes, and he likes me.”

“I can see that it does.”

With the possible exception of the difference in our use of pronouns (which just now came to me without conscious intent; could it have risen from some submerged level of my memory?), that little dialogue isn’t an exact transcription—not only because it happened decades ago, but because while I was talking, my mind was elsewhere. I was looking at the garden tools I’d just returned to the wall behind Jimmy, thinking I might ask him to put the rat on the floor so that I could kill it with a whack of a shovel or some other implement. But my son trusted me, just as the rat apparently trusted him; and what kind of traumatic shock would I be visiting upon Jimmy if I smashed the skull of an animal he considered his friend?

The woodshed is in a wing of the house connected to the kitchen, where my wife, Jean, had been preparing dinner. She surprised me by coming quietly to my side; apparently she had overheard our conversation through the screen door and now was offering a solution to the dilemma. She said, “We need to find something to put your pet in, Jimmy.”

“A box,” I said. “Just keep holding it while I find one.” For I remembered at that moment a stout box I had seen while rummaging among all the agricultural items that had collected over the years in the carriage barn across the road—items that fell into disuse after the fields had been cleared, the house and barns constructed, and finally after tractors and cars had replaced horses. Amid the jumble of old harnesses, horse-drawn plow parts, scythes, and two-man saws was a small oblong box that might have contained dynamite fuses or explosives for removing stumps. It had been sawed and sanded from a plank about two inches thick. Like the house itself, it was made of wood far more durable than anything available since the virgin forests were harvested, and all of its edges were covered in metal. Though I felt guilty for leaving Jimmy and Jean with the rat, I was glad to have remembered the box I had admired for its craftsmanship, and I ran in search of it. For the longest time, I couldn’t find it and thought (as I often did later, whenever I found myself unable to resolve a crisis besetting one of our adolescent sons), What kind of father am I? I was close to panic before I finally found the box, more valuable to me at that moment than our recently purchased Greek-revival farmhouse—the kind of family home I’d long dreamed of owning.

A film of these events still runs through my mind, but I will summarize the rest of it here. Jimmy was initially the director of this movie, with Jean and me the actors obedient to his command: that is to say, he obstinately refused to put the rat into the box until a suitable bed was made for it—old rags wouldn’t do, for it had to be as soft as his favorite blanket. The rat gave him his authority, for it trusted Jean no more than it trusted me; it remained unperturbed in his embrace for a few minutes more, while Jean searched for and then cut several sections from a tattered blanket. Our son was satisfied with that bed, and the rat—whose trust in a three-year-old seemed infinite—seemed equally pleased, permitting Jimmy to place it on the soft strips. As soon as we put the lid on the box, I called the county health department, only to be told that the office had closed; I was to take in the rat first thing in the morning so that its brain could be dissected.

In response to Jean’s immediate question, “Did the rat bite you?” Jimmy said, “No, he kissed me.” Could any parent have believed an answer like that? My response was simply to put the box outside. Before giving our son a bath, we scrutinized every part of his body, finding no scratches anywhere on it. During the night the rat gnawed a hole through the wood, and by dawn it had disappeared.

Forty-odd years ago, rabies vaccination involved a lengthy series of shots, each of them painful, and occasionally the process itself was fatal. Neither the health department nor our pediatrician would tell us what to do. Once again we searched Jimmy’s body for the slightest scratch and again found nothing; so we decided to withhold the vaccination—though Jean and I slept poorly for several nights. Long after it had become apparent that our son had not contracted a fatal disease, I kept thinking—as I again do, in remembering the event—of the errors I had made, of what I should have done instead, of how helpless I had felt following my discovery that the rat had escaped.

While reading a recent biography of William James by Robert D. Richardson Jr., I found myself recalling those suspenseful and seemingly never-ending hours. As Richardson demonstrates, James was aware of the extent that circumstance and random events (like the one that led my young son to a particular rat so long ago) can alter the course of history as well as the lives of individuals, making the future unpredictable. James, like my favorite writer, Chekhov, was trained as a medical doctor and became an author—though not of stories and plays (his younger brother Henry was the fiction writer) but of books and articles on philosophical, psychological, and spiritual matters. One of the founders of American pragmatism, James rejected European reliance on Platonic absolutes or on religious and philosophical doctrines that declared the historical necessity of certain future events. Despite his realization that much lies beyond our present and future control, James still believed in the independence of individual will, a view essential to the long-lasting but often precarious freedom underlying our democratic system.

by James McConkey, American Scholar |  Read more:
Image: via:

Why Is Bumbershoot Better This Year Than Previous Years?

[ed. Personally, I wouldn't go to a rock festival again if you paid me. But I understand everyone has a different burnout point. See also: A Rational Conversation: Do We Really Need A Rock Festival?]

Something feels different about Bumbershoot this year. In the weeks after One Reel announced the lineup for this summer's festival, artists, musicians, critics, and friends began saying something I hadn't heard in years: "Wow, this year's Bumbershoot looks amazing." (...)
Anecdotally, it feels like a better spread and a break from Bumbershoots past that seemed to spend a huge amount of money on superstars like Bob Dylan and leave the rest of the acts in relative neglect. I'm sure the folks at One Reel would take issue with any implication that they weren't working their asses off every year, but the public perception was that it felt less like an integrated music and culture festival and more like a Tacoma Dome gig with a few ragtag bands invited to busk in the parking lot.

The reason this year feels different, say the people at One Reel, is because it actually is. "Any given year is one person's best-ever year and another person's worst-ever year," says One Reel executive director Jon Stone. "Every year we are beat up and held up as champions at the same time, which is part of the fun." But he also says that things changed dramatically in the wake of the 2010 festival, which starred Bob Dylan, Mary J. Blige, Weezer, and Hole—and turned out to be a bust, forcing One Reel to lay off 8 of its 14 full-time, year-round festival employees. Soon after that, Teatro ZinZanni, which had started as a One Reel project, spun off and became its own entity. (...)

Why was 2010 such a crucible for the festival?

The first reason, Stone says, is that Bumbershoot found itself pouring "phenomenal resources" into headline acts. "That part is inversely proportional to the death of the record industry," he explains. "Artists used to make money on record sales and tour as a loss leader. Now artists make nothing on record sales... so fees for performances went up." In the early 2000s, he says, it cost $30,000 to put a main-stage name in Memorial Stadium and fill it up. But by the late 2000s, that number increased tenfold, costing One Reel $350,000 or more to do the same thing. "It was us not seeing the writing on the wall," he says.

The second reason was something more like hubris. In the 2000s, Stone says, Bumbershoot was getting national media attention and being compared to the big shots like Coachella and Bonnaroo. "We began to drink that Kool-Aid and thought, 'We've got to follow the leaders'" and book superstars. In retrospect, he says, that was "a huge mistake" for a few reasons. "What's been happening with the music industry in general—and festivals in particular—is a path towards unsustainability. They're not local, curated celebrations anymore. Global corporations run them." And when global corporations take over music festivals, he says, "innovation stops and the soulless and relentless milking of the consumer dollar starts."

by Brendan Killey, The Stranger | Read more:
Image: Mark Kaufman

Tuesday, August 26, 2014

The One Who Knocks

[ed. I started watching Breaking Bad several weeks ago while house-sitting at my kids' place - six episodes the first night. It's that good. After re-starting my Netflix account just to continue the saga, I'm now up to episode 56. I wouldn't say Bryan Cranston is the second coming of Marlon Brando, but he does an admirable job as Walter White (meek chemistry teacher turned meth kingpin). And everyone else in the series is first rate, too. Spoiler alert: if you haven't seen the show, you might not want to read this review.]

For years, Cranston scrabbled after guest-star roles on crappy TV shows while making his living in commercials. He played a bland smoothie with bread-loaf hair who just happened to love Shield Deodorant Soap, Arrow Shirts, Coffee-mate, and Excedrin, a middle-of-the-bell-curve guy who, despite his initial skepticism, was really sold on the product: “Now you can relieve inflamed hemorrhoidal tissue with the oxygen action of Preparation H.” He says, “I had that everyman look—nonthreatening, non-distracting, no facial hair. I fit in.”

As he was often the last person cast on a show or film, his strategy was to play the opposite of what the ensemble already had. Drama is conflict, after all. When he auditioned for the father on “Malcolm in the Middle,” the Fox sitcom about a crew of unruly brothers, he knew that the boys’ mother was bombastic, fearless, and insightful, so he played the father as gentle, timid, and obtuse. “It was a genius way to make an underwritten part work,” Linwood Boomer, the show’s creator, says. “By the third episode, we realized we had to do a lot more writing for the guy.”

“Malcolm” aired from 2000 to 2006, and established Cranston as a television fixture, if not a star. Yet even after he landed the lead in “Breaking Bad,” in 2007, he framed his character, Walter White, as an opposite—in this case, the opposite of the man Walter would become. The show is about a fifty-year-old high-school chemistry teacher in Albuquerque who, after getting a diagnosis of terminal lung cancer, secretly works with a former student, the sweet yo-yo Jesse Pinkman (Aaron Paul), to make enough crystal meth to leave a nest egg for his family. Walt’s extremely pure product becomes wildly successful, but at great cost to everyone around him.

Vince Gilligan, the show’s creator and executive producer, had sold it to the AMC network as “a man who goes from Mr. Chips to Scarface,” and, in the pilot, Walt tells his students that chemistry is “the study of change.” But Cranston quietly shifted the arc from good-man-becomes-bad to invisible-man-becomes-vivid. In pre-production, Gilligan recalls, Cranston began to construct an ideal nebbish: “Bryan said, ‘I think I should have a mustache, and it should be light and thin and look like a dead caterpillar, and I should be pale, and a little doughier, a hundred and eighty-six pounds.’ ”

Cranston explains, “I wanted Walt to have the body type of my dad, who’s now eighty-nine, like Walt was a much older man. When I was studying my dad, taking on his posture and burdens—I didn’t tell him I was doing it—I noticed I was also taking on some of his characteristics, the ‘Aw, jeez,’ or an eye roll, or”—he gave a skeptical grimace—“when Jesse did something stupid.”

Gilligan, an amiable, fatalistic Virginian, says, “I had a very schematic understanding of Walt in the early going. I was thinking structurally: we’d have a good man beset from all sides by remorseless fate.” Not only does Walt have cancer, an empty savings account, and searing regrets about his career path but his son has cerebral palsy and his wife, Skyler, is unexpectedly pregnant. Gilligan gave a wry smile. “The truth is you have to be very schematic indeed to force someone into cooking crystal meth.”

Instead, Cranston played the role so that Walter’s lung-cancer diagnosis catalyzes a gaudy midlife crisis—so that a luna moth breaks from the drabbest of cocoons. Across the show’s five seasons, which depict a lively two years, Walt is increasingly inhabited by Heisenberg, his drug-dealing pseudonym and alter ego—a figure Cranston describes as “the emotionless, brave, risk-taking imaginary friend who looks out for Walt’s best interests.” Early in the first season, when Walt scurries out of his Pontiac Aztek to retrieve the drug dealer Krazy-8, who lies unconscious on a suburban corner in broad daylight, he’s terrified of being seen, and takes tiny nerdy steps, his shoulders twitching with self-consciousness. There is a touch of Hal, the father Cranston played on “Malcolm in the Middle,” about him still—he might almost waggle his hands in panic for comic effect. (The first season of the show was particularly funny, if darkly so, and Vince Gilligan asked his colleagues whether he should submit it to the Emmys as a drama or a comedy.)

After undergoing chemotherapy, Walt shaves his head and grows a Vandyke, alpha-male plumage that helps him play the bruiser. By the end of the second season, he rousts two would-be meth cooks from his territory with pure assurance: a wide stance, arms relaxed yet poised to strike. And when he reveals his hidden powers to his wife in the famous “I am the one who knocks!” speech, he levels his hand at her like a gun. “The more believable humanity of Walter White—the discovery that he’s not a good man but an everyman—is due to Bryan,” Gilligan said. “The writers realized, from his acting, that Walt isn’t cooking for his family; he’s cooking for himself.”

By the fifth season, having killed Krazy-8 and become responsible for at least a hundred and ninety-four other deaths, Walt has no anxiety left. His voice is low and commanding, his manner brash—he’s eager to be seen. He was cowed at first by his brother-in-law, Hank Schrader, a bluff D.E.A. agent who treats him with kindly contempt. But soon enough he’s snarling at Hank, “I’m done explaining myself,” and taunting him for misidentifying Heisenberg: “This genius of yours, maybe he’s still out there.” Then he eliminates his boss, a drug lord named Gus Fring (Giancarlo Esposito), by blowing his face off with a wheelchair bomb. As Walt takes on the role of the dominant dealer, Cranston has him unconsciously appropriate some of Esposito’s coiled stillness. “I wanted to plant a subliminal thing with the audience,” he says. “But it was Bryan who modelled Walt’s body language on Gus’s—Walt didn’t know what he was doing. All he knew is that he felt more confident with his shoulders back.”

In movies, unless you’re the star, you’re going to play an archetype. Studios, noticing the authority in Cranston’s persona, have often cast him as a colonel (“Saving Private Ryan,” “John Carter,” “Red Tails”). Ben Affleck, who hired him to be the C.I.A.’s version of a colonel in “Argo,” says, “Bryan is the boss you might actually like. He’s not a general and he’s not a sergeant—he’s a colonel.” Yet Cranston’s friend Jason Alexander, who starred as George Costanza on “Seinfeld,” says, “Bryan doesn’t play an idea particularly well, those military roles. That’s because his strongest card is complexity, where you can’t figure out what he represents until he gradually reveals himself.” A producer friend of Cranston’s observes that he doesn’t stand out in such films as “Total Recall,” where he chewed the scenery as a dictator, “because he wasn’t reined in. Actors want to act, but you need someone who will say, ‘Give me the take where he’s doing less.’ ”

A cable series, a format that showcases accretive subtlety, is where Cranston could truly shine. Luckily, cable’s golden age arrived just as he did. “Bryan had to grow into his weight as an actor,” John O’Hurley, a close friend of Cranston’s since the mid-eighties, when they were both married to the same woman on the soap opera “Loving,” says. “He became dangerous when he began letting his eyes go dead. It’s the sign of a man with nothing to lose.”

by Tad Friend, New Yorker |  Read more:
Image: Ian Wright

How Plagues Really Work

The latest epidemic to terrify the Western world is Ebola, a virus that has killed hundreds in Africa in 2014 alone. No wonder there was so much worry when two infected health care workers from the United States were transported home from Liberia for treatment – why bring this plague to the US, exposing the rest of the country as well? But the truth is that Ebola, murderous though it is, doesn’t have what it takes to produce a pandemic, a worldwide outbreak of infectious disease. It spreads only through intimate contact with infected body fluids; to avoid Ebola, just refrain from touching sweat, blood or the bodies of the sick or dead.

Yet no logic can quell our pandemic paranoia, which first infected the zeitgeist with the publication of Laurie Garrett’s The Coming Plague (1994) and Richard Preston’s Hot Zone (1995). These books suggested that human incursion into rainforests and jungles would stir deadly viruses in wait; perturb nature and she nails you in the end. By the late 1990s, we were deep into the biological weapons scare, pumping billions of dollars in worldwide government funding to fight evil, lab-made disease. As if this weren’t enough, the panic caused from 2004 to 2007 by reports of the H5N1 or bird flu virus etched the prospect of a cross-species Andromeda strain in the Western mind.

The fear seems confirmed by historical memory: after all, plagues have killed a lot of people, and deadly diseases litter history like black confetti. The Antonine Plague, attributed to measles or smallpox in the year 165 CE, killed the Roman Emperor Marcus Aurelius and millions of his subjects. The Justinian Plague, caused by the deadly bacterial pathogen Yersinia pestis, spread from North Africa across the Mediterranean Sea to Constantinople and other cities along the Mediterranean. By 542, infected rats and fleas had carried the infection as far north as Rennes in France and into the heart of Germany. Millions died.

Then there was the Black Death of 1348-50, also caused by Yersinia pestis, but this time spread by human fleas and from human lung to human lung, through the air. The plague spread along the Silk Road to what is now Afghanistan, India, Persia, Constantinople, and thence across the Mediterranean to Italy and the rest of Europe, killing tens of millions worldwide. Of all the past pandemics, the 1918 influenza (also known as the Spanish flu) is now considered the über-threat, the rod by which all other pandemics are measured. It killed 40 million people around the globe.

It was the great Australian virologist Frank Macfarlane Burnet who argued that the deadliest diseases were those newly introduced into the human species. It seemed to make sense: the parasite that kills its host is a dead parasite since, without the host, the germ has no way to survive and spread. According to this argument, new germs that erupt into our species will be potential triggers for pandemics, while germs that have a long history in a host species will have evolved to be relatively benign.

Many health experts take the notion further, contending that any coming plague will come from human intrusion into the natural world. One risk, they suggest, comes when hungry people in Africa and elsewhere forge deep into forests and jungles to hunt ‘bushmeat’ – rodents, rabbits, monkeys, apes – with exposure to dangerous pathogens the unhappy result. Those pathogens move silently among wild animals, but can also explode with terrifying ferocity among people when humans venture where they shouldn’t. According to the same line of thought, another proposed risk would result when birds spread a new pandemic strain to chickens in factory farms and, ultimately, to us.

But there’s something in these scenarios that’s not entirely logical. There is nothing new in the intimate contact between animals and people. Our hominid ancestors lived on wildlife before we ever evolved into Homo sapiens: that’s why anthropologists call them hunter-gatherers, a term that still applies to some modern peoples, including bushmeat hunters in West Africa. After domesticating animals, we lived close beside them, keeping cows, pigs and chickens in farmyards and even within households for thousands of years. Pandemics arise out of more than mere contact between human beings and animals: from an evolutionary point of view, there is a missing step between animal pathogen and human pandemic that’s been almost completely overlooked in these terrifying but entirely speculative ideas.

by Wendy Orent, Aeon |  Read more:
Image: Stefano Rellandini/Reuters

Monday, August 25, 2014


Photo: markk

Photo: markk

Mimicking Airlines, Hotels Get Fee-Happy

[ed. Companion piece to the post following this one. From cable charges, to airline fees, to road tolls, to credit/debit card penalties, to miscellaneous utility assessments and on and on and on... consumers are getting dinged like never before.] 

Forget bad weather, traffic jams and kids asking, "Are we there yet?" The real headache for many travelers is a quickly-growing list of hotel surcharges, even for items they never use.

Guaranteeing two queen beds or one king bed will cost you, as will checking in early or checking out late. Don't need the in-room safe? You're likely still paying. And the overpriced can of soda may be the least of your issues with the hotel minibar.

Vacationers are finding it harder to anticipate the true cost of their stay, especially because many of these charges vary from hotel to hotel, even within the same chain.

Coming out of the recession, the travel industry grew fee-happy. Car rental companies charged extra for services such as electronic toll collection devices and navigation systems. And airlines gained notoriety for adding fees for checking luggage, picking seats in advance, skipping lines at security and boarding early. Hotel surcharges predate the recession, but recently properties have been catching up to the rest of the industry.

"The airlines have done a really nice job of making hotel fees and surcharges seem reasonable," says Bjorn Hanson, a professor at New York University's hospitality school.

This year, hotels will take in a record $2.25 billion in revenue from such add-ons, 6 percent more than in 2013 and nearly double that of a decade ago, according to a new study released Monday by Hanson. Nearly half of the increase can be attributed to new surcharges and hotels increasing the amounts of existing fees.

by Scott Mayerowitz, AP |  Read more:
Image: John Locher/AP

Did Congestion Charging Just Go Viral?

[ed. I'd never heard of congestion charging until today. Sounds like a pretty hard-sell.]

Congestion charging or pricing is the practice of setting up cordon tolls around the city on a large-scale to charge entrants for entering during peak hours. Ideally, this is done in an automatic fashion with cameras registering your license plate and directly billing you. This is different from low emissions zones, which are specific zones that limit the type of vehicles that can enter, and when.

City-scale congestion charging is picking up steam as a policy tool to free cities from crippling traffic.Singapore led the way starting in 1975, and LondonMilan, and Stockholm have since followed suit. In 2008, the former Mayor of New York City Michael Bloomberg led a valiant, but eventually doomed effort to install congestion charging around Manhattan. However, despite New York’s setback and otherwise sporadic progress, three news items make me wonder if congestion pricing is reaching a tipping point:

First, despite New York’s failed attempt, it looks as if a bottom-up plan could revive the city’s efforts. With crippling congestion and underfunded transit projects, New Yorkers are starting to rally to the cause. The key to success this time might be better consultation and more community engagement. So far so good.

Second, Stockholm’s at-first shaky congestion pricing plan is now considered an unobtrusive part of life. In fact, its popularity spurred Gothenburg to adopt it, and there are now proposals for all major Swedish cities to adopt the system [in Swedish].

Finally, we turn to the mother lode of traffic: China. Not only have Beijing andShanghai studied the possibility of congestion charging for a while now, it appears that Beijing is going to institute it next year, using its many ring roads to its advantage.

by Tali Trigg, Scientifc American | Read more:
Image: Stockholm Transport Styrelsen.

Sunday, August 24, 2014


Xiao Wen Ju fronts the Lane Crawford Spring/Summer 2014 Campaign
via:

Mutablend (on Flickr), No Communication No Love
via:

Every Insanely Mystifying Paradox in Physics: A Complete List


Today’s brain-melter: Every Insanely Mystifying Paradox in Physics. It’s all there, from the Greisen-Zatsepin-Kuzmin limit to quantum immortality to, of course, the tachyonic antitelephone.
A tachyonic antitelephone is a hypothetical device in theoretical physics that could be used to send signals into one’s own past. Albert Einstein in 1907 presented a thought experiment of how faster-than-light signals can lead to a paradox of causality, which was described by Einstein and Arnold Sommerfeld in 1910 as a means “to telegraph into the past”.
If you emerge with your brain intact, at the very least, you’ll have lost a couple of hours to the list.

by Cliff Pickover, Sprott Physics, Univ. of Wisconson | Read more:
via: Kottke.org