Thursday, September 4, 2014

Wait Six Years to Buy Your Next Car

You’ll be able to buy a car that can drive itself under most conditions, with an option for override by a human driver, in 2020, according to the median estimate in a survey of 217 attendees of the 2014 Automated Vehicles Symposium. By 2030, the group estimated, you’ll be able to buy a car that is so fully automated it won’t even have the option for a human driver.

Though 2020 is just six years away, there remains a lot of debate over how the industry is going to get there. Most auto manufacturers are incrementalists, adding automated features such as adaptive cruise control, self-parking, and traffic-jam assist, two or three at a time. Google and some others in Silicon Valley, however, are more interested in producing highly or even fully automated cars as soon as possible.

The Society of Automotive Engineers and National Highway Traffic Safety Administration have slightly different definitions of different levels of automated cars. But both basically agree that a “partially automated car” can take over some driving functions such as speed and steering but can’t actually drive itself; “highly automated” can drive itself under most conditions but has a human override, and “fully automated” can drive itself without a human override.

Don Norman, a human-factors engineer from UC San Diego, seemingly endorsed Google’s strategy in a keynote speech Wednesday that argued that even highly automated vehicles might be too dangerous for people to use. “I strongly favor full automation,” he said, but feared that highly automated vehicles might find themselves unable to handle some condition and give the human drive an inadequate amount of time to safely take over.

Airplanes have been highly automated for years, Norman pointed out, but if a plane that is 30,000 feet up suddenly decides it can’t handle current conditions and demands that a human takes over, the human pilot has several minutes before the plane might crash. An auto driver in the same position might have only a fraction of a second.

As I indicated here July 16, another source of debate is whether cars should come with vehicle-to-vehicle (V2) communications. This would allow, among other things, cars to operate in “platoons” of several to many cars each. Since the cars would be in constant contact with each other, they could close the gaps between them and greatly increase highway capacities.

by Randal O'Toole, Kurzweil |  Read more:
Image: Harbrick

Wednesday, September 3, 2014


Andreas Paradise, Lafkos, Greece, 2013
via:

Why Nerdy White Guys Who Love the Blues Are Obsessed With a Wisconsin Chair Factory

In the 2001 movie “Ghost World,” 18-year-old Enid picks up the arm on her turntable, drops the needle in the groove, and plays a song yet another time. She can’t get over the emotional power of bluesman Skip James’ 1931 recording of “Devil Got My Woman.” If you know anything about 78 records, it only makes sense that a nerdy 40-something 78 collector named Seymour would have introduced her to this tune. As played by Steve Buscemi, Seymour is an awkward, introverted sadsack based on the film’s director, Terry Zwigoff, who—along with his comic-artist pal, Robert Crumb—is an avid collector of 78s, a medium whose most haunting and rarest tracks are the blues songs recorded in the 1920s and ’30s.

Nearly a decade later, music critic and reporter Amanda Petrusich had the same intoxicating experience Enid (Thora Birch) did, listening to very same song, although she got to hear “Devil Got My Woman” played on its original 78, courtesy of a real-­life collector, who owns this prohibitively expensive shellac record pressed by Paramount. Only three or four copies are known to exist.

The gramophone, a type of phonograph that played 10-inch shellac discs at 78 rpm, was developed in the late 19th century. But it wasn’t until the 1910s and ’20s that the technology became more affordable and less cumbersome so that an average family could have one at home. The records, which could only play 2 to 3 minutes of sound per side, had their heyday in the ’20s and ’30s. They lost their cachet in the ’40s, when radio became the most popular format for music lovers. Then in the ’50s and ’60s, 78 records were phased out in favor of long-playing vinyl records.

Paramount blues records, in particular, seem to get under the skin of modern 78 collectors. From 1922 to 1932, the label, founded by a furniture company in suburban Wisconsin, discovered some of the most legendary blues icons of the 20th century—Charley Patton, Son House, Blind Blake, Ma Rainey, and Blind Lemon Jefferson—thanks to African American producer J. Mayo Williams, who recruited talent scouts to find these impoverished artists in the South, and then paid the artists a pittance to record for Paramount. These “race records,” meant exclusively for black audiences, were made in limited runs from a cheap, low-quality mixture of shellac that gives them a ghostly, crackling sound. Their rarity, the strange sounds they make, and the brilliance of these artists (who mostly remained obscure at the time) has led to a full-blown fervor in the 78 world. Even rock star Jack White, who founded Third Man Records, is obsessed with Paramount. Last year, White teamed up with Revenant Records’ Dean Blackwood to release a box set of vinyl albums featuring 800 known Paramount tracks. (Yours for a paltry $400.)

Petrusich, who spent years immersing herself in the world of 78 collectors as a reporter, got so obsessed with Paramount Records, she went diving into the murky waters of the Milwaukee River to look for discarded shellac. Now, she’s released a book on her experience about getting swept up in this mania, Do Not Sell at Any Price: The Wild, Obsessive Hunt for the World’s Rarest 78 rpm Records. We talked to Petrusich about the characters she met, the important preservationist work they’re doing, and how white men ended up writing the narrative of a music genre created by impoverished African Americans. (...)

Collectors Weekly: Can you tell me a little bit about the history of Paramount Records?

Petrusich: Paramount is this incredible label that was born from a company called the Wisconsin Chair Company, which was making chairs, obviously. The company had started building phonograph cabinets to contain turntables, which they also were licensing. And they developed, like many furniture companies, an arm that was a record label so that they could make records to sell with the cabinets. This was before a time in which record stores existed. People bought their records at the furniture store, because they were things you needed to make your furniture work.

So the Wisconsin Chair Company, based in the Grafton-Port Washington area of Wisconsin, started the Paramount label. And they accidentally ended up recording whom I believe to be some of the most incredible performers in American musical history. Paramount started a “race record” series in the late 1920s after a few other labels had success doing that model, by which African American artists recorded music for African American audiences. Through a complex series of talent scouts, they would bring artists mostly from the Southeast up to Wisconsin to record, which in and of itself was just insane and miraculous. These are Mississippi bluesmen, being brought to this white rural town in Wisconsin, and you can’t imagine how foreign it must have been to them to see that landscape. Sometimes the performers would record for Paramount in Chicago, but later in Paramount’s history, the company built a studio right in Grafton, and it was a notoriously bad studio. It had shoddy, handmade equipment, and then the records that Paramount was pressing were really cheap. It was a very bad mixture of shellac, and Paramount records are infamous for having a lot of surface noise.

But as I said, they captured some of the best performers in American history, folks like Skip James, Charley Patton, Blind Lemon Jefferson, and Geeshie Wiley—all these really incredible singers. At the time, Paramount didn’t know what it was doing. It hasn’t been until now that people are like, “Oh my God, this label rewrote American history.” I don’t think Paramount was remotely cognizant of the significance of the work that was being recorded in their studio.

by Lisa Hix, Collectors Weekly |  Read more:
Image: Robert Crumb

Creativity Creep

Every culture elects some central virtues, and creativity is one of ours. In fact, right now, we’re living through a creativity boom. Few qualities are more sought after, few skills more envied. Everyone wants to be more creative—how else, we think, can we become fully realized people?

Creativity is now a literary genre unto itself: every year, more and more creativity books promise to teach creativity to the uncreative. A tower of them has risen on my desk—Ed Catmull and Amy Wallace’s “Creativity, Inc.”; Philip Petit’s “Creativity: The Perfect Crime”—each aiming to “unleash,” “unblock,” or “start the flow” of creativity at home, in the arts, or at work. Work-based creativity, especially, is a growth area. In “Creativity on Demand,” one of the business-minded books, the creativity guru Michael Gelb reports on a 2010 survey conducted by I.B.M.’s Institute for Business Values, which asked fifteen hundred chief executives what they valued in their employees. “Although ‘execution’ and ‘engagement’ continue to be highly valued,” Gelb reports, “the CEOs had a new number-one priority: creativity,” which is now seen as “the key to successful leadership in an increasingly complex world.” Meanwhile, at the other end of the spectrum, Julia Cameron’s best-selling “The Artist’s Way” proposes creativity as a path to personal, even spiritual fulfillment: “The heart of creativity is an experience of the mystical union,” Cameron writes. “The heart of the mystical union is an experience of creativity.” It’s a measure of creativity’s appeal that we look to it to solve such a wide range of problems. Creativity has become, for many of us, the missing piece in a life that seems routinized, claustrophobic, and frivolous.

How did we come to care so much about creativity? The language surrounding it, of unleashing, unlocking, awakening, developing, flowing, and so on, makes it sound like an organic and primordial part of ourselves which we must set free—something with which it’s natural to be preoccupied. But it wasn’t always so; people didn’t always care so much about, or even think in terms of, creativity. In the ancient world, good ideas were thought to come from the gods, or, at any rate, from outside of the self. During the Enlightenment, rationality was the guiding principle, and philosophers sought out procedures for thinking, such as the scientific method, that might result in new knowledge. People back then talked about “imagination,” but their idea of it was less exalted than ours. They saw imagination as a kind of mental scratch pad: a system for calling facts and images to the mind’s eye and for comparing and making connections between them. They didn’t think of the imagination as “creative.” In fact, they saw it as a poor substitute for reality; Hobbes called it “decayed sense.”

It was Romanticism, in the late eighteenth and early nineteenth centuries, which took the imagination and elevated it, giving us the “creative imagination.” (That’s the title of a classic intellectual history of this period, by the literary scholar James Engell.) People like Samuel Taylor Coleridge argued that we don’t just store things in our imaginations; we transform them. Coleridge made a useful distinction, largely lost today, between two kinds of imagining. All of us, he thought, have a workaday imagination, which we use to recall memories, make plans, and solve problems; he called this practical imagination “fancy.” But we also have a nobler kind of imagination, which operates, as Engell puts it, like “a human reflex of God’s creative energy.” The first kind of imagination understands the world; the second kind cares about it and brings it to life. In the “Prelude,” Wordsworth describes this kind of imagination as “an auxiliary light” that changes everything it illuminates:

An auxiliary light
Came from my mind which on the setting sun
Bestowed new splendor, the melodious birds,
The gentle breezes, fountains that ran on,
Murmuring so sweetly in themselves, obeyed
A like dominion; and the midnight storm
Grew darker in the presence of my eye.

This watchful, inner kind of creativity is not about making things but about experiencing life in a creative way; it’s a way of asserting your own presence amidst the much larger world of nature, and of finding significance in that wider world. By contrast, our current sense of creativity is almost entirely bound up with the making of stuff. If you have a creative imagination but don’t make anything, we regard that as a problem—we say that you’re “blocked.”

How did creativity transform from a way of being to a way of doing? The answer, essentially, is that it became a scientific subject, rather than a philosophical one.

by Joshua Rothman, New Yorker |  Read more:
Image: Boyoun Kim

Tuesday, September 2, 2014

Feist


[ed. How many people know that Fiest and Peaches used to be roomates? I know, blows your mind. Here's some vintage Fiest, just because I love her... 1234 (and the Sesame Street version) and So Sorry. You can check out Peaches kickin' it with Iggy Pop in the next post.]

The Taming of the Stooge


"I would characterize it sort of like a powerful interest group within a political party at this point. It used to be the entire political party."
—Iggy Pop explains his current relationship with his penis.

h/t The Awl 

[ed. Thanks to whoever pulled this up from the archives today, I'd forgotten I posted it (like so many other things). I need to get back there once in a while (and you do, too). ps. Peaches cracks me up: Fuck the pain away - here and here.]

Monday, September 1, 2014






Henri-Georges Clouzot, La Prisonnière (1968)
via:

Hoda Afshar, Westoxicated #7. 2013
via:

Operation 'Washtub'

Fearing a Russian invasion and occupation of Alaska, the U.S. government in the early Cold War years recruited and trained fishermen, bush pilots, trappers and other private citizens across Alaska for a covert network to feed wartime intelligence to the military, newly declassified Air Force and FBI documents show.

Invasion of Alaska? Yes. It seemed like a real possibility in 1950.

"The military believes that it would be an airborne invasion involving bombing and the dropping of paratroopers," one FBI memo said. The most likely targets were thought to be Nome, Fairbanks, Anchorage and Seward.

So FBI director J. Edgar Hoover teamed up on a highly classified project, code-named "Washtub," with the newly created Air Force Office of Special Investigations, headed by Hoover protege and former FBI official Joseph F. Carroll.

The secret plan was to have citizen-agents in key locations in Alaska ready to hide from the invaders of what was then only a U.S. territory. The citizen-agents would find their way to survival caches of food, cold-weather gear, message-coding material and radios. In hiding they would transmit word of enemy movements.

This was not civil defense of the sort that became common later in the Cold War as Americans built their own bomb shelters. This was an extraordinary enlistment of civilians as intelligence operatives on U.S. soil. (...)

"Washtub" was known inside the government by several other codenames, including Corpuscle, Stigmatic and Catboat, according to an official Air Force history of the OSI, which called it one of OSI's "most extensive and long-running Cold War projects." The FBI had its own code word for the project: STAGE.

"Washtub" had two phases.

The first and more urgent was the stay-behind agent program. The second was a parallel effort to create a standby pool of civilian operatives in Alaska trained to clandestinely arrange for the evacuation of downed military air crews in danger of being captured by Soviet forces. This "evasion and escape" plan was coordinated with the CIA.

Among those listed as a stay-behind agent was Dyton Abb Gilliland of Cooper Landing, a community on the Kenai Peninsula south of Anchorage. A well-known bush pilot, Gilliland died in a plane crash on Montague Island in Prince William Sound in May 1955 at age 45. FBI records say he spent 12 days in Washington D.C., in June-July 1951 undergoing a range of specialized training, including in the use of parachutes.

The agents also got extensive training in coding and decoding messages, but this apparently did not always go well. Learning these techniques was "an almost impossible task for backwoodsmen to master in 15 hours of training," one document said. Details in the document were blacked out.

Many agent names in the OSI and FBI documents also were removed before being declassified.

None of the indigenous population was included. The program founders believed that agents from the "Eskimo, Indian and Aleut groups in the Territory should be avoided in view of their propensities to drink to excess and their fundamental indifference to constituted governments and political philosophies. It is pointed out that their prime concern is with survival and their allegiance would easily shift to any power in control."

Recruiters pitched patriotism and were to offer retainer fees of up to $3,000 a year (nearly $30,000 in 2014 dollars). That sum was to be doubled "after an invasion has commenced," according to one planning document. The records do not say how much was actually paid during the course of the program.

by Robert Burns, AP |  Read more:
Image: J. Edgar Hoover, AP

Foucault and Social Media: Life in a Virtual Panopticon


You start the day bleary-eyed and anxious. You stayed up late last night working on a post for your blog, gathering facts and memes from about the web and weaving them into an incisive whole. Has it produced a spike in the stats? You sign in on your iPhone as you brew the coffee. But it’s too early to slip into the professional headspace – you decide that you don’t want to know. Someone has messaged you on Facebook, so you check that instead. Japanese manga mashup! Killer breaks off the cost of Lombok. Lady Gaga is a man and we have photoshopped evidence to prove it! A friend will appreciate that one, so you share it with her directly. Perhaps not something that you’d want to share widely. Two new contact requests on LinkedIn. Your profile needs updating. Should you include details about the design work you completed for the local event the week before? You are not sure. You are building your profile as a graphic artist and looking for quality clients. Perhaps this is a part of your person that you will let incubate for a while longer.

You jump on HootSuite and start sharing targeted content: Facebook for friends, tweets for professional contacts. The day has barely started and already you are split into half a dozen pieces.

How did we ever get by without social media? In under a decade, free online services like Facebook, Twitter, and LinkedIn have utterly transformed how we work, play, and communicate. For hundreds of millions of people, sharing content across a range of social media services is a familiar part of life. Yet little is known about how social media is impacting us on a psychological level. A wealth of commentators are exploring how social media is refiguring forms of economic activity, reshaping our institutions, and transforming our social and organizational practices. We are still learning about how social media impacts on our sense of personal identity.

The French philosopher Michel Foucault (1926-1984) has a set of insights that can help clarify how social media affects us on a psychological level. Foucault died before the advent of the internet, yet his studies of social conditioning and identity formation in relation to power are applicable to life online. Seen from a Foucaultian perspective, social media is more than a vehicle for exchanging information. Social media is a vehicle for identity-formation. Social media involves ‘subjectivation’.

A Foucaultian perspective on social media targets the mechanism that makes it tick: sharing. Sharing is basic to social media. Sharing content is not just a neutral exchange of information, however. Mostly, when we share content on social media services, we do it transparently, visibly, that is in the presence of a crowd. The act of sharing is a performance, to an extent – it a performative act, an act that does something in the world, as J.L. Austin would say. This is important. The performative aspect of sharing shapes the logic and experience of the act itself.

There is a self-reflexive structure to sharing content on Facebook or Twitter. Just as actors on stage know that they are being watched by the audience and tailor their behaviour to find the best effect, effective use of social media implies selecting and framing content with a view to pleasing and/or impressing a certain crowd. We may not intend to do this but it is essential to doing it well. Unless we are sharing anonymously (and the radical end of internet culture, Anonymous, favours anonymity), all the content we share is tagged with an existential marker:

I sent this – it is part of my work. You shall know me by my works’.

Foucault understood how being made constantly visible impacts on us psychologically. Foucault was fascinated by Jeremy Bentham’s model of the ideal prison, the Panopticon, which has been incorporated in the architecture of prisons, schools, hospitals, workplaces, and urban spaces since Bentham designed it in the eighteenth century. In Benthem’s design, the Panopticon is comprised of a ring of cells surrounding a central guard tower. The prisoners in the cells are perpetually exposed to the gaze of the guards in the tower, yet since they cannot themselves see into the tower, they are never certain whether or not they are being watched.

Bentham’s Panopticon, Foucault argues, functions to make prisoners take responsibility for regulating their behaviour. Assuming that they care about the implications of bad behaviour, prisoners will act in the manner prescribed by the institution at all times on the chance that they are being watched. In time, as the sense of being watched gets under their skin, prisoners come to regulate their behaviour as if they were in a Panopticon all times, even after they have been released from the institution.

This, Foucault claims, is ‘the major effect of the Panopticon: to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power’ (Foucault, Discipline and Punish, 201).

‘Conscious and permanent visibility’…’ Apparantly this is what Mark Zuckerberg thinks social media is all about. By making our actions and shares visible to a crowd, social media exposes us to a kind of virtual Panopticon. This is not just because our activities are monitored and recorded by the social media service for the purposes of producing market analysis or generating targeted advertising. For the most part, we can and do ignore this kind of data harvesting. The surveillance that directly affects us and impacts on our behaviour comes from the people with whom we share.

There are no guards and no prisoners in Facebook’s virtual Panopticon. We are both guards and prisoners, watching and implicitly judging one another as we share content.

In sharing online, we are playing to a crowd.

by Tim Rayner, Philosophy For Change |  Read more:
Image: Michael Foucault, uncredited

Sunday, August 31, 2014


Julia Kater
via:

Packrafts in the Parks

The morning sun was just breaking over Bonanza Ridge when a group of wilderness racers gathered last summer in the decommissioned copper mining town of Kennicott in Alaska’s Wrangell Mountains. The iconic red-and-white industrial buildings from nearly 100 years ago, now a national historic monument and undergoing restoration by the National Park Service, glowed in the light of the new day as Monte Montepare, co-owner of Kennicott Wilderness Guides, faced the crowd. Everyone had gathered for a one-of-a-kind race, a dash on foot up Bonanza, then down the backside to the upper reaches of McCarthy Creek.

McCarthy Creek parallels the spine of Bonanza Ridge for several miles until it curves like a fishhook around the base of Sourdough Peak.

Upon reaching the creek, the racers would dump their packs, inflate the boats they carried with them, then shoot down 10 miles of rapids to Kennicott’s sister town of McCarthy. The finish line, not coincidentally, was right in front of the town’s only bar.

The boat carried by each of these racers is the 21st century incarnation of a design concept that’s been around for a couple decades now. Packrafts are lightweight (about 5 pounds), compact, and easily stuffed into a backpack. Essentially, they’re super-tough one-person rubber rafts, the diminutive cousins of the 16- and 20-footers used for more mainstream river trips. Small size is the secret of their advantage: A packraft gives the wilderness traveler the sort of amphibious capability that humans have longed for since the earliest days of our species. A hundred years ago, your only real option was to build a raft or hope you could find a canoe cached on your side of the river. Now, with a packraft, the backcountry trekker can go virtually anywhere, including a fast boogie up and over a mountain, then downstream through some substantial whitewater in time for beer-thirty.

Montepare welcomed all the racers and laid out the rules in front of the Kennicott Wilderness Guides main office; then head ranger Stephens Harper got up and delivered a safety and environmental briefing every bit as mandatory as the helmet and drysuit each racer was required to have. The first annual McCarthy Creek Packraft Race, which started as a way for Kennicott Wilderness to promote its guiding business and have some fun, had grown in importance from being just a bunch of whitewater bums looking for a thrill.

Pretty much by accident, the company and its owners, as well as the racers, found themselves front and center in a rancorous debate over land use, backcountry permitting and public lands policy taking place thousands of miles from the Wrangell Mountains. The jaundiced eyes of nonprofit conservation groups were watching.

Grand Canyon episode

Back in 2011, an erstwhile river warrior hiked down into the Grand Canyon with a packraft, blew it up, and shoved off into Hance, one of the longer and more difficult rapids. Within seconds he’d dumped his boat and was being sucked down into the gorge below while his girlfriend stood helplessly on the bank.

He made it out by the skin of his teeth, with the whole thing on tape, thanks to a GoPro cam. And, of course, what good is a near-drowning experience if you haven’t posted the video on YouTube? It didn’t take long for the National Park Service staff at the Grand Canyon to see it and decide, based on this one incident, that packrafters were a menace both to themselves and to public lands. The video was pulled after a few days but the damage was done.

This tale might seem familiar to readers here in Alaska, given the recent tragic death of Rob Kehrer while packrafting in Wrangell-St. Elias as part of the Alaska Wilderness Classic race. Earlier this month he launched his packraft into the treacherous Tana River and disappeared behind a wall of whitewater. His body was found on a gravel bar downstream.

One can understand how NPS managers might take a dim view of packrafters in their parks, given events such as these. But as with most thorny management issues, there is a lot more to the story than those few incidents that make the headlines.

by Kris Farmen, Alaska Dispatch |  Read more:
Image: Luc Mehl via:

What Your 1st-Grade Life Says About the Rest of It

In the beginning, when they knew just where to find everyone, they pulled the children out of their classrooms.

They sat in any quiet corner of the schools they could claim: the sociologists from Johns Hopkins and, one at a time, the excitable first-graders. Monica Jaundoo, whose parents never made it past the eighth grade. Danté Washington, a boy with a temper and a dad who drank too much. Ed Klein, who came from a poor white part of town where his mother sold cocaine.

They talked with the sociologists about teachers and report cards, about growing up to become rock stars or police officers. For many of the children, this seldom happened in raucous classrooms or overwhelmed homes: a quiet, one-on-one conversation with an adult eager to hear just about them. “I have this special friend,” Jaundoo thought as a 6-year-old, “who’s only talking to me.”

Later, as the children grew and dispersed, some falling out of the school system and others leaving the city behind, the conversations took place in McDonald’s, in public libraries, in living rooms or lock-ups. The children — 790 of them, representative of the Baltimore public school system’s first-grade class in 1982 — grew harder to track as the patterns among them became clearer.

Over time, their lives were constrained — or cushioned — by the circumstances they were born into, by the employment and education prospects of their parents, by the addictions or job contacts that would become their economic inheritance. Johns Hopkins researchers Karl Alexander and Doris Entwisle watched as less than half of the group graduated high school on time. Before they turned 18, 40 percent of the black girls from low-income homes had given birth to their own babies. At the time of the final interviews, when the children were now adults of 28, more than 10 percent of the black men in the study were incarcerated. Twenty-six of the children, among those they could find at last count, were no longer living.

A mere 4 percent of the first-graders Alexander and Entwisle had classified as the “urban disadvantaged” had by the end of the study completed the college degree that’s become more valuable than ever in the modern economy. A related reality: Just 33 of 314 had left the low-income socioeconomic status of their parents for the middle class by age 28.

Today, the “kids” — as Alexander still calls them — are 37 or 38. Alexander, now 68, retired from Johns Hopkins this summer just as the final, encompassing book from the 25-year study was published. Entwisle, then 89, died of lung cancer last November shortly after the final revisions on the book. Its sober title, “The Long Shadow,” names the thread running through all those numbers and conversations: The families and neighborhoods these children were born into cast a heavy influence over the rest of their lives, from how they fared in the first grade to what they became as grownups.

Some of them — children largely from the middle-class and blue-collar white families still in Baltimore’s public school system in 1982 — grew up to managerial jobs and marriages and their own stable homes. But where success occurred, it was often passed down, through family resources or networks simply out of reach of most of the disadvantaged.

Collectively, the study of their lives, and the outliers among them, tells an unusually detailed story — both empirical and intimate — of the forces that surround and steer children growing up in a post-industrial city like Baltimore.

“The kids they followed grew up in the worst era for big cities in the U.S. at any point in our history,” says Patrick Sharkey, a sociologist at New York University familiar with the research. Their childhood spanned the crack epidemic, the decline of urban industry, the waning national interest in inner cities and the war on poverty.

In that sense, this study is also about Baltimore itself — how it appeared to researchers and their subjects, to children and the adults they would later become.

by Emily Badger, Washington Post |  Read more:
Image: Linda Davidson/The Washington Post

The Dawn of the Post-Clinic Abortion

In June 2001, under a cloud-streaked sky, Rebecca Gomperts set out from the Dutch port of Scheveningen in a rented 110-foot ship bound for Ireland. Lashed to the deck was a shipping container, freshly painted light blue and stocked with packets of mifepristone (which used to be called RU-486) and misoprostol. The pills are given to women in the first trimester to induce a miscarriage. Medical abortion, as this procedure is called, had recently become available in the Netherlands. But use of misoprostol and mifepristone to end a pregnancy was illegal in Ireland, where abortion by any means remains against the law, with few exceptions.

Gomperts is a general-practice physician and activist. She first assisted with an abortion 20 years ago on a trip to Guinea, just before she finished medical school in Amsterdam. Three years later, Gomperts went to work as a ship’s doctor on a Greenpeace vessel. Landing in Mexico, she met a girl who was raising her younger siblings because her mother had died during a botched illegal abortion. When the ship traveled to Costa Rica and Panama, women told her about hardships they suffered because they didn’t have access to the procedure. “It was not part of my medical training to talk about illegal abortion and the public-health impact it has,” Gomperts told me this summer. “In those intense discussions with women, it really hit me.”

When she returned to the Netherlands, Gomperts decided she wanted to figure out how to help women like the ones she had met. She did some legal and medical research and concluded that in a Dutch-registered ship governed by Dutch law, she could sail into the harbor of a country where abortion is illegal, take women on board, bring them into international waters, give them the pills at sea and send them home to miscarry. Calling the effort Women on Waves, she chose Dublin as her first destination.

Ten women each gave Gomperts 10,000 Dutch guilders (about $5,500), part of the money needed to rent a boat and pay for a crew. But to comply with Dutch law, she also had to build a mobile abortion clinic. Tapping contacts she made a decade earlier, when she attended art school at night while studying medicine, she got in touch with Joep van Lieshout, a well-known Dutch artist, and persuaded him to design the clinic. They applied for funds from the national arts council and built it together inside the shipping container. When the transport ministry threatened to revoke the ship’s authorization because of the container on deck, van Lieshout faxed them a certificate decreeing the clinic a functional work of art, titled “a-portable.” The ship was allowed to sail, and van Lieshout later showed a mock-up of the clinic at the Venice Biennale.

As the boat sailed toward Dublin, Gomperts and her shipmates readied their store of pills and fielded calls from the press and emails from hundreds of Irish women seeking appointments. The onslaught of interest took them by surprise. So did a controversy that was starting to brew back home. Conservative politicians in the Netherlands denounced Gomperts for potentially breaking a law that required a special license for any doctor to provide an abortion after six and a half weeks of pregnancy. Gomperts had applied for it a few months earlier and received no reply. She set sail anyway, planning to perform abortions only up to six and a half weeks if the license did not come through.

When Gomperts’s ship docked in Dublin, she still didn’t have the license. Irish women’s groups were divided over what to do. Gomperts decided she couldn’t go ahead without their united support and told a group of reporters and protesters that she wouldn’t be able to give out a single pill. “This is just the first of many trips that we plan to make,” she said from the shore, wrapped in a blanket, a scene that is captured in “Vessel,” a documentary about her work that will be released this winter. Gomperts was accused of misleading women. A headline in The Telegraph in London read: “Abortion Boat Admits Dublin Voyage Was a Publicity Sham.”

Gomperts set sail again two years later, this time resolving to perform abortions only up to six and a half weeks. She went to Poland first and to Portugal in 2004. The Portuguese minister of defense sent two warships to stop the boat, then just 12 miles offshore, from entering national waters. No local boat could be found to ferry out the women who were waiting onshore. “In the beginning we were very pissed off, thinking the campaign was failing because the ship couldn’t get in,” one Portuguese activist says in “Vessel.” “But at a certain point, we realized that was the best thing that could ever happen. Because we had media coverage from everywhere.”

Without consulting her local allies, Gomperts changed strategy. She appeared on a Portuguese talk show, held up a pack of pills on-screen and explained exactly how women could induce an abortion at home — specifying the number of pills they needed to take, at intervals, and warning that they might feel pain. A Portuguese anti-abortion campaigner who was also on the show challenged the ship’s operation on legal grounds. “Excuse me,” Gomperts said. “I really think you should not talk about things that you don’t know anything about, O.K. . . . I know what I can do within the law.” Looking directly at him, she added, “Concerning pregnancy, you’re a man, you can walk away when your girlfriend is pregnant. I’m pregnant now, and I had an abortion when I was — a long time ago. And I’m very happy that I have the choice to continue my pregnancy how I want, and that I had the choice to end it when I needed it.” She pointed at the man. “You have never given birth, so you don’t know what it means to do that.”

Two and a half years later, Portugal legalized abortion. As word of Gomperts’s TV appearance spread, activists in other countries saw it as a breakthrough. Gomperts had communicated directly to women what was still, in many places, a well-kept secret: There were pills on the market with the power to end a pregnancy. Emails from women all over the world poured into Women on Waves, asking about the medication and how to get it. Gomperts wanted to help women “give themselves permission” to take the pills, as she puts it, with as little involvement by the government, or the medical profession, as possible. She realized that there was an easier way to do this than showing up in a port. She didn’t need a ship. She just needed the Internet.

by Emily Bazelon, NY Times |  Read more:
Image:Linda Nylind via:

Does It Help to Know History?

About a year ago, I wrote about some attempts to explain why anyone would, or ought to, study English in college. The point, I thought, was not that studying English gives anyone some practical advantage on non-English majors, but that it enables us to enter, as equals, into a long existing, ongoing conversation. It isn’t productive in a tangible sense; it’s productive in a human sense. The action, whether rewarded or not, really is its own reward. The activity is the answer.

It might be worth asking similar questions about the value of studying, or at least, reading, history these days, since it is a subject that comes to mind many mornings on the op-ed page. Every writer, of every political flavor, has some neat historical analogy, or mini-lesson, with which to preface an argument for why we ought to bomb these guys or side with those guys against the guys we were bombing before. But the best argument for reading history is not that it will show us the right thing to do in one case or the other, but rather that it will show us why even doing the right thing rarely works out. The advantage of having a historical sense is not that it will lead you to some quarry of instructions, the way that Superman can regularly return to the Fortress of Solitude to get instructions from his dad, but that it will teach you that no such crystal cave exists. What history generally “teaches” is how hard it is for anyone to control it, including the people who think they’re making it.

Roger Cohen, for instance, wrote on Wednesday about all the mistakes that the United States is supposed to have made in the Middle East over the past decade, with the implicit notion that there are two histories: one recent, in which everything that the United States has done has been ill-timed and disastrous; and then some other, superior, alternate history, in which imperial Western powers sagaciously, indeed, surgically, intervened in the region, wisely picking the right sides and thoughtful leaders, promoting militants without aiding fanaticism, and generally aiding the cause of peace and prosperity. This never happened. As the Libyan intervention demonstrates, the best will in the world—and, seemingly, the best candidates for our support—can’t cure broken polities quickly. What “history” shows is that the same forces that led to the Mahdi’s rebellion in Sudan more than a century ago—rage at the presence of a colonial master; a mad turn towards an imaginary past as a means to equal the score—keep coming back and remain just as resistant to management, close up or at a distance, as they did before. ISIS is a horrible group doing horrible things, and there are many factors behind its rise. But they came to be a threat and a power less because of all we didn’t do than because of certain things we did do—foremost among them that massive, forward intervention, the Iraq War. (The historical question to which ISIS is the answer is: What could possibly be worse than Saddam Hussein?)

Another, domestic example of historical blindness is the current cult of the political hypersagacity of Lyndon B. Johnson. L.B.J. was indeed a ruthless political operator and, when he had big majorities, got big bills passed—the Civil Rights Act, for one. He also engineered, and masterfully bullied through Congress, the Vietnam War, a moral and strategic catastrophe that ripped the United States apart and, more important, visited a kind of hell on the Vietnamese. It also led American soldiers to commit war crimes, almost all left unpunished, of a kind that it still shrivels the heart to read about. Johnson did many good things, but to use him as a positive counterexample of leadership to Barack Obama or anyone else is marginally insane.

Johnson’s tragedy was critically tied to the cult of action, of being tough and not just sitting there and watching. But not doing things too disastrously is not some minimal achievement; it is a maximal achievement, rarely managed. Studying history doesn’t argue for nothing-ism, but it makes a very good case for minimalism: for doing the least violent thing possible that might help prevent more violence from happening.

The real sin that the absence of a historical sense encourages is presentism, in the sense of exaggerating our present problems out of all proportion to those that have previously existed. It lies in believing that things are much worse than they have ever been—and, thus, than they really are—or are uniquely threatening rather than familiarly difficult. Every episode becomes an epidemic, every image is turned into a permanent injury, and each crisis is a historical crisis in need of urgent aggressive handling—even if all experience shows that aggressive handling of such situations has in the past, quite often made things worse. (The history of medicine is that no matter how many interventions are badly made, the experts who intervene make more: the sixteenth-century doctors who bled and cupped their patients and watched them die just bled and cupped others more.) What history actually shows is that nothing works out as planned, and that everything has unintentional consequences. History doesn’t show that we should never go to war—sometimes there’s no better alternative. But it does show that the results are entirely uncontrollable, and that we are far more likely to be made by history than to make it. History is past, and singular, and the same year never comes round twice.

by Adam Gopnik, New Yorker |  Read more:
Image: Nathan Huang

Saturday, August 30, 2014


Joan MirĂ³ (Spanish, 1893-1983), The Conductor [Le Chef d’orchestre], 1976
via: