Thursday, October 31, 2013

Banksy, October 2013.

Gueorgui PinkhassovINDIA. Rajasthan. Jaisalmer. 1995

Food Fighter

Kyoto, Japan—The meal begins simply, almost religiously: a bowl of rice, a plate of pickles, a pot of green tea. Pour the tea over the rice and take a sip, then pinch a half moon of daikon between your chopsticks. Later comes a plate of tofu scraps dressed with green onions and dried fish, a seaweed salad, and a small bowl of miso soup.

This is obanzai, Japanese home-style cooking, but the cook is no ordinary homemaker: Setsuko Sugimoto is the matriarch of one of the oldest families in Kyoto, a city where everyone knows exactly how far your family goes back. Her home is older than the United States and protected by the Kyoto government. Tonight’s dinner stretches back to the Edo period, and to prove it she drops before me a telephone book–thick copy of the original recipes her family has preserved for 10 generations. “These are the traditions that we are starting to lose,” she tells me.

Not more than a few blocks from Sugimoto’s centuries-old home is a thicket of unwelcome invaders: Starbucks slinging monster soy lattes, a pizza delivery chain prepping seafood pies, a rainbow array of 24-hour convenience stores, portals of warmed-over carbohydrates and general gastronomic mischief. It’s a familiar tale: waves of brutish Western culture crashing on the shores of foreign countries and encroaching upon their long-held traditions. But the phenomenon is all the more striking here in Kyoto, in the heart of one of the world’s richest culinary cultures, with cooking traditions that stretch back millennia and more Michelin stars per capita than any other city in the world.

But Kyoto and the rest of Japan are not prepared to see their food yield to the mitigating forces of the modern world. Sugimoto is a part of a formidable coalition of government officials, nonprofit organizations, scholars, and food luminaries who have been working for two years on a proposal to include washoku—the traditional dining cultures of Japan—on the United Nations Educational, Scientific, and Cultural Organization’s list of “intangible world heritages.” On Thursday, they received word that their bid had advanced to the final stage, making Japanese cuisine all but certain to win this prized UNESCO designation in early December. It may seem a benign marker, but the UNESCO program is itself not without controversy. Moreover, it begs the question: Can a U.N. body’s imprimatur do anything to protect something as intangible as a style of cooking?

by Matt Goulding, Slate | Read more:
Image: Matt Goulding

Big Mother is Watching You

The other day, my eleven-year-old son handed me my iPhone with an accusatory air, as if to say: So this is what you people do behind our backs. While he was looking at stocks, he came across a news item reporting that AT&T, with another company, was about to introduce a snap-around-the-wrist, GPS-tracking, emergency-button-featuring, watch-like thingie for children. It’s called FiLIP, comes in bright colors, and has two-way calling and parent-to-child texting. It allows you to set safe zones, so that you’re alerted when your child enters or leaves a designated area.

A little stunned, I checked it out online. FiLIP, I found, is far from the first such gizmo; this one just has more bells and whistles than most. “The world used to be a little simpler,” went its mom-and-apple-pie pitch. “Kids ran free and returned at dinnertime, and parents didn’t worry so much. But today, parents are under more pressure than ever. ... FiLIP has a simple mission—to help kids be kids again, while giving parents an amazing new window into their children’s lives.” Right. And the Invisible Fence collar on my late lamented cairn terrier let my dog be a dog.

All parents have to let their children off the leash eventually—to let them go out unsupervised, to grant them free-ish range on the Internet. That moment always comes before you’re ready for it. For me, it came after a ninth birthday, when we hooked up a Nintendo Wii, then discovered, months later, that it could be used to roam the Internet. Another point was reached toward the end of elementary school, when my children announced that they were the very last kids in their class to get a smartphone. I stalled. Then my son showed me the FiLIP ad, and I discovered a universe of options.

For the iPhone I will soon be buying him, I can get an iPhone Spy Stick, to be plugged into a USB port while he sleeps; it downloads Web histories, e-mails, and text messages, even the deleted ones. Or I can get Mobile Spy, software that would let me follow, in real time, his online activity and geographical location. Also available are an innocent-looking iPhone Dock Camera that would recharge his battery while surreptitiously recording video in his room, and a voice-activated audio monitor, presumably for the wild parties he’s going to throw when his father and I go out of town.

by Judith Shulevitz, TNR |  Read more:
Image: uncredited

Wednesday, October 30, 2013

Harry Callahan, Untitled (Atlanta), 1984

NSA Broke Into Yahoo, Google Data Centers

[ed. Out. Of. Control. The NSA's revealed data gathering activities are the biggest threat to democracy I think I've ever experienced. Not to mention business - with tech companies pushing consumers and corporations to migrate to the "cloud" where everything now mostly resides on NSA servers. Good luck with that. See also: Europe Erupts and NSA Chief wants Media Stopped; and Angry Over U.S. Surveillance, Tech Giants Bolster Defenses]

The National Security Agency has secretly broken into the main communications links that connect Yahoo and Google data centers around the world, The Washington Post reported Wednesday, citing documents obtained from former NSA contractor Edward Snowden.

A secret accounting dated Jan. 9, 2013, indicates that NSA sends millions of records every day from Yahoo and Google internal networks to data warehouses at the agency's Fort Meade, Md., headquarters. In the last 30 days, field collectors had processed and sent back more than 180 million new records — ranging from "metadata," which would indicate who sent or received emails and when, to content such as text, audio and video, the Post reported Wednesday on its website.

The latest revelations were met with outrage from Google, and triggered legal questions, including whether the NSA may be violating federal wiretap laws.

"Although there's a diminished standard of legal protection for interception that occurs overseas, the fact that it was directed apparently to Google's cloud and Yahoo's cloud, and that there was no legal order as best we can tell to permit the interception, there is a good argument to make that the NSA has engaged in unlawful surveillance," said Marc Rotenberg, executive director of Electronic Privacy Information Center. The reference to 'clouds' refers to sites where the companies collect data.

The new details about the NSA's access to Yahoo and Google data centers around the world come at a time when Congress is reconsidering the government's collection practices and authority, and as European governments are responding angrily to revelations that the NSA collected data on millions of communications in their countries. Details about the government's programs have been trickling out since Snowden shared documents with the Post and Guardian newspaper in June.

The NSA's principal tool to exploit the Google and Yahoo data links is a project called MUSCULAR, operated jointly with the agency's British counterpart, GCHQ. The Post said NSA and GCHQ are copying entire data flows across fiber-optic cables that carry information between the data centers of the Silicon Valley giants.

by Lolita C. Baldor, Yahoo News |  Read more:
Image: AP/Google

Helmut Newton, Karen Mulder & Yves Saint Laurent

Bertien van Manen, A Hundred Summers a Hundred Winters, Tomsk, Russia railway station, 1992

Slaves of the Internet, Unite!

Not long ago, I received, in a single week, three (3) invitations to write an original piece for publication or give a prepared speech in exchange for no ($0.00) money. As with stinkbugs, it’s not any one instance of this request but their sheer number and relentlessness that make them so tiresome. It also makes composing a polite response a heroic exercise in restraint.

People who would consider it a bizarre breach of conduct to expect anyone to give them a haircut or a can of soda at no cost will ask you, with a straight face and a clear conscience, whether you wouldn’t be willing to write an essay or draw an illustration for them for nothing. They often start by telling you how much they admire your work, although not enough, evidently, to pay one cent for it. “Unfortunately we don’t have the budget to offer compensation to our contributors...” is how the pertinent line usually starts. But just as often, they simply omit any mention of payment.

A familiar figure in one’s 20s is the club owner or event promoter who explains to your band that they won’t be paying you in money, man, because you’re getting paid in the far more valuable currency of exposure. This same figure reappears over the years, like the devil, in different guises — with shorter hair, a better suit — as the editor of a Web site or magazine, dismissing the issue of payment as an irrelevant quibble and impressing upon you how many hits they get per day, how many eyeballs, what great exposure it’ll offer. “Artist Dies of Exposure” goes the rueful joke.

In fairness, most of the people who ask me to write things for free, with the exception of Arianna Huffington, aren’t the Man; they’re editors of struggling magazines or sites, or school administrators who are probably telling me the truth about their budgets. The economy is still largely in ruins, thanks to the people who “drive the economy” by doing imaginary things on Wall Street, and there just isn’t much money left to spare for people who do actual things anymore.

This is partly a side effect of our information economy, in which “paying for things” is a quaint, discredited old 20th-century custom, like calling people after having sex with them. The first time I ever heard the word “content” used in its current context, I understood that all my artist friends and I — henceforth, “content providers” — were essentially extinct. This contemptuous coinage is predicated on the assumption that it’s the delivery system that matters, relegating what used to be called “art” — writing, music, film, photography, illustration — to the status of filler, stuff to stick between banner ads.

Just as the atom bomb was the weapon that was supposed to render war obsolete, the Internet seems like capitalism’s ultimate feat of self-destructive genius, an economic doomsday device rendering it impossible for anyone to ever make a profit off anything again. It’s especially hopeless for those whose work is easily digitized and accessed free of charge. I now contribute to some of the most prestigious online publications in the English-speaking world, for which I am paid the same amount as, if not less than, I was paid by my local alternative weekly when I sold my first piece of writing for print in 1989. More recently, I had the essay equivalent of a hit single — endlessly linked to, forwarded and reposted. A friend of mine joked, wistfully, “If you had a dime for every time someone posted that ...” Calculating the theoretical sum of those dimes, it didn’t seem all that funny.

I’ve been trying to understand the mentality that leads people who wouldn’t ask a stranger to give them a keychain or a Twizzler to ask me to write them a thousand words for nothing. I have to admit my empathetic imagination is failing me here. I suppose people who aren’t artists assume that being one must be fun since, after all, we do choose to do it despite the fact that no one pays us. They figure we must be flattered to have someone ask us to do our little thing we already do.

I will freely admit that writing beats baling hay or going door-to-door for a living, but it’s still shockingly unenjoyable work. I spent 20 years and wrote thousands of pages learning the trivial craft of putting sentences together. My parents blew tens of thousands of 1980s dollars on tuition at a prestigious institution to train me for this job. They also put my sister the pulmonologist through medical school, and as far as I know nobody ever asks her to perform a quick lobectomy — doesn’t have to be anything fancy, maybe just in her spare time, whatever she can do would be great — because it’ll help get her name out there.

by Tim Kreider, NY Times |  Read more:
Image: Post Typography

Anchovies Elevate a Pan-Seared Chicken Dish

There’s nothing wrong with a dinner of pan-seared chicken seasoned with salt and pepper. But there’s everything right about the same chicken when you add anchovies, capers, garlic and plenty of lemon to the pan.

What was once timid and a little dull turns vibrant, tangy and impossible to stop eating. And the only real extra work is chopping the garlic and a little parsley for garnish.

In this dish, the cut of chicken is less important than the pungent pan sauce. Most people will probably want to use the workhorse of all poultry dinners, the boneless, skinless breasts. But being a dark-meat lover, I prefer the thighs. They cook nearly as quickly, and have a greater margin of error in terms of doneness. Overcook your breasts by even a minute, and you’ll get dry, tough meat. Thighs are more forgiving. However, if your family insists on white meat, you can substitute breasts and subtract about 3 minutes from the cooking time.

Although you can make this dish entirely on the stove top, I take a cue from chefs and finish it in the oven. It cooks more evenly there, with less monitoring. This frees you up to toss a salad and slice up a crusty loaf of bread for mopping up the juices. If you love anchovies and garlic, you won’t want to leave even a drop behind.

Recipe: Garlicky Chicken With Lemon-Anchovy Sauce

by Melissa Clark, NY Times |  Read more:
Image: via:

Monday, October 28, 2013

A Curmudgeon’s Guide to Praise

Occasionally I meet someone else who’s weathered the extravagant praise of a parent, like the writer who published six books of poetry but whose dad brags that she’s won a Guggenheim. That would certainly be miraculous and worth bragging about, because she’s never applied. This kind of praise misconstrues facts “accidentally on purpose,” and a parent has plausible deniability (“I must have misheard you”) until a pattern sets in; though, from their point of view, they really don’t know what the problem is. If praise masquerades as an objective assessment that really conveys the subjective feeling of their love — that boundless, intoxicating feeling — why not sacrifice a few details? Heck, why not sacrifice them all?

A sense of truth — the difference between potential and accomplishment, dream and fact — can be the first casualty in families like this. Growing up I felt like a private eye tracking down unseemly rumors about me, adapting a private eye’s worldview along the way: dark, noirish and suspicious of anyone who claimed to “know” what “happened” about “anything.” I was desperate for praise but paranoid about its ability to manipulate. If you paid me a compliment I’d love you for a second, then squint suspiciously, spit out an imaginary cigar and ask “What do you want, anyway?”

As my girlfriend and I became serious, she spent time with my mom and heard some of the extravagant stories of my childhood. I was concerned if she rolled her eyes any harder she might injure them. Was I really so verbal and wise — and jaded — at age two, after my father died, that I offered my mom irrefutable proof that Santa was a fake? Did I really teach myself to read music at age five? Become a concert-grade pianist without practicing? My girlfriend called bullshit on the highlight reel of my childhood. I didn’t believe half this stuff either, but I believed some of it. That is, I wanted the option in my early 20s of believing it some nights, to lie in bed after a lame day as a secretary and revel in my secret specialness that had not yet been recognized by the world. Doubting the myths of your childhood can be more destabilizing than reading you’re a finalist for a prize you didn’t apply for. Our parents are the historians of our lives. We want to think we have a coherent identity that stretches back to birth, and for the first time I stared back and saw what looked an awful lot like nothing.

Careless praise and exaggerated stories are just a quirky part of growing up in some families, but they can be damaging when they amplify the weird romance kids already have about themselves — the secret identities and magical powers they know they have that adults around them can’t see. I can’t walk five minutes in Prospect Park without seeing some girl spin around, sprinkle herself with magic pixie dust and chant I am the princess, I am the princess, I am the princess. I needed no prompting myself to think I was the next Ron Guidry. I had a natural slider that was unhittable — and uncontrollable — and stayed up each night before I pitched imagining how I’d record 18 straight strikeouts, end the game by myself, and set a record. Onlookers would gasp as they saw a small thing of perfection in a fallen world, or the evening would be a tough slog.

You can guess which one it was.

Our first task as children is to dream ourselves into being and, oddly enough, we pretend to be other people to do that, as if life were a drama filled with larger-than-life characters first glimpsed from afar. We try on identities, just as I assigned one to my dad and was given one by my mom, until we find one that fits, but then face the task of insinuating ourselves into a world that’s indifferent or hostile to us or just more complicated than we imagined. That, of course, is the day childhood starts to end, when we may start to suspect that the praise echoing in our ears sounds an awful lot like lying.

I had no idea there might be a cultural component to all this, that the fantasies families weave around their children might have changed over time, until I was expecting a son myself and read some parenting books. Apparently I grew up at the height of the self-esteem movement, and well-intentioned but excessive praise has harmed a number of children. That, at least, is the claim of Po Bronson and Ashley Merryman in Nurture Shock. In the first chapter they trace the self-esteem movement back to Nathaniel Branden, who asserted in 1969 that self-esteem is our most important attribute. California even created a self-esteem task force, thinking if it could raise the self-esteem of its citizens, they’d “do everything from lower dependence on welfare to decrease teenage pregnancy.” Soon we were handing out medals to every participant at sporting events and not keeping score. Competition is too competitive —our children’s egos are at stake.

But can self-esteem be pinned on someone like a ribbon? Is it an effect or a cause of competence? Does telling children to feel good about themselves actually make them feel good about themselves? The authors, citing research from Carol Dweck, suggest that constant, overly general praise can have the opposite of the intended effect. Their parenting advice, to treat the mind like a muscle that needs exercise, echoes Seneca, our oldest self-help guru, who insisted that the mind be exercised “day and night.” (All of these writers value effort above all; for Seneca, it is the one way we surpass even the gods.) Constant praise makes children both fragile and risk averse. Since they aren’t praised for their effort, the one thing they control, children take failure as a sign that they were never intelligent to begin with. I got a sense of how far off the deep end we’d gone when an NYU psychiatry professor was trotted out late in the chapter to stress what should have been obvious. Praise, she says, “has to be based on a real thing.”

Well, what has praise been based on? (...)

For the parents interviewed in Nurture Shock, praise is a practical attempt to bolster their children’s confidence before they go out to battle the world, so any amount of affirmation can be justified as long as it works. You might call this the Bluster Theory of Praise. At first it doesn’t sound unreasonable. We all know people whose self-promotion allows them to achieve what their capabilities wouldn’t on their own. The downside — in addition to having irritating children — is that their confidence won’t be based on something that should inspire confidence. They get their way not through their ability but through their aggression and can mistake one thing for the other. It can also start an arms race in which every child is inevitably considered a genius.

by Christopher Wall, LA Review of Books |  Read more:
Image: uncredited


[ed... and other quotes from This Side of Paradise.]

Harry Gruyaert, Moscow, 1989.

Andrea Kowch, Dream Chaser


As the oldest university in the English-speaking world, Oxford is a strange choice to host a futuristic think tank, a salon where the concepts of science fiction are debated in earnest. The Future of Humanity Institute seems like a better fit for Silicon Valley or Shanghai. During the week that I spent with him, Bostrom and I walked most of Oxford’s small cobblestone grid. On foot, the city unfolds as a blur of yellow sandstone, topped by grey skies and gothic spires, some of which have stood for nearly 1,000 years. There are occasional splashes of green, open gates that peek into lush courtyards, but otherwise the aesthetic is gloomy and ancient. When I asked Bostrom about Oxford’s unique ambience, he shrugged, as though habit had inured him to it. But he did once tell me that the city's gloom is perfect for thinking dark thoughts over hot tea.

There are good reasons for any species to think darkly of its own extinction. Ninety-nine percent of the species that have lived on Earth have gone extinct, including more than five tool-using hominids. A quick glance at the fossil record could frighten you into thinking that Earth is growing more dangerous with time. If you carve the planet's history into nine ages, each spanning five hundred million years, only in the ninth do you find mass extinctions, events that kill off more than two thirds of all species. But this is deceptive. Earth has always had her hazards; it's just that for us to see them, she had to fill her fossil beds with variety, so that we could detect discontinuities across time. The tree of life had to fill out before it could be pruned. (...)

Bostrom isn’t too concerned about extinction risks from nature. Not even cosmic risks worry him much, which is surprising, because our starry universe is a dangerous place. Every 50 years or so, one of the Milky Way’s stars explodes into a supernova, its detonation the latest gong note in the drumbeat of deep time. If one of our local stars were to go supernova, it could irradiate Earth, or blow away its thin, life-sustaining atmosphere. Worse still, a passerby star could swing too close to the Sun, and slingshot its planets into frigid, intergalactic space. Lucky for us, the Sun is well-placed to avoid these catastrophes. Its orbit threads through the sparse galactic suburbs, far from the dense core of the Milky Way, where the air is thick with the shrapnel of exploding stars. None of our neighbours look likely to blow before the Sun swallows Earth in four billion years. And, so far as we can tell, no planet-stripping stars lie in our orbital path. Our solar system sits in an enviable bubble of space and time.

But as the dinosaurs discovered, our solar system has its own dangers, like the giant space rocks that spin all around it, splitting off moons and scarring surfaces with craters. In her youth, Earth suffered a series of brutal bombardments and celestial collisions, but she is safer now. There are far fewer asteroids flying through her orbit than in epochs past. And she has sprouted a radical new form of planetary protection, a species of night watchmen that track asteroids with telescopes.

‘If we detect a large object that’s on a collision course with Earth, we would likely launch an all-out Manhattan project to deflect it,’ Bostrom told me. Nuclear weapons were once our asteroid-deflecting technology of choice, but not anymore. A nuclear detonation might scatter an asteroid into a radioactive rain of gravel, a shotgun blast headed straight for Earth. Fortunately, there are other ideas afoot. Some would orbit dangerous asteroids with small satellites, in order to drag them into friendlier trajectories. Others would paint asteroids white, so the Sun’s photons bounce off them more forcefully, subtly pushing them off course. Who knows what clever tricks of celestial mechanics would emerge if Earth were truly in peril. (...)

The risks that keep Bostrom up at night are those for which there are no geological case studies, and no human track record of survival. These risks arise from human technology, a force capable of introducing entirely new phenomena into the world.

Nuclear weapons were the first technology to threaten us with extinction, but they will not be the last, nor even the most dangerous. A species-destroying exchange of fissile weapons looks less likely now that the Cold War has ended, and arsenals have shrunk. There are still tens of thousands of nukes, enough to incinerate all of Earth’s dense population centers, but not enough to target every human being. The only way nuclear war will wipe out humanity is by triggering nuclear winter, a crop-killing climate shift that occurs when smoldering cities send Sun-blocking soot into the stratosphere. But it’s not clear that nuke-levelled cities would burn long or strong enough to lift soot that high. The Kuwait oil field fires blazed for ten months straight, roaring through 6 million barrels of oil a day, but little smoke reached the stratosphere. A global nuclear war would likely leave some decimated version of humanity in its wake; perhaps one with deeply rooted cultural taboos concerning war and weaponry.

Such taboos would be useful, for there is another, more ancient technology of war that menaces humanity. Humans have a long history of using biology’s deadlier innovations for ill ends; we have proved especially adept at the weaponisation of microbes. In antiquity, we sent plagues into cities by catapulting corpses over fortified walls. Now we have more cunning Trojan horses. We have even stashed smallpox in blankets, disguising disease as a gift of good will. Still, these are crude techniques, primitive attempts to loose lethal organisms on our fellow man. In 1993, the death cult that gassed Tokyo’s subways flew to the African rainforest in order to acquire the Ebola virus, a tool it hoped to use to usher in Armageddon. In the future, even small, unsophisticated groups will be able to enhance pathogens, or invent them wholesale. Even something like corporate sabotage, could generate catastrophes that unfold in unpredictable ways. Imagine an Australian logging company sending synthetic bacteria into Brazil’s forests to gain an edge in the global timber market. The bacteria might mutate into a dominant strain, a strain that could ruin Earth’s entire soil ecology in a single stroke, forcing 7 billion humans to the oceans for food.

These risks are easy to imagine. We can make them out on the horizon, because they stem from foreseeable extensions of current technology. But surely other, more mysterious risks await us in the epochs to come. After all, no 18th-century prognosticator could have imagined nuclear doomsday. Bostrom’s basic intellectual project is to reach into the epistemological fog of the future, to feel around for potential threats. It’s a project that is going to be with us for a long time, until — if — we reach technological maturity, by inventing and surviving all existentially dangerous technologies.

There is one such technology that Bostrom has been thinking about a lot lately. Early last year, he began assembling notes for a new book, a survey of near-term existential risks. After a few months of writing, he noticed one chapter had grown large enough to become its own book. ‘I had a chunk of the manuscript in early draft form, and it had this chapter on risks arising from research into artificial intelligence,’ he told me. ‘As time went on, that chapter grew, so I lifted it over into a different document and began there instead.’

by Ross Andersen, Aeon |  Read more:
Image: Andy Sansom

Book of Lamentations

The best dystopian literature, or at least the most effective, manages to show us a hideous and contorted future while resisting the temptation to point fingers and invent villains. This is one of the major flaws in George Orwells’s 1984: When O’Brien laughingly expounds on his vision of “a boot stamping on a human face – forever” he starts to acquire the ludicrousness of a Bond villain; he may as well be a cartoon – one of the Krusty Kamp counsellors in The Simpsons, raising a glass “to Evil.” Orwell’s satire of Stalinism, or Margaret Atwood’s on the religious right in The Handmaid’s Tale tend to let our present world off the hook a little by comparison. More subtle works, like Huxley’s Brave New World, are far more effective. His Controller, when interrogated, doesn’t burst out in maniacal laughter and start twiddling his moustache. He explains, in quite reasonable terms, why the dystopia he lives in is the best way to ensure the happiness of all – and he means it. Everything’s broken, but it’s not anyone’s fault; it’s terrifying because it’s so familiar. (...)

The opening passages of DSM-5 give us a long history of the purported previous editions of the book and the endless revisions and fine-tunings that have gone into the work. This mad project is clearly something that its authors are fixated on to a somewhat unreasonable extent. In a retrospectively predictable ironic twist, this precise tendency is outlined in the book itself. The entry for obsessive-compulsive disorder with poor insight describes this taxonomical obsession in deadpan tones: “repetitive behavior, the goal of which is […] to prevent some dreaded event or situation.” Our narrator seems to believe that by compiling an exhaustive list of everything that might go askew in the human mind, this wrong state might somehow be overcome or averted. References to compulsive behavior throughout the book repeatedly refer to the “fear of dirt in someone with an obsession about contamination.” The tragic clincher comes when we’re told, “the individual does not recognize that the obsessions or compulsions are excessive or unreasonable.” This mad project is so overwhelming that its originator can’t even tell that they’ve subsumed themselves within its matrix. We’re dealing with a truly unreliable narrator here, not one that misleads us about the course of events (the narrator is compulsive, they do have poor insight), but one whose entire conceptual framework is radically off-kilter. As such, the entire story is a portrait of the narrator’s own particular madness. With this realization, DSM-5 starts to enter the realm of the properly dystopian. (...)

The idea emerges that every person’s illness is somehow their own fault, that it comes from nowhere but themselves: their genes, their addictions, and their inherent human insufficiency. We enter a strange shadow-world where for someone to engage in prostitution isn’t the result of intersecting environmental factors (gender relations, economic class, family and social relationships) but a symptom of “conduct disorder,” along with “lying, truancy, [and] running away.” A mad person is like a faulty machine. The pseudo-objective gaze only sees what they do, rather than what they think or how they feel. A person who shits on the kitchen floor because it gives them erotic pleasure and a person who shits on the kitchen floor to ward off the demons living in the cupboard are both shunted into the diagnostic category of encopresis. It’s not just that their thought-process don’t matter, it’s as if they don’t exist. The human being is a web of flesh spun over a void. (...)

The word “disorder” occurs so many times that it almost detaches itself from any real signification, so that the implied existence of an ordered state against which a disorder can be measured nearly vanishes is almost forgotten. Throughout the novel, this ordered normality never appears except as an inference; it is the object of a subdued, hopeless yearning. With normality as a negatively defined and nebulously perfect ideal, anything and everything can then be condemned as a deviation from it. Even an outburst of happiness can be We’re told this consists of a prolonged period during which the sufferer’s mood “may be described as euphoric, unusually good, cheerful, or high” and in which “the person may spontaneously start extensive conversations with strangers in public places,” or – more distressingly, admittedly – “a salesperson may telephone strangers at home in the early morning hours to initiate sales”diagnosed as a manic episode. And then there are the “not otherwise specified” personality disorder categories. Here all pretensions to objectivity fall apart and the novel’s carefully warped imitation of scientific categories fades into an examination of petty viciousness. A personality disorder not otherwise specified is the diagnosis for anyone whose behaviors “do not meet the full criteria for any one Personality Disorder, but that together cause clinically significant distress […] eg. social or occupational.” It’s hard not to be reminded of a few people who’ve historically caused social or occupational distress. If you don’t believe that people really exist, any radical call for their emancipation is just sickness at its most annoying.

If there is a normality here, it’s a state of near-catatonia. DSM-5 seems to have no definition of happiness other than the absence of suffering. The normal individual in this book is tranquilized and bovine-eyed, mutely accepting everything in a sometimes painful world without ever feeling much in the way of anything about it. The vast absurd excesses of passion that form the raw matter of art, literature, love, and humanity are too distressing; it’s easier to stop being human altogether, to simply plod on as a heaped collection of diagnoses with a body vaguely attached.

by Sam Kriss, TNI |  Read more:
Image: Vincent van Gogh Corridor in the Asylum (1889)

Isabella Morawetz, Jesse and Jane

Is Glenn Greenwald the Future of News?

Much of the speculation about the future of news focuses on the business model: How will we generate the revenues to pay the people who gather and disseminate the news? But the disruptive power of the Internet raises other profound questions about what journalism is becoming, about its essential character and values. This week’s column is a conversation — a (mostly) civil argument — between two very different views of how journalism fulfills its mission.

Glenn Greenwald broke what is probably the year’s biggest news story, Edward Snowden’s revelations of the vast surveillance apparatus constructed by the National Security Agency. He has also been an outspoken critic of the kind of journalism practiced at places like The New York Times, and an advocate of a more activist, more partisan kind of journalism. Earlier this month he announced he was joining a new journalistic venture, backed by eBay billionaire Pierre Omidyar, who has promised to invest $250 million and to “throw out all the old rules.” I invited Greenwald to join me in an online exchange about what, exactly, that means.

Dear Glenn,

We come at journalism from different traditions. I’ve spent a life working at newspapers that put a premium on aggressive but impartial reporting, that expect reporters and editors to keep their opinions to themselves unless they relocate (as I have done) to the pages clearly identified as the home of opinion. You come from a more activist tradition — first as a lawyer, then as a blogger and columnist, and soon as part of a new, independent journalistic venture financed by the eBay founder Pierre Omidyar. Your writing proceeds from a clearly stated point of view.

In a post on Reuters this summer, media critic Jack Shafer celebrated the tradition of partisan journalism — “From Tom Paine to Glenn Greenwald” — and contrasted it with what he called “the corporatist ideal.” He didn’t explain the phrase, but I don’t think he meant it in a nice way. Henry Farrell, who blogs for The Washington Post, wrote more recently that publications like The New York Times and The Guardian “have political relationships with governments, which make them nervous about publishing (and hence validating) certain kinds of information,” and he suggested that your new project with Omidyar would represent a welcome escape from such relationships.

I find much to admire in America’s history of crusading journalists, from the pamphleteers to the muckrakers to the New Journalism of the ’60s to the best of today’s activist bloggers. At their best, their fortitude and passion have stimulated genuine reforms (often, as in the Progressive Era, thanks to the journalists’ “political relationships with governments”). I hope the coverage you led of the National Security Agency’s hyperactive surveillance will lead to some overdue accountability.

But the kind of journalism The Times and other mainstream news organizations practice — at their best — includes an awful lot to be proud of, too, revelations from Watergate to torture and secret prisons to the malfeasance of the financial industry, and including some pre-Snowden revelations about the N.S.A.’s abuse of its authority. Those are highlights that leap to mind, but you’ll find examples in just about every day’s report. Journalists in this tradition have plenty of opinions, but by setting them aside to follow the facts — as a judge in court is supposed to set aside prejudices to follow the law and the evidence — they can often produce results that are more substantial and more credible. The mainstream press has had its failures — episodes of credulousness, false equivalency, sensationalism and inattention — for which we have been deservedly flogged. I expect you’ll say, not flogged enough. So I pass you the lash.

Dear Bill,

There’s no question that journalists at establishment media venues, certainly including The New York Times, have produced some superb reporting over the last couple of decades. I don’t think anyone contends that what has become (rather recently) the standard model for a reporter — concealing one’s subjective perspectives or what appears to be “opinions” — precludes good journalism.

But this model has also produced lots of atrocious journalism and some toxic habits that are weakening the profession. A journalist who is petrified of appearing to express any opinions will often steer clear of declarative sentences about what is true, opting instead for a cowardly and unhelpful “here’s-what-both-sides-say-and-I-won’t-resolve-the-conflicts” formulation. That rewards dishonesty on the part of political and corporate officials who know they can rely on “objective” reporters to amplify their falsehoods without challenge (i.e., reporting is reduced to “X says Y” rather than “X says Y and that’s false”).

Worse still, this suffocating constraint on how reporters are permitted to express themselves produces a self-neutering form of journalism that becomes as ineffectual as it is boring. A failure to call torture “torture” because government officials demand that a more pleasant euphemism be used, or lazily equating a demonstrably true assertion with a demonstrably false one, drains journalism of its passion, vibrancy, vitality and soul.

Worst of all, this model rests on a false conceit. Human beings are not objectivity-driven machines. We all intrinsically perceive and process the world through subjective prisms. What is the value in pretending otherwise?

The relevant distinction is not between journalists who have opinions and those who do not, because the latter category is mythical. The relevant distinction is between journalists who honestly disclose their subjective assumptions and political values and those who dishonestly pretend they have none or conceal them from their readers.

Moreover, all journalism is a form of activism. Every journalistic choice necessarily embraces highly subjective assumptions — cultural, political or nationalistic — and serves the interests of one faction or another. Former Bush D.O.J. lawyer Jack Goldsmith in 2011 praised what he called “the patriotism of the American press,” meaning their allegiance to protecting the interests and policies of the U.S. government. That may (or may not) be a noble thing to do, but it most definitely is not objective: it is quite subjective and classically “activist.”

But ultimately, the only real metric of journalism that should matter is accuracy and reliability. I personally think honestly disclosing rather than hiding one’s subjective values makes for more honest and trustworthy journalism. But no journalism — from the most stylistically “objective” to the most brazenly opinionated — has any real value unless it is grounded in facts, evidence, and verifiable data. The claim that overtly opinionated journalists cannot produce good journalism is every bit as invalid as the claim that the contrived form of perspective-free journalism cannot.

by Bill Keller with Glenn Greenwald, NY Times | Read more:
Images:Tony Cenicola and Wikipedia

A Game of Shark And Minnow

[ed. Interactive informative story, exceptionally well-done.]

To understand how Ayungin (known to the Western world as Second Thomas Shoal) could become contested ground is to confront, in miniature, both the rise of China and the potential future of U.S. foreign policy. It is also to enter into a morass of competing historical, territorial and even moral claims in an area where defining what is true or fair may be no easier than it has proved to be in the Middle East.

The Spratly Islands sprawl over roughly 160,000 square miles in the waters of the coasts of the Philippines, Malaysia, Brunei, Taiwan and China — all of whom claim part of the islands.

China is currently in disputes with several of its neighbors, and the Chinese have become decidedly more willing to wield a heavy stick. There is a growing sense that they have been waiting a long time to flex their muscles and that that time has finally arrived. “Nothing in China happens overnight,” Stephanie Kleine-Ahlbrandt, the director of Asia-Pacific programs at the United States Institute of Peace, said. “Any move you see was planned and prepared for years, if not more. So obviously this maritime issue is very important to China.”

It is also very important to the United States, as Secretary of State Hillary Rodham Clinton made clear at a gathering of the Association of Southeast Nations (Asean) in Hanoi in July 2010. Clinton declared that freedom of navigation in the South China Sea was a “national interest” of the United States, and that “legitimate claims to maritime space in the South China Sea should be derived solely from legitimate claims to land features,” which could be taken to mean that China’s nine-dash line was illegitimate. The Chinese foreign minister, Yang Jiechi, chafed visibly, left the meeting for an hour and returned only to launch into a long, vituperative speech about the danger of cooperation with outside powers.

President Obama and his representatives have reiterated America’s interest in the region ever since. The Americans pointedly refuse to take sides in the sovereignty disputes. But China’s behavior as it becomes more powerful, along with freedom of navigation and control over South China Sea shipping lanes, will be among the major global political issues of the 21st century. According to the Council on Foreign Relations, of the $5.3 trillion in global trade that transits the South China Sea each year, $1.2 trillion of it touches U.S. ports — and so American foreign policy has begun to shift accordingly.

In a major speech in Singapore last year, Leon Panetta, then the secretary of defense, described the coming pivot in U.S. strategy in precise terms: “While the U.S. will remain a global force for security and stability, we will of necessity rebalance toward the Asia-Pacific region.” He referred to the United States as a “Pacific nation,” with a capital “P” and no irony, and then announced a series of changes — most notably that the roughly 50-50 balance of U.S. naval forces between the Pacific and the Atlantic would become 60-40 Pacific by 2020. Given the size of the U.S. Navy, this is enormously significant.

In June of last year, the United States helped broker an agreement for both China’s and the Philippines’s ships to leave Scarborough Shoal peacefully, but China never left. They eventually blocked access to the shoal and filled in a nest of boats around it to ward off foreign fishermen.

“Since [the standoff], we have begun to take measures to seal and control the areas around the Huangyabn Island,” Maj. Gen. Zhang Zhaozhong, of China’s People’s Liberation Army, said in a television interview in May, using the Chinese term for Scarborough. (That there are three different names for the same set of uninhabitable rocks tells you much of what you need to know about the region.) He described a “cabbage strategy,” which entails surrounding a contested area with so many boats — fishermen, fishing administration ships, marine surveillance ships, navy warships — that “the island is thus wrapped layer by layer like a cabbage.”

by Jeff Himmelman, NY Times |  Read more:
Image: Ashley Gilbertson

Sunday, October 27, 2013

Seattle Asian Art Museum
photo: markk

Seattle Art Museum
photo: markk

Electoral Nirvana, for Krist's Sake

The nominating committee of the Rock and Roll Hall of Fame has announced its finalists for induction in 2014, and leading the list of sixteen is Nirvana, represented now by its two surviving members, the bassist Krist Novoselic and the then-drummer Dave Grohl (and, if you count “Sirvana,” Sir Paul McCartney). Nirvana is this round’s only band to have been nominated in its first year of eligibility, i.e., twenty-five years after the release of a first single or album. (That would be “Love Buzz,” 1988—pre-Grohl, who joined in 1990.)

The Hall of Fame’s purpose, according to itself, “is to recognize the contributions of those who have had a significant impact on the evolution, development and perpetuation of rock and roll.” By that standard, Nirvana is a shoo-in. Although it produced just three albums before Kurt Cobain’s death, it is as highly regarded by my fifteen-year-old son’s cohort as it is by its original fans. The ceremony will be held in New York next April, exactly twenty years after Cobain died at the sadly fabled rock-and-roll age of twenty-seven.

Among this year’s nominees are several artists who have pursued political and social issues: N.W.A. (inner-city turmoil); Peter Gabriel (apartheid, human rights); the divine Linda Ronstadt (the environment, gay rights, immigration reform), whose nomination comes just two months after her distressing announcement that Parkinson’s has rendered her unable to sing; and, perhaps the most politically daring choice, Cat Stevens—who, as Yusuf Islam, was denied entry to the United States by the Department of Homeland Security but is, after all, the composer of “Peace Train.”

Still, it’s safe to say that this year will mark the first time since Bono that a rock star who is also a full-blown policy wonk has made the cut. What a lot of Nirvana fans may not know is that Krist Novoselic, the band’s bassist, has another passion: electoral reform.

Novoselic tells how it happened in his little book, “Of Grunge and Government,” a combination of memoir and political tract. As he describes his political education, it began at age eighteen, when he picked up a copy of Jack Kerouac’s “The Dharma Bums” at a Seattle bookstore. Five years later, Nirvana was on tour in Europe, playing Berlin the day after the Wall fell. During Nirvana’s “two years of meteoric fame,” the trio used its popularity to rally opposition to an anti-gay ballot initiative, to protest the first Gulf War, and to “bring attention to the plight of women in the Balkan conflicts of the time.” (Krist’s parents, Kristo and Marija, immigrated from Croatia.)

Krist, on his own after Nirvana, jumped into political advocacy in both Washingtons. In his home state, he lobbied lawmakers and rallied young people on behalf of various causes, especially opposition to music censorship. In 1998, in the other Washington, he testified before the Commerce Committee of the U.S. Senate. Back home again, he helped forge a satisfactory compromise on a bill before the state legislature. “Succeeding in the face of the slow grind of the democratic process is a wonderful experience,” he writes.

But Novoselic was developing doubts about the grinder itself. He kept telling his fans how important it was to get out and vote, but they kept telling him they felt that their votes didn’t count, that their votes were wasted. “I have always tried to be positive,” he writes.
I’d go on about how every vote does count, and how it’s our duty as Americans to participate. But it didn’t take long to sink in that I was just pitching platitudes. This caused me a crisis of faith … The more I thought about the high rate of nonparticipation, the more I felt it was the result of a political structure that discourages diverse ideas.
So he began to think outside the box—but not outside the ballot box. He concluded that the biggest problem with American legislative elections—even bigger than big money and partisan redistricting—is districting itself: the one-member-constituency, winner-take-all system of elections.

That’s the system we inherited from the British centuries ago. Few democracies, almost all of them former British colonies, use it. Its unfortunate side effects are legion. Among others: zero-sum negative campaigns. Low voter participation. No guarantee of majority rule. And no guarantee of minority political representation. In the United States, in fact, racial representation is the enemy of political representation. When overwhelmingly black single-member districts were created in the South, African-American political representation—that is, representation for the policy ideas that African-Americans prefer—declined. There are plenty of white liberals in the South, but there are no Southern white liberal members of Congress. Likewise, there are plenty of conservatives in big Northern cities, but their representation in the House of Representatives is zero.

In “Of Grunge and Government,” Novoselic touts two big, visionary reforms.

by Hendrik Hertzberg, New Yorker |  Read more:
Image: Ted S. Warren/AP

The Man Who Would Teach Machines to Think

“It depends on what you mean by artificial intelligence.” Douglas Hofstadter is in a grocery store in Bloomington, Indiana, picking out salad ingredients. “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done.”

Hofstadter says this with an easy deliberateness, and he says it that way because for him, it is an uncontroversial conviction that the most-exciting projects in modern artificial intelligence, the stuff the public maybe sees as stepping stones on the way to science fiction—like Watson, IBM’s Jeopardy-playing supercomputer, or Siri, Apple’s iPhone assistant—in fact have very little to do with intelligence. For the past 30 years, most of them spent in an old house just northwest of the Indiana University campus, he and his graduate students have been picking up the slack: trying to figure out how our thinking works, by writing computer programs that think.

Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself. Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.

The idea that changed Hofstadter’s existence, as he has explained over the years, came to him on the road, on a break from graduate school in particle physics. Discouraged by the way his doctoral thesis was going at the University of Oregon, feeling “profoundly lost,” he decided in the summer of 1972 to pack his things into a car he called Quicksilver and drive eastward across the continent. Each night he pitched his tent somewhere new (“sometimes in a forest, sometimes by a lake”) and read by flashlight. He was free to think about whatever he wanted; he chose to think about thinking itself. Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter. The father of psychology, William James, described this in 1890 as “the most mysterious thing in the world”: How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?

Roaming in his 1956 Mercury, Hofstadter thought he had found the answer—that it lived, of all places, in the kernel of a mathematical proof. In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself. Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.” He sat down one afternoon to sketch his thinking in a letter to a friend. But after 30 handwritten pages, he decided not to send it; instead he’d let the ideas germinate a while. Seven years later, they had not so much germinated as metastasized into a 2.9‑pound, 777-page book called Gödel, Escher, Bach: An Eternal Golden Braid, which would earn for Hofstadter—only 35 years old, and a first-time author—the 1980 Pulitzer Prize for general nonfiction. (...)

In GEB, Hofstadter was calling for an approach to AI concerned less with solving human problems intelligently than with understanding human intelligence—at precisely the moment that such an approach, having borne so little fruit, was being abandoned. His star faded quickly. He would increasingly find himself out of a mainstream that had embraced a new imperative: to make machines perform in any way possible, with little regard for psychological plausibility.

Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force. For each legal move it could make at a given point in the game, it would consider its opponent’s responses, its own responses to those responses, and so on for six or more steps down the line. With a fast evaluation function, it would calculate a score for each possible position, and then make the move that led to the best score. What allowed Deep Blue to beat the world’s best humans was raw computational power. It could evaluate up to 330 million positions a second, while Kasparov could evaluate only a few dozen before having to make a decision.

Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?” A brand of AI that didn’t try to answer such questions—however impressive it might have been—was, in Hofstadter’s mind, a diversion. He distanced himself from the field almost as soon as he became a part of it. “To me, as a fledgling AI person,” he says, “it was self-evident that I did not want to get involved in that trickery. It was obvious: I don’t want to be involved in passing off some fancy program’s behavior for intelligence when I know that it has nothing to do with intelligence. And I don’t know why more people aren’t that way.”

One answer is that the AI enterprise went from being worth a few million dollars in the early 1980s to billions by the end of the decade. (After Deep Blue won in 1997, the value of IBM’s stock increased by $18 billion.) The more staid an engineering discipline AI became, the more it accomplished. Today, on the strength of techniques bearing little relation to the stuff of thought, it seems to be in a kind of golden age. AI pervades heavy industry, transportation, and finance. It powers many of Google’s core functions, Netflix’s movie recommendations, Watson, Siri, autonomous drones, the self-driving car.

“The quest for ‘artificial flight’ succeeded when the Wright brothers and others stopped imitating birds and started … learning about aerodynamics,” Stuart Russell and Peter Norvig write in their leading textbook, Artificial Intelligence: A Modern Approach. AI started working when it ditched humans as a model,because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?

by James Somers, The Atlantic |  Read more:
Image: MIT

Dickson Payne, Popeye


[ed. The New York Times declined to publish this op-ed piece so it appeared on Banksy's site instead, along with other art from his month long stay.]

Lou Reed (March 2, 1942 – October 27, 2013)

Saturday, October 26, 2013


Three weeks ago, renowned graffiti artist Banksy started a 30-day “show” in New York entitled Better Out Than In: an Artist’s Residency on the Streets of New York. Each day the enigmatic and elusive artist (we still don’t know his identity) posts a picture to his website and Instagram account of a new piece gracing some part of New York. On the first day of the Instagram account, which now has over 200,000 followers, Banksy posted a photo of his first work, with his comment, “The street is in play Manhattan 2013 #banksyny.” Indeed, Banksy has used the street walls of NYC as canvas for whimsical pieces, from a dog pissing on a hydrant with a dialogue bubble reading “You complete me,” to a plastic heart-shaped balloon with band aids on it. Banksy also availed himself of original sculptures, like Ronald McDonald receiving a shoe shine, and makes a clever use of trucks. One Banksy piece is an installation on wheels: a truck driving through the city streets with stuffed livestock-animal dolls with their cute heads sticking out of the sides, seemingly on their way to the slaughter, and another he calls a mobile garden, a dirty looking delivery track carrying an spray-painted oasis in its carriage. Yet, in all the attempts at playfulness an overwrought seriousness has crept into the project. Many Banksy devotees appear offended—almost personally slighted—by how some New Yorkers deign to treat these pristine works of art.

Indeed, some of Banksy’s installments have been either painted over, or painted on, or tagged by other graffiti artists. The way the pieces are embedded into the city’s environment, it is perhaps unsurprising that just a day after the first Banksy went up, it was painted over; the piece was tagged, “(c) PHATLIPP Sweaty palms made me lose the love of my life ):” This one guy tagged a few pieces, which elicited more graffiti responses, these making fun of him, but also an outpouring of hate and vitriol on the Internet. (A representative comment from one commenter on Banksy’s Instagram: “Fuck those jealous losers that ruin your art. They are just mad that their ugly scribbles don’t get the attention you get.”) Even when more creative additions and replication went up as with Banksy’s, “Concrete Confessional”, people still felt affronted by what they perceived as defacing of art. That many people, en masse have taking to denigrating anyone who might think to tag, or add to, or otherwise “deface” a precious Banksy betrays a static and empty (though understandable) idea of art, an idea that undermines the greatness of graffiti. (...)

Graffiti art, let’s remember, is purposefully one of the less controlled and more transient of art forms. It is inherently communal, whether in the enjoyment or in its genesis and longevity. Performance art, of course, is more fleeting by definition, but both graffiti and performance art take away much of the control from the artist, whether limiting themselves in time or creating a painting that will necessarily be under the community’s control. Both though, tend to raise a strange frustration in people, perhaps because these forms so diverge from our traditional notions of art as eternal, as belonging in a museum. Graffiti takes the city as its canvas, the walls, alleyways, and windows of lived life, an intrusion of art into the stuffiness of the city, but always as part of the city. To then treat it as an objet d’art, to quarantine it off, transforms it and takes it out of its natural and proper context. Banksy once wrote, “Graffiti is one of the few tools you have if you have almost nothing. And even if you don’t come up with a picture to cure world poverty you can make someone smile while they’re having a piss.” But can you imagine the outrage if someone were to take a piss on or even next to one of the new Banksies?

In the last two weeks, owners of buildings with Banksy art have taken to hiring guards, putting up plexiglass, rolling gates, and ropes to create lines, all of which is practical and perhaps understandable but undermines much of the purpose of these 30 days. All of these protections simply turn these outdoors, public pieces into indoor museum pieces, introducing a sterility that subverts the spirit of the project. These tactics isolate the art from the bustling environment. The viewer becomes passive, just another viewer waiting in line, no longer a participant. From a theoretical perspective, this all seems backwards. The owners of the building, from the perspective of the actual graffiti art, ought to hold no more rights than the community in deciding what to do with the graffiti.

by Joe Winkler, Guernica |  Read more:
Image: from Flickr via Dan Brady

Miguel Rio Branco, 1994. Havana.

Alex Webb

Friday, October 25, 2013

JPMorgan Reaches Deal With Agency Over Loans

[ed. See also: Reparations From Banks. I guess this is what passes for regulatory oversight and restitution these days. Despite the oddness of seeing two governmental agencies engaged in a public dispute over how hard to slap J.P's wrist, the fact remains that $13 billon in fines is no small potatoes. That alone should give pause to consider just how enormous the company's transgressions have been (and perhaps still are). I say, book 'em the full extent of the law (and even more, were it possible).]

JPMorgan Chase has secured important concessions in a $13 billion settlement over its mortgage practices, allowances that could ultimately reduce the bank’s financial burden and leave the government itself on the hook for a small portion of the cost.

The concessions emerged on Friday in an agreement with one of the federal regulators suing JPMorgan, the nation’s largest bank. The regulator, the Federal Housing Finance Agency, ran ahead of a broader deal that the Justice Department and other authorities were negotiating with the bank.

The housing agency, which oversees Fannie Mae and Freddie Mac, extracted a $5.1 billion payout on Friday.

But unlike other regulators pursuing the bank, it did not require JPMorgan to admit wrongdoing. And in a provision buried in the settlement, the agency effectively allows JPMorgan to try later to recoup about $1 billion from another federal regulator: the Federal Deposit Insurance Corporation.

The results show that, even as JPMorgan is facing an onslaught from the government, the bank is seeking to contain the fallout — and is succeeding on some fronts.

In a statement, JPMorgan called the deal “an important step towards a broader resolution” with the Justice Department and the other government authorities.

And for its part, the housing agency, while not responding to questions about the wording of the agreement, also heralded the settlement. “This is a significant step as the government and JPMorgan Chase move to address outstanding mortgage-related issues,” Edward J. DeMarco, the acting director of the housing agency, said in a statement. Yet the housing agency’s announcement also suggests that the government may be split over how to punish the bank for misrepresenting the quality of mortgage securities it sold to investors before the 2008 financial crisis. The Justice Department, which has orchestrated the $13 billion settlement, is conversely demanding that JPMorgan not pass on its liabilities to the F.D.I.C.

JPMorgan has been locked in a legal battle with the F.D.I.C. over mortgage securities sold by Washington Mutual. In a deal that the F.D.I.C. orchestrated, JPMorgan bought the failed bank at the height of the financial crisis in 2008, just months after it acquired Bear Stearns. By JPMorgan’s account, the F.D.I.C. agreed that it would shoulder some liabilities from Washington Mutual. The agency disputes that notion and is fighting the bank in court.

The bank’s legal stance has now carried over into the negotiations over the $13 billion mortgage settlement.

Because the settlement deal with the housing regulator involves mortgage securities sold by Washington Mutual, JPMorgan could try to push some of the settlement cost onto the F.D.I.C. And the deal with the housing agency does not explicitly prevent the bank from doing so.

by Ben Protess and Peter Eavis, NY Times Dealbook |  Read more:
Image: Mike Segar/Reuters

Kanye West Knows You Think He Sounded Nuts

Here are three stories.

I'm 9 or 10 and my mother and I are on a cross-country road trip when we decide to stop for breakfast at a small roadside diner in Mississippi. I'm too young to be aware of the charged atmosphere of racial tension, but something feels odd. It feels odd when the people in the diner—most of whom are white—turn to look at my white mother and me, her brown son, as we enter and make our way to a table. It feels odd when my mom asks if there are raisins to put in her oatmeal and the waitress irritatedly spits, "No!" It feels so odd, in fact, that my mother asks our server if something is wrong: "No!" she barks again. It feels odd when the woman throws down the bill when we're done eating. No one calls us names. No one threatens us. The surly waitress has even specifically told us nothing is wrong. But when we return to the car my visibly shaken mom pulls a canister of pepper spray out of the glove compartment and tests it on the ground to make certain it's functioning properly.

Years later, in high school, I'm headed to a party in my friend Spencer's convertible. It's a warm Arizona evening, and the hot wind is blowing through our hair; we're laughing and listening to rap music. A gum wrapper from somewhere in the car catches a gust of air and takes flight. I'm sure it hasn't even landed before a police car pulls us over, and soon my two black friends and I have flashlights in our faces.

"You think you can litter around here?" one white officer asks.

"You mean that gum wrapper?" I ask back. "I'm very sorry that happened, sir, but this is a convertible. It was a mistake." He writes me a ticket.

A month ago, I'm at a dive bar in Brooklyn. My white friend tells me on the way over that the last time he'd been at this particular bar, a week or two before, he was so drunk he'd danced wildly in the middle of place and belted out songs along with the jukebox—his wife and young son were out of town and he was cutting loose. I order a round of beers for our party. We drink them and another friend, a Middle Eastern man with a mop of curly black hair, goes to order another round. When I see it's taking him longer to order than it should, I walk up to the bar and ask what's wrong.

"Your friend here's too drunk," the bartender says. "I'm cutting him off." My friend—having spent years in Germany, no stranger to beer—has had less to drink than anyone else in our party.

"Then I'll buy a round," I tell the bartender.

"You're too drunk, too," he says. "You're both cut off."

I survey the room. "I'm not sure it's a coincidence that you're cutting off the only two brown people in the whole bar," I say.

"I don't have a prejudiced bone in my body," the bartender says.

We walk back and tell our friends what's happened. Two of them—a white woman and a white man who's had a comparable amount to drink as me—order two beers apiece, no questions asked. Despite the fact that we've now got four new beers, nobody much feels like drinking them, and so we leave. I'm served at two more bars that night, and at both I wonder to myself if these bartenders are being unscrupulously generous with an obviously inebriated man. Are my brown friend and I really drunker than all of our friends? Are we shameful? Are we the wasted minorities in a bar full of unprejudiced white people who want us out of there?

I think one of the most damaging effects America's omnipresent racism has on a person's psyche isn't the brief pang of hurt that comes from being called a slur, or seeing a picture of Barack Obama portrayed by a chimpanzee. Those things are common and old-fashioned, and when they happen I tend to feel sadder than angry, because I'm seeing someone who engages with the world like a wall instead of a human being. Rather, I think what's far more corrosive and insidious, the thing that lingers in the back of my mind the most, is the framework of plausible deniability built up around racism, and how insane that plausible deniability can make a person feel when wielded. How unsure of oneself. How worried that you might be overreacting, oversensitive, irrational.

by Cord Jefferson, Gawker |  Read more:
Image: uncredited

Jacob Lawrence, Pool Parlor, 1942