Sunday, May 27, 2012


Henri Rousseau- Le Chat Tigre. Oil on canvas, undated.
via:

Jonathan Franzen: the path to Freedom

[ed. Fascinating glimpse into the life of an acclaimed writer, and the process of writing a great novel.]

I'm going to begin by addressing four unpleasant questions that novelists often get asked. These questions are apparently the price we have to pay for the pleasure of appearing in public. They're maddening not just because we hear them so often but also because, with one exception, they're difficult to answer and, therefore, very much worth asking.

The first of these perennial questions is: Who are your influences?

Sometimes the person asking this question merely wants some book recommendations, but all too often the question seems to be intended seriously. And part of what annoys me about it is that it's always asked in the present tense: who are my influences? The fact is, at this point in my life, I'm mostly influenced by my own past writing. If I were still labouring in the shadow of, say, EM Forster, I would certainly be at pains to pretend that I wasn't. According to Harold Bloom, whose clever theory of literary influence helped him make a career of distinguishing "weak" writers from "strong" writers, I wouldn't even be conscious of the degree to which I was still labouring in EM Forster's shadow. Only Harold Bloom would be fully conscious of that.

Direct influence makes sense only with very young writers, who, in the course of figuring out how to write, first try copying the styles and attitudes and methods of their favourite authors. I personally was very influenced, at the age of 21, by CS Lewis, Isaac Asimov, Louise Fitzhugh, Herbert Marcuse, PG Wodehouse, Karl Kraus, my then-fianceƩ, and The Dialectic of Enlightenment by Max Horkheimer and Theodor Adorno. For a while, in my early 20s, I put a lot of effort into copying the sentence rhythms and comic dialogue of Don DeLillo; I was also very taken with the strenuously vivid and all-knowing prose of Robert Coover and Thomas Pynchon. But to me these various "influences" seem not much more meaningful than the fact that, when I was 15, my favourite music group was the Moody Blues. A writer has to begin somewhere, but where exactly he or she begins is almost random.

It would be somewhat more meaningful to say that I was influenced by Franz Kafka. By this I mean that it was Kafka's novel The Trial, as taught by the best literature professor I ever had, that opened my eyes to the greatness of what literature can do, and made me want to try to create some myself. Kafka's brilliantly ambiguous rendering of Josef K, who is at once a sympathetic and unjustly persecuted Everyman and a self-pitying and guilt-denying criminal, was my portal to the possibilities of fiction as a vehicle of self-investigation: as a method of engagement with the difficulties and paradoxes of my own life. Kafka teaches us how to love ourselves even as we're being merciless toward ourselves; how to remain humane in the face of the most awful truths about ourselves. The stories that recognise people as they really are – the books whose characters are at once sympathetic subjects and dubious objects – are the ones capable of reaching across cultures and generations. This is why we still read Kafka.

The bigger problem with the question about influences, however, is that it seems to presuppose that young writers are lumps of soft clay on which certain great writers, dead or living, have indelibly left their mark. And what maddens the writer trying to answer the question honestly is that almost everything a writer has ever read leaves some kind of mark. To list every writer I've learned something from would take me hours, and it still wouldn't account for why some books matter to me so much more than other books: why, even now, when I'm working, I often think about The Brothers Karamazov and The Man Who Loved Children and never about Ulysses or To the Lighthouse. How did it happen that I did not learn anything from Joyce or Woolf, even though they're both obviously "strong" writers?

The common understanding of influence, whether Harold Bloomian or more conventional, is far too linear and one-directional. When I write, I don't feel like a craftsman influenced by earlier craftsmen who were themselves influenced by earlier craftsmen. I feel like a member of a single, large virtual community in which I have dynamic relationships with other members of the community, most of whom are no longer living. By means of what I write and how I write, I fight for my friends and I fight against my enemies. I want more readers to appreciate the glory of the 19th-century Russians; I'm indifferent to whether readers love James Joyce; and my work represents an active campaign against the values I dislike: sentimentality, weak narrative, overly lyrical prose, solipsism, self-indulgence, misogyny and other parochialisms, sterile game-playing, overt didacticism, moral simplicity, unnecessary difficulty, informational fetishes, and so on. Indeed, much of what might be called actual "influence" is negative: I don't want to be like this writer or that writer. (...)

The second perennial question is: What time of day do you work, and what do you write on?

by Jonathan Franzen, The Guardian | Read more:

U n’ Me by Scott Westmoreland
via:

The Self Illusion: An Interview With Bruce Hood

[ed. Jonah Lehrer inteviews Bruce Hood, author of The Self Illusion, on the nature of self and what it means when we use that term.]

LEHRER: The title of The Self Illusion is literal. You argue that the self – this entity at the center of our personal universe – is actually just a story, a “constructed narrative.” Could you explain what you mean?

HOOD: The best stories make sense. They follow a logical path where one thing leads to another and provide the most relevant details and signposts along the way so that you get a sense of continuity and cohesion. This is what writers refer to as the narrative arc – a beginning, middle and an end. If a sequence of events does not follow a narrative, then it is incoherent and fragmented so does not have meaning. Our brains think in stories. The same is true for the self and I use a distinction that William James drew between the self as “I” and “me.” Our consciousness of the self in the here and now is the “I” and most of the time, we experience this as being an integrated and coherent individual – a bit like the character in the story. The self which we tell others about, is autobiographical or the “me” which again is a coherent account of who we think we are based on past experiences, current events and aspirations for the future. (...)

LEHRER: If the self is an illusion, then why does it exist? Why do we bother telling a story about ourselves?

HOOD: For the same reason that our brains create a highly abstracted version of the world around us. It is bad enough that our brain is metabolically hogging most of our energy requirements, but it does this to reduce the workload to act. That’s the original reason why the brain evolved in the first place – to plan and control movements and keep track of the environment. It’s why living creatures that do not act or navigate around their environments do not have brains. So the brain generates maps and models on which to base current and future behaviors. Now the value of a map or a model is the extent to which it provides the most relevant useful information without overburdening you with too much detail.

The same can be said for the self. Whether it is the “I” of consciousness or the “me” of personal identity, both are summaries of the complex information that feeds into our consciousness. The self is an efficient way of having experience and interacting with the world. For example, imagine you ask me whether I would prefer vanilla or chocolate ice cream? I know I would like chocolate ice cream. Don’t ask me why, I just know. When I answer with chocolate, I have the seemingly obvious experience that my self made the decision. However, when you think about it, my decision covers a vast multitude of hidden processes, past experiences and cultural influences that would take too long to consider individually. Each one of them fed into that decision.

LEHRER: Let’s say the self is just a narrative. Who, then, is the narrator? Which part of me is writing the story that becomes me?

HOOD: This is the most interesting question and also the most difficult to answer because we are entering into the realms of consciousness. For example, only this morning as I was waking up, I was aware that I was gathering my thoughts together and I suddenly became fixated by this phrase, “gathering my thoughts.” I felt I could focus on my thoughts, turn them over in my mind and consider how I was able to do this. Who was doing the gathering and who was focusing? This was a compelling experience of the conscious self.

I would argue that while I had the very strong impression that I was gathering my thoughts together, you do have to question how did the thought to start this investigation begin? Certainly, most of us never bother to think about this, so I must have had an unconscious agenda that this would be an interesting exercise. Maybe it was your question that I read a few days ago or maybe this is a problem that has been ticking over in my brain for some time. It seemed like a story that I was playing out in my head to try and answer a question about how I was thinking. But unless you believe in a ghost in the machine, it is impossible to interrogate your own mind independently. In other words, the narrator and the audience are one and the same.

by Jonah Lehrer, Wired |  Read more:

The Imperial Mind

Americans of all types — Democrats and Republicans, even some Good Progressives — are just livid that a Pakistani tribal court (reportedly in consultation with Pakistani officials) has imposed a 33-year prison sentence on Shakil Afridi, the Pakistani physician who secretly worked with the CIA to find Osama bin Laden on Pakistani soil. Their fury tracks the standard American media narrative: by punishing Dr. Afridi for the “crime” of helping the U.S. find bin Laden, Pakistan has revealed that it sympathizes with Al Qaeda and is hostile to the U.S. (NPR headline: “33 Years In Prison For Pakistani Doctor Who Aided Hunt For Bin Laden”; NYT headline: “Prison Term for Helping C.I.A. Find Bin Laden”). Except that’s a woefully incomplete narrative: incomplete to the point of being quite misleading.

What Dr. Afridi actually did was concoct a pretextual vaccination program, whereby Pakistani children would be injected with a single Hepatitis B vaccine, with the hope of gaining access to the Abbottabad house where the CIA believed bin Laden was located. The plan was that, under the ruse of vaccinating the children in that province, he would obtain DNA samples that could confirm the presence in the suspected house of the bin Laden family. But the vaccine program he was administering was fake: as Wired‘s public health reporter Maryn McKenna detailed, “since only one of three doses was delivered, the vaccination was effectively useless.” An on-the-ground Guardian investigation documented that ”while the vaccine doses themselves were genuine, the medical professionals involved were not following procedures. In an area called Nawa Sher, they did not return a month after the first dose to provide the required second batch. Instead, according to local officials and residents, the team moved on.”

That means that numerous Pakistani children who thought they were being vaccinated against Hepatitis B were in fact left exposed to the virus. Worse, international health workers have long faced serious problems in many parts of the world — including remote Muslim areas — in convincing people that the vaccines they want to give to their children are genuine rather than Western plots to harm them. These suspicions have prevented the eradication of polio and the containment of other preventable diseases in many areas, including in parts of Pakistan. This faux CIA vaccination program will, for obvious and entirely foreseeable reasons, significantly exacerbate that problem.

As McKenna wrote this week, this fake CIA vaccination program was “a cynical attempt to hijack the credibility that public health workers have built up over decades with local populations” and thus “endangered the status of the fraught polio-eradication campaign, which over the past decade has been challenged in majority-Muslim areas in Africa and South Asia over beliefs that polio vaccination is actually a covert campaign to harm Muslim children.” She further notes that while this suspicion “seems fantastic” to oh-so-sophisticated Western ears — what kind of primitive people would harbor suspicions about Western vaccine programs? – there are actually “perfectly good reasons to distrust vaccination campaigns” from the West (in 1996, for instance, 11 children died in Nigeria when Pfizer, ostensibly to combat a meningitis outbreak, conducted drug trials — experiments — on Nigerian children that did not comport with binding safety standards in the U.S.).

When this fake CIA vaccination program was revealed last year, Doctors Without Borders harshly denounced the CIA and Dr. Afridi for their “grave manipulation of the medical act” that will cause “vulnerable communities – anywhere – needing access to essential health services [to] understandably question the true motivation of medical workers and humanitarian aid.” The group’s President pointed out the obvious: “The potential consequence is that even basic healthcare, including vaccination, does not reach those who need it most.” That is now clearly happening, as the CIA program “is casting its shadow over campaigns to vaccinate Pakistanis against polio.” Gulrez Khan, a Peshawar-based anti-polio worker, recently said that tribesman in the area now consider public health workers to be CIA agents and are more reluctant than ever to accept vaccines and other treatments for their children.

For the moment, leave to the side the question of whether knowingly administering ineffective vaccines to Pakistani children is a justified ruse to find bin Laden (just by the way, it didn’t work, as none of the health workers actually were able to access the bin Laden house, though CIA officials claim the program did help obtain other useful information). In light of all the righteous American outrage over this prison sentence, let’s consider what the U.S. Government would do if the situation were reversed: namely, if an American citizen secretly cooperated with a foreign intelligence service to conduct clandestine operations on U.S. soil, all without the knowledge or consent of the U.S. Government, and let’s further consider what would happen if the American citizen’s role in those operations involved administering a fake vaccine program to unwitting American children. Might any serious punishment ensue? Does anyone view that as anything more than an obvious rhetorical question?

by Glenn Greenwald, Salon |  Read more:

Golf's Hardest Shot? Opinions Vary


What’s the hardest shot in golf?

Knowledgeable golfers have been trained to respond, almost by reflex: the long bunker shot. It requires a long carry from sand, from 25 to 100 yards, and that scares most recreational golfers.

But is that truly the hardest shot in golf? Is it harder than hitting your first tee shot of the day with a waiting crowd of other golfers watching? Is it harder than the same first tee shot to open a member-guest tournament — and you’re the guest? Is it harder than the first tee shot of the day if the first tee is positioned directly in front of the clubhouse deck, where dozens of people usually gather?

And if you hit that first tee shot of the day out of bounds and have to re-tee with everyone watching, does that next shot become the hardest shot in golf?

Or, is the hardest shot in golf the dreaded “playing through” shot? You know the one I mean. Your group has been coming up on a slower group for several holes, then, as you approach the tee of a nasty par 3 over water, they decide to wave you through. They stand to the side of the green, their hands on their hips, and wait for you to hit.

Now that’s a tough situation. Even if they’re not annoyed at the interruption, you worry that they are. You must hit, and do so quickly. But you sense, perhaps irrationally, that the golfers in front are judging you, standing there thinking, “Well, you’ve been pushing us for an hour, let’s see how good you are.”

And this always seems to happen on a treacherous hole, usually the highest-handicap hole on the scorecard.

So is that the hardest shot in golf?

I have a few other candidates to propose:

by Bill Pennington, NY Times |  Read more:
Photo: Andy Lyons/Getty Images

This Is How We Ride


This summer the city’s Department of Transportation inaugurates a new bike-share program. People who live and work in New York will be able to travel quickly and cheaply between many neighborhoods. This is major. It will make New Yorkers rethink their city and rewrite the mental maps we use to decide what is convenient, what is possible. Parks, restaurants and friends who once seemed beyond plausible commuting distance on public transportation will seem a lot closer. The possibilities aren’t limitless, but the change will be pretty impressive.

I’ve used a bike to get around New York for decades. There’s an exhilaration you get from self-propelled transportation — skateboarding, in-line skating and walking as well as biking; New York has good public transportation, but you just don’t get the kind of rush I’m talking about on a bus or subway train. I got hooked on biking because it’s a pleasure, not because biking lowers my carbon footprint, improves my health or brings me into contact with different parts of the city and new adventures. But it does all these things, too — and sometimes makes us a little self-satisfied for it; still, the reward is emotional gratification, which trumps reason, as it often does.

More than 200 cities around the world have bike-share programs. We’re not the first, but ours will be one of the largest systems. The program will start with 420 stations spread through the lower half of Manhattan, Long Island City and much of western Brooklyn; eventually more than 10,000 bikes will be available. It will cost just under $10 for a day’s rental. The charge includes unlimited rides during a 24-hour period, as long as each ride is under 30 minutes. So, for example, I could ride from Chelsea to the Lower East Side, from there to food shopping, later to the Brooklyn Academy of Music, and after that, home. This system is not geared for leisurely rides up to the George Washington Bridge or to Coney Island. This is for getting around.

I’ve used bike-sharing programs in London, Ottawa, Washington, Toronto, Barcelona, Milan and Paris. In London, where they introduced a public bike program two years ago, I could enjoy a night out without having to worry about catching the last tube home or finding a no longer readily available black cab. In Paris, the VĆ©lib program has more than 20,000 bikes and extends all the way to the city’s borders. Significantly, the banlieues, the low-income housing projects that surround that city, aren’t included, so the system reinforces a kind of economic discrimination, but maybe more coverage is coming.  (...)

New York’s system will be a lot like the one in London, which I used last summer. Before setting off, I downloaded a map and app that showed me where to find the bike station closest to my hotel, near Soho Square and to my destinations, an art gallery in Mayfair and later a restaurant in Notting Hill. I made one payment — a pound (about $1.50 — cheap!) — and I was good all day; there are no additional charges as long as each bike trip is under 30 minutes. (It’s easy to keep bike trips within that time limit because there are loads of stations where you can drop the bike off, and you can get a new bike after having a coffee.)

So, I don’t have to worry about leaving my bike somewhere if it rains or if I decide to cab home? Nope. I don’t have to worry about parking my bike outside for hours? Nope. I don’t have to think about whether my friend has a bike if we’re going somewhere together? Nope. Everyone has a bike now.

by David Byrne, NY Times |  Read more:
Illustration: Josh Cochran

Saturday, May 26, 2012

Talking Heads


Bettye LaVette


[ed. I'm curious what Ringo Starr thinks of Ms. LaVette's cover of his song. I did a quick Google search but didn't turn up anything.]

Ryan Adams


Sex or Sleep?

Something surprising is going on in the American bedroom. In droves, people are outfitting their beds with a plush, squishy, and decidedly controversial type of mattress. While these products support the body just-so during sleep, they distress some people during sex. The complaint is lack of "traction," if you get the drift. "It's like trying to do it in quicksand," one owner writes on an Internet message board. New York sex therapist Sari Eckler Cooper couldn't be clearer: "There's a lack of resistance for the knees and feet. And whoever is on the bottom is sinking into the bed."

These are memory-foam mattresses, and they are far and away the fastest-growing segment of the $4.6 billion wholesale market for U.S. mat-tresses. Memory foam's market share has shot up from 14% to nearly 20% in just the past eight years. In other words, mattress shoppers are weighing the risk -- bad sex -- against the promise -- good sleep -- and are voting with their eyelids: They choose to snooze.

It's no secret that people are stressed out and exhausted in these hurried times. Baby boomers, the chief buyers of memory-foam mattresses, have the additional problem of creaky bones. Everyone could use a deep, soothing sleep. But at the possible expense of sex?  (...)

 Memory foam is a dense material that softens in reaction to body heat; it is both denser and more responsive to heat than standard mattress foam. It consists of tiny air-filled cells that compress when pressure and heat are applied. The cells closer to the body release their air, allowing the foam to mold to the body's shape.

The material dates back to 1966, when it was developed for NASA to absorb shock in spacecraft seats. It also has been used in football helmets and padding for the insides of shoes. A North Carolina-based company called Dynamic Systems still manufactures memory foam for automotive and aircraft seating, though the patents on the technology have long since expired.

Memory-foam mattresses arrived on the market in the late 1990s, as work lives went 24/7 and folks began hunting far and wide for help in getting to sleep. Nearly 60% of Americans experience insomnia symptoms or sleep disorders, according to market-research firm Marketdata Enterprises. That, in turn, has created a thriving market for sleep aids, including pills, high-tech pillows, white-noise machines, aromatherapy, and, of course, premium mattresses.

Memory-foam mattresses are being embraced by a growing number of sleep-starved Americans -- who generally don't seem to mind that these plush, squishy beds aren't conducive to lovemaking. Barron's takes a closer look.

Memory-foam mattresses, which can cost anywhere from several hundred to several thousand dollars, aren't the most expensive models on the market. A queen-size Tempur-Pedic mattress generally ranges from $3,000 to $7,500, depending on the materials. Mattresses from Sweden's Duxiana, made with multiple layers of more than 1,500 springs, are significantly pricier, at around $10,000, while $15,000 will get you a Relyon mattress, manufactured in the U.K. with hand-made coils. And HƤstens, a 160-year-old Swedish manufacturer, makes the Vividus, a $67,000 made-to-order mattress. Hand-stitched, its mattresses are filled with an intricate blend of horsehair (good for ventilation), cotton, wool (for perspiration absorption), and flax (for strength and elasticity).

Memory-foam  is geared more toward the mass affluent, making it the Lexus of bedding, so to speak. Barron's recently visited a Sleepy's store in Manhattan to kick the tires.

by Miriam Gottfried, Barrons |  Read more:
Illustration: Matt Collins for Barron's

JPMorgan’s Debacle, and its Parallels to AIG

Last week, the once-future Treasury secretary and current JPMorgan Chase CEO Jamie Dimon revealed a $2 billion loss. This previously undisclosed derivative trade should be a wake-up call for those claiming that finance has been “reined in” and no longer presents a threat to the global economy.

As it turns out, nothing could be further from the truth.

Finance has become a low-margin, high-leverage business. This is not surprising in an environment in which trading volumes are exceedingly low and interest rates even lower. In any other industry, a slowdown in economic activity sends management scurrying to cut costs, develop new products, become more productive. In short, to innovate. Companies can throw money at new products, marketing campaigns or discounted pricing, but a slowing economy brings down demand. What we have today is a deleveraging economy, and that is even more challenging — limiting the options that CEOs can take to increase their company revenue.

The world of finance refuses to accept that reality. Whenever Wall Street is confronted with a decrease in profits, we see the same response: Increase leverage. We usually don’t hear about it until some market wobble causes the excessive leverage to blow up in someone’s face. This time, the novelty cigar was smoked by Dimon, and the damage was inflicted on his reputation. The losses, we learned, were a “mere” $2 billion, described as manageable.

Consider any major finance disaster of the past 30 years, and what you will invariably see is the result of trying to spin dross into gold. The magic of finance is that this can work for a while. The reality of finance is simple mathematics. Eventually, the probabilities play themselves out and the dice come up snake eyes.

One thing that makes the JPMorgan trade look especially foolish is that it’s nearly the same sort of recklessness that AIG exhibited: selling derivatives against zero reserves. As Doug Kass, who heads the hedge fund Seabreeze Partners Management, explained: “Under the knowledge of Dimon, the JPM investment office sold massive amounts of CDS [credit-default swap] premium on large U.S. corporations in 2011. Like AIG, they accumulated a large amount of reported profits in the three-year period ending 2011. In an equally familiar manner, the principals of the London investment office were handsomely rewarded. And so was Dimon.”

Gee, why does that sound so familiar?

So how long did it take after AIG blew itself up selling derivatives until some trader came up short making the same reckless bet? Less than four years.

The parallels to AIG continue to mount, including on the JPMorgan risk management committee. Astonishingly, Ellen Futter, who was a director at AIG, was also on the risk management committee at JPMorgan. It’s unclear what you need to do to get kicked off that committee, but the directorial equivalent of steering the Titanic into the iceberg apparently won’t do it.

Most financial debacles have a few things in common:

by Barry Ritholtz, The Big Picture |  Read more:

Tom Wesselmann, Interior No. 2, 1964, Acrylic and collage, including working fan, clock and fluorescent light. (Source: arttattler.com)
via:

Danielle Frankenthal
(Source: jcacciolagallery.com)
via:

How the World's Weather Could Quickly Run Amok

The eminent British scientist James Lovelock, back in the 1970s, formulated his theory of Gaia, which held that the Earth was a kind of super organism. It had a self-regulating quality that would keep everything within that narrow band that made life possible. If things got too warm or too cold—if sunlight varied, or volcanoes caused a fall in temperatures, and so forth—Gaia would eventually compensate. This was a comforting notion. It was also wrong, as Lovelock himself later concluded. "I have to tell you, as members of the Earth's family and an intimate part of it, that you and especially civilization are in grave danger," he wrote in the Independent in 2006.

The world has warmed since those heady days of Gaia, and scientists have grown gloomier in their assessment of the state of the world's climate. NASA climate scientist James Hanson has warned of a "Venus effect," in which runaway warming turns Earth into an uninhabitable desert, with a surface temperature high enough to melt lead, sometime in the next few centuries. Even Hanson, though, is beginning to look downright optimistic compared to a new crop of climate scientists, who fret that things could head south as quickly as a handful of years, or even months, if we're particularly unlucky. Ironically, some of them are intellectual offspring of Lovelock, the original optimist gone sour.

The true gloomsters are scientists who look at climate through the lens of "dynamical systems," a mathematics that describes things that tend to change suddenly and are difficult to predict. It is the mathematics of the tipping point—the moment at which a "system" that has been changing slowly and predictably will suddenly "flip." The colloquial example is the straw that breaks that camel's back. Or you can also think of it as a ship that is stable until it tips too far in one direction and then capsizes. In this view, Earth's climate is, or could soon be, ready to capsize, causing sudden, perhaps catastrophic, changes. And once it capsizes, it could be next to impossible to right it again.

The idea that climate behaves like a dynamical system addresses some of the key shortcomings of the conventional view of climate change—the view that looks at the planet as a whole, in terms of averages. A dynamical systems approach, by contrast, consider climate as a sum of many different parts, each with its own properties, all of them interdependent in ways that are hard to predict.

One of the most productive scientists in applying dynamical systems theory to climate is Tim Lenton at the University of East Anglia in England. Lenton is a Lovelockian two generations removed— his mentors were mentored by Lovelock. "We are looking quite hard at past data and observational data that can tell us something," says Lenton. "Classical case studies in which you've seen abrupt changes in climate data. For example, in the Greenland ice-core records, you're seeing climate jump. And the end of the Younger Dryas," about fifteen thousand years ago, "you get a striking climate change." So far, he says, nobody has found a big reason for such an abrupt change in these past events—no meteorite or volcano or other event that is an obvious cause—which suggests that perhaps something about the way these climate shifts occur simply makes them sudden.

Lenton is mainly interested in the future. He has tried to look for things that could possibly change suddenly and drastically even though nothing obvious may trigger them. He's come up with a short list of nine tipping points—nine weather systems, regional in scope, that could make a rapid transition from one state to another.

by Fred Guteri, Scientific American |  Read more: