Wednesday, August 22, 2012

Massive Attack feat. Liz Fraser


The Playboy Interview: Richard Dawkins


Richard Dawkins, the patron saint of nonbelievers, caused a stir earlier this year during a debate with the Archbishop of Canterbury, who noted that his opponent is often described as the world’s most famous atheist. “Not by me,” Dawkins replied before providing his standard explanation—a supreme being is possible but highly improbable—which led a London newspaper to proclaim that the world’s most notorious skeptic was hedging his bets. Far from it. Dawkins, at 71, remains an unbending and sharp-tongued critic of religious dogmatism. Like any scientist who challenges the Bible and its lyrical version of creation, he spends a great deal of time defending Charles Darwin’s theory that all life, including humans, evolved over eons through natural selection, rather than being molded 10,000 years ago by an intelligent but unseen hand.

Dawkins, who retired from Oxford University in 2008 after 13 years as a professor of public understanding of science (meaning he lectured and wrote books), stepped into the limelight in 1976, at the age of 35, with the publication of The Selfish Gene. The book, which has sold more than a million copies, argues persuasively that evolution takes place at the genetic level; individuals die, but the fittest genes survive. Dawkins has since written 10 more best-sellers, including most recently The Magic of Reality: How We Know What’s Really True. Since 9/11 he has become more outspoken about his skepticism, culminating in The God Delusion, which provides the foundation for his continuing debates with believers. Published in 2006, the book has become Dawkins’s most popular, available in 31 languages with 2 million copies sold. That same year he founded the Richard Dawkins Foundation for Reason and Science “to support scientific education, critical thinking and evidence-based understanding of the natural world in the quest to overcome religious fundamentalism, superstition, intolerance and suffering.” (...)

Excerpts:

PLAYBOY: Albert Einstein and Stephen Hawking reference God in their writings. Are they using the word in the sense of an intelligent designer?

DAWKINS: Certainly not. They use god in a poetic, metaphorical sense. Einstein in particular loved using the word to convey an idea of mystery, which I think all decent scientists do. But nowadays we’ve learned better than to use the word god because it will be willfully misunderstood, as Einstein was. And poor Einstein got quite cross about it. “I do not believe in a personal god,” he said over and over again. In a way he was asking for it. Hawking uses it in a similar way in A Brief History of Time. In his famous last line he says that if we understood the universe, “then we would know the mind of God.” Once again he is using god in the Einsteinian, not the religious sense. And so Hawking’s The Grand Design, in which he says the universe could have come from nothing, is not him turning away from God; his beliefs are exactly the same.

PLAYBOY: You’ve had a lot of fun deconstructing the idea of the intelligent designer. You point out that God made a cheetah fast enough to catch a gazelle and a gazelle fast enough to outrun a cheetah——

DAWKINS: Yes. Is God a sadist?

PLAYBOY: And bad design such as the fact we breathe and eat through the same tube, making it easy to choke to death.

DAWKINS: Or the laryngeal nerve, which loops around an artery in the chest and then goes back up to the larynx.

PLAYBOY: Not very efficient.

DAWKINS: Not in a giraffe, anyway. (...)

PLAYBOY: What will happen when you die?

DAWKINS: Well, I shall either be buried or be cremated.

PLAYBOY: Funny. But without faith in an afterlife, in what do you take comfort in times of despair?

DAWKINS: Human love and companionship. But in more thoughtful, cerebral moments, I take—comfort is not quite the right word, but I draw strength from reflecting on what a privilege it is to be alive and what a privilege it is to have a brain that’s capable in its limited way of understanding why I exist and of reveling in the beauty of the world and the beauty of the products of evolution. The magnificence of the universe and the sense of smallness that gives us in space and in geologically deep time is humbling but in a strangely comforting way. It’s nice to feel you’re part of a hugely bigger picture. (...)

PLAYBOY: We hear constantly that America is a Christian nation and that the founding fathers were all Christians.

DAWKINS: They were deists. They didn’t believe in a personal god, or one who interferes in human affairs. And they were adamant that they did not want to found the United States as a Christian nation.

PLAYBOY: But you hear quite often that if you let atheists run things you end up with Hitler and Stalin.

DAWKINS: Hitler wasn’t an atheist; he was a Roman Catholic. But I don’t care what he was. There is no logical connection between atheism and doing bad things, nor good things for that matter. It’s a philosophical belief about the absence of a creative intelligence in the world. Anybody who thinks you need religion in order to be good is being good for the wrong reason. I’d rather be good for moral reasons. Morals were here before religion, and morals change rather rapidly in spite of religion. Even people who rely on the Bible use nonbiblical criteria. If your criteria are scriptural, you have no basis for choosing the verse that says turn the other cheek rather than the verse that says stone people to death. So you pick and choose without guidance from the Bible.

by Chip Rowe, Playboy |  Read more:

Metamotivation


[ed. I was familiar with Maslow's general hierarchy of needs but not the term Metamotivation i.e., striving to realize one's fullest potential. I wonder how a person's outlook on life and their personality are affected by an inability to achieve that need (if it is felt)? Furthermore, since basic needs are fluid (like health, friendship, economic security, intimacy, etc.) is metamotivation a temporary luxury (and ultimately an unsustainable goal)?] 

Maslow's hierarchy of needs is often portrayed in the shape of a pyramid, with the largest and most fundamental levels of needs at the bottom, and the need for self-actualization at the top.While the pyramid has become the de facto way to represent the hierarchy, Maslow himself never used a pyramid to describe these levels in any of his writings on the subject.

The most fundamental and basic four layers of the pyramid contain what Maslow called "deficiency needs" or "d-needs": esteem, friendship and love, security, and physical needs. With the exception of the most fundamental (physiological) needs, if these "deficiency needs" are not met, the body gives no physical indication but the individual feels anxious and tense. Maslow's theory suggests that the most basic level of needs must be met before the individual will strongly desire (or focus motivation upon) the secondary or higher level needs. Maslow also coined the term Metamotivation to describe the motivation of people who go beyond the scope of the basic needs and strive for constant betterment. Metamotivated people are driven by B-needs (Being Needs), instead of deficiency needs (D-Needs).

via: Wikipedia

The Next Wave for the Wristwatch


Cellphones have already muscled onto watches’ turf as a time-telling tool. Now, some of the biggest technology companies are eyeing your wrist, too.

Companies like Apple, Nike and Sony, along with dozens of start-ups, hope to strap a device on your wrist.

It is quite a disruption for the wristwatch, which has not actually been around all that long. Though said to have been invented in 1868 by the Swiss watchmaker Patek Philippe, it didn’t really catch on until after World War I. Before that, people carried watches in their pockets or on chains.

“Watch manufacturers were asking themselves this in the 1900s, if it made sense to have a watch in their pocket,” said Blaise Bertrand, industrial design director for IDEO, a design company. “I think that’s the same question that is being asked now, but it’s in a completely different context with the smartphone in our pockets.”

The new wrist devices won’t replace smartphones, but rather connect to them. Most will continue the basic task of telling the time, while eliminating the need to dig a smartphone out of your pocket or purse. But they will provide far more information than the most advanced G-Shock watch available today, or the most expensive chronometer.

For example, Sony this year released the Smartwatch, a two-inch-square screen that can display e-mails, Twitter posts and other pieces of text, all pulled from anAndroid smartphone. Nike FuelBand, a black band with an array of colored lights, measures the energy you exert on a daily basis and sends it to a smartphone. (It also tells the time.) Jawbone sells theUp, a unisex bracelet that tracks a user’s daily activity and sends the information to an iPhone application. Pebble, an innovative watch that can play music and display text, the weather and other information from a phone, caught the public’s imagination on Kickstarter, where it raised $10.3 million. It is expected to arrive next year.

by Nick Bilton, NY Times |  Read more:

Tuesday, August 21, 2012


Xu Wei (1521-1593)
via:

The Vast Left-Wing Conspiracy Is on Your Screen

Two decades ago, conservative anger against popular culture burned so intensely that it seemed at the time that Hollywood had come to fill the space in the right-wing fear center vacated by the end of Communism. The anger came out in an endless series of skirmishes. In 1989, after watching an episode of the sitcom Married With Children that included a gay man and a woman removing her bra, Michigan housewife Terry Rakolta (whose sister, Ronna Romney, married the brother of … yes, him) launched a national crusade against the show. Dan Quayle gave a speech denouncing the single-motherhood of Murphy Brown. Advertising boycotts by such groups as Christian Leaders for Responsible Television or Rakolta’s own Americans for ­Responsible Television were a regular occurrence, as were anti-Hollywood rallies that drew thousands of protesters.

The country was “involved in a ­Kulturkampf,” declared Illinois Republican congressman Henry Hyde, a “war between cultures and a war about the meaning of culture.” Liberals, too, considered their way of life threatened by the conservative campaign against Hollywood. “We are in the midst of a culture war,” announced the vice-president of People for the American Way, a group founded by liberal producer Norman Lear. In his keynote speech at the 1992 Republican convention, Pat Buchanan floridly exhorted his party to fight (or, in its view, fight back) in a “cultural war, as critical to the kind of nation we will one day be as was the Cold War itself.”

When Buchanan delivered that terrifying (or exhilarating) speech in Houston, it would have been impossible to imagine that twenty years later, all traces of this war would have disappeared from the national political scene. If you visit Mitt Romney’s campaign website, the issues tab labeled “Values” lists Romney’s unwavering opposition to abortion and gay marriage, and Bushian opposition to stem-cell research, but nary a glancing reference can be found to the state of the culture, let alone a full-throated denunciation of Hollywood filth merchants. An immediate and easy explanation is that popular culture has ceased its provocations, or that the culture war has been shoved aside by the war over the role of government in the economy. The more uncomfortable reality is that the culture war is an ongoing liberal rout. Hollywood is as liberal as ever, and conservatives have simply despaired of changing it

You don’t have to be an especially devoted consumer of film or television (I’m not) to detect a pervasive, if not total, liberalism. Americans for Responsible Television and Christian Leaders for Responsible Television would be flipping out over the modern family in Modern Family, not to mention the girls of Girls and the gays of Glee, except that those groups went defunct long ago. The liberal analysis of the economic crisis—that unregulated finance took wild gambles—has been widely reflected, even blatantly so, in movies likeMargin Call, Too Big to Fail, and the Wall Street sequel. The conservative view that all blame lies with regulations forcing banks to lend to poor people has not, except perhaps in the amateur-hour production ofAtlas Shrugged. The muscular Rambo patriotism that briefly surged in the eighties, and seemed poised to return after 9/11, has disappeared. In its place we have series like Homeland, which probes the moral complexities of a terrorist’s worldview, and action stars like Jason Bourne, whose enemies are not just foreign baddies but also paranoid Dick Cheney figures. The conservative denial of climate change, and the low opinion of environmentalism that accompanies it, stands in contrast to cautionary end-times tales likeIce Age 2: The Meltdown and the tree-hugging mysticism of Avatar. The decade has also seen a revival of political films and shows, from the Aaron Sorkin oeuvre through Veep and The Campaign, both of which cast oilmen as the heavies. Even The Muppets features an evil oil driller stereotypically named “Tex Richman.”

In short, the world of popular culture increasingly reflects a shared reality in which the Republican Party is either absent or anathema. That shared reality is the cultural assumptions, in particular, of the younger voters whose support has become the bedrock of the Democratic Party.

A member of President Obama’s reelection team recently told New York’s John Heilemann that it plans on painting its opponent as a man out of time—Mitt Romney is “the fifties, he is retro, he is backward.” This may sound at first blush like a particular reference to Romney’s uptight persona, but the line of attack would have been available against any Republican nominee—Rick Santorum, Michele Bachmann, Rick Perry, or any other of the dour reactionaries who might have snatched the nomination. The message is transmitted in a thousand ways, both obvious and obscure: Tina Fey’s devastating portrayal of Sarah Palin. Obama appearing on Late Night With Jimmy Fallon to “slow jam the news,” which meant to recite his campaign message of the week. The severed head of George W. Bush appearing on Game of Thrones. An episode of Mad Men that included the odd throwaway line “Romney’s a clown,” putatively to describe George Romney, who was anything but.

When Joe Biden endorsed gay marriage in May, he cited Will & Grace as the single-most important driving force in transforming public opinion on the subject. In so doing he actually confirmed the long-standing fear of conservatives—that a coterie of Hollywood elites had undertaken an invidious and utterly successfully propaganda campaign, and had transmuted the cultural majority into a minority. Set aside the substance of the matter and consider the process of it—that is, think of it from the conservative point of view, if you don’t happen to be one. Imagine that large chunks of your entertainment mocked your values and even transformed once-uncontroversial beliefs of yours into a kind of bigotry that might be greeted with revulsion.

You’d probably be angry, too.

by Jonathan Chait, New York Magazine | Read more:
Photo: Courtesy of Warner Bros. Pictures. Photo-illustration by Gluekit

The Sleep Racket

[ed. See also: Sleeping pills: Britain's hidden addiction.]

Over the last year you could have made a pile of money by betting on a little company in the business of ... insomnia. Shares of ResMed, in Poway, Calif., leaped 44%, selling $465 million worth of “sleep-disordered-breathing” equipment--face masks, nasal pillows, humidifiers and so-called continuous positive airway pressure devices. “It’s a monster market; it’s bigger than Ben-Hur,” says Peter C. Farrell, ResMed’s voluble chief executive. You’d have done even better as an original investor in Pacific Sleep Medicine Services, a small chain of sleep centers, mostly in southern California. One of its founders who chipped in an undisclosed amount in 1998 saw his ante jump a hundredfold, says Tom J. Wiedel, Pacific Sleep’s chief financial officer.

A bad night’s sleep is reason for a very big business. Sleeping pills, led by Ambien, rack up more than $2 billion a year in the U.S. Then there is the revenue from overnight stays at sleep clinics, over-the-counter pills, a parade of gimmicks and a thriving business for sleep specialists. For four bucks you can pick up the Insomnia Relief Scent Inhaler (lavender, rosemary, chamomile and vetiver, a grass root) from Earth Solutions of Atlanta. The Original Sound Pillow ($50) comes with two thin speakers and a headphone jack for your iPod or Discman. Spend a bit more ($147) for an MP3-like player called Pzizz that plays music, voices and tones geared to helping you fall asleep. Dreamate, an $80 device worn as a bracelet, supposedly delivers a 6,000rpm massage to the “sleeping golden triangle,” a.k.a. the wrist. Sealy’s Stearns & Foster unit is now selling a $10,000 mattress fit for a princess afflicted with insomnia. It is topped with latex and stitched with real silver threads. Hypnos goes one better, with a king-size mattress filled with layers of silk, cashmere and lamb’s wool and topped with a $20,000 sticker. And coming this spring from Matsushita Electric Works: a room tricked out to induce “deep sleep” within 30 minutes. It includes a reclining bed, sound-absorbent walls and somniferous sights and sounds from a wide-screen TV.

“Sleep is the new sex.” So says psychologist Arthur J. Spielman, associate director of the Center for Sleep Disorders Medicine & Research at New York Methodist Hospital in Brooklyn, N.Y. “People want it, need it, can’t get enough of it.” The same could be said of profits. Spielman is coauthor of The Insomnia Answer ($24, Perigee Books, 2006). He is also developing light-delivering goggles that are supposed to help people reset the circadian rhythms that govern when they nod off and wake up, so they fall asleep faster and stay asleep longer.

Sleep is also the new snake oil--the promise of a good snooze from a book or a bed or a bottle. It’s easy pickings. Who isn’t somewhat slumber-deprived? Given the demands of work and family, no one gets “enough” sleep, whatever that is. The Morpheus mongers point to all kinds of studies on their behalf. The number of Americans who say they sleep less than six hours a night--16% in 2005, compared with 12% in 1998--is on the rise, claims the National Sleep Foundation, which, coincidentally, gets funding from pharmaceutical companies. According to the American Academy of Sleep Medicine, there are 81 chronic sleeping disorders, from apnea, which causes interrupted breathing, to restless leg syndrome. No wonder sleep labs are popping up everywhere--650 and counting, compared with just a few in the mid-1970s.

According to the National Institutes of Health, sleeplessness creates $16 billion in annual health care expenses and $50 billion in lost productivity. Scientists are finding that chronically reduced or disrupted sleep may increase the risk of obesity, diabetes and cardiovascular disease. “We know from all the research that sleep is just as important to overall health as exercise and diet,” says Carl Hunt, special assistant to the director of the National Heart, Lung & Blood Institute at the NIH.

by Melanie Wells, Forbes (2006) |  Read more:

Kamisaka Sekka (1866 - 1942) Japanese Woodblock Print
Rolling Hillside
Sekka’s A World of Things Series (Momoyogusa)
via:

Monday, August 20, 2012

Cat Power


Spirited Away (2002)

I was so fortunate to meet Miyazaki at the 2002 Toronto film festival. I told him I love the “gratuitous motion” in his films; instead of every movement being dictated by the story, sometimes people will just sit for a moment, or sigh, or gaze at a running stream, or do something extra, not to advance the story but only to give the sense of time and place and who they are.

“We have a word for that in Japanese,” he said. “It’s called ‘ma.’ Emptiness. It’s there intentionally.” He clapped his hands three or four times. “The time in between my clapping is ‘ma.’ If you just have non-stop action with no breathing space at all, it’s just busyness.”

I think that helps explain why Miyazaki’s films are more absorbing than the frantic action in a lot of American animation. “The people who make the movies are scared of silence” he said, “so they want to paper and plaster it over,” he said. “They’re worried that the audience will get bored. But just because it’s 80 percent intense all the time doesn’t mean the kids are going to bless you with their concentration. What really matters is the underlying emotions—that you never let go of those.

“What my friends and I have been trying to do since the 1970’s is to try and quiet things down a little bit; don’t just bombard them with noise and distraction. And to follow the path of children’s emotions and feelings as we make a film. If you stay true to joy and astonishment and empathy you don’t have to have violence and you don’t have to have action. They’ll follow you. This is our principle.”

He said he has been amused to see a lot of animation in live-action superhero movies. “In a way, live action is becoming part of that whole soup called animation. Animation has become a word that encompasses so much, and my animation is just a little tiny dot over in the corner. It’s plenty for me.”

It’s plenty for me, too.

by Roger Ebert, Chicago Sun Times |  Read more:

Where Has the Retail Investor Gone?

Lots of folks are wondering what happened to the Main Street-mom-and-pop retail investors. They seem to have taken their ball and gone home. I don’t blame them for feeling put upon, but it might be instructive to figure out why. Perhaps it could even help us determine what this means for risk capital.

We see evidence of this all over the place: The incredibly light volume of stock trading; the abysmal television ratings of CNBC; the closing of investing magazines such as Smart Money, whose final print issue is on newsstands as it transitions to a digital format; the dearth of stock chatter at cocktail parties. Why, it is almost as if America has fallen out of love with equities.

Given the events of the past decade and a half, this should come as no surprise. Average investors have seen not one but two equity collapses (2000 and 2008). They got caught in the real estate boom and bust. Accredited investors (i.e., the wealthier ones) also discovered that venture capital and private equity were no sure thing either. The Facebook IPO may have been the last straw.

What has driven the typical investor away from equities?

The short answer is that there is no single answer. It is complex, not reducible to single variable analysis. This annoys pundits who thrive on dumbing down complex and nuanced issues to easily digestible sound bites. Television is not particularly good at subtlety, hence the overwhelming tendency for shout-fests and silly bull/bear debates.

The factors that have been weighing on people-formerly-known-as-stock-investors are many. Consider the top 10 reasons investors are unenthused about the stock market:

1 Secular cycle: As we have discussed before, there are long-term cycles of alternating bull and bear markets. The current bear market that began in March 2000 has provided lots of ups and downs — but no lasting gains. Markets are effectively unchanged since 1999 (the Nasdaq is off only 40 percent from its 2000 peak).

The way secular bear markets end is with investors ignoring stocks, enormous P/E multiple compression and bargains galore. Bond king Bill Gross and his Death of the Cult of Equities is a good sign we are getting closer to the final denouement.

2 Psychology: Investors are scarred and scared. They have been scarred by the 57 percent crash in the major indexes from the 2007 peak to the 2009 bottom. They are scared to get back into equities because that is their most recent experience, and it has affected them deeply. While this psychological shift from love to hate to indifference is a necessary part of working toward the end of a secular bear, it is no fun for them — or anyone who trades or invests for a living.

3 Risk on/risk off: Let’s be brutally honest — the fundamentals have been utterly trumped by unprecedented central bank intervention. While this may be helping the wounded bank sector, it is not doing much for long-term investors in fixed income or equities. The Fed’s dual mandate of maximum employment and stable prices seems to have a newer unspoken goal: Driving risk asset prices higher.

When investors can no longer fashion a thesis other than “Buy when the Fed rolls out the latest bailout,” it takes a toll on psychology, and scares them away.

by Barry Ritholtz, Washington Post |  Read more:

Marvin Gaye


Finally the full version of Marvin's funk classic! Enjoy!

Augusta National Adds First Two Female Members

[ed. And in other news today, the temperature in Hell is reported to have dipped below 32 degrees.]

For the first time in its 80-year history, Augusta National Golf Club has female members.

The home of the Masters, under increasing criticism the last decade because of its all-male membership, invited former Secretary of State Condoleezza Rice and South Carolina financier Darla Mooreto become the first women in green jackets when the club opens for a new season in October.

Both women accepted.

"This is a joyous occasion," Augusta National chairman Billy Payne said Monday.

The move likely ends a debate that intensified in 2002 when Martha Burk of the National Council of Women's Organizations urged the club to include women among its members. Former club chairman Hootie Johnson stood his ground, even at the cost of losing Masters television sponsors for two years, when he famously said Augusta National might one day have a woman in a green jacket, "but not at the point of a bayonet."

The comment took on a life of its own, becoming either a slogan of the club's resolve not to give in to public pressure or a sign of its sexism, depending on which side of the debate was interpreting it.

"Oh my God. We won," Burk said. "It's about 10 years too late for the boys to come into the 20th century, never mind the 21st century. But it's a milestone for women in business."

Payne, who took over as chairman in 2006 when Johnson retired, said consideration for new members is deliberate and private, and that Rice and Moore were not treated differently from other new members. Even so, he took the rare step of announcing two of the latest members to join because of the historical significance.

"These accomplished women share our passion for the game of golf and both are well known and respected by our membership," Payne said in a statement. "It will be a proud moment when we present Condoleezza and Darla their green jackets when the club opens this fall. This is a significant and positive time in our club's history and, on behalf of our membership, I wanted to take this opportunity to welcome them and all of our new members into the Augusta National family."

by Doug Ferguson, AP |  Read more:

DS Super (by uncertainworld)
via:

Secret of AA: After 75 Years, We Don’t Know How It Works


AA originated on the worst night of Bill Wilson’s life. It was December 14, 1934, and Wilson was drying out at Towns Hospital, a ritzy Manhattan detox center. He’d been there three times before, but he’d always returned to drinking soon after he was released. The 39-year-old had spent his entire adult life chasing the ecstasy he had felt upon tasting his first cocktail some 17 years earlier. That quest destroyed his career, landed him deeply in debt, and convinced doctors that he was destined for institutionalization.

Wilson had been quite a mess when he checked in the day before, so the attending physician, William Silkworth, subjected him to a detox regimen known as the Belladonna Cure—hourly infusions of a hallucinogenic drug made from a poisonous plant. The drug was coursing through Wilson’s system when he received a visit from an old drinking buddy, Ebby Thacher, who had recently found religion and given up alcohol. Thacher pleaded with Wilson to do likewise. “Realize you are licked, admit it, and get willing to turn your life over to God,” Thacher counseled his desperate friend. Wilson, a confirmed agnostic, gagged at the thought of asking a supernatural being for help.

But later, as he writhed in his hospital bed, still heavily under the influence of belladonna, Wilson decided to give God a try. “If there is a God, let Him show Himself!” he cried out. “I am ready to do anything. Anything!”

What happened next is an essential piece of AA lore: A white light filled Wilson’s hospital room, and God revealed himself to the shattered stockbroker. “It seemed to me, in the mind’s eye, that I was on a mountain and that a wind not of air but of spirit was blowing,” he later said. “And then it burst upon me that I was a free man.” Wilson would never drink again.

At that time, the conventional wisdom was that alcoholics simply lacked moral fortitude. The best science could offer was detoxification with an array of purgatives, followed by earnest pleas for the drinker to think of his loved ones. When this approach failed, alcoholics were often consigned to bleak state hospitals. But having come back from the edge himself, Wilson refused to believe his fellow inebriates were hopeless. He resolved to save them by teaching them to surrender to God, exactly as Thacher had taught him.

Following Thacher’s lead, Wilson joined the Oxford Group, a Christian movement that was in vogue among wealthy mainstream Protestants. Headed by a an ex-YMCA missionary named Frank Buchman, who stirred controversy with his lavish lifestyle and attempts to convert Adolf Hitler, the Oxford Group combined religion with pop psychology, stressing that all people can achieve happiness through moral improvement. To help reach this goal, the organization’s members were encouraged to meet in private homes so they could study devotional literature together and share their inmost thoughts.

In May 1935, while on an extended business trip to Akron, Ohio, Wilson began attending Oxford Group meetings at the home of a local industrialist. It was through the group that he met a surgeon and closet alcoholic named Robert Smith. For weeks, Wilson urged the oft-soused doctor to admit that only God could eliminate his compulsion to drink. Finally, on June 10, 1935, Smith (known to millions today as Dr. Bob) gave in. The date of Dr. Bob’s surrender became the official founding date of Alcoholics Anonymous.

In its earliest days, AA existed within the confines of the Oxford Group, offering special meetings for members who wished to end their dependence on alcohol. But Wilson and his followers quickly broke away, in large part because Wilson dreamed of creating a truly mass movement, not one confined to the elites Buchman targeted. To spread his message of salvation, Wilson started writing what would become AA’s sacred text: Alcoholics Anonymous, now better known as the Big Book.

The core of AA is found in chapter five, entitled “How It Works.” It is here that Wilson lists the 12 steps, which he first scrawled out in pencil in 1939. Wilson settled on the number 12 because there were 12 apostles.

In writing the steps, Wilson drew on the Oxford Group’s precepts and borrowed heavily from William James’ classic The Varieties of Religious Experience, which Wilson read shortly after his belladonna-fueled revelation at Towns Hospital. He was deeply affected by an observation that James made regarding alcoholism: that the only cure for the affliction is “religiomania.” The steps were thus designed to induce an intense commitment, because Wilson wanted his system to be every bit as habit-forming as booze.

The first steps famously ask members to admit their powerlessness over alcohol and to appeal to a higher power for help. Members are then required to enumerate their faults, share them with their meeting group, apologize to those they’ve wronged, and engage in regular prayer or meditation. Finally, the last step makes AA a lifelong duty: “Having had a spiritual awakening as the result of these steps, we tried to carry this message to alcoholics and to practice these principles in all our affairs.” This requirement guarantees not only that current members will find new recruits but that they can never truly “graduate” from the program.

Aside from the steps, AA has one other cardinal rule: anonymity. Wilson was adamant that the anonymous component of AA be taken seriously, not because of the social stigma associated with alcoholism, but rather to protect the nascent organization from ridicule. He explained the logic in a letter to a friend:

[In the past], alcoholics who talked too much on public platforms were likely to become inflated and get drunk again. Our principle of anonymity, so far as the general public is concerned, partly corrects this difficulty by preventing any individual receiving a lot of newspaper or magazine publicity, then collapsing and discrediting AA.

AA boomed in the early 1940s, aided by a glowing Saturday Evening Post profile and the public admission by a Cleveland Indians catcher, Rollie Hemsley, that joining the organization had done wonders for his game. Wilson and the founding members were not quite prepared for the sudden success. “You had really crazy things going on,” says William L. White, author of Slaying the Dragon: The History of Addiction Treatment and Recovery in America. “Some AA groups were preparing to run AA hospitals, and there was this whole question of whether they should have paid AA missionaries. You even had some reports of AA groups drinking beers at their meetings.”

The growing pains spurred Wilson to write AA’s governing principles, known as the 12 traditions. At a time when fraternal orders and churches with strict hierarchies dominated American social life, Wilson opted for something revolutionary: deliberate organizational chaos. He permitted each group to set its own rules, as long as they didn’t conflict with the traditions or the steps. Charging a fee was forbidden, as was the use of the AA brand to endorse anything that might generate revenue. “If you look at this on paper, it seems like it could never work,” White says. “It’s basically anarchy.” But this loose structure actually helped AA flourish. Not only could anyone start an AA group at any time, but they could tailor each meeting to suit regional or local tastes. And by condemning itself to poverty, AA maintained a posture of moral legitimacy.

Despite the decision to forbid members from receiving pay for AA-related activity, it had no problem letting professional institutions integrate the 12 steps into their treatment programs. AA did not object when Hazelden, a Minnesota facility founded in 1947 as “a sanatorium for curable alcoholics of the professional class,” made the steps the foundation of its treatment model. Nor did AA try to stop the proliferation of steps-centered addiction groups from adopting the Anonymous name: Narcotics Anonymous, Gamblers Anonymous, Overeaters Anonymous. No money ever changed hands—the steps essentially served as open source code that anyone was free to build upon, adding whatever features they wished. (Food Addicts Anonymous, for example, requires its members to weigh their meals.)

By the early 1950s, as AA membership reached 100,000, Wilson began to step back from his invention. Deeply depressed and an incorrigible chain smoker, he would go on to experiment with LSD before dying from emphysema in 1971. By that point, AA had become ingrained in American culture; even people who’d never touched a drop of liquor could name at least a few of the steps.

“For nearly 30 years, I have been saying Alcoholics Anonymous is the most effective self-help group in the world,” advice columnist Ann Landers wrote in 1986. “The good accomplished by this fellowship is inestimable … God bless AA.”

There’s no doubt that when AA works, it can be transformative. But what aspect of the program deserves most of the credit? Is it the act of surrendering to a higher power? The making of amends to people a drinker has wronged? The simple admission that you have a problem? Stunningly, even the most highly regarded AA experts have no idea. “These are questions we’ve been trying to answer for, golly, 30 or 40 years now,” says Lee Ann Kaskutas, senior scientist at the Alcohol Research Group in Emeryville, California. “We can’t find anything that completely holds water.”

by Brendan I. Koerner, Wired | Read more:
Photo: Christian Stoll

Rappresentazione della competizione …
via:

Parking Lot by Clarence Holbrook Carter - 1953
via:

Schmooze or Lose

The summer before the 2010 congressional elections, the Democrats’ prospects began to look alarmingly weak. On July 28th, President Barack Obama flew to New York City for two high-priced fund-raisers aimed at replenishing his party’s war chest, largely with money from Wall Street. For a busy President, such events could be a chore. And Obama had never been a Wall Street type. In 1983, Obama, then a recent college graduate who wore a leather jacket and smoked cigarettes, took a job on the periphery of New York’s financial sector: for a year, he worked for Business International, a firm that produced economic trade reports for multinational companies. According to Obama’s mother, he told her that this foray into the corporate world amounted to “working for the enemy,” as David Maraniss recounts in his new biography, “Barack Obama: The Story.” By the time that Obama ran for President, in 2008, his relations with the financial industry had grown warmer, and he attracted more donations from Wall Street leaders than John McCain, his Republican opponent, did. Yet this good feeling did not last, despite the government’s bailout of the banking sector. Many financial titans felt that the President’s attitude toward the “one per cent” was insufficiently admiring, even hostile.

The planning for the fund-raisers seemed to underline this estrangement. Obama’s first event was a 6 P.M. dinner at the Four Seasons. About forty contributors, many of them from Wall Street, had paid thirty thousand dollars each to dine with him. Some of the invitees were disgruntled supporters who felt unfairly blamed for the country’s economic problems, and they wanted to vent about what they considered Obama’s anti-business tone. But the President did not have enough time to hear them out—or even share a meal—because after only an hour he was scheduled to leave for the second fund-raiser, at the downtown home of Anna Wintour, the editor of Vogue. At the Four Seasons, the President could spend about seven minutes per table, each of which accommodated eight donors. This was fund-raising as speed-dating.

The President’s staff knew that Obama wouldn’t have a moment to eat properly that day, and that it would be hard for him to do so while being the focus of attention at the fund-raisers. So time was set aside at the Four Seasons for Obama to grab a bite, in a “ready room,” with Reggie Love, his personal aide, and Valerie Jarrett, his close friend, senior adviser, and liaison to the business community. This arrangement, however, inadvertently left the impression that Obama preferred his staff’s company to that of the paying guests.

“Obama is very meticulous—they have clockwork timing,” one of the attendees says. “After a few minutes at each table, a staffer would come and tap him on the shoulder, and he’d get up. But when people pay thirty thousand they want to talk to you, and take a picture with you. He was trying to be fair, and that’s great, but every time he started to have a real conversation he got tapped.”

The attendee appreciates that such events must get tiresome for Obama. “Each person, at each table, says to the President, ‘Here’s what you have to do . . .’ At the next table, it’s the same.” Even so, he noted that Bill Clinton—who set the gold standard for the art form known as “donor maintenance”—would have presided over the same event with more enthusiasm: “He would have stayed an extra hour.” After that Four Seasons dinner, the attendee adds, “people were a little mad.”

Top Obama donors began grumbling on the first day of the Administration. “The swearing-in was the beginning of pissing off the donors,” a longtime Washington fund-raiser says. “During the inaugural weekend, they didn’t have the capacity to handle all the people who had participated at the highest levels, because there were so many.” One middle-aged widow, from whom the fund-raiser had secured fifty thousand dollars, got four tickets to the swearing-in, but none of them were together. “She was so offended!” the fund-raiser says. “And I got no credit, by the way, for bringing her in. Important donors need to be cultivated so that they’re there four years later.”

As the Washington fund-raiser sees it, the White House social secretary must spend the first year of an Administration saying, “Thank you, thank you, thank you.” Instead, the fund-raiser says, Obama’s first social secretary, Desirée Rogers—a stylish Harvard Business School graduate and a friend from Chicago—made some donors feel unwelcome. Anita McBride, the chief of staff to Laura Bush, says, “It’s always a very delicate balance at the White House. Do donors think they are buying favors or access? You have to be very conscious of how you use the trappings of the White House. But you can go too far in the other direction, too. Donors are called on to do a lot. It doesn’t take a lot to say thank you.” One of the simplest ways, she notes, is to provide donors with “grip-and-grin” photographs with the President. “It doesn’t require a lot of effort on anyone’s part, but there’s been a reluctance to do it” in the Obama White House. “That can produce some hurt feelings.”

Big donors were particularly offended by Obama’s reluctance to pose with them for photographs at the first White House Christmas and Hanukkah parties. Obama agreed to pose with members of the White House press corps, but not with donors, because, a former adviser says, “he didn’t want to have to stand there for fourteen parties in a row.” This decision continues to provoke disbelief from some Democratic fund-raisers. “It’s as easy as falling off a log!” one says. “They just want a picture of themselves with the President that they can hang on the bathroom wall, so that their friends can see it when they take a piss.” Another says, “Oh, my God—the pictures, the fucking pictures!” (In 2010, the photograph policy was reversed; Rogers left the Administration that year.)

Creating a sense of intimacy with the President is especially important with Democratic donors, a frustrated Obama fund-raiser argues: “Unlike Republicans, they have no business interest being furthered by the donation—they just like to be involved. So it makes them more needy. It’s like, ‘If you’re not going to deregulate my industry, or lower my taxes, can’t I at least get a picture?’ ”

by Jane Mayer, New Yorker |  Read more:
Illustration: Barry Blitt

Deluded Individualism

There is a curious passage early in Freud’s “Ego and the Id” where he remarks that the id behaves “as if” it were unconscious. The phrase is puzzling, but the meaning is clear: the id is the secret driver of our desires, the desires that animate our conscious life, but the ego does not recognize it as such. The ego — what we take to be our conscious, autonomous self — is ignorant to the agency of the id, and sees itself in the driver seat instead. Freud offers the following metaphor: the ego is like a man on horseback, struggling to contain the powerful beast beneath; to the extent that the ego succeeds in guiding this beast, it’s only by “transforming the id’s will into actionas if it were its own.”

By Freud’s account, conscious autonomy is a charade. “We are lived,” as he puts it, and yet we don’t see it as such. Indeed, Freud suggests that to be human is to rebel against that vision — the truth. We tend to see ourselves as self-determining, self-conscious agents in all that we decide and do, and we cling to that image. But why? Why do we resist the truth? Why do we wish — strain, strive, against the grain of reality — to be autonomous individuals, and see ourselves as such?

Perhaps Freud is too cynical regarding conscious autonomy, but he is right to question our presumption to it. He is right to suggest that we typically — wrongly — ignore the extent to which we are determined by unknown forces, and overestimate our self-control. The path to happiness for Freud, or some semblance of it in his stormy account of the psyche, involves accepting our basic condition. But why do we presume individual agency in the first place? Why do we insist on it stubbornly, irrationally, often recklessly?

I was reminded of Freud’s paradox by a poignant article in The Times a few months back, which described a Republican leaning district in Minnesota, and its constituents’ conflicted desire to be self-reliant (“Even Critics of the Safety Net Increasingly Depend on It,” Feb. 11). The article cited a study from Dartmouth political science professor Dean Lacy, which revealed that, though Republicans call for deep cuts to the safety net, their districts rely more on government support than their Democratic counterparts.

In Chisago County, Minn., The Times’s reporters spoke with residents who supported the Tea Party and its proposed cuts to federal spending, even while they admitted they could not get by without government support. Tea Party aficionados, and many on the extreme right of the Republican party for that matter, are typically characterized as self-sufficient middle class folk, angry about sustaining the idle poor with their tax dollars. Chisago County revealed a different aspect of this anger: economically struggling Americans professing a robust individualism and self-determination, frustrated with their failures to achieve that ideal.

Why the stubborn insistence on self-determination, in spite of the facts? One might say there is something profoundly American in this. It’s our fierce individualism shining through. Residents of Chisago County are clinging to notions of past self-reliance before the recession, before the welfare state. It’s admirable in a way. Alternately, it evokes the delusional autonomy of Freud’s poor ego.

These people, like many across the nation, rely on government assistance, but pretend they don’t. They even resent the government for their reliance. If they looked closely though, they’d see that we are all thoroughly saturated with government assistance in this country: farm subsidies that lower food prices for us all, mortgage interest deductions that disproportionately favor the rich, federal mortgage guarantees that keep interest rates low, a bloated Department of Defense that sustains entire sectors of the economy and puts hundreds of thousands of people to work. We can hardly fathom the depth of our dependence on government, and pretend we are bold individualists instead. (...)

Thanks to a decades-long safety net, we have forgotten the trials of living without it. This is why, the historian Tony Judt argued, it’s easy for some to speak fondly of a world without government: we can’t fully imagine or recall what it’s like. We can’t really appreciate the horrors Upton Sinclair witnessed in the Chicago slaughterhouses before regulation, or the burden of living without Social Security and Medicare to look forward to. Thus, we can entertain nostalgia for a time when everyone pulled his own weight, bore his own risk, and was the master of his destiny. That time was a myth. But the notion of self-reliance is also a fallacy.

by Firmin Debrabander, NY Times |  Read more:
Illustration: Leif Parsons