Wednesday, August 22, 2012

The Beatles - Love Full Album (HD)


In 2006, George Martin and his son Giles were presented with a new project: a total mashup of the Beatles career into just over an hour, to be used for the new Cirque de Soleil of the name Love. The grammy-award winning album only uses Beatles recorded material to create a soundbed from the original two, four, and eight-track tapes. Using over 130 commercial and demo recordings, Love is, according to Giles Martin, "a way of re-living the whole Beatles musical lifespan in a very condensed period." (80 min)

Scattata con Instagram
via:

DNA Data Storage


A bioengineer and geneticist at Harvard’s Wyss Institute have successfully stored 5.5 petabits of data — around 700 terabytes — in a single gram of DNA, smashing the previous DNA data density record by a thousand times.

The work, carried out by George Church and Sri Kosuri, basically treats DNA as just another digital storage device. Instead of binary data being encoded as magnetic regions on a hard drive platter, strands of DNA that store 96 bits are synthesized, with each of the bases (TGAC) representing a binary value (T and G = 1, A and C = 0). (...)

Scientists have been eyeing up DNA as a potential storage medium for a long time, for three very good reasons: It’s incredibly dense (you can store one bit per base, and a base is only a few atoms large); it’s volumetric (beaker) rather than planar (hard disk); and it’s incredibly stable — where other bleeding-edge storage mediums need to be kept in sub-zero vacuums, DNA can survive for hundreds of thousands of years in a box in your garage.

It is only with recent advances in microfluidics and labs-on-a-chip that synthesizing and sequencing DNA has become an everyday task, though. While it took years for the original Human Genome Project to analyze a single human genome (some 3 billion DNA base pairs), modern lab equipment with microfluidic chips can do it in hours. Now this isn’t to say that Church and Kosuri’s DNA storage is fast — but it’s fast enough for very-long-term archival.

Just think about it for a moment: One gram of DNA can store 700 terabytes of data. That’s 14,000 50-gigabyte Blu-ray discs… in a droplet of DNA that would fit on the tip of your pinky. To store the same kind of data on hard drives — the densest storage medium in use today — you’d need 233 3TB drives, weighing a total of 151 kilos. In Church and Kosuri’s case, they have successfully stored around 700 kilobytes of data in DNA — Church’s latest book, in fact — and proceeded to make 70 billion copies (which they claim, jokingly, makes it the best-selling book of all time!) totaling 44 petabytes of data stored.

Looking forward, they foresee a world where biological storage would allow us to record anything and everything without reservation. Today, we wouldn’t dream of blanketing every square meter of Earth with cameras, and recording every moment for all eternity/human posterity — we simply don’t have the storage capacity. There is a reason that backed up data is usually only kept for a few weeks or months — it just isn’t feasible to have warehouses full of hard drives, which could fail at any time. If the entirety of human knowledge — every book, uttered word, and funny cat video — can be stored in a few hundred kilos of DNA, though… well, it might just be possible to record everything (hello, police state!)

by Sebastian Anthony, ExtremeTech |  Read more:

The Death of the Cyberflâneur


The other day, while I was rummaging through a stack of oldish articles on the future of the Internet, an obscure little essay from 1998 — published, of all places, on a Web site called Ceramics Today — caught my eye. Celebrating the rise of the “cyberflâneur,” it painted a bright digital future, brimming with playfulness, intrigue and serendipity, that awaited this mysterious online type. This vision of tomorrow seemed all but inevitable at a time when “what the city and the street were to the Flâneur, the Internet and the Superhighway have become to the Cyberflâneur.”

Intrigued, I set out to discover what happened to the cyberflâneur. While I quickly found other contemporaneous commentators who believed that flânerie would flourish online, the sad state of today’s Internet suggests that they couldn’t have been more wrong. Cyberflâneurs are few and far between, while the very practice of cyberflânerie seems at odds with the world of social media. What went wrong? And should we worry?

Engaging the history of flânerie may be a good way to start answering these questions. Thanks to the French poet Charles Baudelaire and the German critic Walter Benjamin, both of whom viewed the flâneur as an emblem of modernity, his figure (and it was predominantly a “he”) is now firmly associated with 19th-century Paris. The flâneur would leisurely stroll through its streets and especially its arcades — those stylish, lively and bustling rows of shops covered by glass roofs — to cultivate what Honoré de Balzac called “the gastronomy of the eye.”

While not deliberately concealing his identity, the flâneur preferred to stroll incognito. “The art that the flâneur masters is that of seeing without being caught looking,” the Polish sociologist Zygmunt Bauman once remarked. The flâneur was not asocial — he needed the crowds to thrive — but he did not blend in, preferring to savor his solitude. And he had all the time in the world: there were reports of flâneurs taking turtles for a walk.

The flâneur wandered in the shopping arcades, but he did not give in to the temptations of consumerism; the arcade was primarily a pathway to a rich sensory experience — and only then a temple of consumption. His goal was to observe, to bathe in the crowd, taking in its noises, its chaos, its heterogeneity, its cosmopolitanism. Occasionally, he would narrate what he saw — surveying both his private self and the world at large — in the form of short essays for daily newspapers.

It’s easy to see, then, why cyberflânerie seemed such an appealing notion in the early days of the Web. The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (“Internet Explorer,” “Netscape Navigator”).

Online communities like GeoCities and Tripod were the true digital arcades of that period, trading in the most obscure and the most peculiar, without any sort of hierarchy ranking them by popularity or commercial value. Back then eBay was weirder than most flea markets; strolling through its virtual stands was far more pleasurable than buying any of the items. For a brief moment in the mid-1990s, it did seem that the Internet might trigger an unexpected renaissance of flânerie.

However, anyone entertaining such dreams of the Internet as a refuge for the bohemian, the hedonistic and the idiosyncratic probably didn’t know the reasons behind the disappearance of the original flâneur.

by Evgeny Morozov, NY Times |  Read more:
Illustration: Gustave Caillebotte's "Paris Street; Rainy Day," from 1877.

Massive Attack feat. Liz Fraser


The Playboy Interview: Richard Dawkins


Richard Dawkins, the patron saint of nonbelievers, caused a stir earlier this year during a debate with the Archbishop of Canterbury, who noted that his opponent is often described as the world’s most famous atheist. “Not by me,” Dawkins replied before providing his standard explanation—a supreme being is possible but highly improbable—which led a London newspaper to proclaim that the world’s most notorious skeptic was hedging his bets. Far from it. Dawkins, at 71, remains an unbending and sharp-tongued critic of religious dogmatism. Like any scientist who challenges the Bible and its lyrical version of creation, he spends a great deal of time defending Charles Darwin’s theory that all life, including humans, evolved over eons through natural selection, rather than being molded 10,000 years ago by an intelligent but unseen hand.

Dawkins, who retired from Oxford University in 2008 after 13 years as a professor of public understanding of science (meaning he lectured and wrote books), stepped into the limelight in 1976, at the age of 35, with the publication of The Selfish Gene. The book, which has sold more than a million copies, argues persuasively that evolution takes place at the genetic level; individuals die, but the fittest genes survive. Dawkins has since written 10 more best-sellers, including most recently The Magic of Reality: How We Know What’s Really True. Since 9/11 he has become more outspoken about his skepticism, culminating in The God Delusion, which provides the foundation for his continuing debates with believers. Published in 2006, the book has become Dawkins’s most popular, available in 31 languages with 2 million copies sold. That same year he founded the Richard Dawkins Foundation for Reason and Science “to support scientific education, critical thinking and evidence-based understanding of the natural world in the quest to overcome religious fundamentalism, superstition, intolerance and suffering.” (...)

Excerpts:

PLAYBOY: Albert Einstein and Stephen Hawking reference God in their writings. Are they using the word in the sense of an intelligent designer?

DAWKINS: Certainly not. They use god in a poetic, metaphorical sense. Einstein in particular loved using the word to convey an idea of mystery, which I think all decent scientists do. But nowadays we’ve learned better than to use the word god because it will be willfully misunderstood, as Einstein was. And poor Einstein got quite cross about it. “I do not believe in a personal god,” he said over and over again. In a way he was asking for it. Hawking uses it in a similar way in A Brief History of Time. In his famous last line he says that if we understood the universe, “then we would know the mind of God.” Once again he is using god in the Einsteinian, not the religious sense. And so Hawking’s The Grand Design, in which he says the universe could have come from nothing, is not him turning away from God; his beliefs are exactly the same.

PLAYBOY: You’ve had a lot of fun deconstructing the idea of the intelligent designer. You point out that God made a cheetah fast enough to catch a gazelle and a gazelle fast enough to outrun a cheetah——

DAWKINS: Yes. Is God a sadist?

PLAYBOY: And bad design such as the fact we breathe and eat through the same tube, making it easy to choke to death.

DAWKINS: Or the laryngeal nerve, which loops around an artery in the chest and then goes back up to the larynx.

PLAYBOY: Not very efficient.

DAWKINS: Not in a giraffe, anyway. (...)

PLAYBOY: What will happen when you die?

DAWKINS: Well, I shall either be buried or be cremated.

PLAYBOY: Funny. But without faith in an afterlife, in what do you take comfort in times of despair?

DAWKINS: Human love and companionship. But in more thoughtful, cerebral moments, I take—comfort is not quite the right word, but I draw strength from reflecting on what a privilege it is to be alive and what a privilege it is to have a brain that’s capable in its limited way of understanding why I exist and of reveling in the beauty of the world and the beauty of the products of evolution. The magnificence of the universe and the sense of smallness that gives us in space and in geologically deep time is humbling but in a strangely comforting way. It’s nice to feel you’re part of a hugely bigger picture. (...)

PLAYBOY: We hear constantly that America is a Christian nation and that the founding fathers were all Christians.

DAWKINS: They were deists. They didn’t believe in a personal god, or one who interferes in human affairs. And they were adamant that they did not want to found the United States as a Christian nation.

PLAYBOY: But you hear quite often that if you let atheists run things you end up with Hitler and Stalin.

DAWKINS: Hitler wasn’t an atheist; he was a Roman Catholic. But I don’t care what he was. There is no logical connection between atheism and doing bad things, nor good things for that matter. It’s a philosophical belief about the absence of a creative intelligence in the world. Anybody who thinks you need religion in order to be good is being good for the wrong reason. I’d rather be good for moral reasons. Morals were here before religion, and morals change rather rapidly in spite of religion. Even people who rely on the Bible use nonbiblical criteria. If your criteria are scriptural, you have no basis for choosing the verse that says turn the other cheek rather than the verse that says stone people to death. So you pick and choose without guidance from the Bible.

by Chip Rowe, Playboy |  Read more:

Metamotivation


[ed. I was familiar with Maslow's general hierarchy of needs but not the term Metamotivation i.e., striving to realize one's fullest potential. I wonder how a person's outlook on life and their personality are affected by an inability to achieve that need (if it is felt)? Furthermore, since basic needs are fluid (like health, friendship, economic security, intimacy, etc.) is metamotivation a temporary luxury (and ultimately an unsustainable goal)?] 

Maslow's hierarchy of needs is often portrayed in the shape of a pyramid, with the largest and most fundamental levels of needs at the bottom, and the need for self-actualization at the top.While the pyramid has become the de facto way to represent the hierarchy, Maslow himself never used a pyramid to describe these levels in any of his writings on the subject.

The most fundamental and basic four layers of the pyramid contain what Maslow called "deficiency needs" or "d-needs": esteem, friendship and love, security, and physical needs. With the exception of the most fundamental (physiological) needs, if these "deficiency needs" are not met, the body gives no physical indication but the individual feels anxious and tense. Maslow's theory suggests that the most basic level of needs must be met before the individual will strongly desire (or focus motivation upon) the secondary or higher level needs. Maslow also coined the term Metamotivation to describe the motivation of people who go beyond the scope of the basic needs and strive for constant betterment. Metamotivated people are driven by B-needs (Being Needs), instead of deficiency needs (D-Needs).

via: Wikipedia

The Next Wave for the Wristwatch


Cellphones have already muscled onto watches’ turf as a time-telling tool. Now, some of the biggest technology companies are eyeing your wrist, too.

Companies like Apple, Nike and Sony, along with dozens of start-ups, hope to strap a device on your wrist.

It is quite a disruption for the wristwatch, which has not actually been around all that long. Though said to have been invented in 1868 by the Swiss watchmaker Patek Philippe, it didn’t really catch on until after World War I. Before that, people carried watches in their pockets or on chains.

“Watch manufacturers were asking themselves this in the 1900s, if it made sense to have a watch in their pocket,” said Blaise Bertrand, industrial design director for IDEO, a design company. “I think that’s the same question that is being asked now, but it’s in a completely different context with the smartphone in our pockets.”

The new wrist devices won’t replace smartphones, but rather connect to them. Most will continue the basic task of telling the time, while eliminating the need to dig a smartphone out of your pocket or purse. But they will provide far more information than the most advanced G-Shock watch available today, or the most expensive chronometer.

For example, Sony this year released the Smartwatch, a two-inch-square screen that can display e-mails, Twitter posts and other pieces of text, all pulled from anAndroid smartphone. Nike FuelBand, a black band with an array of colored lights, measures the energy you exert on a daily basis and sends it to a smartphone. (It also tells the time.) Jawbone sells theUp, a unisex bracelet that tracks a user’s daily activity and sends the information to an iPhone application. Pebble, an innovative watch that can play music and display text, the weather and other information from a phone, caught the public’s imagination on Kickstarter, where it raised $10.3 million. It is expected to arrive next year.

by Nick Bilton, NY Times |  Read more:

Tuesday, August 21, 2012


Xu Wei (1521-1593)
via:

The Vast Left-Wing Conspiracy Is on Your Screen

Two decades ago, conservative anger against popular culture burned so intensely that it seemed at the time that Hollywood had come to fill the space in the right-wing fear center vacated by the end of Communism. The anger came out in an endless series of skirmishes. In 1989, after watching an episode of the sitcom Married With Children that included a gay man and a woman removing her bra, Michigan housewife Terry Rakolta (whose sister, Ronna Romney, married the brother of … yes, him) launched a national crusade against the show. Dan Quayle gave a speech denouncing the single-motherhood of Murphy Brown. Advertising boycotts by such groups as Christian Leaders for Responsible Television or Rakolta’s own Americans for ­Responsible Television were a regular occurrence, as were anti-Hollywood rallies that drew thousands of protesters.

The country was “involved in a ­Kulturkampf,” declared Illinois Republican congressman Henry Hyde, a “war between cultures and a war about the meaning of culture.” Liberals, too, considered their way of life threatened by the conservative campaign against Hollywood. “We are in the midst of a culture war,” announced the vice-president of People for the American Way, a group founded by liberal producer Norman Lear. In his keynote speech at the 1992 Republican convention, Pat Buchanan floridly exhorted his party to fight (or, in its view, fight back) in a “cultural war, as critical to the kind of nation we will one day be as was the Cold War itself.”

When Buchanan delivered that terrifying (or exhilarating) speech in Houston, it would have been impossible to imagine that twenty years later, all traces of this war would have disappeared from the national political scene. If you visit Mitt Romney’s campaign website, the issues tab labeled “Values” lists Romney’s unwavering opposition to abortion and gay marriage, and Bushian opposition to stem-cell research, but nary a glancing reference can be found to the state of the culture, let alone a full-throated denunciation of Hollywood filth merchants. An immediate and easy explanation is that popular culture has ceased its provocations, or that the culture war has been shoved aside by the war over the role of government in the economy. The more uncomfortable reality is that the culture war is an ongoing liberal rout. Hollywood is as liberal as ever, and conservatives have simply despaired of changing it

You don’t have to be an especially devoted consumer of film or television (I’m not) to detect a pervasive, if not total, liberalism. Americans for Responsible Television and Christian Leaders for Responsible Television would be flipping out over the modern family in Modern Family, not to mention the girls of Girls and the gays of Glee, except that those groups went defunct long ago. The liberal analysis of the economic crisis—that unregulated finance took wild gambles—has been widely reflected, even blatantly so, in movies likeMargin Call, Too Big to Fail, and the Wall Street sequel. The conservative view that all blame lies with regulations forcing banks to lend to poor people has not, except perhaps in the amateur-hour production ofAtlas Shrugged. The muscular Rambo patriotism that briefly surged in the eighties, and seemed poised to return after 9/11, has disappeared. In its place we have series like Homeland, which probes the moral complexities of a terrorist’s worldview, and action stars like Jason Bourne, whose enemies are not just foreign baddies but also paranoid Dick Cheney figures. The conservative denial of climate change, and the low opinion of environmentalism that accompanies it, stands in contrast to cautionary end-times tales likeIce Age 2: The Meltdown and the tree-hugging mysticism of Avatar. The decade has also seen a revival of political films and shows, from the Aaron Sorkin oeuvre through Veep and The Campaign, both of which cast oilmen as the heavies. Even The Muppets features an evil oil driller stereotypically named “Tex Richman.”

In short, the world of popular culture increasingly reflects a shared reality in which the Republican Party is either absent or anathema. That shared reality is the cultural assumptions, in particular, of the younger voters whose support has become the bedrock of the Democratic Party.

A member of President Obama’s reelection team recently told New York’s John Heilemann that it plans on painting its opponent as a man out of time—Mitt Romney is “the fifties, he is retro, he is backward.” This may sound at first blush like a particular reference to Romney’s uptight persona, but the line of attack would have been available against any Republican nominee—Rick Santorum, Michele Bachmann, Rick Perry, or any other of the dour reactionaries who might have snatched the nomination. The message is transmitted in a thousand ways, both obvious and obscure: Tina Fey’s devastating portrayal of Sarah Palin. Obama appearing on Late Night With Jimmy Fallon to “slow jam the news,” which meant to recite his campaign message of the week. The severed head of George W. Bush appearing on Game of Thrones. An episode of Mad Men that included the odd throwaway line “Romney’s a clown,” putatively to describe George Romney, who was anything but.

When Joe Biden endorsed gay marriage in May, he cited Will & Grace as the single-most important driving force in transforming public opinion on the subject. In so doing he actually confirmed the long-standing fear of conservatives—that a coterie of Hollywood elites had undertaken an invidious and utterly successfully propaganda campaign, and had transmuted the cultural majority into a minority. Set aside the substance of the matter and consider the process of it—that is, think of it from the conservative point of view, if you don’t happen to be one. Imagine that large chunks of your entertainment mocked your values and even transformed once-uncontroversial beliefs of yours into a kind of bigotry that might be greeted with revulsion.

You’d probably be angry, too.

by Jonathan Chait, New York Magazine | Read more:
Photo: Courtesy of Warner Bros. Pictures. Photo-illustration by Gluekit

The Sleep Racket

[ed. See also: Sleeping pills: Britain's hidden addiction.]

Over the last year you could have made a pile of money by betting on a little company in the business of ... insomnia. Shares of ResMed, in Poway, Calif., leaped 44%, selling $465 million worth of “sleep-disordered-breathing” equipment--face masks, nasal pillows, humidifiers and so-called continuous positive airway pressure devices. “It’s a monster market; it’s bigger than Ben-Hur,” says Peter C. Farrell, ResMed’s voluble chief executive. You’d have done even better as an original investor in Pacific Sleep Medicine Services, a small chain of sleep centers, mostly in southern California. One of its founders who chipped in an undisclosed amount in 1998 saw his ante jump a hundredfold, says Tom J. Wiedel, Pacific Sleep’s chief financial officer.

A bad night’s sleep is reason for a very big business. Sleeping pills, led by Ambien, rack up more than $2 billion a year in the U.S. Then there is the revenue from overnight stays at sleep clinics, over-the-counter pills, a parade of gimmicks and a thriving business for sleep specialists. For four bucks you can pick up the Insomnia Relief Scent Inhaler (lavender, rosemary, chamomile and vetiver, a grass root) from Earth Solutions of Atlanta. The Original Sound Pillow ($50) comes with two thin speakers and a headphone jack for your iPod or Discman. Spend a bit more ($147) for an MP3-like player called Pzizz that plays music, voices and tones geared to helping you fall asleep. Dreamate, an $80 device worn as a bracelet, supposedly delivers a 6,000rpm massage to the “sleeping golden triangle,” a.k.a. the wrist. Sealy’s Stearns & Foster unit is now selling a $10,000 mattress fit for a princess afflicted with insomnia. It is topped with latex and stitched with real silver threads. Hypnos goes one better, with a king-size mattress filled with layers of silk, cashmere and lamb’s wool and topped with a $20,000 sticker. And coming this spring from Matsushita Electric Works: a room tricked out to induce “deep sleep” within 30 minutes. It includes a reclining bed, sound-absorbent walls and somniferous sights and sounds from a wide-screen TV.

“Sleep is the new sex.” So says psychologist Arthur J. Spielman, associate director of the Center for Sleep Disorders Medicine & Research at New York Methodist Hospital in Brooklyn, N.Y. “People want it, need it, can’t get enough of it.” The same could be said of profits. Spielman is coauthor of The Insomnia Answer ($24, Perigee Books, 2006). He is also developing light-delivering goggles that are supposed to help people reset the circadian rhythms that govern when they nod off and wake up, so they fall asleep faster and stay asleep longer.

Sleep is also the new snake oil--the promise of a good snooze from a book or a bed or a bottle. It’s easy pickings. Who isn’t somewhat slumber-deprived? Given the demands of work and family, no one gets “enough” sleep, whatever that is. The Morpheus mongers point to all kinds of studies on their behalf. The number of Americans who say they sleep less than six hours a night--16% in 2005, compared with 12% in 1998--is on the rise, claims the National Sleep Foundation, which, coincidentally, gets funding from pharmaceutical companies. According to the American Academy of Sleep Medicine, there are 81 chronic sleeping disorders, from apnea, which causes interrupted breathing, to restless leg syndrome. No wonder sleep labs are popping up everywhere--650 and counting, compared with just a few in the mid-1970s.

According to the National Institutes of Health, sleeplessness creates $16 billion in annual health care expenses and $50 billion in lost productivity. Scientists are finding that chronically reduced or disrupted sleep may increase the risk of obesity, diabetes and cardiovascular disease. “We know from all the research that sleep is just as important to overall health as exercise and diet,” says Carl Hunt, special assistant to the director of the National Heart, Lung & Blood Institute at the NIH.

by Melanie Wells, Forbes (2006) |  Read more:

Kamisaka Sekka (1866 - 1942) Japanese Woodblock Print
Rolling Hillside
Sekka’s A World of Things Series (Momoyogusa)
via:

Monday, August 20, 2012

Cat Power


Spirited Away (2002)

I was so fortunate to meet Miyazaki at the 2002 Toronto film festival. I told him I love the “gratuitous motion” in his films; instead of every movement being dictated by the story, sometimes people will just sit for a moment, or sigh, or gaze at a running stream, or do something extra, not to advance the story but only to give the sense of time and place and who they are.

“We have a word for that in Japanese,” he said. “It’s called ‘ma.’ Emptiness. It’s there intentionally.” He clapped his hands three or four times. “The time in between my clapping is ‘ma.’ If you just have non-stop action with no breathing space at all, it’s just busyness.”

I think that helps explain why Miyazaki’s films are more absorbing than the frantic action in a lot of American animation. “The people who make the movies are scared of silence” he said, “so they want to paper and plaster it over,” he said. “They’re worried that the audience will get bored. But just because it’s 80 percent intense all the time doesn’t mean the kids are going to bless you with their concentration. What really matters is the underlying emotions—that you never let go of those.

“What my friends and I have been trying to do since the 1970’s is to try and quiet things down a little bit; don’t just bombard them with noise and distraction. And to follow the path of children’s emotions and feelings as we make a film. If you stay true to joy and astonishment and empathy you don’t have to have violence and you don’t have to have action. They’ll follow you. This is our principle.”

He said he has been amused to see a lot of animation in live-action superhero movies. “In a way, live action is becoming part of that whole soup called animation. Animation has become a word that encompasses so much, and my animation is just a little tiny dot over in the corner. It’s plenty for me.”

It’s plenty for me, too.

by Roger Ebert, Chicago Sun Times |  Read more:

Where Has the Retail Investor Gone?

Lots of folks are wondering what happened to the Main Street-mom-and-pop retail investors. They seem to have taken their ball and gone home. I don’t blame them for feeling put upon, but it might be instructive to figure out why. Perhaps it could even help us determine what this means for risk capital.

We see evidence of this all over the place: The incredibly light volume of stock trading; the abysmal television ratings of CNBC; the closing of investing magazines such as Smart Money, whose final print issue is on newsstands as it transitions to a digital format; the dearth of stock chatter at cocktail parties. Why, it is almost as if America has fallen out of love with equities.

Given the events of the past decade and a half, this should come as no surprise. Average investors have seen not one but two equity collapses (2000 and 2008). They got caught in the real estate boom and bust. Accredited investors (i.e., the wealthier ones) also discovered that venture capital and private equity were no sure thing either. The Facebook IPO may have been the last straw.

What has driven the typical investor away from equities?

The short answer is that there is no single answer. It is complex, not reducible to single variable analysis. This annoys pundits who thrive on dumbing down complex and nuanced issues to easily digestible sound bites. Television is not particularly good at subtlety, hence the overwhelming tendency for shout-fests and silly bull/bear debates.

The factors that have been weighing on people-formerly-known-as-stock-investors are many. Consider the top 10 reasons investors are unenthused about the stock market:

1 Secular cycle: As we have discussed before, there are long-term cycles of alternating bull and bear markets. The current bear market that began in March 2000 has provided lots of ups and downs — but no lasting gains. Markets are effectively unchanged since 1999 (the Nasdaq is off only 40 percent from its 2000 peak).

The way secular bear markets end is with investors ignoring stocks, enormous P/E multiple compression and bargains galore. Bond king Bill Gross and his Death of the Cult of Equities is a good sign we are getting closer to the final denouement.

2 Psychology: Investors are scarred and scared. They have been scarred by the 57 percent crash in the major indexes from the 2007 peak to the 2009 bottom. They are scared to get back into equities because that is their most recent experience, and it has affected them deeply. While this psychological shift from love to hate to indifference is a necessary part of working toward the end of a secular bear, it is no fun for them — or anyone who trades or invests for a living.

3 Risk on/risk off: Let’s be brutally honest — the fundamentals have been utterly trumped by unprecedented central bank intervention. While this may be helping the wounded bank sector, it is not doing much for long-term investors in fixed income or equities. The Fed’s dual mandate of maximum employment and stable prices seems to have a newer unspoken goal: Driving risk asset prices higher.

When investors can no longer fashion a thesis other than “Buy when the Fed rolls out the latest bailout,” it takes a toll on psychology, and scares them away.

by Barry Ritholtz, Washington Post |  Read more:

Marvin Gaye


Finally the full version of Marvin's funk classic! Enjoy!

Augusta National Adds First Two Female Members

[ed. And in other news today, the temperature in Hell is reported to have dipped below 32 degrees.]

For the first time in its 80-year history, Augusta National Golf Club has female members.

The home of the Masters, under increasing criticism the last decade because of its all-male membership, invited former Secretary of State Condoleezza Rice and South Carolina financier Darla Mooreto become the first women in green jackets when the club opens for a new season in October.

Both women accepted.

"This is a joyous occasion," Augusta National chairman Billy Payne said Monday.

The move likely ends a debate that intensified in 2002 when Martha Burk of the National Council of Women's Organizations urged the club to include women among its members. Former club chairman Hootie Johnson stood his ground, even at the cost of losing Masters television sponsors for two years, when he famously said Augusta National might one day have a woman in a green jacket, "but not at the point of a bayonet."

The comment took on a life of its own, becoming either a slogan of the club's resolve not to give in to public pressure or a sign of its sexism, depending on which side of the debate was interpreting it.

"Oh my God. We won," Burk said. "It's about 10 years too late for the boys to come into the 20th century, never mind the 21st century. But it's a milestone for women in business."

Payne, who took over as chairman in 2006 when Johnson retired, said consideration for new members is deliberate and private, and that Rice and Moore were not treated differently from other new members. Even so, he took the rare step of announcing two of the latest members to join because of the historical significance.

"These accomplished women share our passion for the game of golf and both are well known and respected by our membership," Payne said in a statement. "It will be a proud moment when we present Condoleezza and Darla their green jackets when the club opens this fall. This is a significant and positive time in our club's history and, on behalf of our membership, I wanted to take this opportunity to welcome them and all of our new members into the Augusta National family."

by Doug Ferguson, AP |  Read more:

DS Super (by uncertainworld)
via:

Secret of AA: After 75 Years, We Don’t Know How It Works


AA originated on the worst night of Bill Wilson’s life. It was December 14, 1934, and Wilson was drying out at Towns Hospital, a ritzy Manhattan detox center. He’d been there three times before, but he’d always returned to drinking soon after he was released. The 39-year-old had spent his entire adult life chasing the ecstasy he had felt upon tasting his first cocktail some 17 years earlier. That quest destroyed his career, landed him deeply in debt, and convinced doctors that he was destined for institutionalization.

Wilson had been quite a mess when he checked in the day before, so the attending physician, William Silkworth, subjected him to a detox regimen known as the Belladonna Cure—hourly infusions of a hallucinogenic drug made from a poisonous plant. The drug was coursing through Wilson’s system when he received a visit from an old drinking buddy, Ebby Thacher, who had recently found religion and given up alcohol. Thacher pleaded with Wilson to do likewise. “Realize you are licked, admit it, and get willing to turn your life over to God,” Thacher counseled his desperate friend. Wilson, a confirmed agnostic, gagged at the thought of asking a supernatural being for help.

But later, as he writhed in his hospital bed, still heavily under the influence of belladonna, Wilson decided to give God a try. “If there is a God, let Him show Himself!” he cried out. “I am ready to do anything. Anything!”

What happened next is an essential piece of AA lore: A white light filled Wilson’s hospital room, and God revealed himself to the shattered stockbroker. “It seemed to me, in the mind’s eye, that I was on a mountain and that a wind not of air but of spirit was blowing,” he later said. “And then it burst upon me that I was a free man.” Wilson would never drink again.

At that time, the conventional wisdom was that alcoholics simply lacked moral fortitude. The best science could offer was detoxification with an array of purgatives, followed by earnest pleas for the drinker to think of his loved ones. When this approach failed, alcoholics were often consigned to bleak state hospitals. But having come back from the edge himself, Wilson refused to believe his fellow inebriates were hopeless. He resolved to save them by teaching them to surrender to God, exactly as Thacher had taught him.

Following Thacher’s lead, Wilson joined the Oxford Group, a Christian movement that was in vogue among wealthy mainstream Protestants. Headed by a an ex-YMCA missionary named Frank Buchman, who stirred controversy with his lavish lifestyle and attempts to convert Adolf Hitler, the Oxford Group combined religion with pop psychology, stressing that all people can achieve happiness through moral improvement. To help reach this goal, the organization’s members were encouraged to meet in private homes so they could study devotional literature together and share their inmost thoughts.

In May 1935, while on an extended business trip to Akron, Ohio, Wilson began attending Oxford Group meetings at the home of a local industrialist. It was through the group that he met a surgeon and closet alcoholic named Robert Smith. For weeks, Wilson urged the oft-soused doctor to admit that only God could eliminate his compulsion to drink. Finally, on June 10, 1935, Smith (known to millions today as Dr. Bob) gave in. The date of Dr. Bob’s surrender became the official founding date of Alcoholics Anonymous.

In its earliest days, AA existed within the confines of the Oxford Group, offering special meetings for members who wished to end their dependence on alcohol. But Wilson and his followers quickly broke away, in large part because Wilson dreamed of creating a truly mass movement, not one confined to the elites Buchman targeted. To spread his message of salvation, Wilson started writing what would become AA’s sacred text: Alcoholics Anonymous, now better known as the Big Book.

The core of AA is found in chapter five, entitled “How It Works.” It is here that Wilson lists the 12 steps, which he first scrawled out in pencil in 1939. Wilson settled on the number 12 because there were 12 apostles.

In writing the steps, Wilson drew on the Oxford Group’s precepts and borrowed heavily from William James’ classic The Varieties of Religious Experience, which Wilson read shortly after his belladonna-fueled revelation at Towns Hospital. He was deeply affected by an observation that James made regarding alcoholism: that the only cure for the affliction is “religiomania.” The steps were thus designed to induce an intense commitment, because Wilson wanted his system to be every bit as habit-forming as booze.

The first steps famously ask members to admit their powerlessness over alcohol and to appeal to a higher power for help. Members are then required to enumerate their faults, share them with their meeting group, apologize to those they’ve wronged, and engage in regular prayer or meditation. Finally, the last step makes AA a lifelong duty: “Having had a spiritual awakening as the result of these steps, we tried to carry this message to alcoholics and to practice these principles in all our affairs.” This requirement guarantees not only that current members will find new recruits but that they can never truly “graduate” from the program.

Aside from the steps, AA has one other cardinal rule: anonymity. Wilson was adamant that the anonymous component of AA be taken seriously, not because of the social stigma associated with alcoholism, but rather to protect the nascent organization from ridicule. He explained the logic in a letter to a friend:

[In the past], alcoholics who talked too much on public platforms were likely to become inflated and get drunk again. Our principle of anonymity, so far as the general public is concerned, partly corrects this difficulty by preventing any individual receiving a lot of newspaper or magazine publicity, then collapsing and discrediting AA.

AA boomed in the early 1940s, aided by a glowing Saturday Evening Post profile and the public admission by a Cleveland Indians catcher, Rollie Hemsley, that joining the organization had done wonders for his game. Wilson and the founding members were not quite prepared for the sudden success. “You had really crazy things going on,” says William L. White, author of Slaying the Dragon: The History of Addiction Treatment and Recovery in America. “Some AA groups were preparing to run AA hospitals, and there was this whole question of whether they should have paid AA missionaries. You even had some reports of AA groups drinking beers at their meetings.”

The growing pains spurred Wilson to write AA’s governing principles, known as the 12 traditions. At a time when fraternal orders and churches with strict hierarchies dominated American social life, Wilson opted for something revolutionary: deliberate organizational chaos. He permitted each group to set its own rules, as long as they didn’t conflict with the traditions or the steps. Charging a fee was forbidden, as was the use of the AA brand to endorse anything that might generate revenue. “If you look at this on paper, it seems like it could never work,” White says. “It’s basically anarchy.” But this loose structure actually helped AA flourish. Not only could anyone start an AA group at any time, but they could tailor each meeting to suit regional or local tastes. And by condemning itself to poverty, AA maintained a posture of moral legitimacy.

Despite the decision to forbid members from receiving pay for AA-related activity, it had no problem letting professional institutions integrate the 12 steps into their treatment programs. AA did not object when Hazelden, a Minnesota facility founded in 1947 as “a sanatorium for curable alcoholics of the professional class,” made the steps the foundation of its treatment model. Nor did AA try to stop the proliferation of steps-centered addiction groups from adopting the Anonymous name: Narcotics Anonymous, Gamblers Anonymous, Overeaters Anonymous. No money ever changed hands—the steps essentially served as open source code that anyone was free to build upon, adding whatever features they wished. (Food Addicts Anonymous, for example, requires its members to weigh their meals.)

By the early 1950s, as AA membership reached 100,000, Wilson began to step back from his invention. Deeply depressed and an incorrigible chain smoker, he would go on to experiment with LSD before dying from emphysema in 1971. By that point, AA had become ingrained in American culture; even people who’d never touched a drop of liquor could name at least a few of the steps.

“For nearly 30 years, I have been saying Alcoholics Anonymous is the most effective self-help group in the world,” advice columnist Ann Landers wrote in 1986. “The good accomplished by this fellowship is inestimable … God bless AA.”

There’s no doubt that when AA works, it can be transformative. But what aspect of the program deserves most of the credit? Is it the act of surrendering to a higher power? The making of amends to people a drinker has wronged? The simple admission that you have a problem? Stunningly, even the most highly regarded AA experts have no idea. “These are questions we’ve been trying to answer for, golly, 30 or 40 years now,” says Lee Ann Kaskutas, senior scientist at the Alcohol Research Group in Emeryville, California. “We can’t find anything that completely holds water.”

by Brendan I. Koerner, Wired | Read more:
Photo: Christian Stoll