Thursday, August 23, 2012
Money Market Funds 'Operating Without a Net'
Attempts to make sweeping changes to a popular type of mutual fund that played a central role in the 2008 financial crisis have been derailed.
The chairwoman of the Securities and Exchange Commission, Mary L. Schapiro, wanted to bring her vision for regulating money market mutual funds to a vote as early as next week. But Ms. Schapiro acknowledged on Wednesday evening that three of the five commissioners opposed her plan and said she was calling off the vote. (...)
Until the financial crisis, money market funds were considered a dull, low-return corner of the markets. But now, most of the nation’s top financial regulators view the sector as one of the most vulnerable parts of the American financial system. (...)
Regulators view the funds as vulnerable because they act like banks by taking in money and promising to return every dollar that investors put in. Unlike banks, though, they do not have to pay deposit insurance or keep capital buffers to protect against defaults.
The funds, which provide short-term loans to banks and other borrowers, grew wildly over the last 30 years because they typically offered a higher return than bank accounts and at their peak held $3.8 trillion.
Most investors have used the funds like low-risk bank accounts from which money could be immediately withdrawn.
But in the financial crisis the vulnerability of the funds was laid bare. In September 2008, the Reserve Primary Fund suffered losses on $785 million of debt issued by Lehman Brothers and fell below $1 a share, known as “breaking the buck.”
Investors fled the Reserve Primary Fund and a panic ensued in which they withdrew about $300 billion from money market funds in one week, contributing to the credit freeze that gripped global markets. The Federal Reserve and the Treasury Department stepped in to bail out the money market fund sector with a guarantee and a special loan facility.
The S.E.C. voted in 2010 to introduce several new rules aimed at making the funds more stable. The most significant change forced fund managers to hold more assets that could be easily sold for cash. (...)
“Money market funds effectively are operating without a net,” Ms. Schapiro said.
The chairwoman of the Securities and Exchange Commission, Mary L. Schapiro, wanted to bring her vision for regulating money market mutual funds to a vote as early as next week. But Ms. Schapiro acknowledged on Wednesday evening that three of the five commissioners opposed her plan and said she was calling off the vote. (...)
Until the financial crisis, money market funds were considered a dull, low-return corner of the markets. But now, most of the nation’s top financial regulators view the sector as one of the most vulnerable parts of the American financial system. (...)
Regulators view the funds as vulnerable because they act like banks by taking in money and promising to return every dollar that investors put in. Unlike banks, though, they do not have to pay deposit insurance or keep capital buffers to protect against defaults.
The funds, which provide short-term loans to banks and other borrowers, grew wildly over the last 30 years because they typically offered a higher return than bank accounts and at their peak held $3.8 trillion.
Most investors have used the funds like low-risk bank accounts from which money could be immediately withdrawn.
But in the financial crisis the vulnerability of the funds was laid bare. In September 2008, the Reserve Primary Fund suffered losses on $785 million of debt issued by Lehman Brothers and fell below $1 a share, known as “breaking the buck.”
Investors fled the Reserve Primary Fund and a panic ensued in which they withdrew about $300 billion from money market funds in one week, contributing to the credit freeze that gripped global markets. The Federal Reserve and the Treasury Department stepped in to bail out the money market fund sector with a guarantee and a special loan facility.
The S.E.C. voted in 2010 to introduce several new rules aimed at making the funds more stable. The most significant change forced fund managers to hold more assets that could be easily sold for cash. (...)
“Money market funds effectively are operating without a net,” Ms. Schapiro said.
by Nathaniel Popper, NY Times | Read more:
A Community on Overdose
About half of those living in McDowell County depend on some kind of relief check such as Social Security, Disability, Supplemental Security Income (SSI), Temporary Assistance for Needy Families, retirement benefits, and unemployment to survive. They live on the margins, check to check, expecting no improvement in their lives and seeing none. The most common billboards along the roads are for law firms that file disability claims and seek state and federal payments. “Disability and Injury Lawyers,” reads one. It promises to handle “Social Security. Car Wrecks. Veterans. Workers’ Comp.” The 800 number ends in COMP.
Harry M. Caudill, in his monumental 1963 book Night Comes to the Cumberlands, describes how relief checks became a kind of bribe for the rural poor in Appalachia. The decimated region was the pilot project for outside government assistance, which had issued the first food stamps in 1961 to a household of fifteen in Paynesville, West Virginia. “Welfarism” began to be practiced, as Caudill wrote, “on a scale unequalled elsewhere in America and scarcely surpassed anywhere in the world.” Government “handouts,” he observed, were “speedily recognized as a lode from which dollars could be mined more easily than from any coal seam.”
Obtaining the monthly “handout” became an art form. People were reduced to what Caudill called “the tragic status of ‘symptom hunters.’ If they could find enough symptoms of illness, they might convince the physicians they were ‘sick enough to draw’... to indicate such a disability as incapacitating the men from working. Then his children, as public charges, could draw enough money to feed the family.”
Joe and I are sitting in the Tug River Health Clinic in Gary with a registered nurse who does not want her name used. The clinic handles federal and state black lung applications. It runs a program for those addicted to prescription pills. It also handles what in the local vernacular is known as “the crazy check” -- payments obtained for mental illness from Medicaid or SSI -- a vital source of income for those whose five years of welfare payments have run out. Doctors willing to diagnose a patient as mentally ill are important to economic survival.
“They come in and want to be diagnosed as soon as they can for the crazy check,” the nurse says. “They will insist to us they are crazy. They will tell us, ‘I know I’m not right.’ People here are very resigned. They will avoid working by being diagnosed as crazy.”
The reliance on government checks, and a vast array of painkillers and opiates, has turned towns like Gary into modern opium dens. The painkillers OxyContin, fentanyl -- 80 times stronger than morphine -- Lortab, as well as a wide variety of anti-anxiety medications such as Xanax, are widely abused. Many top off their daily cocktail of painkillers at night with sleeping pills and muscle relaxants. And for fun, addicts, especially the young, hold “pharm parties,” in which they combine their pills in a bowl, scoop out handfuls of medication, swallow them, and wait to feel the result.
A decade ago only about 5% of those seeking treatment in West Virginia needed help with opiate addiction. Today that number has ballooned to 26%. It recorded 91 overdose deaths in 2001. By 2008 that number had risen to 390.
Drug overdoses are the leading cause of accidental death in West Virginia, and the state leads the country in fatal drug overdoses. OxyContin -- nicknamed “hillbilly heroin” -- is king. At a drug market like the Pines it costs a dollar a milligram. And a couple of 60- or 80-milligram pills sold at the Pines is a significant boost to a family’s income. Not far behind OxyContin is Suboxone, the brand name for a drug whose primary ingredient is buprenorphine, a semisynthetic opioid. Dealers, many of whom are based in Detroit, travel from clinic to clinic in Florida to stock up on the opiates and then sell them out of the backs of gleaming SUVs in West Virginia, usually around the first of the month, when the government checks arrive. Those who have legal prescriptions also sell the drugs for a profit. Pushers are often retirees. They can make a few hundred extra dollars a month on the sale of their medications. The temptation to peddle pills is hard to resist.
Harry M. Caudill, in his monumental 1963 book Night Comes to the Cumberlands, describes how relief checks became a kind of bribe for the rural poor in Appalachia. The decimated region was the pilot project for outside government assistance, which had issued the first food stamps in 1961 to a household of fifteen in Paynesville, West Virginia. “Welfarism” began to be practiced, as Caudill wrote, “on a scale unequalled elsewhere in America and scarcely surpassed anywhere in the world.” Government “handouts,” he observed, were “speedily recognized as a lode from which dollars could be mined more easily than from any coal seam.”Obtaining the monthly “handout” became an art form. People were reduced to what Caudill called “the tragic status of ‘symptom hunters.’ If they could find enough symptoms of illness, they might convince the physicians they were ‘sick enough to draw’... to indicate such a disability as incapacitating the men from working. Then his children, as public charges, could draw enough money to feed the family.”
Joe and I are sitting in the Tug River Health Clinic in Gary with a registered nurse who does not want her name used. The clinic handles federal and state black lung applications. It runs a program for those addicted to prescription pills. It also handles what in the local vernacular is known as “the crazy check” -- payments obtained for mental illness from Medicaid or SSI -- a vital source of income for those whose five years of welfare payments have run out. Doctors willing to diagnose a patient as mentally ill are important to economic survival.
“They come in and want to be diagnosed as soon as they can for the crazy check,” the nurse says. “They will insist to us they are crazy. They will tell us, ‘I know I’m not right.’ People here are very resigned. They will avoid working by being diagnosed as crazy.”
The reliance on government checks, and a vast array of painkillers and opiates, has turned towns like Gary into modern opium dens. The painkillers OxyContin, fentanyl -- 80 times stronger than morphine -- Lortab, as well as a wide variety of anti-anxiety medications such as Xanax, are widely abused. Many top off their daily cocktail of painkillers at night with sleeping pills and muscle relaxants. And for fun, addicts, especially the young, hold “pharm parties,” in which they combine their pills in a bowl, scoop out handfuls of medication, swallow them, and wait to feel the result.
A decade ago only about 5% of those seeking treatment in West Virginia needed help with opiate addiction. Today that number has ballooned to 26%. It recorded 91 overdose deaths in 2001. By 2008 that number had risen to 390.
Drug overdoses are the leading cause of accidental death in West Virginia, and the state leads the country in fatal drug overdoses. OxyContin -- nicknamed “hillbilly heroin” -- is king. At a drug market like the Pines it costs a dollar a milligram. And a couple of 60- or 80-milligram pills sold at the Pines is a significant boost to a family’s income. Not far behind OxyContin is Suboxone, the brand name for a drug whose primary ingredient is buprenorphine, a semisynthetic opioid. Dealers, many of whom are based in Detroit, travel from clinic to clinic in Florida to stock up on the opiates and then sell them out of the backs of gleaming SUVs in West Virginia, usually around the first of the month, when the government checks arrive. Those who have legal prescriptions also sell the drugs for a profit. Pushers are often retirees. They can make a few hundred extra dollars a month on the sale of their medications. The temptation to peddle pills is hard to resist.
Tiny Hawaiian Island Will See if New Owner Tilts at Windmills
Yet for all its seeming serenity, Lanai — a privately owned island in easy sight of Maui’s western shore — is torn these days by economic and cultural conflict, struggling with its identity and an uncertain future after its reclusive residents learned that their island had been sold to the reclusive billionaire owner of a software company.
Since James Drummond Dole bought Lanai from a rancher 90 years ago, the island has undergone a series of wrenching economic transformations. Under Dole, it became the world’s largest pineapple plantation, known as Pineapple Island, with bristling fields and a colony of workers. When Dole moved its operations overseas in the late 1980s, Lanai turned to tourism, opening two high-end resorts where rooms go for as much as $1,100 a night, providing a new source of employment for this community.
But when those resorts struggled with the recent economic downturn and the challenge of bringing tourists to a remote island with single-propeller air service, the island’s owner proposed building a field of 45-story turbine windmills, across bluffs and beaches covering over a quarter of the island, to produce energy to sell to Oahu. The plan polarized residents, dividing those who saw the turbines as the economic salvation of their struggling island from those who treasured its wild and undeveloped isolation.
“It’s awful, just awful,” said Robin Kaye, one of the opponents, sweeping his arm across the land where the windmills would rise, a tumble of otherworldly rock formations framed by views across the Pacific to Maui and Molokai. “There are families who won’t talk to each other anymore. It has really ripped us up.”
Lanai’s new owner is Larry Ellison, a co-founder of Oracle. He bought 98 percent of the island — the remainder is government property and privately owned homes — six weeks ago from David H. Murdock, another billionaire, whose holdings include Dole and who was the force behind the windmill proposal. The price was not disclosed.
Mr. Ellison now owns the gas station, the car rental agency and the supermarket. He owns the Lanai City Grille, the Hotel Lanai, the two Four Seasons resorts, two championship golf courses, about 500 cottages and luxury homes, a solar farm, and nearly every single one of the small shops and cafes that line Lanai City. He owns 88,000 acres of overgrown pineapple fields and arid, boulder-strewn hills, thick with red dust, as well as 50 miles of beaches.
But Mr. Murdock is not quite gone. As part of his deal, he retained the option to build the windmills should he win the requisite approvals. That was viewed here as one final anxiety-causing shot at his Lanai neighbors.
For all the speculation about Mr. Ellison’s intentions — the most prevalent being that the new owner, whose team of yachts won the America’s Cup in 2010, would turn Lanai into a hub for sailing — he has yet to appear in public, speak with elected officials or tell anyone what he might have in mind. He did not respond to a request for comment.
“Everybody is basically in the dark,” said Mary Charles, who runs the Hotel Lanai. “It’s been a very tough struggle for Lanai for the past five years.”
by Adam Nagourney, NT Times | Read more:
Monica Almeida/The New York TimesWednesday, August 22, 2012
The Beatles - Love Full Album (HD)
DNA Data Storage
The work, carried out by George Church and Sri Kosuri, basically treats DNA as just another digital storage device. Instead of binary data being encoded as magnetic regions on a hard drive platter, strands of DNA that store 96 bits are synthesized, with each of the bases (TGAC) representing a binary value (T and G = 1, A and C = 0). (...)
Scientists have been eyeing up DNA as a potential storage medium for a long time, for three very good reasons: It’s incredibly dense (you can store one bit per base, and a base is only a few atoms large); it’s volumetric (beaker) rather than planar (hard disk); and it’s incredibly stable — where other bleeding-edge storage mediums need to be kept in sub-zero vacuums, DNA can survive for hundreds of thousands of years in a box in your garage.
It is only with recent advances in microfluidics and labs-on-a-chip that synthesizing and sequencing DNA has become an everyday task, though. While it took years for the original Human Genome Project to analyze a single human genome (some 3 billion DNA base pairs), modern lab equipment with microfluidic chips can do it in hours. Now this isn’t to say that Church and Kosuri’s DNA storage is fast — but it’s fast enough for very-long-term archival.
Just think about it for a moment: One gram of DNA can store 700 terabytes of data. That’s 14,000 50-gigabyte Blu-ray discs… in a droplet of DNA that would fit on the tip of your pinky. To store the same kind of data on hard drives — the densest storage medium in use today — you’d need 233 3TB drives, weighing a total of 151 kilos. In Church and Kosuri’s case, they have successfully stored around 700 kilobytes of data in DNA — Church’s latest book, in fact — and proceeded to make 70 billion copies (which they claim, jokingly, makes it the best-selling book of all time!) totaling 44 petabytes of data stored.
Looking forward, they foresee a world where biological storage would allow us to record anything and everything without reservation. Today, we wouldn’t dream of blanketing every square meter of Earth with cameras, and recording every moment for all eternity/human posterity — we simply don’t have the storage capacity. There is a reason that backed up data is usually only kept for a few weeks or months — it just isn’t feasible to have warehouses full of hard drives, which could fail at any time. If the entirety of human knowledge — every book, uttered word, and funny cat video — can be stored in a few hundred kilos of DNA, though… well, it might just be possible to record everything (hello, police state!)
by Sebastian Anthony, ExtremeTech | Read more:
The Death of the Cyberflâneur
Intrigued, I set out to discover what happened to the cyberflâneur. While I quickly found other contemporaneous commentators who believed that flânerie would flourish online, the sad state of today’s Internet suggests that they couldn’t have been more wrong. Cyberflâneurs are few and far between, while the very practice of cyberflânerie seems at odds with the world of social media. What went wrong? And should we worry?
Engaging the history of flânerie may be a good way to start answering these questions. Thanks to the French poet Charles Baudelaire and the German critic Walter Benjamin, both of whom viewed the flâneur as an emblem of modernity, his figure (and it was predominantly a “he”) is now firmly associated with 19th-century Paris. The flâneur would leisurely stroll through its streets and especially its arcades — those stylish, lively and bustling rows of shops covered by glass roofs — to cultivate what Honoré de Balzac called “the gastronomy of the eye.”
While not deliberately concealing his identity, the flâneur preferred to stroll incognito. “The art that the flâneur masters is that of seeing without being caught looking,” the Polish sociologist Zygmunt Bauman once remarked. The flâneur was not asocial — he needed the crowds to thrive — but he did not blend in, preferring to savor his solitude. And he had all the time in the world: there were reports of flâneurs taking turtles for a walk.
The flâneur wandered in the shopping arcades, but he did not give in to the temptations of consumerism; the arcade was primarily a pathway to a rich sensory experience — and only then a temple of consumption. His goal was to observe, to bathe in the crowd, taking in its noises, its chaos, its heterogeneity, its cosmopolitanism. Occasionally, he would narrate what he saw — surveying both his private self and the world at large — in the form of short essays for daily newspapers.
It’s easy to see, then, why cyberflânerie seemed such an appealing notion in the early days of the Web. The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (“Internet Explorer,” “Netscape Navigator”).
Online communities like GeoCities and Tripod were the true digital arcades of that period, trading in the most obscure and the most peculiar, without any sort of hierarchy ranking them by popularity or commercial value. Back then eBay was weirder than most flea markets; strolling through its virtual stands was far more pleasurable than buying any of the items. For a brief moment in the mid-1990s, it did seem that the Internet might trigger an unexpected renaissance of flânerie.
However, anyone entertaining such dreams of the Internet as a refuge for the bohemian, the hedonistic and the idiosyncratic probably didn’t know the reasons behind the disappearance of the original flâneur.
Illustration: Gustave Caillebotte's "Paris Street; Rainy Day," from 1877.
The Playboy Interview: Richard Dawkins
Dawkins, who retired from Oxford University in 2008 after 13 years as a professor of public understanding of science (meaning he lectured and wrote books), stepped into the limelight in 1976, at the age of 35, with the publication of The Selfish Gene. The book, which has sold more than a million copies, argues persuasively that evolution takes place at the genetic level; individuals die, but the fittest genes survive. Dawkins has since written 10 more best-sellers, including most recently The Magic of Reality: How We Know What’s Really True. Since 9/11 he has become more outspoken about his skepticism, culminating in The God Delusion, which provides the foundation for his continuing debates with believers. Published in 2006, the book has become Dawkins’s most popular, available in 31 languages with 2 million copies sold. That same year he founded the Richard Dawkins Foundation for Reason and Science “to support scientific education, critical thinking and evidence-based understanding of the natural world in the quest to overcome religious fundamentalism, superstition, intolerance and suffering.” (...)
Excerpts:
PLAYBOY: Albert Einstein and Stephen Hawking reference God in their writings. Are they using the word in the sense of an intelligent designer?
DAWKINS: Certainly not. They use god in a poetic, metaphorical sense. Einstein in particular loved using the word to convey an idea of mystery, which I think all decent scientists do. But nowadays we’ve learned better than to use the word god because it will be willfully misunderstood, as Einstein was. And poor Einstein got quite cross about it. “I do not believe in a personal god,” he said over and over again. In a way he was asking for it. Hawking uses it in a similar way in A Brief History of Time. In his famous last line he says that if we understood the universe, “then we would know the mind of God.” Once again he is using god in the Einsteinian, not the religious sense. And so Hawking’s The Grand Design, in which he says the universe could have come from nothing, is not him turning away from God; his beliefs are exactly the same.
PLAYBOY: You’ve had a lot of fun deconstructing the idea of the intelligent designer. You point out that God made a cheetah fast enough to catch a gazelle and a gazelle fast enough to outrun a cheetah——
DAWKINS: Yes. Is God a sadist?
PLAYBOY: And bad design such as the fact we breathe and eat through the same tube, making it easy to choke to death.
DAWKINS: Or the laryngeal nerve, which loops around an artery in the chest and then goes back up to the larynx.
PLAYBOY: Not very efficient.
DAWKINS: Not in a giraffe, anyway. (...)
PLAYBOY: What will happen when you die?
DAWKINS: Well, I shall either be buried or be cremated.
PLAYBOY: Funny. But without faith in an afterlife, in what do you take comfort in times of despair?
DAWKINS: Human love and companionship. But in more thoughtful, cerebral moments, I take—comfort is not quite the right word, but I draw strength from reflecting on what a privilege it is to be alive and what a privilege it is to have a brain that’s capable in its limited way of understanding why I exist and of reveling in the beauty of the world and the beauty of the products of evolution. The magnificence of the universe and the sense of smallness that gives us in space and in geologically deep time is humbling but in a strangely comforting way. It’s nice to feel you’re part of a hugely bigger picture. (...)
PLAYBOY: We hear constantly that America is a Christian nation and that the founding fathers were all Christians.
DAWKINS: They were deists. They didn’t believe in a personal god, or one who interferes in human affairs. And they were adamant that they did not want to found the United States as a Christian nation.
PLAYBOY: But you hear quite often that if you let atheists run things you end up with Hitler and Stalin.
DAWKINS: Hitler wasn’t an atheist; he was a Roman Catholic. But I don’t care what he was. There is no logical connection between atheism and doing bad things, nor good things for that matter. It’s a philosophical belief about the absence of a creative intelligence in the world. Anybody who thinks you need religion in order to be good is being good for the wrong reason. I’d rather be good for moral reasons. Morals were here before religion, and morals change rather rapidly in spite of religion. Even people who rely on the Bible use nonbiblical criteria. If your criteria are scriptural, you have no basis for choosing the verse that says turn the other cheek rather than the verse that says stone people to death. So you pick and choose without guidance from the Bible.
by Chip Rowe, Playboy | Read more:
Metamotivation
[ed. I was familiar with Maslow's general hierarchy of needs but not the term Metamotivation i.e., striving to realize one's fullest potential. I wonder how a person's outlook on life and their personality are affected by an inability to achieve that need (if it is felt)? Furthermore, since basic needs are fluid (like health, friendship, economic security, intimacy, etc.) is metamotivation a temporary luxury (and ultimately an unsustainable goal)?]
The most fundamental and basic four layers of the pyramid contain what Maslow called "deficiency needs" or "d-needs": esteem, friendship and love, security, and physical needs. With the exception of the most fundamental (physiological) needs, if these "deficiency needs" are not met, the body gives no physical indication but the individual feels anxious and tense. Maslow's theory suggests that the most basic level of needs must be met before the individual will strongly desire (or focus motivation upon) the secondary or higher level needs. Maslow also coined the term Metamotivation to describe the motivation of people who go beyond the scope of the basic needs and strive for constant betterment. Metamotivated people are driven by B-needs (Being Needs), instead of deficiency needs (D-Needs).
via: Wikipedia
The Next Wave for the Wristwatch
Companies like Apple, Nike and Sony, along with dozens of start-ups, hope to strap a device on your wrist.
It is quite a disruption for the wristwatch, which has not actually been around all that long. Though said to have been invented in 1868 by the Swiss watchmaker Patek Philippe, it didn’t really catch on until after World War I. Before that, people carried watches in their pockets or on chains.
“Watch manufacturers were asking themselves this in the 1900s, if it made sense to have a watch in their pocket,” said Blaise Bertrand, industrial design director for IDEO, a design company. “I think that’s the same question that is being asked now, but it’s in a completely different context with the smartphone in our pockets.”
The new wrist devices won’t replace smartphones, but rather connect to them. Most will continue the basic task of telling the time, while eliminating the need to dig a smartphone out of your pocket or purse. But they will provide far more information than the most advanced G-Shock watch available today, or the most expensive chronometer.
For example, Sony this year released the Smartwatch, a two-inch-square screen that can display e-mails, Twitter posts and other pieces of text, all pulled from anAndroid smartphone. Nike FuelBand, a black band with an array of colored lights, measures the energy you exert on a daily basis and sends it to a smartphone. (It also tells the time.) Jawbone sells theUp, a unisex bracelet that tracks a user’s daily activity and sends the information to an iPhone application. Pebble, an innovative watch that can play music and display text, the weather and other information from a phone, caught the public’s imagination on Kickstarter, where it raised $10.3 million. It is expected to arrive next year.
by Nick Bilton, NY Times | Read more:
Tuesday, August 21, 2012
The Vast Left-Wing Conspiracy Is on Your Screen
Two decades ago, conservative anger against popular culture burned so intensely that it seemed at the time that Hollywood had come to fill the space in the right-wing fear center vacated by the end of Communism. The anger came out in an endless series of skirmishes. In 1989, after watching an episode of the sitcom Married With Children that included a gay man and a woman removing her bra, Michigan housewife Terry Rakolta (whose sister, Ronna Romney, married the brother of … yes, him) launched a national crusade against the show. Dan Quayle gave a speech denouncing the single-motherhood of Murphy Brown. Advertising boycotts by such groups as Christian Leaders for Responsible Television or Rakolta’s own Americans for Responsible Television were a regular occurrence, as were anti-Hollywood rallies that drew thousands of protesters.
The country was “involved in a Kulturkampf,” declared Illinois Republican congressman Henry Hyde, a “war between cultures and a war about the meaning of culture.” Liberals, too, considered their way of life threatened by the conservative campaign against Hollywood. “We are in the midst of a culture war,” announced the vice-president of People for the American Way, a group founded by liberal producer Norman Lear. In his keynote speech at the 1992 Republican convention, Pat Buchanan floridly exhorted his party to fight (or, in its view, fight back) in a “cultural war, as critical to the kind of nation we will one day be as was the Cold War itself.”
When Buchanan delivered that terrifying (or exhilarating) speech in Houston, it would have been impossible to imagine that twenty years later, all traces of this war would have disappeared from the national political scene. If you visit Mitt Romney’s campaign website, the issues tab labeled “Values” lists Romney’s unwavering opposition to abortion and gay marriage, and Bushian opposition to stem-cell research, but nary a glancing reference can be found to the state of the culture, let alone a full-throated denunciation of Hollywood filth merchants. An immediate and easy explanation is that popular culture has ceased its provocations, or that the culture war has been shoved aside by the war over the role of government in the economy. The more uncomfortable reality is that the culture war is an ongoing liberal rout. Hollywood is as liberal as ever, and conservatives have simply despaired of changing it
You don’t have to be an especially devoted consumer of film or television (I’m not) to detect a pervasive, if not total, liberalism. Americans for Responsible Television and Christian Leaders for Responsible Television would be flipping out over the modern family in Modern Family, not to mention the girls of Girls and the gays of Glee, except that those groups went defunct long ago. The liberal analysis of the economic crisis—that unregulated finance took wild gambles—has been widely reflected, even blatantly so, in movies likeMargin Call, Too Big to Fail, and the Wall Street sequel. The conservative view that all blame lies with regulations forcing banks to lend to poor people has not, except perhaps in the amateur-hour production ofAtlas Shrugged. The muscular Rambo patriotism that briefly surged in the eighties, and seemed poised to return after 9/11, has disappeared. In its place we have series like Homeland, which probes the moral complexities of a terrorist’s worldview, and action stars like Jason Bourne, whose enemies are not just foreign baddies but also paranoid Dick Cheney figures. The conservative denial of climate change, and the low opinion of environmentalism that accompanies it, stands in contrast to cautionary end-times tales likeIce Age 2: The Meltdown and the tree-hugging mysticism of Avatar. The decade has also seen a revival of political films and shows, from the Aaron Sorkin oeuvre through Veep and The Campaign, both of which cast oilmen as the heavies. Even The Muppets features an evil oil driller stereotypically named “Tex Richman.”
In short, the world of popular culture increasingly reflects a shared reality in which the Republican Party is either absent or anathema. That shared reality is the cultural assumptions, in particular, of the younger voters whose support has become the bedrock of the Democratic Party.
A member of President Obama’s reelection team recently told New York’s John Heilemann that it plans on painting its opponent as a man out of time—Mitt Romney is “the fifties, he is retro, he is backward.” This may sound at first blush like a particular reference to Romney’s uptight persona, but the line of attack would have been available against any Republican nominee—Rick Santorum, Michele Bachmann, Rick Perry, or any other of the dour reactionaries who might have snatched the nomination. The message is transmitted in a thousand ways, both obvious and obscure: Tina Fey’s devastating portrayal of Sarah Palin. Obama appearing on Late Night With Jimmy Fallon to “slow jam the news,” which meant to recite his campaign message of the week. The severed head of George W. Bush appearing on Game of Thrones. An episode of Mad Men that included the odd throwaway line “Romney’s a clown,” putatively to describe George Romney, who was anything but.
When Joe Biden endorsed gay marriage in May, he cited Will & Grace as the single-most important driving force in transforming public opinion on the subject. In so doing he actually confirmed the long-standing fear of conservatives—that a coterie of Hollywood elites had undertaken an invidious and utterly successfully propaganda campaign, and had transmuted the cultural majority into a minority. Set aside the substance of the matter and consider the process of it—that is, think of it from the conservative point of view, if you don’t happen to be one. Imagine that large chunks of your entertainment mocked your values and even transformed once-uncontroversial beliefs of yours into a kind of bigotry that might be greeted with revulsion.
You’d probably be angry, too.
by Jonathan Chait, New York Magazine | Read more:
Photo: Courtesy of Warner Bros. Pictures. Photo-illustration by Gluekit
The country was “involved in a Kulturkampf,” declared Illinois Republican congressman Henry Hyde, a “war between cultures and a war about the meaning of culture.” Liberals, too, considered their way of life threatened by the conservative campaign against Hollywood. “We are in the midst of a culture war,” announced the vice-president of People for the American Way, a group founded by liberal producer Norman Lear. In his keynote speech at the 1992 Republican convention, Pat Buchanan floridly exhorted his party to fight (or, in its view, fight back) in a “cultural war, as critical to the kind of nation we will one day be as was the Cold War itself.”
When Buchanan delivered that terrifying (or exhilarating) speech in Houston, it would have been impossible to imagine that twenty years later, all traces of this war would have disappeared from the national political scene. If you visit Mitt Romney’s campaign website, the issues tab labeled “Values” lists Romney’s unwavering opposition to abortion and gay marriage, and Bushian opposition to stem-cell research, but nary a glancing reference can be found to the state of the culture, let alone a full-throated denunciation of Hollywood filth merchants. An immediate and easy explanation is that popular culture has ceased its provocations, or that the culture war has been shoved aside by the war over the role of government in the economy. The more uncomfortable reality is that the culture war is an ongoing liberal rout. Hollywood is as liberal as ever, and conservatives have simply despaired of changing it
You don’t have to be an especially devoted consumer of film or television (I’m not) to detect a pervasive, if not total, liberalism. Americans for Responsible Television and Christian Leaders for Responsible Television would be flipping out over the modern family in Modern Family, not to mention the girls of Girls and the gays of Glee, except that those groups went defunct long ago. The liberal analysis of the economic crisis—that unregulated finance took wild gambles—has been widely reflected, even blatantly so, in movies likeMargin Call, Too Big to Fail, and the Wall Street sequel. The conservative view that all blame lies with regulations forcing banks to lend to poor people has not, except perhaps in the amateur-hour production ofAtlas Shrugged. The muscular Rambo patriotism that briefly surged in the eighties, and seemed poised to return after 9/11, has disappeared. In its place we have series like Homeland, which probes the moral complexities of a terrorist’s worldview, and action stars like Jason Bourne, whose enemies are not just foreign baddies but also paranoid Dick Cheney figures. The conservative denial of climate change, and the low opinion of environmentalism that accompanies it, stands in contrast to cautionary end-times tales likeIce Age 2: The Meltdown and the tree-hugging mysticism of Avatar. The decade has also seen a revival of political films and shows, from the Aaron Sorkin oeuvre through Veep and The Campaign, both of which cast oilmen as the heavies. Even The Muppets features an evil oil driller stereotypically named “Tex Richman.”
In short, the world of popular culture increasingly reflects a shared reality in which the Republican Party is either absent or anathema. That shared reality is the cultural assumptions, in particular, of the younger voters whose support has become the bedrock of the Democratic Party.
A member of President Obama’s reelection team recently told New York’s John Heilemann that it plans on painting its opponent as a man out of time—Mitt Romney is “the fifties, he is retro, he is backward.” This may sound at first blush like a particular reference to Romney’s uptight persona, but the line of attack would have been available against any Republican nominee—Rick Santorum, Michele Bachmann, Rick Perry, or any other of the dour reactionaries who might have snatched the nomination. The message is transmitted in a thousand ways, both obvious and obscure: Tina Fey’s devastating portrayal of Sarah Palin. Obama appearing on Late Night With Jimmy Fallon to “slow jam the news,” which meant to recite his campaign message of the week. The severed head of George W. Bush appearing on Game of Thrones. An episode of Mad Men that included the odd throwaway line “Romney’s a clown,” putatively to describe George Romney, who was anything but.
When Joe Biden endorsed gay marriage in May, he cited Will & Grace as the single-most important driving force in transforming public opinion on the subject. In so doing he actually confirmed the long-standing fear of conservatives—that a coterie of Hollywood elites had undertaken an invidious and utterly successfully propaganda campaign, and had transmuted the cultural majority into a minority. Set aside the substance of the matter and consider the process of it—that is, think of it from the conservative point of view, if you don’t happen to be one. Imagine that large chunks of your entertainment mocked your values and even transformed once-uncontroversial beliefs of yours into a kind of bigotry that might be greeted with revulsion.
You’d probably be angry, too.
by Jonathan Chait, New York Magazine | Read more:
Photo: Courtesy of Warner Bros. Pictures. Photo-illustration by Gluekit
The Sleep Racket
Over the last year you could have made a pile of money by betting on a little company in the business of ... insomnia. Shares of ResMed, in Poway, Calif., leaped 44%, selling $465 million worth of “sleep-disordered-breathing” equipment--face masks, nasal pillows, humidifiers and so-called continuous positive airway pressure devices. “It’s a monster market; it’s bigger than Ben-Hur,” says Peter C. Farrell, ResMed’s voluble chief executive. You’d have done even better as an original investor in Pacific Sleep Medicine Services, a small chain of sleep centers, mostly in southern California. One of its founders who chipped in an undisclosed amount in 1998 saw his ante jump a hundredfold, says Tom J. Wiedel, Pacific Sleep’s chief financial officer.
A bad night’s sleep is reason for a very big business. Sleeping pills, led by Ambien, rack up more than $2 billion a year in the U.S. Then there is the revenue from overnight stays at sleep clinics, over-the-counter pills, a parade of gimmicks and a thriving business for sleep specialists. For four bucks you can pick up the Insomnia Relief Scent Inhaler (lavender, rosemary, chamomile and vetiver, a grass root) from Earth Solutions of Atlanta. The Original Sound Pillow ($50) comes with two thin speakers and a headphone jack for your iPod or Discman. Spend a bit more ($147) for an MP3-like player called Pzizz that plays music, voices and tones geared to helping you fall asleep. Dreamate, an $80 device worn as a bracelet, supposedly delivers a 6,000rpm massage to the “sleeping golden triangle,” a.k.a. the wrist. Sealy’s Stearns & Foster unit is now selling a $10,000 mattress fit for a princess afflicted with insomnia. It is topped with latex and stitched with real silver threads. Hypnos goes one better, with a king-size mattress filled with layers of silk, cashmere and lamb’s wool and topped with a $20,000 sticker. And coming this spring from Matsushita Electric Works: a room tricked out to induce “deep sleep” within 30 minutes. It includes a reclining bed, sound-absorbent walls and somniferous sights and sounds from a wide-screen TV.
“Sleep is the new sex.” So says psychologist Arthur J. Spielman, associate director of the Center for Sleep Disorders Medicine & Research at New York Methodist Hospital in Brooklyn, N.Y. “People want it, need it, can’t get enough of it.” The same could be said of profits. Spielman is coauthor of The Insomnia Answer ($24, Perigee Books, 2006). He is also developing light-delivering goggles that are supposed to help people reset the circadian rhythms that govern when they nod off and wake up, so they fall asleep faster and stay asleep longer.
Sleep is also the new snake oil--the promise of a good snooze from a book or a bed or a bottle. It’s easy pickings. Who isn’t somewhat slumber-deprived? Given the demands of work and family, no one gets “enough” sleep, whatever that is. The Morpheus mongers point to all kinds of studies on their behalf. The number of Americans who say they sleep less than six hours a night--16% in 2005, compared with 12% in 1998--is on the rise, claims the National Sleep Foundation, which, coincidentally, gets funding from pharmaceutical companies. According to the American Academy of Sleep Medicine, there are 81 chronic sleeping disorders, from apnea, which causes interrupted breathing, to restless leg syndrome. No wonder sleep labs are popping up everywhere--650 and counting, compared with just a few in the mid-1970s.
According to the National Institutes of Health, sleeplessness creates $16 billion in annual health care expenses and $50 billion in lost productivity. Scientists are finding that chronically reduced or disrupted sleep may increase the risk of obesity, diabetes and cardiovascular disease. “We know from all the research that sleep is just as important to overall health as exercise and diet,” says Carl Hunt, special assistant to the director of the National Heart, Lung & Blood Institute at the NIH.
by Melanie Wells, Forbes (2006) | Read more:
Kamisaka Sekka (1866 - 1942) Japanese Woodblock Print
Rolling Hillside
Sekka’s A World of Things Series (Momoyogusa)
via:
Monday, August 20, 2012
Subscribe to:
Comments (Atom)











.jpg)
.jpg)
