Saturday, November 17, 2012
The Real Scandal
[ed. See also, Collateral damage of our surveillance state.]
So not only did the FBI - again, all without any real evidence of a crime - trace the locations and identity of Broadwell and Petreaus, and read through Broadwell's emails (and possibly Petraeus'), but they also got their hands on and read through 20,000-30,000 pages of emails between Gen. Allen and Kelley.This is a surveillance state run amok. It also highlights how any remnants of internet anonymity have been all but obliterated by the union between the state and technology companies.
But, as unwarranted and invasive as this all is, there is some sweet justice in having the stars of America's national security state destroyed by the very surveillance system which they implemented and over which they preside. As Trevor Timm of the Electronic Frontier Foundation put it this morning: "Who knew the key to stopping the Surveillance State was to just wait until it got so big that it ate itself?" [ed. Or, as Kurt Opsahi, a senior staff attorney at EFF, asks "If the director of central intelligence isn't able to successfully keep his emails private, what chance do I have?"]
It is usually the case that abuses of state power become a source for concern and opposition only when they begin to subsume the elites who are responsible for those abuses. Recall how former Democratic Rep. Jane Harman - one of the most outspoken defenders of the illegal Bush National Security Agency (NSA) warrantless eavesdropping program -suddenly began sounding like an irate, life-long ACLU privacy activist when it was revealed that the NSA had eavesdropped on her private communications with a suspected Israeli agent over alleged attempts to intervene on behalf of AIPAC officials accused of espionage. Overnight, one of the Surveillance State's chief assets, the former ranking member of the House Intelligence Committee, transformed into a vocal privacy proponent because now it was her activities, rather than those of powerless citizens, which were invaded.
With the private, intimate activities of America's most revered military and intelligence officials being smeared all over newspapers and televisions for no good reason, perhaps similar conversions are possible. Put another way, having the career of the beloved CIA Director and the commanding general in Afghanistan instantly destroyed due to highly invasive and unwarranted electronic surveillance is almost enough to make one believe not only that there is a god, but that he is an ardent civil libertarian.
The US operates a sprawling, unaccountable Surveillance State that - in violent breach of the core guarantees of the Fourth Amendment - monitors and records virtually everything even the most law-abiding citizens do. Just to get a flavor for how pervasive it is, recall that the Washington Post, in its 2010 three-part "Top Secret America" series, reported: "Every day, collection systems at the National Security Agency intercept and store 1.7 billion e-mails, phone calls and other types of communications."
by Glenn Greenwald, The Guardian | Read more:
Image NSA Headquarters via: BGR
A Preface to "Song Reader"
[ed. Story behind Beck's new "album" Song Reader, 20 songbooks designed to be interpreted by the listener/player.]
After releasing an album in the mid-nineteen-nineties, I was sent a copy of the sheet-music version by a publisher who had commissioned piano transcriptions and guitar-chord charts of everything on the original recording. Seeing the record’s sonic ideas distilled down to notation made it obvious that most of the songs weren’t intended to work that way. Reversing the process and putting together a collection of songs in book form seemed more natural—it would be an album that could only be heard by playing the songs.
A few years later, I came across a story about a song called “Sweet Leilani,” which Bing Crosby had released in 1937. Apparently, it was so popular that, by some estimates, the sheet music sold fifty-four million copies. Home-played music had been so widespread that nearly half the country had bought the sheet music for a single song, and had presumably gone through the trouble of learning to play it. It was one of those statistics that offers a clue to something fundamental about our past.
I met with Dave Eggers in 2004 to talk about doing a songbook project with McSweeney’s. Initially I was going to write the songs the same way I’d write one of my albums, only in notated form, leaving the interpretation and performance to the player. But after a few discussions, the approach broadened. We started collecting old sheet music, and becoming acquainted with the art work, the ads, the tone of the copy, and the songs themselves. They were all from a world that had been cast so deeply into the shadow of contemporary music that only the faintest idea of it seemed to exist anymore. I wondered if there was a way to explore that world that would be more than an exercise in nostalgia—a way to represent how people felt about music back then, and to speak to what was left, in our nature, of that instinct to play popular music ourselves.
When I started out on guitar, I gravitated toward folk and country blues; they seemed to work well with the limited means I had to make music of my own. The popular songs, by contrast, didn’t really translate to my Gibson flat-top acoustic. There was an unspoken division between the music you heard on the radio and the music you were able to play with your own hands. By then, recorded music was no longer just the document of a performance—it was a composite of style, hooks, and production techniques, an extension of a popular personality’s image within a current sound.
The pop of the early twentieth century had a different character. Songs could function as an accompaniment to some action; they could speak to specific parts of life, even as they indulged in fantasy and lyricism. They could be absurd as often as they were sentimental—you could pick up “The Unlucky Velocipedist” along with “Get Off of Cuba’s Toes” and “I’m a Cake-Eating Man.” Motifs repeated—there are thousands of “moon” songs, exotic-locale songs, place-name songs, songs about new inventions, stuttering songs—but even though much of the music was formulaic, there was originality and eccentricity as well. And professional songwriters, remote figures, names on a page, occupied a central place in the lives of millions. The culture was closer to its folk traditions, to the time of songs being passed down. The music felt like it could belong to almost anybody.
You could say that things like karaoke or band-replicating video games have filled that vacuum, but home music was different in its demands—a fundamentally more individual expression. Learning to play a song is its own category of experience; recorded music made much of that participation unnecessary. More recently, digital developments have made songs even less substantial-seeming than they were when they came on vinyl or CD. Access to music has changed the perception of it. Songs have lost their cachet; they compete with so much other noise now that they can become more exaggerated in an attempt to capture attention. The question of what a song is supposed to do, and how its purpose has altered, has begun to seem worth asking.
by Beck Hansen, New Yorker | Read more:
After releasing an album in the mid-nineteen-nineties, I was sent a copy of the sheet-music version by a publisher who had commissioned piano transcriptions and guitar-chord charts of everything on the original recording. Seeing the record’s sonic ideas distilled down to notation made it obvious that most of the songs weren’t intended to work that way. Reversing the process and putting together a collection of songs in book form seemed more natural—it would be an album that could only be heard by playing the songs.
A few years later, I came across a story about a song called “Sweet Leilani,” which Bing Crosby had released in 1937. Apparently, it was so popular that, by some estimates, the sheet music sold fifty-four million copies. Home-played music had been so widespread that nearly half the country had bought the sheet music for a single song, and had presumably gone through the trouble of learning to play it. It was one of those statistics that offers a clue to something fundamental about our past.
I met with Dave Eggers in 2004 to talk about doing a songbook project with McSweeney’s. Initially I was going to write the songs the same way I’d write one of my albums, only in notated form, leaving the interpretation and performance to the player. But after a few discussions, the approach broadened. We started collecting old sheet music, and becoming acquainted with the art work, the ads, the tone of the copy, and the songs themselves. They were all from a world that had been cast so deeply into the shadow of contemporary music that only the faintest idea of it seemed to exist anymore. I wondered if there was a way to explore that world that would be more than an exercise in nostalgia—a way to represent how people felt about music back then, and to speak to what was left, in our nature, of that instinct to play popular music ourselves.
When I started out on guitar, I gravitated toward folk and country blues; they seemed to work well with the limited means I had to make music of my own. The popular songs, by contrast, didn’t really translate to my Gibson flat-top acoustic. There was an unspoken division between the music you heard on the radio and the music you were able to play with your own hands. By then, recorded music was no longer just the document of a performance—it was a composite of style, hooks, and production techniques, an extension of a popular personality’s image within a current sound.
The pop of the early twentieth century had a different character. Songs could function as an accompaniment to some action; they could speak to specific parts of life, even as they indulged in fantasy and lyricism. They could be absurd as often as they were sentimental—you could pick up “The Unlucky Velocipedist” along with “Get Off of Cuba’s Toes” and “I’m a Cake-Eating Man.” Motifs repeated—there are thousands of “moon” songs, exotic-locale songs, place-name songs, songs about new inventions, stuttering songs—but even though much of the music was formulaic, there was originality and eccentricity as well. And professional songwriters, remote figures, names on a page, occupied a central place in the lives of millions. The culture was closer to its folk traditions, to the time of songs being passed down. The music felt like it could belong to almost anybody.
You could say that things like karaoke or band-replicating video games have filled that vacuum, but home music was different in its demands—a fundamentally more individual expression. Learning to play a song is its own category of experience; recorded music made much of that participation unnecessary. More recently, digital developments have made songs even less substantial-seeming than they were when they came on vinyl or CD. Access to music has changed the perception of it. Songs have lost their cachet; they compete with so much other noise now that they can become more exaggerated in an attempt to capture attention. The question of what a song is supposed to do, and how its purpose has altered, has begun to seem worth asking.
by Beck Hansen, New Yorker | Read more:
The Pretty Restless
[ed. As one commenter noted, this sounds like it should have been in the Hangover.]
Last Call
It’s apparent in their hospitals, where since the 1970s rates of cirrhosis and other liver diseases among the middle-aged have increased by eightfold for men and sevenfold for women. And it’s apparent in their streets, where the carousing, violent “lager lout” is as much a symbol of modern Britain as Adele, Andy Murray, and the London Eye. Busting a bottle across someone’s face in a bar is a bona fide cultural phenomenon—so notorious that it has its own slang term, “glassing,” and so common that at one point the Manchester police called for bottles and beer mugs to be replaced with more shatter-resistant material. In every detail but the style of dress, the alleys of London on a typical Saturday night look like the scenes in William Hogarth’s famous pro-temperance print Gin Lane. It was released in 1751.
The United States, although no stranger to alcohol abuse problems, is in comparatively better shape. A third of the country does not drink, and teenage drinking is at a historic low. The rate of alcohol use among seniors in high school has fallen 25 percentage points since 1980. Glassing is something that happens in movies, not at the corner bar.
Why has the United States, so similar to Great Britain in everything from language to pop culture trends, managed to avoid the huge spike of alcohol abuse that has gripped the UK? The reasons are many, but one stands out above all: the market in Great Britain is rigged to foster excessive alcohol consumption in ways it is not in the United States—at least not yet. (...)
A moment’s thought makes it obvious that alcohol is different from, say, apples. Apples don’t form addicts. Apples don’t foster disease. Society doesn’t bear the cost of excessive apple consumption. Society does bear the cost of alcoholism, drink-related illness, and drunken violence and crime. The fact that alcohol is habit forming and life threatening among a substantial share of those who use it (and kills or damages the lives of many who don’t) means that a market for it inevitably imposes steep costs on society.
It was the recognition of this plain truth that led post-Prohibition America to regulate the alcohol market as a rancher might fetter a horse—letting it roam freely within certain confines, neither as far nor as fast as it might choose.
The UK, by contrast, spent most of the last eighty years fussing with the barn door while the beast ran wild. It made sure that every pub closed at the appointed hour, that every glass of ale contained a full Queen’s pint, that every dram of whiskey was doled out in increments precise to the milliliter—and simultaneously allowed the industry to adopt virtually any tactic that could get more young people to start drinking and keep at it throughout their lives. It is no coincidence that one of the first major studies to prompt a shift in Britain’s approach to liquor regulation, published in 2003, is titled Alcohol: No Ordinary Commodity.
The UK’s modern drinking problem started appearing in the years following World War II. Some of the developments were natural. Peace reigned; people wanted to have fun again; there was an understandable push toward relaxing wartime restrictions and loosening puritan attitudes left over from the more temperance-minded prewar years.
But other changes were happening that deserved, but did not get, a dose of caution. As the nation shifted to a service and banking economy, and from agricultural and industrial towns to modern cities and suburbs, social life moved from pubs to private homes and shopping moved from the local grocer, butcher, and fishmonger to the all-in-one supermarket. In the 1960s, loosened regulations led to a boom in the off-license sale of alcohol—that is, store-based sale for private consumption, as opposed to on-license sale in public drinking establishments. But whereas pubs were required to meet certain responsibilities (such as refusing to serve the inebriated), and had their hours of operation strictly regulated (for example, having to close their doors temporarily in the afternoon, to prevent all-day drinking), few limits were placed on off-licenses.
Supermarkets, in particular, profited from the new regime. They were free to stock wine, beer, and liquor alongside other consumables, making alcohol as convenient to purchase as marmalade. They were free, also, to offer discounts on bulk sales, and to use alcoholic beverages as so-called loss leaders, selling them below cost to lure customers into their stores and recouping the losses through increased overall sales. Very quickly, cheap booze became little more than a force multiplier for groceries.
When the supermarkets themselves subsequently underwent a wave of consolidation, the multiplier only increased. Four major chains—Tesco, Sainsbury’s, Asda, and Morrisons—now enjoy near-total dominance in the UK, and their vast purchasing power lets them cut alcohol prices even further. Relative to disposable income, alcohol today costs 40 percent less than it did in 1980. The country is awash in a river of cheap drink, available on seemingly every corner.
by Tim Heffernan, Washington Monthly | Read more:
Can American Diplomacy Ever Come Out of Its Bunker?
Three decades later, after serving as an ambassador in three countries, Neumann found himself marveling at how much his profession has changed. “The dangers have gotten worse, but the change is partly psychological,” he told me. “There’s less willingness among our political leaders to accept risks, and all that has driven us into the bunker.”
Nothing illustrated those changes better than the death of J. Christopher Stevens, after an assault by jihadis on the U.S. mission in Benghazi on Sept. 11. Stevens was a brave and thoughtful diplomat who, like Neumann, lived to engage with ordinary people in the countries where he served, to get past the wire. Yet his death was treated as a scandal, and it set off a political storm that seems likely to tie the hands of American diplomats around the world for some time to come. Congressmen and Washington pundits accused the administration of concealing the dangers Americans face abroad and of failing Stevens by providing inadequate security. Threats had been ignored, the critics said, seemingly unaware that a background noise of threats is constant at embassies across the greater Middle East. The death of an ambassador would not be seen as the occasional price of a noble but risky profession; someone had to be blamed.
Lost in all this partisan wrangling was the fact that American diplomacy has already undergone vast changes in the past few decades and is now so heavily encumbered by fortresslike embassies, body armor and motorcades that it is almost unrecognizable. In 1985 there were about 150 security officers in U.S. embassies abroad, and now there are about 900. That does not include the military officers and advisers, whose presence in many embassies — especially in the Middle East — can change the atmosphere. Security has gone from a marginal concern to the very heart of American interactions with other countries.
The barriers are there for a reason: Stevens’s death attests to that, as do those of Americans in Beirut, Baghdad and other violent places. But the reaction to the attack in Benghazi crystallized a sense among many diplomats that risks are less acceptable in Washington than they once were, that the mantra of “security” will only grow louder. As a result, some of the country’s most distinguished former ambassadors are now asking anew what diplomacy can achieve at such a remove.
“No one has sat back to say, ‘What are our objectives?’ ” said Prudence Bushnell, who was ambassador to Kenya when the Qaeda bombing took place there in 1998, killing more than 200 people and injuring 4,000. “The model has become, we will go to dangerous places and transform them, and we will do it from secure fortresses. And it doesn’t work.”
Photo: Tara Todras-Whitehill for The New York Times
Friday, November 16, 2012
A Phony Hero for a Phony War
[ed. See also: The Medals They Carried.]
“What’s wrong with a general looking good?” you may wonder. I would propose that every moment a general spends on his uniform jacket is a moment he’s not doing his job, which is supposed to be leading soldiers in combat and winning wars — something we, and our generals, stopped doing about the time that MacArthur gold-braided his way around the stalemated Korean War.
And now comes “Dave” Petraeus, and the Iraq and Afghanistan conflicts. No matter how good he looked in his biographer-mistress’s book, it doesn’t make up for the fact that we failed to conquer the countries we invaded, and ended up occupying undefeated nations.
The genius of General Petraeus was to recognize early on that the war he had been sent to fight in Iraq wasn’t a real war at all. This is what the public and the news media — lamenting the fall of the brilliant hero undone by a tawdry affair — have failed to see. He wasn’t the military magician portrayed in the press; he was a self-constructed hologram, emitting an aura of preening heroism for the ever eager cameras. (...)
The thing he learned to do better than anything else was present the image of The Man You Turn To When Things Get Tough. (Who can forget the Newsweek cover, “Can This Man Save Iraq?” with a photo of General Petraeus looking very Princeton-educated in his Westy-starched fatigues?) He was so good at it that he conned the news media into thinking he was the most remarkable general officer in the last 40 years, and, by playing hard to get, he conned the political establishment into thinking that he could morph into Ike Part Deux and might one day be persuaded to lead a moribund political party back to the White House.
The problem was that he hadn’t led his own Army to win anything even approximating a victory in either Iraq or Afghanistan. It’s not just General Petraeus. The fact is that none of our generals have led us to a victory since men like Patton and my grandfather, Lucian King Truscott Jr., stormed the beaches of North Africa and southern France with blood in their eyes and military murder on their minds.
Those generals, in my humble opinion, were nearly psychotic in their drive to kill enemy soldiers and subjugate enemy nations. Thankfully, we will probably never have cause to go back to those blood-soaked days. But we still shouldn’t allow our military establishment to give us one generation after another of imitation generals who pretend to greatness on talk shows and photo spreads, jetting around the world in military-spec business jets.
by Lucien K. Truscott IV, NY Times | Read more:
Photo: TYWKIWDBI
Dry Cabins: Living the Dream
[ed. This is almost a rite of passage if you're a UAF student living the Alaska dream.]
The rinse water washes down six inches of pipe into a bucket beneath your sink. Dishes done, you carefully pick up the bucket-full of rancid waste water and inch outside, mindful not to slop any on the floor. You fling the water from your deck and it evaporates instantly into the air.
With the dishes done, you prepare to brave the cold for the bathroom, an outhouse 20 feet away. Hopefully there are no moose on the trail, but you grab your headlamp just in case.
You're living the "dry cabin" lifestyle, just like several thousand others in Fairbanks, an Alaska town known for its extreme climate and endless winters. It's also the epicenter of an unusual cultural phenomenon: Dry-cabin living, a.k.a, living without running water.
That means no plumbing.
No toilet.
No shower.
No kitchen faucets. These modern amenities are replaced by outhouses, five-gallon water jugs and trips to the laundromat.
Why would anyone live this way in one of America's coldest cities?
Dry cabin communities in Fairbanks are partially a product of geology – yes, you read that right. Patches of ground remain frozen year-round in the Interior; that permafrost presents builders with a lot of problems. You can’t dig into frozen ground, so installing septic and water systems becomes difficult if not impossible.
People turn to dry cabins instead. Some are drawn to dry-cabin living for the mystique that the lifestyle offers. Others gravitate toward dry cabins for economic reasons. Either way, it’s a life that offers rewards and challenges found only in Alaska.
by Laurel Andrews, Alaska Dispatch | Read more:
Photo: Leah Hill
How a Vicious Circle of Self-Interest Sank a California City
When this sun-drenched exurb east of Los Angeles filed for bankruptcy protection in August, the city attorney suggested fraudulent accounting was the root of the problem.
The mayor blamed a dysfunctional city council and greedy police and fire unions. The unions blamed the mayor. Even now, there is little agreement on how the city got into this crisis or how it can extricate itself.
"It's total political chaos," said John Husing, a former San Bernardino resident and regional economist. "There is no solution. They'll never fix anything."
Yet on close examination, the city's decades-long journey from prosperous, middle-class community to bankrupt, crime-ridden, foreclosure-blighted basket case is straightforward — and alarmingly similar to the path traveled by many municipalities around America's largest state. San Bernardino succumbed to a vicious circle of self-interests among city workers, local politicians and state pension overseers.
Little by little, over many years, the salaries and retirement benefits of San Bernardino's city workers — and especially its police and firemen — grew richer and richer, even as the city lost its major employers and gradually got poorer and poorer.
Unions poured money into city council elections, and the city council poured money into union pay and pensions. The California Public Employees' Retirement System (Calpers), which manages pension plans for San Bernardino and many other cities, encouraged ever-sweeter benefits. Investment bankers sold clever bond deals to pay for them. Meanwhile, state law made it impossible to raise local property taxes and difficult to boost any other kind.
No single deal or decision involving benefits and wages over the years killed the city. But cumulatively, they built a pension-fueled financial time-bomb that finally exploded.
In bankrupt San Bernardino, a third of the city's 210,000 people live below the poverty line, making it the poorest city of its size in California. But a police lieutenant can retire in his 50s and take home $230,000 in one-time payouts on his last day, before settling in with a guaranteed $128,000-a-year pension. Forty-six retired city employees receive over $100,000 a year in pensions.
Almost 75 percent of the city's general fund is now spent solely on the police and fire departments, according to a Reuters analysis of city bankruptcy documents - most of that on wages and pension costs.
by Tim Reid and Cezary Podkul and Ryan McNeill, Reuters | Read more:
Photo: Lucy Nicholson/Reuters
The mayor blamed a dysfunctional city council and greedy police and fire unions. The unions blamed the mayor. Even now, there is little agreement on how the city got into this crisis or how it can extricate itself.
"It's total political chaos," said John Husing, a former San Bernardino resident and regional economist. "There is no solution. They'll never fix anything."
Yet on close examination, the city's decades-long journey from prosperous, middle-class community to bankrupt, crime-ridden, foreclosure-blighted basket case is straightforward — and alarmingly similar to the path traveled by many municipalities around America's largest state. San Bernardino succumbed to a vicious circle of self-interests among city workers, local politicians and state pension overseers.
Little by little, over many years, the salaries and retirement benefits of San Bernardino's city workers — and especially its police and firemen — grew richer and richer, even as the city lost its major employers and gradually got poorer and poorer.
Unions poured money into city council elections, and the city council poured money into union pay and pensions. The California Public Employees' Retirement System (Calpers), which manages pension plans for San Bernardino and many other cities, encouraged ever-sweeter benefits. Investment bankers sold clever bond deals to pay for them. Meanwhile, state law made it impossible to raise local property taxes and difficult to boost any other kind.
No single deal or decision involving benefits and wages over the years killed the city. But cumulatively, they built a pension-fueled financial time-bomb that finally exploded.
In bankrupt San Bernardino, a third of the city's 210,000 people live below the poverty line, making it the poorest city of its size in California. But a police lieutenant can retire in his 50s and take home $230,000 in one-time payouts on his last day, before settling in with a guaranteed $128,000-a-year pension. Forty-six retired city employees receive over $100,000 a year in pensions.
Almost 75 percent of the city's general fund is now spent solely on the police and fire departments, according to a Reuters analysis of city bankruptcy documents - most of that on wages and pension costs.
by Tim Reid and Cezary Podkul and Ryan McNeill, Reuters | Read more:
Photo: Lucy Nicholson/Reuters
How to Survive Societal Collapse in Suburbia
Many so-called survivalists would take pride in keeping far away from places that sell espresso drinks. But Douglas, a 38-year-old entrepreneur and founder of one of the largest preparedness expos in the country, isn’t your typical prepper.
At that morning’s meeting, a strategy session with two new colleagues, Douglas made it clear that he doesn’t even like the word “survivalist.” He believes the word is ruined, evoking “the nut job who lives out in the mountains by himself on the retreat.” Instead, he prefers “self-reliance.”
When prompted by his colleagues to define the term, Douglas leaned forward in his chair. “I’m glad you asked,” he replied. “Take notes. This is good.”
For the next several minutes, Douglas talked about emergency preparedness, sustainable living and financial security — what he called the three pillars of self-reliance. He detailed the importance of solar panels, gardens, water storage and food stockpiles. People shouldn’t just have 72-hour emergency kits for when the power grid goes down; they should learn how to live on their own. It’s a message that Douglas is trying to move from the fringe to the mainstream.
“Our main goal is to reach as many people and get the word out to as many people as we can, to get them thinking and moving in this direction,” he said. “Sound good?”
The preparedness industry, always prosperous during hard times, is thriving again now. In Douglas’s circles, people talk about “the end of the world as we know it” with such regularity that the acronym Teotwawki (tee-ought-wah-kee) has come into widespread use. The Vivos Group, which sells luxury bunkers, until recently had a clock on its Web site that was ticking down to Dec. 21, 2012 — a date that, thanks to the Mayan calendar, some believe will usher in the end times. But amid the alarmism, there is real concern that the world is indeed increasingly fragile — a concern highlighted most recently by Hurricane Sandy. The storm’s aftermath has shown just how unprepared most of us are to do without the staples of modern life: food, fuel, transportation and electric power.
The survivalist business surged in the wake of 9/11, when authorities instructed New Yorkers to prepare disaster kits, learn how to seal doors and vents with duct tape and be ready to evacuate at any time. Threat-level warnings about possible terrorist attacks kept Americans rattled for years, and were followed by various disasters of other types: the financial meltdown, Hurricanes Katrina and Ike, drought, blackouts and concerns over everything from rising sea levels to Iran’s nuclear program.
Late last year, Douglas and his partners formed the Red Shed Media Group, a single corporate home for several endeavors: the Self Reliance Expo, conventions that Douglas founded in 2010, dedicated to showcasing survival gear and skills; Self Reliance Broadcasting, an Internet-based channel devoted to the cause; and an entity that controls the rights to publishing “Making the Best of Basics,” a popular survivalist handbook. The name Red Shed was symbolic for Douglas. “When your grandfather went and did a project,” he told me, “he went out to the red shed and pulled out all the tools he needed for the job.” Douglas wants his virtual red shed to be a single place where people can get all the preparedness information they need. Five expos this year have drawn 40,000 people who pay $10 each. The radio network has logged more than two million podcast downloads; in one day alone in July, it reported nearly 90,000 downloads. The book, which was first published in 1974, includes recipes for everything from wild pig (“they are easy to prepare”) to dove pie (“simmer for one hour or until doves are tender”). Douglas said it had sold about 20,000 copies this year.
But the goal isn’t just to sell to the same old preparedness crowd. Red Shed wants to attract liberals and political moderates to a marketplace historically populated by conservatives and right-wing extremists. “It’s not the end of the world,” Douglas told me last spring, making a bold statement for someone in his industry. “It’s not doomsday.” It’s about showing the gun-toting mountain man in his camouflage and the suburban soccer mom in her minivan that they want the same thing: peace of mind. “We don’t say, ‘Hurry up and buy your stuff because Obama is going to ruin the country,’ ” Douglas said. “We don’t get into the political crap. We just want to teach people the lifestyle.”
by Keith O'Brien, NY Times | Read more:
Photograph by Dwight Eschliman for The New York TimesMeditation For Better Health
African Americans with heart disease who practiced Transcendental Meditation regularly were 48 percent less likely to have a heart attack, stroke or die from all causes compared with African Americans who attended a health education class over more than five years, according to new research published in the American Heart Association journal Circulation: Cardiovascular Quality and Outcomes.
Those practicing meditation also lowered their blood pressure and reported less stress and anger. And the more regularly patients meditated, the greater their survival, said researchers who conducted the study at the Medical College of Wisconsin in Milwaukee.
"We hypothesized that reducing stress by managing the mind-body connection would help improve rates of this epidemic disease," said Robert Schneider, M.D., lead researcher and director of the Institute for Natural Medicine and Prevention in Fairfield, Iowa. "It appears that Transcendental Meditation is a technique that turns on the body's own pharmacy — to repair and maintain itself."
For the study, researchers randomly assigned 201 people to participate in a Transcendental Meditation stress-reducing program or a health education class about lifestyle modification for diet and exercise.
Participants in the health education group were advised, under the instruction of professional health educators, to spend at least 20 minutes a day at home practicing heart-healthy behaviors such as exercise, healthy meal preparation and nonspecific relaxation.
Researchers evaluated participants at the start of the study, at three months and every six months thereafter for body mass index, diet, program adherence, blood pressure and cardiovascular hospitalizations. They found:
"Transcendental Meditation may reduce heart disease risks for both healthy people and those with diagnosed heart conditions," said Schneider, who is also dean of Maharishi College of Perfect Health in Fairfield, Iowa.
"The research on Transcendental Meditation and cardiovascular disease is established well enough that physicians may safely and routinely prescribe stress reduction for their patients with this easy to implement, standardized and practical program," he said.
by Maggie Francis, American Heart Association | Read more (citation):
Those practicing meditation also lowered their blood pressure and reported less stress and anger. And the more regularly patients meditated, the greater their survival, said researchers who conducted the study at the Medical College of Wisconsin in Milwaukee.
"We hypothesized that reducing stress by managing the mind-body connection would help improve rates of this epidemic disease," said Robert Schneider, M.D., lead researcher and director of the Institute for Natural Medicine and Prevention in Fairfield, Iowa. "It appears that Transcendental Meditation is a technique that turns on the body's own pharmacy — to repair and maintain itself."
For the study, researchers randomly assigned 201 people to participate in a Transcendental Meditation stress-reducing program or a health education class about lifestyle modification for diet and exercise.
- Forty-two percent of the participants were women, average age 59, and half reported earning less than $10,000 per year.
- Average body mass index was about 32, which is clinically obese.
- Nearly 60 percent in both treatment groups took cholesterol-lowering drugs; 41 percent of the meditation group and 31 percent of the health education group took aspirin; and 38 percent of the meditation group and 43 percent of the health education group smoked.
Participants in the health education group were advised, under the instruction of professional health educators, to spend at least 20 minutes a day at home practicing heart-healthy behaviors such as exercise, healthy meal preparation and nonspecific relaxation.
Researchers evaluated participants at the start of the study, at three months and every six months thereafter for body mass index, diet, program adherence, blood pressure and cardiovascular hospitalizations. They found:
- There were 52 primary end point events. Of these, 20 events occurred in the meditation group and 32 in the health education group.
- Blood pressure was reduced by 5 mm Hg and anger decreased significantly among Transcendental Meditation participants compared to controls.
- Both groups showed beneficial changes in exercise and alcohol consumption, and the meditation group showed a trend towards reduced smoking. Although, there were no significant differences between the groups in weight, exercise or diet.
- Regular meditation was correlated with reduced death, heart attack and stroke.
"Transcendental Meditation may reduce heart disease risks for both healthy people and those with diagnosed heart conditions," said Schneider, who is also dean of Maharishi College of Perfect Health in Fairfield, Iowa.
"The research on Transcendental Meditation and cardiovascular disease is established well enough that physicians may safely and routinely prescribe stress reduction for their patients with this easy to implement, standardized and practical program," he said.
by Maggie Francis, American Heart Association | Read more (citation):
Image: Wikipedia
The Looming Fertilizer Shortage
I have yet to meet a climate scientist who does not believe that global warming is a worse problem than they thought a few years ago. The seriousness of this change is not appreciated by politicians and the public. The scientific world carefully measures the speed with which we approach the cliff and will, no doubt, carefully measure our rate of fall. But it is not doing enough to stop it. I am a specialist in investment bubbles, not climate science. But the effects of climate change can only exacerbate the ecological trouble I see reflected in the financial markets — soaring commodity prices and impending shortages.
My firm warned of vastly inflated Japanese equities in 1989 — the grandmother of all bubbles — US growth stocks in 2000 and everything risky in late 2007. The usual mix of investor wishful thinking and dangerous and cynical encouragement from industrial vested interests made these bubbles possible. Prices of global raw materials are now rising fast. This does not constitute a bubble, however, but is a genuine paradigm shift, perhaps the most important economic change since the Industrial Revolution. Simply, we are running out.
The price index of 33 important commodities declined by 70% over the 100 years up to 2002 — an enormous help to industrialized countries in getting rich. Only one commodity, oil, had been flat until 1972 and then, with the advent of the Organization of the Petroleum Exporting Countries, it began to rise. But since 2002, prices of almost all the other commodities, plus oil, tripled in six years; all without a world war and without much comment. Even if prices fell tomorrow by 20% they would still on average have doubled in 10 years, the equivalent of a 7% annual rise.
This price surge is a response to global population growth and the explosion of capital spending in China. Especially dangerous to social stability and human well-being are food prices and food costs. Growth in the productivity of grains has fallen to 1.2% a year, which is exactly equal to the global population growth rate. There is now no safety margin.
Then there is the impending shortage of two fertilizers: phosphorus (phosphate) and potassium (potash). These two elements cannot be made, cannot be substituted, are necessary to grow all life forms, and are mined and depleted. It’s a scary set of statements. Former Soviet states and Canada have more than 70% of the potash. Morocco has 85% of all high-grade phosphates. It is the most important quasi-monopoly in economic history.
What happens when these fertilizers run out is a question I can’t get satisfactorily answered and, believe me, I have tried. There seems to be only one conclusion: their use must be drastically reduced in the next 20–40 years or we will begin to starve.
The world’s blind spot when it comes to the fertilizer problem is seen also in the shocking lack of awareness on the part of governments and the public of the increasing damage to agriculture by climate change; for example, runs of extreme weather that have slashed grain harvests in the past few years. Recognition of the facts is delayed by the frankly brilliant propaganda and obfuscation delivered by energy interests that virtually own the US Congress. (It is not unlike the part played by the financial industry when investment bubbles start to form … but that, at least, is only money.) We need oil producers to leave 80% of proven reserves untapped to achieve a stable climate. As a former oil analyst, I can easily calculate oil companies’ enthusiasm to leave 80% of their value in the ground — absolutely nil.
My firm warned of vastly inflated Japanese equities in 1989 — the grandmother of all bubbles — US growth stocks in 2000 and everything risky in late 2007. The usual mix of investor wishful thinking and dangerous and cynical encouragement from industrial vested interests made these bubbles possible. Prices of global raw materials are now rising fast. This does not constitute a bubble, however, but is a genuine paradigm shift, perhaps the most important economic change since the Industrial Revolution. Simply, we are running out.
The price index of 33 important commodities declined by 70% over the 100 years up to 2002 — an enormous help to industrialized countries in getting rich. Only one commodity, oil, had been flat until 1972 and then, with the advent of the Organization of the Petroleum Exporting Countries, it began to rise. But since 2002, prices of almost all the other commodities, plus oil, tripled in six years; all without a world war and without much comment. Even if prices fell tomorrow by 20% they would still on average have doubled in 10 years, the equivalent of a 7% annual rise.
This price surge is a response to global population growth and the explosion of capital spending in China. Especially dangerous to social stability and human well-being are food prices and food costs. Growth in the productivity of grains has fallen to 1.2% a year, which is exactly equal to the global population growth rate. There is now no safety margin.
Then there is the impending shortage of two fertilizers: phosphorus (phosphate) and potassium (potash). These two elements cannot be made, cannot be substituted, are necessary to grow all life forms, and are mined and depleted. It’s a scary set of statements. Former Soviet states and Canada have more than 70% of the potash. Morocco has 85% of all high-grade phosphates. It is the most important quasi-monopoly in economic history.
What happens when these fertilizers run out is a question I can’t get satisfactorily answered and, believe me, I have tried. There seems to be only one conclusion: their use must be drastically reduced in the next 20–40 years or we will begin to starve.
The world’s blind spot when it comes to the fertilizer problem is seen also in the shocking lack of awareness on the part of governments and the public of the increasing damage to agriculture by climate change; for example, runs of extreme weather that have slashed grain harvests in the past few years. Recognition of the facts is delayed by the frankly brilliant propaganda and obfuscation delivered by energy interests that virtually own the US Congress. (It is not unlike the part played by the financial industry when investment bubbles start to form … but that, at least, is only money.) We need oil producers to leave 80% of proven reserves untapped to achieve a stable climate. As a former oil analyst, I can easily calculate oil companies’ enthusiasm to leave 80% of their value in the ground — absolutely nil.
by Jeremy Grantham, Nature | Read more:
Thursday, November 15, 2012
Subscribe to:
Comments (Atom)

















