Friday, November 16, 2012


Bo Bartlett, Study for Light Years
via:

How a Vicious Circle of Self-Interest Sank a California City

When this sun-drenched exurb east of Los Angeles filed for bankruptcy protection in August, the city attorney suggested fraudulent accounting was the root of the problem.

The mayor blamed a dysfunctional city council and greedy police and fire unions. The unions blamed the mayor. Even now, there is little agreement on how the city got into this crisis or how it can extricate itself.

"It's total political chaos," said John Husing, a former San Bernardino resident and regional economist. "There is no solution. They'll never fix anything."

Yet on close examination, the city's decades-long journey from prosperous, middle-class community to bankrupt, crime-ridden, foreclosure-blighted basket case is straightforward — and alarmingly similar to the path traveled by many municipalities around America's largest state. San Bernardino succumbed to a vicious circle of self-interests among city workers, local politicians and state pension overseers.

Little by little, over many years, the salaries and retirement benefits of San Bernardino's city workers — and especially its police and firemen — grew richer and richer, even as the city lost its major employers and gradually got poorer and poorer.

Unions poured money into city council elections, and the city council poured money into union pay and pensions. The California Public Employees' Retirement System (Calpers), which manages pension plans for San Bernardino and many other cities, encouraged ever-sweeter benefits. Investment bankers sold clever bond deals to pay for them. Meanwhile, state law made it impossible to raise local property taxes and difficult to boost any other kind.

No single deal or decision involving benefits and wages over the years killed the city. But cumulatively, they built a pension-fueled financial time-bomb that finally exploded.

In bankrupt San Bernardino, a third of the city's 210,000 people live below the poverty line, making it the poorest city of its size in California. But a police lieutenant can retire in his 50s and take home $230,000 in one-time payouts on his last day, before settling in with a guaranteed $128,000-a-year pension. Forty-six retired city employees receive over $100,000 a year in pensions.

Almost 75 percent of the city's general fund is now spent solely on the police and fire departments, according to a Reuters analysis of city bankruptcy documents - most of that on wages and pension costs.

by Tim Reid and Cezary Podkul and Ryan McNeill, Reuters |  Read more:
Photo: Lucy Nicholson/Reuters

How to Survive Societal Collapse in Suburbia


On a clear morning in May, Ron Douglas left his home in exurban Denver, eased into his Toyota pickup truck and drove to a business meeting at a Starbucks. Douglas, a bearded bear of a man, ordered a venti double-chocolate-chip Frappuccino — “the girliest drink ever,” he called it — and then sat down to discuss the future of the growing survivalist industry.

Many so-called survivalists would take pride in keeping far away from places that sell espresso drinks. But Douglas, a 38-year-old entrepreneur and founder of one of the largest preparedness expos in the country, isn’t your typical prepper.

At that morning’s meeting, a strategy session with two new colleagues, Douglas made it clear that he doesn’t even like the word “survivalist.” He believes the word is ruined, evoking “the nut job who lives out in the mountains by himself on the retreat.” Instead, he prefers “self-reliance.”

When prompted by his colleagues to define the term, Douglas leaned forward in his chair. “I’m glad you asked,” he replied. “Take notes. This is good.”

For the next several minutes, Douglas talked about emergency preparedness, sustainable living and financial security — what he called the three pillars of self-reliance. He detailed the importance of solar panels, gardens, water storage and food stockpiles. People shouldn’t just have 72-hour emergency kits for when the power grid goes down; they should learn how to live on their own. It’s a message that Douglas is trying to move from the fringe to the mainstream.

“Our main goal is to reach as many people and get the word out to as many people as we can, to get them thinking and moving in this direction,” he said. “Sound good?”

The preparedness industry, always prosperous during hard times, is thriving again now. In Douglas’s circles, people talk about “the end of the world as we know it” with such regularity that the acronym Teotwawki (tee-ought-wah-kee) has come into widespread use. The Vivos Group, which sells luxury bunkers, until recently had a clock on its Web site that was ticking down to Dec. 21, 2012 — a date that, thanks to the Mayan calendar, some believe will usher in the end times. But amid the alarmism, there is real concern that the world is indeed increasingly fragile — a concern highlighted most recently by Hurricane Sandy. The storm’s aftermath has shown just how unprepared most of us are to do without the staples of modern life: food, fuel, transportation and electric power.

The survivalist business surged in the wake of 9/11, when authorities instructed New Yorkers to prepare disaster kits, learn how to seal doors and vents with duct tape and be ready to evacuate at any time. Threat-level warnings about possible terrorist attacks kept Americans rattled for years, and were followed by various disasters of other types: the financial meltdown, Hurricanes Katrina and Ike, drought, blackouts and concerns over everything from rising sea levels to Iran’s nuclear program.

Late last year, Douglas and his partners formed the Red Shed Media Group, a single corporate home for several endeavors: the Self Reliance Expo, conventions that Douglas founded in 2010, dedicated to showcasing survival gear and skills; Self Reliance Broadcasting, an Internet-based channel devoted to the cause; and an entity that controls the rights to publishing “Making the Best of Basics,” a popular survivalist handbook. The name Red Shed was symbolic for Douglas. “When your grandfather went and did a project,” he told me, “he went out to the red shed and pulled out all the tools he needed for the job.” Douglas wants his virtual red shed to be a single place where people can get all the preparedness information they need. Five expos this year have drawn 40,000 people who pay $10 each. The radio network has logged more than two million podcast downloads; in one day alone in July, it reported nearly 90,000 downloads. The book, which was first published in 1974, includes recipes for everything from wild pig (“they are easy to prepare”) to dove pie (“simmer for one hour or until doves are tender”). Douglas said it had sold about 20,000 copies this year.

But the goal isn’t just to sell to the same old preparedness crowd. Red Shed wants to attract liberals and political moderates to a marketplace historically populated by conservatives and right-wing extremists. “It’s not the end of the world,” Douglas told me last spring, making a bold statement for someone in his industry. “It’s not doomsday.” It’s about showing the gun-toting mountain man in his camouflage and the suburban soccer mom in her minivan that they want the same thing: peace of mind. “We don’t say, ‘Hurry up and buy your stuff because Obama is going to ruin the country,’ ” Douglas said. “We don’t get into the political crap. We just want to teach people the lifestyle.”

by Keith O'Brien, NY Times |  Read more:
Photograph by Dwight Eschliman for The New York Times

Meditation For Better Health

African Americans with heart disease who practiced Transcendental Meditation regularly were 48 percent less likely to have a heart attack, stroke or die from all causes compared with African Americans who attended a health education class over more than five years, according to new research published in the American Heart Association journal Circulation: Cardiovascular Quality and Outcomes.

Those practicing meditation also lowered their blood pressure and reported less stress and anger. And the more regularly patients meditated, the greater their survival, said researchers who conducted the study at the Medical College of Wisconsin in Milwaukee.

"We hypothesized that reducing stress by managing the mind-body connection would help improve rates of this epidemic disease," said Robert Schneider, M.D., lead researcher and director of the Institute for Natural Medicine and Prevention in Fairfield, Iowa. "It appears that Transcendental Meditation is a technique that turns on the body's own pharmacy — to repair and maintain itself."

For the study, researchers randomly assigned 201 people to participate in a Transcendental Meditation stress-reducing program or a health education class about lifestyle modification for diet and exercise.
  • Forty-two percent of the participants were women, average age 59, and half reported earning less than $10,000 per year.
  • Average body mass index was about 32, which is clinically obese.
  • Nearly 60 percent in both treatment groups took cholesterol-lowering drugs; 41 percent of the meditation group and 31 percent of the health education group took aspirin; and 38 percent of the meditation group and 43 percent of the health education group smoked.
Those in the meditation program sat with eyes closed for about 20 minutes twice a day practicing the technique, allowing their minds and bodies to rest deeply while remaining alert.

Participants in the health education group were advised, under the instruction of professional health educators, to spend at least 20 minutes a day at home practicing heart-healthy behaviors such as exercise, healthy meal preparation and nonspecific relaxation.

Researchers evaluated participants at the start of the study, at three months and every six months thereafter for body mass index, diet, program adherence, blood pressure and cardiovascular hospitalizations. They found:
  • There were 52 primary end point events. Of these, 20 events occurred in the meditation group and 32 in the health education group.
  • Blood pressure was reduced by 5 mm Hg and anger decreased significantly among Transcendental Meditation participants compared to controls.
  • Both groups showed beneficial changes in exercise and alcohol consumption, and the meditation group showed a trend towards reduced smoking. Although, there were no significant differences between the groups in weight, exercise or diet.
  • Regular meditation was correlated with reduced death, heart attack and stroke.
Cardiovascular disease is the leading cause of death worldwide. Death from heart disease is about 50 percent higher in black adults compared to whites in the United States. Researchers focused on African Americans because of health disparities in America.

"Transcendental Meditation may reduce heart disease risks for both healthy people and those with diagnosed heart conditions," said Schneider, who is also dean of Maharishi College of Perfect Health in Fairfield, Iowa.

"The research on Transcendental Meditation and cardiovascular disease is established well enough that physicians may safely and routinely prescribe stress reduction for their patients with this easy to implement, standardized and practical program," he said.

by Maggie Francis, American Heart Association | Read more (citation):
Image: Wikipedia

The Looming Fertilizer Shortage

I have yet to meet a climate scientist who does not believe that global warming is a worse problem than they thought a few years ago. The seriousness of this change is not appreciated by politicians and the public. The scientific world carefully measures the speed with which we approach the cliff and will, no doubt, carefully measure our rate of fall. But it is not doing enough to stop it. I am a specialist in investment bubbles, not climate science. But the effects of climate change can only exacerbate the ecological trouble I see reflected in the financial markets — soaring commodity prices and impending shortages.

My firm warned of vastly inflated Japanese equities in 1989 — the grandmother of all bubbles — US growth stocks in 2000 and everything risky in late 2007. The usual mix of investor wishful thinking and dangerous and cynical encouragement from industrial vested interests made these bubbles possible. Prices of global raw materials are now rising fast. This does not constitute a bubble, however, but is a genuine paradigm shift, perhaps the most important economic change since the Industrial Revolution. Simply, we are running out.

The price index of 33 important commodities declined by 70% over the 100 years up to 2002 — an enormous help to industrialized countries in getting rich. Only one commodity, oil, had been flat until 1972 and then, with the advent of the Organization of the Petroleum Exporting Countries, it began to rise. But since 2002, prices of almost all the other commodities, plus oil, tripled in six years; all without a world war and without much comment. Even if prices fell tomorrow by 20% they would still on average have doubled in 10 years, the equivalent of a 7% annual rise.

This price surge is a response to global population growth and the explosion of capital spending in China. Especially dangerous to social stability and human well-being are food prices and food costs. Growth in the productivity of grains has fallen to 1.2% a year, which is exactly equal to the global population growth rate. There is now no safety margin.

Then there is the impending shortage of two fertilizers: phosphorus (phosphate) and potassium (potash). These two elements cannot be made, cannot be substituted, are necessary to grow all life forms, and are mined and depleted. It’s a scary set of statements. Former Soviet states and Canada have more than 70% of the potash. Morocco has 85% of all high-grade phosphates. It is the most important quasi-monopoly in economic history.

What happens when these fertilizers run out is a question I can’t get satisfactorily answered and, believe me, I have tried. There seems to be only one conclusion: their use must be drastically reduced in the next 20–40 years or we will begin to starve.

The world’s blind spot when it comes to the fertilizer problem is seen also in the shocking lack of awareness on the part of governments and the public of the increasing damage to agriculture by climate change; for example, runs of extreme weather that have slashed grain harvests in the past few years. Recognition of the facts is delayed by the frankly brilliant propaganda and obfuscation delivered by energy interests that virtually own the US Congress. (It is not unlike the part played by the financial industry when investment bubbles start to form … but that, at least, is only money.) We need oil producers to leave 80% of proven reserves untapped to achieve a stable climate. As a former oil analyst, I can easily calculate oil companies’ enthusiasm to leave 80% of their value in the ground — absolutely nil.

by Jeremy Grantham, Nature |  Read more:

Thursday, November 15, 2012


Wabi Sabi, Elena Ray
via:

Robert John Thornton, New illustration of the sexual system of Carolus von Linnaeus (1807)
via:

My Unrealizable Postmodern Novel

I have never really aspired to write anything that you might consider literary fiction, finding its style—what the late Dennis Potter so memorably defined as "he said, she said, descriptions of the sky"—to be terminally tiresome, but about fifteen years or so back, when I was still young enough to think I could pull it off but (as it turns out) too old to really have the energy to get it together, I came up with the idea for a novel that I was absolutely sure would show the world my as-then-undiscovered genius.

The book was going to be called To Be Sure. (Don't cringe yet, you don't know how much worse it gets, or why!) It would follow the career of an aspiring writer from his early days breaking into the literary scene (this is back when there actually was a literary scene, although even then it was starting to show signs of fatigue) until his death. Let's call him Stephen Hero, because I never got as far as giving him a name and that one seems to have a pretty decent pedigree.

So far, so what, you say. Who wants to read another book about a man making his way through the bookish demimonde of New York, experiencing the vicissitudes of literary style, the cut-and-thrust of pretentious people at cocktail parties, the growing bitterness of a protagonist who realizes he might not achieve everything that once seemed so promising to him? It's a fair question. I sure as fuck don't. But my book was going to be different, different in an amazing way that nothing had ever been before.

And here was the idea: the book would be told solely through reviews written by its protagonist. There would never be a line of dialogue. You would only be able to follow the character's development through the bio appended to each review—the plan was to start with "Stephen Hero is finishing his first novel" and follow it up throughout the years listing various teaching posts, professional affiliations, anthologies edited, etc., but to make it clear that his novel was never finished—or the occasional letter to the editor from a disgruntled recipient of a poor review which delineated conflicts of interest and the like. Over the course of 100 or so reviews you'd watch as Stephen Hero went from enthusiastic young aspirant to embittered old failure. While some of that would come through in each review, it was those bios that would really show the rise and fall of this literary wannabe.

I hear your amazement. "Alex, what an astounding idea! What utter genius! How come you never did anything with it?" you ask. Well, lemme tell you. There are two major flaws in this concept as far as I could see. (There may very well be more, but I gave up after two.)

First: I am what you would consider a low-on-energy, low-on-inspiration kind of guy. The prospect of coming up with 100 different plots that would be under review—lampooning so many styles and fads over the course of 40 years, coming up with the flaws the character would complain about in the essays, mimicking the standard conventions of literary criticism, and, honestly, doing 100 versions of anything—was so daunting as to make the entire prospect untenable. And don't forget, this was right around the time "Behind The Music" debuted; there were so many other distractions.

Second, even I was not unaware of the off-the-charts pretension and showy postmodernism-run-amuck behind the concept. Given an outline of this project even Jorge Luis Borges would have been all, "Fuck this bullshit, I'm going to go listen to some movies." Consider this: one of my ideas was that in each of the reviews, the penultimate paragraph would begin with "To be sure," which would lead into five or six lines that stated the complete opposite of everything that had preceded it in the review before seguing back into the original tone of the piece. Also? The bio for the very last review would have been something like, "Stephen Hero, who passed away in December, was a contributor to this publication for over forty years. This spring, To Be Sure, a collection of his reviews, will be published by Hemingway House." Do you see what I did there? Not only did poor Stephen Hero only finally get a book published after he died, it was the very book you were reading right now! Could you not just choke yourself to death with your own fingers?

In the end, I am happy to report that good sense prevailed and I abandoned the idea entirely. (Whenever I allow sloth to win out over industry I award it the appellation of "good sense," which is a life strategy you might profitably adopt yourself if you have not already.) Is the world a worse place because I never put the book to paper? The only sense in which I can say yes to that question is when I think about the fires we are all going to need to flame eventually; no one would have bought this sucker, and I bet they would have burned really well.

You Google Wrong

One of the first things you learn in Google's Power Searching class is that if you know about the magic of CTRL+F then you are in the top 10 percent of all searchers. That made someone like me, who uses the word find function on the regular a little cocky about my searching skills, as I embarked on Google's free online class which teaches you how to type words into a search box. The thing I didn't realize, however is that it takes a lot of other, more obscure skills to move into the top 1 percent of savviest Googlers.

If you don't know the CTRL+F skill, learn it now. It's easy: pressing CTRL on Windows or ⌘ on Macs and F at the same will prompt you to enter a word or series of words that your browser will then highlight on that page.


OK. So you're now in the top 10 percent of searchers. On to the harder stuff.

Every few months, Google offers its Power Searching with Google class, which consists of six 50-minute classes split up into 5 to 10 minute YouTube clips. Each and every lesson is taught by Google research scientist (and Search expert) Dan Russell from the same couch, with the same Macbook, wearing the same light blue buttoned down shirt. It's monotony just like a real-live class! Also, like a true place of learning, there is homework. An activity follows each clip, going over (and testing) the information just discussed. There is also a mid-term and a final, which are graded. (If the prospect of limitless shame at not passing your Google Search final doesn't motivate you, nothing will.) And, in order to get a certificate (to hang on your Facebook wall?), you must complete these assessments on time.

Someone who searches all day every day might call this overkill for a skill this person already possesses. I mean, searching for stuff is what I do for my job all day long. CTRL+F is an amateur move. If that qualifies as something that puts someone in the 90th percentile, then how hard could the rest of the class be? But, I soon learned Googling isn't just a skill, it's a series of skills. You can choose to just type into that empty box. Or you can take this class and join the 1 percent of Google Searchers. But beware: getting into this elite of searchers involves watching some very dry YouTube videos. Since I've already spent the time with professor Russell and have weeks of Googling with my new tricks behind me, let me give you my little cheat sheet.

by Rebecca Greenfield, The Atlantic Wire |  Read more:

I Lived a CIA Conspiracy Theory


I had an interesting weekend. Maybe you did, too. It's always a mixed bag, you know? Some Friday nights are drunken and exhilarating; other Friday nights are empty and reserved. And then, of course, there are those Friday nights when random people believe you accidentally forced the resignation of the head of the CIA.

We've all been there.

I'm not sure what I should write about the previous 72 hours of my life, or even if I should write anything at all. Technically, nothing happened. But I've been asked to "explain" how and why a certain non-event occurred, and I will try my best to do so. If you already know what I'm referring to, you will likely be disappointed by the banality of the forthcoming details. If you have no idea what I'm referring to, I will now attempt to explain what a bunch of other people desperately wanted to believe, mostly for their own amusement. It's a good story (not a great one, but a good one).

On Friday evening, I started watching a movie in my living room just after 9 p.m. This particular movie was 184 minutes long. I didn't want to be distracted, so I turned off my phone. When the film was over, my wife mentioned that she had just received an odd, alarmist e-mail from a mutual friend of ours. I subsequently turned on my phone and instantaneously received a dozen text messages that ranged from the instructional ("You're on the Internet") to the inscrutable ("This totally makes up karmically for that time you caused Billy Joel to go to rehab"). I had no idea what any of this meant (or even what it could mean). But what had transpired was this: At 9:09 p.m., the managing editor of Foreign Policy magazine had tweeted the words "interesting letter" to his 48,000 followers, along with a link to an article published in the New York Times Magazine on July 13. What happened after that is totally bizarre and stupidly predictable.

It was an honor to be involved.

First, some necessary background: Since June, I've been writing a column for theNew York Times Magazine called "The Ethicist." The existence of this column predates my involvement by many years (I'm now the third person who's occupied this particular title). "The Ethicist" is structured like a conventional advice column, but that's not really what it is; it's more like a collection of nonfictional thought experiments based on questions from the public. The ongoing goal is to isolate moral dilemmas within the day-to-day experience of modern life and to examine the potential ramifications of those quandaries in a readable, objective way.

by Chuck Klosterman, Grantland |  Read more:

Wednesday, November 14, 2012


[ed. Repost]
Mort Kunstler
via:

Jr.

Four years ago, in the fading light of a chilly December afternoon, Jesse Louis Jackson Jr. arrived at a Chicago office building for the most important meeting of his political life. As the eldest son of the Reverend Jesse L. Jackson, Jesse Jr. was no stranger to high-powered summitry. When Jackson was an infant, Martin Luther King Jr. paid visits to his family’s tiny apartment; as a teenager, he accompanied his father to meet with presidents in the Oval Office; by the time he was a young man, and a key adviser to “Reverend” (as he often addressed his father), he was traveling the globe for encounters with Fidel Castro and Nelson Mandela. Now, as the representative for Illinois’s Second Congressional District, Jackson was a political player in his own right—someone whose time was in demand by any number of powerful people, including Barack Obama, who’d tapped Jackson as a co-chair for both his 2004 Senate bid and his just-concluded presidential campaign.

The man with whom Jackson was meeting that afternoon was not a world-historical figure. Illinois governor Rod Blagojevich was under federal investigation for corruption, and a recent poll had put his approval rating at 13 percent. And yet, as far as Jackson was concerned, Blagojevich was a political titan. It was his job to appoint the person who would fill Obama’s Senate seat—an appointment Jackson desperately coveted. Although he was just 43 years old, he had already spent thirteen years in Congress and was itching to move on to bigger things. “I grew up wanting to be just like Dad,” Jackson once said. “Dad wanted to be president.” He’d flirted with runs for U.S. senator and Chicago mayor as possible stepping-stones and was determined not to lose this opportunity. “He’d watched all these people whom he had helped pass him by, especially Barack,” Delmarie Cobb, a Chicago political consultant and a former Jackson adviser, says. “And he was like, ‘Wait a minute, I’ve got to do something!’”

Blagojevich and Jackson had once been friends. When they served together in Congress in the late nineties, they were so close that a colleague referred to the pair as “Salt and Pepper.” And when Blagojevich decided to run for governor in 2002, Jackson pledged his support. But then, according to people close to Jesse Jr., the Reverend Jackson intervened, urging his son to endorse a black candidate. “Junior said, ‘Reverend told me that I needed to shore up my base,’ ” one Jackson confidant recalls. “And he decided to take his dad’s advice.”

As a result, Jesse Jr. knew that, left to his own devices, Blagojevich would never appoint him to the Senate. So he launched an aggressive lobbying effort that would essentially leave Blagojevich with no other alternative. Jackson commissioned a poll that showed him to be the leading choice of Illinois voters to replace Obama. He secured endorsements from newspaper editorial boards and Illinois politicos. He even turned to his family’s celebrity friends. In a memo prepared for Bill Cosby, Jesse Jr. furnished the comedian with the governor’s home telephone number, the correct pronunciation of his name (“Blah-goy-a-vitch”), and talking points in favor of his appointment. “My strategy was to run a public campaign,” Jackson later explained, “as public as possible.”

But the campaign to get Jesse Jr. to the Senate also had a private side. Four days before the presidential election, Raghu Nayak, a Chicago businessman and a longtime friend and supporter of the Jacksons, approached Robert Blagojevich, the governor’s brother and chief fund-raiser, with a proposal. If Rod Blagojevich appointed Jesse Jr. to fill Obama’s Senate seat, Nayak and his friends in Chicago’s Indian community would raise $6 million for the governor’s reelection campaign.

Initially, the governor was not moved. As he infamously explained to an aide, unaware that the FBI had tapped his phone: “I’ve got this thing and it’s fucking golden, and … I’m just not giving it up for fuckin’ nothing.” He told another aide that the thought of appointing Jackson was “repugnant” and that “I can’t believe anything he says.” But as Blagojevich repeatedly tried and failed to auction off the Senate seat—for monetary and political concessions—his opposition to Jesse Jr. seemed to soften. For weeks, Jackson had been seeking a meeting with Blagojevich, with whom he hadn’t spoken in four years, to discuss the appointment. On December 8, 2008, the governor finally granted him one.

For 90 minutes, the erstwhile friends, along with Blagojevich’s chief of staff, met in the governor’s Chicago office, one of the few places in Blago’s world the Feds hadn’t bugged. Jackson began the conversation with a mea culpa, apologizing to Blagojevich for not endorsing him in 2002. Then he proceeded to make the case for his appointment. He presented a binder full of polling data, newspaper endorsements, and letters of support. He also pledged that, if appointed, he would run with Blagojevich when both men were up for election in 2010. At no point, according to the subsequent sworn testimony of all three men who attended, was there any discussion of money. When the meeting was over, Blagojevich seemed impressed and told Jackson that it had been a good interview and he would soon have him back for another.

by Jason Zengerle, New York Magazine |  Read more:
Photo: Yuri Gripas/Reuters. Photo-illustration by Joe Darrow

An Author Can Dream

[ed. I haven't read the book. I'm already exhausted by the review.]

Six years ago, with his rambunctious debut novel, “Dr. Pitcher’s Experimental Mistress,” the chronicle of a timid Iowa chiropractor’s Ambien-fueled erotic awakening aboard a sinking Alaskan cruise ship, Samson Graham-Muñoz, then just 23 years old, gained an instant reputation as a limber verbal gymnast. Told in the form of a blog-within-a-blog, written by the eponymous physician in a blurred state of somnambulant arousal (the doctor types notes on his iPad during sex), the book gained a small but zealous following among fans of droll divertissements. Still, there were some critics, including this one, who found the performance more impish than inspired. Graham-Muñoz was clearly a talent on a tear, but where exactly he was headed was anybody’s guess.

The question now is why we ever doubted him. “The String Theory Quartet,” his sophomore effort, is no less audacious than its predecessor. But this time the pyrotechnics are imbued with a wounded humanity, like firecrackers that go “ouch” instead of “pop” or Roman candles that sigh as they shoot off sparks. Graham-Muñoz the antic boy wonder has matured, enriching the cerebral with the intestinal. His smart, soulful writing lodges in the gut, delivering resonant artistic thrills that even casual readers will find accessible.

The book of the moment? To be sure. A book for the ages? It’s too soon to say. But it isn’t too soon to say, loudly, in public, with arms raised high: The literary times they are a-changin’ and “The String Theory Quartet” is why.

by Walter Kirn, NY Times | Read more:
Illustration by Rodrigo Corral

Is There Such a Thing as the Female Conscience?

The foundational divide between moral law and virtue is usually cast in Greek terms as “the universal” vs. “the particular.” Is moral law the same everywhere? Yes, say the universalists. What we call virtue in Sparta is also virtue in Athens. If murder is wrong in one city, it is wrong in all. To universalists, there can be no separate morality for each and every culture, creating thereby a world in which we are strangers to one another, a world in which moral evaluations are turned upside down as we move from place to place.

Those who defend the other side of the debate—the particular—argue strong and moderate versions. The strong holds that there is no universal language of moral virtue, no general moral truth. There is only the code of virtue embedded in our culture—in our own “language game,” as twentieth-century philosopher Ludwig Wittgenstein later put it. Wittgenstein famously argued that the limits of his language were the limits of his world. A weaker version of his thesis holds that, while there may not be a bright line between moral universals and particulars, there can be major distinctions between virtuous and non-virtuous behavior, and what cultures consider right or wrong.

Philosophers never truly settled this matter to anyone’s satisfaction. Which is why it goes unresolved to this day. It surfaces most dramatically in times of violent cultural encounter. Thus, there are those who would defend al Qaeda terrorism on the grounds that, within Osama bin Laden’s frame of reference, killing as many enemy non-combatants as possible, including women and children, is the “right thing to do.” In carrying out his violent quest, as the argument goes, bin Laden was only defending his religious convictions.

I recall occasions post-9/11 when I was challenged on this issue in specifically gendered terms. The questions went something like this: What if they (Arab women) don’t even have the words for gender equality that we take for granted? If the faith demands that a woman wear a burqa, how can we say that her equality is violated when a man requires her to do so? Or beats her with a stick should she unwittingly display a bit of ankle? We may not like it, but it is the way the culture works. These were for me rather surprising questions to be put by young American university students in an age of gender parity. But perhaps, I mused, it isn’t so odd after all if we trace the debate from ancient to present and think through the ways our forebears determined all questions of truth, justice, and moral conscience.

We need to add one more ingredient to this already fulsome mix—namely, whether the moral law we set for ourselves was conceived along lines of what we now refer to routinely as race, gender, ethnicity, or religion. The Greeks distinguished between those who were authentically Greek and those who were barbaroi, barbarians. Among the authentically Greek, a further internal distinction was made, and it was by gender. Are men and women identical when it comes to moral law? Can women know the truth of the Forms (Plato’s question) as men of merit can?

Furthermore, if men and women play different social roles based on their respective natures, how do we calibrate their moral standing? How can we judge where the greatest moral good is to be found? Are men, on average, more capable than women of understanding and internalizing universal standards of Truth and Virtue? Plato’s argument cannot be unpacked in detail here but, to reduce it to its simplest form, he held that men were, by nature, more likely to be fit subjects for the contemplative life, a way of life made possible within the polis or city.

In The Republic, Plato’s utopian picture of the ideal if not perfect city—those who rise to the heights in which truth is contemplated—are overwhelmingly male and form a class he calls “the guardians.” What sort of society was good, just, and worthy of serving as a template of human virtue? Plato’s formula was simple: A just man can exist without a just city, but a just city cannot exist without at least a few just men. Plato’s guardians were responsible for society’s highest functions; as public, spirited, virtuous men, they would rule for the common good.

As it turns out, Plato made room for women: a few could get in on the act. But, according to him, it would be difficult. Why? Because women were oriented to the particular, to an ethic circumscribed by the household. Such citizens would not be capable of achieving the necessary virtue. They would not easily surrender themselves to the unconditional bond between individual and state that Plato believed necessary to render the polis as one. It follows that the few women who made it into the guardian class would be mated with the boldest and bravest male guardians.

But those women were forbidden from knowing their own infants. When a guardian woman gave birth, her child was taken at once to a special section of the city. There, minders cared for the young. When a child needed to nurse, he or she was handed randomly to a lactating female. Why all these wrenchings? In addition to the hope that breeding between superior males and females would continue to perpetuate an aristocracy of the best and the brightest, it was held that private homes, sexual attachments, and dedication to personal aims would undermine a citizen’s allegiance to the city. Plato cried: “Have we any greater evil for a city than what splits it and makes it many instead of one? Or a greater good than what binds it together?”

And so, the gauntlet was thrown. Every subsequent dispute or dialogue about gender and virtue and conscience owes something to these early formulations. In them, women are a divisive force in the polis. Their devotion to their children and their petty, private worlds limits their moral imaginations and knowledge. Mind you, we need women as we need children to be born. But in the context of early Greek philosophy, women were not trustworthy moral beings. This underlying perception set the basis for all subsequent debates about women and their political and social roles, including their niche in what one might call the “moral economy.” Plato considered them civically unreliable in light of their attachment to narrow, family loves. Does the same hold for that other titan of Greek philosophy, Aristotle? Yes, but this wants a bit of explaining. Aristotle ranks action, the vita activa, above all other human enterprise. In his estimation, two classes of people are cut off from thevita activa: women and slaves.

So it was that the Greek philosophers consigned women to a world of lesser virtue, for the oikos, or household, can never rise to universal moral truths. The home is too mired in the realm of biology and reproduction—an indispensable realm, surely, but limited. Women, slaves, and laborers are “necessary conditions” of the state. Men, by contrast, are integral. In such ways was the class or category “woman” deemed inferior to the class or category “man.” From that premise the rest was straightforward: Women are to be barred from citizenship and an active participation in the polis. They cannot be judged in the same way as a free male. And so, despite disagreements on the moral life, Plato and Aristotle held hands on the gender question—with exceptions here and there. That Plato was willing to admit a few women into his guardian class does little to remedy his overall view of the morally limited family and the private life that the overwhelming majority of women serve.

by Jean Bethke Elshtain, VQR | Read more:

harvest blooms | cosmicspread flickr society6
via:

As Not Seen on TV

[ed. Ouch.]

Guy Fieri, have you eaten at your new restaurant in Times Square? Have you pulled up one of the 500 seats at Guy’s American Kitchen & Bar and ordered a meal? Did you eat the food? Did it live up to your expectations?

Did panic grip your soul as you stared into the whirling hypno wheel of the menu, where adjectives and nouns spin in a crazy vortex? When you saw the burger described as “Guy’s Pat LaFrieda custom blend, all-natural Creekstone Farm Black Angus beef patty, LTOP (lettuce, tomato, onion + pickle), SMC (super-melty-cheese) and a slathering of Donkey Sauce on garlic-buttered brioche,” did your mind touch the void for a minute?

Did you notice that the menu was an unreliable predictor of what actually came to the table? Were the “bourbon butter crunch chips” missing from your Almond Joy cocktail, too? Was your deep-fried “boulder” of ice cream the size of a standard scoop?

What exactly about a small salad with four or five miniature croutons makes Guy’s Famous Big Bite Caesar (a) big (b) famous or (c) Guy’s, in any meaningful sense?

Were you struck by how very far from awesome the Awesome Pretzel Chicken Tenders are? If you hadn’t come up with the recipe yourself, would you ever guess that the shiny tissue of breading that exudes grease onto the plate contains either pretzels or smoked almonds? Did you discern any buttermilk or brine in the white meat, or did you think it tasted like chewy air?

Why is one of the few things on your menu that can be eaten without fear or regret — a lunch-only sandwich of chopped soy-glazed pork with coleslaw and cucumbers — called a Roasted Pork Bahn Mi, when it resembles that item about as much as you resemble Emily Dickinson?

When you have a second, Mr. Fieri, would you see what happened to the black bean and roasted squash soup we ordered?

Hey, did you try that blue drink, the one that glows like nuclear waste? The watermelon margarita? Any idea why it tastes like some combination of radiator fluid and formaldehyde?

At your five Johnny Garlic’s restaurants in California, if servers arrive with main courses and find that the appetizers haven’t been cleared yet, do they try to find space for the new plates next to the dirty ones? Or does that just happen in Times Square, where people are used to crowding?

If a customer shows up with a reservation at one of your two Tex Wasabi’s outlets, and the rest of the party has already been seated, does the host say, “Why don’t you have a look around and see if you can find them?” and point in the general direction of about 200 seats?

What is going on at this new restaurant of yours, really?

by Pete Wells, NY Times |  Read more:
Photo: Casey Kelbaugh for The New York Times

The Secret History of the Aeron Chair

After the great dot-com bust of 2000, there was one lasting symbol of the crash: Herman Miller’s Aeron chair. The ergonomic, mesh-backed office chair was launched in 1994, at the start of the bubble; at a cost of more than $1,000 at the time, it quickly became a status symbol in Silicon Valley—spotted constantly in magazines and in cameos on TV and film. Then, as the dot-coms failed, the chairs went empty. As one information architect told New York magazine years later, he remembered them "piled up in a corner as a kind of corporate graveyard." He went on: “They’re not in my mind an example of hubris as much as they are an example of companies trying to treat their staff more generously than they could actually afford.”

The Aeron was a throne perfectly tailored to Silicon Valley’s vanities. With a frame of high-tech molded plastic, a skin of woven plastic fibers pulled taut, and mechanics that accommodated slouchy rebels, the chair flattered the people who bought it. It was the best engineering money could buy, and it seemed purpose-built for squeaky-voiced billionaires inventing the future in front of a computer. But the Aeron’s origin story isn’t so simple. The apotheosis of the office chair—and perhaps the only one ever to become a recognizable and coveted brand name among cubicle-dwellers—was actually the unexpected fruit of a 10-year effort to create better furniture for the elderly.

One the Aeron’s designers was Bill Stumpf, the son of a gerontology nurse and a preternaturally keen observer of human behavior. So he was well primed in the late 1970s, when the American furniture company Herman Miller began casting about for growth prospects and hired Stumpf and Don Chadwick—who had done several pieces for Herman Miller—to investigate the potential of furniture for the elderly. It seemed like a tantalizing market opportunity. The American populace was aging quickly,assisted living facilities were rare, and hospitals lacked ergonomic furniture suited to long-term care. In each environment, Stumpf and Chadwick observed the surest sign of an opportunity: furniture being used in unintended ways. The homely workhorse common in both medical and residential settings was the La-Z-Boy. In hospitals, the elderly often got dialysis in semireclined La-Z-Boys; at home they spent hours in them watching TV. “The chair becomes the center of one’s universe. These sorts of realizations at the time weren’t just overlooked, they weren’t [deemed] important,” says Clark Malcolm, who helped manage the project. Those observation studies and focus groups “made Bill and Don focus on seating, in a way they never had before.”

The La-Z-Boy was terribly suited to both settings. The elderly, with weakened legs, had to back up to the chair and simply fall backward. The lever for reclining was awkward to reach and hard to engage. And, worst of all, the foam stuffing, often upholstered in vinyl, spread the sitter’s weight unevenly while retaining body heat and moisture—potentially causing bedsores.

Stumpf and Chadwick addressed all of those problems with the Sarah chair, which was finally completed in 1988, as part of a larger study of in-home medical equipment dubbed Metaforms. To solve the falling-backward problem, they settled on a footrest that, when closed, folded in under the seat, leaving the sitter with room to curl her legs under the chair as she sat down, thus bracing herself. When a sitter was fully reclined, fins flipped up, supporting her feet—like the fins on a wheelchair—and keeping them from falling asleep. The lever was banished in favor of a pneumatic control inspired by the recline buttons found on airplane seats.

But the chair’s greatest innovation was hidden: Its foam cushions were supported not by an upholstered wooden box, as was typical at the time, but by a span of plastic fabric stretched across a plastic frame. The foam could thus be thinner and more able to mold to the body. And because the foam’s backing was exposed to air, the design mitigated heat build-up.

“People became emotionally attached to that chair,” says Gary Miller, who headed Herman Miller’s R&D department at the time and eventually oversaw the Aeron project. Everyone who sat in it seemed to mention a parent or a sibling who needed it. But Herman Miller’s management balked at how futuristic it was. No one could figure out how to sell it, since there weren’t any stores selling high-design furniture to the elderly. The company was in far greater need of high-margin office chairs, so they killed the Sarah.

by Cliff Kuang, Slate |  Read more:
Image: Herman Miller

Clive Thompson: The Folding Game


On a Thursday night in September, I raced from Midtown to Bushwick for an impromptu conference organized by Arikia Millikan in what was dubbed a mansion, but I understood to be a large house. I sat on a wooden floor as ten people talked for ten minutes each, all speaking about secrets. One such person was Wired columnist Clive Thompson, who told us how gamers had solved a decade-long scientific mystery in a single month. As a suspicious non-gamer, I was amazed to find altruism within the World of Warcraft. Weeks later, we met at a café in Park Slope to discuss how the increasing complexity of video games led to groupthink, and how groupthink has been harnessed by researchers for scientific gain.
—Erika Anderson for Guernica

Guernica: How would you describe the evolution of video games?

Clive Thompson: When games started out, they were very, very simple affairs, and that was partly just technical—you couldn’t do very much. They had like 4K of memory. And so the games started off really not needing instructions at all. The first Pong game had one instruction. It was, “Avoid missing ball for high score.” So it was literally just that: don’t fail to hit the ball. I remember when I read it, it was actually a confusing construction: avoid missing ball for high score. It’s weirdly phrased, as if it were being translated from Swedish or something, you know? But they didn’t know what they were doing.

But what started happening very early on was that if you were in the arcades as I was—I’m 44 in October, so I was right at that age when these games were coming out—the games were really quite hard in a way, and because they were taking a quarter from you, their goal was to have you stop playing quickly because they need more money. They ramped up in difficulty very quickly, like the next wave is harder, and the third wave is unbelievably harder. And so you had to learn how to play them by trial and error with yourself but you only had so much money. And so what you started doing was you started observing other people and you started talking to all the other people. What you saw when you went to a game was one person playing and a semi-circle of people around them and they were all talking about what was going on, to try to figure out how to play the game. And they would learn all sorts of interesting strategy.

So the early stuff was literally strategy, and they discovered some very clever things, like in Asteroids, there was this strategy called lurking, whereby if you parked, if you got rid of almost all the asteroids except for one little one that would be sort of soaring through the air, you could hide in the corner of the screen and it would take a long time for it to come anywhere close to hitting. You could use this strategy to hunt for hours. You would sit there for hours getting more and more points.

So there are all these little strategies like that. But there was also like weird little bugs inside the game that weren’t intentionally put there, that people would discover. Like there were certain situations when you were being chased by the red ghosts of Pac-Man, certain places on the board where you could suddenly go racing right through the ghost without being hit. That was not intentional, that’s just a bug, and there are little bugs like that. And these are incredibly difficult things to notice, I mean the designers didn’t notice them. But if you have tens of thousands of kids in these arcades talking and observing and sharing notes, it’s a little bit like a scientific process whereby everyone notices one little fact and you slowly compile them into a theory of gravity or a theory of how cell biology works.

What started happening is that the game designers began to intentionally put secrets inside the games. The first one was Adventure, it was a game on the Atari 2600, and back then you didn’t know who had made the game, there were no credits anywhere. The designers would be hired for six months to produce a game and they were dispatched and it was this freelance economy, and they didn’t like not having any credit, so this guy decided to hide the credits inside the game. And he made a secret room inside Adventure and the only way to find it if you’re like stumbling around, trying to find something and get out without getting killed by these dragons, there’s a single pixel on one room—you might not even notice it, right?—but if you picked it up, you could go through a wall—an unmarked area in a wall—and get to this room, and inside the room it would have his name spelled out. It was incredible. There were no clues at all, but somehow kids started finding this. Someone would blunder through the wall, and they’d tell someone else and it just went through the grapevine and that’s the first example of an intentional secret put inside a game that was discovered by this sort of groupthink process that kids all playing and talking in school about what’s going on.

And that was sort of the beginning of what became an arms race, because essentially, as more kids started talking to each other and sharing information, any secrets inside the game would get discovered very quickly. And so the game designers started responding by putting more stuff inside the game that was harder to find, because they knew that kids were not just playing the game by themselves but playing it collectively, and so they weren’t just designing a game for one player, but for a 100 or 1,000 networked players, and a 100 or 1,000 people talking to each other is much smarter than one person individually.

by Clive Thompson, Guernica |  Read more:
Image from Flickr via billmcclair

Tuesday, November 13, 2012


[ed. This could be one of my all-time favorite pics.]
via: