Thursday, August 11, 2011

It’s the Economy, Dummkopf!

With Greece and Ireland in economic shreds, while Portugal, Spain, and perhaps even Italy head south, only one nation can save Europe from financial Armageddon: a highly reluctant Germany. The ironies—like the fact that bankers from Düsseldorf were the ultimate patsies in Wall Street’s con game—pile up quickly as Michael Lewis investigates German attitudes toward money, excrement, and the country’s Nazi past, all of which help explain its peculiar new status.

by Michael Lewis

By the time I arrived in Hamburg the fate of the financial universe seemed to turn on which way the German people jumped. Moody’s was set to downgrade the Portuguese government’s debt to junk-bond status, and Standard & Poor’s had hinted darkly that Italy might be next. Ireland was about to be downgraded to junk status, too, and there was a very real possibility that the newly elected Spanish government might seize the moment to announce that the old Spanish government had miscalculated, and owed foreigners a lot more money than they previously imagined. Then there was Greece. Of the 126 countries with rated debt, Greece now ranked 126th: the Greeks were officially regarded as the least likely people on the planet to repay their debts. As the Germans were not only the biggest creditor of the various deadbeat European nations but their only serious hope for future funding, it was left to Germans to act as moral arbiter, to decide which financial behaviors would be tolerated and which would not. As a senior official at the Bundesbank put it to me, “If we say ‘no,’ it’s ‘no.’ Nothing happens without Germany. This is where the losses come to live.” Just a year ago, when German public figures called Greeks cheaters, and German magazines ran headlines like WHY DON"T YOU SELL YOUR ISLANDS, YOU BANKRUPT GREEKS? ordinary Greeks took it as an outrageous insult. In June of this year the Greek government started selling islands or at any rate created a fire-sale list of a thousand properties—golf courses, beaches, airports, farmlands, roads—that they hoped to sell, to help repay their debts. It was safe to say that the idea for doing this had not come from the Greeks.

To no one but a German is Hamburg an obvious place to spend a vacation, but it happened to be a German holiday, and Hamburg was overrun by German tourists. When I asked the hotel concierge what there was to see in his city, he had to think for a few seconds before he said, “Most people just go to the Reeperbahn.” The Reeperbahn is Hamburg’s red-light district, the largest red-light district in Europe, according to one guidebook, though you have to wonder how anyone figured that out. And the Reeperbahn, as it happens, was why I was there.

Perhaps because they have such a gift for creating difficulties with non-Germans, the Germans have been on the receiving end of many scholarly attempts to understand their collective behavior. In this vast and growing enterprise, a small book with a funny title towers over many larger, more ponderous ones. Published in 1984 by a distinguished anthropologist named Alan Dundes, Life Is Like a Chicken Coop Ladder set out to describe the German character through the stories that ordinary Germans liked to tell one another. Dundes specialized in folklore, and in German folklore, as he put it, “one finds an inordinate number of texts concerned with anality. Scheisse (shit), Dreck (dirt), Mist (manure), Arsch (ass).… Folksongs, folktales, proverbs, riddles, folk speech—all attest to the Germans’ longstanding special interest in this area of human activity.”

The Hamburg red-light district had caught Dundes’s eye because the locals made such a big deal of mud-wrestling. Naked women fought in a metaphorical ring of filth while the spectators wore plastic caps, a sort of head condom, to avoid being splattered. “Thus,” wrote Dundes, “the audience can remain clean while enjoying dirt!” Germans longed to be near the shit, but not in it. This, as it turns out, was an excellent description of their role in the current financial crisis.

The Scheisse Hits the Fan

A week or so earlier, in Berlin, I had gone to see Germany’s deputy minister of finance, a 44-year-old career government official named Jörg Asmussen. The Germans are now in possession of the only Finance Ministry in the big-time developed world whose leaders don’t need to worry whether their economy will collapse the moment investors stop buying their bonds. As unemployment in Greece climbs to the highest on record (16.2 percent at last count), it falls in Germany to 20-year lows (6.9 percent). Germany appears to have experienced a financial crisis without economic consequences. They’d donned head condoms in the presence of their bankers, and so they had avoided being splattered by their mud. As a result, for the past year or so the financial markets have been trying and failing to get a bead on the German people: they can probably afford to pay off the debts of their fellow Europeans, but will they actually do it? Are they now Europeans, or are they still Germans? Any utterance or gesture by any German official anywhere near this decision for the past 18 months has been a market-moving headline, and there have been plenty, most of them echoing German public opinion, and expressing incomprehension and outrage that other peoples can behave so irresponsibly. Asmussen is one of the Germans now being obsessively watched. He and his boss, Wolfgang Schäuble, are the two German officials present in every conversation between the German government and the deadbeats.

The Finance Ministry, built in the mid-1930s, is a monument to both the Nazis’ ambition and their taste. A faceless butte, it is so big that if you circle it in the wrong direction it can take you 20 minutes to find the front door. I circle it in the wrong direction, then sweat and huff to make up for lost time, all the while wondering if provincial Nazis in from the sticks had had the same experience, wandering outside these forbidding stone walls and trying to figure out how to get in. At length I find a familiar-looking courtyard: the only difference between it and famous old photographs of it is that Hitler is no longer marching in and out of the front door, and the statues of eagles perched atop swastikas have been removed. “It was built for Göring’s Air Ministry,” says the waiting Finance Ministry public-relations man, who is, oddly enough, French. “You can tell from the cheerful architecture.” He then explains that the building is so big because Hermann Göring wanted to be able to land planes on its roof.

Read more:

Wednesday, August 10, 2011

Luli Sanchez
via:

Screw Optimism and Screw “Sanity”

by Ian Welsh

I recently stumbled across a book on the link between leadership and what we call madness.  From the Amazon review:
Take realism, for instance: study after study has shown that those suffering depression are better than “normal” people at assessing current threats and predicting future outcomes. Looking at Lincoln and Churchill among others, Ghaemi shows how depressive realism helped these men tackle challenges both personal and national. Or consider creativity, a quality psychiatrists have studied extensively in relation to bipolar disorder. A First-Rate Madness shows how mania inspired General Sherman and Ted Turner to design and execute their most creative-and successful-strategies.
Ghaemi’s thesis is both robust and expansive; he even explains why eminently sane men like Neville Chamberlain and George W. Bush made such poor leaders. Though sane people are better shepherds in good times, sanity can be a severe liability in moments of crisis. A lifetime without the cyclical torment of mood disorders, Ghaemi explains, can leave one ill equipped to endure dire straits. He also clarifies which kinds of insanity-like psychosis-make for despotism and ineptitude, sometimes on a grand scale.
Now, I’m not depressive, strictly speaking.  I don’t stay in bed all day, and so on. But the Welsh family motto, no kidding, is this:
An optimist and a damn fool are the same thing.
Ordinary people, what we call “sane” in our society, are really shitty analysts.  Really, really shitty analysts.

Their bias to the upside is tiresome, predictable and makes them wrong, over and over and over again.  They don’t know what real threats are, they constantly are confused about what is really dangerous.  They think stranger pedophiles are a big danger to their kids, while it’s their family members or their own driving.  They think terrorism is dangerous, when almost no one dies from it, as opposed to crossing the street or eating too many Big Macs.  They fear “Osama” when the men who are most likely to cause their death or impoverishment have names like Bush, Paulson, Geithner, Obama and so on.

I walked through Calcutta’s slums, as a teenager, by myself.  I know what’s actually dangerous, and what isn’t.  But my parents didn’t coddle me, didn’t think their job was to make sure I never faced any danger, no matter how minor, so that when released as an adult I wouldn’t know how to evaluate threats.  They also didn’t think my self-esteem should outrun my ability.

Of course optimism is wonderfully adaptive as long as optimists aren’t your leaders or analysts, and don’t run your nuclear power plants, or plan your economies, or make any decisions about anything which if it goes wrong can go catastrophically wrong.  Optimists are happier, they live longer, they’re healthier, they “get up and go”, blah, blah, blah.  Optimism is good for optimists and hey, they’re generally more pleasant to be around, too.  There are time periods when they’re even right a lot (say during the 50s).  But basically, they’re blind.  One imagines conversations between cows. “Hey, they feed us every day, we get free health care, no real responsibility!  The dog makes sure the wolves don’t bother us.  This is great!  I do wonder what happened to Thelma and Fred, when they took them away in that truck?  But I’m sure it wasn’t anything bad, and if it was they must have deserved it, and anyway, that’d never happen to me, because I’m a good cow and this is the best herd in the whole world!”

And you can tell people what will happen, in advance, and be right, over and over and over again.  And what that will do is get you marginalized.  “Oh, he’s so negative! Such a downer. He should make us feel good about ourselves and our future, and if he doesn’t, we won’t listen. Let’s watch some TV!”

Obsessives: Soda Pop

Kerry James Marshall, Nude (Spotlight), 2009
via:

Marvin Gaye


The Cult of Cats

Photobucket

Cats first decided to live among humans over 9,000 years ago. A burial site in Cyprus dating from 7,500BC provides the earliest evidence, with the corpse of an eight-month-old cat carefully laid out in its own tiny plot less than two feet away from its companion human. This gives human-feline cohabitation a more recent pedigree than human-canine, with dogs having lived alongside humans for well over 10,000 years, but puts cats comfortably ahead of such lesser beasts as chickens, ducks, horses, silkworms and ferrets. And among all domestic animals cats boast a unique distinction: to the best of our knowledge, it was them who chose us.

Or rather, cats chose what humans represented: the plentiful supply of tasty vermin that lived among the stock and refuse of early civilisation. In this, the central dynamic of human-feline relations has altered little over ten millennia: food and shelter are welcome, and the bipeds who come packaged with these lie somewhere between a nuisance and a bonus. As I type these words, a well-fed feline called Jacob is lying across my forearms, where he spends much of the day when I’m writing. I know that he appreciates the stroking as well as the feeding; but I’m equally certain that, if our sizes were reversed, the only thing that would stop him from eating me instantly would be the pleasure of hunting me first.

Vermin-catching skills aside, cats are not useful to humans in any instrumental sense, nor much inclined to put themselves at our service. In contrast to the empathetic, emphatically useful dog, a cat’s mind is an alien and often unsympathetic mix of impulses. And it’s perhaps this combination of indifference and intimacy that has made it a beast of such ambivalent fascination throughout our history. Felines have been gods, demons, spirits and poppets to humankind over the centuries—and that’s before you reach the maelstrom of the internet and its obsessions. They are, in effect, a blank page onto which we doodle our dreams, fears and obsessions.

Thanks to a new book from independent London publishers Merrell, we now have a lavish and delightfully illustrated synopsis of the role of cats in our visual culture. Titled The Cat, and rather more helpfully subtitled 3500 Years of the Cat in Art, it gets off to a bad start by misdating the origins of cats in human history by some 5,000 years, but from then on improves into a thorough monument to feline fascination. Under ten chapter headings, ranging from “early” and “religious” cats to “legendary,” “eastern” and “portrait” examples, author Caroline Bugler dashes through the years with a rich store of anecdotes and antecedents.

Throughout history, the domestication of a species has typically involved humans remoulding the world to suit themselves. In cats, though, we meet the gaze of an alien but equal opportunism; of the only mammal to have invited itself into our homes, persuaded us to feed it, then got us cleaning up the mess afterwards.

Read more:

image credit:

Is That All There Is?

The disappearance of God is often considered elegiacally, as a loss. But secularism can also be an affirmation of the here and now.

by James Wood

I have a friend, an analytic philosopher and convinced atheist, who told me that she sometimes wakes in the middle of the night, anxiously turning over a series of ultimate questions: “How can it be that this world is the result of an accidental big bang? How could there be no design, no metaphysical purpose? Can it be that every life—beginning with my own, my husband’s, my child’s, and spreading outward—is cosmically irrelevant?” In the current intellectual climate, atheists are not supposed to have such thoughts. We are locked into our rival certainties—religiosity on one side, secularism on the other—and to confess to weakness on this order is like a registered Democrat wondering if she is really a Republican, or vice versa.

These are theological questions without theological answers, and, if the atheist is not supposed to entertain them, then, for slightly different reasons, neither is the religious believer. Religion assumes that they are not valid questions because it has already answered them; atheism assumes that they are not valid questions because it cannot answer them. But as one gets older, and parents and peers begin to die, and the obituaries in the newspaper are no longer missives from a faraway place but local letters, and one’s own projects seem ever more pointless and ephemeral, such moments of terror and incomprehension seem more frequent and more piercing, and, I find, as likely to arise in the middle of the day as the night. I think of these anxieties as the Virginia Woolf Question, after a passage in that most metaphysical of novels “To the Lighthouse,” when the painter Lily Briscoe is at her easel, mourning her late friend Mrs. Ramsay. Next to her sits the poet, Augustus Carmichael, and suddenly Lily imagines that she and Mr. Carmichael might stand up and demand “an explanation” of life:

For one moment she felt that if they both got up, here, now on the lawn, and demanded an explanation, why was it so short, why was it so inexplicable, said it with violence, as two fully equipped human beings from whom nothing should be hid might speak, then, beauty would roll itself up; the space would fill; those empty flourishes would form into shape; if they shouted loud enough Mrs. Ramsay would return. “Mrs. Ramsay!” she said aloud, “Mrs. Ramsay!” The tears ran down her face.

Why is life so short, why so inexplicable? These are the questions Lily wants answered. More precisely, these are the questions she needs to ask, ironically aware that an answer cannot be had if there is no one to demand it from. We may hope that “nothing should be hid” from us, but certain explanations can only ever be hidden. Just as Mrs. Ramsay has died, and cannot be shouted back to life, so God is dead, and cannot be reimplored into existence. And, as Terrence Malick’s oddly beautiful film “The Tree of Life” reminds us, the answers are still hidden even if we believe in God. Lily Briscoe’s “Why?” is not very different from Job’s “Why, Lord?”

Read more:

Overdone

Why are restaurant web sites so horrifically bad?

by Farhad Manjoo

The first thing that pops up when you visit the website of the San Francisco restaurant Fleur de Lys is a nearly full-screen animation of celebrity chef Hubert Keller's autograph. That makes sense—when I'm choosing a restaurant, the first thing I want to know is, Can the chef sign his name?

Wait a second, though. What does Chef Keller look like? You're not going to bother with this place if the chef doesn't have a good headshot. Good news! After the signature, the site fades into a snappy photo of Keller. Fortunately, he's a looker—think Peter Fonda with Fabio's hair.

After the autograph and headshot, the site transitions to a "main menu," which presents you with links to Keller's other restaurants and his PBS TV show. Tempted though you are, you stay focused and click for the San Francisco restaurant. One bit of advice: If you've got a subwoofer attached to your computer, now's the time to crank it up, because you're in for some auto-playing, royalty-free, ambient techno smooth jazz! As you stifle your urge to get up and dance, you click around in search of information about the restaurant. (The page emits a friendly beep every time you click.) If you spend the better part of your lunch hour scouring the site, you'll eventually find the menu. What you won't find is the price—it takes a Web search to determine that the tasting menu at Fleur de Lys costs $72 a person.

By this point, you've likely been so beaten down by the music, the nested menus, and the interminable "Loading …" prompts that you're considering Taco Bell for dinner (though it too has a terrible site). Still, I'm not arguing that Hubert Keller is responsible for the worst restaurant website ever created. That's a bit like trying to decide on the most awful serial killer in history. The head-poundingly awful Fleur de Lys site is just one of many in an industry whose collective crimes against Web design are as routine as they are horrific. If you think Fleur de Lys is ugly, check out the site for New York's Buddakan, which launches a full-screen window, auto-plays sitar-heavy technopop, and subjects you to a series of flying panels every time you click. (Eater NY described the site as "like the Inception trailer, but with summer rolls.") Next, check out Cavatore, an Italian restaurant in Houston that hired Web designers who were either a) on a Monty Python-besotted acid trip, or b) looking to induce epileptic seizures. Seriously, this site is so bad it's evil.

While lots of people have noted the general terribleness of restaurant sites, I haven't ever seen an explanation for why this industry's online presence is so singularly bruising. The rest of the Web long ago did away with auto-playing music, Flash buttons and menus, and elaborate intro pages, but restaurant sites seem stuck in 1999. The problem is getting worse in the age of the mobile Web—Flash doesn't work on Apple's devices, and while some of these sites do load on non-Apple smartphones, they take forever to do so, and their finicky navigation makes them impossible to use.

Over the last few weeks I've spent countless hours, now lost forever, plumbing the depths of restaurant Web hell. I also spoke to several industry experts about the reasons behind all these maliciously poorly designed pages. I heard several theories for why restaurant sites are so bad—that they can't afford to pay for good designers, that they don't understand what people want from a site, and that they don't really care what's on their site. But the best answer I found was this: Restaurant sites are the product of restaurant culture. These nightmarish websites were spawned by restaurateurs who mistakenly believe they can control the online world the same way they lord over a restaurant. "In restaurants, the expertise is in the kitchen and in hospitality in general," says Eng San Kho, a partner at the New York design firm Love and War, which has created several unusually great restaurant sites (more on those in a bit). "People in restaurants have a sense that they want to create an entertainment experience online—that's why disco music starts, that's why Flash slideshows open. They think they can still play the host even here online."

But it'd be a mistake to blame the chefs for these sites. They were all aided by Web designers who were either too unscrupulous or unsophisticated to disabuse them of their ideas. In fact, when you dig into some of the worst restaurant sites, you notice that they share the same designers. The Inception-like Buddakan site was built by a firm called 160over90, whose portfolio also includes New York's Morimoto (10-second load time, obnoxious bass-heavy music, a cut-out snapshot of the chef that constantly hovers on the screen) and Philadelphia's Butcher and Singer (flashes an old-movie-reel countdown while it loads, then plays old-timey music that will drive you mad). I tried to contact 160over90 and the designers behind other sites mentioned in this article to ask them, essentially, why they sucked. Not surprisingly, I got no response.

I did get a plausible-sounding explanation of the design process from Tom Bohan, who heads up Menupages, the fantastic site that lists menus of restaurants in several large cities. "Say you're a designer and you've got to demo a site you've spent two months creating," Bohan explains. "Your client is someone in their 50s who runs a restaurant but is not very in tune with technology. What's going to impress them more: Something with music and moving images, something that looks very fancy to someone who doesn't know about optimizing the Web for consumer use, or if you show them a bare-bones site that just lists all the information? I bet it would be the former—they would think it's great and money well spent."

Read more:

Tuesday, August 9, 2011

Kristian Schuller - Fashion by c.neeon
via:

Frank Zappa - Decline of the Music Industry


[ed.  Seems like this has broader applications than just the music industry.]

Aloe Blacc


The Endless Summer

by Andrew Cohen

I have never surfed—never even dreamed of surfing or had any inclination to pick up a board—and yet in this summer of our discontent I have become mesmerized by Bruce Brown's timeless documentary The Endless Summer. It is a beautifully shot film from pristine locales chronicling the worldly travels of two dashing surfer dudes in the mid 1960s. Brown's masterpiece has been airing over and over again this summer on ESPN Classic, and it seems to me like a perfect antidote to all the bad news coming out of Washington these days.

Wouldn't we all like to leave everything behind and go out in search of the perfect wave right about now? Wouldn't we like to worry about nothing more than finding the right beach with the right surf and the right water temperature? I'd bet the ranch that President Barack Obama, he of the Hawaiian birthplace, would sign on to that deal if he could. If the movie were food it would be your favorite dish at the local diner. If it were a song it would be the sort people pay to listen to in order to fall asleep. If I were a doctor, I would prescribe it to my patients.

Here's how Brown's people subsequently described what he accomplished nearly 50 years ago:
In 1964, filmmaker Bruce Brown decided to follow two surfers around the world in search of a perfect wave. On a budget of only US $50 thousand, with a 16mm camera, he captured the essence, the adventure, and the art of surfing. Hence the renowned The Endless Summer. From the waters of West Africa, through the seas of Australia, to Tahiti, two surfers from California achieved their great dream: to try the wildest waves in the world.
Here's a brief video clip from the film:


The documentary was released in 1966 to surprisingly good reviews from mainstream movie critics. The timing was serendipitous. The technology of filmmaking would not have allowed the film to be made five years earlier. And five years later, in 1971, the sun and fun would have seemed far too frivolous following the race riots, Kent State, and the body bags coming home from Southeast Asia. For these reasons, The Endless Summer seems as much of a period piece as Citizen Kane or Gone With the Wind. Yes, son, there really was a time when the beaches were clear and no one bugged you to put on sunscreen.

The film indeed revels in the absence of anything weighty. There is a single remark by Brown about South Africa's apartheid—he lamely notes that the area's sharks and porpoises segregate themselves in the water. There is a sexist remark about the bathing suits of Australia's female surfers. A few locals here and there are made fun of. And that's about the extent of the film's political message. We don't know what the boys think about anything beyond what they think of the water and the waves and the size of the surf. They aren't characters so much as props.

The film's philosophical message, on the other hand, is front and center: There is art and science in most human endeavors, including the ones that ultimately matter the least to the story of our existence on Earth. The "perfect wave" doesn't exist only in the perfect world these men inhabited during their journey. And yet the surfers were as beautiful and as graceful as the beaches and waves upon which they played. They were as carefree as the fish they saw in the water or the animals they saw on land. No wonder the Beach Boys used the title for their 1974 memorable compilation album (Side 1: "Surfin' Safari," "Surfer Girl," "Catch a Wave," "The Warmth of the Sun," and "Surfin USA"). 

Can the Middle Class be Saved?

by Don Peck

In October 2005, three Citigroup analysts released a report describing the pattern of growth in the U.S. economy. To really understand the future of the economy and the stock market, they wrote, you first needed to recognize that there was “no such animal as the U.S. consumer,” and that concepts such as “average” consumer debt and “average” consumer spending were highly misleading.

In fact, they said, America was composed of two distinct groups: the rich and the rest. And for the purposes of investment decisions, the second group didn’t matter; tracking its spending habits or worrying over its savings rate was a waste of time. All the action in the American economy was at the top: the richest 1 percent of households earned as much each year as the bottom 60 percent put together; they possessed as much wealth as the bottom 90 percent; and with each passing year, a greater share of the nation’s treasure was flowing through their hands and into their pockets. It was this segment of the population, almost exclusively, that held the key to future growth and future returns. The analysts, Ajay Kapur, Niall Macleod, and Narendra Singh, had coined a term for this state of affairs: plutonomy.

In a plutonomy, Kapur and his co-authors wrote, “economic growth is powered by and largely consumed by the wealthy few.” America had been in this state twice before, they noted—during the Gilded Age and the Roaring Twenties. In each case, the concentration of wealth was the result of rapid technological change, global integration, laissez-faire government policy, and “creative financial innovation.” In 2005, the rich were nearing the heights they’d reached in those previous eras, and Citigroup saw no good reason to think that, this time around, they wouldn’t keep on climbing. “The earth is being held up by the muscular arms of its entrepreneur-plutocrats,” the report said. The “great complexity” of a global economy in rapid transformation would be “exploited best by the rich and educated” of our time.

Kapur and his co-authors were wrong in some of their specific predictions about the plutonomy’s ramifications—they argued, for instance, that since spending was dominated by the rich, and since the rich had very healthy balance sheets, the odds of a stock-market downturn were slight, despite the rising indebtedness of the “average” U.S. consumer. And their division of America into only two classes is ultimately too simple. Nonetheless, their overall characterization of the economy remains resonant. According to Gallup, from May 2009 to May 2011, daily consumer spending rose by 16 percent among Americans earning more than $90,000 a year; among all other Americans, spending was completely flat. The consumer recovery, such as it is, appears to be driven by the affluent, not by the masses. Three years after the crash of 2008, the rich and well educated are putting the recession behind them. The rest of America is stuck in neutral or reverse.

The ease with which the rich and well educated have shrugged off the recession shouldn’t be surprising; strong winds have been at their backs for many years. The recession, meanwhile, has restrained wage growth and enabled faster restructuring and offshoring, leaving many corporations with lower production costs and higher profits—and their executives with higher pay.

“The rich seem to be on the road to recovery,” says Emmanuel Saez, an economist at Berkeley, while those in the middle, especially those who’ve lost their jobs, “might be permanently hit.” Coming out of the deep recession of the early 1980s, Saez notes, “you saw an increase in inequality … as the rich bounced back, and unionized labor never again found jobs that paid as well as the ones they’d had. And now I fear we’re going to see the same phenomenon, but more dramatic.” Middle-paying jobs in the U.S., in which some workers have been overpaid relative to the cost of labor overseas or technological substitution, “are being wiped out. And what will be left is a hard and a pure market,” with the many paid less than before, and the few paid even better—a plutonomy strengthened in the crucible of the post-crash years.

Read more:

Nickel and Dimed (2011 Version)

It was at lunch with the editor of Harper’s Magazine that the subject came up: How does anyone actually live “on the wages available to the unskilled”?  And then Barbara Ehrenreich said something that altered her life and resulted, improbably enough, in a bestselling book with almost two million copies in print.  “Someone,” she commented, “ought to do the old-fashioned kind of journalism -- you know go out there and try it for themselves.”  She meant, she hastened to point out on that book’s first page, “someone much younger than myself, some hungry neophyte journalist with time on her hands.”

That was 1998 and, somewhat to her surprise, Ehrenreich soon found herself beginning the first of a whirl of unskilled “careers” as a waitress at a “family restaurant” attached to a big discount chain hotel in Key West, Florida, at $2.43 an hour plus tips.  And the rest, of course, is history.  The now famous book that resulted, Nickel and Dimed: On (Not) Getting By in America, is just out in its tenth anniversary edition with a new afterword by Ehrenreich -- perfectly timed for an American era in which the book’s subtitle might have to be changed to “On (Not) Getting a Job in America.”  TomDispatch takes special pride in offering Ehrenreich’s new afterword, adapted and shortened, for a book that, in its latest edition, deserves to sell another million copies

On Turning Poverty into an American Crime
By Barbara Ehrenreich

I completed the manuscript for Nickel and Dimed in a time of seemingly boundless prosperity. Technology innovators and venture capitalists were acquiring sudden fortunes, buying up McMansions like the ones I had cleaned in Maine and much larger. Even secretaries in some hi-tech firms were striking it rich with their stock options. There was loose talk about a permanent conquest of the business cycle, and a sassy new spirit infecting American capitalism. In San Francisco, a billboard for an e-trading firm proclaimed, “Make love not war,” and then -- down at the bottom -- “Screw it, just make money.”

When Nickel and Dimed was published in May 2001, cracks were appearing in the dot-com bubble and the stock market had begun to falter, but the book still evidently came as a surprise, even a revelation, to many. Again and again, in that first year or two after publication, people came up to me and opened with the words, “I never thought...” or “I hadn’t realized...”

To my own amazement, Nickel and Dimed quickly ascended to the bestseller list and began winning awards. Criticisms, too, have accumulated over the years. But for the most part, the book has been far better received than I could have imagined it would be, with an impact extending well into the more comfortable classes. A Florida woman wrote to tell me that, before reading it, she’d always been annoyed at the poor for what she saw as their self-inflicted obesity. Now she understood that a healthy diet wasn’t always an option. And if I had a quarter for every person who’s told me he or she now tipped more generously, I would be able to start my own foundation.

Even more gratifying to me, the book has been widely read among low-wage workers. In the last few years, hundreds of people have written to tell me their stories: the mother of a newborn infant whose electricity had just been turned off, the woman who had just been given a diagnosis of cancer and has no health insurance, the newly homeless man who writes from a library computer.

At the time I wrote Nickel and Dimed, I wasn’t sure how many people it directly applied to -- only that the official definition of poverty was way off the mark, since it defined an individual earning $7 an hour, as I did on average, as well out of poverty. But three months after the book was published, the Economic Policy Institute in Washington, D.C., issued a report entitled “Hardships in America: The Real Story of Working Families,” which found an astounding 29% of American families living in what could be more reasonably defined as poverty, meaning that they earned less than a barebones budget covering housing, child care, health care, food, transportation, and taxes -- though not, it should be noted, any entertainment, meals out, cable TV, Internet service, vacations, or holiday gifts. Twenty-nine percent is a minority, but not a reassuringly small one, and other studies in the early 2000s came up with similar figures.

The big question, 10 years later, is whether things have improved or worsened for those in the bottom third of the income distribution, the people who clean hotel rooms, work in warehouses, wash dishes in restaurants, care for the very young and very old, and keep the shelves stocked in our stores. The short answer is that things have gotten much worse, especially since the economic downturn that began in 2008.

Post-Meltdown Poverty

When you read about the hardships I found people enduring while I was researching my book -- the skipped meals, the lack of medical care, the occasional need to sleep in cars or vans -- you should bear in mind that those occurred in the best of times. The economy was growing, and jobs, if poorly paid, were at least plentiful.

In 2000, I had been able to walk into a number of jobs pretty much off the street. Less than a decade later, many of these jobs had disappeared and there was stiff competition for those that remained. It would have been impossible to repeat my Nickel and Dimed “experiment,” had I had been so inclined, because I would probably never have found a job.

For the last couple of years, I have attempted to find out what was happening to the working poor in a declining economy -- this time using conventional reporting techniques like interviewing. I started with my own extended family, which includes plenty of people without jobs or health insurance, and moved on to trying to track down a couple of the people I had met while working on Nickel and Dimed.

This wasn’t easy, because most of the addresses and phone numbers I had taken away with me had proved to be inoperative within a few months, probably due to moves and suspensions of telephone service. I had kept in touch with “Melissa” over the years, who was still working at Wal-Mart, where her wages had risen from $7 to $10 an hour, but in the meantime her husband had lost his job. “Caroline,” now in her 50s and partly disabled by diabetes and heart disease, had left her deadbeat husband and was subsisting on occasional cleaning and catering jobs. Neither seemed unduly afflicted by the recession, but only because they had already been living in what amounts to a permanent economic depression.