Friday, July 20, 2012

War Is Betrayal

We condition the poor and the working class to go to war. We promise them honor, status, glory, and adventure. We promise boys they will become men. We hold these promises up against the dead-end jobs of small-town life, the financial dislocations, credit card debt, bad marriages, lack of health insurance, and dread of unemployment. The military is the call of the Sirens, the enticement that has for generations seduced young Americans working in fast food restaurants or behind the counters of Walmarts to fight and die for war profiteers and elites.

The poor embrace the military because every other cul-de-sac in their lives breaks their spirit and their dignity. Pick up Erich Maria Remarque’s All Quiet on the Western Front or James Jones’s From Here to Eternity. Read Henry IV. Turn to the Iliad. The allure of combat is a trap, a ploy, an old, dirty game of deception in which the powerful, who do not go to war, promise a mirage to those who do.

I saw this in my own family. At the age of ten I was given a scholarship to a top New England boarding school. I spent my adolescence in the schizophrenic embrace of the wealthy, on the playing fields and in the dorms and classrooms that condition boys and girls for privilege, and came back to my working-class relations in the depressed former mill towns in Maine. I traveled between two universes: one where everyone got chance after chance after chance, where connections and money and influence almost guaranteed that you would not fail; the other where no one ever got a second try. I learned at an early age that when the poor fall no one picks them up, while the rich stumble and trip their way to the top.

Those I knew in prep school did not seek out the military and were not sought by it. But in the impoverished enclaves of central Maine, where I had relatives living in trailers, nearly everyone was a veteran. My grandfather. My uncles. My cousins. My second cousins. They were all in the military. Some of them—including my Uncle Morris, who fought in the infantry in the South Pacific during World War II—were destroyed by the war. Uncle Morris drank himself to death in his trailer. He sold the hunting rifle my grandfather had given to me to buy booze.

He was not alone. After World War II, thousands of families struggled with broken men who, because they could never read the approved lines from the patriotic script, had been discarded. They were not trotted out for red-white-and-blue love fests on the Fourth of July or Veterans Day.

The myth of war held fast, despite the deep bitterness of my grandmother—who acidly denounced what war had done to her only son—and of others like her. The myth held because it was all the soldiers and their families had. Even those who knew it to be a lie—and I think most did—were loath to give up the fleeting moments of recognition, the only times in their lives they were told they were worth something.

“For it’s Tommy this, an’ Tommy that, an’ ‘Chuck him out, the brute!’” Rudyard Kipling wrote. “But it’s ‘Saviour of ’is country’ when the guns begin to shoot.”

Any story of war is a story of elites preying on the weak, the gullible, the marginal, the poor. I do not know of a single member of my graduating prep school class who went into the military. You could not say this about the high school class that graduated the same year in Mechanic Falls, Maine.

by Chris Hedges, Boston Review |  Read more:
Photograph by Teddy Wade, U.S. Army

Global Warming's Terrifying New Math

If the pictures of those towering wildfires in Colorado haven't convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the "largest temperature departure from average of any season on record." The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet's history.

Not that our leaders seemed to notice. Last month the world's nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn't even attend. It was "a ghost of the glad, confident meeting 20 years ago," the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls "once thronged by multitudes." Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I've spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we're losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn't yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world's nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the "most important gathering since the Second World War, given what is at stake." As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: "This is our chance. If we miss it, it could take years before we get a new and better one. If ever."

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving "Copenhagen Accord" that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. "Copenhagen is a crime scene tonight," an angry Greenpeace official declared, "with the guilty men and women fleeing to the airport." Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized "the scientific view that the increase in global temperature should be below two degrees Celsius." And in the very next paragraph, it declared that "we agree that deep cuts in global emissions are required... so as to hold the increase in global temperature below two degrees Celsius." By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we've raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. "Any number much above one degree involves a gamble," writes Kerry Emanuel of MIT, a leading authority on hurricanes, "and the odds become less and less favorable as the temperature goes up." Thomas Lovejoy, once the World Bank's chief biodiversity adviser, puts it like this: "If we're seeing what we're seeing today at 0.8 degrees Celsius, two degrees is simply too much." NASA scientist James Hansen, the planet's most prominent climatologist, is even blunter: "The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster." At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: "Some countries will flat-out disappear." When delegates from developing nations were warned that two degrees would represent a "suicide pact" for drought-stricken Africa, many of them started chanting, "One degree, one Africa."

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it's fair to say that it's the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world's carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can't raise the temperature more than two degrees Celsius – it's become the bottomest of bottom lines. Two degrees.

by Bill McKibben, Rolling Stone |  Read more:
Illustration by Edel Rodriguez

Thursday, July 19, 2012

Michelle Jenneke, Junior World Championships, Barcelona 2012


The Future of Manufacturing Is in America, Not China


A furor broke out last week after it was reported that the uniforms of U.S. Olympians would be manufactured in China. "They should take all the uniforms, put them in a big pile, and burn them," said an apoplectic Sen. Harry Reid. The story tapped into the anger -- and fear -- that Americans feel about the loss of manufacturing to China. Seduced by government subsidies, cheap labor, lax regulations, and a rigged currency, U.S. industry has rushed to China in recent decades, with millions of American jobs lost. It is these fears, rather than the Olympic uniforms themselves, that triggered last week's congressional uproar.

But Ralph Lauren berets aside, the larger trends show that the tide has turned, and it is China's turn to worry. Many CEOs, including Dow Chemicals' Andrew Liveris, have declared their intentions to bring manufacturing back to the United States. What is going to accelerate the trend isn't, as people believe, the rising cost of Chinese labor or a rising yuan. The real threat to China comes from technology. Technical advances will soon lead to the same hollowing out of China's manufacturing industry that they have to U.S industry over the past two decades.

Several technologies advancing and converging will cause this.

First, robotics. The robots of today aren't the androids or Cylons that we are used to seeing in science fiction movies, but specialized electromechanical devices run by software and remote control. As computers become more powerful, so do the abilities of these devices. Robots are now capable of performing surgery, milking cows, doing military reconnaissance and combat, and flying fighter jets. Several companies, such Willow Garage, iRobot, and 9th Sense, sell robot-development kits for which university students and open-source communities are developing ever more sophisticated applications.

The factory assembly that China is currently performing is child's play compared to the next generation of robots -- which will soon become cheaper than human labor. One of China's largest manufacturers, Taiwan-based Foxconn Technology Group, announced last August that it plans to install one million robots within three years to do the work that its workers in China presently do. It has found even low-cost Chinese labor to be too expensive and demanding.

Then there is artificial intelligence (AI) -- software that makes computers, if not intelligent in the human sense, at least good enough to fake it. This is the basic technology that IBM's Deep Blue computer used to beat chess grandmaster Garry Kasparov in 1997 and that enabled IBM's Watson to beat TV-show Jeopardy champions in 2011. AI is making it possible to develop self-driving cars, voice-recognition systems such as the iPhone's Siri, and Face.com, the face-recognition software Facebook recently acquired.

by Vivek Wadhwa, Foreign Policy |  Read more:
Stephen Brashear/Getty Images

The Triumph of the Family Farm


We buried my grandfather last spring. He had died in his sleep in his own bed at 95, so, as funerals go, it wasn’t a grim occasion. But it was a historic one for our small rural community. My great-grandparents were early settlers, arriving in 1913 and farming the land throughout their lives. My grandfather continued that tradition, and now rests next to them on a hillside overlooking the family homestead.

If you’re a part of the roughly 99 percent of the North American population that doesn’t work on a farm, you might guess at what comes next—many a lament has been written about the passing of the good old days in rural areas, the family farm’s decline, and the inevitable loss of the homestead. But in many respects, that narrative itself is obsolete. That’s certainly true in my family’s case: The Freeland farm is still being cultivated by my father. And it is bigger and more prosperous than ever.

My dad farms 3,200 acres of his own, and rents another 2,400—all told, a territory seven times the size of Central Park. Last year, he produced 3,900 tonnes (or metric tons) of wheat, 2,500 tonnes of canola, and 1,400 tonnes of barley. (That’s enough to produce 13 million loaves of bread, 1.2 million liters of vegetable oil, and 40,000 barrels of beer.) His revenue last year was more than $2 million, and he admits to having made “a good profit,” but won’t reveal more than that. The farm has just three workers, my dad and his two hired men, who farm with him nine months of the year. For the two or three weeks of seeding and harvest, my dad usually hires a few friends to help out, too.

My father farms in northern Alberta, but his story is typical of large-scale family farmers across North America. Urbanites may picture farmers as hip heritage-pig breeders returning to the land, or a struggling rural underclass waging a doomed battle to hang on to their patrimony as agribusiness moves in. But these stereotypes are misleading. In 2010, of all the farms in the United States with at least $1 million in revenues, 88 percent were family farms, and they accounted for 79 percent of production. Large-scale farmers today are sophisticated businesspeople who use GPS equipment to guide their combines, biotechnology to boost their yields, and futures contracts to hedge their risk. They are also pretty rich. (...)

Big Money has noticed these trends, and is beginning to pile in. “We are seeing a tremendous uptick in allocations and interest in farmland,” says Chris Erickson of HighQuest Partners, an agricultural consultancy and investor. Erickson told me that big institutional investors—pension funds, insurance companies—have recently been making investments in farmland ranging from “the several hundred millions to the billions.” Erickson said this broad interest is new, and is driven by the fact that “the fundamentals are changing dramatically.”

by Chrystia Freeland, The Atlantic |  Read more:
Photo: David Johnston

Robert Glasper Experiment - Afro Blue (Feat. Erykah Badu)


Wednesday, July 18, 2012

Into the Wild

Marko Cheseto is almost late to class. He enters the lobby of the social sciences building at 9:58 a.m., two minutes before his public speaking lecture begins. He is in no rush, plodding slowly amid the blur of backpacks and students. He stands out: 28 years old, long and spindly, a black man on the mostly white campus of the University of Alaska Anchorage, a Kenyan among mostly in-state students. His skin is as dark as an Alaskan winter morning; patches of frostbite char his cheeks like eyeblack. His lips are dry and crevassed. He is the most famous person on campus, a star runner. And he's pushing a two-wheeled walker.

A blond girl stops him. "Marko!" she says.

"Hellll-oooo!" he replies, voice arching.

"Can I give you a hug?"

"Okay, just don't push me!" he says in fast, accented English. She moves in gently. Marko embraces her with his left arm, his right hand steadying himself. For two months, Marko has envisioned this January morning: First day of spring semester senior year, a chance to prove that he's still the same old sweet, sarcastic, eager-to-entertain Marko. A few nights ago at a UAA basketball game, girls had hugged him in droves. Three former teammates surrounded him for a picture and posted it on Facebook. Marko had ambled around without his walker, showing off, perhaps too much.

Now Marko says goodbye to the blonde and rolls into an elevator. Before the doors close, an older woman whom Marko doesn't know juts toward the narrowing window and whispers, "We love you." The elevator rings open on the second floor, and Marko pushes to Room 251. He rolls toward the desks, then stops like a car that's halfway through a wrong turn.

Those desks -- the normal desks -- aren't for him anymore. He turns toward the lone handicap table, twists and falls into his seat straight-legged, then glances down at the shiny black shoes covering his new plastic stubs.

Those used to be his feet.

During an August night in 2008, Marko Cheseto walked onto a plane in Nairobi bound for Alaska. His feet were his own. He had only $100 in his pockets. His luggage totaled one bag containing two outfits. He was raised in Ptop, a village of 1,000 in the western Kenyan mountains, elevation 8,000 feet -- a foggy, damp region without running water or electricity or roads, where the Pokot dialect of Swahili was spoken. His father, Dickson, farmed, built houses and herded animals, many of which he sold to help purchase a one-way ticket to Anchorage, where the third oldest of his 11 children would attend college on a cross-country and track scholarship.

Nobody from Marko's village had ever left to go to school in America, never mind Alaska. Running was not the route out of Ptop as it was in so many other poor villages in Kenya's highlands. But running was something he always did well. After he graduated from a Nairobi two-year college in 2006 and was earning a modest living as a teacher, he noticed that runners -- inferior runners, he felt -- were leaving on scholarship for U.S. colleges. America meant money, and those who left were expected to share it to help back home.

So Marko chased a new life in hopes of improving his family's old one. He wanted, in the words of his cousin Nicholas Atudonyang, "to be a role model for the guys in his village." He enrolled in one of the running academies in Eldoret, training twice daily in the 6,000-foot elevation, and had moderate success in local races. That got his name on American recruiters' prospect lists. Michael Friess, the track and cross-country coach at Alaska Anchorage, already had one star Kenyan on his roster, David Kiplagat, and wanted to add more. Friess, a loving hard-ass who's been UAA's head coach for 22 of his 50 years, offered Marko a full scholarship, without even meeting him

At first, his parents didn't want Marko to leave, fearing that they'd have to support him again. But he argued that although his teaching job was fine for him, his father could desperately use extra income to supplement his typical earnings of $200 a year. In Alaska, Marko said, he'd work part time and send home a few hundred dollars a year. His parents acquiesced, selling farm animals and asking members of their extended family to help cover Marko's expenses. So Marko, seated in the rear, a few rows behind another runner bound for UAA, Alfred Kangogo, flew from Nairobi to Amsterdam to Minneapolis to Anchorage. All he'd heard about Alaska was that it was dark 24 hours a day. But when they arrived in the evening, the sun shining, Alfred turned to Marko and said, "Just like home."  (...)

But the ease with which Marko and his fellow Kenyans got along with other students belied the fact that getting beyond the surface was difficult. The Kenyans were too busy being unspoken breadwinners to date much. Friess, worried that they were stretched too thin, told them they couldn't begin work at 6 a.m. anymore. They adjusted by working later. They simply carried on, each handling the pressure in his own way. David was driven, eventually graduating with a degree in finance and economics. Alfred was relentless, earning the nickname Bulldog. And Marko tried to be perfect, putting on a positive front even during the occasional month when he didn't earn enough to send any money home. After he paid rent and his school expenses, much of his $450 take-home was spoken for. Usually he was able to save up and wire $100 every few months.

by Seth Wickersham, ESPN |  Read more:
Photo: Jose Mandojana for ESPN The Magazine

Dying in Court

Gloria Taylor, a Canadian, has amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease. Over a period of a few years, her muscles will weaken until she can no longer walk, use her hands, chew, swallow, speak, and ultimately, breathe. Then she will die. Taylor does not want to go through all of that. She wants to die at a time of her own choosing.

Suicide is not a crime in Canada, so, as Taylor put it: “I simply cannot understand why the law holds that the able-bodied who are terminally ill are allowed to shoot themselves when they have had enough because they are able to hold a gun steady, but because my illness affects my ability to move and control my body, I cannot be allowed compassionate help to allow me to commit an equivalent act using lethal medication.”

Taylor sees the law as offering her a cruel choice: either end her life when she still finds it enjoyable, but is capable of killing herself, or give up the right that others have to end their lives when they choose. She went to court, arguing that the provisions of the Criminal Code that prevent her from receiving assistance in dying are inconsistent with the Canadian Charter of Rights and Freedoms, which gives Canadians rights to life, liberty, personal security, and equality.

The court hearing was remarkable for the thoroughness with which Justice Lynn Smith examined the ethical questions before her. She received expert opinions from leading figures on both sides of the issue, not only Canadians, but also authorities in Australia, Belgium, the Netherlands, New Zealand, Switzerland, the United Kingdom, and the United States. The range of expertise included general medicine, palliative care, neurology, disability studies, gerontology, psychiatry, psychology, law, philosophy, and bioethics.
Many of these experts were cross-examined in court. Along with Taylor’s right to die, decades of debate about assistance in dying came under scrutiny.

Last month, Smith issued her judgment. The case, Carter v. Canada, could serve as a textbook on the facts, law, and ethics of assistance in dying.

by Peter Singer, Project Syndicate |  Read more:
Illustration: Margaret Scott

Yellow Bather, Keith Vaughan. English (1912 - 1977)
via:

32 Innovations That Will Change Your Tomorrow


The electric light was a failure.

Invented by the British chemist Humphry Davy in the early 1800s, it spent nearly 80 years being passed from one initially hopeful researcher to another, like some not-quite-housebroken puppy. In 1879, Thomas Edison finally figured out how to make an incandescent light bulb that people would buy. But that didn’t mean the technology immediately became successful. It took another 40 years, into the 1920s, for electric utilities to become stable, profitable businesses. And even then, success happened only because the utilities created other reasons to consume electricity. They invented the electric toaster and the electric curling iron and found lots of uses for electric motors. They built Coney Island. They installed electric streetcar lines in any place large enough to call itself a town. All of this, these frivolous gadgets and pleasurable diversions, gave us the light bulb.

We tend to rewrite the histories of technological innovation, making myths about a guy who had a great idea that changed the world. In reality, though, innovation isn’t the goal; it’s everything that gets you there. It’s bad financial decisions and blueprints for machines that weren’t built until decades later. It’s the important leaps forward that synthesize lots of ideas, and it’s the belly-up failures that teach us what not to do.

When we ignore how innovation actually works, we make it hard to see what’s happening right in front of us today. If you don’t know that the incandescent light was a failure before it was a success, it’s easy to write off some modern energy innovations — like solar panels — because they haven’t hit the big time fast enough.

Worse, the fairy-tale view of history implies that innovation has an end. It doesn’t. What we want and what we need keeps changing. The incandescent light was a 19th-century failure and a 20th- century success. Now it’s a failure again, edged out by new technologies, like LEDs, that were, themselves, failures for many years.

That’s what this issue is about: all the little failures, trivialities and not-quite-solved mysteries that make the successes possible. This is what innovation looks like. It’s messy, and it’s awesome.

by Maggie Koerth-Baker, NY Times |  Read more:
Illustration: Chris Nosenzo

Don’t Indulge. Be Happy.


How much money do you need to be happy? Think about it. What’s your number?

Many of us aren’t satisfied with how much we have now. That’s why we’re constantly angling for a raise at work, befriending aged relatives and springing, despite long odds, for lottery scratch tickets.

Is it crazy to question how much money you need to be happy? The notion that money can’t buy happiness has been around a long time — even before yoga came into vogue. But it turns out there is a measurable connection between income and happiness; not surprisingly, people with a comfortable living standard are happier than people living in poverty.

The catch is that additional income doesn’t buy us any additional happiness on a typical day once we reach that comfortable standard. The magic number that defines this “comfortable standard” varies across individuals and countries, but in the United States, it seems to fall somewhere around $75,000. Using Gallup data collected from almost half a million Americans, researchers at Princeton found that higher household incomes were associated with better moods on a daily basis — but the beneficial effects of money tapered off entirely after the $75,000 mark.

Why, then, do so many of us bother to work so hard long after we have reached an income level sufficient to make most of us happy? One reason is that our ideas about the relationship between money and happiness are misguided. In research we conducted with a national sample of Americans, people thought that their life satisfaction would double if they made $55,000 instead of $25,000: more than twice as much money, twice as much happiness. But our data showed that people who earned $55,000 were just 9 percent more satisfied than those making $25,000. Nine percent beats zero percent, but it’s still kind of a letdown when you were expecting a 100 percent return.

Interestingly, and usefully, it turns out that what we do with our money plays a far more important role than how much money we make. Imagine three people each win $1 million in the lottery. Suppose one person attempts to buy every single thing he has ever wanted; one puts it all in the bank and uses the money only sparingly, for special occasions; and one gives it all to charity. At the end of the year, they all would report an additional $1 million of income. Many of us would follow the first person’s strategy, but the latter two winners are likely to get the bigger happiness bang for their buck.

by Elizabeth Dunn and Michael Norton, NY Times |  Read more:
Illustration: Brock Davis

Tuesday, July 17, 2012



Amy Bennett
Losing It, oil on panel, 16 x 20 inches

The Shopping Mall Turns 60 (and Prepares to Retire)


The enclosed suburban shopping mall has become so synonymous with the American landscape that it’s hard to imagine the original idea for it ever springing from some particular person's imagination. Now the scheme seems obvious: of course Americans want to amble indoors in a million square feet of air-conditioned retail, of course we will need a food court because so much shopping can’t be done without meal breaks, and of course we will require 10,000 parking spaces ringing the whole thing to accommodate all our cars.

The classic indoor mall, however, is widely credited with having an inventor. And when the Vienna-born architect Victor Gruen first outlined his vision for it in a 1952 article in the magazine Progressive Architecture, the plan was a shocker. Most Americans were still shopping downtown, and suburban "shopping centers," to the extent they existed, were most definitely not enclosed in indoor mega-destinations.

Gruen’s idea transformed American consumption patterns and much of the environment around us. At age 60, however, the enclosed regional shopping mall also appears to be an idea that has run its course (OK, maybe not in China, but among Gruen’s original clientele). He opened the first prototype in Edina, Minnesota, in 1956, and the concept spread from there (this also means the earliest examples of the archetypal American mall are now of age for historic designation, if anyone wants to make that argument).

At the mall’s peak popularity, in 1990, America opened 19 of them. But we haven’t cut the ribbon on a new one since 2006, for reasons that go beyond the recession. As we imagine ways to repurpose these aging monoliths and what the next generation of retail should look like, it’s worth recalling Gruen’s odd legacy. He hated suburbia. He thought his ideas would revitalize cities. He wanted to bring urban density to the suburbs. And he envisioned shopping malls as our best chance at containing sprawl.

"He said great quotes on suburbia being 'soulless' and 'in search of a heart,'" says Jeff Hardwick, who wrote the Gruen biography Mall Maker. "He just goes on and on with these critiques. And they occur really early in his writing as well. So it’s not as if he ends up bemoaning suburbia later. He’s critiquing suburbia pretty much from the get-go, and of course the remedy he offers is the shopping mall."

by Emily Badger, The Atlantic | Read more:
Photo: Reuters


KOBAYASHI Kiyochika(小林 清親 Japanese, 1847-1915)
The winter moon above Sumida river.  1915
Woodblock 

The Joke’s on You

Among the hacks who staff our factories of conventional wisdom, evidence abounds that we are living in a golden age of political comedy. The New York Times nominates Jon Stewart, beloved host of Comedy Central’s Daily Show, as the “most trusted man in America.” His protégé, Stephen Colbert, enjoys the sort of slavish media coverage reserved for philanthropic rock stars. Bill Maher does double duty as HBO’s resident provocateur and a regular on the cable news circuit. The Onion, once a satirical broadsheet published by starving college students, is now a mini-empire with its own news channel. Stewart and Colbert, in particular, have assumed the role of secular saints whose nightly shtick restores sanity to a world gone mad.

But their sanctification is not evidence of a world gone mad so much as an audience gone to lard morally, ignorant of the comic impulse’s more radical virtues. Over the past decade, political humor has proliferated not as a daring form of social commentary, but a reliable profit source. Our high-tech jesters serve as smirking adjuncts to the dysfunctional institutions of modern media and politics, from which all their routines derive. Their net effect is almost entirely therapeutic: they congratulate viewers for their fine habits of thought and feeling while remaining careful never to question the corrupt precepts of the status quo too vigorously.

Our lazy embrace of Stewart and Colbert is a testament to our own impoverished comic standards. We have come to accept coy mockery as genuine subversion and snarky mimesis as originality. It would be more accurate to describe our golden age of political comedy as the peak output of a lucrative corporate plantation whose chief export is a cheap and powerful opiate for progressive angst and rage.

Fans will find this assessment offensive. Stewart and Colbert, they will argue, are comedians, offering late-night entertainment in the vein of David Letterman or Jay Leno, but with a topical twist. To expect them to do anything more than make us laugh is unfair. Besides, Stewart and Colbert do play a vital civic role—they’re a dependable news source for their mostly young viewers, and de facto watchdogs against media hype and political hypocrisy.

Michiko Kakutani of the New York Times offered a summation of the majority opinion in a 2008 profile of Stewart that doubled as his highbrow coronation. “Mr. Stewart describes his job as ‘throwing spitballs’ from the back of the room,” she wrote. “Still, he and his writers have energetically tackled the big issues of the day . . . in ways that straight news programs cannot: speaking truth to power in blunt, sometimes profane language, while using satire and playful looniness to ensure that their political analysis never becomes solemn or pretentious.”

Putting aside the obvious objection that poking fun at the powerful isn’t the same as bluntly confronting them, it’s important to give Stewart and Colbert their due. They are both superlative comedians with brilliant writing staffs. They represent a quantum improvement over the aphoristic pabulum of the thirties satirist Will Rogers or the musical schmaltz of Beltway balladeer Mark Russell. Stewart and Colbert have, on occasion, aimed their barbs squarely at the seats of power. (...)

What’s notable about these episodes, though, is how uncharacteristic they are. What Stewart and Colbert do most nights is convert civic villainy into disposable laughs. They prefer Horatian satire to Juvenalian, and thus treat the ills of modern media and politics as matters of folly, not concerted evil. Rather than targeting the obscene cruelties borne of greed and fostered by apathy, they harp on a rogues’ gallery of hypocrites familiar to anyone with a TiVo or a functioning memory. Wit, exaggeration, and gentle mockery trump ridicule and invective. The goal is to mollify people, not incite them.

In Kakutani’s adoring New York Times profile, Stewart spoke of his comedic mission as though it were an upscale antidepressant: “It’s a wonderful feeling to have this toxin in your body in the morning, that little cup of sadness, and feel by 7 or 7:30 that night, you’ve released it in sweat equity and can move on to the next day.” What’s missing from this formulation is the idea that comedy might, you know, change something other than your mood.

by Steve Almond, The Baffler |  Read more:
Illustration: Steven Kroninger

Ask Someone Who Recently Traveled Around Ireland

A couple weeks ago I took a 10-day trip through Ireland, with long-to-very-short stops in Dublin, Malahide, Kilkenny, Killarney, around the Ring of Kerry and the Skellig Ring, Dingle, Portmagee, Athlone, Galway, and Belfast. It was great.

Question: I have wanted to do such a trip for a while! But was chicken about driving there. How difficult did you find it? (I looked into bus schedules but it seemed ... like I should get over it and just rent a car.)

The cars were hard! Probably the most unexpectedly stressful part, actually. The roads are about this wide ||, and the driving-on-the-left-side thing is not an instant natural fit for everyone [nervous laughter], and I spent a lot of the time staring at the road (on the passenger side) clutching my hands in catatonic terror. But then I loosened up [a very little bit], and it was fine! (My travel partner deserves lots and lots of credit for both never crashing and for putting up with me the whole time. "Careful." "Careful!" "CAREFUL." "Careful." "CAREFUL." "Careful.") Way more fun, in a different way, were the trains! Especially when you bring/buy booze on them. I think — I think — if I did it again, I might just go everywhere by bus and train, although that'd sacrifice the roamy, up-and-down-the-Irish-hills, windows-open part of it. But both options are great. Unless you crash and die. It bewilders me that everyone in Ireland hasn't already crashed and died. (...)

My question is what kind of food does one eat in Ireland? Also did you see any of those really pretty people that have blue eyes and dark hair?

There was a REALLY beautiful girl with shiny black hair, pale blue eyes, and peaches-and-cream skin (ugh!) who worked at the car rental place at the Dublin airport, and she was so pretty it made me angry! I'm still angry at her! And at the terrifying car she rented me. No, she was lovely, and the car was great. It was just very, very small, which ultimately was good for the micro-roads. Wait, let me talk more about the cars and roads! That's a joke, but now that we're here again, the roads were actually really smooth, so that was nice. More on the roads shortly, I'm sure.

And the food was pretty surprisingly great everywhere, despite its less-than-glowing reputation. Although it depends a lot on whether you like fish. There's so much good fish! (Best I had: Out of the Blue, in Dingle. Get the John Dory.) Vegetarians might have a little trouble, though. And eating as much local Irish beef as I did fish seems tasty but maybe not as digestively pleasant, although that's just me. As long as you don't eat too much of anything, when you do have fish and chips, and beef-and-Guinness stew, and rustic brown bread slathered with ridiculously delicious butter and jam, and poached eggs, and black pudding, and Willy Wonka-style chocolates with hilarious names like "Pimbly Mimblies" and "Mumbly Pumblies," and swirly soft-serve ice cream cones that look like they came out of a cartoon, and everything else, it doesn't hurt your stomach too badly. Or do whatever!

How depressing is it over there, generally? Like, do people just seem sadder than people in America? Or are my assumptions about what the food, weather, and economy do to the national mood way off?

Hmm, I don't know. Everyone seemed pretty good, but I was only there for 10 days, tourist-ing it up. But what's definitely true is that everyone — truly every person I spoke with or otherwise observed — was incredibly nice. Went out of their way to be warm, helpful, open. So friendly, even when you ask stupid questions or drive embarrassingly Americanly (sorry, nice man on that bike!). If you go, talk to people! The people and the scenery are really the best parts of the whole experience, and the reason to visit. Also the beer and whiskey. (But the Jameson distillery is skippable. Kind of. Actually it's pretty hilarious, if only for the strangely pro-America, propaganda-style video screened at the beginning.)

There was no cloud of depression, although they have a good, dark sense of humor about things.

Also they made fun of the country and its weather constantly, asking, jokingly [ish], why we even came there in the first place. But no one seemed glum. Also I loved the weather! But I love rain and darkness. There was plenty of sunshine, too. Just not for too-too many hours at a time, usually. But the weather was always in flux. SHALL I GO ON?

by Edith Zimmerman, The Hairpin | Read more:

How Much Has Citizens United Changed the Political Game?

“A hundred million dollars is nothing,” the venture capitalist Andy Rappaport told me back in the summer of 2004. This was at a moment when wealthy liberals like George Soros and Peter Lewis were looking to influence national politics by financing their own voter-turnout machine and TV ads and by creating an investment fund for start-ups. Rappaport’s statement struck me as an expression of supreme hubris. In American politics at that time, $100 million really meant something.

Eight years later, of course, his pronouncement seems quaint. Conservative groups alone, including a super PAC led by Karl Rove and another group backed by the brothers Charles and David Koch, will likely spend more than a billion dollars trying to take down Barack Obama by the time November rolls around.

The reason for this exponential leap in political spending, if you talk to most Democrats or read most news reports, comes down to two words: Citizens United. The term is shorthand for a Supreme Court decision that gave corporations much of the same right to political speech as individuals have, thus removing virtually any restriction on corporate money in politics. The oft-repeated narrative of 2012 goes like this: Citizens United unleashed a torrent of money from businesses and the multimillionaires who run them, and as a result we are now seeing the corporate takeover of American politics.

As a matter of political strategy, this is a useful story to tell, appealing to liberals and independent voters who aren’t necessarily enthusiastic about the administration but who are concerned about societal inequality, which is why President Obama has made it a rallying cry almost from the moment the Citizens United ruling was made. But if you’re trying to understand what’s really going on with politics and money, the accepted narrative around Citizens United is, at best, overly simplistic. And in some respects, it’s just plain wrong.

It helps first to understand what Citizens United did and didn’t do to change the opaque rules governing outside money. Go back to, say, 2007, and pretend you’re a conservative donor. At this moment, you would still have been free to write a check for any amount to a 527 — so named because of the shadowy provision in the tax code that made such groups legal. (America Coming Together and the infamous Swift Boat Veterans for Truth were both 527s.) Even corporations, though they couldn’t contribute to a candidate or a party, were free to write unlimited checks to something called a social-welfare group, whose principal purpose, ostensibly, is issue advocacy rather than political activity. The anti-tax Club for Growth, for instance, is a social-welfare group. So, remarkably, is the Koch brothers’ Americans for Prosperity and Karl Rove’s Crossroads GPS.

There were, however, a few caveats when it came to the way these groups could spend their money. Neither a 527 nor a social-welfare group could engage in “express advocacy” — that is, overtly making the case for one candidate over another. Nor could they use corporate money for “electioneering communications” — a category defined as radio or television advertising that even mentions a candidate’s name within 30 days of a primary or 60 days of a general election. So under the old rules, the Club for Growth couldn’t broadcast an ad that said “Vote Against Barack Obama,” but it could spend that money on as many ads as it wanted that said “Barack Obama has ruined America — call and tell him to stop!” as long as it did so more than 60 days before an election. (The distinction between those two ads may sound silly and arcane to you, but that’s why you don’t sit on the Federal Election Commission.)

by Matt Bai, NY Times |  Read more:
Illustration by Other Means