Sunday, July 7, 2013

Why I Donated My Stool

This spring I saved a friend from a terrible illness, maybe even death. No, I didn’t donate a kidney or a piece of my lung. I did it with my stool.

About 18 months ago, my friend, whom I’ll call Gene to protect his privacy, fell sick with stomach pain, intestinal cramps and copious bloody diarrhea. He had ulcerative colitis, a colon riddled with bleeding ulcers.

His gastroenterologist started him on steroids and anti-inflammatories — standard treatment for these ulcers. He felt better and within a few weeks was able to taper off the steroids, which can be dangerous if used over the long term. But a month later, the bleeding and diarrhea were back. He was in horrible pain that worsened when he ate or drank. He couldn’t sleep at night.

The doctor put him back on the steroids, but this time the symptoms weren’t held in check. For the next excruciating year, my friend went through episodes where he could do nothing but lie writhing in bed in pain. He lost frightening amounts of weight, became anemic from the blood loss and was forced to take medical leave from a job he loved.

According to his doctors, he was left with two options: powerful immunosuppressant drugs (the kind they give people after organ transplants) or a total colectomy (the removal of the colon). The drugs might not be effective, and they raised the risk of lymphoma or fatal infections, while with the surgical option, the tissue left behind could and often did eventually become ulcerated itself.

That’s when Gene started reading about a procedure called fecal microbiota transplant, or F.M.T.

Transplanting the stool from one person into the digestive tract of another seems, well, repulsive, but it also makes sense. The majority of the matter in stool — roughly 60 percent — is bacteria, dead and alive, but mostly alive. While bacteria can make us sick, they also constitute a large part of who we are; the hundreds of trillions of cells in an individual’s microbiome, as this collective is known, outnumber human cells 10 to 1. The bacteria serve many functions, including in metabolism, hormone regulation and the immune system.

The microbiome of the digestive system is particularly important. At least a thousand strains of bacteria coexist in a healthy human bowel, and beneficial bacteria are involved in vitamin production, digestion and keeping “bad” bacteria in check. Thus, changes to the gut microbiome can precipitate disease. For instance, taking a powerful antibiotic wipes out both good and bad gut flora, which can lead to opportunistic bacteria taking over and causing infection.

Many people who suffer from clostridium difficile, a dangerous strain of bacteria that is becoming epidemic in hospitals and nursing homes, got it this way. The idea behind fecal transfers is that restoring colonies of healthy bacteria can either dilute or crowd out these harmful strains. And it seems to work: in January, The New England Journal of Medicine reported that the first randomized clinical trial of F.M.T.’s for clostridium difficile had been halted because the treatment worked so well that it was unethical to withhold it from the control group.

The causes of ulcerative colitis are more mysterious than those of clostridium difficile (doctors in Gene’s case did not hazard a guess), but there is some speculation that the condition can also be traced to pathogenic bacteria. A small study of children with ulcerative colitis, published this spring in The Journal of Pediatric Gastroenterology and Nutrition, found that 78 percent had a reduction in symptoms within a week of being treated with fecal transfers. (...)

Today, around 3,000 F.M.T.’s have been performed worldwide. No significant adverse reactions have been definitively attributed to the procedure (though there have been two F.M.T.’s that may have led to the transmission of the norovirus stomach bug, both of which cleared on their own within days).

Convinced that the potential benefits outweighed the risks, Gene decided, early this year, to try F.M.T. However, this turned out to be harder than he’d expected. There are only about 16 centers in the country that even offer the treatment. Gene finally secured an appointment with Dr. Lawrence Brandt, one of the most experienced F.M.T. practitioners, only to find out, just before his visit, that Dr. Brandt was suspending his F.M.T. practice for ulcerative colitis on the advice of the hospital’s lawyers, in order to comply with a new Food and Drug Administration decision. In April, the F.D.A. decided to classify human stool that is used therapeutically as a drug, and thus approved for use only within an F.D.A.-approved clinical study.

Gene tried tracking down other doctors, but found to his frustration that almost all of them had stopped doing F.M.T.’s as a result of the agency’s somewhat ambiguous restrictions. He found one remaining gastroenterologist, R. David Shepard, who had an excellent record of treating ulcerative colitis with fecal transfers and was still doing them. But Dr. Shepard was in Florida, and Gene was now too sick to travel.

Dr. Shepard, however, had a solution: he would help Gene with the mechanics of performing a do-it-yourself F.M.T., something he’d done successfully with a handful of other patients. Gene just had to find a donor.

by Marie Myung-Ok Lee , NY Times | Read more:
Image: Katie Scott

Saturday, July 6, 2013


Woman in Green with a Carnation, 1909 by Henri Matisse
via:

Ahmad Jamal


A Matter of Life and Death

[ed. Repost. One of the best essays on cancer and dying you will ever read. Heartbreaking, yet fiercely life affirming.]

It was cancer—a brutally sudden death sentence: the doctors told the author she had probably less than six months. For a woman with two young children and a full life, that prognosis was devastating, but also, in some ways, oddly liberating. And so began more than three years of horror, hope, and grace, as she learned to live, and even laugh, on borrowed time.

The beast first showed its face benignly, in the late-June warmth of a California swimming pool, and it would take me more than a year to know it for what it was. Willie and I were lolling happily in the sunny shallow end of my in-laws’ pool when he—then only seven—said, “Mommy, you’re getting thinner.”

It was true, I realized with some pleasure. Those intractable 10 or 15 pounds that had settled in over the course of two pregnancies: hadn’t they seemed, lately, to be melting away? I had never gained enough weight to think about trying very hard to lose it, except for sporadic, failed commitments to the health club. But I’d carried—for so many years I hardly noticed it—an unpleasant sensation of being more cushiony than I wanted to be. And now, without trying, I’d lost at least five pounds, perhaps even eight.

I suppose I fell into the smug assumption that I had magically restored the lucky metabolism of my 20s and 30s, when it had been easy for me to carry between 110 and 120 pounds on a frame of five feet six inches. True, in the months before Willie’s observation, I’d been working harder, and more happily, than I had in years—burning more fuel through later nights and busier days. I’d also been smoking, an old habit I’d fallen into again two years earlier, bouncing back and forth between quitting and succumbing, working up to something like eight cigarettes a day.

Of course Willie noticed it first, I now think: children major in the study of their mothers, and Willie has the elder child’s umbilical awareness of me. But how is it that I didn’t even question a weight loss striking enough for a child to speak up about? I was too happy enjoying this unexpected gift to question it even briefly: the American woman’s yearning for thinness is so deeply a part of me that it never crossed my mind that a weight loss could herald something other than good fortune.

As it happened, I took up running about a month later, in concert with quitting smoking for good. By the end of the summer I was running about four miles a day, at least five days a week. And with all that exercise I found I could eat pretty much anything I wanted without worrying about my weight. So more weight melted away, and the steady weight loss that might have warned me something was going badly wrong disguised itself instead as the reward for all those pounding steps I was taking through the chill of early fall, the sting of winter, the beauty of spring’s beginning. I went from around 126 pounds, in the spring of 2000, to about 109 a year later.

Somewhere in there my period became irregular—first it was late, then it stopped altogether. Well, I’d heard of this: women who exercise heavily sometimes do become amenorrheic. I discussed it with my gynecologist in January, and he agreed it was no real cause for alarm. He checked my hormone levels and found I definitely hadn’t hit perimenopause, but what I most remember about that visit is the amazed approval with which he commented on the good shape I was in.

Around that time—I can’t pinpoint exactly when—I began to have hot flashes, almost unnoticeable at first, gradually increasing in intensity. Well, I said to myself, I must be perimenopausal after all; a gynecologist friend told me that hormone levels can fluctuate so much that the test my doctor had done wasn’t necessarily the last word on the subject.

Then one day in April I was lying on my back, talking idly on the telephone (strangely, I don’t remember to whom), and running my hand up and down my now deliciously scrawny stomach. And just like that I felt it: a mass, about the size of a small apricot, on the lower right side of my abdomen. My mind swung sharply into focus: Have I ever felt this thing before, this lump? Well, who knows, maybe this is a part of my anatomy I was just never aware of before—I had always had a little layer of fat between my skin and the mysteries of the innards. Maybe there was some part of the intestine that felt that way, and I had just never been thin enough to notice it before.

You know how you’ve always wondered about it: Would you notice if you had a sudden lump? Would you be sensible enough to do something about it? How would your mind react? For all of us, those wonderings have a luxuriantly melodramatic quality. Because surely that isn’t really how it works; you don’t just stumble onto the fact that you have a lethal cancer while you’re gabbing on the phone like a teenager. Surely you can’t have a death sentence so close to the surface, just resting there, without your being in some other way aware of it.

I thought about calling my doctor, but then remembered that I had a full checkup scheduled in about three weeks anyway; I would bring it up then. In the intervening weeks I often reached down to find this odd bump: sometimes it wasn’t there, and at other times it was. Once, I even thought it had moved—could I possibly be feeling it three inches up and two inches to the left, nearly underneath my belly button? Surely not. This must be just another sign that I was imagining things.

by Marjorie Williams, Vanity Fair |  Read more:
Image: uncredited


Margarita Sikorskaia
via:


Miroslava Rakovic, like a mountain 02
via:

Ultramundane

If we are what we consume, then nothing tells us more about who we are in the early 21st century than the energy drink. Caffeine delivery systems in a can, energy drinks promise to make us perform, whether this is on the bureaucratic terrain of spreadsheets and performance targets or the sports courts where we thrash out our fantasies of being all we can be. There is a multibillion-dollar industry built on the commodification of latent energy. For the protestant subtext of capitalism means that we can question our talent but never our work rate. A can of Relentless, a sugary stimulant sold under a gothic logo, is a step towards a life of “no half measures”. Relentless is the corner shop’s quick fix for flagging performance, the equivalent of a Duracell suppository. Relentless is a one-word manifesto.

The great clichĂ© of late capitalism is that ours is an age of excess. But an excess of what? Marc AugĂ© defined what he called “supermodernity” as an excess of space — of airports, shopping malls and other spatial wastes. But if architecture is plagued by non-places and junkspace, then design indulges in a different kind of excess. Design is drowning in a surplus of performance. It is caught in a 2,000-rpm spin cycle set in motion by modernism. Relentless. The modernists gave us function as a credo, and once we got a taste for it the market took over and did what it does best. It took modernism’s obsession with function to its absurd conclusion. Brands heaped function on top of function, they honed and enhanced, they boosted and superseded, they spent millions on r&d in pursuit of infinitesimal advancements. We, the consumer, tried to keep up but we were always one step behind.

Condemned to longing and jealousy, we watched rapt as consumer products took on a spectral brilliance. Goods were no longer good, they were incredible. They didn’t just perform, they over-performed. Ours is the age of the ultramundane. Far from meaning what it sounds like it means (i.e. very boring), “ultramundane” refers to the otherworldly. Ultramundane products are both banal and yet too good for this world. Belonging to the realm of hyper-performance, they test the limits of our understanding. Where we fail, they succeed. They are the reification of all that we wish we could do, of our longing to perform. (...)

In furniture, hyper-performance achieved its apotheosis in the Aeron chair. What began as a process to design a comfortable chair for the elderly ended as a trophy of the boardroom and the dotcom boom. The Aeron’s unashamed technicality was its selling point. It mechanised comfort, stripping away the layers of foam and leather to leave mesh and a machinery of levers and tilt mechanisms. In fact, this technical system was never meant to result in just a chair. Its designers were developing a concept called Metaforms, which was supposed to produce a furniture system that could support any task, any human behaviour — but it was reduced to just a chair. As such, it did its job too well. It made it possible to sit for hours on end without interruption, rendering you a slave to your work. Cruelly anticipating your every move, it tied you to a life of hyper-performance.

Mechanisation was supposed to be liberating. The problem with technological determinism is that technology determines our behaviour as much as we determine its. Siegfried Giedion warned us about this 65 years ago. Mechanization Takes Command is full of “purely technical solutions” that “found no response in the emotional temper of the time”. In it, you’ll find the 19th-century ancestor of the Aeron, the Invalid Chair of 1838, a multi-hinged reclining number that is basically a less obese La-Z-Boy. (The La-Z-Boy is of course anything but lazy, but by over-performing it allows us to be.) And yet Giedion was clear that mechanisation was progress.

by Justin McGuirk, Domus | Read more:
Illustration by Danilo Agutoli

Friday, July 5, 2013

To Galt’s Gulch They Go

There was a time when Atlas would frown and the world of nations would tremble. He was as mighty as Zeus and as petulant as a teenager. His wrath was irresistible, and he was easily provoked. Badmouth him and he might just drop his burden and walk away. Elect someone he didn’t approve of and he’d put a lightning bolt up your ass.

Chile learned the hard way about minding the feelings of the business-class god. In 1970 that country selected as president one Salvador Allende, a socialist of the old school who quickly set about nationalizing banks, telecom concerns, and so on. American companies naturally feared these developments and laid plans to push the country down a different path. They would withdraw investments, executives mused; they would halt purchases of Chilean goods; and they would persuade others to do the same. President Richard Nixon, who was clearly thinking along the same lines, told his CIA director to “make the economy scream.”

And scream it did. Still, these were the early days of collective capitalist action, and there was a certain brutality and clumsiness to the proceedings. Not every American firm doing business in Chile went along with the program—the high-minded banks, for example, squealed about their policy of “non-involvement in the political affairs of the countries where they do business.” And in the end, Atlas’s goals for the Southern Cone were achieved only by means of an ugly military coup.

In later years, Atlas would grow more subtle in expressing himself, more refined. When François Mitterrand was elected president of France in 1981—another socialist pursuing an array of nationalizations and expanded rights for labor—there was no need for a junta of generals to intervene. Mitterrand pumped the depressed French economy full of Keynesian stimulus, but his nationalizations were too much to take: the private sector simply refused to play along. The New York Times spoke of an “investment strike,” rich Frenchmen moved abroad, and Mitterrand himself moaned about a guerre sociale conducted by the bosses. This socialist was no Salvador Allende: he came into office at the head of a good-sized majority, he presided over one of the largest economies in the world, and he was fully committed to the American-led security program of the era. But none of that mattered to peevish Atlas.

It took only two years for Mitterrand to capitulate. In 1983 he embarked on his famous economic U-turn, one of the most depressing episodes in the entire gloomy history of the neoliberal conquest. Economic orthodoxy returned to France in triumph. Entrepreneurs were celebrated. Labor unions went into a decline from which they have never recovered.

A similar episode took place in those days in Jamaica, where the socialist prime minister, Michael Manley, pleaded with the business community to invest, but without result: their mistrust was simply too great. Another unfolded in Canada, where large national corporations, according to one witness, threatened to pick up their marbles and go home unless Pierre Trudeau’s government abandoned plans to close certain tax loopholes.

And finally America itself got a taste of Atlas’s power. The immortal remark Bill Clinton addressed to his economic advisers shortly after being elected president in 1992—“You mean to tell me that the success of the program and my reelection hinges on the Federal Reserve and a bunch of fucking bond traders?”—will stand forever as testimony to the power of the visible hand. Seven years later, the administration had been converted to the cause so utterly that it now rationalized the things Atlas did to states that dared to regulate: “In a global economy where capital can be invested anywhere,” quoth vice president Al Gore in 1999, “red tape is like an economic noose that says: if you send your investments here, we’re going to strangle them with bureaucracy, inefficiency, and forms, fees, and requirements you can barely even understand.” Even for Americans, certain conventional acts of public administration were now beyond the horizon of the permissible. By 1999, not even a red-baiter like Richard Nixon would have been able to escape the wrath of the business god, thanks to his worshipful hours at the altar of Keynesianism. Just let the infidel try his wage and price controls in the decade of “globalization,” and it’d be his economy that would scream.

by Thomas Frank, Baffler |  Read more:
Image: David Suter


Jeffrey Smart (1921-2013)
via:

Thursday, July 4, 2013

James Clapper, Play-acting, and Political Priorities

The NSA revelations continue to expose far more than just the ongoing operations of that sprawling and unaccountable spying agency. Let's examine what we have learned this week about the US political and media class and then certain EU leaders.

The first NSA story to be reported was our June 6 article which exposed the bulk, indiscriminate collection by the US Government of the telephone records of tens of millions of Americans. Ever since then, it has been undeniably clear that James Clapper, the Director of National Intelligence, outright lied to the US Senate - specifically to the Intelligence Committee, the body charged with oversight over surveillance programs - when he said "no, sir" in response to this question from Democratic Sen. Ron Wyden: "Does the NSA collect any type of data at all on millions or hundreds of millions of Americans?"

That Clapper fundamentally misled Congress is beyond dispute. The DNI himself has now been forced by our stories to admit that his statement was, in his words, "clearly erroneous" and to apologize. But he did this only once our front-page revelations forced him to do so: in other words, what he's sorry about is that he got caught lying to the Senate. And as Salon's David Sirota adeptly documented on Friday, Clapper is still spouting falsehoods as he apologizes and attempts to explain why he did it.

How is this not a huge scandal? Intentionally deceiving Congress is a felony, punishable by up to 5 years in prison for each offense. Reagan administration officials were convicted of misleading Congress as part of the Iran-contra scandal and other controversies, and sports stars have been prosecuted by the Obama DOJ based on allegations they have done so.

Beyond its criminality, lying to Congress destroys the pretense of oversight. Obviously, members of Congress cannot exercise any actual oversight over programs which are being concealed by deceitful national security officials.

In response to our first week of NSA stories, Wyden issued a statementdenouncing these misleading statements, explaining that the Senate's oversight function "cannot be done responsibly if senators aren't getting straight answers to direct questions", and calling for "public hearings" to "address the recent disclosures," arguing that "the American people have the right to expect straight answers from the intelligence leadership to the questions asked by their representatives." Those people who have been defending the NSA programs by claiming there is robust Congressional oversight should be leading the chorus against Clapper, given that his deceit prevents the very oversight they invoke to justify these programs.

But Clapper isn't the only top national security official who has been proven by our NSA stories to be fundamentally misleading the public and the Congress about surveillance programs. As an outstanding Washington Post article by Greg Miller this week documented:
"[D]etails that have emerged from the exposure of hundreds of pages of previously classified NSA documents indicate that public assertions about these programs by senior US officials have also often been misleading, erroneous or simply false."
Please re-read that sentence. It's not just Clapper, but multiple "senior US officials", whose statements have been proven false by our reporting and Edward Snowden's disclosures. Indeed, the Guardian previously published top secret documents disproving the claims of NSA Director Gen. Keith Alexander that the agency is incapable of stating how many Americans are having their calls and emails invaded without warrants, as well as the oft-repeated claim from President Barack Obama that the NSA is not listening in on Americans' calls without warrants. Both of those assertions, as our prior reporting and Miller's article this week demonstrates, are indisputably false.

Beyond that, the NSA got caught spreading falsehoods even in its own public talking points about its surveillance programs, and were forced by our disclosures to quietly delete those inaccuracies. Wyden and another Democratic Senator, Mark Udall, wrote a letter to the NSA identifying multiple inaccuracies in their public claims about their domestic spying activities.

Defending the Obama administration, Paul Krugman pronounced that "the NSA stuff is a policy dispute, not the kind of scandal the right wing wants." Really? In what conceivable sense is this not a serious scandal? If you, as an American citizen, let alone a journalist, don't find it deeply objectionable when top national security officials systematically mislead your representatives in Congress about how the government is spying on you, and repeatedly lie publicly about resulting political controversies over that spying, what is objectionable? If having the NSA engage in secret, indiscriminate domestic spying that warps if not outright violates legal limits isn't a "scandal", then what is?

For many media and political elites, the answer to that question seems clear: what's truly objectionable to them is when powerless individuals blow the whistle on deceitful national security state officials. Hence the endless fixation on Edward Snowden's tone and choice of asylum providers, the flamboyant denunciations of this "29-year-old hacker" for the crime of exposing what our government leaders are doing in the dark, and all sorts of mockery over the drama that resulted from the due-process-free revocation of his passport. This is what our media stars and progressive columnists, pundits and bloggers are obsessing over in the hope of distracting attention away from the surveillance misconduct of top-level Obama officials and their serial deceit about it.

What kind of journalist - or citizen - would focus more on Edward Snowden's tonal oddities and travel drama than on the fact that top US officials have been deceitfully concealing a massive, worldwide spying apparatus being constructed with virtually no accountability or oversight? Just ponder what it says about someone who cares more about, and is angrier about, Edward Snowden's exposure of these facts than they are about James Clapper's falsehoods and the NSA's excesses.

by Glenn Greenwald, The Guardian |  Read more:
Image: J. Scott Applewhite/AP

Happy 4th



Lucile, Deception Pass WA
photo: markk

Amigos

If there was one thing Sandra knew well, it was hair. She knew hair from root to split end. In beauty school, she had learned the shape of the human head and how the best thing to do when trimming its hair was to section the skull into eighths. Her long nails shone red as she held her soft hands in front of her to demonstrate on an imaginary client. Her gold rings glinted. When she tired of hair-cutting techniques, she waved her hands quickly and her fingers sparked through the thick night like fireworks.

Sandra, like other girls who hung out where we sat on Havana’s waist-high seawall malecĂłn where it hit Paseo, wore fashionable clothes of the barely there variety: diminutive shorts with interlocking C’s on back pockets, glittery heels, bras that peeked from tops, halters leaving midriffs bare. She dyed her own long, straight hair blue-black and lined her lips with the same dark pencil that she used around her eyes because shops hadn’t carried red in months. Her plastic nails were thick and whispery along the tips; she grabbed my forearm as we crossed the street on our way to the bathroom at a nearby gas station, dodging the cars that sped around the curve at Paseo. We went the long way to avoid the police who hung in the shadows on the intersection’s traffic island, keeping an eye on the strip. “The cars here, they’ll hit you. And if it’s him”—Sandra flicked her chin and pulled her hand down to mime a beard, the universal gesture for Fidel Castro—“they won’t stop. They’ll run you over and keep on going.”

There were clubs and bars at the hotels that hulked over the crossroads—the mod Riviera, the shimmery Meliá Cohiba, the Jazz CafĂ©—but since few locals could afford drinks there, the tourists who wanted to meet real Cubanos hung out by the sea. Everyone, Cuban and foreign, loved the malecĂłn, to sit facing the ocean and Miami and feel the spray on bare shins, or to turn toward the city and watch old cars roar slowly by, or, after a long night at the bars, to see the brightening sky pull itself away from the the sea. On nights when there was no moon, you could nod approvingly at the fish that men in mesh tank tops caught on sheer line stretched from coils on the sidewalk. On hot days, you watched kids who leapt from the wall into high tide, their arms pinwheeling past the rocks that cragged up from the ocean.

So young men toted bongo drums and guitars, imitating the Buena Vista Social Club for a few dollars’ tip. Gentlemen in frayed straw fedoras asked tourists to pick up an extra beer at the gas station kiosk. Tired-looking women in Lycra shorts sang out the names of cones of roasted peanuts, cucuruchos de manĂ­, and popcorn, rositas de maĂ­z. Nonchalant girls cocked hips at the foreign men who walked past. Sandra had been taught the art of artifice to serve the Cuban revolution through its beauty parlors, but she’d given up on hair. By the time she was twenty-one, she’d been working as a prostitute for around five years. The dates changed every time I asked her. Either way, she made about three times in one night what she’d have been paid monthly at any of the government-owned salons.(...)

Sometimes it’s hard to discern who’s selling sex and who’s just trying to wear as little fabric as possible in Havana’s oppressive heat. The mainstays of jinetera fashion—miniskirts, transparent fabrics, cleavage- and shoulder-baring tops—appear on most women, including foreigners, who feel freer to be sexy in permissive Cuba than at home. At clubs, I saw foreign women with bikini-strap marks sunburned around their necks look left, right, then pull their necklines down before dancing with slim Cuban men in tight jeans and big silver belt buckles. These women lapped up the sensual aura, as if just breathing would send tiny cells of sexy through their bodies, the infusion pushing and pulling hips back and forth, transforming walks into sashays, planting dry one-liners in mouths.

Sandra had long since mastered these feminine tricks. Everything about her physical appearance was calibrated to entice: the tops that looked almost about to slip off, the hair that twisted around her neck, her long, soft, red nails. I had just five years on Sandra, but I felt large, clumsy, and dusty around her in my flats and loose dresses. I was a tattered stuffed animal next to her as we sat, the second time we met, in the backseat of a cab that took us from the malecĂłn out to her house.

by Julia Cooke, VQR |  Read more:
Image: Jason Florio

Wednesday, July 3, 2013


Paris bookstore, 1904, Tavik František Šimon. Czech (1877 - 1942)
via:

Your Student Loan Isn’t Really a Loan


It’s becoming an annual ritual. Every June, Congress debates what to do about the interest rate on federally subsidized student loans, to avert what this year will be the imminent doubling from 3.4 percent to 6.8 percent. But interest rates alone don’t tell the whole story.

At a time when overall student debt approaches $1 trillion, the facts reveal that student loans aren’t loans, not in the traditional sense. They exhibit none of the qualities of modern consumer financial instruments, and are often sold under false pretenses, with the promise of a lifelong benefit that never materializes. We need to change how these loans work and have a broader conversation about what we should be doing — including bankruptcy and refinancing — to help future generations obtain a quality, affordable education, which is critical to our economic future.

The roughly two-thirds of U.S. students who take out loans to finance their college education can end up in a situation most resembling the historical concept of indenture. In medieval times, peasants would sign deeds to work land, which would then get cut in a jagged line (looking like teeth, or “dentures”). Each party would get half, and rejoining them would prove the authenticity of the contract. Colonial indentures would trade years of labor for the opportunity of transportation to the New World. The indentured could not alter the terms of the contract, no matter their circumstances. One way or another, the debt would get paid.

This is basically how student loans work. A college student might remember freshman orientation, when an instructor told them to look to their left and right, explaining, “One of you won’t graduate.” But student loans aren’t extinguished for those who don’t finish college; instead, the debt becomes a burdensome reminder of this early mistake in life. This is also true for students snookered into matriculating at sketchy for-profit colleges, which offer almost no marketable skills or career preparedness to justify the cost. And it further describes recent college graduates who, through an accident of timing, entered the real world during the Great Recession and its aftermath, finding it difficult to obtain work in their field of study.

Due to these combined factors, delinquency rates for student loans – unlike auto, credit card or even mortgage debt – have risen the past two years, according to the Federal Reserve Bank of New York. But student debt is something you carry for the rest of your life. It’s nearly impossible to refinance student loans, despite the current low-rate environment, primarily because of the high credit risk and lack of collateral. And unlike most other loans, you cannot get rid of student debt through bankruptcy.

This happened almost by accident. Before 1976, student debt was treated the same as any other in the bankruptcy process. Amid rising default rates – yes, even back then – Congress got it in its head that people were ripping off the government for a free education and then shedding the loans (a couple of well-placed stories about doctors declaring bankruptcy after graduating from medical school added to the panic). In an effort to stop this, Congress passed a law permitting students only to discharge loans in bankruptcy five years after origination, unless they demonstrated undue hardship.

In 1990 the five-year rule was extended to seven years, and then in 1998 Congress dropped that requirement altogether, making undue hardship the only way to discharge student loans in bankruptcy. And undue hardship is a very large chore to prove, according to Bob Lawless, law professor at the University of Illinois. “The courts require proof of an inability to get by without a modification,” Lawless told Salon. “They’re reluctant to allow a discharge if someone just has a lower-paying job and can’t afford the payment.” So the bankruptcy law has become harsher at the same time that college tuition has ballooned, increasing demand for student loans. As then-law professor Elizabeth Warren said in 2007, “Why should students who are trying to finance an education be treated more harshly than someone … who racked up tens of thousands of dollars gambling?”

In addition to having no escape from their loans, students must deal with aggressive creditors that can get to virtually any income source to secure payment – paychecks and tax refunds included. The Department of Education uses an “army of private debt collectors,” some of the most notorious financial operators out there, to intimidate and harass student borrowers. These collectors earned $1 billion in commissions from taxpayers in 2011. They get paid bonuses for extracting higher payments, and they can also rack up additional fees virtually endlessly. That’s because student debt has no statute of limitations on collectors, unlike most other forms of debt. The government can even collect student loan payments from Social Security checks, thanks to a 1996 law (this is not theoretical, as growing numbers of seniors are entering retirement with student debt).

So, through a series of bad laws, student debt has become an inescapable trap, a terrible burden on those whose higher education dreams don’t pan out, and a significant burden even on successful graduates. Loan debt now averages $26,000 per student, up 40 percent in seven years, with significant chunks owing $50,000 or $100,000. Princeton professor Jesse Rothsteinargued  in a recent working paper that graduates burdened by debt will choose higher-paying jobs to pay off the loans, draining the talent pool for lower-paid, but critical, “public interest” job sectors like education, government or nonprofits. This further erodes the nation’s seed corn and funnels the best and brightest into the financial industry or other higher-paying power centers, reducing entrepreneurship in the bargain. Student debtors also put off major purchases like houses or cars, and the Federal Reserve believes this is having a serious negative effect on our economy.

How do we quit loading up 18-year-olds with a risky gamble that could impact the rest of their lives? It will take more than just a low interest rate, though a variable rate that changes over time (the House Republican plan) or one that isn’t capped (the Obama administration plan) will just make things worse. Obviously a lower rate translates into lower payments, but it’s time to dismantle the treatment afforded student loans, and to eliminate the extreme dependence on them.

by David Dayden, Salon |  Read more:
Image: hxdbzxy via Shutterstock/Salon


Marco ArduiniAt Golf. 1969
via:

I was a Manic Pixie Dream Girl

Like scabies and syphilis, Manic Pixie Dream Girls were with us long before they were accurately named. It was the critic Nathan Rabin who coined the term in a review of the film Elizabethtown, explaining that the character of the Manic Pixie Dream Girl "exists solely in the fevered imaginations of sensitive writer-directors to teach broodingly soulful young men to embrace life and its infinite mysteries and adventures". She pops up everywhere these days, in films and comics and novels and television, fascinating lonely geek dudes with her magical joie-de-vivre and boring the hell out of anybody who likes their women to exist in all four dimensions.

Writing about Doctor Who this week got me thinking about sexism in storytelling, and how we rely on lazy character creation in life just as we do in fiction. The Doctor has become the ultimate soulful brooding hero in need of a Manic Pixie Dream Girl to save him from the vortex of self-pity usually brought on by the death, disappearance or alternate-universe-abandonment of the last girl. We cannot have the Doctor brooding. A planet might explode somewhere, or he might decide to use his powers for evil, or his bow-tie might need adjusting. The companions of the past three years, since the most recent series reboot, have been the ultimate in lazy sexist tropification, any attempt at actually creating interesting female characters replaced by... That Girl. (...)

Manic Pixies, like other female archetypes, crop up in real life partly because fiction creates real life, particularly for those of us who grow up immersed in it. Women behave in ways that they find sanctioned in stories written by men who know better, and men and women seek out friends and partners who remind them of a girl they met in a book one day when they were young and longing.

For me, Manic Pixie Dream Girl was the story that fit. Of course, I didn't think of it in those terms; all I saw was that in the books and series I loved - mainly science fiction, comics and offbeat literature, not the mainstream films that would later make the MPDG trope famous - there were certain kinds of girl you could be, and if you weren't a busty bombshell, if you were maybe a bit weird and clever and brunette, there was another option.

And that's how I became a Manic Pixie Dream Girl. The basic physical and personality traits were already there, and some of it was doubtless honed by that learned girlish desire to please - because the posture does please people, particularly the kind of sad, bright, bookish young men who have often been my friends and lovers. I had the raw materials: I’m five feet nothing, petite and small-featured with skin the color of something left on the bottom of a pond for too long and messy hair that’s sometimes dyed a shocking shade of red or pink. At least, it was before I washed all the dye out last year, partly to stop soulful Zach-Braff-a-likes following me to the shops, and partly to stop myself getting smeary technicolour splotches all over the bathroom, as if a muppet had been horribly murdered.  (...)

I’m fascinated by this character and what she means to people, because the experience of being her - of playing her - is so wildly different than it seems to appear from the outside. In recent weeks I’ve filled in the gaps of classic Manic Pixie Dream Girl films I hadn’t already sat through, and I’m struck by how many of them claim to be ironic re-imaginings of a character trope that they fail to actually interrogate in any way. Irony is, of course, the last vestige of modern crypto-misogyny: all those lazy stereotypes and hurtful put-downs are definitely a joke, right up until they aren’t, and clearly you need a man to tell you when and if you’re supposed to take sexism seriously.

by Laurie Penny, TNS |  Read more:
Image: Zooey Deschanel, uncredited

The Rock ’n’ Roll Casualty Who Became a War Hero

I asked if he ever talked about it. Jason shook his head no. Did they find out anyway? “Always.”

The first time was at Fort Benning in 1994, in the middle of the hell of basic training. The ex-cop recruits in boot camp with him said that prisoners had more freedom than they did. There were guys who faked suicide attempts to get out of basic. But Everman never had any doubts. “I was 100 percent,” he told me. “If I wasn’t, there was no way I’d get through it.”

He had three drill sergeants, two of whom were sadists. Thank God it was the easygoing one who saw it. He was reading a magazine, when he slowly looked up and stared at Everman. Then the sergeant walked over, pointing to a page in the magazine. “Is this you?” It was a photo of the biggest band in the world, Nirvana. Kurt Cobain had just killed himself, and this was a story about his suicide. Next to Cobain was the band’s onetime second guitarist. A guy with long, strawberry blond curls. “Is this you?”

Everman exhaled. “Yes, Drill Sergeant.”

And that was only half of it. Jason Everman has the unique distinction of being the guy who was kicked out of Nirvana and Soundgarden, two rock bands that would sell roughly 100 million records combined. At 26, he wasn’t just Pete Best, the guy the Beatles left behind. He was Pete Best twice.

Then again, he wasn’t remotely. What Everman did afterward put him far outside the category of rock’n’roll footnote. He became an elite member of the U.S. Army Special Forces, one of those bearded guys riding around on horseback in Afghanistan fighting the Taliban.

I’ve known Jason Everman since we played rock shows together nearly 25 years ago. What happened to him was almost inexplicable, a cruel combination of good luck, bad luck and the kind of disappointment that would have overwhelmed me even at my most brashly defiant. After having not seen him since the early ’90s, I ended up hanging out with him in his apartment in Brooklyn last summer. We had drinks, retraced steps. We once were in the same place in our lives. But mine had since quietly transitioned from rock to parenthood. My changes were glacial. His were violent.

None of it is easy for him to talk about. Jason is one of the most guarded people I have ever met. But when I pulled up to his remote A-frame cabin near Puget Sound last winter, there he was, a sturdy, tall figure in a Black Flag sweatshirt holding a glass of red wine. This was his private place, and he was letting me into it.

Books and action figures covered one wall. Guitars and drums were scattered on the floor. But the far wall almost looked like a memorial: medals, artifacts, war photos. I took it all in, asking about a hand-decorated gun on the fireplace. “That’s how the Taliban trick out their weapons,” he said. Then I picked up his Army helmet. It seemed heavy to me. “Dude, that’s light,” he said. “That’s state of the art.” It had his blood type still written on the side: O positive.

by Clay Tarver, NY Times |  Read more:
Image: Ian Tilton

Bush and Mountain Flying in Alaska