Friday, September 27, 2013

Memento Mori

“Death… the most awful of evils,” says Epicurus, “is nothing to us, seeing that when we are, death is not yet, and when death comes, we are not.” My experience in the New Haven hospital demonstrated the worth of the hypothesis; the books I read in college formed the thought as precept; my paternal grandfather, Roger D. Lapham, taught the lesson by example.

In the summer of 1918, then a captain of infantry with the American Expeditionary Force in World War I, he had been reported missing and presumed dead after his battalion had been overwhelmed by German poison gas during the Oise-Aisne offensive. Nearly everybody else in the battalion had been promptly killed, and it was six weeks before the Army found him in the hayloft of a French barn. A farmer had retrieved him, unconscious but otherwise more or less intact, from the pigsty into which he had fallen, by happy accident, on the day of what had been planned as a swift and sure advance.

The farmer’s wife nursed him back to life with soup and soap and Calvados, and by the time he was strong enough to walk, he had lost half his body weight and undergone a change in outlook. He had been born in 1883, descended from a family of New England Quakers, and before going to Europe in the spring of 1918 was said to have been almost solemnly conservative in both his thought and his behavior, shy in conversation, cautious in his dealings with money. He returned from France reconfigured in a character akin to Shakespeare’s Sir John Falstaff, extravagant in his consumption of wine and roses, passionate in his love of high-stakes gambling on the golf course and at the card table, persuaded that the object of life was nothing other than its fierce and close embrace.

Which is how I found him in the autumn of 1957, when I returned to San Francisco to look for work on a newspaper. He was then a man in his middle seventies (i.e., of an age that now surprises me to discover as my own), but he was the same vivid presence (round red face like Santa Claus, boisterous sense of humor, unable to contain his emotions) that I had known as a boy growing up in the 1940s in the city of which he was then the mayor.

A guest in his house on Jackson Street for three months before finding a room of my own, most mornings I sat with him while he presided over his breakfast (one scrambled egg, two scraps of Melba toast, pot of coffee, glass of Scotch) listening to him talk about what he had seen of a world in which he knew that all present (committee chairman, lettuce leaf, and Norfolk terrier) were granted a very short stay. Although beset by a good many biological systems failures, he regarded them as nuisances not worth mention in dispatches. He thought it inadvisable to quit drinking brandy, much less the whiskey, the rum punch, and the gin. At the bridge table he continued to think it unsporting to look at his cards before bidding the hand.

My grandfather’s refusal to consult doctors no doubt shortened his length of days on Earth, but he didn’t think the Fates were doing him an injustice. He died in 1966 at the age of 82 on terms that he would have considered sporting. The grand staircase in his house on Jackson Street was curved in a semicircle rising 30 feet from the entrance hall to a second-floor landing framed by a decorative wooden railing. Having climbed the long flight of stairs after a morning in the office and the afternoon on a golf course, Roger Dearborn Lapham paused to catch his breath. It wasn’t forthcoming. He plunged head first through the railing and was dead -- so said the autopsy -- before his body collided and combined with the potted palm at the base of the stairwell. He had suffered a massive heart attack, and his death had come to him in a way he would have hoped it would, as a surprise.

An Immortal Human Head in the Clouds

About the presence of death and dying I don’t remember the society in the 1950s being so skittish as it has since become. People still died at home, among relatives and friends, often in the care of a family physician. Death was still to be seen sitting in the parlor, hanging in a butcher shop, sometimes lying in the street. By the generations antecedent to my own, survivors of the Great Depression or one of the nation’s foreign wars, it seemed to be more or less well understood, as it had been by Montaigne that one’s own death “was a part of the order of the universe… a part of the life of the world.”

For the last 60 or 70 years, the consensus of decent American opinion (cultural, political, and existential) has begged to differ, making no such outlandish concession. To do so would be weak-minded, offensive, and wrong, contrary to the doctrine of American exceptionalism that entered the nation’s bloodstream subsequent to its emergence from the Second World War crowned in victory, draped in virtue.

Military and economic command on the world stage fostered the belief that America was therefore exempt from the laws of nature, held harmless against the evils, death chief among them, inflicted on the lesser peoples of the Earth. The wonders of medical science raked from the ashes of the war gave notice of the likelihood that soon, maybe next month but probably no later than next year, death would be reclassified as a preventable disease.

That article of faith sustained the bright hopes and fond expectations of both the 1960s countercultural revolution (incited by a generation that didn’t wish to grow up) and the Republican Risorgimento of the 1980s (sponsored by a generation that didn’t choose to grow old). Joint signatories to the manifesto of Peter Pan, both generations shifted the question from “Why do I have to die?” to the more upbeat “Why can’t I live forever?”

The substituting of the promise of technology for the consolations of philosophy had been foreseen by John Stuart Mill as the inevitable consequence of the nineteenth century’s marching ever upward on the roads of social and political reform. Suffering in 1854 from a severe pulmonary disease, Mill noted in his diary on April 15, “The remedies for all our diseases will be discovered long after we are dead, and the world will be made a fit place to live in after the death of most of those by whose exertions have been made so.”

His premonition is now the just-over-the-horizon prospect of life everlasting bankrolled by Dmitry Itskov, a Russian multimillionaire, vouched for by the Dalai Lama and a synod of Silicon Valley visionaries, among them Hiroshi Ishiguro and Ray Kurzweil. As presented to the Global Future 2045 conference at Lincoln Center in New York City in June 2013, Itskov’s Avatar Project proposes to reproduce the functions of human life and mind on “nonbiological substrates,” do away with the “limited mortal protein-based carrier” and replace it with cybernetic bodies and holograms, a “neohumanity” that will “change the bodily nature of a human being, and make them immortal, free, playful, independent of limitations of space and time.” In plain English, lifelike human heads to which digital copies of the contents of a human brain can be downloaded from the cloud.

The question “Why must I die?” and its implied follow-up, “How then do I live my life?,” both admit of an answer by and for and of oneself. Learning how to die, as Montaigne goes on to rightly say, is unlearning how to be a slave. The question “Why can’t I live forever?” assigns the custody of one’s death to powers that make it their business to promote and instill the fear of it -- to church or state, to an alchemist or an engineer.

by Lewis Lapham, TomDispatch |  Read more:
Image:Norman Parkinson via:

Craig Cole 2013. No junk or bills. oil on canvas 1.2x1.8
via:

Bologna.
via:

The Truth About GMOs

Mama Moses has been growing bananas on her farm in southwestern Uganda for twenty years. She farms only bananas, which is typical of subsistence farmers in Sanga, the impoverished village where she lives. Last year, when she saw the flowers on her banana plants begin to shrivel and yellow bacteria ooze from the cut stems, she knew her crop was doomed. Within months the bacterial infection turned her healthy crop into a black, wilted mess.

Banana Xanthomonas wilt disease (BXW) is one of the greatest threats to banana production in Eastern Africa. Cultural practices provide some control, but they are ineffective during epidemics. More than a thousand kinds of banana can be found worldwide, but none has robust resistance to BXW. Even if resistance were identified, most scientists believe that breeding a new variety using conventional methods would take decades, assuming it is even possible.

BXW creates precisely the sort of food insecurity that affects the world’s poorest people. Bananas and plantains are the fourth most valuable food crop after rice, wheat, and maize. Approximately one-third of the bananas produced globally are grown in sub-Saharan Africa, where bananas provide more than 25 percent of the food energy requirements for more than 100 million people.

For anyone worried about the future of global agriculture, Mama Moses’s story is instructive. The world faces an enormous challenge: with changing diets and population growth of 2–3 billion over the next 40 years, UNESCO predicts that food production will need to rise by 70 percent by 2050. Many pests and diseases cannot, however, be controlled using conventional breeding methods. Moreover, subsistence farmers cannot afford most pesticides, which are often ineffective or harmful to the environment.

Yet many emerging agricultural catastrophes can almost certainly be avoided thanks to a modern form of plant breeding that uses genetic engineering (GE), a process that has led to reduced insecticide use and enhanced productivity of farms large and small.

In spite of these benefits, genetic engineering is anathema to many people. In the United States, we’ve seen attempts to force labeling of genetically modified organisms (GMOs). In much of Europe, farmers are prohibited from growing genetically engineered crops and so must import grain from the United States. And “GMO-free” zones are expanding in Japan.

The strong distrust of GE foods is curious. Opponents typically profess a high degree of concern for human welfare and the environment. They want the same things that scientists, farmers, food security experts, and environmentalists want: ecologically sound food production accessible to a growing global population. But their opposition threatens the great strides that have been made toward these goals through deployment of new technologies.

For 10,000 years, we have altered the genetic makeup of our crops. Conventional approaches are often crude, resulting in new varieties through a combination of trial and error, without knowledge of the precise function of the genes being moved around. Such methods include grafting or mixing genes of distantly related species through forced pollinations, as well as radiation treatments to induce random mutations in seeds. Today virtually everything we eat is produced from seeds that we have genetically altered in one way or another.

Over the last twenty years, scientists and breeders have used GE to create crop varieties that thrive in extreme environments or can withstand attacks by pests and disease. Like the older conventional varieties, GE crops are genetically altered, but in a manner that introduces fewer genetic changes. Genetic engineering can also be used to insert genes from distantly related species, such as bacteria, directly into a plant.

Given that modern genetic engineering is similar to techniques that have served humanity well for thousands of years and that the risks of unintended consequences are similar whether the variety is derived from the processes of GE or conventional gene alteration, it should come as no surprise that the GE crops currently on the market are as safe to eat and safe for the environment as organic or conventional foods. That is the conclusion reached by diverse agricultural and food experts. There is broad consensus on this point among highly regarded science-based organizations in the United States and abroad, including the American Medical Association, the National Academy of Sciences, the World Health Organization, and European Commission Joint Research Centre. In the seventeen years since GE crops were first grown commercially, not a single instance of adverse health or environmental effects has been documented.

by Pamela Ronald, Boston Review |  Read more:
Image: uncredited

The Eagles' Greatest Hit


[ed. See also: The Tao of Joe Walsh.]

I never put much thought into the Eagles. In high school, my friends and I assumed they were just another famous '70s band that splintered, then found an extended afterlife on classic rock stations. They stood out only because they sold a remarkable number of "Greatest Hits" albums. Everyone — and I mean everyone — had the first one. Their songs popped up consistently at our parties, but so did the Steve Miller Band and the Allman Brothers and 12 other groups from that era. I don't remember arguing about the Eagles, debating the meaning of "Hotel California," or even joking about Glenn Frey being pissed about Don Henley's then-scorching solo career.

Did I know that music critics picked them apart for being more successful than they should have been? Absolutely not. I never knew the band abused their bodies and went through groupies like they were Marlboro Reds. I never knew three different Eagles guitarists left the band for stereotypically awesome reasons: jealousy, infighting, warring creative visions, credit jockeying, even a beer that was derisively poured on Frey's head. I never knew when the Eagles split up, much less why, or if it mattered. That ubiquitous classic rock format kept every '70s band relevant. The Eagles were broken up, but really, they weren't.

Two years after I graduated college, they reunited for 1994's "Hell Freezes Over" tour, a shameless money grab disguised as their long-awaited reunion. Nostalgia rock had been generating big bucks for every past-its-prime act. Pink Floyd, Aerosmith and the Rolling Stones sold out stadiums like it was 1975. Billy Joel and Elton John toured America together, even overlapping for a few songs every show. And now, the Eagles were freezing hell. I remember having a chance to see them and quickly passing. Take it easy, Eagles.


From that point on, I never thought about them unless Chris Berman was involved. That changed this spring, right after Showtime started showing Alison Ellwood's documentary about them. I have watched The History of the Eagles, Part One five times, not counting all the other times it sucked me in for 15-minute stretches. I have participated in multiple Eagles-related e-mail chains that I may or may not have started. I have gone down Eagles-related rabbit holes on Google so cavernous that I once typed the words "Stevie Nicks Don Henley abortion." (Yes, things come up.) Two different times, a friend e-mailed me just to say, "I was talking about the Eagles doc with [fill in our mutual friend]. I had no idea you loved it, too!" (...)

You know what else? The Eagles were significantly bigger than I ever realized. Really, there wasn't a more successful, popular or famous American band in the 1970s. Even today, their first greatest hits album (released in 1976, almost one year before Hotel California came out) is still battling neck and neck with Thriller as the highest-selling album of all time. That dumbfounding fact alone made the Eagles worthy of a documentary, even if a 215-minute treatment was unquestionably overboard. Part One handles their creation and ascent, their battles with fame and cocaine (since when were those six words anything but awesome?), every major fight they ever had (ditto), every possible reason they broke up (ditto), and then their actual breakup after an acrimonious concert highlighted by Glenn Frey repeatedly threatening to kick a band mate's ass (even though Frey probably weighed a buck fifty at the time).

In my humble opinion, it's the finest documentary ever made about the rise and fall of a memorable rock band, as well as a superb commentary on the dangers of fame and excess. You'll recognize pieces of Almost Famous in it, and that's not by accident — Cameron Crowe covered them for Rolling Stone, eventually creating Stillwater as a hybrid of the Allman Brothers and the Eagles (with a little Led Zep mixed in). There's more than a little Frey and Henley in Jeff Bebe and Russell Hammond.

The film should have ended there. But since the band wanted something covering their entire history from 1971 until today, Part Two sprawlingly covers their post-breakup careers and their reunion. It's excessive, to say the least. I would have been fine with an eight-minute epilogue. Although I did enjoy Part Two's attempt to make Frey's acting career seem successful, as well as any Eagle pretending they returned for any reason other than "gobs and gobs of money." There's one unintentionally hilarious part: Henley and Frey painstakingly rehashing the creative process for "Get Over It" as if they're discussing "Hotel California" or something. I also enjoyed guitarist Don Felder bitching about reunion royalties; Felder believed he should be earning as much money as Henley and Frey when, again, he was Don Felder. It was like the 1993 Bulls reuniting, then Horace Grant fighting to be paid as much as Michael and Scottie.

Fine, you got me — I've watched Part Two twice even though it's 70 minutes too long. I can't help it. But Part One? Part One is magnificent. It's one of my favorite documentaries ever. Without further ado, my 20 favorite things about The History of the Eagles, Part One.

by Bill Simmons, Grantland |  Read more:
Image: Ken Garduno

The Shadow Commander

Last February, some of Iran’s most influential leaders gathered at the Amir al-Momenin Mosque, in northeast Tehran, inside a gated community reserved for officers of the Revolutionary Guard. They had come to pay their last respects to a fallen comrade. Hassan Shateri, a veteran of Iran’s covert wars throughout the Middle East and South Asia, was a senior commander in a powerful, élite branch of the Revolutionary Guard called the Quds Force. The force is the sharp instrument of Iranian foreign policy, roughly analogous to a combined C.I.A. and Special Forces; its name comes from the Persian word for Jerusalem, which its fighters have promised to liberate. Since 1979, its goal has been to subvert Iran’s enemies and extend the country’s influence across the Middle East. Shateri had spent much of his career abroad, first in Afghanistan and then in Iraq, where the Quds Force helped Shiite militias kill American soldiers.
Shateri had been killed two days before, on the road that runs between Damascus and Beirut. He had gone to Syria, along with thousands of other members of the Quds Force, to rescue the country’s besieged President, Bashar al-Assad, a crucial ally of Iran. In the past few years, Shateri had worked under an alias as the Quds Force’s chief in Lebanon; there he had helped sustain the armed group Hezbollah, which at the time of the funeral had begun to pour men into Syria to fight for the regime. The circumstances of his death were unclear: one Iranian official said that Shateri had been “directly targeted” by “the Zionist regime,” as Iranians habitually refer to Israel.

At the funeral, the mourners sobbed, and some beat their chests in the Shiite way. Shateri’s casket was wrapped in an Iranian flag, and gathered around it were the commander of the Revolutionary Guard, dressed in green fatigues; a member of the plot to murder four exiled opposition leaders in a Berlin restaurant in 1992; and the father of Imad Mughniyeh, the Hezbollah commander believed to be responsible for the bombings that killed more than two hundred and fifty Americans in Beirut in 1983. Mughniyeh was assassinated in 2008, purportedly by Israeli agents. In the ethos of the Iranian revolution, to die was to serve. Before Shateri’s funeral, Ayatollah Ali Khamenei, the country’s Supreme Leader, released a note of praise: “In the end, he drank the sweet syrup of martyrdom.”

Kneeling in the second row on the mosque’s carpeted floor was Major General Qassem Suleimani, the Quds Force’s leader: a small man of fifty-six, with silver hair, a close-cropped beard, and a look of intense self-containment. It was Suleimani who had sent Shateri, an old and trusted friend, to his death. As Revolutionary Guard commanders, he and Shateri belonged to a small fraternity formed during the Sacred Defense, the name given to the Iran-Iraq War, which lasted from 1980 to 1988 and left as many as a million people dead. It was a catastrophic fight, but for Iran it was the beginning of a three-decade project to build a Shiite sphere of influence, stretching across Iraq and Syria to the Mediterranean. Along with its allies in Syria and Lebanon, Iran forms an Axis of Resistance, arrayed against the region’s dominant Sunni powers and the West. In Syria, the project hung in the balance, and Suleimani was mounting a desperate fight, even if the price of victory was a sectarian conflict that engulfed the region for years.

Suleimani took command of the Quds Force fifteen years ago, and in that time he has sought to reshape the Middle East in Iran’s favor, working as a power broker and as a military force: assassinating rivals, arming allies, and, for most of a decade, directing a network of militant groups that killed hundreds of Americans in Iraq. The U.S. Department of the Treasury has sanctioned Suleimani for his role in supporting the Assad regime, and for abetting terrorism. And yet he has remained mostly invisible to the outside world, even as he runs agents and directs operations. “Suleimani is the single most powerful operative in the Middle East today,” John Maguire, a former C.I.A. officer in Iraq, told me, “and no one’s ever heard of him.”

When Suleimani appears in public—often to speak at veterans’ events or to meet with Khamenei—he carries himself inconspicuously and rarely raises his voice, exhibiting a trait that Arabs call khilib, or understated charisma. “He is so short, but he has this presence,” a former senior Iraqi official told me. “There will be ten people in a room, and when Suleimani walks in he doesn’t come and sit with you. He sits over there on the other side of room, by himself, in a very quiet way. Doesn’t speak, doesn’t comment, just sits and listens. And so of course everyone is thinking only about him.”

At the funeral, Suleimani was dressed in a black jacket and a black shirt with no tie, in the Iranian style; his long, angular face and his arched eyebrows were twisted with pain. The Quds Force had never lost such a high-ranking officer abroad. The day before the funeral, Suleimani had travelled to Shateri’s home to offer condolences to his family. He has a fierce attachment to martyred soldiers, and often visits their families; in a recent interview with Iranian media, he said, “When I see the children of the martyrs, I want to smell their scent, and I lose myself.” As the funeral continued, he and the other mourners bent forward to pray, pressing their foreheads to the carpet. “One of the rarest people, who brought the revolution and the whole world to you, is gone,” Alireza Panahian, the imam, told the mourners. Suleimani cradled his head in his palm and began to weep.

by Dexter Filkins, New Yorker |  Read more:
Image: Krzysztof Domaradzki.

Thursday, September 26, 2013

Nukes of Hazard

On January 25, 1995, at 9:28 a.m. Moscow time, an aide handed a briefcase to Boris Yeltsin, the President of Russia. A small light near the handle was on, and inside was a screen displaying information indicating that a missile had been launched four minutes earlier from somewhere in the vicinity of the Norwegian Sea, and that it appeared to be headed toward Moscow. Below the screen was a row of buttons. This was the Russian “nuclear football.” By pressing the buttons, Yeltsin could launch an immediate nuclear strike against targets around the world. Russian nuclear missiles, submarines, and bombers were on full alert. Yeltsin had forty-seven hundred nuclear warheads ready to go.

The Chief of the General Staff, General Mikhail Kolesnikov, had a football, too, and he was monitoring the flight of the missile. Radar showed that stages of the rocket were falling away as it ascended, which suggested that it was an intermediate-range missile similar to the Pershing II, the missile deployed by nato across Western Europe. The launch site was also in the most likely corridor for an attack on Moscow by American submarines. Kolesnikov was put on a hot line with Yeltsin, whose prerogative it was to launch a nuclear response. Yeltsin had less than six minutes to make a decision.

The Cold War had been over for four years. Mikhail Gorbachev had resigned on December 25, 1991, and had handed over the football and the launch codes to Yeltsin. The next day, the Soviet Union voted itself out of existence. By 1995, though, Yeltsin’s popularity in the West was in decline; there was tension over plans to expandnato; and Russia was bogged down in a war in Chechnya. In the context of nuclear war, these were minor troubles, but there was also the fact, very much alive in Russian memory, that seven and a half years earlier, in May, 1987, a slightly kooky eighteen-year-old German named Mathias Rust had flown a rented Cessna, an airplane about the size of a Piper Cub, from Helsinki to Moscow and landed it a hundred yards from Red Square. The humiliation had led to a mini-purge of the air-defense leadership. Those people did not want to get burned twice. (...)

But most of the danger that human beings faced from nuclear weapons after the destruction of Hiroshima and Nagasaki had to do with inadvertence—with bombs dropped by mistake, bombers catching on fire or crashing, missiles exploding, and computers miscalculating and people jumping to the wrong conclusion. On most days, the probability of a nuclear explosion happening by accident was far greater than the probability that someone would deliberately start a war. (...)

A study run by Sandia National Laboratories, which oversees the production and security of American nuclear-weapons systems, discovered that between 1950 and 1968 at least twelve hundred nuclear weapons had been involved in “significant” accidents. Even bombs that worked didn’t work quite as planned. In Little Boy, the bomb dropped on Hiroshima on August 6, 1945, only 1.38 per cent of the nuclear core, less than a kilogram* of uranium, fissioned (although the bomb killed eighty thousand people). The bomb dropped on Nagasaki, three days later, was a mile off target (and killed forty thousand people). A test of the hydrogen bomb in the Bikini atoll, in 1954, produced a yield of fifteen megatons, three times as great as scientists had predicted, and spread lethal radioactive fallout over hundreds of square miles in the Pacific, some of it affecting American observers miles away from the blast site.

These stories, and many more, can be found in Eric Schlosser’s “Command and Control” (Penguin), an excellent journalistic investigation of the efforts made since the first atomic bomb was exploded, outside Alamogordo, New Mexico, on July 16, 1945, to put some kind of harness on nuclear weaponry. By a miracle of information management, Schlosser has synthesized a huge archive of material, including government reports, scientific papers, and a substantial historical and polemical literature on nukes, and transformed it into a crisp narrative covering more than fifty years of scientific and political change. And he has interwoven that narrative with a hair-raising, minute-by-minute account of an accident at a Titan II missile silo in Arkansas, in 1980, which he renders in the manner of a techno-thriller:
Plumb watched the nine-pound socket slip through the narrow gap between the platform and the missile, fall about seventy feet, hit the thrust mount, and then ricochet off the Titan II. It seemed to happen in slow motion. A moment later, fuel sprayed from a hole in the missile like water from a garden hose. 
“Oh man,” Plumb thought. “This is not good.”
“Command and Control” is how nonfiction should be written.

Schlosser is known for two popular books, “Fast Food Nation,” published in 2001, and “Reefer Madness,” an investigative report on black markets in marijuana, pornography, and illegal immigrants that came out in 2003. Readers of those books, and of Schlosser’s occasional writings in The Nation, are likely to associate him with progressive politics. They may be surprised to learn that, insofar as “Command and Control” has any heroes, those heroes are Curtis LeMay, Robert McNamara, and Ronald Reagan (plus an Air Force sergeant named Jeff Kennedy, who was involved in responding to the wounded missile in the Arkansas silo). Those men understood the risks of just having these things on the planet, and they tried to keep them from blowing up in our faces.

by Louis Menard, New Yorker |  Read more:
Image: Shout

Wednesday, September 25, 2013

Sarah Jarosz



No Communication No Love (by mutablend)
via:

Earl Horter, Still Life. 1939.
via:

Fried Mozzarella, Basil & Nectarine Stacks with Balsamic Glaze


Fried Mozzarella, Basil & Nectarine Stacks with Balsamic Glaze

Cook time: 5 mins total time: 15 mins yield: 4 stacks

Ingredients

4 large nectarines or peaches, sliced into rounds
12 large basil leaves
12 ounces fresh mozzarella, sliced into 8 thick round slices
1 cup panko bread crumbs
1/4 cup flour
1/3 cup parmesan cheese, grated
2 eggs, beaten
1/2 teaspoon salt
1/2 teaspoon pepper
1/4 teaspoon cayenne
Balsamic Glaze
1/2 cup balsamic vinegar
1 teaspoon brown sugar (optional)

Instructions
  1. Add vinegar and brown sugar, if using, to a small saucepan and bring to a boil. Reduce to a very low simmer and cook for 10-15 minutes, until liquid reduces by about half and is slightly syrupy. Remove from heat, pour vinegar in a bowl or glass to pour, and set aside to cool and thicken.
  2. In a large bowl, combine panko, flour, parmesan, salt, pepper and cayenne, mixing thoroughly to combine. In a small bowl, lightly beat the eggs. Take each slice of fresh mozzarella and coat it in the beaten egg, then dredge it through the bread crumb mix, pressing on both sides to adhere. Repeat with the remaining slices.
  3. Heat a large skillet over high heat. Add 1 teaspoon of olive oil and sear both sides of the nectarines for 1 to minutes until just warmed, but still somewhat firm. Alternately you can also grill the nectarines directly on the grill. Keep the nectarines warm.
  4. Add the remaining olive oil to the skillet and when hot, fry coated mozzarella, turning carefully once or twice, until golden and cheese starts to melt but still retains its shape, about 1 minute on each side. Drain on paper towels.
  5. To assemble, place one nectarine to a plate, top with 1 slice of fried mozzarella and then a basil leaf. Repeat the layer one more time and finish with a nectarine. Garish with basil and freshly grated pepper. Drizzle on the balsamic glaze.

A woman surveys a treacherous mountain pass in the Pyrenees of France, 1956. Justin Locke, National Geographic.
via:

Buy a House, Make Your Payments, Then Discover You've Been Foreclosed On Without Your Knowledge

A few months ago, Ceith and Louise Sinclair of Altadena, California, were told that their home had been sold. It was the first time they’d heard that it was for sale.

Their mortgage servicer, Nationstar, foreclosed on them without their knowledge, and sold the house to an investment company. If it wasn’t for the Sinclairs going to a local ABC affiliate and describing their horror story, they would have been thrown out on the street, despite never missing a mortgage payment. It’s impossible to know how many homeowners who didn’t get the media to pick up their tale have dealt with a similar catastrophe, and eventually lost their home.

As finance writer Barry Ritholtz has explained, home purchases involve a series of precise safeguards, designed to protect property rights and prevent situations where borrowers who are perfect on their payments get evicted. “In a nation of laws, contract and property rights, there is no room for errors,” Ritholtz writes. “The only way these errors could have occurred is if several people involved in the process committed criminal fraud.”

Any observer of the mortgage industry since 2009 is no stranger to foreclosure fraud, and the fact that virtually nobody has paid the price for this crime. But the case of the Sinclairs involves a new player in that rotten game: Nationstar. Unheralded just a few years ago, the firm, owned by a private equity behemoth, has been buying up the rights to service mortgages, accepting monthly payments and distributing the proceeds to the owners of the loan, taking a little off the top for itself.

Nationstar has racked up an impressively horrible customer service record in its short life, failing to honor prior agreements with borrowers and pursuing illegal foreclosures. The fact that Nationstar and other corrupt companies like it are beginning to corner the market for mortgage servicing should trouble not only homeowners, but the regulators tasked with looking out for them. It didn’t seem possible that a broken mortgage servicing industry could get worse, but it has.

Nationstar is at the forefront of a massive shift in mortgage servicing. In the past few years, the largest servicers were arms of major banks, like JPMorgan Chase, Wells Fargo, Bank of America, Citi and Ally Bank. Those were the “big five” servicers sanctioned for an array of fraudulent conduct in the National Mortgage Settlement, which mandated specific standards for servicers to follow, like providing a single point of contact for customers and an end to “dual tracking,” when a servicer offers a trial modification to a borrower and pursues foreclosure at the same time.

The banks realized that they could sell the servicing rights and evade these standards, along with the higher labor costs associated with implementing them. What’s more, they would avoid new, higher capital requirements associated with holding servicing assets, allowing them to give bigger dividends to shareholders and bigger bonuses to executives.

So the big banks started selling off their servicing rights, not to other banks, but to specialty financial services firms like Green Tree, Nationstar, Walter Investment Management and Ocwen, all of whom are in kind of an arms race to become the biggest servicer.

Last October, Ocwen purchased the entire servicing portfolio of Ally Bank, covering about $329 billion in loans. Ocwen has also purchased part of JPMorgan Chase’s servicing, as well as a slice from OneWest Bank; it is attempting to dominate the market.

Nationstar acquired business from Bank of America and Aurora Bank in 2012, and more in 2013. Wells Fargo is poised to sell some servicing rights as well, and Nationstar will surely bid for those rights. As of June 30 of this year, Nationstar has the right to collect on $318 billion worth of home loans—growing three-fold in under two years—and it will seek to add even more in the future. The company, majority owned by the private equity firm Fortress Investment Group, recently raised $1.1 billion in capital to buy up more servicing rights from banks around the country.

This means that homeowners victimized by big-bank servicers, who were supposed to get a commitment to honest treatment as part of the National Mortgage Settlement, instead got their servicing rights sold to companies no longer bound by the terms of that settlement. So homeowners lose all of their protections, and often have to start back at square one with their new servicer. For example, if a borrower was in process on a loan modification with their old servicer, the new servicer can choose to simply not recognize that modification, and demand the full monthly payment under threat of foreclosure. This is a very common practice.

by David Dayen, Alternet |  Read more:
Image: Shutterstock.com/iQoncept

Tuesday, September 24, 2013


Shinichiro Saka, Pearl Fuji 
via:

Melody Gardot


A Teacher and Her Student

[ed. Gilead, one of my favorite books.]

Marilynne Robinson was my fourth and final workshop instructor at the Iowa Writers’ Workshop. She is an intimidating intellectual presence—she once told us that to improve characterization, we should read Descartes. When I asked her to sign my copy of Gilead, she admitted she had recently become fascinated by ancient cuneiform script. But she is also generous and quick to laugh—when she offered to have us to her house for dinner, and I asked if we ought to bring food, she replied, “Or perhaps I will make some loaves and fishes appear!” Then she burst into giggles.

After receiving my MFA this May, I left Iowa believing that there’s no good way to be taught how to write, to tell a story. But there is also no denying that Marilynne has made me a better writer. Her demands are deceptively simple: to be true to human consciousness and to honor the complexities of the mind and its memory. Marilynne has said in other interviews that she doesn’t read much contemporary fiction because it would take too much of her time, but I suspect it’s also because she spends a fair amount of her mental resources on her students.

Our interview was held on one of the last days of the spring semester. The final traces of the bitter winter had disappeared, and light filled the classroom, which now felt empty with just the two of us. My two years at Iowa were over, and I selfishly wanted to stretch the interview for as long as possible.

You recently told the class you had discovered the ending to your new novel—or so you hoped. How does that happen for you? How do you know?

A lot of the experience of the novel—after the beginning—is being in the novel. You set yourself with a complex problem. If it’s a good problem or one that really engages you, then your mind works on it all the time. A novel by its nature is new. The great struggle, conscious or unconscious, is to make sure that it is new. That it actually has raised issues that deserve to be dealt with in their own terms. They’re not terms that you have seen elsewhere. It’s sort of like composing music. There are options that open and options that disappear, depending on how you develop the guidelines. You think about it over time. And then something will appear, something that is the most elegant response to the question that you’ve asked yourself. And it can absorb the most in terms of the complexities that you’ve created.

It struck me when you said we must “trust the peripheral vision of our mind.” It seems like a muscle in your body that you have to develop by training some other part of you.

One reaches for analogies. I think it’s probably a lot like meditation—which I have never practiced. But from what I understand, it is a capacity that develops itself and that people who practice it successfully have access to aspects of consciousness that they would not otherwise have. They find these large and authoritative experiences. I think that, by the same discipline of introspection, you have access to a much greater part of your awareness than you would otherwise. Things come to mind. Your mind makes selections—this deeper mind—on other terms than your front-office mind. You will remember that once, in some time, in some place, you saw a person standing alone, and their posture suggested to you an enormous narrative around them. And you never spoke to them, you don’t know them, you were never within ten feet of them. But at the same time, you discover that your mind privileges them over something like the Tour d’Eiffel. There’s a very pleasant consequence of that, which is the most ordinary experience can be the most valuable experience. If you’re philosophically attentive you don’t need to seek these things out.

In a way, it seems more difficult. Like trying to look beautiful without makeup.

Harder in some cases than others. It is hard. Frankly, I think most people would think that if you look beautiful without makeup, you’re more truly beautiful than if you’re beautiful with makeup. Although that’s an argument in and of itself. If it were simply discipline, like learning to juggle, or something like that, that’s one thing. But it’s finding access into your life more deeply than you would otherwise. Consider this incredibly brief, incredibly strange experience that we have as this hypersensitive creature on a tiny planet in the middle of somewhere that looks a lot like nowhere. It’s assigning an appropriate value to the uniqueness of our situation and every individual situation.

by Thessaly La Force, Vice | Read more:
Image:Denise Nestor

It's Hip to be Hip, Too


Those of us in our 30s and younger have come of age during a time of incessant media-based self-reflection. Not of the meaningful, “Where do I fit into the universe?” kind that might've passed for existential maturation in a more philosophical era, but of a more superficial stripe. “What is my personal brand?” we ask ourselves. It's something that was a lot easier to answer in the past, when there were only so many to choose from, and when a career or class did most of the heavy lifting for you. Today the perpetually splintering brackets of contemporary demographic specificity engender an eternal anxiety of self, one in which we're meant to renew our vows of identity with regularity. And the choices are many. Identifying as bros, or tech nerds, foodies, gamers, health-conscious types, fashionistas, politicos, or the sports-obsessed are all viable branding options. There's just one type that we're not supposed to assume for ourselves, which is strange, because we're all obsessed with it: the hipster.

This overarching identity dilemma is one born of aggressive social-media expressions, singing songs of ourselves each day and launching them unto the world to either coalesce in harmony with our peers, or to serve as a jarring counterpoint. Nowhere is this type of perpetually refreshing navel-gazing better illustrated than in our repeated investigation into the idea of hipsterhood. Barely a week goes by where we're not confronted by it—28 Signs You're A Hipster, What Was the Hipster?, and so on. More often than not, these come, for some reason, in the paper of record. This past weekend Steven Kurutz contemplated his own unexpected metamorphosis into this most picked-clean carcass of identity. “My initial surprise was replaced by a stark realization: as a 30-something skinnyish urban male there’s almost nothing I can wear that won’t make me look like a hipster,” he wrote, surprised to find himself enlisted into a community he never volunteered for. “Such is the pervasiveness of hipster culture that virtually every aspect of male fashion and grooming has been colonized.”

The versatility of the hipster signifier is what makes it such an empty avenue of exploration in goofy listicles and trend pieces while also engendering skeptics' frustration with its dogged refusal to go away. Public approval of hipsters is at 16 percent, according to a recent poll—Congress looks good in comparison. As Kurutz notes, almost everything can be woven into the hipster fabric now; it's a choose-your-own-ending story where every option leads to the same page, you standing there in some silly hat or other. White guy with a beard? Hipster. Black dude on a skateboard? Hipster. Just a sort-of-skinny cop? Hipster. Woman riding a bike? Hipster. You can play either a mandolin or a turntable and somehow still be a hipster. No rules! As a result, hipsters have become both an object of incessant scorn, but also endless fascination. When a hipster can be defined as anything, it also essentially means nothing—that's an undeniably appealing paradox to poke at.

One thing that seems universally agreed upon, however, as most of these types of pieces about what constitutes hipsterhood point out, and the thing that makes Kurutz such an obvious candidate for hipsterhood himself, is that no one—even the most self-evidently hipster among us—wants to admit to fitting the description. The only rule of hipster club is don't admit you're a member of hipster club. Nothing could be seen as less hip than actually wanting to align oneself with a superficial demo. That's exactly the wrong attitude, it seems to me, if we're to pin down this mercurial concept. The original hipster was someone who bucked the status quo and jumped out ahead of the curve. So, unlike Kurutz and the thousands who have come before him bending themselves into logical pretzels trying to shrug off the designation, I'd like affirm my hipsterhood—with pride.

by Luke O'Neil, Slate |  Read more:
Image: Luke O'Neil