Sunday, December 7, 2014

Do the Most Hipster Thing Possible—Move to Des Moines

[ed. I could live in Des Moines.]

Des Moines, Iowa—This is too nice a place to spawn a war cry. But if the city had one, it would be the sentiment heard across a downtown populated by baristas, tech start-up founders, musicians, and nonprofit professionals alike: "It's Des Moines against the world."

Young people here know what you think of this city. It doesn't need repeating. But ambitious minds are in the process of building a new Des Moines, a tech hub in Silicon Prairie, an artistic center in the Heartland, a destination for people who want to create something meaningful outside of the limits imposed by an oversaturated city like Chicago or New York.

That's exactly what former Brooklynite Zachary Mannheimer sought seven years ago. Mannheimer, 36, had launched restaurants and theater projects in New York, but he wanted to find a city where he could tap local artistic talent and revitalize a stagnant urban community. He visited 22 cities in eight weeks during the summer of 2007, and fell for this Midwestern capital, where he founded the Des Moines Social Club, a nonprofit center for the arts. The Social Club is now lodged in an old firehouse built in 1937, and has a theater, classrooms, bars, art gallery, and adjoining restaurant—and it hosts events every night of the week. An average of 20,000 visitors come through every month, perhaps for a WWE-style wrestling match or an aerial arts class or a punk show.

Mannheimer created something that would have taken the rest his life and $300 million to complete if he'd stayed in New York. It took him seven years and $12 million. He also left his crappy, expensive apartment in Brooklyn for comparatively lavish digs in Des Moines. Now, he wants people living in New York or Chicago or Washington to think about doing the same.

"How much are you working every day? How much are you being paid? How much is your cost of living?" Mannheimer asks. "What if I told you we have per capita the same amount of cultural amenities here that you do in New York? Get over your, 'How do we even pronounce Des Moines?' and 'Where is it?' and 'Why should I even care about it?' Get over it, and come out here and visit."

Besides, he says, "In the world of hipsters, is there anything more ironic than coming to live in Des Moines, as opposed to living in Brooklyn?"

On paper, Des Moines has the assets to back up Mannheimer's pitch: Cost of living is six percentage points below the national average, median salary is $51,200, job growth is 2.9 percent, there is one company with 500 or more employees for every 612 people, and millennials are pouring into Des Moines at a higher rather than they are nationally. Forbes even lists it as the best city for young professionals. (...)

"We always joke that Des Moines is a big small town," says Heggen, a project manager for a firm that transforms old art deco buildings into new apartments. "But really, Des Moines is a large living room. There's this homey feel. What I most want is everybody around me to be successful. And I believe that everyone wants that for me, as well."

Sanchez, too, moved to Des Moines "to start building things, to do something bigger than yourself." Her hope in starting a chapter, she says, was that maybe more young professionals would move to Des Moines. Or to borrow a line from a movie based in Iowa: If you build it, they will come.

Talking Heads frontman and Des Moines fan David Byrne touched on that idea at the Social Club's launch party in this same courtyard, where he pondered why a music scene or an artistic scene or a theater scene develops in any city. "What makes it happen?" he asked the crowd of 500. "It's hard to say. There's no guarantees, but it is possible and it's certainly not going to happen unless there are places like this. And, sad for me to say, it's not going to happen in Manhattan anymore, which means it's up to you guys."

by Matt Vasilogambros and Mauro Whiteman, National Journal |  Read more: 
Image: Wikipedia

Is It Selfish to Want an Afterlife?

“Do I really want it, this self, these scattered fingerprints on the air, to persist forever, to outlast the atomic universe?

Those who scoff at the Christian hope of an afterlife have on their side not only a mass of biological evidence knitting the self-conscious mind tight to the perishing body but a certain moral superiority as well: isn’t it terribly, well, selfish, and grotesquely egocentric, to hope for more than our animal walk in the sun, from eager blind infancy through the productive and procreative years into a senescence that, by the laws of biological instinct as well as by the premeditated precepts of stoic virtue, will submit to eternal sleep gratefully? Where, indeed, in the vast spaces disclosed by modern astronomy, would our disembodied spirit go, and, once there, what would it do?

In fact we do not try to picture the afterlife, nor is it our selves in our nervous tics and optical flecks that we wish to perpetuate; it is the self as window on the world that we can’t bear to think of shutting. My mind when I was a boy of ten or eleven sent up its silent screams at the thought of future aeons – at the thought of the cosmic party going on without me.

The yearning for an afterlife is the opposite of selfish: it is love and praise for the world that we are privileged, in this complex interval of light, to witness and experience. Though some believers may think of the afterlife as a place of retribution, where lives of poverty, distress, and illness will be compensated for, and where renunciations will be rewarded – where the last shall be first, in other words, and those that hunger and thirst shall be filled – the basic desire, as Unamuno says in his Tragic Sense of Life, is not for some otherworld but for this world, for life more or less as we know it to go on forever: ‘The immortality that we crave is a phenomenal immortality – it is the continuation of this present life.'”

by John Updike, Self-Consciousness: Memoirs, TBP | Read more:
Image: uncredited

Friday, December 5, 2014


photo: markk

Will The Torture Report Be Buried After All?

This is an outrage:
Secretary of State John Kerry personally phoned Dianne Feinstein, chairman of the Senate Select Committee on Intelligence, Friday morning to ask her to delay the imminent release of her committee’s report on CIA torture and rendition during the George W. Bush administration, according to administration and Congressional officials. Kerry was not going rogue — his call came after an interagency process that decided the release of the report early next week, asFeinstein had been planning, could complicate relationships with foreign countries at a sensitive time and posed an unacceptable risk to U.S. personnel and facilities abroad.
First, the Obama administration set up a white-wash, in the form of the Durham investigation; then they sat back as the CIA tried to sabotage the Senate Select Committee on Intelligence; then Obama’s chief of staff prevented the report’s publication for months, by insisting on redactions of the report to the point of it being near-unintelligible; and now, with mere days to go, the administration suddenly concludes that a factual accounting of this country’s descent into barbarism poses “an unacceptable risk” to US personnel abroad.Now, after this report has been stymied for two years; now, just days before its scheduled publication; now, because if the administration can prevent its publication this month, they know full well that the Republicans who will control the committee in January will bury the evidence of grotesque and widespread torture by the US for ever.

Of course this complicates relationships with foreign countries; of course it guts any remaining credibility on human rights the US has; of course the staggering brutality endorsed by the highest echelons in American government will inflame American enemies and provoke disbelief across the civilized world. But that’s not the fault of the report; it’s the fault of the torture regime and its architects, many of whom have continued to operate with total impunity under president Obama.

Make no mistake about it: if this report is buried, it will be this president who made that call, and this president who has allowed this vital and minimal piece of accountability to be slow-walked to death and burial, and backed the CIA every inch of the way. But notice also the way in which Kerry’s phone-call effectively cuts the report off at its knees. If it is released, Obama will be able to say he tried to stop it, and to prevent the purported damage to US interests and personnel abroad. He will have found a way to distance himself from the core task of releasing this essential accounting. And he will have ensured that the debate over it will be about whether the report is endangering Americans, just as the Republican talking points have spelled out, rather than a first step to come to terms with the appalling, devastating truth of what the American government has done.

by Andrew Sullivan, The Dish |  Read more:
Image: Charles Ommanney/Getty Images

Companions in Misery


I had just arrived home from my summer vacation — a week in a Minnesota cabin whose brochure warned “no crabbiness allowed” — when I came upon a study that declared New York the “unhappiest city in America.” I doubt many people were surprised by the results — New Yorkers, both in lore and reality, can be hard to please, and famously outspoken about their grievances — but as a born-and-raised New Yorker, and as a philosopher, I was suspicious of how the study defined happiness.

The survey in question, conducted by the Centers for Disease Control and Prevention, asked how “satisfied” Americans were with their lives — very satisfied, satisfied, dissatisfied or very dissatisfied. But the National Bureau of Economic Research used the data to conclude things about their “happiness.” Some might not have minded that the terms satisfaction and happiness were used interchangeably, but I did. The study was titled “Unhappy Cities,” and the headlines that followed it came out swinging against New Yorkers.

I was certain that a person (even a New Yorker) could be both dissatisfied and happy at once, and that the act of complaining was not in fact evidence of unhappiness, but something that could in its own way lead to greater happiness.

At times like this I appreciate philosophers’ respect for words, and a number of them have argued to keep happiness separate from satisfaction. In his 1861 essay “Utilitarianism,” John Stuart Mill carefully distinguished between the two, saying that a person can be satisfied by giving the body what it craves, but that human happiness also involves motivating the intellect. This means that happiness and satisfaction will sometimes conflict, and that those of us who seek happiness, and even attain it, may still be dissatisfied. Mill considered this a good thing: “It is better to be a human being dissatisfied than a pig satisfied, better to be Socrates dissatisfied than a fool satisfied.”

The 19th-century German philosopher Arthur Schopenhauer, one of history’s best-known pessimists, also believed there was more to life than satisfaction. Better to honestly describe a negative world, he believed, than to conceal it with beautiful lies. That sounds very New York.

There’s plenty to complain about when living in a big city: overcrowding, potholes, high prices, train delays, cyclists, bees. When I was growing up in Rockaway and schlepping to school in Brooklyn, it was perfectly normal to complain, and almost everyone I knew did. Our complaining was not an indicator of our level of happiness. In my experience outside the city, however, people routinely misinterpret my casual expressions of dissatisfaction as unhappiness. They consider complaining to be a sign of negativity, which they think should be replaced with positivity in order to be happy. “If you don’t have something nice to say, don’t say anything at all” is an example of this ubiquitous, if banal, attitude. (...)

The 20th-century Spanish philosopher Miguel de Unamuno didn’t recommend banishing the negative emotions or “keeping on the sunny side of life.” In “The Tragic Sense of Life” he described his anxiety over the prospect that there might be no afterlife, adding that he failed to understand people who had not once been similarly tormented by this or by the certainty of their own death.

Unamuno believed that a life worth living consists in communing with others, and that this happens most genuinely through negativity. In “My Religion,” Unamuno wrote: “Whenever I have felt a pain I have shouted and I have done it publicly” in order to “start the grieving chords of others’ hearts playing.” For Unamuno, authentic love is found in suffering with others, and negativity is necessary for compassion and understanding. If we try to deny, hide or eradicate the negative from our lives, we will be ill-equipped to deal with people who are suffering.

Complaining is useful, but we must first shatter and rebuild what “useful” means. My son is not crying in the car to get home faster; he is crying because he is trapped. When I get trapped in crummy situations I too cry, whine, complain. I get it out. I vent. I do these things because they are useful, but not the kind of useful that people usually have in mind. Usefulness doesn’t exclusively mean undoing what we don’t like about our situation; it can also mean dealing with our situation creatively. I use negativity both to change myself — to release disappointment, anger and frustration — and more important, to connect with others.

by Mariana Alessandri, NY Times |  Read more:
Image: Brecht Vandenbroucke

Thursday, December 4, 2014


Louise Peterson, Lethal Weapons
via:

Tanigami Kounan (1879-1928) 谷上廣南 Laelia, 1917
via:

Jackie's Goodbye

[ed. Public service announcement: Please. I know it's a bummer but if you have an aging parent, read this. It provides an excellent account of nearly everything you need to know about caring for someone with Alzheimer's, or any other form of dementia - financial, emotional, bureuacratic. It parallels almost exactly the learning and decision-making process I had to go through when my mom caught pneumonia and her dementia suddenly red-lined. Pay close attention: Medicare vs. Medicaid (program restrictions associated with each and required documentation); VA benefits if applicable (time involved to get them and financial requirements - short answer, up to a year or more and in the end not worth it); nursing and assisted care facilities - their various qualifications, services and costs; home health care options; and hospital discharge policies. I'd also add: be prepared with health care directives and power of attorney, and expect strong differences of opinion (between siblings or surviving spouses over the necessity and costs of whatever care is involved). It is a nightmare. Mom, we tried our best. I still miss you so much.]

I became an Alzheimer's caregiver the week of my 29th birthday. It was August 2012, and I was standing at my kitchen counter in Washington when I got a call from a family friend telling me, "We have a problem." My father had been hospitalized with congestive heart failure. For seven years, he'd been the primary caregiver for his older sister, who had Alzheimer's disease. Without his oversight, she had followed his hospitalization with one of her own after collapsing in her bedroom from dehydration, or low blood sugar, or both. My 66-year-old aunt was a widow with no children. My father was a divorced bachelor, and I was an only child. They were my responsibility.

I had thought I would drive the eight hours to my hometown in South Carolina to get my aunt, Jackie Belcoe, settled back at home, and perhaps hire a nurse to come help out during the day. But when I got there, I found a much graver situation than I had expected.

Tucked into her hospital bed at Lexington Medical Center, Jackie looked so frail and sick that it was heartbreaking. She had been a hairdresser for many years and once owned her own salon. She was the woman who taught me to wear lipstick, who never left the house without her mascara on and her blond bob perfectly styled. Now, her hair was matted and unkempt. She needed a bath and her teeth brushed.

In the emergency room, a nurse had cut the urine-soaked T-shirt off her body. When the paramedics found her, she told them she was 19 and lived at home with her parents.

Though her parents were no longer alive, it was true that she lived in the house where she had grown up. I soon learned that conditions there were as deplorable as the state she was found in. Her bed and sheets were soiled, and dirty laundry had been left beside the washer. A trail of feces stained the carpet from the bed to the bathroom. It was clear that Jackie, like many late-stage dementia patients, had become incontinent—a fact that perhaps a caregiver who was also a brother was too ashamed to admit. Full trash bags were piled in the kitchen. Shards of broken cups were scattered on the floor. The mess had attracted pests, and mice and flies had invaded the brick ranch house.

I traced the chaos to my father's own declining health. That spring, after years with a weak heart, he took leave from work. He tried to stay upbeat and not worry me. I had stopped by to see them in recent months, but kept my visit short. It was so hard, seeing Jackie the way she was. Now, I wondered, how had I missed that something was terribly wrong? Or had I just not wanted to see?

For years, I had been pressuring my dad to think about the long-term plan. What would we do if Jackie needed more support than we could provide at home? Should we decide on a facility where we could place her if the time came?

We had to consider a nursing home, I assumed. My hand had been forced. Naively, at first I didn't think about the money involved. It had been a relief when Jackie reached age 65, with all its attendant public benefits. Surely, I thought, Medicare would cover the kind of care she needed.

"There is nothing medically wrong with her," the hospital social worker told me.

I was incredulous. "What do you mean nothing is wrong with her?" I implored. "Her brain is decaying. If she was left alone, she would die."

The social worker informed me that there was nothing wrong with Jackie that warranted a longer hospital stay or a transfer to a skilled nursing home. What she meant was that Jackie needed custodial care—help with eating, dressing, and bathing. She needed a watchful eye, the adult equivalent of day care. She did not need the assistance of a registered nurse or another medical professional who could administer IVs or monitor complicated equipment and treatments.

Medicare pays for hospital stays and short-term, skilled nursing care for older Americans. It does not cover the kind of custodial care Jackie required, and it generally does not pay for long-term stays in a nursing home or a dementia care unit, a fact nearly 40 percent of Americans over 40 don't fully realize, according to a poll from the Associated Press-NORC Center for Public Affairs Research. Medicaid, designed to provide health care to the poorest Americans, can pay for nursing home residence and long-term care. However, in some states, such as South Carolina, it cannot be used to cover room and board in assisted living or an assisted-living facility's dementia care unit—that is, the kinds of places that provide custodial care to those who don't qualify for nursing homes. Medicaid supports some at-home services, but only if states apply for waivers. (There is also a program in South Carolina and other states that can supplement payments to assisted-living facilities for Medicaid-eligible residents, but Jackie, like many other seniors, did not meet its stringent income and resource limits.)

Jackie did not qualify for Medicaid outright: Her assets and her monthly Social Security income of $1,223 disqualified her from South Carolina's basic 2012 Medicaid limits of $2,000 in resources and a monthly income of just over $900 (the limit is now $973). Given her needs, she could have possibly qualified for some in-home benefits—such as visits from a nursing aide—through the state's Medicaid programs, some of which have higher income thresholds. But it would have taken months to get through the paperwork, and even with some Medicaid supports, I knew she needed full-time caregiving, a role my father could no longer fulfill.

My dad fought me when I first suggested moving Jackie to an assisted-living facility. He didn't want to institutionalize her. He also didn't know how we were going to pay for it.

I soon learned what my father already knew: Brochure after brochure in his files showed glossy photos of luxury dementia care units in our area with 24-hour supervision, secured access to prevent patients from wandering, and life-enrichment programs for the memory-impaired. I called them. A family member toured many of them. Most cost between $4,000 and $6,000 a month out-of-pocket. My dad made just under $29,000 a year working as a welder in the maintenance department at the University of South Carolina. His house was on the brink of foreclosure. Years of health crises had left him and Jackie with very little savings. The annual cost of an assisted-living facility with dementia care was more than double what my father made annually and nearly four times Jackie's income. What I had assumed was procrastination or denial on my father's part was really paralysis.

Hospitals, though, do not like you to overstay your welcome, and Jackie had not been a model patient. More than once, she had gotten out of bed and wandered down the hall and into other patients' rooms. She got agitated, and the staff had to physically restrain her, wrapping her in a vest so she could no longer move. She grew so fitful that a doctor prescribed her a regimen of anti-anxiety drugs, hoping she would just go to sleep.

On the day Jackie was discharged, it was nearly impossible to wake her. The combination of medication and a new environment had made her sleeping patterns even more erratic than usual. She appeared lethargic to the point of being catatonic.

Soon, however, she became more aggresive. She gripped the sheets tighter each time I tried to remove them. When I finally had her sitting up, she didn't want to put on the pair of pants I had brought her. I lifted her hospital gown to put them on her myself. "Stop! Stop!" she yelled. "What are you doing?" In a flash of anger, she pulled her fist back to hit me. She relented when I grabbed her hand.

"What are you doing?" she kept repeating, as we fought over getting dressed."What are you doing?"

The truth was I didn't know what I was doing. And I didn't know what we were going to do.

by Tiffany Stanley, National Journal |  Read more:
Images: Adrià Fruitós and Stanley Family

Lin Fengmian, Standing
via:

Tiny House


A 280 sq. ft. tiny house in Aurora, Oregon. More here:
via:

Nikolas Gambaroff,Untitled, 2010
via:

Shell 'Art' Made 300,000 Years Before Humans Evolved


The artist – if she or he can be called that – was right-handed and used a shark's tooth. They had a remarkably steady hand and a strong arm. Half a million years ago, on the banks of a calm river in central Java, they scored a deep zigzag into a clam shell.

We will never know what was going on inside its maker's head, but the tidy, purposeful line (pictured above right) has opened a new window into the origins of our modern creative mind.

It was found etched into the shell of a fossilised freshwater clam, and is around half a million years old – making the line by far the oldest engraving ever found. The date also means it was made two to three hundred thousand years before our own species evolved, by a more ancient hominin, Homo erectus.

"It is a fascinating discovery," says Colin Renfrew, an archaeologist at the University of Cambridge. "The earliest abstract decoration in the world is really big news."

The shell was dug up in Trinil, Indonesia, in the 1890s by Dutch geologist Eugene Dubois, and was one of many fossil finds in the area, including bones of Homo erectus and several animals.

The shell collection sat in a museum in Leiden, the Netherlands, for over a century. Seven years ago, PhD student Stephen Munro, now at the Australian National University in Canberra, was in the country for a few days and stayed with archaeologist Josephine Joordens of the University of Leiden. She was re-exploring the Dubois collection at the time, and as Munro was also studying ancient molluscs, Joordens encouraged him to take a look. Pressed for time, he photographed each one before heading back to Australia.

"A week later I received an email," Joordens recalls. "He wrote that there was something strange on one of the shells and did I know what it was?"

by Catherine Brahic, New Scientist |  Read more:
Image: uncredited

How Exxon Helped Make Iraqi Kurdistan

[ed. Spreading freedom throughout the Middle-East, one oil field at a time.]

In January 2011, Exxon hired one of the best connected men in Iraq: Ali Khedery, an American of Iraqi descent who had served in Baghdad as a special assistant to five U.S. ambassadors and a senior adviser to three U.S. generals.

At a meeting with Exxon a few months later to analyze Iraq's future, Khedery laid out his thoughts.

Iraq under Prime Minister Nouri al-Maliki was moving toward dictatorship and civil war, he said he told the session. "We will see a rise in violence and a total paralysis in Baghdad," he recalled saying. Iraq was likely to align itself more closely with Iran, which will "have an adverse impact on U.S. companies."

The gloomy scenario grabbed the attention of Exxon executives. Just two years earlier, they had signed a $25 billion deal with Iraq to develop West Qurna, one of the largest oil fields in the country.

"No one wanted to hear that they had negotiated a multi-billion dollar deal in a country which will soon implode," said Khedery, who has detailed to Reuters the meeting and subsequent events for the first time.

He suggested an alternative: Kurdistan, a semi-autonomous region in northern Iraq that was politically stable, far from the chaos in the south, and had, by some estimates, oil reserves of 45 billion barrels.

Less than a year later, Exxon signed a deal with Kurdistan. The story of how that happened explains much about the would-be nation's growing power.

Interviews with key players in the secret 2011 negotiations - the talks involved not just Exxon but also fellow Western oil giant Royal Dutch Shell - show how Exxon's decision to invest infuriated both Washington and Baghdad, and helped propel Kurdistan closer to its long-held goal of independence.

Kurds like to say they are the world's largest ethnic group without a state. Numbering some 35 million, they inhabit a band that stretches from Syria across southern Turkey and northern Iraq and into Iran. Most follow Sunni Islam and speak their own distinct languages.

The Exxon deal fueled Kurdish self-belief. The presence of the biggest U.S. oil company has helped not just financially but also politically and even psychologically.

by Dmitry Zhdannikov, Isabel Coles and Ned Parker, Reuters |  Read more:
Image: Reuters/Brendan Smialowski

The Golden Quarter

We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.

The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.

Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.

There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.

Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. The US economist Tyler Cowen, in his essay The Great Stagnation (2011), argues that, in the US at least, a technological plateau has been reached. Sure, our phones are great, but that’s not the same as being able to fly across the Atlantic in eight hours or eliminating smallpox. As the US technologist Peter Thiel once put it: ‘We wanted flying cars, we got 140 characters.’ (...)

But surely progress today is real? Well, take a look around. Look up and the airliners you see are basically updated versions of the ones flying in the 1960s – slightly quieter Tristars with better avionics. In 1971, a regular airliner took eight hours to fly from London to New York; it still does. And in 1971, there was one airliner that could do the trip in three hours. Now, Concorde is dead. Our cars are faster, safer and use less fuel than they did in 1971, but there has been no paradigm shift.

And yes, we are living longer, but this has disappointingly little to do with any recent breakthroughs. Since 1970, the US Federal Government has spent more than $100 billion in what President Richard Nixon dubbed the ‘War on Cancer’. Far more has been spent globally, with most wealthy nations boasting well-funded cancer‑research bodies. Despite these billions of investment, this war has been a spectacular failure. In the US, the death rates for all kinds of cancer dropped by only 5 per cent in the period 1950-2005, according to the National Center for Health Statistics. Even if you strip out confounding variables such as age (more people are living long enough to get cancer) and better diagnosis, the blunt fact is that, with most kinds of cancer, your chances in 2014 are not much better than they were in 1974. In many cases, your treatment will be pretty much the same.

For the past 20 years, as a science writer, I have covered such extraordinary medical advances as gene therapy, cloned replacement organs, stem-cell therapy, life-extension technologies, the promised spin-offs from genomics and tailored medicine. None of these new treatments is yet routinely available. The paralyzed still cannot walk, the blind still cannot see. The human genome was decoded (one post-Golden Quarter triumph) nearly 15 years ago and we’re still waiting to see the benefits that, at the time, were confidently asserted to be ‘a decade away’. We still have no real idea how to treat chronic addiction or dementia. The recent history of psychiatric medicine is, according to one eminent British psychiatrist I spoke to, ‘the history of ever-better placebos’. And most recent advances in longevity have come about by the simple expedient of getting people to give up smoking, eat better, and take drugs to control blood pressure.

There has been no new Green Revolution. We still drive steel cars powered by burning petroleum spirit or, worse, diesel. There has been no new materials revolution since the Golden Quarter’s advances in plastics, semi-conductors, new alloys and composite materials. After the dizzying breakthroughs of the early- to mid-20th century, physics seems (Higgs boson aside) to have ground to a halt. String Theory is apparently our best hope of reconciling Albert Einstein with the Quantum world, but as yet, no one has any idea if it is even testable. And nobody has been to the Moon for 42 years.

Why has progress stopped? Why, for that matter, did it start when it did, in the dying embers of the Second World War?

by Michael Hanlon, Aeon |  Read more:
Image: courtesy Dick Swanson/U.S. National Archives

Wednesday, December 3, 2014


Jonas Wood
via:

Mansplanation Nation

There's something endearing about people who loudly proclaim their love of books. Forget the suspicions kicked up by trumpeting something as universal as “books” as one’s true love (also loves: baby animals, pizza, oxygen); forget the anachronism of loving physical objects in space and not some “long read” floating in the ether; forget the self-congratulatory tone that hints at a closetful of book-festival tote bags emblazoned with Shakespeare’s face. Proudly championing books still counts as a true act of courage, a way of raging against the dying of the page.

In embracing the book as an object, a concept, a signifier, and a religion, though, one often forgets the texts that answer to the name of “book” these days. A perusal of the best-seller lists of the past two decades indicates that the most popular books might more accurately be described as billionaire-themed smut, extended blast of own-horn tooting, Sociology 101 textbook with sexy one-word title, unfocused partisan rant, 250-page-long stand-up routine, text version of Muppets Most Wanted with self-serious humans where the Muppets should be, folksy Christian sci-fi/fantasy, pseudohistorical rambling by non-historian, and simpleton wisdom trussed up in overpriced yoga pants.

And if we narrow our focus to the No. 1 spot on the New York Times’ hardcover-nonfiction best-seller list in the twenty years since Bookforum was first published, we discover an increasingly shrill, two-decade-long cry for help from the American people. As I Want to Tell You by O. J. Simpson (1995) and The Royals by Kitty Kelley (1997) yield to Dude, Where’s My Country? by Michael Moore (2003) and Plan of Attack by Bob Woodward (2004), you can almost see the support beams of the American dream tumbling sideways, the illusions of endless peace and rapidly compounding prosperity crumbling along with it. The leisurely service-economy daydreams of the late ’90s left us plenty of time to spend Tuesdays with Morrie and muse about The Millionaire Next Door or get worked up about The Day Diana Died. But such luxe distractions gave way to The Age of Turbulence, as our smug belief in the good life was crushed under the weight of 9/11, the Great Recession, and several murky and seemingly endless wars. Suddenly the world looked Hot, Flat, and Crowded, with the aggressively nostalgic waging an all-out Assault on Reason. In such a Culture of Corruption, if you weren’t Going Rogue you inevitably found yourself Arguing with Idiots.

Tracking this borderline-hysterical parade of titles can feel like watching America lose its religion in slow motion. Except, of course, this also meant there was a boom industry in patiently teaching faith-shaken Americans precisely how to believe again. Since the new century began, the top spot on the best-seller list most years has been all but reserved for these morale-boosting bromides: Seemingly every politician, blowhard, and mouthpiece willing to instruct us on how to reclaim our threadbare security blanket of patriotism, cultural supremacy, and never-ending growth and prosperity has turned up in that prestigious limelight. If that list is any indication, we’re desperate for something to ease our fears—or to feed directly into those fears with the kind of angry rhetoric that plays so well on cable news. (...)

And while it’s impossible to argue that people aren’t purchasing books by Laura Ingraham and Sean Hannity and Sarah Palin and Dick Cheney, it’s also difficult to imagine that people are actually reading these books from cover to cover. These are identity accessories more than books—the red-state equivalent of what a funky watch or rakishly arranged silk scarf might be in our notoriously shallow and decadent coastal metropolises. It also says something about the current state of liberal culture that the most popular blue-state authors—Franken, Jon Stewart, Stephen Colbert—are, accessory-wise, more like a bow tie that squirts water in your face. Solemn, poorly written tomes on everything that’s wrong with this country are on one side of the fence, ironic detachment, incredulity, and clown cars on the other. Or as Bill O’Reilly put it when he appeared on the Daily Show in 2004, “You’ve got stoned slackers watching your dopey show every night, and they can vote.”

Bemoaning the stoned slackers of the world might not be a wise choice for the author responsible for what we might term the “Mansplanations of History for Stoned Slackers” series: Killing Lincoln (2011), Killing Kennedy (2012), Killing Jesus (2013), and Killing Patton (2014), every single one of which rested comfortably in the No. 1 spot for weeks at a time. In fact, over the past two decades, O’Reilly has been in the position with seven different books for forty-eight weeks total. How does he do it?

Here’s a clue: If mansplaining means “to comment on or explain something to a woman in a condescending, overconfident, and often inaccurate or oversimplified manner,” then O’Reilly clearly sees America as a suggestible (though fortunately profligate) woman in desperate need of a seemingly limitless amount of remedial mansplanation. And to be fair, if the most popular nonfiction books are a reliable guide, Americans crave mansplaining the way starving rats crave half-eaten hamburgers. We’d like Beck—not an education professor—to mansplain the Common Core to us. We want Malcolm Gladwell—not a neuroscientist or a sociologist or psychologist—to mansplain everything from the laws of romantic attraction to epidemiology. And we want O’Reilly—not an actual historian—to mansplain Lincoln, Kennedy, Jesus, and all of the other great mansplaining icons of history. We want mansplainers mansplaining other mansplainers. We dig hot mansplainer-on-mansplainer action.

by Heather Havrilesky, Bookforum |  Read more:
Image: The Princess Bride with uncredited modifications

Winning the Breakup in the Age of Instagram

[ed. How does stuff get so complicated.]

“Brett was there,” I Gchatted my friend Holly after running into a man who’d broken my heart six months earlier. “We ­actually had a nice chat. He was a mess though. Like, unshowered, smelled weird, was carrying an iPad in the waistband of his pants because he had nowhere to put it.” She asked me what I’d been wearing. Lipstick and heels, I replied. I’d been waiting for my new boyfriend, who picked me up and briefly met Brett.

“Oh my God,” Holly replied. “That is the ultimate ex encounter? He’s nice but looks like a mess. You look awesome and are with a new guy. You won.”

“Winning the breakup” may be a petty concept, but everyone who exits relationships regularly (or maybe just exited one very memorably) knows exactly what it means. The winner is the ex whose career skyrockets after the split; whose new wife is a ­supermodel; who looks better; who dates better; who has bouncier hair. It’s getting over your ex before she gets over you and leading a demonstratively successful life without her — but doing so in ways that at least look casual, just for yourself, definitely not just to rub it in her face, because you’re so over her, remember? And therein lies the Catch-22 of winning the breakup: To care about winning, you are forced to care about not caring about someone. Asked about her weekend plans, my 26-year-old friend Sam once replied, “I’m assembling a team of hotties to torture my ex on Instagram.”

Dating actively is to be in a perpetual state of breakup. (Even in a best-case scenario, you are spared the breakup only once.) I’m 30, but already I feel like I’ve surpassed my lifetime limit for breakups — starting at age 18, hooking up in the dorms, I was already cohabitating with my significant others. In the past decade and change, I’ve had multiple multiyear relationships, which among my peers is a typical track record. For a time, social theorists believed my generation’s defining romantic feature was the hookup. But as hooking up rapidly expanded into a series of miniature ­marriages — and miniature divorces made more confounding by social-media omnipresence and cell-phone butt dials — I’ve come to think millennial romances are defined not by their casual beginnings but their disastrous ends. We aren’t the hookup generation; we’re the breakup generation. Today I find myself entering each subsequent relationship already anticipating its end — but is breakup dread a sign that the relationship is doomed, or does the dread actually cause the doom?

Inevitably, no two people ever can desire a breakup exactly equally. Which means at least one person comes out of it feeling like a loser — and as any résumé-padding overachiever knows, where there are losers there are also winners.

by Maureen O'Connor, NY Magazine | Read more:
Image: Photo: Islandpaps/Splash News