Thursday, November 5, 2015

What's the Best Way to Die?

[ed. See also: One in a million.]

After a particularly gruesome news story — ISIS beheadings, a multicar pileup, a family burnt in their beds during a house fire — I usually get to wondering whether that particular tragic end would be the worst way to go. The surprise, the pain, the fear of impending darkness.

But lately, I’ve been thinking that it’s the opposite question that begs to be asked: what’s the best way to die? Given hypothetical, anything-goes permission to choose from a creepy, unlimited vending machine of endings, what would you select?

If it helps, put yourself in that mindset that comes after a few glasses of wine with friends — your pal asks something dreamy, like where in the whole world you’d love to travel, or, if you could sleep with any celebrity, who would it be? Except this answer is even more personal.

There are lots of ways to look at the query. Would I want to know when I’m going to die, or be taken by surprise? (I mean, as surprising as such an inevitable event can be.) Would I want to be cognizant, so I can really experience dying as a process? Or might it be better to drowse my way through it?

Many surveys suggest that about three-quarters of Americans want to die at home, though the reality is that most Americans, upwards of 68 percent, will die in a hospital or other medicalized environment. Many also say they want to die in bed, but consider what that actually means: just lying there while your heart ticks away, your lungs heave to a stop. Lying around for too long also gets rather uncomfortable — as anyone who’s spent a lazy weekend in bed can tell you — and this raises a further question: should we expect comfort as we exit this life?

Sometimes I think getting sniped while walking down the street is the best way to go. Short, sweet, surprising; no worries, no time for pain. Sure, it’d be traumatic as hell for the people nearby, but who knows — your death might spark a social movement, a yearlong news story that launches media, legal, and criminal justice careers. What a death! It might mean something. Does that matter to you — that your death helps or otherwise changes other people’s lives? If there’s not a point to your death, you might wonder, was there a point to your life?

These are heavy questions — ahem, vital, ones — that don’t seem to come up very often.

I got curious about how other people would answer this question, so I started asking colleagues and friends for their ideal death scenarios (yes, I’m a blast at parties). I heard a wide variety of answers. Skydiving while high on heroin for the second time (because you want to have fun the first time, according to a colleague). Drowning, because he’d heard it was fairly peaceful once the panic clears. Storming a castle and felling enemies with a sword to save a woman, who he then has appreciative sex with, just as he’s taking his dying breaths. (That poor gal!) An ex-boyfriend of mine used to say that the first time he lost bowel control, he’d drive to the Grand Canyon and jump off.

My own non-serious answer is to be tickled to death, sheerly for the punniness of it.

Anecdotally, young men were more fancy-free about their answers, while the older folks and women I spoke with gave more measured answers or sat quietly. Wait, what did you ask? I’d repeat the question. A pause. Hmm.

One old standby came up quite a lot: dying of old age in my bed, surrounded by family. The hospital nurses I asked had a twist on that trope: in bed, surrounded by family, and dying of kidney failure. Among nurses, there was consensus that this is the best way to go if you’re near death and in intensive care — you just fade out and pass, one ICU nurse told me. In the medical community, there’s debate about how calm death by kidney failure actually is, but really, who can you ask?

These answers are all interesting, but my nurse friend got me wondering about people who deal with death on the regular — what do they think about the best death? Do they think about it? Surely hospice workers, physicians, oncologists, “right-to-die” advocates, cancer-cell biologists, bioethicists, and the like have a special view on dying. What might their more-informed criteria be for my “best death” query?

I started with a concept that I think most can agree with — an ideal death should be painless.

Turns out, a painless death is a pretty American way to think about dying.

by Robyn K. Coggins, Wilson Quarterly | Read more:
Image: via:

Wednesday, November 4, 2015

How to Compliment a Guy

These are things I have said to women I don’t know:

“That bag is incroyable!” (Instead of incredible.)

“Your eyebrows look so amazing today. I feel like dying!”

“Wait, you put avocado on your face? Is it hard to be a genius?”

In order to be equal in all my doings, I recently set out to compliment men more. Do men even like compliments? On one hand, everyone always seems to appreciate the books, movies, and skyscrapers men make. But does anyone ever tell them nice things about their clothes and hair? It was time for someone (me) to go up to some men and try it. Without objectifying them, because that would be bad for their self-esteem.

The first guy I complimented was in my grocery store. I was buying coffee, and he had on some cool Elvis Costello–style glasses. “Nice glasses!” I said. He smiled and nudged the girl next to him, who was talking to some other people. She turned around, and I could see that she was wearing awesome glasses, even better than his. Then the girl said to me, “He’s happy because everyone always compliments my glasses.” I felt bad about not complimenting her, so I left.

I kept looking for men to compliment. I saw one dude wearing cool sneakers and another guy carrying a really cute yellow-and-white-striped beach bag (?!), but it’s difficult to just yell compliments at a man while he’s walking down the street. That could make him feel unsafe!

The next time I complimented a man, I was in a pharmacy and I did actually yell at him, but it was mostly because of circumstances. This man had a gigantic watch on that literally was covering his entire wrist. He was also wearing a tremendously tight blue cashmere V-neck sweater over a plaid shirt, legginglike jeans, and little brown slippers like a Viennese dancer. To my mind, he was really asking for the attention.

“Nice watch!” I screamed, because he was walking away from me. He turned toward me with a look of disgust.

This, however, was my only negative reaction. Most men seem extremely pleased when you compliment them on anything that has to do with their clothing or hair. It’s uncanny: When you say anything even slightly nice to a man, his face will melt into a grateful sunburst of a grin, as if he had never been complimented before. I complimented a guy at my gym on his sneakers, and he looked as if he was going to weep.

But, of course, these are not the only kind of compliments there are. You can also compliment a man on his ability to do activities. I must say, this other type of compliment didn’t even occur to me at first because, in general, girls don’t really compliment one another about how well they are doing tasks. They’re just like, “What an amazing pair of boots. Are you French?”

When I started to compliment men on their abilities, it engendered a slightly different reaction. Instead of weeping with gratitude, they started to act all proud and almost annoyed. For example, I rented a Zipcar to go see the demolished Great Gatsby mansion on Sands Point, and the guy working in the parking garage had to maneuver the car out of a very confined space. I was impressed. “You are an incredible driver!” I said, not even thinking about my man-complimenting project. The guy walked away as if he didn’t hear me. Later, a different man expertly mounted my TV. “That’s the most beautiful TV I’ve ever seen,” I told him. “It looks like a picture on a wall.” He simply nodded.

by Rebecca Harrington, NY Magazine | Read more:
Image: Philip Gendreau/Bettmann/Corbis

The Lure of Luxury

Why would anyone spend thousands of dollars on a Prada handbag, an Armani suit, or a Rolex watch? If you really need to know the time, buy a cheap Timex or just look at your phone and send the money you have saved to Oxfam. Certain consumer behaviors seem irrational, wasteful, even evil. What drives people to possess so much more than they need?

Maybe they have good taste. In her wonderful 2003 book The Substance of Style, Virginia Postrel argues that our reaction to many consumer items is “immediate, perceptual, and emotional.” We want these things because of the pleasure we get from looking at and interacting with high-quality products—and there is nothing wrong with this. “Decoration and adornment are neither higher nor lower than ‘real’ life,” she writes. “They are part of it.”

Postrel is pushing back against a more cynical theory held by many sociologists, economists, and evolutionary theorists. Building from the insights of Thorstein Veblen, they argue that we buy such things as status symbols. Though we are often unaware of it and might angrily deny it, we are driven to accumulate ostentatious goods to impress others. Evolutionary psychologist Geoffrey Miller gives this theory an adaptationist twist, arguing that the hunger for these luxury goods is a modern expression of the evolved desire to signal attractive traits—such as intelligence, ambition, and power—to entice mates: Charles Darwin’s sexual selection meets Veblen’s conspicuous consumption.

Signaling is a theory with broad scope—it has been applied to everything from self-mutilating behavior to the fact that the best private schools teach dead languages—but it is most blatant in the consumer world. Advertisements are often pure signaling fantasies. Your neighbors gasp as your car drives by; the attractive stranger in a bar is aroused by your choice of beer; your spouse and children love you because you bought the right brand of frozen pizza. Consistent with this, neuroscience studies reveal that when people look at products they judge to be “cool,” brain areas associated with praise and social approval are activated.

If such purchases are motivated by status enhancement, they become positional goods: their value is determined by what other people possess. This inspires a powerful critique of consumerism. Status is a zero-sum game, and just as countries in a literal arms race have to strip away resources from domestic priorities, the figurative arms race that economist Robert H. Frank calls “luxury fever” takes away from individual consumers money that would be better spent on more substantial goods, such as socializing and travel. It is hard for people to opt out. To say that an individual can simply refuse to participate is like saying that countries in a literal arms race can choose to stop buying all those fighter planes and put the money into school lunches and Shakespeare in the Park. Sure they can—if they don’t mind being invaded. If everyone else buys fancy suits for their job interviews, then I risk unemployment by choosing not to.

We would be better off, then, if some Leviathan could force us to disarm, so Miller, Frank, and others argue that the government should step in. A policy aimed at curbing luxury shopping might involve higher marginal tax rates or, as a more targeted intervention, a consumption tax. As it becomes harder to afford a Rolex, people will devote more money to pleasures that really matter. Less waste, more happiness.

Now, only a philistine would deny Postrel’s point that some consumer preferences are aesthetic, even sensual. And only a rube would doubt that some people buy some luxury items to impress colleagues, competitors, spouses, and lovers. Perhaps we can divvy up the consumer world. An appreciation of beauty explains certain accessible and universal consumer pleasures—Postrel begins her book in Kabul after the Taliban fell, describing how the women there reveled in their freedom to possess burkas of different colors and to paint their nails—while signaling theory applies to the more extravagant purchases. A crimson burka? Aesthetics. A $30,000 watch? Signaling. Aristotle Onassis’s choice to upholster the bar stools in his yacht with whale foreskin? Definitely signaling.

I don’t think any of this is mistaken. But it is seriously incomplete. There is a further explanation for our love of such goods, which draws upon one of the most interesting ideas in the cognitive sciences: that humans are not primarily sensory creatures. Rather, we respond to what we believe are objects’ deeper properties, including their histories. Sensory properties are relevant and so is signaling, but the pleasure we get from the right sort of history explains much of the lure of luxury items—and of more mundane consumer items as well.

by Paul Bloom, Boston Review |  Read more:
Image: scion_cho

The Decay of Twitter


[ed. See also: Twitter unfaves itself]

On Tuesday, Twitter Inc. announced another dreary set of quarterly earnings. While the company beat investor expectations, it’s still running at a loss of $132 million after taxes. Its fourth-quarter projections seem low. Worst of all, its namesake product has essentially failed to add any active American users in 2015.

Twitter stock fell more than 10 percent after the announcement.

Since it went public two years ago, investors have rarely considered Twitter’s prospects rosy. The sliver of Twitter’s users who are interested in how it fares as a corporation have gotten used to this, I think: There’s an idea you see floating around that, beyond avoiding bankruptcy, Twitter’s financial success has little bearing on its social utility. Maybe there are only 320 million humans interested in seeing 140-character updates from their friends every day after all. If you make a website that 4 percent of the world’s population finds interesting enough to peek at every month, you shouldn’t exactly feel embarrassed.

Yet the two entities that are called “Twitter”—the San Francisco-based corporation and the character-delimited social network—are not entirely disconnected. And similarly, no matter how many features Twitter-the-company tacks on to draw in new people, it’s still captive to the whims of Twitter-the-network. Recently, I’ve started to wonder if the corporation is trapped in more than a nominal way. What if the network is one of the company’s greatest obstacles, especially when it comes to growth?

Talking about Twitter the Network is hard. I’ve tried it once before, when my colleague Adrienne LaFrance and I tried to describe how English-language, U.S. Twitter of April 2014 differed from the equivalent Twitter of two years prior. Eighteen months on, I think our effort missed a lot. But I do think we noticed that the social network was slipping into something like midlife. It sometimes feels like Instagram, for instance, is the only social network that people actually still love to use.

But I’m still talking in terms of feel: a biased, decidedly non-precise way of discussing something which emerges from more than 300 million minds. And that’s why I like one theory of what’s changed about Twitter from the Canadian academic Bonnie Stewart. I think it makes clear why Twitter the Company is finding such difficulty attracting new users, especially in the United States. And I think it even helps answer the Instagram question, namely: Why is Instagram (or Vine, or Pinterest) so much more fun than Twitter?

The only problem: To talk about Stewart’s theory, you have to first tackle the ideas of the 20th-century philosopher of media, Walter J. Ong.

by Robinson Meyer, The Atlantic |  Read more:
Image:  Esaias van de Velde

Inverted Totalitarianism

[ed. See also: The Gathering Financial Storm Is Just One Effect of Corporate Power Unbound]

Sheldon Wolin, our most important contemporary political theorist, died Oct. 21 at the age of 93. In his books “Democracy Incorporated: Managed Democracy and the Specter of Inverted Totalitarianism” and “Politics and Vision,” a massive survey of Western political thought that his former student Cornel West calls “magisterial,” Wolin lays bare the realities of our bankrupt democracy, the causes behind the decline of American empire and the rise of a new and terrifying configuration of corporate power he calls “inverted totalitarianism.” (...)

Wolin throughout his scholarship charted the devolution of American democracy and in his last book, “Democracy Incorporated,” details our peculiar form of corporate totalitarianism. “One cannot point to any national institution[s] that can accurately be described as democratic,” he writes in that book, “surely not in the highly managed, money-saturated elections, the lobby-infested Congress, the imperial presidency, the class-biased judicial and penal system, or, least of all, the media.”

Inverted totalitarianism is different from classical forms of totalitarianism. It does not find its expression in a demagogue or charismatic leader but in the faceless anonymity of the corporate state. Our inverted totalitarianism pays outward fealty to the facade of electoral politics, the Constitution, civil liberties, freedom of the press, the independence of the judiciary, and the iconography, traditions and language of American patriotism, but it has effectively seized all of the mechanisms of power to render the citizen impotent.

“Unlike the Nazis, who made life uncertain for the wealthy and privileged while providing social programs for the working class and poor, inverted totalitarianism exploits the poor, reducing or weakening health programs and social services, regimenting mass education for an insecure workforce threatened by the importation of low-wage workers,” Wolin writes. “Employment in a high-tech, volatile, and globalized economy is normally as precarious as during an old-fashioned depression. The result is that citizenship, or what remains of it, is practiced amidst a continuing state of worry. Hobbes had it right: when citizens are insecure and at the same time driven by competitive aspirations, they yearn for political stability rather than civic engagement, protection rather than political involvement.”

Inverted totalitarianism, Wolin said when we met at his home in Salem, Ore., in 2014 to film a nearly three-hour interview, constantly “projects power upwards.” It is “the antithesis of constitutional power.” It is designed to create instability to keep a citizenry off balance and passive.

He writes, “Downsizing, reorganization, bubbles bursting, unions busted, quickly outdated skills, and transfer of jobs abroad create not just fear but an economy of fear, a system of control whose power feeds on uncertainty, yet a system that, according to its analysts, is eminently rational.”

Inverted totalitarianism also “perpetuates politics all the time,” Wolin said when we spoke, “but a politics that is not political.” The endless and extravagant election cycles, he said, are an example of politics without politics.

“Instead of participating in power,” he writes, “the virtual citizen is invited to have ‘opinions’: measurable responses to questions predesigned to elicit them.”

Political campaigns rarely discuss substantive issues. They center on manufactured political personalities, empty rhetoric, sophisticated public relations, slick advertising, propaganda and the constant use of focus groups and opinion polls to loop back to voters what they want to hear. Money has effectively replaced the vote. Every current presidential candidate—including Bernie Sanders—understands, to use Wolin’s words, that “the subject of empire is taboo in electoral debates.” The citizen is irrelevant. He or she is nothing more than a spectator, allowed to vote and then forgotten once the carnival of elections ends and corporations and their lobbyists get back to the business of ruling.

“If the main purpose of elections is to serve up pliant legislators for lobbyists to shape, such a system deserves to be called ‘misrepresentative or clientry government,’ ” Wolin writes. “It is, at one and the same time, a powerful contributing factor to the depoliticization of the citizenry, as well as reason for characterizing the system as one of antidemocracy.”

The result, he writes, is that the public is “denied the use of state power.” Wolin deplores the trivialization of political discourse, a tactic used to leave the public fragmented, antagonistic and emotionally charged while leaving corporate power and empire unchallenged.

“Cultural wars might seem an indication of strong political involvements,” he writes. “Actually they are a substitute. The notoriety they receive from the media and from politicians eager to take firm stands on nonsubstantive issues serves to distract attention and contribute to a cant politics of the inconsequential.”

“The ruling groups can now operate on the assumption that they don’t need the traditional notion of something called a public in the broad sense of a coherent whole,” he said in our meeting. “They now have the tools to deal with the very disparities and differences that they have themselves helped to create. It’s a game in which you manage to undermine the cohesiveness that the public requires if they [the public] are to be politically effective. And at the same time, you create these different, distinct groups that inevitably find themselves in tension or at odds or in competition with other groups, so that it becomes more of a melee than it does become a way of fashioning majorities.”

In classical totalitarian regimes, such as those of Nazi fascism or Soviet communism, economics was subordinate to politics. But “under inverted totalitarianism the reverse is true,” Wolin writes. “Economics dominates politics—and with that domination comes different forms of ruthlessness.”

He continues: “The United States has become the showcase of how democracy can be managed without appearing to be suppressed.”

The corporate state, Wolin told me, is “legitimated by elections it controls.” To extinguish democracy, it rewrites and distorts laws and legislation that once protected democracy. Basic rights are, in essence, revoked by judicial and legislative fiat. Courts and legislative bodies, in the service of corporate power, reinterpret laws to strip them of their original meaning in order to strengthen corporate control and abolish corporate oversight.

He writes: “Why negate a constitution, as the Nazis did, if it is possible simultaneously to exploit porosity and legitimate power by means of judicial interpretations that declare huge campaign contributions to be protected speech under the First Amendment, or that treat heavily financed and organized lobbying by large corporations as a simple application of the people’s right to petition their government?”

Our system of inverted totalitarianism will avoid harsh and violent measures of control “as long as ... dissent remains ineffectual,” he told me. “The government does not need to stamp out dissent. The uniformity of imposed public opinion through the corporate media does a very effective job.”

And the elites, especially the intellectual class, have been bought off. “Through a combination of governmental contracts, corporate and foundation funds, joint projects involving university and corporate researchers, and wealthy individual donors, universities (especially so-called research universities), intellectuals, scholars, and researchers have been seamlessly integrated into the system,” Wolin writes. “No books burned, no refugee Einsteins.”

But, he warns, should the population—steadily stripped of its most basic rights, including the right to privacy, and increasingly impoverished and bereft of hope—become restive, inverted totalitarianism will become as brutal and violent as past totalitarian states. “The war on terrorism, with its accompanying emphasis upon ‘homeland security,’ presumes that state power, now inflated by doctrines of preemptive war and released from treaty obligations and the potential constraints of international judicial bodies, can turn inwards,” he writes, “confident that in its domestic pursuit of terrorists the powers it claimed, like the powers projected abroad, would be measured, not by ordinary constitutional standards, but by the shadowy and ubiquitous character of terrorism as officially defined.”

by Chris Hedges, Truthdig |  Read more:
Image: Democracy Inc. Amazon

Tuesday, November 3, 2015


Robert Crumb
via:

Neil Young

Streaming Wars

TV executives see cord-cutters as a strange, exotic species that must be observed and scrutinized. I learned this while seated between several high-ranking executives from the country’s largest broadcast and cable companies one evening this summer.

Once I outed myself as a cord-cutter (actually worse, a cord-never!), my purchase decisions became the dominant topic of conversation. I may as well have said I hunt zombies in my spare time. They bombarded me with questions.

What do I watch? How do I watch? I must not like sports, they correctly noted. What do I subscribe to? Hulu and Netflix?! And HBO Now? And Spotify? And Amazon AMZN -0.48% Prime Video?! Don’t I know that the price of all those subscriptions add up to much more than a basic cable package? Didn’t I know how irrational I was?

It was maddening to them, and I understand why. But consumers, myself included, can be irrational. Just ask JCPenny, which in 2013 stopped inflating prices for the charade of coupons and deep discounts, and nearly tanked its business in the process. Turns out customers preferred the charade.

Likewise, it’s increasingly common for cord-cutters like myself to cobble together an array of separate on-demand subscription services in lieu of a traditional cable subscription. Media companies, including 21st Century Fox, Viacom, CBS, Time Warner, Discovery, and Walt Disney, forecasted decreases in cable subscriptions this summer, and their stock prices were hammered as a result.

While TV execs criticize cord-cutters for irrational purchase decisions, the digital streaming services are in an all-out war for their money. The competition means more and better choices, making a basic cable package less attractive by the day.

by Erin Griffith, Fortune |  Read more:
Image: Beck Diefenbach — Reuters

Death Rates Rising for Middle-Aged White Americans

Something startling is happening to middle-aged white Americans. Unlike every other age group, unlike every other racial and ethnic group, unlike their counterparts in other rich countries, death rates in this group have been rising, not falling.

That finding was reported Monday by two Princeton economists, Angus Deaton, who last month won the 2015 Nobel Memorial Prize in Economic Science, and Anne Case. Analyzing health and mortality data from the Centers for Disease Control and Prevention and from other sources, they concluded that rising annual death rates among this group are being driven not by the big killers like heart disease and diabetes but by an epidemic of suicides and afflictions stemming from substance abuse: alcoholic liver disease and overdoses of heroin and prescription opioids.

The analysis by Dr. Deaton and Dr. Case may offer the most rigorous evidence to date of both the causes and implications of a development that has been puzzling demographers in recent years: the declining health and fortunes of poorly educated American whites. In middle age, they are dying at such a high rate that they are increasing the death rate for the entire group of middle-aged white Americans, Dr. Deaton and Dr. Case found.

The mortality rate for whites 45 to 54 years old with no more than a high school education increased by 134 deaths per 100,000 people from 1999 to 2014.

“It is difficult to find modern settings with survival losses of this magnitude,” wrote two Dartmouth economists, Ellen Meara and Jonathan S. Skinner, in a commentary to the Deaton-Case analysis to be published in Proceedings of the National Academy of Sciences.

“Wow,” said Samuel Preston, a professor of sociology at the University of Pennsylvania and an expert on mortality trends and the health of populations, who was not involved in the research. “This is a vivid indication that something is awry in these American households.”

Dr. Deaton had but one parallel. “Only H.I.V./AIDS in contemporary times has done anything like this,” he said.

by Gina Kolata, NY Times |  Read more:
Image: Ben Solomon

Atsuo (Dazai Atsuo) and S. Riyo
via: here and here

Monday, November 2, 2015

Depression Modern

[ed. Haven't seen this one yet but now I'm intrigued.]

The second season of “The Leftovers,” on HBO, begins with a mostly silent eight-minute sequence, set in a prehistoric era. We hear a crackle, then see red-and-black flames and bodies sleeping around a fire; among them is a pregnant woman, nearly naked. She rises, stumbles from her cave, then squats and pisses beneath the moon—only to be startled by a terrifying rumble, an earthquake that buries her home. When she gives birth, we see everything: the flood of amniotic fluid, the head crowning, teeth biting the umbilical cord. For weeks, she struggles to survive, until finally she dies in agony, bitten by a snake that she pulled off her child. Another woman rescues the baby, her face hovering like the moon. Only then does the camera glide down the river, to where teen-age girls splash and laugh. We are suddenly in the present, with no idea how we got there.

It takes serious brass to start your new season this way: the main characters don’t even show up until midway through the hour. With no captions or dialogue, and no clear link to the first season’s story, it’s a gambit that might easily veer into self-indulgence, or come off as second-rate Terrence Malick. Instead, almost magically, the sequence is ravishing and poetic, sensual and philosophical, dilating the show’s vision outward like a telescope’s lens. That’s the way it so often has been with this peculiar, divisive, deeply affecting television series, Damon Lindelof’s first since “Lost.” Lindelof, the co-creator, and his team (which includes Tom Perrotta, the other co-creator, who wrote the novel on which the show is based; the religious scholar Reza Aslan, a consultant; and directors such as Mimi Leder) persist in dramatizing the grandest of philosophical notions and addressing existential mysteries—like the origins of maternal love and loss—without shame, thus giving the audience permission to react to them in equally vulnerable ways. They’re willing to risk the ridiculous in search of something profound.

At heart, “The Leftovers” is about grief, an emotion that is particularly hard to dramatize, if only because it can be so burdensome and static. The show, like the novel, is set a few years after the Departure, a mysterious event in which, with no warning, two per cent of the world’s population disappears. Celebrities go; so do babies. Some people lose their whole family, others don’t know anyone who has “departed.” The entire cast of “Perfect Strangers” blinks out (though, in a rare moment of hilarity, Mark Linn-Baker turns out to have faked his death). Conspiracy theories fly, people lose their religion or become fundamentalists—and no one knows how to feel. The show’s central family, the Garveys, who live in Mapleton, New York, appear to have lost no one, yet they’re emotionally shattered. Among other things, the mother, Laurie (an amazing Amy Brenneman, her features furrowed with disgust), joins a cult called the Guilty Remnant, whose members dress in white, chain-smoke, and do not speak. They stalk the bereaved, refusing to let anyone move on from the tragedy. Her estranged husband, Kevin (Justin Theroux), the chief of police, has flashes of violent instability; their teen-age children drift away, confused and alarmed.

That’s the plot, but the series is often as much about images (a girl locked in a refrigerator, a dog that won’t stop barking) and feelings (fury, suicidal alienation) as about events; it dives into melancholy and the underwater intensity of the grieving mind without any of the usual relief of caperlike breakthroughs. Other cable dramas, however ambitious, fuel themselves on the familiar story satisfactions of brilliant iconoclasts taking risks: cops, mobsters, surgeons, spies. “The Leftovers” is structured more like explorations of domestic intimacy such as “Friday Night Lights,” but marinated in anguish and rendered surreal. The Departure itself is a simple but highly effective metaphor. In the real world, of course, people disappear all the time: the most ordinary death can feel totally bizarre and inexplicable, dividing the bereaved as often as it brings them closer. But “The Leftovers” is more expansive than that, evoking, at various moments, New York after 9/11, and also Sandy Hook, Charleston, Indonesia, Haiti, and every other red-stringed pin on our pre-apocalyptic map of trauma. At its eeriest, the show manages to feel both intimate and world-historical: it’s a fable about a social catastrophe threaded into the story of a lacerating midlife divorce.

The first season of “The Leftovers” rose and fell in waves: a few elements (like a plot about the Garveys’ son, who becomes a soldier in a separate cult) felt contrived, while others (especially the violent clashes between the Guilty Remnant and the bereaved residents of Mapleton) were so raw that the show could feel hard to watch. But halfway through Season 1 “The Leftovers” spiked into greatness, with a small masterpiece of an episode. In “Guest,” a seemingly minor character named Nora Durst (Carrie Coon), a Mapleton resident who has the frightening distinction of having lost her entire family—a husband and two young children—stepped to the story’s center. In one hour, we learned everything about her: what she does for work (collects “survivor” questionnaires for an organization searching for patterns), what she does at home (obsessively replaces cereal boxes, as if her family were still alive), and what she does for catharsis (hires a prostitute to shoot her in the chest while she’s wearing a bulletproof vest). She travels to New York for a conference, where her identity gets stripped away in bizarre fashion. But, as with that prehistoric opener, the revelations are delivered through montages, which drag, then speed up, revealing without overexplaining, grounded in Coon’s brilliantly unsentimental, largely silent performance. When the episode was over, I was weeping, which happens a lot with “The Leftovers.” It may be the whole point.

by Emily Nussbaum, New Yorker | Read more:
Image: Emiliano Ponzi 

A 'Huge Milestone' in Cancer Treatment

A new cancer treatment strategy is on the horizon that experts say could be a game-changer and spare patients the extreme side effects of existing options such as chemotherapy.

Chemotherapy and other current cancer treatments are brutal, scorched-earth affairs that work because cancer cells are slightly – but not much – more susceptible to the havoc they wreak than the rest of the body. Their side effects are legion, and in many cases horrifying – from hair loss and internal bleeding to chronic nausea and even death.

But last week the Food and Drug Administration (FDA) for the first time approved a single treatment that can intelligently target cancer cells while leaving healthy ones alone, and simultaneously stimulate the immune system to fight the cancer itself.

The treatment, which is called T-VEC (for talimogene laherparepvec) but will be sold under the brand name Imlygic, uses a modified virus to hunt cancer cells in what experts said was an important and significant step in the battle against the deadly disease.

It works by introducing a specially modified form of the herpes virus by injection directly into a tumour – specifically skin cancer, the indication for which the drug has been cleared for use.

It was developed by the Massachusetts-based biotech company BioVex, which wasacquired in 2011 by biotech behemoth Amgen for $1bn. The genetic code of the virus – which was originally taken from the cold sore of a BioVex employee – has been modified so it can kill only cancer cells.

Cancer-hunting viruses have long been thought of as a potential source of a more humane and targeted treatment for cancer. Unlike current oncological treatments like chemotherapy and radiotherapy, which kill cancer cells but also damage the rest of the body, viruses can be programmed to attack only the cancer cells, leaving patients to suffer the equivalent of just a day or two’s flu.

Treatments such as Imlygic have two modes of action: first, the virus directly attacks the cancer cells; and second, it triggers the body’s immune system to attack the rogue cells too once it detects the virus’s presence.

Dr Stephen Russell, a researcher at the Mayo Clinic who specialises in oncolytic virotherapy – as these treatments are known – says that the FDA’s clearance of Imlygic represents “a huge milestone” in cancer treatment development.

Viruses are “nature’s last untapped bioresource”, Russell said. Imlygic itself has an officially fairly modest effect coming out of its clinical studies – an average lifespan increase of less than five months. But underneath that data, Russell said anecdotally that in his Mayo clinic studies in mice, some programmable viruses saw “large tumours completely disappearing”.

The goal, he said, was to get to the point where the clinical trials would see similarly dramatic outcomes, so that chemotherapy and radiotherapy could finally be consigned to medical history.

by Nicky Woolf, The Guardian |  Read more:
Image: Stocktrek Images, Inc. / Alamy S/Alamy Stock Photo

The Folklore Of Our Times

I was born in 1949. I started high school in 1963 and went to college in 1967. And so it was amid the crazy, confused uproar of 1968 that I saw in my otherwise auspicious twentieth year. Which, I guess, makes me a typical child of the sixties. It was the most vulnerable, most formative, and therefore most important period in my life, and there I was, breathing in deep lungfuls of abandon and quite naturally getting high on it all. I kicked in a few deserving doors - and what a thrill it was whenever a door that deserved kicking in presented itself before me, as Jim Morrison, the Beatles and Bob Dylan played in the background. The whole shebang.

Even now, looking back on it all, I think that those years were special. I'm sure that if you were to examine the attributes of the time one by one, you wouldn't discover anything all that noteworthy. Just the heat generated by the engine of history, that limited gleam that certain things give off in certain places at certain times - that and a kind of inexplicable antsiness, as if we were viewing everything through the wrong end of a telescope. Heroics and villainy, rapture and disillusionment, martyrdom and revisionism, silence and eloquence, etcetera, etcetera... the stuff of any age. Only, in our day - if you'll forgive the overblown expression - it was all so colourful somehow, so very reach-out-and-grab-it palpable. There were no gimmicks, no discount coupons, no hidden advertising, no keep-'em-coming point-card schemes, no insidious, loopholing paper trails. Cause and effect shook hands; theory and reality embraced with aplomb. A prehistory to high capitalism: that's what I personally call those years.

But as to whether the era brought us - my generation, that is - any special radiance, well, I'm not so sure. In the final analysis, perhaps we simply passed through it as if we were watching an exciting movie: we experienced it as real - our hearts pounded, our palms sweated - but when the lights came on we just walked out of the cinema and picked up where we'd left off. For whatever reason, we neglected to learn any truly valuable lesson from it all. Don't ask me why. I am much too deeply bound up in those years to answer the question. There's just one thing I'd like you to understand: I'm not the least bit proud that I came of age then; I'm simply reporting the facts.

Now let me tell you about the girls. About the mixed-up sexual relations between us boys, with our brand new genitals, and the girls, who at the time were, well, still girls.

But, first, about virginity. In the sixties, virginity held a greater significance than it does today. As I see it - not that I've ever conducted a survey - about 50% of the girls of my generation were no longer virgins by the age of 20. Or, at least, that seemed to be the ratio in my general vicinity. Which means that, consciously or not, about half the girls around still revered this thing called virginity.

Looking back now, I'd say that a large portion of the girls of my generation, whether virgins or not, had their share of inner conflicts about sex. It all depended on the circumstances, on the partner. Sandwiching this relatively silent majority were the liberals, who thought of sex as a kind of sport, and the conservatives, who were adamant that girls should stay virgins until they were married.

Among the boys, there were also those who thought that the girl they married should be a virgin.

People differ, values differ. That much is constant, no matter what the period. But the thing about the sixties that was totally unlike any other time is that we believed that those differences could be resolved.

This is the story of someone I knew. He was in my class during my senior year of high school in Kobe, and, frankly, he was the kind of guy who could do it all. His grades were good, he was athletic, he was considerate, he had leadership qualities. He wasn't outstandingly handsome, but he was good-looking in a clean-cut sort of way. He could even sing. A forceful speaker, he was always the one to mobilise opinion in our classroom discussions. This didn't mean that he was much of an original thinker - but who expects originality in a classroom discussion? All we ever wanted was for it to be over as quickly as possible, and if he opened his mouth we were sure to be done on time. In that sense, you could say that he was a real friend.

There was no faulting him. But then again I could never begin to imagine what went on in his mind. Sometimes I felt like unscrewing his head and shaking it, just to see what kind of sound it would make. Still, he was very popular with the girls. Whenever he stood up to say something in class, all the girls would gaze at him admiringly. Any maths problem they didn't understand, they'd take to him. He must have been twenty-seven times more popular than I was. He was just that kind of guy.

We all learn our share of lessons from the textbook of life, and one piece of wisdom I've picked up along the way is that you just have to accept that in any collective body there will be such types. Needless to say, though, I personally wasn't too keen on his type. I guess I preferred, I don't know, someone more flawed, someone with a more unusual presence. So in the course of an entire year in the same class I never once hung out with the guy. I doubt that I even spoke to him. The first time I ever had a proper conversation with him was during the summer vacation after my freshman year of college. We happened to be attending the same driving school, and we'd chat now and then, or have coffee together during the breaks. That driving school was such a bore that I'd have been happy to kill time with any acquaintance I ran into. I don't remember much about our conversations; whatever we talked about, it left no impression, good or bad.

The other thing I remember about him is that he had a girlfriend. She was in a different class, and she was hands down the prettiest girl in the school. She got good grades, but she was also an athlete, and she was a leader - like him, she had the last word in every class discussion. The two of them were simply made for each other: Mr and Miss Clean, like something out of a toothpaste commercial.

I'd see them around. Every lunch hour, they sat in a corner of the schoolyard, talking. After school, they rode the train home together, getting off at different stations. He was on the soccer team, and she was in the English conversation club. When their extra-curricular activities weren't over at the same time, the one who finished first would go and study in the library. Any free time they had they spent together.

None of us - in my crowd - had anything against them. We didn't make fun of them, we never gave them a hard time; in fact, we hardly paid any attention to them at all. They really didn't give us much to speculate about. They were like the weather - just there, a physical fact. Inevitably, we spent our time talking about the things that interested us more: sex and rock and roll and Jean-Luc Godard films, political movements and Kenzaburo Oe novels, things like that. But especially sex.

OK, we were ignorant and full of ourselves. We didn't have a clue about life. But, for us, Mr and Miss Clean existed only in their Clean world. Which probably means that the illusions we entertained back then and the illusions they embraced were, to some extent, interchangeable.

This is their story. It's not a particularly happy story, nor, by this point in time, is it one with much of a moral. But no matter: it's our story as much as theirs. Which, I guess, makes it a form of cultural history. Suitable material for me to collect and relate here - me, the insensitive folklorist.

by Haruki Murakami, The Guardian |  Read more:
Image: Wikipedia

The Big Bush Question

Poor Jeb. Or I should say, Poor Jeb! (I’m not given to exclamation points, but Jeb! is so magnetic.) It’s unfathomable how he thought that he could run for the Republican nomination without having to wrestle with his brother’s record as president.   [ed. Jeb! has a new slogan: Jeb Can Fix It.  lol]

Soon enough, he was so entangled in the question of whether he would have gone into Iraq, knowing what we know now, that it took him four tries to come up with the currently politically acceptable answer: No. But while the war in Iraq is widely accepted to have been a disastrous mistake, another crucial event during the George W. Bush administration has long been considered unfit for political discussion: President Bush’s conduct, in the face of numerous warnings of a major terrorist plot, in the months leading up to September 11, 2001.

The general consensus seems to have been that the 9/11 attacks were so horrible, so tragic, that to even suggest that the president at the time might bear any responsibility for not taking enough action to try to prevent them is to play “politics,” and to upset the public. And so we had a bipartisan commission examine the event and write a report; we built memorials at the spots where the Twin Towers had come down and the Pentagon was attacked; and that was to be that. And then along came Donald Trump, to whom “political correctness” is a relic of an antiquated, stuffy, political system he’s determined to overwhelm. In an interview on October 16, he violated the longstanding taboo by saying, “When you talk about George Bush—I mean, say what you want, the World Trade Center came down during his time.”

Trump’s comments set up a back and forth between him and Jeb Bush—who, as Trump undoubtedly anticipated, can’t let a blow against him by the frontrunner go by without response—but the real point is that with a simple declaration by Trump, there it was: the subject of George W. Bush’s handling of the warnings about the 9/11 attacks was out there.

Jeb Bush had already left himself open to this charge by saying that his brother had “kept us safe.” Now he has insisted on this as his response to Trump. But the two men were talking about different periods of time. As Jeb Bush said later, “We were attacked, my brother kept us safe.” That’s true enough in Jeb’s framing of the issue as what happened after the attacks—and if one’s concept of safe means fighting two terrible wars whose effects continue to play out in the Middle East; continual reports of terrorist plots and panicked responses to them; invasive searches at airports; and greatly expanded surveillance.

But that’s not the heart of the matter. The heretofore hushed-up public policy question that Trump stumbled into is: Did George W. Bush do what he could have to try to disrupt the terrorist attacks on September 11, 2001? It’s not simply a question of whether he could have stopped the devastation—that’s unknowable. But did he do all he could given the various warnings that al-Qaeda was planning a major attack somewhere on US territory, most likely New York or Washington? The unpleasant, almost unbearable conclusion—one that was not to be discussed within the political realm—is that in the face of numerous warnings of an impending attack, Bush did nothing.

Osama bin Laden was already a wanted man when the Bush administration took office. The Clinton administration had identified him as the prime suspect in the 1998 bombings of two US embassies in Africa, and it took a few steps to capture or kill him that came to naught. Outgoing Clinton officials warned the incoming administration about al-Qaeda, but the repeated and more specific warnings by Richard Clarke, who stayed on from one administration to the next as the chief terrorism adviser, were ignored. In a White House meeting on July 5, 2001, Clarke said, “Something really spectacular is going to happen here, and it’s going to happen soon.” By this time top Bush officials regarded Clarke as a pain, who kept going on about terrorist plots against the US.

But Clarke wasn’t the only senior official sounding an alarm. On July 10, CIA Director George Tenet, having just received a briefing from a deputy that “literally made my hair stand on end,” phoned National Security Adviser Condoleezza Rice to ask for a special meeting at the White House. “I can recall no other time in my seven years as DCI that I sought such an urgent meeting at the White House,” Tenet later wrote in his book, At the Center of the Storm. Tenet and aides laid out for Rice what they described as irrefutable evidence that, as the lead briefer put it at that meeting, “’There will be a significant terrorist attack in the coming weeks or months” and that the attack would be “spectacular.” Tenet believed that the US was going to get hit, and soon. But the intelligence authorities, including covert action, that the CIA officials told Rice they needed, and had been asking for since March, weren’t granted until September 17.

Then came the now-famous August 6 Presidential Daily Brief (PDB) intelligence memorandum to the president, headlined, “Bin Laden Determined To Strike in US.” Bush was at his ranch in Crawford, Texas, on what was to be one of the longest summer vacations any president has taken; none of his senior aides was present for the briefing. Rice later described this PDB as “very vague” and “ very non-specific” and “mostly historical.” It was only after a great struggle that the 9/11 commission got it declassified and the truth was learned. In its final report, the commission noted that this was the thirty-sixth Presidential Daily Brief so far that year related to al-Qaeda and bin Laden though the first one that specifically warned of an attack on the US itself.

While the title of the memo has become somewhat familiar, less known are its contents, including the following: “Clandestine, foreign government, and media reports indicate bin Laden since 1997 has wanted to conduct terrorist attacks in the US. Bin Laden implied in U.S. television interviews in 1997 and 1998 that his followers would follow the example of World Trade Center bomber Ramzi Yousef and ‘bring the fighting to America.’” And: “FBI information since that time indicates patterns of suspicious activity in this country consistent with preparations for hijackings or other types of attacks, including recent surveillance of federal buildings in New York.” Having received this alarming warning the president did nothing.

As August went on, Tenet was so agitated by the chatter he was picking up and Bush’s lack of attention to the matter that he arranged for another CIA briefing of the president later in August, with Bush still at his ranch, to try to get his attention to what Tenet believed was an impending danger. According to Ron Suskind, in the introduction to his book The One Percent Doctrine, when the CIA agents finished their briefing of the president in Crawford, the president said, “All right. You’ve covered your ass now.” And that was the end of it.

What might a president do upon receiving notice that the world’s number one terrorist was “determined to strike in US”? The most obvious thing was to direct Rice or Vice President Cheney to convene a special meeting of the heads of any agencies that might have information about possible terror threats, and order them shake their agencies down to the roots to find out what they had that might involve such a plot, then put the information together. As it happened they had quite a bit:

by Elizabeth Drew, NY Review of Books |  Read more:
Image: David Levine

Catherine MurphyStill Life with Envelopes 1976
via:

The Truth About Ninjas

If you do anything for Halloween this weekend, chances are pretty good you might see a child (or an adult) trick-or-treating (or partying) dressed as a ninja. Maybe it’ll be a generic ninja, or maybe a specific one, like a Naruto character or a Ninja Turtle.

Today, ninjas are all around us. They’re in our movies and comics and video games; they’re even in our everyday language (“I can’t believe you ninja’d that in there at the last second!” “Come join our team of elite code ninjas!”). Far from their origins in medieval Japan, ninjas are now arguably that country’s most famous warrior type. We talk about pirates versus ninjas, after all, not pirates versus samurai.

There’s a huge divergence between historical ninja and the fantasy ninjas of popular culture. For example, everyone knows that ninjas were warriors who stuck to the shadows and never revealed their secrets— yet watch some anime or play a video game and you’re likely to see ninjas portrayed as the flashiest, most conspicuous characters around.

Like a lot of well-known fantasy archetypes, the ninja have a real history, but aside from some basic core attributes, writers and artists around the world feel free to interpret the word however they want.

The two strains of ninja—“real” ninja versus pop-culture ninjas—aren’t as separate as you might assume. In fact, the tension between the two is one of the things I love most about them. Ninjas as we know them today are a complex mixture of historical inspiration and modern imagination, defined by the intersection of two seemingly contradictory identities.

The true story of the ninja is fascinating. The people known today as ninja (they pronounced it “shinobi” then) rose out of small villages in the Iga and Kōga regions of Japan. By necessity, they became experts in navigating and utilizing the resources of the dense mountain forests around them. Because of their relative isolation, they served no lord and ruled themselves through a council of village chiefs. In the Warring States period (c. 1467 – c. 1603), people from these areas frequently found work as spies and agents of espionage, work that made good use of their skills in navigation, observation, and escape.

The villages of Iga and Kōga were eventually attacked by one of the greatest warlords of the era, Oda Nobunaga (an event that forms the loose inspiration for, among many other things, the underrated Neo Geo fighting game Ninja Master’s[sic], by World Heroes developer ADK).

The villages banded together and fought the invading armies with guerilla techniques—techniques enabled by their superior knowledge and mastery of the terrain. That’s pretty much textbook ninja action, right there.

By the end of the Warring States period, the ninja were enfranchised and integrated into the government’s systems of power. Their most famous leader, Hattori Hanzō, received an official salary equivalent to millions of dollars today. He became so much a part of the establishment that they named a gate in the Shogun’s palace after him, and today there’s a train line named after that gate: Tokyo’s Hanzōmon Line.

Serious researchers and students of ninja history and practice often take pains to remind us that the real-life ninjas they study were decidedly not cartoon characters. The real story of the ninja, they often say, is better than anything that’s been made up about them. That’s true in some sense: the history of the ninja is definitely worth understanding. It weaves together many threads of Japan’s culture, its philosophy, and even its spirituality.

But I have to admit: I love the goofy pop-culture version of ninjas, too.

by Matthew S. Burns, Kotaku | Read more:
Image: uncredited