Friday, November 6, 2015


Ryota Hayashi
via:

The Poop's in the Mail

Clostridium difficile (C. diff) is a terrible way to die. It's an antibiotic-resistant gut bug that causes painful diarrhea, fever and kidney failure. Almost half a million Americans are diagnosed with C. diff a year, the Centers for Disease Control and Prevention announced in a February press release. Of those people, around 29,000 die within 30 days of diagnosis.

To put those numbers into perspective, here are a few more. In 2013, 32,719 people were killed in car crashes in the U.S., according to the Insurance Institute for Highway Safety. Every year, according to the CDC, 33,000 people are killed by guns in the U.S. This year, about 40,000 women in the U.S. are expected to die from breast cancer.

The intestinal infection comes from a spore-forming bacteria that neither soap nor alcohol can effectively kill and can live on surfaces for months. Though it's largely preventable with proper hygiene, it's usually spread in hospitals and nursing homes.

C. diff is generally treated in two ways: antibiotics and fecal microbiota transplantation (FMT), also known as a fecal transplant — the "microbial equivalent of a blood transfusion," according to MinuteEarth. Doctors take a fecal sample from a healthy person and transplant it into the patient's intestines through a colonoscopy, enema or nasoenteric tube, which goes up the nose and down into the stomach. FMT is highly effective in treating C. diff cases when antibiotics fail, but it's an invasive procedure and is considered an experimental treatment. Currently, it's used only on patients for whom antibiotics have proven ineffective.

Now there's a much easier cure for this deadly infection: the poop pill.

On Oct. 28, OpenBiome announced that it had started production of the first fecal transplant pill.

Even though FMT has been extremely successful for treating C. diff, it's still been difficult to make these transplants happen for people. That difficulty is what inspired Mark Smith to start OpenBiome, the world's first poop bank.

Since 2013, the Boston-based lab has been collecting healthy fecal matter, packaging it and sending it out to doctors around the world. Since then, OpenBiome has been involved in more than 7,500 treatments in roughly 460 hospitals, Carolyn Edelstein, OpenBiome's director of policy and global partnerships, told Mic.

"Despite the underlying simplicity and efficacy of FMT, prior to OpenBiome, it had become difficult for clinicians to offer FMT at a scale that matched patient needs because of the challenges of identifying and screening donors and processing stool material," OpenBiome's website says.

C. diff has traditionally been treated with antibiotics, and around 80% of C. diff patients will be cured by antibiotics, according to OpenBiome's clinical primer.

A study performed from 2008 to 2013 by the Academic Medical Center at the University of Amsterdam tested the efficacy of FMT and antibiotics to treat C. diff. Researchers found that fecal transplants were far much more effective than antibiotics — so conclusively that they stopped the study early. Ninety-four percent of the participants who received FMT as treatment were cured, compared to 31% and 23% cured by different antibiotics. (...)

FMT is arguably the best alternative to antibiotic treatment for infections like C. diff. It's actually a little odd that C. diff is treated with antibiotics, because "almost all cases of C. diff are associated with antibiotic use," Smith told Mic.

"Imagine that there were terrorists on the loose in New York City, and we decided the way to deal with it was nuking New York." Smith said. "That's what antibiotics are like. C. diff is like a sleeper cell; it's waiting for a disturbance caused by antibiotics."

by Alexis Kleinman, Mic |  Read more:
Image: Openbiome

When Does 'Eating Clean' Become an Eating Disorder?

The idea of an eating disorder that didn't involve a loss of appetite or the desire to purge began hitting the zeitgeist a year and a half ago. The disease was called orthorexia, a term coined by Dr. Steven Bratman in 1997. "Orthorexia is defined as an unhealthy obsession with healthy food," Dr. Bratman tells Broadly. "It's not the diet that is orthorexia, it's the diet that could lead to it. The more extreme or restrictive the diet, the more likely it could lead to orthorexia."

After coining the term, Dr. Bratman went on to publish several books about orthorexia and healthy living. Today, he has created an official scientific definition for the disease and is working on getting it published and accepted by the medical community. But Dr. Bratman was not the one to bring orthorexia to the mainstream some year and a half ago. Jordan Younger, a 25-year-old lifestyle blogger from California, was.

Younger was a devout raw vegan who had built an online following of tens of thousands by writing about veganism and her virtuous diet on her then-blog The Blonde Vegan. To Younger, veganism was the cure-all she was hoping for—no longer did she suffer from chronic indigestion or feelings of bloating and discomfort. As she preached about the benefits of a plant-based diet alongside photos of bright green smoothies, mason jars brimming with chia seeds, and chopped kale salads, the popularity of her vegan persona grew.

Soon vegan cleanse companies sought her out to try their pricey cleanses for free. Younger started cleansing religiously—for a minimum of three days a week, eventually finding that every time she finished a cleanse and reintroduced solid food, her stomach problems returned, making her feel even worse than before. But Younger was resolute in turning to vegan cleanses as the answer. Soon the cycle of cleansing, getting too hungry, binging on solid food, feeling guilty, and cleansing again became the norm. Instead of looking outside of veganism to feel better, Younger started fearing vegan foods that weren't as healthy as she'd like them to be, and became riddled with anxiety about the food she ate.

Eventually, Younger came to understand that she had a problem. But hers wasn't a classic eating disorder that people were familiar with; hers was a fixation on the virtue of food. She introduced the term orthorexia to her following, saying that she was suffering and was going to get help. The response she got was overwhelming: "Once I started talking about experience with orthorexia on my blog and national news picked up on it, a flood of people came forward saying they identified with me," Younger tells Broadly. "We're talking tens of thousands of messages. It's been a year and a half and I haven't stopped hearing from people. It's not that number anymore; it's a couple people a day now, but it showed me how many people feel inadequate and feel that living a balanced life is not enough." (...)

People have died of orthorexia because they haven't been properly diagnosed. And, as Younger's floodgate of messages can attest, there are an enormous number of people suffering from orthorexic symptoms today. Nutritional therapist Dr. Karin Kratina, who has specialized in treating eating disorders for over 30 years and authored a paper about orthorexia on NationalEatingDisorders.org, tells Broadly: "I have absolutely seen a rise in orthorexic patients as a nutrition therapist. It's almost rising exponentially. Now I get a new client every week with orthorexic symptoms. It is a serious problem."

One of the reasons Dr. Kratina believes orthorexia is rising in popularity is because of our fixation on health. "There is nothing wrong with eating local or being a vegetarian or vegan," she says. "I think a lot of those diets are inherently valuable. The problem is that we have moralized eating, weight, food, and exercise. Food has become presented—more and more—as the answer."

We see this moral fixation on the virtues of food thrown back into our faces on a daily basis. Instagram can often seem like ground zero for a grotesque display of morally just food choices. Food bloggers like Deliciously Ella—whose vegan food blog has attracted hundreds of thousands of Instagram followers and multiple books deals—are attractive to us because they provide a clear answer: eating healthy will make you good. This answer, regularly served in the convenient form of an easily digestible #eatclean picture, feels so nice on our eyes.

"I think the images of all the really beautiful food—the joke for me is the kale smoothie—the endless kale smoothies are very pretty," says Dr. Bratman. "A lot of it is wonderful food photography. I think this type of media is definitely causing orthorexia to reach a larger audience and a younger audience."

by Claudia McNeilly, Broadly |  Read more:
Image: Stocksy

Interviews With People That Have Interesting Or Unusual Jobs

Q: What is your job?
A: I am currently a golf caddie. I’ve been doing it for four years.

Q: How often do you work?
A: Seven days a week. I work as much as I can because it’s a very weather-dependent job.

Q: Did you know how to play golf when you started being a caddie?
A: I’ve been playing golf since I was 11 years old.

Q: Who do you caddie for?
A: I deal with extremely rich, extremely white people. People that have a lot of money and this is their escape. A round of golf takes like four hours. They can turn their cell phones off. You’re not allowed to talk on cell phones while on the course.

Q: What do you do as a caddie? Just carry the clubs?
A: There are three types of caddies.

The first is VIP, like what I do. You’re a psychologist, a coach, and a physical laborer all at the same time. They ask me, “Should I use the seven iron or the eight iron?” The higher the number, the shorter the ball goes. So if you’re even thinking about that question, you go with the one that goes further.

The second type is a bag caddie, where everybody walks and I take two golfers and carry one bag on each shoulder. It’s approximately 70 pounds and the course is approximately seven miles in length so you have to be in good shape. I run a lot. It’s an extremely physical job.

Third is a fore caddie, where you always stay in front of them. The players are in the carts, and you have just stay up with the carts. Golf carts go 14 mph, which is like a four minute 22 second mile, and you have to stay ahead of them.

Q: If they’re taking a cart, why can’t you take a cart too?
A: I guess I would say, we’re not getting paid to take a cart.

You can’t argue with the money though. It’s around four and a half, maybe five hours of work and you get $250-$300 a day. And that’s seven days a week. It pays very well. Sometimes it doesn’t. It could be $80. It depends on the tips.

Q: How did you find this job?
A: I used to be a high school English teacher. I was helping one of my students look at labor gigs on Craigslist and golf caddie popped up. He applied but I also put in an application for myself. I started one day a week, but I realized I was making twice what I made as a teacher.

Q: Were you in good shape when you started working as a caddie?
A: I’ve always been in pretty good shape. I’m 5’9” and 140 pounds and have two percent body fat. I know this because I got nailed by a golf ball last week and they did x-rays and stuff.

Q: Where did you get hit?
A: It was a half inch from my right nipple. I don’t know how it would’ve felt if it hit my nipple.

Q: How many times have you been hit?
A: I’ve been hit twice in four years.

Q: Have you ever had clients that you couldn’t get along with?
A: Only twice I’ve had a problem. Twice in about 800 rounds.

One time I just walked off. I said, “Have a nice round, I’m done with you.” They were abusive… mean… like plantation slave-owner mean.

The other time I had a single woman come down. I deal with mostly men. I’ve seen people that are drunk or have smoked weed but she was high on something—I have no idea. LSD?

She didn’t have golf attire on. She came down in a red cocktail dress with a rented set of clubs. I did the first couple of holes. I asked her her name and she said, “Umm… Jenny.”

She had no bra, no panties, and her boobs were falling out of the sides of her dress. She kept saying, “Prince Harry is my fiancĂ©. Can you go and find Prince Harry for me?”

I did an hour and a half and then I went to my boss and they took her off the course. Later that night she was wandering around the hotel looking for Prince Harry again.

by Suzanne Yeagley, McSweeny's |  Read more:
Image: Caddyshack

Thursday, November 5, 2015


Monica Barengo, Japan Series
via:

What's the Best Way to Die?

[ed. See also: One in a million.]

After a particularly gruesome news story — ISIS beheadings, a multicar pileup, a family burnt in their beds during a house fire — I usually get to wondering whether that particular tragic end would be the worst way to go. The surprise, the pain, the fear of impending darkness.

But lately, I’ve been thinking that it’s the opposite question that begs to be asked: what’s the best way to die? Given hypothetical, anything-goes permission to choose from a creepy, unlimited vending machine of endings, what would you select?

If it helps, put yourself in that mindset that comes after a few glasses of wine with friends — your pal asks something dreamy, like where in the whole world you’d love to travel, or, if you could sleep with any celebrity, who would it be? Except this answer is even more personal.

There are lots of ways to look at the query. Would I want to know when I’m going to die, or be taken by surprise? (I mean, as surprising as such an inevitable event can be.) Would I want to be cognizant, so I can really experience dying as a process? Or might it be better to drowse my way through it?

Many surveys suggest that about three-quarters of Americans want to die at home, though the reality is that most Americans, upwards of 68 percent, will die in a hospital or other medicalized environment. Many also say they want to die in bed, but consider what that actually means: just lying there while your heart ticks away, your lungs heave to a stop. Lying around for too long also gets rather uncomfortable — as anyone who’s spent a lazy weekend in bed can tell you — and this raises a further question: should we expect comfort as we exit this life?

Sometimes I think getting sniped while walking down the street is the best way to go. Short, sweet, surprising; no worries, no time for pain. Sure, it’d be traumatic as hell for the people nearby, but who knows — your death might spark a social movement, a yearlong news story that launches media, legal, and criminal justice careers. What a death! It might mean something. Does that matter to you — that your death helps or otherwise changes other people’s lives? If there’s not a point to your death, you might wonder, was there a point to your life?

These are heavy questions — ahem, vital, ones — that don’t seem to come up very often.

I got curious about how other people would answer this question, so I started asking colleagues and friends for their ideal death scenarios (yes, I’m a blast at parties). I heard a wide variety of answers. Skydiving while high on heroin for the second time (because you want to have fun the first time, according to a colleague). Drowning, because he’d heard it was fairly peaceful once the panic clears. Storming a castle and felling enemies with a sword to save a woman, who he then has appreciative sex with, just as he’s taking his dying breaths. (That poor gal!) An ex-boyfriend of mine used to say that the first time he lost bowel control, he’d drive to the Grand Canyon and jump off.

My own non-serious answer is to be tickled to death, sheerly for the punniness of it.

Anecdotally, young men were more fancy-free about their answers, while the older folks and women I spoke with gave more measured answers or sat quietly. Wait, what did you ask? I’d repeat the question. A pause. Hmm.

One old standby came up quite a lot: dying of old age in my bed, surrounded by family. The hospital nurses I asked had a twist on that trope: in bed, surrounded by family, and dying of kidney failure. Among nurses, there was consensus that this is the best way to go if you’re near death and in intensive care — you just fade out and pass, one ICU nurse told me. In the medical community, there’s debate about how calm death by kidney failure actually is, but really, who can you ask?

These answers are all interesting, but my nurse friend got me wondering about people who deal with death on the regular — what do they think about the best death? Do they think about it? Surely hospice workers, physicians, oncologists, “right-to-die” advocates, cancer-cell biologists, bioethicists, and the like have a special view on dying. What might their more-informed criteria be for my “best death” query?

I started with a concept that I think most can agree with — an ideal death should be painless.

Turns out, a painless death is a pretty American way to think about dying.

by Robyn K. Coggins, Wilson Quarterly | Read more:
Image: via:

Wednesday, November 4, 2015

How to Compliment a Guy

These are things I have said to women I don’t know:

“That bag is incroyable!” (Instead of incredible.)

“Your eyebrows look so amazing today. I feel like dying!”

“Wait, you put avocado on your face? Is it hard to be a genius?”

In order to be equal in all my doings, I recently set out to compliment men more. Do men even like compliments? On one hand, everyone always seems to appreciate the books, movies, and skyscrapers men make. But does anyone ever tell them nice things about their clothes and hair? It was time for someone (me) to go up to some men and try it. Without objectifying them, because that would be bad for their self-esteem.

The first guy I complimented was in my grocery store. I was buying coffee, and he had on some cool Elvis Costello–style glasses. “Nice glasses!” I said. He smiled and nudged the girl next to him, who was talking to some other people. She turned around, and I could see that she was wearing awesome glasses, even better than his. Then the girl said to me, “He’s happy because everyone always compliments my glasses.” I felt bad about not complimenting her, so I left.

I kept looking for men to compliment. I saw one dude wearing cool sneakers and another guy carrying a really cute yellow-and-white-striped beach bag (?!), but it’s difficult to just yell compliments at a man while he’s walking down the street. That could make him feel unsafe!

The next time I complimented a man, I was in a pharmacy and I did actually yell at him, but it was mostly because of circumstances. This man had a gigantic watch on that literally was covering his entire wrist. He was also wearing a tremendously tight blue cashmere V-neck sweater over a plaid shirt, legginglike jeans, and little brown slippers like a Viennese dancer. To my mind, he was really asking for the attention.

“Nice watch!” I screamed, because he was walking away from me. He turned toward me with a look of disgust.

This, however, was my only negative reaction. Most men seem extremely pleased when you compliment them on anything that has to do with their clothing or hair. It’s uncanny: When you say anything even slightly nice to a man, his face will melt into a grateful sunburst of a grin, as if he had never been complimented before. I complimented a guy at my gym on his sneakers, and he looked as if he was going to weep.

But, of course, these are not the only kind of compliments there are. You can also compliment a man on his ability to do activities. I must say, this other type of compliment didn’t even occur to me at first because, in general, girls don’t really compliment one another about how well they are doing tasks. They’re just like, “What an amazing pair of boots. Are you French?”

When I started to compliment men on their abilities, it engendered a slightly different reaction. Instead of weeping with gratitude, they started to act all proud and almost annoyed. For example, I rented a Zipcar to go see the demolished Great Gatsby mansion on Sands Point, and the guy working in the parking garage had to maneuver the car out of a very confined space. I was impressed. “You are an incredible driver!” I said, not even thinking about my man-complimenting project. The guy walked away as if he didn’t hear me. Later, a different man expertly mounted my TV. “That’s the most beautiful TV I’ve ever seen,” I told him. “It looks like a picture on a wall.” He simply nodded.

by Rebecca Harrington, NY Magazine | Read more:
Image: Philip Gendreau/Bettmann/Corbis

The Lure of Luxury

Why would anyone spend thousands of dollars on a Prada handbag, an Armani suit, or a Rolex watch? If you really need to know the time, buy a cheap Timex or just look at your phone and send the money you have saved to Oxfam. Certain consumer behaviors seem irrational, wasteful, even evil. What drives people to possess so much more than they need?

Maybe they have good taste. In her wonderful 2003 book The Substance of Style, Virginia Postrel argues that our reaction to many consumer items is “immediate, perceptual, and emotional.” We want these things because of the pleasure we get from looking at and interacting with high-quality products—and there is nothing wrong with this. “Decoration and adornment are neither higher nor lower than ‘real’ life,” she writes. “They are part of it.”

Postrel is pushing back against a more cynical theory held by many sociologists, economists, and evolutionary theorists. Building from the insights of Thorstein Veblen, they argue that we buy such things as status symbols. Though we are often unaware of it and might angrily deny it, we are driven to accumulate ostentatious goods to impress others. Evolutionary psychologist Geoffrey Miller gives this theory an adaptationist twist, arguing that the hunger for these luxury goods is a modern expression of the evolved desire to signal attractive traits—such as intelligence, ambition, and power—to entice mates: Charles Darwin’s sexual selection meets Veblen’s conspicuous consumption.

Signaling is a theory with broad scope—it has been applied to everything from self-mutilating behavior to the fact that the best private schools teach dead languages—but it is most blatant in the consumer world. Advertisements are often pure signaling fantasies. Your neighbors gasp as your car drives by; the attractive stranger in a bar is aroused by your choice of beer; your spouse and children love you because you bought the right brand of frozen pizza. Consistent with this, neuroscience studies reveal that when people look at products they judge to be “cool,” brain areas associated with praise and social approval are activated.

If such purchases are motivated by status enhancement, they become positional goods: their value is determined by what other people possess. This inspires a powerful critique of consumerism. Status is a zero-sum game, and just as countries in a literal arms race have to strip away resources from domestic priorities, the figurative arms race that economist Robert H. Frank calls “luxury fever” takes away from individual consumers money that would be better spent on more substantial goods, such as socializing and travel. It is hard for people to opt out. To say that an individual can simply refuse to participate is like saying that countries in a literal arms race can choose to stop buying all those fighter planes and put the money into school lunches and Shakespeare in the Park. Sure they can—if they don’t mind being invaded. If everyone else buys fancy suits for their job interviews, then I risk unemployment by choosing not to.

We would be better off, then, if some Leviathan could force us to disarm, so Miller, Frank, and others argue that the government should step in. A policy aimed at curbing luxury shopping might involve higher marginal tax rates or, as a more targeted intervention, a consumption tax. As it becomes harder to afford a Rolex, people will devote more money to pleasures that really matter. Less waste, more happiness.

Now, only a philistine would deny Postrel’s point that some consumer preferences are aesthetic, even sensual. And only a rube would doubt that some people buy some luxury items to impress colleagues, competitors, spouses, and lovers. Perhaps we can divvy up the consumer world. An appreciation of beauty explains certain accessible and universal consumer pleasures—Postrel begins her book in Kabul after the Taliban fell, describing how the women there reveled in their freedom to possess burkas of different colors and to paint their nails—while signaling theory applies to the more extravagant purchases. A crimson burka? Aesthetics. A $30,000 watch? Signaling. Aristotle Onassis’s choice to upholster the bar stools in his yacht with whale foreskin? Definitely signaling.

I don’t think any of this is mistaken. But it is seriously incomplete. There is a further explanation for our love of such goods, which draws upon one of the most interesting ideas in the cognitive sciences: that humans are not primarily sensory creatures. Rather, we respond to what we believe are objects’ deeper properties, including their histories. Sensory properties are relevant and so is signaling, but the pleasure we get from the right sort of history explains much of the lure of luxury items—and of more mundane consumer items as well.

by Paul Bloom, Boston Review |  Read more:
Image: scion_cho

The Decay of Twitter


[ed. See also: Twitter unfaves itself]

On Tuesday, Twitter Inc. announced another dreary set of quarterly earnings. While the company beat investor expectations, it’s still running at a loss of $132 million after taxes. Its fourth-quarter projections seem low. Worst of all, its namesake product has essentially failed to add any active American users in 2015.

Twitter stock fell more than 10 percent after the announcement.

Since it went public two years ago, investors have rarely considered Twitter’s prospects rosy. The sliver of Twitter’s users who are interested in how it fares as a corporation have gotten used to this, I think: There’s an idea you see floating around that, beyond avoiding bankruptcy, Twitter’s financial success has little bearing on its social utility. Maybe there are only 320 million humans interested in seeing 140-character updates from their friends every day after all. If you make a website that 4 percent of the world’s population finds interesting enough to peek at every month, you shouldn’t exactly feel embarrassed.

Yet the two entities that are called “Twitter”—the San Francisco-based corporation and the character-delimited social network—are not entirely disconnected. And similarly, no matter how many features Twitter-the-company tacks on to draw in new people, it’s still captive to the whims of Twitter-the-network. Recently, I’ve started to wonder if the corporation is trapped in more than a nominal way. What if the network is one of the company’s greatest obstacles, especially when it comes to growth?

Talking about Twitter the Network is hard. I’ve tried it once before, when my colleague Adrienne LaFrance and I tried to describe how English-language, U.S. Twitter of April 2014 differed from the equivalent Twitter of two years prior. Eighteen months on, I think our effort missed a lot. But I do think we noticed that the social network was slipping into something like midlife. It sometimes feels like Instagram, for instance, is the only social network that people actually still love to use.

But I’m still talking in terms of feel: a biased, decidedly non-precise way of discussing something which emerges from more than 300 million minds. And that’s why I like one theory of what’s changed about Twitter from the Canadian academic Bonnie Stewart. I think it makes clear why Twitter the Company is finding such difficulty attracting new users, especially in the United States. And I think it even helps answer the Instagram question, namely: Why is Instagram (or Vine, or Pinterest) so much more fun than Twitter?

The only problem: To talk about Stewart’s theory, you have to first tackle the ideas of the 20th-century philosopher of media, Walter J. Ong.

by Robinson Meyer, The Atlantic |  Read more:
Image:  Esaias van de Velde

Inverted Totalitarianism

[ed. See also: The Gathering Financial Storm Is Just One Effect of Corporate Power Unbound]

Sheldon Wolin, our most important contemporary political theorist, died Oct. 21 at the age of 93. In his books “Democracy Incorporated: Managed Democracy and the Specter of Inverted Totalitarianism” and “Politics and Vision,” a massive survey of Western political thought that his former student Cornel West calls “magisterial,” Wolin lays bare the realities of our bankrupt democracy, the causes behind the decline of American empire and the rise of a new and terrifying configuration of corporate power he calls “inverted totalitarianism.” (...)

Wolin throughout his scholarship charted the devolution of American democracy and in his last book, “Democracy Incorporated,” details our peculiar form of corporate totalitarianism. “One cannot point to any national institution[s] that can accurately be described as democratic,” he writes in that book, “surely not in the highly managed, money-saturated elections, the lobby-infested Congress, the imperial presidency, the class-biased judicial and penal system, or, least of all, the media.”

Inverted totalitarianism is different from classical forms of totalitarianism. It does not find its expression in a demagogue or charismatic leader but in the faceless anonymity of the corporate state. Our inverted totalitarianism pays outward fealty to the facade of electoral politics, the Constitution, civil liberties, freedom of the press, the independence of the judiciary, and the iconography, traditions and language of American patriotism, but it has effectively seized all of the mechanisms of power to render the citizen impotent.

“Unlike the Nazis, who made life uncertain for the wealthy and privileged while providing social programs for the working class and poor, inverted totalitarianism exploits the poor, reducing or weakening health programs and social services, regimenting mass education for an insecure workforce threatened by the importation of low-wage workers,” Wolin writes. “Employment in a high-tech, volatile, and globalized economy is normally as precarious as during an old-fashioned depression. The result is that citizenship, or what remains of it, is practiced amidst a continuing state of worry. Hobbes had it right: when citizens are insecure and at the same time driven by competitive aspirations, they yearn for political stability rather than civic engagement, protection rather than political involvement.”

Inverted totalitarianism, Wolin said when we met at his home in Salem, Ore., in 2014 to film a nearly three-hour interview, constantly “projects power upwards.” It is “the antithesis of constitutional power.” It is designed to create instability to keep a citizenry off balance and passive.

He writes, “Downsizing, reorganization, bubbles bursting, unions busted, quickly outdated skills, and transfer of jobs abroad create not just fear but an economy of fear, a system of control whose power feeds on uncertainty, yet a system that, according to its analysts, is eminently rational.”

Inverted totalitarianism also “perpetuates politics all the time,” Wolin said when we spoke, “but a politics that is not political.” The endless and extravagant election cycles, he said, are an example of politics without politics.

“Instead of participating in power,” he writes, “the virtual citizen is invited to have ‘opinions’: measurable responses to questions predesigned to elicit them.”

Political campaigns rarely discuss substantive issues. They center on manufactured political personalities, empty rhetoric, sophisticated public relations, slick advertising, propaganda and the constant use of focus groups and opinion polls to loop back to voters what they want to hear. Money has effectively replaced the vote. Every current presidential candidate—including Bernie Sanders—understands, to use Wolin’s words, that “the subject of empire is taboo in electoral debates.” The citizen is irrelevant. He or she is nothing more than a spectator, allowed to vote and then forgotten once the carnival of elections ends and corporations and their lobbyists get back to the business of ruling.

“If the main purpose of elections is to serve up pliant legislators for lobbyists to shape, such a system deserves to be called ‘misrepresentative or clientry government,’ ” Wolin writes. “It is, at one and the same time, a powerful contributing factor to the depoliticization of the citizenry, as well as reason for characterizing the system as one of antidemocracy.”

The result, he writes, is that the public is “denied the use of state power.” Wolin deplores the trivialization of political discourse, a tactic used to leave the public fragmented, antagonistic and emotionally charged while leaving corporate power and empire unchallenged.

“Cultural wars might seem an indication of strong political involvements,” he writes. “Actually they are a substitute. The notoriety they receive from the media and from politicians eager to take firm stands on nonsubstantive issues serves to distract attention and contribute to a cant politics of the inconsequential.”

“The ruling groups can now operate on the assumption that they don’t need the traditional notion of something called a public in the broad sense of a coherent whole,” he said in our meeting. “They now have the tools to deal with the very disparities and differences that they have themselves helped to create. It’s a game in which you manage to undermine the cohesiveness that the public requires if they [the public] are to be politically effective. And at the same time, you create these different, distinct groups that inevitably find themselves in tension or at odds or in competition with other groups, so that it becomes more of a melee than it does become a way of fashioning majorities.”

In classical totalitarian regimes, such as those of Nazi fascism or Soviet communism, economics was subordinate to politics. But “under inverted totalitarianism the reverse is true,” Wolin writes. “Economics dominates politics—and with that domination comes different forms of ruthlessness.”

He continues: “The United States has become the showcase of how democracy can be managed without appearing to be suppressed.”

The corporate state, Wolin told me, is “legitimated by elections it controls.” To extinguish democracy, it rewrites and distorts laws and legislation that once protected democracy. Basic rights are, in essence, revoked by judicial and legislative fiat. Courts and legislative bodies, in the service of corporate power, reinterpret laws to strip them of their original meaning in order to strengthen corporate control and abolish corporate oversight.

He writes: “Why negate a constitution, as the Nazis did, if it is possible simultaneously to exploit porosity and legitimate power by means of judicial interpretations that declare huge campaign contributions to be protected speech under the First Amendment, or that treat heavily financed and organized lobbying by large corporations as a simple application of the people’s right to petition their government?”

Our system of inverted totalitarianism will avoid harsh and violent measures of control “as long as ... dissent remains ineffectual,” he told me. “The government does not need to stamp out dissent. The uniformity of imposed public opinion through the corporate media does a very effective job.”

And the elites, especially the intellectual class, have been bought off. “Through a combination of governmental contracts, corporate and foundation funds, joint projects involving university and corporate researchers, and wealthy individual donors, universities (especially so-called research universities), intellectuals, scholars, and researchers have been seamlessly integrated into the system,” Wolin writes. “No books burned, no refugee Einsteins.”

But, he warns, should the population—steadily stripped of its most basic rights, including the right to privacy, and increasingly impoverished and bereft of hope—become restive, inverted totalitarianism will become as brutal and violent as past totalitarian states. “The war on terrorism, with its accompanying emphasis upon ‘homeland security,’ presumes that state power, now inflated by doctrines of preemptive war and released from treaty obligations and the potential constraints of international judicial bodies, can turn inwards,” he writes, “confident that in its domestic pursuit of terrorists the powers it claimed, like the powers projected abroad, would be measured, not by ordinary constitutional standards, but by the shadowy and ubiquitous character of terrorism as officially defined.”

by Chris Hedges, Truthdig |  Read more:
Image: Democracy Inc. Amazon

Tuesday, November 3, 2015


Robert Crumb
via:

Neil Young

Streaming Wars

TV executives see cord-cutters as a strange, exotic species that must be observed and scrutinized. I learned this while seated between several high-ranking executives from the country’s largest broadcast and cable companies one evening this summer.

Once I outed myself as a cord-cutter (actually worse, a cord-never!), my purchase decisions became the dominant topic of conversation. I may as well have said I hunt zombies in my spare time. They bombarded me with questions.

What do I watch? How do I watch? I must not like sports, they correctly noted. What do I subscribe to? Hulu and Netflix?! And HBO Now? And Spotify? And Amazon AMZN -0.48% Prime Video?! Don’t I know that the price of all those subscriptions add up to much more than a basic cable package? Didn’t I know how irrational I was?

It was maddening to them, and I understand why. But consumers, myself included, can be irrational. Just ask JCPenny, which in 2013 stopped inflating prices for the charade of coupons and deep discounts, and nearly tanked its business in the process. Turns out customers preferred the charade.

Likewise, it’s increasingly common for cord-cutters like myself to cobble together an array of separate on-demand subscription services in lieu of a traditional cable subscription. Media companies, including 21st Century Fox, Viacom, CBS, Time Warner, Discovery, and Walt Disney, forecasted decreases in cable subscriptions this summer, and their stock prices were hammered as a result.

While TV execs criticize cord-cutters for irrational purchase decisions, the digital streaming services are in an all-out war for their money. The competition means more and better choices, making a basic cable package less attractive by the day.

by Erin Griffith, Fortune |  Read more:
Image: Beck Diefenbach — Reuters

Death Rates Rising for Middle-Aged White Americans

Something startling is happening to middle-aged white Americans. Unlike every other age group, unlike every other racial and ethnic group, unlike their counterparts in other rich countries, death rates in this group have been rising, not falling.

That finding was reported Monday by two Princeton economists, Angus Deaton, who last month won the 2015 Nobel Memorial Prize in Economic Science, and Anne Case. Analyzing health and mortality data from the Centers for Disease Control and Prevention and from other sources, they concluded that rising annual death rates among this group are being driven not by the big killers like heart disease and diabetes but by an epidemic of suicides and afflictions stemming from substance abuse: alcoholic liver disease and overdoses of heroin and prescription opioids.

The analysis by Dr. Deaton and Dr. Case may offer the most rigorous evidence to date of both the causes and implications of a development that has been puzzling demographers in recent years: the declining health and fortunes of poorly educated American whites. In middle age, they are dying at such a high rate that they are increasing the death rate for the entire group of middle-aged white Americans, Dr. Deaton and Dr. Case found.

The mortality rate for whites 45 to 54 years old with no more than a high school education increased by 134 deaths per 100,000 people from 1999 to 2014.

“It is difficult to find modern settings with survival losses of this magnitude,” wrote two Dartmouth economists, Ellen Meara and Jonathan S. Skinner, in a commentary to the Deaton-Case analysis to be published in Proceedings of the National Academy of Sciences.

“Wow,” said Samuel Preston, a professor of sociology at the University of Pennsylvania and an expert on mortality trends and the health of populations, who was not involved in the research. “This is a vivid indication that something is awry in these American households.”

Dr. Deaton had but one parallel. “Only H.I.V./AIDS in contemporary times has done anything like this,” he said.

by Gina Kolata, NY Times |  Read more:
Image: Ben Solomon

Atsuo (Dazai Atsuo) and S. Riyo
via: here and here

Monday, November 2, 2015

Depression Modern

[ed. Haven't seen this one yet but now I'm intrigued.]

The second season of “The Leftovers,” on HBO, begins with a mostly silent eight-minute sequence, set in a prehistoric era. We hear a crackle, then see red-and-black flames and bodies sleeping around a fire; among them is a pregnant woman, nearly naked. She rises, stumbles from her cave, then squats and pisses beneath the moon—only to be startled by a terrifying rumble, an earthquake that buries her home. When she gives birth, we see everything: the flood of amniotic fluid, the head crowning, teeth biting the umbilical cord. For weeks, she struggles to survive, until finally she dies in agony, bitten by a snake that she pulled off her child. Another woman rescues the baby, her face hovering like the moon. Only then does the camera glide down the river, to where teen-age girls splash and laugh. We are suddenly in the present, with no idea how we got there.

It takes serious brass to start your new season this way: the main characters don’t even show up until midway through the hour. With no captions or dialogue, and no clear link to the first season’s story, it’s a gambit that might easily veer into self-indulgence, or come off as second-rate Terrence Malick. Instead, almost magically, the sequence is ravishing and poetic, sensual and philosophical, dilating the show’s vision outward like a telescope’s lens. That’s the way it so often has been with this peculiar, divisive, deeply affecting television series, Damon Lindelof’s first since “Lost.” Lindelof, the co-creator, and his team (which includes Tom Perrotta, the other co-creator, who wrote the novel on which the show is based; the religious scholar Reza Aslan, a consultant; and directors such as Mimi Leder) persist in dramatizing the grandest of philosophical notions and addressing existential mysteries—like the origins of maternal love and loss—without shame, thus giving the audience permission to react to them in equally vulnerable ways. They’re willing to risk the ridiculous in search of something profound.

At heart, “The Leftovers” is about grief, an emotion that is particularly hard to dramatize, if only because it can be so burdensome and static. The show, like the novel, is set a few years after the Departure, a mysterious event in which, with no warning, two per cent of the world’s population disappears. Celebrities go; so do babies. Some people lose their whole family, others don’t know anyone who has “departed.” The entire cast of “Perfect Strangers” blinks out (though, in a rare moment of hilarity, Mark Linn-Baker turns out to have faked his death). Conspiracy theories fly, people lose their religion or become fundamentalists—and no one knows how to feel. The show’s central family, the Garveys, who live in Mapleton, New York, appear to have lost no one, yet they’re emotionally shattered. Among other things, the mother, Laurie (an amazing Amy Brenneman, her features furrowed with disgust), joins a cult called the Guilty Remnant, whose members dress in white, chain-smoke, and do not speak. They stalk the bereaved, refusing to let anyone move on from the tragedy. Her estranged husband, Kevin (Justin Theroux), the chief of police, has flashes of violent instability; their teen-age children drift away, confused and alarmed.

That’s the plot, but the series is often as much about images (a girl locked in a refrigerator, a dog that won’t stop barking) and feelings (fury, suicidal alienation) as about events; it dives into melancholy and the underwater intensity of the grieving mind without any of the usual relief of caperlike breakthroughs. Other cable dramas, however ambitious, fuel themselves on the familiar story satisfactions of brilliant iconoclasts taking risks: cops, mobsters, surgeons, spies. “The Leftovers” is structured more like explorations of domestic intimacy such as “Friday Night Lights,” but marinated in anguish and rendered surreal. The Departure itself is a simple but highly effective metaphor. In the real world, of course, people disappear all the time: the most ordinary death can feel totally bizarre and inexplicable, dividing the bereaved as often as it brings them closer. But “The Leftovers” is more expansive than that, evoking, at various moments, New York after 9/11, and also Sandy Hook, Charleston, Indonesia, Haiti, and every other red-stringed pin on our pre-apocalyptic map of trauma. At its eeriest, the show manages to feel both intimate and world-historical: it’s a fable about a social catastrophe threaded into the story of a lacerating midlife divorce.

The first season of “The Leftovers” rose and fell in waves: a few elements (like a plot about the Garveys’ son, who becomes a soldier in a separate cult) felt contrived, while others (especially the violent clashes between the Guilty Remnant and the bereaved residents of Mapleton) were so raw that the show could feel hard to watch. But halfway through Season 1 “The Leftovers” spiked into greatness, with a small masterpiece of an episode. In “Guest,” a seemingly minor character named Nora Durst (Carrie Coon), a Mapleton resident who has the frightening distinction of having lost her entire family—a husband and two young children—stepped to the story’s center. In one hour, we learned everything about her: what she does for work (collects “survivor” questionnaires for an organization searching for patterns), what she does at home (obsessively replaces cereal boxes, as if her family were still alive), and what she does for catharsis (hires a prostitute to shoot her in the chest while she’s wearing a bulletproof vest). She travels to New York for a conference, where her identity gets stripped away in bizarre fashion. But, as with that prehistoric opener, the revelations are delivered through montages, which drag, then speed up, revealing without overexplaining, grounded in Coon’s brilliantly unsentimental, largely silent performance. When the episode was over, I was weeping, which happens a lot with “The Leftovers.” It may be the whole point.

by Emily Nussbaum, New Yorker | Read more:
Image: Emiliano Ponzi