Thursday, December 4, 2014

Jackie's Goodbye

[ed. Public service announcement: Please. I know it's a bummer but if you have an aging parent, read this. It provides an excellent account of nearly everything you need to know about caring for someone with Alzheimer's, or any other form of dementia - financial, emotional, bureuacratic. It parallels almost exactly the learning and decision-making process I had to go through when my mom caught pneumonia and her dementia suddenly red-lined. Pay close attention: Medicare vs. Medicaid (program restrictions associated with each and required documentation); VA benefits if applicable (time involved to get them and financial requirements - short answer, up to a year or more and in the end not worth it); nursing and assisted care facilities - their various qualifications, services and costs; home health care options; and hospital discharge policies. I'd also add: be prepared with health care directives and power of attorney, and expect strong differences of opinion (between siblings or surviving spouses over the necessity and costs of whatever care is involved). It is a nightmare. Mom, we tried our best. I still miss you so much.]

I became an Alzheimer's caregiver the week of my 29th birthday. It was August 2012, and I was standing at my kitchen counter in Washington when I got a call from a family friend telling me, "We have a problem." My father had been hospitalized with congestive heart failure. For seven years, he'd been the primary caregiver for his older sister, who had Alzheimer's disease. Without his oversight, she had followed his hospitalization with one of her own after collapsing in her bedroom from dehydration, or low blood sugar, or both. My 66-year-old aunt was a widow with no children. My father was a divorced bachelor, and I was an only child. They were my responsibility.

I had thought I would drive the eight hours to my hometown in South Carolina to get my aunt, Jackie Belcoe, settled back at home, and perhaps hire a nurse to come help out during the day. But when I got there, I found a much graver situation than I had expected.

Tucked into her hospital bed at Lexington Medical Center, Jackie looked so frail and sick that it was heartbreaking. She had been a hairdresser for many years and once owned her own salon. She was the woman who taught me to wear lipstick, who never left the house without her mascara on and her blond bob perfectly styled. Now, her hair was matted and unkempt. She needed a bath and her teeth brushed.

In the emergency room, a nurse had cut the urine-soaked T-shirt off her body. When the paramedics found her, she told them she was 19 and lived at home with her parents.

Though her parents were no longer alive, it was true that she lived in the house where she had grown up. I soon learned that conditions there were as deplorable as the state she was found in. Her bed and sheets were soiled, and dirty laundry had been left beside the washer. A trail of feces stained the carpet from the bed to the bathroom. It was clear that Jackie, like many late-stage dementia patients, had become incontinent—a fact that perhaps a caregiver who was also a brother was too ashamed to admit. Full trash bags were piled in the kitchen. Shards of broken cups were scattered on the floor. The mess had attracted pests, and mice and flies had invaded the brick ranch house.

I traced the chaos to my father's own declining health. That spring, after years with a weak heart, he took leave from work. He tried to stay upbeat and not worry me. I had stopped by to see them in recent months, but kept my visit short. It was so hard, seeing Jackie the way she was. Now, I wondered, how had I missed that something was terribly wrong? Or had I just not wanted to see?

For years, I had been pressuring my dad to think about the long-term plan. What would we do if Jackie needed more support than we could provide at home? Should we decide on a facility where we could place her if the time came?

We had to consider a nursing home, I assumed. My hand had been forced. Naively, at first I didn't think about the money involved. It had been a relief when Jackie reached age 65, with all its attendant public benefits. Surely, I thought, Medicare would cover the kind of care she needed.

"There is nothing medically wrong with her," the hospital social worker told me.

I was incredulous. "What do you mean nothing is wrong with her?" I implored. "Her brain is decaying. If she was left alone, she would die."

The social worker informed me that there was nothing wrong with Jackie that warranted a longer hospital stay or a transfer to a skilled nursing home. What she meant was that Jackie needed custodial care—help with eating, dressing, and bathing. She needed a watchful eye, the adult equivalent of day care. She did not need the assistance of a registered nurse or another medical professional who could administer IVs or monitor complicated equipment and treatments.

Medicare pays for hospital stays and short-term, skilled nursing care for older Americans. It does not cover the kind of custodial care Jackie required, and it generally does not pay for long-term stays in a nursing home or a dementia care unit, a fact nearly 40 percent of Americans over 40 don't fully realize, according to a poll from the Associated Press-NORC Center for Public Affairs Research. Medicaid, designed to provide health care to the poorest Americans, can pay for nursing home residence and long-term care. However, in some states, such as South Carolina, it cannot be used to cover room and board in assisted living or an assisted-living facility's dementia care unit—that is, the kinds of places that provide custodial care to those who don't qualify for nursing homes. Medicaid supports some at-home services, but only if states apply for waivers. (There is also a program in South Carolina and other states that can supplement payments to assisted-living facilities for Medicaid-eligible residents, but Jackie, like many other seniors, did not meet its stringent income and resource limits.)

Jackie did not qualify for Medicaid outright: Her assets and her monthly Social Security income of $1,223 disqualified her from South Carolina's basic 2012 Medicaid limits of $2,000 in resources and a monthly income of just over $900 (the limit is now $973). Given her needs, she could have possibly qualified for some in-home benefits—such as visits from a nursing aide—through the state's Medicaid programs, some of which have higher income thresholds. But it would have taken months to get through the paperwork, and even with some Medicaid supports, I knew she needed full-time caregiving, a role my father could no longer fulfill.

My dad fought me when I first suggested moving Jackie to an assisted-living facility. He didn't want to institutionalize her. He also didn't know how we were going to pay for it.

I soon learned what my father already knew: Brochure after brochure in his files showed glossy photos of luxury dementia care units in our area with 24-hour supervision, secured access to prevent patients from wandering, and life-enrichment programs for the memory-impaired. I called them. A family member toured many of them. Most cost between $4,000 and $6,000 a month out-of-pocket. My dad made just under $29,000 a year working as a welder in the maintenance department at the University of South Carolina. His house was on the brink of foreclosure. Years of health crises had left him and Jackie with very little savings. The annual cost of an assisted-living facility with dementia care was more than double what my father made annually and nearly four times Jackie's income. What I had assumed was procrastination or denial on my father's part was really paralysis.

Hospitals, though, do not like you to overstay your welcome, and Jackie had not been a model patient. More than once, she had gotten out of bed and wandered down the hall and into other patients' rooms. She got agitated, and the staff had to physically restrain her, wrapping her in a vest so she could no longer move. She grew so fitful that a doctor prescribed her a regimen of anti-anxiety drugs, hoping she would just go to sleep.

On the day Jackie was discharged, it was nearly impossible to wake her. The combination of medication and a new environment had made her sleeping patterns even more erratic than usual. She appeared lethargic to the point of being catatonic.

Soon, however, she became more aggresive. She gripped the sheets tighter each time I tried to remove them. When I finally had her sitting up, she didn't want to put on the pair of pants I had brought her. I lifted her hospital gown to put them on her myself. "Stop! Stop!" she yelled. "What are you doing?" In a flash of anger, she pulled her fist back to hit me. She relented when I grabbed her hand.

"What are you doing?" she kept repeating, as we fought over getting dressed."What are you doing?"

The truth was I didn't know what I was doing. And I didn't know what we were going to do.

by Tiffany Stanley, National Journal |  Read more:
Images: Adrià Fruitós and Stanley Family

Lin Fengmian, Standing
via:

Tiny House


A 280 sq. ft. tiny house in Aurora, Oregon. More here:
via:

Nikolas Gambaroff,Untitled, 2010
via:

Shell 'Art' Made 300,000 Years Before Humans Evolved


The artist – if she or he can be called that – was right-handed and used a shark's tooth. They had a remarkably steady hand and a strong arm. Half a million years ago, on the banks of a calm river in central Java, they scored a deep zigzag into a clam shell.

We will never know what was going on inside its maker's head, but the tidy, purposeful line (pictured above right) has opened a new window into the origins of our modern creative mind.

It was found etched into the shell of a fossilised freshwater clam, and is around half a million years old – making the line by far the oldest engraving ever found. The date also means it was made two to three hundred thousand years before our own species evolved, by a more ancient hominin, Homo erectus.

"It is a fascinating discovery," says Colin Renfrew, an archaeologist at the University of Cambridge. "The earliest abstract decoration in the world is really big news."

The shell was dug up in Trinil, Indonesia, in the 1890s by Dutch geologist Eugene Dubois, and was one of many fossil finds in the area, including bones of Homo erectus and several animals.

The shell collection sat in a museum in Leiden, the Netherlands, for over a century. Seven years ago, PhD student Stephen Munro, now at the Australian National University in Canberra, was in the country for a few days and stayed with archaeologist Josephine Joordens of the University of Leiden. She was re-exploring the Dubois collection at the time, and as Munro was also studying ancient molluscs, Joordens encouraged him to take a look. Pressed for time, he photographed each one before heading back to Australia.

"A week later I received an email," Joordens recalls. "He wrote that there was something strange on one of the shells and did I know what it was?"

by Catherine Brahic, New Scientist |  Read more:
Image: uncredited

How Exxon Helped Make Iraqi Kurdistan

[ed. Spreading freedom throughout the Middle-East, one oil field at a time.]

In January 2011, Exxon hired one of the best connected men in Iraq: Ali Khedery, an American of Iraqi descent who had served in Baghdad as a special assistant to five U.S. ambassadors and a senior adviser to three U.S. generals.

At a meeting with Exxon a few months later to analyze Iraq's future, Khedery laid out his thoughts.

Iraq under Prime Minister Nouri al-Maliki was moving toward dictatorship and civil war, he said he told the session. "We will see a rise in violence and a total paralysis in Baghdad," he recalled saying. Iraq was likely to align itself more closely with Iran, which will "have an adverse impact on U.S. companies."

The gloomy scenario grabbed the attention of Exxon executives. Just two years earlier, they had signed a $25 billion deal with Iraq to develop West Qurna, one of the largest oil fields in the country.

"No one wanted to hear that they had negotiated a multi-billion dollar deal in a country which will soon implode," said Khedery, who has detailed to Reuters the meeting and subsequent events for the first time.

He suggested an alternative: Kurdistan, a semi-autonomous region in northern Iraq that was politically stable, far from the chaos in the south, and had, by some estimates, oil reserves of 45 billion barrels.

Less than a year later, Exxon signed a deal with Kurdistan. The story of how that happened explains much about the would-be nation's growing power.

Interviews with key players in the secret 2011 negotiations - the talks involved not just Exxon but also fellow Western oil giant Royal Dutch Shell - show how Exxon's decision to invest infuriated both Washington and Baghdad, and helped propel Kurdistan closer to its long-held goal of independence.

Kurds like to say they are the world's largest ethnic group without a state. Numbering some 35 million, they inhabit a band that stretches from Syria across southern Turkey and northern Iraq and into Iran. Most follow Sunni Islam and speak their own distinct languages.

The Exxon deal fueled Kurdish self-belief. The presence of the biggest U.S. oil company has helped not just financially but also politically and even psychologically.

by Dmitry Zhdannikov, Isabel Coles and Ned Parker, Reuters |  Read more:
Image: Reuters/Brendan Smialowski

The Golden Quarter

We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.

The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.

Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.

There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.

Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. The US economist Tyler Cowen, in his essay The Great Stagnation (2011), argues that, in the US at least, a technological plateau has been reached. Sure, our phones are great, but that’s not the same as being able to fly across the Atlantic in eight hours or eliminating smallpox. As the US technologist Peter Thiel once put it: ‘We wanted flying cars, we got 140 characters.’ (...)

But surely progress today is real? Well, take a look around. Look up and the airliners you see are basically updated versions of the ones flying in the 1960s – slightly quieter Tristars with better avionics. In 1971, a regular airliner took eight hours to fly from London to New York; it still does. And in 1971, there was one airliner that could do the trip in three hours. Now, Concorde is dead. Our cars are faster, safer and use less fuel than they did in 1971, but there has been no paradigm shift.

And yes, we are living longer, but this has disappointingly little to do with any recent breakthroughs. Since 1970, the US Federal Government has spent more than $100 billion in what President Richard Nixon dubbed the ‘War on Cancer’. Far more has been spent globally, with most wealthy nations boasting well-funded cancer‑research bodies. Despite these billions of investment, this war has been a spectacular failure. In the US, the death rates for all kinds of cancer dropped by only 5 per cent in the period 1950-2005, according to the National Center for Health Statistics. Even if you strip out confounding variables such as age (more people are living long enough to get cancer) and better diagnosis, the blunt fact is that, with most kinds of cancer, your chances in 2014 are not much better than they were in 1974. In many cases, your treatment will be pretty much the same.

For the past 20 years, as a science writer, I have covered such extraordinary medical advances as gene therapy, cloned replacement organs, stem-cell therapy, life-extension technologies, the promised spin-offs from genomics and tailored medicine. None of these new treatments is yet routinely available. The paralyzed still cannot walk, the blind still cannot see. The human genome was decoded (one post-Golden Quarter triumph) nearly 15 years ago and we’re still waiting to see the benefits that, at the time, were confidently asserted to be ‘a decade away’. We still have no real idea how to treat chronic addiction or dementia. The recent history of psychiatric medicine is, according to one eminent British psychiatrist I spoke to, ‘the history of ever-better placebos’. And most recent advances in longevity have come about by the simple expedient of getting people to give up smoking, eat better, and take drugs to control blood pressure.

There has been no new Green Revolution. We still drive steel cars powered by burning petroleum spirit or, worse, diesel. There has been no new materials revolution since the Golden Quarter’s advances in plastics, semi-conductors, new alloys and composite materials. After the dizzying breakthroughs of the early- to mid-20th century, physics seems (Higgs boson aside) to have ground to a halt. String Theory is apparently our best hope of reconciling Albert Einstein with the Quantum world, but as yet, no one has any idea if it is even testable. And nobody has been to the Moon for 42 years.

Why has progress stopped? Why, for that matter, did it start when it did, in the dying embers of the Second World War?

by Michael Hanlon, Aeon |  Read more:
Image: courtesy Dick Swanson/U.S. National Archives

Wednesday, December 3, 2014


Jonas Wood
via:

Mansplanation Nation

There's something endearing about people who loudly proclaim their love of books. Forget the suspicions kicked up by trumpeting something as universal as “books” as one’s true love (also loves: baby animals, pizza, oxygen); forget the anachronism of loving physical objects in space and not some “long read” floating in the ether; forget the self-congratulatory tone that hints at a closetful of book-festival tote bags emblazoned with Shakespeare’s face. Proudly championing books still counts as a true act of courage, a way of raging against the dying of the page.

In embracing the book as an object, a concept, a signifier, and a religion, though, one often forgets the texts that answer to the name of “book” these days. A perusal of the best-seller lists of the past two decades indicates that the most popular books might more accurately be described as billionaire-themed smut, extended blast of own-horn tooting, Sociology 101 textbook with sexy one-word title, unfocused partisan rant, 250-page-long stand-up routine, text version of Muppets Most Wanted with self-serious humans where the Muppets should be, folksy Christian sci-fi/fantasy, pseudohistorical rambling by non-historian, and simpleton wisdom trussed up in overpriced yoga pants.

And if we narrow our focus to the No. 1 spot on the New York Times’ hardcover-nonfiction best-seller list in the twenty years since Bookforum was first published, we discover an increasingly shrill, two-decade-long cry for help from the American people. As I Want to Tell You by O. J. Simpson (1995) and The Royals by Kitty Kelley (1997) yield to Dude, Where’s My Country? by Michael Moore (2003) and Plan of Attack by Bob Woodward (2004), you can almost see the support beams of the American dream tumbling sideways, the illusions of endless peace and rapidly compounding prosperity crumbling along with it. The leisurely service-economy daydreams of the late ’90s left us plenty of time to spend Tuesdays with Morrie and muse about The Millionaire Next Door or get worked up about The Day Diana Died. But such luxe distractions gave way to The Age of Turbulence, as our smug belief in the good life was crushed under the weight of 9/11, the Great Recession, and several murky and seemingly endless wars. Suddenly the world looked Hot, Flat, and Crowded, with the aggressively nostalgic waging an all-out Assault on Reason. In such a Culture of Corruption, if you weren’t Going Rogue you inevitably found yourself Arguing with Idiots.

Tracking this borderline-hysterical parade of titles can feel like watching America lose its religion in slow motion. Except, of course, this also meant there was a boom industry in patiently teaching faith-shaken Americans precisely how to believe again. Since the new century began, the top spot on the best-seller list most years has been all but reserved for these morale-boosting bromides: Seemingly every politician, blowhard, and mouthpiece willing to instruct us on how to reclaim our threadbare security blanket of patriotism, cultural supremacy, and never-ending growth and prosperity has turned up in that prestigious limelight. If that list is any indication, we’re desperate for something to ease our fears—or to feed directly into those fears with the kind of angry rhetoric that plays so well on cable news. (...)

And while it’s impossible to argue that people aren’t purchasing books by Laura Ingraham and Sean Hannity and Sarah Palin and Dick Cheney, it’s also difficult to imagine that people are actually reading these books from cover to cover. These are identity accessories more than books—the red-state equivalent of what a funky watch or rakishly arranged silk scarf might be in our notoriously shallow and decadent coastal metropolises. It also says something about the current state of liberal culture that the most popular blue-state authors—Franken, Jon Stewart, Stephen Colbert—are, accessory-wise, more like a bow tie that squirts water in your face. Solemn, poorly written tomes on everything that’s wrong with this country are on one side of the fence, ironic detachment, incredulity, and clown cars on the other. Or as Bill O’Reilly put it when he appeared on the Daily Show in 2004, “You’ve got stoned slackers watching your dopey show every night, and they can vote.”

Bemoaning the stoned slackers of the world might not be a wise choice for the author responsible for what we might term the “Mansplanations of History for Stoned Slackers” series: Killing Lincoln (2011), Killing Kennedy (2012), Killing Jesus (2013), and Killing Patton (2014), every single one of which rested comfortably in the No. 1 spot for weeks at a time. In fact, over the past two decades, O’Reilly has been in the position with seven different books for forty-eight weeks total. How does he do it?

Here’s a clue: If mansplaining means “to comment on or explain something to a woman in a condescending, overconfident, and often inaccurate or oversimplified manner,” then O’Reilly clearly sees America as a suggestible (though fortunately profligate) woman in desperate need of a seemingly limitless amount of remedial mansplanation. And to be fair, if the most popular nonfiction books are a reliable guide, Americans crave mansplaining the way starving rats crave half-eaten hamburgers. We’d like Beck—not an education professor—to mansplain the Common Core to us. We want Malcolm Gladwell—not a neuroscientist or a sociologist or psychologist—to mansplain everything from the laws of romantic attraction to epidemiology. And we want O’Reilly—not an actual historian—to mansplain Lincoln, Kennedy, Jesus, and all of the other great mansplaining icons of history. We want mansplainers mansplaining other mansplainers. We dig hot mansplainer-on-mansplainer action.

by Heather Havrilesky, Bookforum |  Read more:
Image: The Princess Bride with uncredited modifications

Winning the Breakup in the Age of Instagram

[ed. How does stuff get so complicated.]

“Brett was there,” I Gchatted my friend Holly after running into a man who’d broken my heart six months earlier. “We ­actually had a nice chat. He was a mess though. Like, unshowered, smelled weird, was carrying an iPad in the waistband of his pants because he had nowhere to put it.” She asked me what I’d been wearing. Lipstick and heels, I replied. I’d been waiting for my new boyfriend, who picked me up and briefly met Brett.

“Oh my God,” Holly replied. “That is the ultimate ex encounter? He’s nice but looks like a mess. You look awesome and are with a new guy. You won.”

“Winning the breakup” may be a petty concept, but everyone who exits relationships regularly (or maybe just exited one very memorably) knows exactly what it means. The winner is the ex whose career skyrockets after the split; whose new wife is a ­supermodel; who looks better; who dates better; who has bouncier hair. It’s getting over your ex before she gets over you and leading a demonstratively successful life without her — but doing so in ways that at least look casual, just for yourself, definitely not just to rub it in her face, because you’re so over her, remember? And therein lies the Catch-22 of winning the breakup: To care about winning, you are forced to care about not caring about someone. Asked about her weekend plans, my 26-year-old friend Sam once replied, “I’m assembling a team of hotties to torture my ex on Instagram.”

Dating actively is to be in a perpetual state of breakup. (Even in a best-case scenario, you are spared the breakup only once.) I’m 30, but already I feel like I’ve surpassed my lifetime limit for breakups — starting at age 18, hooking up in the dorms, I was already cohabitating with my significant others. In the past decade and change, I’ve had multiple multiyear relationships, which among my peers is a typical track record. For a time, social theorists believed my generation’s defining romantic feature was the hookup. But as hooking up rapidly expanded into a series of miniature ­marriages — and miniature divorces made more confounding by social-media omnipresence and cell-phone butt dials — I’ve come to think millennial romances are defined not by their casual beginnings but their disastrous ends. We aren’t the hookup generation; we’re the breakup generation. Today I find myself entering each subsequent relationship already anticipating its end — but is breakup dread a sign that the relationship is doomed, or does the dread actually cause the doom?

Inevitably, no two people ever can desire a breakup exactly equally. Which means at least one person comes out of it feeling like a loser — and as any résumé-padding overachiever knows, where there are losers there are also winners.

by Maureen O'Connor, NY Magazine | Read more:
Image: Photo: Islandpaps/Splash News

Mystery Humans

Updated genome sequences from two extinct relatives of modern humans suggest that these ‘archaic’ groups bred with humans and with each other more extensively than was previously known.

The ancient genomes, one from a Neanderthal and one from a member of an archaic human group called the Denisovans, were presented on 18 November at a meeting on ancient DNA at the Royal Society in London. The results suggest that interbreeding went on between the members of several ancient human-like groups in Europe and Asia more than 30,000 years ago, including an as-yet-unknown human ancestor from Asia.

“What it begins to suggest is that we’re looking at a Lord of the Rings-type world — that there were many hominid populations,” says Mark Thomas, an evolutionary geneticist at University College London who was at the meeting but was not involved in the work.

The first published Neanderthal and Denisovan genome sequences revolutionized the study of ancient human history, not least because they showed that these groups bred with anatomically modern humans, contributing to the genetic diversity of many people alive today. (...)

The Denisovan genome indicates that the population got around: Reich said at the meeting that as well as interbreeding with the ancestors of Oceanians, they also bred with Neanderthals and the ancestors of modern humans in China and other parts of East Asia. Most surprisingly, Reich said, the genomes indicate that Denisovans interbred with yet another extinct population of archaic humans that lived in Asia more than 30,000 years ago — one that is neither human nor Neanderthal.

by Ewen Callaway, Nature |  Read more:
Image: Ria Novosti/SPL

Tuesday, December 2, 2014

James Brown

How He and His Cronies Stole Russia

For twenty years now, the Western politicians, journalists, businessmen, and academics who observe and describe the post-Soviet evolution of Russia have almost all followed the same narrative. We begin with the assumption that the Soviet Union ended in 1991, when Mikhail Gorbachev handed over power to Boris Yeltsin and Russia, Ukraine, and the rest of the Soviet republics became independent states. We continue with an account of the early 1990s, an era of “reform,” when some Russian leaders tried to create a democratic political system and a liberal capitalist economy. We follow the trials and tribulations of the reformers, analyze the attempts at privatization, discuss the ebb and flow of political parties and the growth and decline of an independent media.

Mostly we agree that those reforms failed, and sometimes we blame ourselves for those failures: we gave the wrong advice, we sent naive Harvard economists who should have known better, we didn’t have a Marshall Plan. Sometimes we blame the Russians: the economists didn’t follow our advice, the public was apathetic, President Yeltsin was indecisive, then drunk, then ill. Sometimes we hope that reforms will return, as many believed they might during the short reign of President Dmitry Medvedev.

Whatever their conclusion, almost all of these analysts seek an explanation in the reform process itself, asking whether it was effective, or whether it was flawed, or whether it could have been designed differently. But what if it never mattered at all? What if it made no difference which mistakes were made, which privatization plans were sidetracked, which piece of advice was not followed? What if “reform” was never the most important story of the past twenty years in Russia at all?

Karen Dawisha’s Putin’s Kleptocracy is not the first book to ask this question. (...) In her introduction, Dawisha, a professor of political science at Miami University in Ohio, explains:
Instead of seeing Russian politics as an inchoate democratic system being pulled down by history, accidental autocrats, popular inertia, bureaucratic incompetence, or poor Western advice, I conclude that from the beginning Putin and his circle sought to create an authoritarian regime ruled by a close-knit cabal…who used democracy for decoration rather than direction.
In other words, the most important story of the past twenty years might not, in fact, have been the failure of democracy, but the rise of a new form of Russian authoritarianism. Instead of attempting to explain the failures of the reformers and intellectuals who tried to carry out radical change, we ought instead to focus on the remarkable story of one group of unrepentant, single-minded, revanchist KGB officers who were horrified by the collapse of the Soviet Union and the prospect of their own loss of influence. In league with Russian organized crime, starting at the end of the 1980s, they successfully plotted a return to power. Assisted by the unscrupulous international offshore banking industry, they stole money that belonged to the Russian state, took it abroad for safety, reinvested it in Russia, and then, piece by piece, took over the state themselves. Once in charge, they brought back Soviet methods of political control—the only ones they knew—updated for the modern era.

That corruption was part of the Russian system from the beginning is something we’ve long known for a long time, of course. In her book Sale of the Century (2000), Chrystia Freeland memorably describes the moment when she realized that the confusing regulations and contradictory laws that hog-tied Russian business in the 1990s were not a temporary problem that would soon be cleaned up by some competent administrator. On the contrary, they existed for a purpose: the Russian elite wanted everybody to operate in violation of one law or another, because that meant that everybody was liable at any time to arrest. The contradictory regulations were not a mistake, they were a form of control.

Dawisha takes Freeland’s realization one step further. She is arguing, in effect, that even before those nefarious rules were written, the system had already been rigged to favor particular people and interest groups. No “even playing field” was ever created in Russia, and the power of competitive markets was never unleashed. Nobody became rich by building a better mousetrap or by pulling himself up by his bootstraps. Instead, those who succeeded did so thanks to favors granted by—or stolen from—the state. And when the dust settled, Vladimir Putin emerged as king of the thieves.

by Anne Applebaum, NY Review of Books |  Read more:
Image: Yuri Maltsev/Reuters

Why “BoJack Horseman” is Like Nothing You’ve Ever Seen

[ed. This is indeed a very bizarre show. And funny.]

It’s hard to explain “BoJack Horseman.” It’s an animated show about a guy, who is also a horse, who is also a washed-up sitcom actor who makes poor personal and professional decisions. His ex-girlfriend and agent is a cat lady. Not a lady who has cats; a lady who is a cat. The editor who wants to publish his memoirs is a penguin. (Naturally, he works for Penguin.) In the title sequence, BoJack sleepwalks through various scenes of D-list celebrity with a dazed, absent look on his face; in one of the last moments, he sinks underwater without struggling, evoking “The Graduate’s” Benjamin Braddock on his horsey face.

“BoJack Horseman” is a show that tries for many things: raunchy comedy that engages with the disaffection of the bored and wealthy; the pathos of being old and forgotten in a culture that constantly rewards the new; and the bizarre, unremarked-upon reality of a Hollywood populated by animal-people and people-people. (Princess Carolyn, BoJack’s agent, energetically claws at a huge scratching post when she goes to the gym.)

It’s more weird than entertaining, at least at first. But sticking with the curiosity long enough reaps surprising rewards. BoJack’s story starts as an animated comedy about an asshole — in the vein of “Family Guy” — and turns into a time-jumping drama about an asshole in the process of painfully acknowledging that he is, well, an asshole — à la “Mad Men.” BoJack Horseman does the difficult work of transforming from something like Peter Griffin to something like Don Draper in 12 half-hour episodes. (...)

And most crucially: “BoJack Horseman” is a strange creation, one that could not have existed just five years ago, one that feels avant-garde even now. This wanton combining of genres, borrowing of styles, and moneyed experimentation with expensive celebrity guests would be nearly impossible to sell to any major network except maybe HBO (which did something like this with “Curb Your Enthusiasm”). All the more surprising considering that “BoJack Horseman” is not a program from a veteran showrunner of a beloved sitcom, like Larry David, or a wealthy scion of a Hollywood family, like any number of the Coppola kids’ projects. The show is instead the brainchild of a novice showrunner, Raphael Bob-Waksberg, and the distinctive visual style of production designer Lisa Hanawalt, also a newcomer to television. The show is a pastiche of styles that borrows from the dark, surreal comedy of Adult Swim shows and the family drama of “Bob’s Burgers” and “The Simpsons,” but it also has the demonstrated freedom to use two back-to-back episodes to tell the same story from different perspectives, as it does with “Say Anything” and “The Telescope.” It isn’t afraid to end things on, as it describes, “A Downer Ending,” as it does in the penultimate episode of the season. And it’s willing to tell the convoluted story of BoJack stealing the “D” from the Hollywood sign and then pinning it on his rival Mr. Peanutbutter and then starring in the film adaptation of the real event, but not playing himself, playing Mr. Peanutbutter, all while Mr. Peanutbutter is preparing to marry Diane, BoJack’s ghostwriter, whom they both are in love with. (It is more than a little confusing, and revels in its silly complexity.)

Zach Sharf at Indiewire made the case that “BoJack Horseman” is the most Netflix-friendly of the studio’s current originals, arguing that like Season 4 of “Arrested Development,” “BoJack Horseman” “has made the streaming service’s binge-watching platform vital to experiencing the show as a whole.” What strikes me about “BoJack Horseman” is that it’s a show that demonstrates not just what Netflix can do well, comedically, but also how much the mold for animated shows can be further broken. “BoJack Horseman” is so much weirder than I thought a show could get while still being fun and moving; I wouldn’t have known that if Netflix hadn’t thrown money at this particular (and likely quite expensive) project.

by Sonia Saraiya, Salon |  Read more:
Image: Netflix

Not Dead Yet: How Some Video Stores are Thriving in the Age of Netflix

[ed. Not so fast. See also: Nine years working at one of the last Indie video stores in America.]

That the number of video stores around the world is on the decline isn’t exactly breaking news. For the past decade, a range of options—DVD-by-mail, video on demand, standalone rental boxes, and online streaming among them—have posed major challenges to the viability of video stores, rendering the phrase itself an anachronism. But just because Blockbuster couldn’t keep its iconic blue awnings hanging doesn’t mean there aren’t some intrepid entrepreneurs (and diehard cinephiles) taking a cue from their digital counterparts and finding ways to not just survive in the age of Netflix, but thrive.

Hop the F train from Kim’s final location to Brooklyn and you’ll find Video Free Brooklyn, a tiny storefront touting the tagline that “Video stores didn’t die, they just had to evolve.” Originally opened in 2002, the space was taken over by the husband-and-wife team of Aaron Hillis and Jennifer Loeber in 2012.

While the decision to purchase a video store at the height of streaming’s assault on the traditional rental industry seemed counterintuitive, Hillis calls it “a labor of love that, surprisingly, also made economic sense.” Part of that is location: Video Free Brooklyn resides on Smith Street, a main thoroughfare of the borough’s Cobble Hill neighborhood, which means steady walk-in traffic. And once people find it, they tend to come back. “The neighborhood tends to be more educated and media-savvy,” Hillis says, “which translates to more discerning tastes.”

Though the store measures just 375 square feet, basement storage allows Hillis—a noted film critic in his own right—to keep approximately 10,000 discs on hand. But rather than compete against the same wide-release films and television series that one can watch with the click of a button and an $8.99 streaming subscription, Hillis is curating a library of hard-to-find fare. “After the floodgates of the Internet opened, we’re now drowning in content and especially mediocrity,” Hillis says. “Video Free Brooklyn’s model is almost a no-brainer: My inventory is heavily curated, but so is my staff, all of whom work in the film industry and have extensive, nerdy knowledge about cinema. Coming into the shop is about nostalgia, the joy of discovery, and getting catered recommendations from passionate cinephiles. It’s a hangout, like the record store in High Fidelity.” (...)

Fisher admits that the convenience and on-demand nature of digital entertainment always will pose a challenge to physical retail outlets, but believes the issue goes beyond the idea of instant gratification. “There is room for all these things,” he says, “but it’s dangerous if people reach the mindset of, ‘If it’s not on Netflix it’s not worth watching.’ Because the selection is so small. It’s the same with cable television and on-demand services—even if you subscribe to all of these different avenues, you’re missing out.”

It’s not that streaming platforms aren’t being curated—it’s who’s doing the curating. “What’s passively happening, whether people realize it or not, is that corporations are deciding what we should watch,” adds Barr. “The thing that made VHS catch on in the ’80s was this great sense of emancipation; prior to that, the only way you were seeing a movie was just by going to a theater. With streaming we are regressing a little bit, because once again the sacrifice we are making in order to have the ease of streaming is that we are putting that decision-making process in the hands of Netflix, Amazon, or whatever service.” And more often than not, those decisions are financially motivated—which is fine for the company’s coffers, but can also lead to that all-too-familiar fatigue that comes with scrolling past endless straight-to-video schlock and movies you’ve already seen but keep getting recommendations for.

by Jennifer M. Wood, Wired |  Read more:
Image: Scarecrow Video