Friday, June 29, 2018

Harlan Ellison (May, 1934 – June, 2018)

Harlan Ellison is dead. He was 375 years old. He died fighting alien space bears.

Harlan is dead. He exploded in his living room, in his favorite chair, apoplectic over the absolute garbage fire this world has become. He's dead, gone missing under mysterious circumstances, leaving behind many suspects. He went down arguing over the law of gravity with a small plane in which he was flying. Harlan took the contrary position. He won.

Harlan Ellison, science fiction writer and legendarily angry man, died Thursday. He exited peacefully (as far as such things go) at home and in his sleep. He was 84 years old.

Any one of those first lies seems to me more likely than the truth of the last one. Hard enough to believe that Ellison is gone — that something out there finally stilled that great and furious spirit and pried those pecking fingers from the keyboard of his Olympia typewriter (without, apparently, the aid of explosives). But a quiet farewell to this life that he loved so largely and this world that he excoriated so beautifully? If someone had asked me, I would've bet on the space bears.

Ellison brought a literary sensibility to sci-fi at a time when the entire establishment was allergic to any notion of art, won awards for it, and held those who'd doubted him early in a state of perpetual contempt.

Harlan Ellison was, after all, one of the most interesting humans on Earth. He was one of the greatest and most influential science fiction writers alive (until yesterday), and now is one of the best dead ones. He was a complete jerk, mostly unapologetically, and a purely American creation — short, loud, furious, outnumbered but never outmatched — who came up in Cleveland, went to LA and lived like some kind of darkside Forrest Gump; a man who, however improbably, however weirdly, inserted himself into history simply by dint of being out in it, brass knuckles in his pocket, and always looking for trouble.

In his youth, he claims to have been, among other things, "a tuna fisherman off the coast of Galveston, itinerant crop-picker down in New Orleans, hired gun for a wealthy neurotic, nitroglycerine truck driver in North Carolina, short-order cook, cab driver." He was the kid who ran off and joined the circus. Bought the circus. Burned the whole circus down one night just to see the pretty lights.

Stone fact: He marched with Martin Luther King, Jr. in Selma, lectured to college kids, visited with death row inmates, and once mailed a dead gopher to a publisher. He got into it with Frank Sinatra one night in Beverly Hills. Omar Sharif and Peter Falk were there. Ellison was shooting pool, and in walks Sinatra, who laid into Ellison because he didn't like the kid's boots.

And look, this is Sinatra in '65. Sinatra at the height of his power and glory. A Sinatra who could wreck anyone he felt like. But Ellison simply did not care. He went nose-to-nose with Sinatra, shouting, ready to scrap. Gay Talese was there, working on a story, so Ellison became a tiny part of what, among magazine geeks, stands as the single greatest magazine profile of all time: "Frank Sinatra Has A Cold." "Sinatra probably forgot about it at once," Talese wrote, "but Ellison will remember it all his life."

And that was absolutely true.

But that moment? It encapsulated Ellison. His luck, his deviltry, his style and violence. He lived like he had nothing to lose, and he wrote the same way. Twenty hours a day sometimes, hunched over a typewriter, just pounding. He published something like 1,800 stories in his life and some of them (not just one of them or two of them, but a lot of them) are among the best, most important things ever put down on paper.

Ellison brought a literary sensibility to sci-fi at a time when the entire establishment was allergic to any notion of art, won awards for it, and held those who'd doubted him early in a state of perpetual contempt. He wrote "I Have No Mouth and I Must Scream" and "'Repent, Harlequin!' Said the Ticktockman." But everyone knows that, right? He wrote "A Boy and His Dog," which became the movie of the same name and still stands as one of the darkest, most disturbing, most gorgeously weird examples of post-apocalyptica on the shelves.

His anthology, Dangerous Visions, gave weight and seriousness to the New Wave movement that revitalized sci-fi in the '70s. That kicked open the door for everyone who came after and the scene we have today. He wrote a flamethrower essay about hating Christmas and the script for "City on the Edge of Forever," the Star Trek episode that most nerds who lean in that direction will tell you was the best of the series. He wrote for comics, for videogames, for Hollywood, got fired from Disney on his first day for making jokes about Disney porn.

He was ... science fiction's Hemingway. Its Picasso. Talented and conflicted, both, and with a fire in him that sometimes came out as genius and sometimes as violence and no one ever knew which one they'd get.

"My work is foursquare for chaos," he once told Stephen King. "I spend my life personally, and my work professionally, keeping the soup boiling. Gadfly is what they call you when you are no longer dangerous; I much prefer troublemaker, malcontent, desperado. I see myself as a combination of Zorro and Jiminy Cricket. My stories go out from here and raise hell."

And he followed those stories right out the door. Did he get in fights? He did. And bragged about every one of them. Filed lawsuits like they were greeting cards. He assaulted book people with frightening regularity, went to story meetings with a baseball bat back in the day. He groped the author Connie Willis on stage during a Hugo Award ceremony, for which some people never forgave him.

And there's nothing to say to normalize that. He wasn't just some curmudgeon or crank to wave off. I once called him "America's weird uncle," but that almost seems too gentle because he was more than that. He was an all-American a**hole, born and bred. Science fiction's Hemingway. Its Picasso. Talented and conflicted, both, and with a fire in him that sometimes came out as genius and sometimes as violence and no one ever knew which one they'd get.

by Jason Sheehan, NPR |  Read more:
Image: Barbara Alper/Getty Images
[ed. I read the Glass Teat in college and have been an Ellison fan ever since. See also: Controversial Author Harlan Ellison Remade Sci-Fi.]

Thursday, June 28, 2018

Mom, in Touch

Every morning, just before heading out into the predawn light to her job as a dentist for the Texas Department of Criminal Justice, my mom would hunch over the laminate countertop in our dimly lit kitchen and scribble a note for me. She would neatly place it in my lunch box, which she also packed each day. Later, when I would plop down on a long bench in the cafeteria at Huntsville’s Gibbs Elementary School, I would rummage through my smooshed turkey sandwich, Dole fruit cup, Ruffles potato chips, and single Hershey’s Kiss, and—without fail—find an extra napkin marked with her unkempt cursive.

Often she wrote to tell me, as she always did before I left the house, to be “gentle, sweet, and kind.” Other times, prompted by my teachers, she urged me to stay quiet in class, despite the fact that she adored my tendency to randomly break out in song. On Fridays she would remind me of our weekly ritual: an after-school trip to King’s Candies, downtown on the Huntsville square, where we celebrated the beginning of the weekend with a BLT and a bag of pastel-colored mints we plucked out of rows of giant glass candy jars. It didn’t much matter what the note said, though. It was my mom’s way of reminding me that she was always there.

Of course, other kids got occasional notes in their lunch boxes: encouragement for a math test, a simple “I love you” in flowery script, goofy renderings of smiley faces, the occasional lip-mark in a shade of red that could only have belonged in an East Texas cosmetics bag. But for my mother, who found order in quotidian rituals, the extra napkin wasn’t simply an afterthought; it was as much a part of my lunch box as the meal itself.

My mom had a particular attachment to handwritten notes. The joke among the blue-haired ladies at First Baptist Church was that if you ever did anything nice for my family, you’d better hurry home if you wanted to beat Cinde Johnston’s thank-you card. Years ago I found a note tucked into one of my dad’s college textbooks, A History of Soviet Russia. It was an apology written by Mom, then a newlywed, for losing her temper the night before.

So it wasn’t all that surprising when a letter arrived in our mailbox a few weeks after her funeral. It was addressed to me, a third-grader at the time. It was a note from Mom.

My first memory is of my mom. I was around four, spending the day in Houston with my paternal grandparents, who had traveled from Arlington to help my dad during one of Mom’s extended stays at MD Anderson Cancer Center. We made our way to the top of a multistory parking garage so I could glide along the empty expanse of concrete on my tricycle and roller skates. Granddad, who had been staring off into the distance as Mimi watched me turn circles, eventually called me over to the edge. Mimi followed, bracing herself against the concrete barrier as Granddad hoisted me onto his hip and pointed. It took several moments for me to spot a woman waving—her short, curly, dark-brown hair gone, completely shaved—behind one of the hundreds of shiny glass squares in front of me. It was the first time I’d seen my mother in weeks. I don’t know how long both of us lingered like that, waving to each other, but in my memory I can still hear my grandmother start to cry.

Many of my childhood memories are set in Houston, an hour and a half from home. I made regular trips there with my mother, and I looked forward to those car rides to the city. When we left, hours before the sun came up, it felt like we were somehow sneaking away from something, embarking on an adventure together, just Mom and me. She would stop and buy me doughnut holes at the first open bakery she spotted, and I would happily munch on them as I gazed at the stars still in the morning sky, wondering how the dark could persist if we had entered a new day.

She often scheduled her hospital appointments on weekends so we could visit the Houston Zoo or Six Flags AstroWorld (my mother, even during her weakest moments, remained a roller coaster enthusiast). And she would frequently go out of her way to drive by Rice University. She firmly insisted that one day I would be among the throngs of students strolling across its verdant campus, suggesting more practical majors than the French and journalism degrees I would eventually pursue.

Other memories of Mom—those outside the context of an endless succession of doctor’s visits and chemo treatments and breast scans—are harder to come by. They exist in short, blurry flashes, like an old home movie that fades and flickers over time. I remember sitting with her on the banks of the Comal River in New Braunfels when I was seven years old, watching inner tubes tumble down the Prince Solms Park chute, the waterslide carved into the side of the city’s dam. During our annual trips to the river, we typically rode the chute together, me in her lap, but that year the port in her chest, which had been implanted for chemo treatments, kept her from getting into the water. Instead, as children of family friends rode the rapids over and over, we were both content to watch from the sidelines. I buried my face in her neck, taking in the scent of sunscreen on her tanned skin, and asked if next year we’d be able to ride the chute again. Mom, her hair only just starting to sprout back, could only offer me, “Maybe, honey. We’ll have to see.”

Another memory: we’re wading through a creekbed on the outskirts of Huntsville, dappled sunlight filtering through the towering East Texas pines. As I bounded ahead, Mom, whose body was starting to slow, shouted after me, reminding me to be vigilant in watching for water moccasins. I recoiled in horror, rushing back and clinging to her waist, but she took my hand and said, “I’ll look out for them too.” We spent most of the day splashing and lounging along sandy stretches of the creek, my mind completely at ease, making whistles out of blades of grass.

I have managed to hold on to some other specifics. I remember her clear alto ringing out both in church (particularly prominent on her favorite hymn, “On Eagle’s Wings”) and in the car (she preferred the Beatles and the blues outside of the sanctuary). I remember her absurd hats lining a wall of our guest bedroom—my favorite was an electric-blue sequined beret. I can feel the comfort of her arms wrapped around me after getting my finger pricked by a cactus in the basin of Big Bend National Park. And I can still replicate our special whistle—starting high, then a quick slide down the scale before leaping back up to the original note (if she wanted my attention when I was small, it was as good as a dog whistle).

But these memories are difficult to preserve. The main truth that I hold about Mom, which will inevitably come up in every conversation I ever have about her, was that she was very sick. And then she died.

On May 12, 1999, a week after my ninth birthday and three days after Mother’s Day, my mom left this world at age 43. Breast cancer, which had slowly spread through her body for seven years, finally overtook her, and she died at home, in her bed, with my father next to her. Of all of the things I struggle to call to mind about her, it seems cruel that I remember that first morning without her so clearly. My dad lightly tousled my hair, the lights still off in my bedroom, a soft glow coming from the hallway. I blearily glanced at the clock—it was around seven in the morning—and then bolted upright, panicked that I was late for school. He grabbed me by the shoulders as if bracing me against something. His eyes were filled with tears for the first time I could ever recall. He leaned in and whispered, “She’s gone.”

When the letter from Mom arrived a few weeks later, it was postmarked from heaven. I imagined that after she’d written it, stuffed it into an envelope, and let it slip from high above, the letter had dodged fluffy cumulus clouds on its way to our home. It took me years to understand that my dad had placed it in the mailbox. (At least, I assume it was my dad. Nearly two decades later, he’s still not one to talk about these things.)

In the first days and weeks and months after Mom’s death, I did my best to maintain a shred of normalcy amid funeral planning, the parade of family members and friends, and navigating my first summer without her. The note from Mom, not unlike the ones she stuffed in my lunch box, helped. The need to impose order is something my mom would have understood instinctively. After a troubled childhood and a cancer-stricken adulthood, she regimented every possible aspect of her life to try to maintain some sense of control. Looking back, I realize it’s a coping mechanism that I developed as well.

In the letter, Mom assured me that she was no longer hurting, that her death had been a release. It’s precisely the thing that our friends and family had told themselves—and me—to make her death feel like less of an injustice. At the time, I readily accepted her explanation. My frustration came when I realized I had no way to write her back.

That was the only time a letter from her showed up in the mailbox, but I gradually began to stumble across handwritten notes elsewhere. In the year leading up to her death, Mom, ever the fastidious planner, had strategically hidden them all over the house. I found one that December, my first Christmas without her—the same year Dad got me a puppy, something Mom had always forbidden—hidden in a tiny mailbox ornament. (She must have known the holidays were going to be especially hard.) Over the years I found them secreted away in jewelry boxes and stashed among toys. There were many in my favorite books; Mom knew it was better to hide them there than in my neglected box of Barbies. When I was fifteen, while searching for childhood photos of myself for a project with my dance team, I found another tucked inside a photo album that had been sitting at the top of my closet for years. The notes came unexpectedly, and with them, electric shocks of realization: they often turned up when I wasn’t searching for anything at all. The few times I tore through my possessions in moments of desperation or longing, trying to apply logic to her hiding spots, they never surfaced.

Yet they appeared frequently enough that, as a child, I never lost faith that another note awaited me somewhere, that Mom had left one in yet another random nook I’d never think to search. After all, until she died, Mom wrote to me in some form every day, even if it was a command scrawled on a church bulletin to stop fidgeting as I sat on the pew next to her. The hidden notes felt like a natural extension of that communication.

But as I grew older, as more birthdays passed without her, I came to the realization that the notes eventually had to end. In my teenage years, I was gripped with anxiety each time I encountered one, fearing that it was the last vestige of our shared history. I’d grown increasingly conscious of the fact that my memories of Mom were slipping, like a foreign language fading in stagnation. I’d forgotten what her perfume smelled like. I could no longer map the constellation of moles on the back of her neck, and even the details of her face began to blur in my mind’s eye.

At the same time, I was morphing from an oval-faced kid into something resembling an adult, one who looked and behaved much like my dad (I still chuckle every time I see my parents’ wedding photos—my twenty-year-old father was, essentially, me in a tux). At every milestone I was becoming ever more distant from the girl that my mom knew. Those notes felt like our only remaining connection.

I discovered the last of them as I packed for graduate school, in 2013. I was headed to study journalism at the University of Missouri, a far cry from the law or medical degree she had hoped for. In my South Austin apartment—my first address after graduating from the University of Texas at Austin—I sorted through the parts of my life that I wanted boxed up and hauled, along with my cat, to Columbia, Missouri. I’m both hopelessly sentimental and a pack rat, so moving is a particularly painful process—every tchotchke was a reminder of family, friends, or the home state I was about to leave behind. As I turned out the contents of a plastic storage box filled with Christmas ornaments, a tiny scrap of paper floated to the floor, alongside the gaudy felt doves and nutcrackers made by my great-grandmother. It had been ripped from a small spiral memo pad and adorned with the soft ink of an already-failing pen. It was creased and folded many times over, but I recognized the handwriting instantly:

Hi!

by Abby Johnston, Texas Monthly |  Read more:
Image: Abby Johnston

Why Women’s Friendships Are So Complicated

When Deborah Tannen, a linguistics professor at Georgetown University, was in grade school, one of her best friends abruptly stopped talking to her. Tannen and the friend, Susan, had done everything together: They had lunch together, made trips to the library together, did afterschool activities in their New York City neighborhood of Greenwich Village together. Then, one day, Susan cut her off. They wouldn’t speak again until more than half a century later.

Tannen recounted this story as part of a talk Tuesday at the Aspen Ideas Festival, which the Aspen Institute co-hosts with The Atlantic, about the sociology of friendships. Specifically, her lecture was about the gender differences that inform how people relate to and engage with others close to them, as based on her new book You're the Only One I Can Tell: Inside the Language of Women's Friendships.

As distressing as it was, Tannen’s estrangement from Susan—and, namely, the mysteriousness that surrounded it—wasn’t unusual. Women, Tannen has found in her research, are far less inclined than men to explain their reason for breaking up with a friend. Women are more likely to avoid confrontation; they don’t want to give their friend the opportunity to defend herself.

This is where women’s friendships—which because of their emotional intimacy can, according to Tannen, be far more gratifying than those between men—get especially complicated: Not knowing why a friend is ending her relationship with you, she said, “is really hurtful because knowing what’s going on is a big part of friendship.”

It’s a common belief that men are more competitive than women, but Tannen’s findings suggest that the reality is less clear-cut. Women are simply competitive in a way that’s less obvious—they’re competitive about connection. Among women, prized is the degree to which one is privy into the details of her friends’ lives.

This, Tannen says, makes them more prone to “gossip,” but it also means they can serve as immense, unmatched sources of support for someone who is going through something difficult and needs to vent or seek help. For example, say a woman gets into a series of disagreements with her roommate that culminate in an explosive falling-out; now she’s debating whether to break her lease and move out. If she were to confide about in a male friend, chances are he’d respond by giving his advice right off the bat; he might not know how to engage with her emotionally. If the woman were to vent to a female friend instead, though, that friend would likely request more context and ask how the issue makes her feel before jumping into her feedback. That willingness to take the time to hear her out first sends her what Tannen calls a “meta-message”—it tells her that her friend cares. One of Tannen’s interview subjects described this dynamic when reflecting on how she mourned the death of a close girlfriend: The hardest part of her dying is that “I can’t call her and tell her how terrible I feel about her dying.”

by Alia Wong, The Atlantic |  Read more:
Image: Sol De Zuasnabar Brebbia/Getty

Cy Twombly, Souvenir 1992

Calvinballing

Mitch McConnell’s Politics of Shamelessness Have Won

When Justice Antonin Scalia died unexpectedly, Senate Majority Leader Mitch McConnell pulled a new rule of American politics out of thin air and said there could be no vote on a replacement during a hotly contested election year. When Justice Anthony Kennedy announced his retirement Wednesday afternoon, McConnell pulled a distinction out of thin air and said that the autumn of a midterm election was a perfect time to confirm a new Supreme Court justice.

It is, yes, hypocritical.


And McConnell’s great strength as a politician is that he doesn’t care. He doesn’t care that it’s hypocritical, he doesn’t care that I think it’s hypocritical, and he doesn’t care that Chuck Todd thinks it’s hypocritical. He just waives the objection away with a sniff and sneer and on we go.

It works for McConnell because he’s not interested in being thought of as a high-minded guy or in being well-regarded by high-minded people. He wants to be thought of as an effective party politician, and he is an effective politician. Ask me, ask Chuck Todd, ask anyone.

There’s a perfect alignment between the reputation he wants, the reputation he has, and the reputation he deserves in a way that’s unequalled among American politicians and that allows him to conduct himself with an even greater degree of shamelessness than Donald Trump himself since unlike the all-id Trump, McConnell isn’t out of control he’s just willing to be utterly ruthless in pursuit of his political objectives.

It would be wrong to see this as a zero-cost strategy. Most people who get into electoral politics do it, on some level, because they want to be liked and admired, and McConnell does not. A 2018 paper by Vanderbilt political scientist Larry Bartels shows that McConnell is strikingly disliked by both Democrats and Republicans, with Dems rating him about on par with the National Rifle Association and Republicans liking him less than college professors, environmentalists, or people on food stamps. But nevertheless, he persisted.

McConnell pioneered the unprecedented use of obstructionist tactics in 2009 and 2010 — going so far as to block action on even measures Republicans didn’t disagree with — in order to make American politics as contentious as possible, knowing that an ineffective policy response to the Great Recession would redound to Republicans’ benefits. He blocked a bipartisan statement on Russian interference in the 2016 presidential campaign, implicitly partnering with Vladimir Putin to put Trump in the White House. He held Scalia’s seat open until it could be filled with Neil Gorsuch, and he greeted Gorsuch’s vote to uphold Trump’s discriminatory travel ban with a sneer.

Now the odds are overwhelming that not only will he get to replace the slightly moderate Justice Kennedy with another hardline right-winger, but that he’ll be able to reap some partisan gain for his trouble.

Part of the genius of his shameless Calvinballing is that it not only blocks the opposition party, it frustrates them. Angry and frightened by the prospect of the Supreme Court moving further rightward, much of the progressive base is inevitably going to take out their rage not on Trump, McConnell, and vulnerable Senate Republicans like Dean Heller (R-NV) but on Democrats for not being able to make the right tactical choices to block him — just as much of the progressive rank-and-file reacted to disappointment with Democrats’ legislative productivity in 2009-2010 by sitting out the midterms.

by Matthew Yglesias, Vox | Read more:
Image: Twitter
[ed. The slimiest of the slime. See also: This Is the World Mitch McConnell Gave Us]

Wednesday, June 27, 2018

Unlocked And Loaded

Families Confront Dementia And Guns

With a bullet in her gut, her voice choked with pain, Dee Hill pleaded with the 911 dispatcher for help.

“My husband accidentally shot me,” Hill, 75, of The Dalles, Ore., groaned on the May 16, 2015, call. “In the stomach, and he can’t talk, please …”

Less than four feet away, Hill’s husband, Darrell Hill, a former local police chief and two-term county sheriff, sat in his wheelchair with a discharged Glock handgun on the table in front of him, unaware that he’d nearly killed his wife of almost 57 years.

The 76-year-old lawman had been diagnosed two years earlier with a form of rapidly progressive dementia, a disease that quickly stripped him of reasoning and memory.

“He didn’t understand,” said Dee, who needed 30 pints of blood, three surgeries and seven weeks in the hospital to survive her injuries.

As America copes with an epidemic of gun violence that kills 96 people each day, there has been vigorous debate about how to prevent people with mental illness from acquiring weapons. But a little-known problem is what to do about the vast cache of firearms in the homes of aging Americans with impaired or declining mental faculties.

Darrell Hill, who died in 2016, was among the estimated 9 percent of Americans 65 and older diagnosed with dementia, a group of terminal diseases marked by mental decline and personality changes. Many, like the Hills, are gun owners and supporters of Second Amendment rights. Forty-five percent of people 65 and older have guns in their household, according to a 2017 Pew Research Center survey.

But no one tracks the potentially deadly intersection of those groups.

A four-month Kaiser Health News investigation has uncovered dozens of cases across the U.S. in which people with dementia used guns to kill or injure themselves or others.

From news reports, court records, hospital data and public death records, KHN found 15 homicides and more than 60 suicides since 2012, although there are likely many more. The shooters often acted during bouts of confusion, paranoia, delusion or aggression — common symptoms of dementia. They killed people closest to them — their caretaker, wife, son or daughter. They shot at people they happened to encounter — a mailman, a police officer, a train conductor. At least four men with dementia who brandished guns were fatally shot by police. In cases where charges were brought, many assailants were deemed incompetent to stand trial.

Many killed themselves. Among men in the U.S., the suicide rate is highest among those 65 and older; firearms are the most common method, according to the Centers for Disease Control and Prevention.

These statistics do not begin to tally incidents in which a person with dementia waves a gun at an unsuspecting neighbor or a terrified home health aide.

Volunteers with Alzheimer’s San Diego, a nonprofit group, became alarmed when they visited people with dementia to give caregivers a break — and found 25 to 30 percent of those homes had guns, said Jessica Empeño, the group’s vice president.

“We made a decision as an organization not to send volunteers into the homes with weapons,” she said.

At the same time, an analysis of government survey data in Washington state found that about 5 percent of respondents 65 and older reported both some cognitive decline and having firearms in their home. The assessment, conducted for KHN by a state epidemiologist, suggests that about 54,000 of the state’s more than 1 million residents 65 and older say they have worsening memory and confusion — and access to weapons.

About 1.4 percent of those respondents 65 and older — representing about 15,000 people — reported both cognitive decline and that they stored their guns unlocked and loaded, according to data from the state’s 2016 Behavioral Risk Factor Surveillance System survey. Washington is the only state to track those dual trends, according to the CDC.

In a politically polarized nation, where gun control is a divisive topic, even raising concerns about the safety of cognitively impaired gun owners and their families is controversial. Relatives can take away car keys far easier than removing a firearm, the latter protected by the Second Amendment. Only five states have laws allowing families to petition a court to temporarily seize weapons from people who exhibit dangerous behavior.

But in a country where 10,000 people a day turn 65, the potential for harm is growing, said Dr. Emmy Betz, associate research director at the University of Colorado School of Medicine, a leading researcher on gun access and violence. Even as rates of dementia fall, the sheer number of older people is soaring, and the number of dementia cases is expected to soar as well.

By 2050, the number of people with dementia who live in U.S. homes with guns could reach between roughly 8 million and 12 million, according to a May study by Betz and her colleagues.

“You can’t just pretend it’s not going to come up,” Betz said. “It’s going to be an issue.”

Polling conducted by the Kaiser Family Foundation for this story suggests that few Americans are concerned about the potential dangers of elders and firearms. Nearly half of people queried in a nationally representative poll in June said they had relatives over 65 who have guns. Of those, more than 80 percent said they were “not at all worried” about a gun-related accident. (Kaiser Health News is an editorially independent program of the foundation.)

Dee Hill had ignored her husband’s demands and sold Darrell’s car when it became too dangerous for him to drive. But guns were another matter.

“He was just almost obsessive about seeing his guns,” Dee said. He worried that the weapons were dirty, that they weren’t being maintained. Though she’d locked them in a vault in the carport, she relented after Darrell had asked, repeatedly, to check on the guns he’d carried every day of his nearly 50-year law enforcement career.

She intended to briefly show him two of his six firearms, the Glock handgun and a Smith & Wesson .357 Magnum revolver. But after he saw the weapons, Darrell accidentally knocked the empty pouch that had held the revolver to the floor. When Dee bent to pick it up, he somehow grabbed the Glock and fired.

“My concern [had been] that someone was going to get hurt,” she said. “I didn’t in my wildest dreams think it was going to be me.”

by JoNel Aleccia and Melissa Bailey, Kaiser Health News via: Naked Capitalism | Read more:
Image: uncredited

Alao Yokogi, 1998

Jimmy Carter for Higher Office

Sunday mornings in Plains, Georgia, Mr. Jimmy wakes in his unchanged ranch home with the '70s appliances and same old Formica countertops at the usual hour of 5 A.M., and inevitably scribbles some Bible-lesson notes that he mostly never refers to, and then, after his ablutions and 7 A.M. breakfast with Rosalynn—oatmeal is a favorite—the Secret Service ferries him through town in a black car, past the gas station that was once his brother's, past his old campaign headquarters in a little warehouse, past the home Rosalynn was born in, to Maranatha Baptist Church.

The church is the bull's-eye of his stomping grounds—the verdant flatland upon which Plains sits, where he hunts and fishes. He receives vegetables from the farm where he grew up, a few miles outside town, which is now on the National Register of Historic Places. He visits every so often, and there's the old bedroom that belonged to Mr. Jimmy, with a model wooden ship and weathered copy of War and Peace, and there's the dining-room table at which the Carter family—two girls and two boys—sat to eat, or sat to read and eat, as Mr. Jimmy's mother, Lillian, insisted that her children always be reading. And there's the scrubby red-dirt tennis court built by Mr. Jimmy's daddy, Earl, a Sunday-school teacher himself, who employed a wicked slice to always beat his son.

About 40 Sundays a year, Mr. Jimmy materializes from thin air, flickering before us at Maranatha to lead Bible study, to say, No, the world's not going to end. Not just yet. Though he's elfin with age, you'd still instantly recognize him as our 39th president: with those same hooded ice-blue eyes, the same rectangular head, the same famous 1,000-watt smile. But when he teaches like this, he transforms from whatever your vision of Jimmy Carter is into someone different, some kind of 93-year-old Yoda-like knower, who in his tenth decade on earth still possesses that rarest of airy commodities: hope.

Hope is something that Mr. Jimmy thinks about a lot—and faith, too, from which hope rises in the first place. It's something that you're born with, faith, but also something you must re-apply every day, like a gel or cream. He says first you have faith in your mama, when you're suckling at her breast. And then you have faith in your people: the tight-knit circle of kin and neighbors in your town. Then—your country. He says what might be most important, though, is faith in a creator of some sort. Mr. Jimmy says you can fill in the blank: Muhammad, Buddha, Jesus…Gaia, Martians, T'Challa, king of Wakanda…

In front of the congregation—in spring and summer, autumn and winter—he perambulates the green carpet like old people sometimes do, as if on the deck of a ship on a rolling sea. He wears a turquoise bolo, somewhere between groovy and huh? His face is still elastic, the zygomatic muscles reflexively drawing his mouth into that smile, but his voice sometimes turns phlegmy without notice—and he starts coughing. His mind is a churning thing of wonder. His recall is sharp, his barbs of humor unexpected.

“So you didn't put 'em to sleep, right?” he says to the pastor, Brandon Patterson, on one April morning when he steps up to the lectern, flashing a mischievous grin. He turns to the overflow crowd. “You like our pastor okay?” Murmurings of assent. “Well, he just passed his 24th birthday and he got married and he got a dahwg.” Mr. Jimmy hangs on the word, and bends it, to laughter. “He and I used to argue about who loved their wife more, but now he's divided his love between a dahwg and his wife, so I think I'm ahead of him!”

Among ex-presidents, Mr. Jimmy blazes on. Gerald Ford and Ronald Reagan, both men who sandwiched him in office, are dead; George senior teeters; Bill Clinton tremors when tired and, at 71, has begun to fade before our eyes. (Meanwhile, Bill and Hillary have pocketed over $150 million from speeches.) George W. has retreated to a more low-profile, patrician life of painting and occasional aid trips to Africa, while Obama is just beginning a post-presidency that some have projected could be worth roughly $250 million in personal gain and includes the recent announcement of a multi-year production deal with Netflix.

With Carter now in his fourth decade as ex-president, his actual presidency feels more like a footnote, an aberration in the life of a holy man. The public servant in him, the impulse that led him to the presidency in the first place, has thrived in the aftermath of his former Beltway imprisonment. While he rejects pay-to-play speechmaking and appearances, his net worth—reportedly $7 million to $8 million—has come from the 30-plus books he's written, many of them spiritual in nature. His activism and advocacy across the globe—in particular his success in eradicating Guinea worm in Africa and Asia, from 3.5 million estimated cases in 1986 to 30 last year—led to the 2002 Nobel Peace Prize. (...)

If one were to judge by the sustained Sunday crowds, Maranatha Baptist Church has turned into an unlikely American pilgrimage site. Perhaps we're afflicted by a deeper national need, or lack, the kind that inspires searchers to travel hundreds, sometimes thousands, of miles and begin lining up in the dark, but it raises a bunch of personal and collective questions. After all, soul-searching is a by-product of having temporarily lost one's soul.

On one of my Sunday visits, last November, I arrived around 4:30 A.M. and was handed a scrap of paper with the number 15 scrawled on it, meaning I was 15th in line. The man doing the handing out was named George, dressed in blue slacks and a checked shirt with a red ball cap on his head. He said that some Sundays, if you're not there four or five hours ahead of time, you don't get in. The Sunday after Mr. Jimmy announced he had cancer, in August 2015, 1,800 people came to Plains, beginning to queue on Saturday night. (The highway patrol shut down the road out front; Mr. Jimmy did two lessons, one in a nearby auditorium, and still they turned people away.)

Today, George had arrived at his usual time, around 4 A.M., to find a young man—twitchy, half-awake, and chilled—in a suit, no tie, white shirt, standing out there on the front porch of the church in the dark, here to see Mr. Jimmy before it's too late. Everything at that hour seemed special in Plains to someone coming from the North, as I had. George's honeyed drawl, for one. And the silence was special, in the hour when the Muscogee ghosts give up night on the southern coastal plain, and everything is deep and still. George had his eye on the special sky now, scanning the canopy of stars.

“Last week we saw it three times,” he said. “We won't see it again for about ten days.”

He was talking about the International Space Station, which you could track with an app on your phone. On his phone, too, George showed some pictures of yesterday's fishing trip with Mr. Jimmy. No artifice, no braggadocio: just another fishing trip with Mr. Jimmy—who you can almost forget is ex-president of the United States down here—to add to the others. (...)

Our America, as summed up recently by The New York Times, is a place where “life expectancy has declined, suicide rates have risen, the opioid crisis has worsened, inequality has grown, and confidence in government has fallen.” But our democracy has survived ragged, if not broken, times before. In 1971 the Times asked in a headline, IS AMERICA FALLING APART? Then put a fine point on it: “America is a prewar country, psychologically unprepared for one thing to go wrong,” wrote Anthony Burgess. “Now everything [seems] to be going wrong. Hence the neurosis, despair, the Kafka feeling that the whole marvelous fabric of American life is coming apart at the seams.”

If we were fully unraveling in 1971, what was 1974, then? What were 1776 and 1862? We were coming apart at the seams in 1929 and 1942, 1963 and 2001. It's possible we've been coming apart since our inception. Perhaps it's a shortcoming of our American imagination, or national narcolepsy—and part of our volcanic creation story, too—to believe that this moment, right now, may be the worst moment ever, over and over and over again. If we forget other dysfunctional presidents, from John Quincy Adams and John Tyler to LBJ and Nixon, we might believe that this president is the most irrational, unstable, and narcissistic of all. The potential split atom of our democracy forever threatens to be our annihilation.

But it doesn't mitigate these times to say there have been times like them before. It only begs the question: To whom might we appeal, or where might we find not just a voice of reason but one to remind us that—despite division and gun violence, deep-seated issues of race and class—the experiment is still worthy and vital?

Perhaps this is why people come to Plains. Because to gaze upon Jimmy Carter, to look upon a face marked by time—the charismatic handsomeness of his 50s has softened, hollowed, and transformed into the weatherworn visage of his 90s—is to see someone shorn of ambition, trying to tell a truth, or his truth. Somewhere inside the man we knew as president, there's always been Mr. Jimmy, the seeker, who over time grew in concentration, no longer caring for our approval but, in a weird way, for the state of our national soul. If he was once criticized as a politician for being egomaniacal or sanctimonious, it's easier, with his presidency in the deep past, to accept Jimmy Carter as a human being whose heart might have always been in the right place. In church, teaching from the Bible, Mr. Jimmy becomes to his followers the purest distillation, then, of some post-presidency ideal, some secular saint. On the hallway bulletin board are pinned pictures of community events, the Carters beaming with locals. The butterfly garden out back was built by Rosalynn. And at these Sunday-school meetings, her husband steadies our twitchiness in singsong tones, with a personal psalm of history, Bible study, current events, and autobiography.

“There's no way to separate completely the responsibilities of public service and also some basic moral and ethical principles on which we base the finest aspects of our life,” Mr. Jimmy says, “and we cure the problems in our society.” He likes to quote a favorite theologian, Reinhold Niebuhr, who said, “The sad duty of politics is to establish justice in a sinful world.” Carter says, for better or worse, he tried. In the Oval Office every morning before beginning his day's work, he would stand before the huge globe situated by the Resolute desk and touch his finger on Moscow, trying to put himself in Brezhnev's shoes. He would think: How can I not provoke him today?

“We never shot a bullet, we never dropped a bomb, we never launched a missile,” says Mr. Jimmy of his time in office. It's a fact he's proud of, especially given that since World War II, America's been at war with about 20 countries. China, on the other hand, hasn't been in a major war since 1979. “What they have done is to use their enormous resources to benefit their own people,” he says. “China has 14,000 miles of fast-speed rail.”

Look at the Universal Declaration of Human Rights, he tells us one Sunday morning: “Thirty little, tiny paragraphs that you can read over in five minutes.… A lot of them are not being honored by our country in particular.… That's why we have wars today. All of those 30 paragraphs guarantee that women and men should have equal pay and equal opportunity for advancement and equal rights.… We have a long way to go.”

If we as a nation suffer from thin-skinned righteousness, or ideological arrogance, he says he himself has suffered the sins of pride, thinking himself superior at times—to women, to those of different background. He's not proud of this in the least, but he has the courage to admit it.

“Who decided whether you'd be kind or filled with hatred?” he asks on another morning. “Who decided whether you would forgive other people or not? Who decided whether you'd be honest and tell the truth or not? Who decided whether you would be generous or not? Who decided whether you'd be filled with love or not?”

Then he answers his own questions: “Every one of us,” says Mr. Jimmy, “has our own free decision to answer the question This is the kind of person I'm going to be. It's not a decision that your parents can make for you, or your wife can make for you, or your husband or your friends. Everybody in here has the right to decide This is the kind of person I'm going to be. And if you haven't been the kind of person that you are proud of so far, you're free from now on the rest of your life to correct your mistakes.”

by Michael Paterniti, GQ |  Read more:
Image: Matt Martin
[ed. A decent man in an indecent profession.]

Donald Hall’s Life Work

In 2011, the poet Donald Hall—who passed away at his home in rural New Hampshire, on Saturday, at age eighty-nine—received the National Medal of Arts from President Obama, the highest award granted to an artist by the American government. For several years, I have kept the above photo of the ceremony, which takes place in the East Room of the White House, saved to my phone. Looking at it makes me instantly glad: Hall appears pleasingly nuts, with scraggly wisps of gray hair sticking out every which way. Obama has such a wide and earnest grin. They are both laughing, as if to say, “What a wild and beautiful thing!”

Hall, who was the first poetry editor of The Paris Review, beginning in 1953, and the Poet Laureate of the United States, from 2006 to 2007, published more than fifty books in his lifetime, across several genres. He had fixations: baseball, loss, devotion, New England. He spent much of his later life at Eagle Pond Farm in Wilmot, New Hampshire, in a white clapboard house with green shutters—the same house where his grandmother was born, in 1878, and his mother, in 1903. “Only in the rural south, and in rural New England, do American houses willfully contain the history of a family,” he wrote in the introduction to “Eagle Pond,” a tender and elegiac book.

Much like the poet and essayist Wendell Berry, Hall had deeply held beliefs about how our attachments to our native places were being systematically cleaved, at great spiritual and practical cost. Hall, too, could be a little crabby. I always loved this about him—the earned misanthropy of a person who has witnessed too many absurdities, and endured the relentless commodification of what was once pure. He was especially agitated by the existence of Vermont. (I understood this to be a delightful interstate spat, in the grand tradition of New York vs. New Jersey, or any other number of semi-inscrutable regional feuds.) In an essay titled “Reasons for Hating Vermont,” he cites the state’s urban pilgrims—born-rich city-dwellers who voyage north on long weekends to get their hands dirty, but not really—as loathsome, a plague on the landscape. “In Vermont when inchling trout are released into streams, a state law requires that they be preboned and stuffed with wild rice delicately flavored with garlic and thyme,” he writes. Touché, Hall!

Hall’s essays are often polemical, and frequently very funny. His book “Life Work,” from 1993, is a meditation on the nature and practice of work (“I’ve never worked a day in my life,” it opens) and the strange rituals that humans devise to navigate our days (he subscribes to Baudelaire’s notion that work is actually less boring than having to amuse yourself). The goal of work—the bliss of writing, for Hall—is in the way it collapses time. “In the best part of the best day, absorbedness occupies me from footsole to skulltop. Hours or minutes or days—who cares?—lapse without signifying.” He couldn’t much abide reading “junk prose,” or watching television or movies, but he was mesmerized and thrilled by sports, and especially the Red Sox: “I sit with my mouth open, witlessly enraptured.”

Hall was born in Connecticut, in 1928, and moved to New Hampshire in 1975, with his second wife, the poet Jane Kenyon. She died at Eagle Pond, in 1995, from leukemia. They met at the University of Michigan (she was a student there, and he was a professor, nineteen years her senior) and married shortly thereafter. Following her death, Hall wrote deeply and endlessly about his grief, the way it erased everything. “The year endures without punctuation,” he writes in “Without,” a poem from 1995, “the body is a nation tribe dug into stone / assaulted white blood broken to fragments.” Years ago, I pressed a bit of a New Hampshire fern—the green frond of an Interrupted Fern, to be more precise—into my copy of “Without.” Each time I open it, and the fern falls out, and I inevitably think of Kenyon’s poem “Let Evening Come,” in which she suggests that we should submit to endings with grace and assurance:
Let it come, as it will, and don’t
be afraid. God does not leave us
comfortless, so let evening come
It’s good advice, if difficult to metabolize, and harder still to follow. I have certainly bucked, wildly and without dignity, against loss. How to quiet the panic that arises when you believe you’ve been involuntarily divested of love?

by Amanda Petrusich, New Yorker | Read more:
Image:Charles Dharapak / AP / REX / Shutterstock

Tuesday, June 26, 2018

Secret Settings Hidden in Your Android Phone Will Make it Feel Twice as Fast

Modern Android phones are so much faster and smoother than they used to be even a few short years ago. Of course, speed on a smartphone is just like battery life: you can never have enough. The latest-generation Qualcomm Snapdragon processor has been combined with extra RAM in 2018 Android flagship phones to create the smoothest and most powerful Android experience to date. In fact, technology has progressed so much that a couple of recent Android flagship phones have been found to be even faster than current-generation iPhone models running iOS 11. That’s right, the Samsung Galaxy S9 and OnePlus 6 both actually managed to beat the iPhone X in real-life speed tests on YouTube.

Phones like the OnePlus 6, Samsung’s latest flagships, the LG G7 ThinQ, and the new HTC U12+ are all wonderfully fast right out of the box. But what you might not know is that there’s a simple secret setting hiding inside all of these phones that can make them even faster. And the best part is this hidden setting can be found on every single Android phone out there. So whether you have a brand new OnePlus 6 or an old Galaxy S7, there’s a wonderfully easy way to speed up your phone — and all it takes is a few seconds.

We’ve covered this trick before here on the site. In fact, we cover it at least a couple of times each year. Why? Because more and more people learn about these secret settings each time we write about them, and we receive tons of thankful feedback each and every time. People can’t believe how much of an impact tweaking three simple settings on a smartphone can have. But believe us when we tell you, the impact is huge.

Here’s how it works: each time you open an app, close an app, switch between apps, and so on, your phone displays a transition animation to take you from one screen to the next. It’s the sort of thing that fades to the background as you use your phone, so you probably don’t even notice these transition animations anymore. But playing these transition animations on your phone’s display takes time, so speeding up the animations results in faster app loads.

We’re sure that you see where we’re going by now.

Inside every Android phone is a secret Settings menu called “Developer options” that’s filled with all sorts of advanced options. There’s a simple trick to enable this special section of the Settings app, and here’s how to do it:
  1. Open the Settings app on your phone
  2. If your handset runs Android 8.0+, tap System (skip this step if you’re on an earlier version of Android)
  3. Scroll down and tap About phone
  4. Scroll down again and tap Build number 7 times consecutively
Important note: if you’re not a savvy user, we recommend that you don’t mess with any of the settings in this section other than the ones we’re about to show you. You can’t do any real damage to your phone, but better safe than sorry.

Once you’ve enabled the Developer options section, open it and scroll down until you see the following three settings:
  • Window animation scale
  • Transition animation scale
  • Animator animation scale
You’ll see a “1x” next to each of those settings by default, and that’s what we’re going to adjust to speed up your phone. Simply tap on each of those three settings and change “1x” to “.5x,” then exit the Developer options section. To ensure that we’re clear, you want to select “.5x” and not “5x.” This means each animation will take half as long to play, so things will move twice as fast as you do things like open and close apps. Needless to say, we spend a ton of time opening and closing apps on our phones, so the impact is huge.

by Zach Epstein, BGR | Read more:
Image: Zach Epstein, BGR
[ed. This really works.]

Monday, June 25, 2018

Ways to Think About Machine Learning

We're now four or five years into the current explosion of machine learning, and pretty much everyone has heard of it. It's not just that startups are forming every day or that the big tech platform companies are rebuilding themselves around it - everyone outside tech has read the Economist or BusinessWeek cover story, and many big companies have some projects underway. We know this is a Next Big Thing.

Going a step further, we mostly understand what neural networks might be, in theory, and we get that this might be about patterns and data. Machine learning lets us find patterns or structures in data that are implicit and probabilistic (hence ‘inferred’) rather than explicit, that previously only people and not computers could find. They address a class of questions that were previously ‘hard for computers and easy for people’, or, perhaps more usefully, ‘hard for people to describe to computers’. And we’ve seen some cool (or worrying, depending on your perspective) speech and vision demos.

I don't think, though, that we yet have a settled sense of quite what machine learning means - what it will mean for tech companies or for companies in the broader economy, how to think structurally about what new things it could enable, or what machine learning means for all the rest of us, and what important problems it might actually be able to solve.

This isn't helped by the term 'artificial intelligence', which tends to end any conversation as soon as it's begun. As soon as we say 'AI', it's as though the black monolith from the beginning of 2001 has appeared, and we all become apes screaming at it and shaking our fists. You can’t analyze ‘AI’.

Indeed, I think one could propose a whole list of unhelpful ways of talking about current developments in machine learning. For example:
  • Data is the new oil
  • Google and China (or Facebook, or Amazon, or BAT) have all the data
  • AI will take all the jobs
  • And, of course, saying AI itself.
More useful things to talk about, perhaps, might be:
  • Automation
  • Enabling technology layers
  • Relational databases.
Why relational databases? They were a new fundamental enabling layer that changed what computing could do. Before relational databases appeared in the late 1970s, if you wanted your database to show you, say, 'all customers who bought this product and live in this city', that would generally need a custom engineering project. Databases were not built with structure such that any arbitrary cross-referenced query was an easy, routine thing to do. If you wanted to ask a question, someone would have to build it. Databases were record-keeping systems; relational databases turned them into business intelligence systems.

This changed what databases could be used for in important ways, and so created new use cases and new billion dollar companies. Relational databases gave us Oracle, but they also gave us SAP, and SAP and its peers gave us global just-in-time supply chains - they gave us Apple and Starbucks. By the 1990s, pretty much all enterprise software was a relational database - PeopleSoft and CRM and SuccessFactors and dozens more all ran on relational databases. No-one looked at SuccessFactors or Salesforce and said "that will never work because Oracle has all the database" - rather, this technology became an enabling layer that was part of everything.

So, this is a good grounding way to think about ML today - it’s a step change in what we can do with computers, and that will be part of many different products for many different companies. Eventually, pretty much everything will have ML somewhere inside and no-one will care.

An important parallel here is that though relational databases had economy of scale effects, there were limited network or ‘winner takes all’ effects. The database being used by company A doesn't get better if company B buys the same database software from the same vendor: Safeway's database doesn't get better if Caterpillar buys the same one. Much the same actually applies to machine learning: machine learning is all about data, but data is highly specific to particular applications. More handwriting data will make a hand-writing recognizer better, and more gas turbine data will also make a system that predicts failures in gas turbines better, but the one doesn't help with the other. Data isn’t fungible.

This gets to the heart of the most common misconception that comes up in talking about machine learning - that it is is some way a single, general purpose thing, on a path to HAL 9000, and that Google or Microsoft have each built *one*, or that Google 'has all the data', or that IBM has an actual thing called ‘Watson’. Really, this is always the mistake in looking at automation: with each wave of automation, we imagine we're creating something anthropomorphic or something with general intelligence. In the 1920s and 30s we imagined steel men walking around factories holding hammers, and in the 1950s we imagined humanoid robots walking around the kitchen doing the housework. We didn't get robot servants - we got washing machines.

Washing machines are robots, but they're not ‘intelligent’. They don't know what water or clothes are. Moreover, they're not general purpose even in the narrow domain of washing - you can't put dishes in a washing machine, nor clothes in a dishwasher (or rather, you can, but you won’t get the result you want). They're just another kind of automation, no different conceptually to a conveyor belt or a pick-and-place machine. Equally, machine learning lets us solve classes of problem that computers could not usefully address before, but each of those problems will require a different implementation, and different data, a different route to market, and often a different company. Each of them is a piece of automation. Each of them is a washing machine.

by Benedict Evans |  Read more:
Image: uncredited

Sunday, June 24, 2018

Mardi Gras in Theory and Practice


Mardi Gras did not seem like it would be my kind of holiday. It is characterized, in popular stereotypes, by three things: beads, beer, and breasts. As a teetotaler, I do not drink beer. As a person of taste, I am disinclined to cover myself in plastic beads. And while I am theoretically pro-breasts, I feel no particular need to see them publicly displayed from second-floor balconies. Making the prospect even less appealing, my apartment is in the heart of the French Quarter, a place with unpleasantly high quantities of debauchery even in the off-season. (A man peed on my house the other week.) Mardi Gras promised to be a loud, messy spectacle, the worst of New Orleans magnified and multiplied. I had friends who were leaving town to escape it. They seemed wise.

I also quickly began to realize what everyone from here knows already: it is not just a single day, “Fat Tuesday.” It is Carnival Season, a month-long celebration beginning in early January on Twelfth Night and lasting through Ash Wednesday. We are not talking about an afternoon of unusually heavy drinking by a throng of tourists on Bourbon Street. We are talking about over 50 parades, an influx of visitors that multiplies the city’s population by four, and a billion dollars in Mardi Gras related spending. A few weeks before everything descended into chaos, local news reported that the sewage department had extracted 92,000 pounds of leftover Mardi Gras beads from the city’s catch basins. I honestly did not understand how that many pounds of Mardi Gras beads could end up in the drainage system. I would soon understand.

by Nathan J. Robinson, Current Affairs |  Read more:
Image: Nathan J. Robinson
[ed. Wonderful essay!]

Saturday, June 23, 2018

A Walk to Kobe


A Walk to Kobe

I gazed at Kobe harbour, sparkling leadenly far below, and listened carefully, hoping to pick up some echoes from the past, but nothing came to me. Just the sounds of silence. That’s all. But what are you going to do? We’re talking about things that happened over thirty years ago.

Over thirty years ago. There is one thing I can say for certain: the older a person gets, the lonelier he becomes. It’s true for everyone. But maybe that isn’t wrong. What I mean is, in a sense our lives are nothing more than a series of stages to help us get used to loneliness. That being the case, there’s no reason to complain. And besides, who would we complain to, anyway?

by Haruki Murakami, Granta |  Read more:
Image: uncredited

Mind Control

Barbara Ehrenreich cuts an unusual figure in American culture. A prominent radical who never became a liberal, a celebrity, or a reactionary, who built a successful career around socialist-feminist writing and activism, she embodies an opportunity that was lost when the New Left went down to defeat. Since the mid-1970s she has devoted her work to an unsparing examination of what she viewed as the self-involvement of her professional, middle-class peers: from their narcissism and superiority in Fear of Falling and Nickel and Dimed to their misplaced faith in positive thinking in Bright-Sided. Again and again, she has offered a critique of the world they were making and leaving behind them. She is, in other words, both a boomer and the opposite.

At first glance, her new book, Natural Causes, is a polemic against wellness culture and the institutions that sustain it. What makes the argument unusual is its embrace of that great humbler, the end of life. “You can think of death bitterly or with resignation ... and take every possible measure to postpone it,” she offers at the beginning of the book. “Or, more realistically, you can think of life as an interruption of an eternity of personal nonexistence, and seize it as a brief opportunity to observe and interact with the living, ever-surprising world around us.” With a winning shrug, she declares herself “old enough to die” and have her obituary simply list “natural causes.”

Ehrenreich contemplates with some satisfaction not just the approach of her own death but also the passing of her generation. As the boomers have aged, denial of death, she argues, has moved to the center of American culture, and a vast industrial ecosystem has bloomed to capitalize on it. Across twelve chapters, Ehrenreich surveys the health care system, the culture of old age, the world of “mindfulness,” and the interior workings of the body itself, and finds a fixation on controlling the body, encouraged by cynical and self-interested professionals in the name of “wellness.” Without opposing reasonable, routine maintenance, Ehrenreich observes that the care of the self has become a coercive and exploitative obligation: a string of endless medical tests, drugs, wellness practices, and exercise fads that threaten to become the point of life rather than its sustenance. Someone, obviously, is profiting from all this.

While innumerable think pieces have impugned millennials’ culture of “self-care”—and argued that the generation born in the 1980s and ’90s is fragile, consumerist, and distracted—Ehrenreich redirects such criticisms toward an older crowd. Her book sets out to refute the idea that it’s possible to control the course and shape of one’s own biological or emotional life, and dissects the desire to do so. “Agency is not concentrated in humans or their gods or favorite animals,” she writes. “It is dispersed throughout the universe, right down to the smallest imaginable scale.” We are not, that is, in charge of ourselves. (...)

Natural Causes opens with her decision to reject a series of medical interventions. Ehrenreich is in her seventies and has survived breast cancer but, “in the last few years,” she writes, she has “given up on the many medical measures—cancer screenings, annual exams, Pap smears, for example—expected of a responsible person with health insurance.” She describes making this choice after a series of troubling experiences: First, her primary care doctor talked her into a bone scan, then diagnosed her with osteopenia—thinning of the bones—“a condition that might have been alarming if I hadn’t found out that it is shared by nearly all women over the age of thirty-five.” Bone scans, though, have been heavily promoted by the manufacturer of the osteopenia drug, which itself turns out to cause bone thinning. Next, she got a false positive on a mammogram and decided never to get another.

Even though she showed no signs of sleep apnea, her dentist wanted her to get a test for it, “after which I could buy the treatment from her: a terrifying skull-shaped mask that would supposedly prevent sleep apnea and definitely extinguish any last possibility of sexual activity.” The risk of sudden death in her sleep, she decides, is tolerable. She turns down colonoscopies, certain that she’ll die of something else before colon cancer kills her anyway. She fires her doctor after he suspends his ordinary practice and offers “concierge care” instead—pricey, constant access and a heightened testing regime.

Ehrenreich, who has a Ph.D. in cell biology, isn’t opposed to scientific medicine. But she is alert to the power dynamics that characterize a patient-doctor relationship and the ways those dynamics can influence patients’ decisions: Some will seek or accept treatments that won’t help with their condition, simply because so much power is invested in the doctor. Ehrenreich quotes at length from a 1956 article titled “Body Rituals of the Nacirema” (“American” backwards), which describes an American hospital through an ethnographer’s eye:
Few supplicants [patients] in the temple are well enough to do anything but lie on their hard beds. The daily ceremonies, like the rites of the holy-mouth-men [dentists], involve discomfort and torture. With ritual precision, the vestals awaken their miserable charges each dawn and roll them about on their beds of pain while performing ablutions, in the formal movements of which the maidens are highly trained. At other times they insert magic wands in the supplicant’s mouth or force him to eat substances which are supposed to be healing. From time to time the medicine men come to their clients and jab magically treated needles into their flesh.
Stripped of the authority of Western medicine, the treatments the article describes sound like cruel rituals. “The fact that these temple ceremonies may not cure, and may even kill the neophyte,” the article goes on, “in no way decreases the people’s faith in the medicine men.”

A bit wryly, Ehrenreich points out that she’s not the anti-empirical one in this debate. Doctors have been quite resistant to so-called “evidence-based medicine”—the disbursement of treatment according to quantitative evidence rather than medical discretion. And, accustomed to the present system, many patients now worry that anything less than constant testing and maximal intervention would leave them at risk: “An internist in Burlington, North Carolina, reports that when he told a 72-year-old patient that she did not need many of the tests she was expecting in her annual physical, she wrote a letter to the local paper about him as an example of ‘socialized medicine.’ ” Doctors and hospitals use these expectations to drive up demand and prices, and patients, afraid and intimidated, submit.

The way Americans assent to such treatments fits more broadly into a culture of arduous self-improvement regimens. Here, Ehrenreich speaks as an inveterate gym rat, a participant in the astonishing rise of the workout since the 1970s. She sees the ascent of exercise culture in part as a continuation of women’s reclamation of their bodies in the 1970s, and in part as an example of the retreat from public concerns and move toward individualism that many of her peers made around the same time. “I may not be able to do much about grievous injustice in the world, at least not by myself or in very short order, but I can decide to increase the weight on the leg press machine by twenty pounds and achieve that within a few weeks,” she writes. “The gym, which once looked so alien and forbidding to me, became one of the few sites where I could reliably exert control.” What was a consolation, however, quickly evolved into a prize. Working out became a status symbol, a form of conspicuous consumption for a professional middle class bereft of purpose; and it became a disciplinary device, part of a culture that inflicts “steep penalties for being overweight.”

Once associated with play, exercise is now closer to a form of labor: measured, timed, and financially incentivized by employers and insurers. Like any kind of alienated labor, it assumes and intensifies the division between mind and body—indeed, it involves a kind of violence by the mind against the body. Ehrenreich is tired of being told to “crush your workout,” of being urged to develop “explosive strength” through a “warrior” routine. She cites the copy from an advertisement for a home fitness machine: “A moment of silence please, for my body has no idea what I’m about to put it through.” Exercise, for some reason, has become a struggle to the death. As Oscar Pistorius—the amputee and Olympic runner convicted of murder in 2015—has tattooed on his back, “I beat my body and make it my slave / I bring it under my complete subjection.”

While workout culture requires the strict ordering of the body, mindfulness culture has emerged to subject the brain to similarly stringent routines. Mindfulness gurus often begin from the assumption that our mental capacities have been warped and attenuated by the distractions of our age. We need re-centering. Mindfulness teaches that it is possible through discipline and practice to gain a sense of tranquility and focus. Such spiritual discipline, often taking the form of a faux-Buddhist meditation program, can of course be managed through an app on your phone, or, with increasing frequency, might be offered by your employer. Google, for example, keeps on staff a “chief motivator,” who specializes in “fitness for the mind,” while Adobe’s “Project Breathe” program allocates 15 minutes per day for employees to “recharge their batteries.” This fantastical hybrid of exertion and mysticism promises that with enough effort , you too can bend your mind back into shape.

“Whichever prevails in the mind-body duality, the hope, the goal—the cherished assumption,” Ehrenreich summarizes, “is that by working together, the mind and the body can act as a perfectly self-regulating machine.” In this vision, the self is a clockwork mechanism, ideally adapted by natural selection to its circumstances and needing upkeep only in the form of juice cleanses, meditation, CrossFit, and so on. Monitor your data forever and hope to live forever. Like workout culture, wellness is a form of conspicuous consumption. It is only the wealthy who have the resources to maintain the illusion of an integral and bounded self, capable of responsible self-care and thus worthy of social status. The same logic says that those who smoke (read: poor), or don’t eat right (poor again), or don’t exercise enough (also poor) have personally failed and somehow deserve their health problems and low life expectancy.

Of course, the body cannot really be mastered this way. For Ehrenreich, in fact, the body is not even a single thing, but rather a continuous, contradictory process. Immunology—her academic specialty—hinges on an essentially military metaphor of distinction between self and nonself: The immune system protects the homeland by destroying invaders. What, then, are we to make of routine episodes of intrabody conflict? There are obvious cases, such as cancer and autoimmune disorders. But Ehrenreich points out that even something as ordinary as menstruation appears to be the product of the adaptive struggle over resources between mother and fetus, an “arms race ... between the human endometrium and the human embryo/placental combination.” The body, like the body politic pictured on the frontispiece of Hobbes’s Leviathan, only gives the appearance of unity: It’s made of a “collection of tiny selves.” And for that matter, there’s not really a king to impose order. (...)

But Ehrenreich’s universe hums with life and activity. It’s warm, not cold. She wants to join it in her final years, not leave it behind by cloistering herself in the clinic, the gym, or the spa. For the elderly today, “the price of survival is endless toil” to keep fit, along with incessant trips to the doctor and avoiding all good food, right up till death. She’s not interested. She still works out, though less intensely than before, and she stretches every day—some of it even “might qualify as yoga.” “Other than that, I pretty much eat what I want and indulge my vices, from butter to wine. Life is too short to forgo these pleasures, and would be far too long without them.”

by Gabriel Winant, TNR |  Read more:
Image: uncredited

Michiko Kakutani on 'The Death of Truth'

Forget Michael Wolff’s Fire and Fury, forget Hillary Clinton’s and James Comey’s ridiculously self-serving memoirs. Former chief book critic for The New York Times Michiko Kakutani has written the first great book of the Trump administration. The Death of Truth: Notes on Falsehood in the Age of Trump (out July 17th) is a fiery polemic against the president and should go down as essential reading.

In nine exquisitely crafted broadsides, the 63-year-old Pulitzer winner calls upon her vast knowledge of literature, philosophy and politics to serve up a damning state of the union. She cites those you might expect from the authoritarian cannon: George Orwell, Aldous Huxley, Hannah Arendt, but easily pivots to David Foster Wallace, Spike Jonze and Tom Wolfe. She deftly traces the history of leftist postmodern academics who helped usher in relativism and led us away from objective truths, but saves some of her most withering attacks for the right-wing media (FOX News, Brietbart, et al) that set the stage for a dangerous demagogue like Trump.

It’s the fluidity and grace of her prose, however, that leave the reader amazed by Kakutani's virtuoso talent and command. "Trump’s ridiculousness, his narcissistic ability to make everything about himself, the outrageousness of his lies, and the profundity of his ignorance," she writes, "can easily distract attention from the more lasting implications of his story: how easily Republicans in Congress enabled him, undermining the whole concept of checks and balances set in place by the founders; how a third of the country passively accepted his assaults on the Constitution; how easily Russian disinformation took root in a culture where the teaching of history and civics had seriously atrophied.”

The Death of Truth is a clear-eyed, if dismal, blueprint for how we got here and why our society has been pushed to the very brink. Rolling Stone reached Kakutani by email for the following exchange:

What was the genesis of this book?

Like many people, I became increasingly alarmed during the 2016 campaign and the first year of the Trump administration by the full-on war being waged on the very idea of truth. The Washington Post estimated that President Trump emits nearly six false or misleading claims a day. And it's not just the liar-in-chief who is spreading "alternative facts" and assailing reason and science; it's also his political and media enablers, aided and abetted by Russian trolls. The consequences for our democracy are grave: The lies spewed forth by Trump and company are promoting division and discord in the country at large, inflaming bigotry and hatred and elevating partisanship and tribal politics over shared values and the democratic ideals embodied in the Constitution. With the erosion of truth, we are made susceptible to propaganda (from the Russians, the White House and the likes of the NRA), our institutions are undermined, and rational public discourse is imperiled.

One of the things I wanted to do in The Death of Truth was explore some of the larger social and political dynamics that fueled the rise of Trump and brought America to the point where a third of the country will casually shrug off hard facts about everything from the size of inaugural crowds to the crime rate among immigrants. Those broader dynamics include the toxic partisanship that increasingly afflicts our politics; the merging of news and politics with entertainment; the growing populist disdain for expertise; the embrace of subjectivity and relativism by both the right and left; the growth of online filter bubbles that segregate us into silos of like-minded users; and the viral spread of misinformation and conspiracy theories on the web. Trump is both a bizarro-world apotheosis of many larger trends undermining truth today, and a flame-thrower who is accelerating these alarming attitudes.

One of your last pieces for the Times was seen as comparing Trump to Hitler. This book takes that case further. Did you hesitate to go there?

There are personality traits in common – toxic narcissism, a fondness for superlatives, an instinct for lying, bullying and manipulation. And parallels can be drawn between Hitler's ascent and the rise of Trump: from his translation of his own mendacity into a shameless propaganda machine, to his Machiavellian exploitation of his audiences' fears and resentments, to other politicians' craven failure to stand up to him.

This is not to draw a direct analogy between today's circumstances and the overwhelming horrors of the World War II era, but to look at some of the conditions and attitudes – what Margaret Atwood has called the "danger flags" – that make a people susceptible to demagoguery, and nations easy prey for would-be autocrats. And to remind readers of the fragility of democracy – of how quickly the rule of law can be broken and how rapidly civil liberties can erode.

America is being bombarded with disinformation from the White House and its allies – designed, you write, to keep the population not only misled but paranoid and off-balance. Do you see any solutions or ways to combat this crisis?

The role of a free and independent press has never been more important, and investigative reporters – working for newspapers, magazines, online outlets, radio and television – have been doing vital, necessary work, trying to untangle Trump and his campaign's relationship with Russia, and expose the culture of lying and corruption that has flourished under his administration, while sounding warning bells about the consequences of his assault on truth.

The problem is that such reports do not reach many of the president's most ardent supporters, who live in Fox News and Sinclair Broadcasting silos, and who shrug off any news that does not ratify their pre-existing beliefs. At the same time, the volume and velocity of Trump's lies, his multiplying scandals and violations of norms threaten to overwhelm the public, resulting in numbness and cynicism – the very traits that autocrats (like Vladimir Putin) rely upon to sabotage dissent and strengthen their own hold on power. (...)

Did the media fail in its most basic duty during the 2016 campaign?

In pursuit of the clicks and eyeballs that Trump generated, the media gave the former reality-TV star an estimated $5 billion in free campaign coverage. Many outlets paid more attention to scandals and questions of personality than to substantive matters of policy (like the consequences a Trump administration would have on, say, national security, health care, immigration, the budget), and more attention to Hillary Clinton's emails than to the Trump campaign's entanglements with Russia. Like James Comey, much of the press assumed that Clinton was going to win the election, and that assumption wasn't only wrong – it also affected coverage.

H.L. Mencken wrote: "Democracy is the theory that the common people know what they want, and deserve to get it good and hard." Isn't that exactly what we are seeing with Trump? Is this our own reckoning of 50-odd years of partisan fighting and a failure of our political class?

Trump tapped into a lot of middle-class and working-class disillusion with the political establishment, and into economic worries and resentments that ballooned in the wake of the 2008 financial crash. His promises to "drain the swamp" and reduce taxes on the middle class, however, turned out to be lies: Since taking office, he has made the swamp deeper and wider than ever, presiding over an administration filled with grifters and dark money – an administration that's delivered tax cuts not to ordinary people, but to corporations and the very rich. His surprise election blindsided the political and media establishment, which underestimated the anti-elitist sentiment in the country and the toxic efficacy of Trump's fear-mongering, and which was also slow to recognize the dangerous levels of misinformation being spread by the alt-right and Russia on the web.

by Sean Woods, Rolling Stone | Read more:
Image: Petr Hlinomaz