Wednesday, May 17, 2017

How Boomers Ruined the World

The day before I finished reading A Generation of Sociopaths, who should pop up to prove Bruce Cannon Gibney’s point, as if he had been paid to do so, but the notorious Joe Walsh (born 1961), former congressman and Obama denigrator. In answer to talkshow host Jimmy Kimmel’s plea for merciful health insurance, using his newborn son’s heart defect as an example, Walsh tweeted: “Sorry Jimmy Kimmel: your sad story doesn’t obligate me or anyone else to pay for somebody else’s health care.” Gibney’s essential point, thus proved, is that boomers are selfish to the core, among other failings, and as a boomer myself, I feel the “you got me” pain that we all ought to feel but so few of us do.

Gibney is about my daughter’s age – born in the late 1970s – and admits that one of his parents is a boomer. He has a wry, amusing style (“As the Boomers became Washington’s most lethal invasive species … ”) and plenty of well parsed statistics to back him up. His essential point is that by refusing to make the most basic (and fairly minimal) sacrifices to manage infrastructure, address climate change and provide decent education and healthcare, the boomers have bequeathed their children a mess of daunting proportions. Through such government programmes as social security and other entitlements, they have run up huge debts that the US government cannot pay except by, eventually, soaking the young. One of his most affecting chapters is about how failing schools feed mostly African American youth into the huge for-profit prison system. Someday, they will get out. There will be no structures in place to employ or take care of them.

The boomers have made sure that they themselves will live long and prosper, but only at the expense of their offspring. That we are skating on thin ice is no solace: “Because the problems Boomers created, from entitlements on, grow not so much in linear as exponential terms, the crisis that feels distant today will, when it comes, seem to have arrived overnight.” As one who has been raging against the American right since the election of Ronald Reagan, as someone with plenty of boomer friends who have done the same, I would like to let myself off the hook, but Gibney points out that while “not all Boomers directly participated, almost all benefited; they are, as the law would have it, jointly and severally liable”.

Gibney’s theories about how we boomers got to be sociopaths (inclined to “deceit, selfishness, imprudence, remorselessness, hostility”) are a little light: no experience of the second world war, unlike the Europeans; coddled childhoods owing to 1950s prosperity; and TV – “a training and reinforcement mechanism for deceit”, not to mention softening viewers up for ever more consumption of goods.

My own theories are based on my experience of the cold war. I think that the constant danger of nuclear annihilation and the drumbeat on TV and radio of the Soviet threat raised our fight-flight instincts so that some of us became overly cautious (me) and others overly aggressive (Dick Cheney). I also think that our parents were not “permissive”, but that they produced too many children in an era when there was nothing much for the children to do but get out of the house and into trouble – few time-consuming tasks around the house or on the farm, plus bored mothers and absent fathers, who felt a sense of despair when they compared themselves with the shiny advertisements of middle-class perfection they saw everywhere, not just on TV. This was what America had to offer – washing machines, high heels, perfect hairdos, Corn Flakes, TV dinners, patriotism and imminent destruction.

by Jane Smiley, The Guardian |  Read more:
Image: Lambert/Getty Images

Politics 101

My Family’s Slave

The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.

Her name was Eudocia Tomas Pulido. We called her Lola. She was 4 foot 11, with mocha-brown skin and almond eyes that I can still see looking into mine—my first memory. She was 18 years old when my grandfather gave her to my mother as a gift, and when my family moved to the United States, we brought her with us. No other word but slave encompassed the life she lived. Her days began before everyone else woke and ended after we went to bed. She prepared three meals a day, cleaned the house, waited on my parents, and took care of my four siblings and me. My parents never paid her, and they scolded her constantly. She wasn’t kept in leg irons, but she might as well have been. So many nights, on my way to the bathroom, I’d spot her sleeping in a corner, slumped against a mound of laundry, her fingers clutching a garment she was in the middle of folding.

To our American neighbors, we were model immigrants, a poster family. They told us so. My father had a law degree, my mother was on her way to becoming a doctor, and my siblings and I got good grades and always said “please” and “thank you.” We never talked about Lola. Our secret went to the core of who we were and, at least for us kids, who we wanted to be.

After my mother died of leukemia, in 1999, Lola came to live with me in a small town north of Seattle. I had a family, a career, a house in the suburbs—the American dream. And then I had a slave.

by Alex Tizon, The Atlantic |  Read more:
Image: Alex Tizon

Tuesday, May 16, 2017

The Gospel According to Mitch

It will surprise no one to hear that politicians are hypocrites. Even the word “politics” today works as a de facto synonym for not practicing what you preach. To be a “skilled politician” means you’re good at saying all the right things while hiding your intent to do the opposite most of the time. Only within the morally corrupt confines of the Beltway is the phrase regarded as a compliment.

For millennia now, moralists have assailed hypocrisy not only as a despicable personal trait, but also a stain on one’s soul. Even if you could fool other people into believing what you say, even if they never caught on to your self-serving and double-crossing, God could see straight through you. One way or another, you will be judged. People in general, and voters in particular, despise hypocrisy. Actions speak louder than words, and empty promises will come back to bite you. You can only bullshit your fellow humans so much—eventually they will catch on and hold you accountable.

The problem, though, is that none of this is true. The cup of political history overfloweth with proof that reliably rank moral dishonesty pays off in public life—one of the most glaring cases in point is Senior Kentucky Senator and Senate Majority Leader Mitch McConnell.

I have developed an unhealthy obsession with McConnell’s political career, not because there’s anything interesting about him personally (the only universally shared opinion about McConnell seems to be that he’s got the charisma of a tub of Vaseline), but because he’s the purest embodiment of some of our most significant political contradictions. And the baseline contradiction from which all the others flow is this: if hypocrisy is such a unanimously despised trait, then how did someone like McConnell become one of the most powerful people in the country?

Having read numerous lengthy profiles of one of the most outwardly boring people in the galaxy, I’ll spare you the long yarns about how a Southern boy who contracted polio at age two ascended the local and national ranks of government without ever losing a single electoral race. For a thorough account of McConnell’s career that simultaneously traces the evolution of the GOP over the past four decades, read Alec MacGillis’s sharp biography The Cynic: The Political Education of Mitch McConnell. If you have the stomach for it, compare MacGillis’s book with McConnell’s own propagandistic memoir, The Long Game.

Perhaps the single-most perplexing feature of McConnell’s life as a professional politician is the most painfully obvious one: the guy is the epitome of unlikeability. From the beginning of his political career, friends and competitors alike have remarked on McConnell’s coldness and lack of basic amiability, his astoundingly bland and awkward bearing as an orator, and, of course, his flaccid physical demeanor, like a turtle without a shell. In his 1977 campaign for county judge in Jefferson County, Kentucky, McConnell raised enough money to hire the (very expensive) pollster and strategist Tully Plesser along with the ad producer Robert Goodman. Goodman himself said of McConnell, his own client, “He isn’t interesting. He doesn’t have an aura, an air of mystery about him.” Mitch McConnell is the human equivalent of eggshell paint. He’s a bowl of porridge whose girlfriend dumped him for gruel. You get the picture.

More perplexing still is this reptilian nonentity’s nugatory track record in terms of doing anything to incrementally improve our shared public life. You’d be hard pressed to find someone outside of D.C. who knows or remembers McConnell for remotely good reasons. For liberals and leftists, McConnell’s impeccably punchable face has been the symbol of cynical Republican obstructionism over the last eight years. And for the terminally aggrieved conservative base that Trump stole away from the GOP establishment, McConnell was often painted as too ready to reach compromises with the Obama administration, especially on the showdowns over raising the debt ceiling (2011) and avoiding the fiscal cliff (2013). On the far right, McConnell’s a spineless “cuckservative” puppet of corporate interests, plain and simple.

These latter complaints will, no doubt, baffle anyone left of center. After all, this is the same man who famously declared in 2010 that “The single most important thing we want to achieve is for President Obama to be a one-term president.” This is also the man who followed through on that pledge by leading the GOP’s congressional charge to throw sand in the gears of government at every turn during Obama’s presidency.

Beyond outright petulance, the logic behind McConnell’s strategy was clear. As Michael Grunwald explained in Politico, “Republican leaders simply did not want their fingerprints on the Obama agenda; as McConnell explained, if Americans thought D.C. politicians were working together, they would credit the president, and if they thought D.C. seemed as ugly and messy as always, they would blame the president.” (That this strategy did not, in fact, make Obama a one-term president had little to do with McConnell’s search-and-destroy legislative philosophy, and almost everything to do with the GOP’s nomination of private-equity Fauntleroy Mitt Romney as the president’s 2012 challenger.)

Obama stoked the hopes of voters in 2008 for a “post-partisan” way of doing politics that would allegedly put country over party differences. And in the wake of a disastrous Bush presidency, capped off by a crippling economic recession, it appeared that the buoyant Obama wave was pushing the modern GOP closer and closer to oblivion. If the American people began to sense that things were, indeed, getting better under Obama, that would be the death knell for the modern Republican party.

When others in the party began to panic, though, McConnell buckled down. Harkening back to the infamous tactics of Newt Gingrich, McConnell followed this fathomlessly cynical logic to its culmination, weaponizing his branch of Congress to deny the Obama administration any chance whatsoever to claim post-partisanism was working, even if that meant torpedoing the public’s faith in government entirely.

McConnell’s plan proved a (quite literal) smashing success. After eight years of intentionally driving the government into crippling gridlock, McConnell at last has everything he ever wanted—Obama’s gone, Republicans control every branch of government, and he’s fastened his turtle chompers onto the job he’s obsessed over for most of his adult life. In The Long Game, McConnell confesses that, while just about every ambitious senator on the Hill is gunning for the ultimate prize of one day commanding the Oval Office, this was never his goal. “When it came to what I most desired,” he writes, “and the place from which I thought I could make the greatest difference, I knew deep down it was the majority leader’s desk I hoped to occupy one day.” That day came in January of 2015.

There was one big unforeseen consequence, though. As one of the chief architects of the GOP’s scorched-earth strategy during the Obama years, McConnell had created the basic conditions for the Senate’s—and indeed, the GOP’s—own public immolation. Even if it meant filibustering their own proposals, Republicans wanted to expose the useless guts of a broken system to the public and try to pin as much of the blame on Obama as possible. In the process of burning down Washington, though, they cleared a path for the anti-Obama, a loud-mouthed beast who would capitalize on the collective lost faith in the government establishment they themselves had used to fuel a fire that was now burning beyond their control.

This is what makes McConnell such an easy target now. After years of intransigent, uncompromising warfare with the Obama vision, he now must figure out some way to jumpstart the same machine he’s tried so hard to drive into the dirt. It is thus with a peculiar mixture of schadenfreude and fury that we are now treated to the ongoing spectacle of Mitch’s hypocrisy—Mitch-pocrisy, if you will—laid bare. As with Trump, the law of digital irony continuously seems to affirm that, for every injunction McConnell makes during the current administration, there’s a clip somewhere of him saying the exact opposite during the Obama years. (...)

These recent examples of McConnell’s outlandish hypocrisy are just the tip of the iceberg; he has spent his entire career flip-flopping. As John Yarmuth, Kentucky’s only Democratic congressman, told a union crowd in 2014, “Mitch McConnell has been the same cold-hearted, power-hungry politician for the entire forty-six years I’ve known him . . . He’s like a windmill—whichever way the wind blows, he goes. He doesn’t have any core values. He just wants to be something. He doesn’t want to do anything.” Perhaps what’s most distressing about this is that virtually none of us register it as anything resembling news. Everyone knows McConnell is a slimy hypocrite. What’s worse, everyone knows that his hypocrisy works.

by Maximillian Alvarez, The Baffler | Read more:
Image: Gage Skidmore

[ed. Just think, kids growing up today might never see one of these, or even know what they are.]

Health Insurers Bilk Medicare for Billions

When Medicare was facing an impossible $13 trillion funding gap, Congress opted for a bold fix: It handed over part of the program to insurance companies, expecting them to provide better care at a lower cost. The new program was named Medicare Advantage.

Nearly 15 years later, a third of all Americans who receive some form of Medicare have chosen the insurer-provided version, which, by most accounts, has been a success.

But now a whistle-blower, a former well-placed official at UnitedHealth Group, asserts that the big insurance companies have been systematically bilking Medicare Advantage for years, reaping billions of taxpayer dollars from the program by gaming the payment system.

The Justice Department takes the whistle-blower’s claims so seriously that it has said it intends to sue the whistle-blower’s former employer, UnitedHealth Group, even as it investigates other Medicare Advantage participants. The agency has until the end of Tuesday to take action against UnitedHealth.

In the first interview since his allegations were made public, the whistle-blower, Benjamin Poehling of Bloomington, Minn., described in detail how his company and others like it — in his view — gamed the system: Finance directors like him monitored projects that UnitedHealth had designed to make patients look sicker than they were, by scouring patients’ health records electronically and finding ways to goose the diagnosis codes.

The sicker the patient, the more UnitedHealth was paid by Medicare Advantage — and the bigger the bonuses people earned, including Mr. Poehling.

In February, a federal judge unsealed the lawsuit that Mr. Poehling filed against UnitedHealth and 14 other companies involved in Medicare Advantage.

“They’ve set up a perfect scheme here,” Mr. Poehling said in an interview. “It was rigged so there was no way they could lose.”

A spokesman for UnitedHealth, Matthew A. Burns, said the company rejected Mr. Poehling’s allegations and would contest them vigorously. (...)

Mr. Poehling’s suit, filed under the False Claims Act, seeks to recover excess payments, and big penalties, for the Centers for Medicare and Medicaid Services. (Mr. Poehling would earn a percentage of any money recovered.) The amounts in question industrywide are mind-boggling: Some analysts estimate improper Medicare Advantage payments at $10 billion a year or more.

At the heart of the dispute: The government pays insurers extra to enroll people with more serious medical problems, to discourage them from cherry-picking healthy people for their Medicare Advantage plans. The higher payments are determined by a complicated risk scoring system, which has nothing to do with the treatments people get from their doctors; rather, it is all about diagnoses. (...)

Mr. Poehling said the data-mining projects that he had monitored could raise the government’s payments to UnitedHealth by nearly $3,000 per new diagnosis found. The company, he said, did not bother looking for conditions like high blood pressure, which, though dangerous, do not raise risk scores.

He included in his complaint an email message from Jerry J. Knutson, the chief financial officer of his division, in which Mr. Knutson urged Mr. Poehling’s team “to really go after the potential risk scoring you have consistently indicated is out there.”

“You mentioned vasculatory disease opportunities, screening opportunities, etc., with huge $ opportunities,” Mr. Knutson wrote. “Let’s turn on the gas!”

by Mary Williams Walsh, NY Times |  Read more:
Image: NY Times 
[ed. Anyone surprised? What will be surprising is if Congress and the Justice Department under Jeff Sessions actually do anything.]

Ladies Who Jam

"Jazz has the power to make men forget their differences and come together.” These are the words with which Quincy Jones inaugurated the first UNESCO International Jazz Day exactly five years ago.

Broadcasting on April 30 from Havana, Cuba, this year’s headliners include Herbie Hancock, Chucho Valdés, Carl Allen, Marc Antoine, Till Brönner, Antonio Hart, Marcus Miller, Kurt Elling, Gonzalo Rubalcaba, Ben Williams, Pancho Amat, César López, Ivan Lins, Igor Burman, Julio Padón, Richard Bona, and Bobby Carcasses, plus three notable jazzwomen: Cassandra Wilson, Esperanza Spalding, and Regina Carter.

If the X-Y energy sounds disproportionate in that lineup, just consider that Wynton Marsalis’s renowned Jazz at Lincoln Center Orchestra — among the best-paying gigs for an American jazz musician — has never once hired a permanent female member. This is all too common a story. While female jazz vocalists like Wilson and Spalding, who also plays bass, are somewhat de rigueur, the instrument section is overwhelmingly a masculine domain, which historically prizes aggressive self-confidence on the bandstand; it’s a job that requires frequent absences from home and family, and punishes women — particularly horn players — for being “unattractive” while “blowing hot.”

What’s more, research shows that the trumpet, trombone, and drums are still perceived as “masculine” instruments, while the flute, clarinet, and piano are considered feminine. In other words, sexual stereotyping of band instruments helps explain why boys are more likely to play the trombone, and girls the flute. For a long time, in fact, girls were prohibited from playing saxophones and percussion.

Of course, a penis is no prerequisite for playing jazz. It’s a social art. But as a freelance, ensemble-based industry, it remains largely a musical boys’ club whose members typically get a foot in the door by referrals through buddies. There’s rarely any formal hiring procedures in place, or any public postings of openings in big bands or jazz ensembles. The jazz gender gap extends beyond the music — as the mastheads of leading jazz magazines show, less than 10 percent of jazz critics and journalists are women, and a player’s promotion hinges on mostly male-run booking agencies and jazz festival programmers.

In kicking off that first International Jazz Day, Jones described jazz as “the personification of transforming overwhelmingly negative circumstances into freedom, friendship, hope, and dignity.” A nice, inclusive interpretation of the music. But as a commercial business, jazz is among the most sexist sectors of the music industry.

Classical music, while not typically incubated in jazz’s red-light classrooms of bars and clubs, offers an intriguing comparison.

In the 1970s, women accounted for less than five percent of classical musicians. Then a musicians’ union mandated “blind audition” policies, which conceal the identity of performance candidates from the jury and decrease bias, be it conscious or unconscious. Today, 48 percent of symphony musicians in metropolitan areas are women, says Ellen Seeling, a professional trumpet player and chairperson of JazzWomen and Girls Advocates, the first and largest organization dedicated to promoting “the visibility of women and girl instrumentalists of all ethnicities in jazz” and advocating “for their inclusion in all aspects of the art form.”

The group’s mission poses the question: If pressure were applied to the hiring tactics of jazz orchestras, could women’s representation in the genre undergo a sea change similar to that in the classical world? Seeling hopes so.

She made headlines a couple years ago when summoning hundreds of musicians and a female-led band to stage a rally outside Jazz at Lincoln Center during a high-ticketed donors’ gala to advocate for blind auditions. But Seeling contends the very nature of jazz makes things a little more complicated.

“Jazz is cool, it’s rogue,” says Seeling, making air quotes. “It’s rogue and totally unregulated and misogynistic — even more so than rock ’n’ roll. Look at the Grammys house band, the SNL band, any of them. How many women do you see there?”

by Katie O’Reilly, LARB | Read more:
Image: uncredited

Monday, May 15, 2017

The Startup Industry’s Toxic “Side Hustle” Fixation

A handsome man gazes at the camera. “These days, everyone needs a side hustle,” he shrugs. We cut to scenes from his well-lit life, and it’s a mix of pleasant chauffeur jaunts, and hangout sessions with his daughter, dog, and pals. “Earning, chilling, earning, chilling,” the man sing-songs, a prosperous avatar for enviable work-life balance. His existence is so delightfully calibrated, I could play the scene for my therapist to best explain what I’m aiming for, except I won’t do that, because the man is an actor and he’s in an ad for Uber. The transit company has embraced the concept of “side hustle” to entice people to become contractor-taxis, spinning the idea of having a second job as a form of freedom, a salvation from drudgery. “Get your side hustle on,” Uber’s website beckons new drivers.

Uber is the most prominent business in startup culture to explicitly use the term as a way to sell piecemeal labor as a savvy lifestyle choice, but the phrase is frequently deployed within the startup industry to hype all sorts of gig-economy work. Websites like Side Hustle Academy and books like Side Hustle to Success and Side Hustle Blueprint promise readers they’ll explain how to build wealth as an extracurricular habit. A marketing consultant who refers to herself as a “digital nomad” published a self-help book called The Side Hustle Gal. It’s like those spambot comments at the bottom of blog posts — make extra money working from home — were interpreted credulously and turned into an economic game plan by a cadre of self-published wannabe Suze Ormans. But the way Uber and startup culture has co-opted and bowdlerized the phrase into an anodyne signifier of entrepreneurialism is gallingly hollow. The side hustle is a survival mechanism, not an aspirational career track.

Two definitions of the “side hustle” are hyped by startup culture. One is the Uber interpretation, and it’s simple: side hustle as second gig. The other, what I’d call the “life coach” definition, is a little more specific: the side hustle as second gig that is also a passion project. While these two definitions are distinct, they are not so distant from one another. Both imagine that the side hustler is on track to a better life through ceaseless piecemeal labor rather than 9-to-5 employment. Even in Uber’s estimation, the “side hustle” is a sanguine endeavor, something that makes life easier, a way to grease one’s most ambitious life track. It’s a captivating tale. The idea that success depends on after-hours striving speaks to an archetypically American combination of values — the national preoccupation with work ethic and individualism. It also misconstrues economic reality to make companies like Uber look like benevolent job creators rather than businesses tailored to maximize profit while shifting as much financial risk onto contractors as possible.

Companies like Uber want to spit-shine the concept of “side hustle” so it looks like a better alternative to steady, gainful employment. If people see gig-economy labor as a flexible stepping stone to a better life, they’re less likely to also see it as a force eroding a work culture with protections and benefits for employees in favor of a low-ball freelance marketplace. (Also, one that will eventually be automated, making their jobs obsolete.) Uber is selling a fantasy of economic advancement through the corrosion of employment benefits and stability, pitching increased subjugation to the corporation as some sort of salvo. That Uber is promoting itself as a solution for the financial flailing is particularly eye-popping when one considers the company’s ultimate goal to eliminate the job of driving in favor of large-scale automation. (...)

I recently wrote about how startup multilevel marketing companies, like LuLaRoe and It Works!, are growing in popularity on social media. Many of these businesses push the idea that people can get rich by selling wares as a type of side hustle, but the reality is that the majority of contractors shilling for these companies make little to no money. This does not mean they are indolent or obtuse. Many startup gigs that are breathlessly pitched as ways to transcend the grind are, in fact, often more wearying, riskier, and less financially rewarding than salaried employment.

CNBC recently reported a story about how a college student earned $10,000 using a “side hustle app” called JoyRun. That sounds impressive until the figures are broken down. She worked around 12–20 hours a week for a year. That’s a classic part-time job making around $10–15 an hour. So it’s slightly better than the average wage at McDonald’s. The student’s “success” on this app was apparently rare enough to warrant media attention; what a closer look at the numbers reveals is how easily low-rung employment can get ginned into a success story by slapping a hyperbolic “side hustle” narrative on it.

by Kate Knibbs, The Ringer |  Read more:
Image: Getty/The Ringer

How Noncompete Clauses Keep Workers Locked In

Keith Bollinger’s paycheck as a factory manager had shriveled after the 2008 financial crisis, but then he got a chance to pull himself out of recession’s hole. A rival textile company offered him a better job — and a big raise.

When he said yes, it set off a three-year legal battle that concluded this past week but wiped out his savings along the way.

“I tried to get a better life for my wife and my son, and it backfired,” said Mr. Bollinger, who is 53. “Now I’m in my mid-50s, and I’m ruined.”

Mr. Bollinger had signed a noncompete agreement, designed to prevent him from leaving his previous employer for a competitor. These contracts have long been routine among senior executives. But they are rapidly spreading to employees like Mr. Bollinger, who do the kind of blue-collar work that President Trump has promised to create more of.

The growth of noncompete agreements is part of a broad shift in which companies assert ownership over work experience as well as work. A recent survey by economists including Evan Starr, a management professor at the University of Maryland, showed that about one in five employees was bound by a noncompete clause in 2014.

Employment lawyers say their use has exploded. Russell Beck, a partner at the Boston law firm Beck Reed Riden who does an annual survey of noncompete litigation, said the most recent data showed that noncompete and trade-secret lawsuits had roughly tripled since 2000.

“Companies of all sorts use them for people at all levels,” he said. “That’s a change.”

Employment lawyers know this, but workers are often astonished to learn that they’ve signed away their right to leave for a competitor. Timothy Gonzalez, an hourly laborer who shoveled dirt for a fast-food-level wage, was sued after leaving one environmental drilling company for another. Phillip Barone, a midlevel salesman and Air Force veteran, was let go from his job after his old company sent a cease-and-desist letter saying he had signed a noncompete. (...)

Alan B. Krueger, a Princeton economics professor who was chairman of President Barack Obama’s Council of Economic Advisers, recently described noncompetes and other restrictive employment contracts — along with outright collusion — as part of a “rigged” labor market in which employers “act to prevent the forces of competition.”

By giving companies huge power to dictate where and for whom their employees can work next, noncompetes take a person’s greatest professional assets — years of hard work and earned skills — and turn them into a liability.

“It’s one thing to have a bump in the road and be in between jobs for a little while; it’s another thing to be prevented from doing the only thing you know how to do,” said Max Burton Wahrhaftig, an arborist in Doylestown, Pa., who in 2013 was threatened by his former employer after leaving for a better-paying job with a rival tree service. He was able to avoid a full-blown lawsuit.

Noncompetes are but one factor atop a great mountain of challenges making it harder for employees to get ahead. Globalization and automation have put American workers in competition with overseas labor and machines. The rise of contract employment has made it harder to find a steady job. The decline of unions has made it tougher to negotiate.

But the move to tie workers down with noncompete agreements falls in line with the decades-long trend in which their mobility and bargaining power has steadily declined, and with it their share of company earnings.

When a noncompete agreement is litigated to the letter, a worker can be barred or ousted from a new job by court order. Even if that never happens, the threat alone can create a chilling effect that reduces wages throughout the work force.

“People can’t negotiate when their company knows they won’t leave,” said Sandra E. Black, an economics professor at the University of Texas at Austin.

by Conor Dougherty, NY Times | Read more:
Image: Travis Dove

How Untreated Depression Contributes to the Opioid Epidemic

It can sometimes seem strange how so much of the country got hooked on opioids within just a few years. Deaths from prescription drugs like oxycodone, hydrocodone, and methadone have more than quadrupled since 1999, according to the CDC. But pain doesn’t seem to be the only culprit: About one-third of Americans have chronic pain, but not all of them take prescription painkillers for it. Of those who do take prescription opioids, not all become addicted.

Several researchers now believe depression, one of the most common medical diagnoses in the U.S., might be one underlying cause that’s driving some patients to seek out prescription opioids and to use them improperly.

People with depression show abnormalities in the body’s release of its own, endogenous, opioid chemicals. Depression tends to exacerbate pain—it makes chronic pain last longer and hurts the recovery process after surgery.

“Depressed people are in a state of alarm,” said Mark Sullivan, a professor of psychiatry at the University of Washington. “They’re fearful, or frozen in place. There’s a heightened sense of threat.” That increased threat sensitivity might also be what heightens sensations of pain.

Not only do people with depression tend to be more pain-sensitive, the effect of opioids can, for some, feel as mood-elevating as an antidepressant.

“Depression is a mixed bag,” Sullivan said. “People can feel sluggish and uninterested, but they can also feel agitated, irritated, and anxious. They feel both unrelaxed and really unmotivated at the same time.”

Opioids might, at least temporarily, feel soothing and sedating. Indeed, several studies have found that buprenorphine, an opioid that is typically used to wean people off of heroin, has some antidepressant properties.

Sullivan and other researchers from Washington and California found in 2012 that depressed people were about twice as likely as non-depressed ones to misuse their painkillers for non-pain symptoms, and depressed individuals were between two and three times more likely to ramp up their own doses of painkillers. Adolescents with depression were also more likely, in one study, to use prescription painkillers for non-medical reasons and to become addicted.

In 2015, a different group of researchers found that depressed people were likely to keep using opioids, even when their pain had subsided and when they were more functional. “If the emotional pain, the depression, is never properly diagnosed or treated, the patient might continue taking the opioid because it’s treating something,” said Jenna Goesling, an assistant professor in the department of anesthesiology at the University of Michigan and an author of that study.

by Olga Khazan, The Atlantic |  Read more:
Image: Jonathan Ernst/Reuters

Sunday, May 14, 2017


Craig Ferguson
, Yakitori alley, Tokyo
via:

Ash Wednesday

I’m worried about my trip up to New York to attend a party.

I worry that I am not traveling with a young assistant who is far more skillful at pushing the buttons on my iPhone (or laptop, if I hadn’t drowned the keyboard in coffee and lost the damned thing even before I became that comfortable using it) than I am. I worry about the latest Theory of Everything (this decade it’s ADD) which does such a good job of holding people in their various social tracks, so that someone who is dyslexic (like me) is also said to have “a form of ADD.” My friend Bob Woof tells me a third of the people who will be at the party have it too. Our mutual friend Eric says Bob is practically a hoarder, which makes it likely he has a touch of it himself. But despite both of those worries, I’m on the bus and headed north.

Bob has invited me to one of his Prime Timers parties on Sunday evening, March 5, 2017. He’s been inviting me to these gatherings for more than a year, but this time I’ve decided to accept and write a few notes on it as well. (In a notebook. I can’t handle them any other way.)

I’m combining the trip with another visit I’ve been wanting to make for several years, to see my old fuck buddy, Maison, and his husband, Fred, who live further upstate near Poughkeepsie: I’ll continue by the Metro-North train and stay with him and Fred Monday night, March 6, and Tuesday night, March 7, before returning to New York City by train and, after a walk across town from Grand Central Station to Port Authority with my grey plastic rollaway and my grubby white Zabar’s bag, back to Philadelphia on the morning of Wednesday, March 8—on a Peter Pan bus.

But that’s getting ahead of things.

The Prime Timers is a group of older New York City–based gay men who have a sex party every month. This time it is at the DoubleTree hotel on the southeast corner of Forty-Seventh and Seventh. The party is in room 3905—two rooms actually, both given over to sex from 5:30 pretty much till midnight.

While I was not particularly nervous sexually about what would happen, there was my worsening ADD: the shattering of my self-confidence last year had left me with exactly the kind of uncertainties that Bob prided himself on being able to take care of in the elderly men who came to his parties. Would I arrive with phone and luggage intact? Would I be able to get back with everything I started out with? Would I be able to negotiate my medications, food? Sleep? With ADD wreaking havoc on logic and focus, would I be able to document the trip as I hoped?

About a year ago, Bob brought a car full of guys to have lunch with me out at a mall restaurant in Wynnewood, Pennsylvania, where Dennis, my partner of twenty-seven years, and I were living with my daughter and son-in-law. The guys Bob brought were civilized, seemingly well off, and friendly. One big fellow in jeans and a jean jacket was driving the group back to New York City from somewhere.

One man, John, in a navy pea jacket remarked on what a nice guy I was. Bob sucked my fingertip at the restaurant table. Nobody else in the restaurant seemed particularly interested in us. Dennis didn’t come that day, I remember, for whatever reason.

I’d met Bob at an academic convention on gay comic art, at the CUNY Graduate Center, where he’d walked up to me, put his arms around me, and began to kiss me. He was fifty-six and I was seventy-two. He told me that he was really mad over “silver daddy bears.” He was a guy with glasses and a short white beard, who traveled in jeans and plaid shirts, as I did. My beard was notably longer, and white.

Through the rest of the program, he hung out with me even though I had come with three younger friends (Mia Wolff, Ann Matsuchi, and Alex Lozupone); it was the day I met Alison Bechdel, and we mentioned my part in the formation of “The Bechdel Test,” and met a number of other folks. While Bob verged on the annoying, his brazenly direct sexual come on was intriguing.

What has always interested me about gay male society is the way it seems to operate differently from what one might call normative heterosexual society.

I learned that Bob ran a group for men such as myself—the Prime Timers: gay and over fifty. (What this had to do with gay comic books, I never really understood; but, well, there was some connection. . . .) For better or worse, however, I felt I could learn something from him. He seemed naturally kind, concerned and caring.

I’m known as a “sex radical, Afrofuturist, and grand master of science fiction,” but the fact is, I am nowhere near as sexually radical as many, and for all my interest lots of things have passed me by. I felt there was a world of experience that had been slipping away. I wanted at least to know something about it, to write about it.

by Samuel R. Delaney, Boston Review |  Read more:
Image: Samuel R. Delaney

Winners and Losers of the Recent Nuclear Holocaust

The nation was recently rocked by retaliatory nuclear blasts that have turned much of America into a barren wasteland, decimating the population, triggering the rise of firestorms and supervolcanoes, and generally bringing civilization to the brink of collapse. Let’s take a look at the political fallout.

Winners
  • Congressional Republicans: Widespread destruction aside, this was a kumbaya moment for a caucus that has had its share of family spats of late. For the first time since coming together to narrowly pass the American Health Care Act in May, Speaker Paul Ryan wonkily persuaded the House GOP’s version of the Hatfields and McCoys — the principled hardliners of the Freedom Caucus on one hand, and the reasonable moderates of the Tuesday Group on the other — to set their bickering aside just long enough to squeak through a resolution in support of President Trump’s plan, tweeted out at 3:29 a.m. on Thursday morning, to “FRANCE IS LOOKING FOR TROUBLE. Sick country that won’t solve its own problems. Maybe nucluar?” Concerns that a more deliberative Senate would splash cold water on a rare show of Republican unity proved unfounded when Senator Susan Collins (R-ME), the human fulcrum perched stoically at the precise center of American politics, revealed in a nationally televised special that she would vote to authorize nuclear war to balance out the fact that she had recently broken ranks with her party on an agriculture appropriations bill.
  • CNN: As every news producer knows, nothing makes for better theater than war — and nothing makes for better CNN than theater. Right up until the moment when the first blast’s electromagnetic pulse wiped out all of the technology on the eastern seaboard, the cable giant was in fine form, drawing record viewership to a number of its weekday staples. The roiling debate over whether or not to abruptly drop hydrogen bombs on traditional allies proved to be compelling fodder; one particularly juicy squabble between contributors Jeffrey Lord and Lanny Davis will likely go down in history as the second-to-last thing to go viral. Time will tell whether Ari Fleischer’s observation that a nuclear conflict “could be the victory that Donald Trump needs to right the ship of this administration” holds true, but one thing’s for certain — this moment was CNN as it was meant to be: a grand arena where intellectual titans come to match wits and battle it out over issues with no clear answer.
  • Donald Trump: Sure, the verdict may not be in just yet. But when the radioactive dust settles, we could be looking at a game-changing moment for a young presidency. Trump may have ruffled some feathers with less-than-sensitive remarks to the New York Times’ Maggie Haberman that the nuclear holocaust would be “way bigger than the old Holocaust,” but let’s be clear — political correctness has never been this man’s game. For a president with his eye on 2020, an uncertain path to reelection just got a whole lot more manageable, with the threshold for victory in the Electoral College now down from 270 votes to 14. While thermonuclear annihilation may be an inelegant solution, it burnishes the public impression of Trump as a man of action — eccentric, perhaps, but someone who at the end of the day isn’t afraid to get his hands dirty or seek out unorthodox solutions. Those who are still parsing whether the first wave of mortal attacks were justified are asking all the wrong questions. The truth is, it doesn’t matter — this president will be remembered as The Great Disruptor for taking strong and decisive action again and again. Goodbye Armageddon. Hello, Arma-mentum.
Losers
  • Hillary Clinton: The former Secretary of State was spared from the vast and merciless extermination due to scheduled travel. To Wisconsin, you might ask? Of course not. Instead, the one-time Democratic nominee had jetted off to Tanzania to take part in a symposium on empowering women and girls in the world’s fastest-growing economies — an excursion that is sure to raise new questions about her ability to connect with everyday Americans. It’s the same old story: as ever, a politician notorious for being out-of-touch with regular people goes out of her way to prove it once again, this time by failing to relate to the now-quintessential American experience of being instantaneously vaporized into ash by a 500 kiloton wall of unsparing white light that — unlike some people we know — actually deigns to visit blue collar communities in every state.
by Dan Cluchey, McSweeny's |  Read more:
Image: uncredited via:

Saturday, May 13, 2017

The American Obsession with Lawns

Warmer weather in the northern states means more time outside, and more time to garden. While urban gardeners may be planning their container gardens, in the suburbs, homeowners are thinking about their lawns. It’s the time of year when the buzz of landscaping equipment begins to fill the air, and people begin to scrutinize their curb appeal.

The goal—as confirmed by the efforts of Abraham Levitt in his sweeping exercise in conformity (although it had been established well before that)—is to attain a patch of green grass of a singular type with no weeds that is attached to your home. It should be no more than an inch and a half tall, and neatly edged. This means you must be willing to care for it. It must be watered, mowed, repaired, and cultivated. Lawns are expensive—and some regard them as boring in their uniformity—but they are a hallmark of homeownership. Why do Americans place so much importance on lawn maintenance?

In The Great Gatsby when Nick Carraway rents his house on the West Egg, he apparently spends little time on lawn care. The disparity between his patch of greenery and the immaculately manicured grounds of Jay Gatsby's mansion is clear: “We both looked at the grass—there was a sharp line where my ragged lawn ended and the darker, well-kept expanse of his began,” reports Carraway. In preparation for Gatsby’s luncheon with Daisy, Gatsby is so troubled by this difference that he sends his own gardeners to take care of the offensive strip of grass.

This concern is not limited to fiction. The state of a homeowner’s lawn is important in relation to their status within the community and to the status of the community at large. Lawns connect neighbors and neighborhoods; they’re viewed as an indicator of socio-economic character, which translates into property- and resale values. Lawns are indicative of success; they are a physical manifestation of the American Dream of home ownership. To have a well maintained lawn is a sign to others that you have the time and/or the money to support this attraction. It signifies that you care about belonging and want others to see that you are like them. A properly maintained lawn tells others you are a good neighbor. Many homeowner associations have regulations to the effect of how often a lawn must be maintained. So important is this physical representative of a desired status that fines can be levied if the lawn is not maintained. It’s no wonder that Gatsby wanted Carraway’s lawn addressed: it would reflect on him in a variety of ways if it were not.

by Krystal D'Costa, Scientific Amercian |  Read more:
Image: Oliur Rahman Pexels

Our Mothers as We Never Saw Them

In one of my favorite photographs of my mother, she’s about 18 and very tan, with long, blond hair. It’s the 1970s and she’s wearing a white midriff and cutoffs. My dad is there, too, hugging her from behind, and from the looks of it, they’re somewhere rural — maybe some pastoral patch of small-town New Jersey where they met.

I haven’t seen this photo for years, I have no idea where it is now, but I still think of it — and, specifically, my mom in it. She looks really sexy; wars have been waged over less impressive waist-to-hip ratios. And she is so young and innocent. She hasn’t yet dropped out of college, or gotten married. The young woman in this photo has no idea that life will bring her five children and five grandchildren, a conversion to Judaism, one divorce, two marriages, a move across the country.

For me, as for many daughters, the time before my mother became a mother is a string of stories, told and retold: the time she got hit by a car and had amnesia; the time she sold her childhood Barbie to buy a ticket to Woodstock; the time she worked as a waitress at Howard Johnson’s, struggling to pay her way through her first year at Rutgers. The old photos of her are even more compelling than the stories because they’re a historical record, carrying the weight of fact, even if the truth there is slippery: the trick of an image, and so much left outside the frame. These photos serve as a visual accompaniment to the myths. Because any story about your mother is part myth, isn’t it?

After finishing my most recent novel, in part about mother-daughter relationships, I put out a call on social media for photos from women of their mothers before they were mothers. A character in the book, a young artist, does something similar, so I’d thought a lot about what the process might be like. I wasn’t prepared, however, for how powerful the images I received would be.

The young women in these pictures are beautiful, fierce, sassy, goofy, cool, sweet — sometimes all at once. I asked contributors to tell me about their moms or the photo submitted, and they often wrote that something specific and special about their present-day mother — her smile, say, or her posture — was present in this earlier version. What solace to know that time, aging and motherhood cannot take away a woman’s essential identity. For daughters who closely resemble their moms, it must be an even bigger comfort; these mothers and daughters are twins, separated by a generation, and an old photo serves as a kind of mirror: How do I look? Even if there isn’t a resemblance, we can’t help but compare ourselves to our young mothers before they were mothers. (...)

The photos women sent me offer a key to how we, as daughters, want to perceive young womanhood. Pluck, sex appeal, power, kindness, persistence: We admire and celebrate these characteristics, and we long for the past versions of our moms to embody them. But if these characteristics are a prerequisite for a properly executed womanhood, does becoming a mother divest a woman of such qualities? In studying these photos, and each daughter’s interpretation of them, I’ve come to wonder what traits we allow our mothers to have, and which ones we view as temporary, expiring with age and the beginning of motherhood. Can a woman be both sexual and maternal, daring and responsible, innocent and wise? Mothers are either held up as paragons of selflessness, or they’re discounted and parodied. We often don’t see them in all their complexity.

by Edan Lepucki, NY Times |  Read more:
Image: Edan Lepucki

Raf Cruz, Working Class Zero, Collage 2015
via:

New Gene Tests Pose a Threat to Insurers

Pat Reilly had good reason to worry about Alzheimer’s disease: Her mother had it, and she saw firsthand the havoc it could wreak on a family, much of it financial.

So Ms. Reilly, 77, a retired social worker in Ann Arbor, Mich., applied for a long-term care insurance policy. Wary of enrolling people at risk for dementia, the insurance company tested her memory three times before issuing the policy.

But Ms. Reilly knew something the insurer did not: She has inherited the ApoE4 gene, which increases the lifetime risk of developing Alzheimer’s. “I decided I’d best get long-term care insurance,” she said.

An estimated 5.5 million people in the United States have Alzheimer’s disease, and these patients constitute half of all nursing home residents. Yet very few people in the United States have been tested for the ApoE4 gene.

But last month, with the approval of the Food and Drug Administration, the gene testing company 23andMe began offering tests that reveal whether people have the variant, as well as assessing their risks for developing such conditions as Parkinson’s and celiac disease.

Other genetics companies are planning to offer similar tests, and soon millions of people will have a better idea what their medical futures might be. Recent research has found that many, like Ms. Reilly, are likely to begin preparing for the worst.

But for companies selling long-term care insurance, these tests could be a disaster, sending risky patients in search of policies even as those with fewer risks shy away, damaging an already fragile business. “There is a question about whether the industry is in a death spiral anyway,” said Robert Hunter, director of insurance at the Consumer Federation of America. “This could make it worse.”

The tests are simple: All people have to do is send away a saliva sample and pay $199. Their disease risks, if they say they want to know them, will be delivered with a report on ancestry and on how their genes influence such traits as flushing when they drink alcohol or having straight hair.

The company will not reveal how many people have received disease-risk data, but it says that in Britain and Canada, where it has offered such testing for several years, about three-quarters of their customers have asked for it. 23andMe has sold its genetic services to more than two million people worldwide since 2007.

The issue for now is with long-term care insurance, not employment and not — at least so far — health insurance.

Under the Genetic Information Nondiscrimination Privacy Act, companies cannot ask employees to take gene tests and cannot use any such results in employment decisions; insurers are not permitted to require gene tests or to use the results in coverage decisions.

But legislation proposed in the House would exempt corporate “wellness” programs from some of these requirements. And the American Health Care Act, passed by the House, would permit states to waive some insurance safeguards regarding pre-existing conditions.

At the moment, companies selling long-term care insurance — unlike medical insurers — are permitted to ask about health status and take future health into consideration when deciding whom to insure and how much to charge.

The 23andMe test results will not appear in people’s medical records, and the company promises not to disclose identifiable findings to third parties. It is up to the customers to reveal them — and the fear for insurers is that many will not.

Two-thirds of nursing home residents are on Medicaid, and the remaining private insurers are already struggling. In the early 2000s, more than 100 firms offered long-term care insurance, according to the Treasury Department. By the end of 2015, only 12 firms offered it, and new enrollees fell from 171,000 to 104,000.

The insurers charged too little for these policies, experts say; policyholders have turned out to be much sicker than anticipated. To pay for an unanticipated increase in policyholders who develop Alzheimer’s, insurers would have to raise prices, said Don Taylor, a professor of public policy at Duke University who has studied the issue.

Increasing numbers of people at low risk might decide the insurance was not worth the rising price. Even many at high risk would eventually find the policies unaffordable. It is the definition of an insurance death spiral.

by Gina Kolata, NY Times |  Read more:

Ten Year Futures

Now that mobile is maturing and its growth is slowing, everyone in tech turns to thinking about what the Next Big Thing will be. It's easy to say that 'machine learning is the new mobile' (and everyone does), but there are other things going on too.

On one hand, we have a set of profound changes coming as a result of new primary technology. Electric and autonomous cars will change cities, virtual and mixed reality will change the entire computing experience, and machine learning is changing the kind of questions that computers can answer. But each of these is also just beginning, especially relative to their potential - they are at the bottom of the S-Curve where smartphones are now getting towards the top. On the other hand, I think we can see a set of changes that come not so much from any new technology as from shifts in consumer behaviour and operating economics. These changes are potentially just as big, and might be starting sooner.

Electric and autonomous cars are just beginning - electric is happening now but will take time to grow, and autonomy is 5-10 years away from the first real launches. As they happen, each of these destabilises the car industry, changing what it means to make or own a car, and what it means to drive. Gasoline is half of global oil demand and car accidents kill 1.25m people year, and each of those could go away. But as I explored here, that's just the start: if autonomy ends accidents, removes parking and transforms what congestion looks like, then we should try to imagine changes to cities on the same scale as those that came with cars themselves. How do cities change if some or all of their parking space is now available for new needs, or dumped on the market, or moved to completely different places? Where are you willing to live if 'access to public transport' is 'anywhere' and there are no traffic jams on your commute? How willing are people to go from their home in a suburb to dinner or a bar in a city centre on a dark cold wet night if they don't have to park and an on-demand ride is the cost of a coffee? And how does law enforcement change when every passing car is watching everything?

Then, virtual reality and mixed reality are also some years away from mass-market adoption. We have some VR products in market today and some very early MR, but for both, it feels as though we are in the 2005-2006 phase of multitouch smartphones - almost, but not yet. Once these really come to market, they may change the world just as much as the iPhone. Mixed reality in particular could change things a great deal, if we all have a pair of glasses that can place something in the world in front of you as though it was really there. Predicting what this could be today reminds me of trying to predict the mobile internet not in 2007 but in 1999 - "stock tips, news headlines and the weather" don’t really capture what has happened since then.

Machine learning is happening right now, and rolls through or perhaps underneath the entire tech industry as a new fundamental computer science capability - and of course enables both mixed reality and autonomous cars. Like, perhaps, relational databases or (in a smaller way) smartphone location, machine learning is a building block that will be part of everything, making many things better and enabling some new and surprising companies and products. I don't think we quite understand what it means to say that computers will be able to read images, video or speech in the way that they've been able to read text and numbers since the 1970s or earlier. But though we are creating machine learning now, again, it's still very early to see all of the implications. It's at the beginning of the S-Curve.

So, we have these hugely important new technologies coming, but not quite here yet. At the same time, though, we have a set of more immediate changes, that have much more to do with consumer behaviour, company strategy and economic 'tipping points' than with primary, frontier technology of the kind that Magic Leap or Waymo are building.

by Benedict Evans |  Read more: