Saturday, May 20, 2017
Maine Is Drowning in Lobsters
In his famous 1968 essay "The Tragedy of the Commons," biologist Garrett Hardin singled out ocean fishing as a prime example of self-interested individuals short-sightedly depleting shared resources:
Then there's the Maine lobster. As University of Maine anthropologist James M. Acheson put it in his 2003 book "Capturing the Commons: Devising Institutions to Manage the Maine Lobster Industry":
It's a seafood sustainability success story! But there's been an interesting twist since Acheson wrote those words in 2003. That already-record-setting Maine lobster harvest has more than doubled:
Sustainable fisheries practices alone can't really explain why today's lobster take is more than seven times the pre-2000 average. What can? The most universally accepted answer seems to be that depletion of the fish that used to eat young lobsters (mainly cod, landings of which peaked in Maine in 1991 and have fallen 99.2 percent since) has allowed a lot more lobsters to grow big enough for people to catch and eat them. The tragedy of one commons has brought unprecedented bounty to another.
Warming ocean temperatures have also improved lobster survival rates. Canada's Atlantic provinces have experienced a lobster boom similar to Maine's. Not so in the New England states to the south and west of Maine, where the water is now apparently a little too warm and lobster harvests peaked in the 1990s. Within Maine, which now accounts for more than 80 percent of U.S. lobsters, the sweet spot for lobstering has moved from the state's southern coast to the cooler northeast. (...)
This leaves the Maine (and Canadian) lobster industry with another interesting challenge: how to find enough buyers for all those lobsters so that prices don't collapse. As you can see from the chart below, they've mostly succeeded:
Affluent Chinese diners have been one reason. This January, five chartered 747s full of live lobsters flew from Halifax, Nova Scotia, to China to supply Chinese New Year feasts. Maine's lobsters tend to make the voyage less dramatically, in regularly scheduled flights from Boston, but $27 million worth of them were shipped to China in 2016.
The national and even global spread of the lobster roll has also helped a lot. I came to Maine on a trip organized by Luke's Lobster, a fast-casual restaurant chain that now has 21 "shacks" in the U.S. and eight more scheduled to open this year, along with six licensed locations in Japan. Founder Luke Holden was an investment banker in New York when he and former food writer Ben Conniff opened the first restaurant in the East Village in 2009, but he's also the son of a Maine lobsterman who owned the state's first lobster-processing plant.
Luke's Lobster now has its own plant in Saco, Maine, that processes between 4 and 5 percent of the state's lobster harvest. Processing, in this case, means cooking and picking the meat out of the claws and knuckles for Luke's lobster rolls 4 while cleaning and freezing the raw tails and clawless "bullet" lobsters for sale to restaurants, groceries and such.
Holden's father, Jeff, says that tails used to sell for much more than claw meat. Now lobster rolls, for which tail meat is generally too chewy, have flipped the price equation.
All in all, it's a fascinating tale of adaptation, marketing and lobster logistics. There is one big catch, though, beyond the vague fears that the lobsters can't be this abundant forever. It's that the bait used to lure the lobsters into traps -- herring -- isn't as abundant as they are. Herring stocks along the Maine coast haven't collapsed as some other fisheries have, but the catch has fallen in recent years, to 77 million pounds in 2016 from 103 million in 2014 and more than 150 million some years in the 1950s and 1960s.
On average, it takes about a pound of herring to catch a pound of lobster.
Professing to believe in the "inexhaustible resources of the oceans," they bring species after species of fish and whales closer to extinction.The whales have actually been doing a lot better lately. Fish in general, not so much.
Then there's the Maine lobster. As University of Maine anthropologist James M. Acheson put it in his 2003 book "Capturing the Commons: Devising Institutions to Manage the Maine Lobster Industry":
Since the late 1980s, catches have been at record-high levels despite decades of intense exploitation. We have never produced so many lobsters. Even more interesting to managers is the fact that catch levels remained relatively stable from 1947 to the late 1980s. While scientists do not agree on the reason for these high catches, there is a growing consensus that they are due, in some measure, to the long history of effective regulations that the lobster industry has played a key role in developing.Two of the most prominent and straightforward regulations are that lobsters must be thrown back in the water not only if they are too small but also if they are too big (because mature lobsters produce the most offspring), and that egg-bearing females must not only be thrown back but also marked (by notches cut in their tails) as off-limits for life. Acheson calls this "parametric management" -- the rules "control 'how' fishing is done," not how many lobsters are caught -- and concludes that "Although this approach is not supported by fisheries scientists in general, it appears to work well in the lobster fishery."
It's a seafood sustainability success story! But there's been an interesting twist since Acheson wrote those words in 2003. That already-record-setting Maine lobster harvest has more than doubled:
Sustainable fisheries practices alone can't really explain why today's lobster take is more than seven times the pre-2000 average. What can? The most universally accepted answer seems to be that depletion of the fish that used to eat young lobsters (mainly cod, landings of which peaked in Maine in 1991 and have fallen 99.2 percent since) has allowed a lot more lobsters to grow big enough for people to catch and eat them. The tragedy of one commons has brought unprecedented bounty to another.Warming ocean temperatures have also improved lobster survival rates. Canada's Atlantic provinces have experienced a lobster boom similar to Maine's. Not so in the New England states to the south and west of Maine, where the water is now apparently a little too warm and lobster harvests peaked in the 1990s. Within Maine, which now accounts for more than 80 percent of U.S. lobsters, the sweet spot for lobstering has moved from the state's southern coast to the cooler northeast. (...)
This leaves the Maine (and Canadian) lobster industry with another interesting challenge: how to find enough buyers for all those lobsters so that prices don't collapse. As you can see from the chart below, they've mostly succeeded:
Affluent Chinese diners have been one reason. This January, five chartered 747s full of live lobsters flew from Halifax, Nova Scotia, to China to supply Chinese New Year feasts. Maine's lobsters tend to make the voyage less dramatically, in regularly scheduled flights from Boston, but $27 million worth of them were shipped to China in 2016.
The national and even global spread of the lobster roll has also helped a lot. I came to Maine on a trip organized by Luke's Lobster, a fast-casual restaurant chain that now has 21 "shacks" in the U.S. and eight more scheduled to open this year, along with six licensed locations in Japan. Founder Luke Holden was an investment banker in New York when he and former food writer Ben Conniff opened the first restaurant in the East Village in 2009, but he's also the son of a Maine lobsterman who owned the state's first lobster-processing plant.
Luke's Lobster now has its own plant in Saco, Maine, that processes between 4 and 5 percent of the state's lobster harvest. Processing, in this case, means cooking and picking the meat out of the claws and knuckles for Luke's lobster rolls 4 while cleaning and freezing the raw tails and clawless "bullet" lobsters for sale to restaurants, groceries and such.
Holden's father, Jeff, says that tails used to sell for much more than claw meat. Now lobster rolls, for which tail meat is generally too chewy, have flipped the price equation.
All in all, it's a fascinating tale of adaptation, marketing and lobster logistics. There is one big catch, though, beyond the vague fears that the lobsters can't be this abundant forever. It's that the bait used to lure the lobsters into traps -- herring -- isn't as abundant as they are. Herring stocks along the Maine coast haven't collapsed as some other fisheries have, but the catch has fallen in recent years, to 77 million pounds in 2016 from 103 million in 2014 and more than 150 million some years in the 1950s and 1960s.
On average, it takes about a pound of herring to catch a pound of lobster.
by Justin Fox, Bloomberg | Read more:
Image: Justin Sullivan/Getty ImagesFriday, May 19, 2017
Roger Ailes Was One of the Worst Americans Ever
On the Internet today you will find thousands, perhaps even millions, of people gloating about the death of elephantine Fox News founder Roger Ailes. The happy face emojis are getting a workout on Twitter, which is also bursting with biting one-liners.
When I mentioned to one of my relatives that I was writing about the death of Ailes, the response was, "Say that you hope he's reborn as a woman in Saudi Arabia."
Ailes has no one but his fast-stiffening self to blame for this treatment. He is on the short list of people most responsible for modern America's vicious and bloodthirsty character.
We are a hate-filled, paranoid, untrusting, book-dumb and bilious people whose chief source of recreation is slinging insults and threats at each other online, and we're that way in large part because of the hyper-divisive media environment he discovered.
Ailes was the Christopher Columbus of hate. When the former daytime TV executive and political strategist looked across the American continent, he saw money laying around in giant piles. He knew all that was needed to pick it up was a) the total abandonment of any sense of decency or civic duty in the news business, and b) the factory-like production of news stories that spoke to Americans' worst fantasies about each other.
Like many con artists, he reflexively targeted the elderly – "I created a TV network for people from 55 to dead," he told Joan Walsh – where he saw billions could be made mining terrifying storylines about the collapse of the simpler America such viewers remembered, correctly or (more often) incorrectly, from their childhoods.
In this sense, his Fox Newsbroadcasts were just extended versions of the old "ring around the collar" ad – scare stories about contagion. Wisk was pitched as the cure for sweat stains creeping onto your crisp white collar; Fox was sold as the cure for atheists, feminists, terrorists and minorities crawling over your white picket fence. (...)
Ailes grew out of the entertainment world – his first experience was in daytime variety TV via The Mike Douglas Show – but he later advised a series of Republican campaigns, from Ronald Reagan to George H.W. Bush to Trump.
So when he created Fox, he merged his expertise from those two worlds, mixing entertainment and political stagecraft.
The effect was to politicize the media, a characteristic of banana republics everywhere. When Ailes decided to cordon off Republican audiences and craft news programming targeted specifically to them, he began the process of atomizing the entire media landscape into political fiefdoms – Fox for the right, MSNBC for the left, etc.
Ailes trained Americans to shop for the news as a commodity. Not just on the right but across the political spectrum now, Americans have learned to view the news as a consumer product.
What most of us are buying when we tune in to this or that channel or read this or that newspaper is a reassuring take on the changes in the world that most frighten us. We buy the version of the world that pleases us and live in little bubbles where we get to nurse resentments all day long and no one ever tells us we're wrong about anything. Ailes invented those bubbles.
by Matt Taibbi, Rolling Stone | Read more:
Image: Jim Cooper/AP
When I mentioned to one of my relatives that I was writing about the death of Ailes, the response was, "Say that you hope he's reborn as a woman in Saudi Arabia."
Ailes has no one but his fast-stiffening self to blame for this treatment. He is on the short list of people most responsible for modern America's vicious and bloodthirsty character.We are a hate-filled, paranoid, untrusting, book-dumb and bilious people whose chief source of recreation is slinging insults and threats at each other online, and we're that way in large part because of the hyper-divisive media environment he discovered.
Ailes was the Christopher Columbus of hate. When the former daytime TV executive and political strategist looked across the American continent, he saw money laying around in giant piles. He knew all that was needed to pick it up was a) the total abandonment of any sense of decency or civic duty in the news business, and b) the factory-like production of news stories that spoke to Americans' worst fantasies about each other.
Like many con artists, he reflexively targeted the elderly – "I created a TV network for people from 55 to dead," he told Joan Walsh – where he saw billions could be made mining terrifying storylines about the collapse of the simpler America such viewers remembered, correctly or (more often) incorrectly, from their childhoods.
In this sense, his Fox Newsbroadcasts were just extended versions of the old "ring around the collar" ad – scare stories about contagion. Wisk was pitched as the cure for sweat stains creeping onto your crisp white collar; Fox was sold as the cure for atheists, feminists, terrorists and minorities crawling over your white picket fence. (...)
Ailes grew out of the entertainment world – his first experience was in daytime variety TV via The Mike Douglas Show – but he later advised a series of Republican campaigns, from Ronald Reagan to George H.W. Bush to Trump.
So when he created Fox, he merged his expertise from those two worlds, mixing entertainment and political stagecraft.
The effect was to politicize the media, a characteristic of banana republics everywhere. When Ailes decided to cordon off Republican audiences and craft news programming targeted specifically to them, he began the process of atomizing the entire media landscape into political fiefdoms – Fox for the right, MSNBC for the left, etc.
Ailes trained Americans to shop for the news as a commodity. Not just on the right but across the political spectrum now, Americans have learned to view the news as a consumer product.
What most of us are buying when we tune in to this or that channel or read this or that newspaper is a reassuring take on the changes in the world that most frighten us. We buy the version of the world that pleases us and live in little bubbles where we get to nurse resentments all day long and no one ever tells us we're wrong about anything. Ailes invented those bubbles.
by Matt Taibbi, Rolling Stone | Read more:
Image: Jim Cooper/AP
We Aren’t Built to Live in the Moment
We are misnamed. We call ourselves Homo sapiens, the “wise man,” but that’s more of a boast than a description. What makes us wise? What sets us apart from other animals? Various answers have been proposed — language, tools, cooperation, culture, tasting bad to predators — but none is unique to humans.
What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives.
A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present.
Behaviorists thought of animal learning as the ingraining of habit by repetition. Psychoanalysts believed that treating patients was a matter of unearthing and confronting the past. Even when cognitive psychology emerged, it focused on the past and present — on memory and perception.
But it is increasingly clear that the mind is mainly drawn to the future, not driven by the past. Behavior, memory and perception can’t be understood without appreciating the central role of prospection. We learn not by storing static records but by continually retouching memories and imagining future possibilities. Our brain sees the world not by processing every pixel in a scene but by focusing on the unexpected.
Our emotions are less reactions to the present than guides to future behavior. Therapists are exploring new ways to treat depression now that they see it as primarily not because of past traumas and present stresses but because of skewed visions of what lies ahead.
Prospection enables us to become wise not just from our own experiences but also by learning from others. We are social animals like no others, living and working in very large groups of strangers, because we have jointly constructed the future. Human culture — our language, our division of labor, our knowledge, our laws and technology — is possible only because we can anticipate what fellow humans will do in the distant future. We make sacrifices today to earn rewards tomorrow, whether in this life or in the afterlife promised by so many religions. (...)
The central role of prospection has emerged in recent studies of both conscious and unconscious mental processes, like one in Chicago that pinged nearly 500 adults during the day to record their immediate thoughts and moods. If traditional psychological theory had been correct, these people would have spent a lot of time ruminating. But they actually thought about the future three times more often than the past, and even those few thoughts about a past event typically involved consideration of its future implications.
When making plans, they reported higher levels of happiness and lower levels of stress than at other times, presumably because planning turns a chaotic mass of concerns into an organized sequence. Although they sometimes feared what might go wrong, on average there were twice as many thoughts of what they hoped would happen.
While most people tend to be optimistic, those suffering from depression and anxiety have a bleak view of the future — and that in fact seems to be the chief cause of their problems, not their past traumas nor their view of the present. While traumas do have a lasting impact, most people actually emerge stronger afterward. Others continue struggling because they over-predict failure and rejection. Studies have shown depressed people are distinguished from the norm by their tendency to imagine fewer positive scenarios while overestimating future risks. (...)
The brain’s long-term memory has often been compared to an archive, but that’s not its primary purpose. Instead of faithfully recording the past, it keeps rewriting history. Recalling an event in a new context can lead to new information being inserted in the memory. Coaching of eyewitnesses can cause people to reconstruct their memory so that no trace of the original is left.
The fluidity of memory may seem like a defect, especially to a jury, but it serves a larger purpose. It’s a feature, not a bug, because the point of memory is to improve our ability to face the present and the future. To exploit the past, we metabolize it by extracting and recombining relevant information to fit novel situations.
What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives.
A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present.Behaviorists thought of animal learning as the ingraining of habit by repetition. Psychoanalysts believed that treating patients was a matter of unearthing and confronting the past. Even when cognitive psychology emerged, it focused on the past and present — on memory and perception.
But it is increasingly clear that the mind is mainly drawn to the future, not driven by the past. Behavior, memory and perception can’t be understood without appreciating the central role of prospection. We learn not by storing static records but by continually retouching memories and imagining future possibilities. Our brain sees the world not by processing every pixel in a scene but by focusing on the unexpected.
Our emotions are less reactions to the present than guides to future behavior. Therapists are exploring new ways to treat depression now that they see it as primarily not because of past traumas and present stresses but because of skewed visions of what lies ahead.
Prospection enables us to become wise not just from our own experiences but also by learning from others. We are social animals like no others, living and working in very large groups of strangers, because we have jointly constructed the future. Human culture — our language, our division of labor, our knowledge, our laws and technology — is possible only because we can anticipate what fellow humans will do in the distant future. We make sacrifices today to earn rewards tomorrow, whether in this life or in the afterlife promised by so many religions. (...)
The central role of prospection has emerged in recent studies of both conscious and unconscious mental processes, like one in Chicago that pinged nearly 500 adults during the day to record their immediate thoughts and moods. If traditional psychological theory had been correct, these people would have spent a lot of time ruminating. But they actually thought about the future three times more often than the past, and even those few thoughts about a past event typically involved consideration of its future implications.
When making plans, they reported higher levels of happiness and lower levels of stress than at other times, presumably because planning turns a chaotic mass of concerns into an organized sequence. Although they sometimes feared what might go wrong, on average there were twice as many thoughts of what they hoped would happen.
While most people tend to be optimistic, those suffering from depression and anxiety have a bleak view of the future — and that in fact seems to be the chief cause of their problems, not their past traumas nor their view of the present. While traumas do have a lasting impact, most people actually emerge stronger afterward. Others continue struggling because they over-predict failure and rejection. Studies have shown depressed people are distinguished from the norm by their tendency to imagine fewer positive scenarios while overestimating future risks. (...)
The brain’s long-term memory has often been compared to an archive, but that’s not its primary purpose. Instead of faithfully recording the past, it keeps rewriting history. Recalling an event in a new context can lead to new information being inserted in the memory. Coaching of eyewitnesses can cause people to reconstruct their memory so that no trace of the original is left.
The fluidity of memory may seem like a defect, especially to a jury, but it serves a larger purpose. It’s a feature, not a bug, because the point of memory is to improve our ability to face the present and the future. To exploit the past, we metabolize it by extracting and recombining relevant information to fit novel situations.
by Martin E.P. Seligman and John Tierney, NY Times | Read more:
Image: Maxwell Holyoke-HirschThursday, May 18, 2017
This New App Wants to Be the Uber of Camping
According to a report by the Outdoor Foundation, Americans log 598 million nights a year under the stars. At an average of $40 in expenses and fees per night, that’s $24 billion spent on campsites alone. Add in all the related costs—gear, transportation, food—and the Outdoor Industry Association figures the industry generates closer to $167 billion annually.
But former investment banker Michael D’Agostino, who grew up camping on a farm in Litchfield, Conn., still calls the industry a broken business.
The tipping point came a few summers ago, when D’Agostino found himself on vacation “directly across from a campsite of 40 people at a Wiccan convention: robes and UFO spotters and streaking and all.” It wasn’t what he’d imagined as a quiet weekend with his wife—counting stars, listening to crickets, bellies full from prime steaks grilled over a man-made fire. “We definitely took them up on some mead,” he said of the Wiccans, “but we had to keep the dog in the tent—she was going bonkers—and it was kind of like camping in Times Square.”
The experience led him to create Tentrr, a free iPhone app that takes the guesswork out of camping. It lets users find and instantly book fully private campsites in vetted, bucolic settings, all within a few hours’ drive of major cities. The sites themselves are all custom-designed by D’Agostino and follow a standardized footprint: They consist of hand-sewn canvas expedition tents from Colorado, set on an elevated deck with Adirondack chairs. You’re also guaranteed to find Brazilian wood picnic tables and sun showers strewn around the campsites, as well as portable camping toilets, fire pits, cookware, and grills. As for the sleeping arrangements? Air mattresses with featherbed toppers, not sleeping bags, are the name of the game.
Tentrr beta-launched last summer with just 50 campsites in New York state, while D’Agostino figured out how to get liability insurers on board with his slice of the sharing economy. Despite the soft opening, the app has already logged $4 million in funding and 1,500 bookings—40 percent of them by people who’d never gone camping before.
In the days leading up to Memorial Day, Tentrr will move past its beta phase with a newly expanded collection of roughly 150 campsites spread across the U.S. Northeast. By July 4 an additional 100 sites will gradually come online, not including a 50-site expansion into the Pacific Northwest. Next year, D’Agostino plans to tackle the “San Francisco-Yosemite corridor, the American Southwest, and counterclockwise around the perimeter of the U.S., all within a few hours of major metropolitan cities, until all of the country’s top-50 hubs are served.” His ultimate vision, however, is global.
The trick, said D’Agostino, is shifting campers away from national or state parks and working instead with private landowners. Among his campsite keepers are a set of fourth-generation dairy farmers, a contractor who runs a recording studio in his barn, and an “unnamed” actress with expansive property in New York’s Hudson Valley. All have dozens, if not hundreds, of acres to spare—making them perfect for Tentrr’s semipermanent campsites. (The tents are heated by cylinder stoves through November; after the camping season ends, either the tent keepers or Tentrr employees dismantle the sites and put them into weatherproof storage.)
It sounds limited, but Tentrr is setting up 10 to 20 campsites per week, with tent keepers paying a one-time, $1,500 membership fee to join. (It covers the setup of their site, which itself is valued at $6,000.) “We’ve been spreading by word-of-mouth like wildfire,” said D’Agostino. “We set up one camp, and one turns into 30.” But he’s wary of expanding too quickly and is limiting his company’s growth to no more than 35 new campsites per week—an effort to ensure demand continually outpaces supply.
by Nikki Ekstein, Bloomberg | Read more:
But former investment banker Michael D’Agostino, who grew up camping on a farm in Litchfield, Conn., still calls the industry a broken business.
The tipping point came a few summers ago, when D’Agostino found himself on vacation “directly across from a campsite of 40 people at a Wiccan convention: robes and UFO spotters and streaking and all.” It wasn’t what he’d imagined as a quiet weekend with his wife—counting stars, listening to crickets, bellies full from prime steaks grilled over a man-made fire. “We definitely took them up on some mead,” he said of the Wiccans, “but we had to keep the dog in the tent—she was going bonkers—and it was kind of like camping in Times Square.”The experience led him to create Tentrr, a free iPhone app that takes the guesswork out of camping. It lets users find and instantly book fully private campsites in vetted, bucolic settings, all within a few hours’ drive of major cities. The sites themselves are all custom-designed by D’Agostino and follow a standardized footprint: They consist of hand-sewn canvas expedition tents from Colorado, set on an elevated deck with Adirondack chairs. You’re also guaranteed to find Brazilian wood picnic tables and sun showers strewn around the campsites, as well as portable camping toilets, fire pits, cookware, and grills. As for the sleeping arrangements? Air mattresses with featherbed toppers, not sleeping bags, are the name of the game.
Tentrr beta-launched last summer with just 50 campsites in New York state, while D’Agostino figured out how to get liability insurers on board with his slice of the sharing economy. Despite the soft opening, the app has already logged $4 million in funding and 1,500 bookings—40 percent of them by people who’d never gone camping before.
In the days leading up to Memorial Day, Tentrr will move past its beta phase with a newly expanded collection of roughly 150 campsites spread across the U.S. Northeast. By July 4 an additional 100 sites will gradually come online, not including a 50-site expansion into the Pacific Northwest. Next year, D’Agostino plans to tackle the “San Francisco-Yosemite corridor, the American Southwest, and counterclockwise around the perimeter of the U.S., all within a few hours of major metropolitan cities, until all of the country’s top-50 hubs are served.” His ultimate vision, however, is global.
The trick, said D’Agostino, is shifting campers away from national or state parks and working instead with private landowners. Among his campsite keepers are a set of fourth-generation dairy farmers, a contractor who runs a recording studio in his barn, and an “unnamed” actress with expansive property in New York’s Hudson Valley. All have dozens, if not hundreds, of acres to spare—making them perfect for Tentrr’s semipermanent campsites. (The tents are heated by cylinder stoves through November; after the camping season ends, either the tent keepers or Tentrr employees dismantle the sites and put them into weatherproof storage.)
It sounds limited, but Tentrr is setting up 10 to 20 campsites per week, with tent keepers paying a one-time, $1,500 membership fee to join. (It covers the setup of their site, which itself is valued at $6,000.) “We’ve been spreading by word-of-mouth like wildfire,” said D’Agostino. “We set up one camp, and one turns into 30.” But he’s wary of expanding too quickly and is limiting his company’s growth to no more than 35 new campsites per week—an effort to ensure demand continually outpaces supply.
Image: Tertius Bune for Tentrr
Wednesday, May 17, 2017
Rod Rosenstein Saves the Republican Party From Itself
With the stroke of a pen, Rod Rosenstein redeemed his reputation, preserved the justice system, pulled American politics back from the brink — and, just possibly, saved the Republican Party and President Trump from themselves.
The deputy attorney general’s memo Wednesday night announcing that he had appointed Robert Mueller as special prosecutor to investigate the Trump administration’s ties to Russia was pitch perfect in its simple justification: While he has not determined that any crime has been committed, he wrote that “based upon the unique circumstances, the public interest requires me to place this investigation under the authority of a person who exercises a degree of independence from the normal chain of command.”
This is precisely what Rosenstein needed to do for all parties, but particularly for his own honor. Rosenstein, just two weeks into the job, had trashed the reputation he had built over the years as a fair-minded and above-the-fray prosecutor by allowing Trump to use him as cover for Trump’s own decision to sack FBI Director James Comey. Many who admired Rosenstein were stunned that he would let himself be used this way; I argued last week that “if he cares at all about rehabilitating the reputation he built, Rosenstein has one option: He can appoint a serious, independent and above-reproach special counsel — the sort of person Rosenstein was seen as, until this week — to continue the Russia probe.” In tapping Mueller — a solid figure who served ably as FBI director under two presidents — that’s what Rosenstein did.
Rosenstein also restores some confidence in a justice system that has been much abused by Trump’s assaults on “so-called” judges. That system was gravely wounded by Comey’s firing, ordered by Trump and overseen by Attorney General Jeff Sessions, who was supposed to have recused himself from the Russia probe but decided it was just fine to recommend the firing of the man overseeing that investigation and choose his replacement.
The deputy attorney general’s decision also reduces partisan pressures that were very clearly harming the national interest. Republicans had gone into a crouch to protect against any suggestion that Trump and his advisers colluded with the Russians. Democrats were often leaping to conclude that there was high-level collusion. And nearly everybody had lost track of the most important issue: Russia, arguably our leading global adversary, had successfully meddled in a U.S. presidential election — undermining confidence in our system of government — and was ready to do it again. (...)
At a news conference Wednesday morning, Ryan, reading from a typewritten statement, gave what amounted to a generous Trump defense. Ryan alleged that “there are some people out there who want to harm the president,” and said of Comey: “If this happened as he allegedly describes, why didn’t he take action at the time?” Ryan dismissed “speculation and innuendo,” saying “there’s clearly a lot of politics being played.” He cited the acting FBI director as saying “no one has tried to impede” the FBI probe. “There is plenty of oversight that is being done,” Ryan assured all. Walking out, he was asked if he had “full confidence” in Trump. Ryan paused briefly mid-stride and said, softly, “I do.”
It was a huge gamble by the top Republican in Congress. Ryan’s defense of Trump is a calculation that Trump will ride out the troubles. He is betting his political fortunes — and perhaps his party’s hold on the House — on a man who has provided very little justification for trust.
Rosenstein’s action rescues Ryan, McConnell and other GOP leaders from their own cowardice in refusing to demand more accountability from Trump.
Trump no doubt will feel betrayed by Rosenstein, as he felt betrayed by Comey. He was already feeling awfully sorry for himself, saying Wednesday that no politician “has been treated worse or more unfairly.”
Now he is to be treated to the luxury of his own, personal special prosecutor. If past is prologue, Mueller’s investigation will be a huge distraction for the White House as everybody “lawyers up” and attention shifts from what remains of Trump’s agenda to the latest twists and turns that can be discerned. I covered the Clinton White House during the Monica Lewinsky investigation, and I don’t doubt that this probe, like that one, could prove debilitating to Trump.
But Trump’s agenda was already moribund. A mere 117 days into his presidency, Trump has already amassed a collection of scandals and failures that most presidents take years to acquire. Even before the latest debacles over Comey’s firing, his memos and Trump’s handing secrets to Russia, Trump’s political capital had been drained by the health-care woes. It’s hard to see how legislative momentum can be restored now that Washington has settled into scandal mode. Trump has the waddle of a prematurely lame duck. The Mueller appointment, at least, gives the Trump White House a chance to compartmentalize the scandals. And, crucially, it provides one more watchdog keeping Trump’s autocratic instincts from getting the better of him — and the rest of us. (...)
It’s often said that all that is necessary for the triumph of evil is for good men to do nothing. On Wednesday night, Rod Rosenstein did something.
[ed. UPDATE: Except, there may be one small problem.]
The deputy attorney general’s memo Wednesday night announcing that he had appointed Robert Mueller as special prosecutor to investigate the Trump administration’s ties to Russia was pitch perfect in its simple justification: While he has not determined that any crime has been committed, he wrote that “based upon the unique circumstances, the public interest requires me to place this investigation under the authority of a person who exercises a degree of independence from the normal chain of command.”
This is precisely what Rosenstein needed to do for all parties, but particularly for his own honor. Rosenstein, just two weeks into the job, had trashed the reputation he had built over the years as a fair-minded and above-the-fray prosecutor by allowing Trump to use him as cover for Trump’s own decision to sack FBI Director James Comey. Many who admired Rosenstein were stunned that he would let himself be used this way; I argued last week that “if he cares at all about rehabilitating the reputation he built, Rosenstein has one option: He can appoint a serious, independent and above-reproach special counsel — the sort of person Rosenstein was seen as, until this week — to continue the Russia probe.” In tapping Mueller — a solid figure who served ably as FBI director under two presidents — that’s what Rosenstein did.Rosenstein also restores some confidence in a justice system that has been much abused by Trump’s assaults on “so-called” judges. That system was gravely wounded by Comey’s firing, ordered by Trump and overseen by Attorney General Jeff Sessions, who was supposed to have recused himself from the Russia probe but decided it was just fine to recommend the firing of the man overseeing that investigation and choose his replacement.
The deputy attorney general’s decision also reduces partisan pressures that were very clearly harming the national interest. Republicans had gone into a crouch to protect against any suggestion that Trump and his advisers colluded with the Russians. Democrats were often leaping to conclude that there was high-level collusion. And nearly everybody had lost track of the most important issue: Russia, arguably our leading global adversary, had successfully meddled in a U.S. presidential election — undermining confidence in our system of government — and was ready to do it again. (...)
At a news conference Wednesday morning, Ryan, reading from a typewritten statement, gave what amounted to a generous Trump defense. Ryan alleged that “there are some people out there who want to harm the president,” and said of Comey: “If this happened as he allegedly describes, why didn’t he take action at the time?” Ryan dismissed “speculation and innuendo,” saying “there’s clearly a lot of politics being played.” He cited the acting FBI director as saying “no one has tried to impede” the FBI probe. “There is plenty of oversight that is being done,” Ryan assured all. Walking out, he was asked if he had “full confidence” in Trump. Ryan paused briefly mid-stride and said, softly, “I do.”
It was a huge gamble by the top Republican in Congress. Ryan’s defense of Trump is a calculation that Trump will ride out the troubles. He is betting his political fortunes — and perhaps his party’s hold on the House — on a man who has provided very little justification for trust.
Rosenstein’s action rescues Ryan, McConnell and other GOP leaders from their own cowardice in refusing to demand more accountability from Trump.
Trump no doubt will feel betrayed by Rosenstein, as he felt betrayed by Comey. He was already feeling awfully sorry for himself, saying Wednesday that no politician “has been treated worse or more unfairly.”
Now he is to be treated to the luxury of his own, personal special prosecutor. If past is prologue, Mueller’s investigation will be a huge distraction for the White House as everybody “lawyers up” and attention shifts from what remains of Trump’s agenda to the latest twists and turns that can be discerned. I covered the Clinton White House during the Monica Lewinsky investigation, and I don’t doubt that this probe, like that one, could prove debilitating to Trump.
But Trump’s agenda was already moribund. A mere 117 days into his presidency, Trump has already amassed a collection of scandals and failures that most presidents take years to acquire. Even before the latest debacles over Comey’s firing, his memos and Trump’s handing secrets to Russia, Trump’s political capital had been drained by the health-care woes. It’s hard to see how legislative momentum can be restored now that Washington has settled into scandal mode. Trump has the waddle of a prematurely lame duck. The Mueller appointment, at least, gives the Trump White House a chance to compartmentalize the scandals. And, crucially, it provides one more watchdog keeping Trump’s autocratic instincts from getting the better of him — and the rest of us. (...)
It’s often said that all that is necessary for the triumph of evil is for good men to do nothing. On Wednesday night, Rod Rosenstein did something.
by Dana Milbank, Washington Post | Read more:
Image: Aaron Bernstein/Reuters[ed. UPDATE: Except, there may be one small problem.]
Writing About Charlie Brown Feels Like Writing About Myself
I can’t write objectively about Charlie Brown. It feels like I’m writing about myself.
This, I realize, is no accident.
I know that Charlie Brown is the type of character consciously designed to make people feel like they’re looking at an image of themselves. If you can’t empathize with Charlie Brown, you likely lack an ability to empathize with any fictional character. Here is a child continually humiliated for desiring nothing more than normalcy—the opportunity to kick a football, the aptitude to fly a kite, the freedom to walk down the sidewalk without having a random acquaintance compare his skull to a block of lumber. He wants glory, but not an excessive amount (one baseball victory would be more than enough). He has the coolest dog in town, but that plays to his disadvantage. He’s an eight-year-old who needs a psychiatrist, and he has to pay the bill himself (only five cents, but still). Charlie Brown knows his life is a contradictory struggle, and sometimes his only option is to lie in a dark room, alone with his thoughts. He will never win. He will never win.
Yet here’s the paradox: Charlie Brown is still happy. He still has friends. He still gets excited about all the projects that are destined to fail. Very often, young Americans are simultaneously pessimistic about the world and optimistic about themselves—they assume everyone’s future is bleak, except for their own. Charlie is the opposite. He knows he’s doomed, but that doesn’t stop him from trying anything and everything. He believes existence is amazing, despite his own personal experience. It’s the quality that makes him so infinitely likable: He does not see the world as cruel. He believes the world is good, even if everything that’s ever happened to him suggests otherwise. All he wants are the things everyone else seems to get without trying. He aspires to be average, which—for him—is an impossible dream.
I suppose nobody feels this way all the time. But everybody feels this way occasionally.
Charles M. Schulz died on February 12, 2000. The final Peanuts strip ran the very next day, a coincidence noted by virtually everyone who cared about the man and his work. In the years since his passing, I’ve noticed a curious trend: For whatever reason, it’s become popular to assert that the spiritual center of the Peanuts universe is not Charlie Brown. The postmodern answer to that puzzle is Snoopy—dynamic, indefatigable, and hyperimaginative. Perception has drifted toward what the public prefers to celebrate. It’s a little like what happened on the TV show Happy Days: A sitcom originally focused on milquetoast Richie Cunningham rapidly evolved into a vehicle for the super‑coolness of Fonzie. Obviously, this type of paradigm shift is no crime against humanity, and I love Snoopy almost as much as his owner (he’s a wonderful dancer and my all-time favorite novelist). But Snoopy is not the emotional vortex of Peanuts. That’s simply wrong. The linchpin to Peanuts will always be Charlie Brown. It can be no one else. And this is because Charlie Brown effortlessly embodies what Peanuts truly is: an introduction to adult problems, explained by children.
The probable (read: inevitable) death of daily newspapers will have a lot of collateral damage, to varying degrees of impact. I don’t know where the gradual disappearance of the Sunday comics falls on this continuum, or even if it belongs at all. I assume something else will come to occupy its role in the culture, and the notion of bemoaning such a loss will be categorized as nostalgia for a period when the media was controlled by dinosaurs who refused to accept that the purpose of every news story was to provide random people the opportunity to publicly comment on how they felt about it. But I will miss the Sunday comics. I miss them already.
As a kid, I loved the idea that there was at least one section of the newspaper directly targeted at my brain; as an adult, it was reassuring to read something that was still the exact same product I remembered from the past. It was static in the best possible way. Like most people, I moved through various adolescent phases where different strips temporarily became my obsession: Garfield in fifth grade, The Far Side throughout high school, Calvin and Hobbes as a college boozehound. But I always considered Peanuts the most “important” comic strip, and the one that all other strips were measured against. The fact that Peanuts was the first strip on the very top of the Sunday comics’ front page verified this subjective belief—if comics were rock bands, it seemed obvious that Peanuts was the Beatles. (...)
“It’s depressing to realize that you’re so insignificant you haven’t got a chance ever to become president,” Charlie Brown tells Lucy on page 76 (it’s the second week of June, 1957). “It wouldn’t be so bad if I thought I had some chance.” Like so much of the classic Peanuts banter, he makes these remarks apropos of nothing—it’s just something he’s suddenly worried about, for no clear reason. Lucy, of course, obliterates Charlie for voicing this trepidation, mocking him with a tsunami of faint praise, almost as if he had somehow claimed he was destined for political greatness. Is her response amusing? I suppose it’s a little amusing. But it’s mostly dark (and entirely true). At the age of eight, Charlie Brown is considering a reality that most people don’t confront until much later: a realization that the future is limited. It’s not that he desperately wants to become Dwight Eisenhower—it’s the simple recognition that this couldn’t happen even if he did. He’s confronting the central myth of childhood, which is that anyone can be anything. Charlie Brown represents the downside of adult consciousness. And what does Lucy represent? Lucy represents the world itself. Lucy responds the way society always responds to any sudden insight of existential despair: How did you not know this already, Blockhead?
It doesn’t matter how many times this sort of thing has happened before. It will never stop happening, to Charlie Brown or anyone else. Like I said—Charlie Brown knows he’s doomed. He absolutely, irrefutably knows it. But a little part of his mind always thinks, “Maybe not this time, though.” That glimmer of hope is his Achilles’ heel. It’s also the quality that makes him so eminently relatable. The joke is not that Charlie Brown is hopeless. The joke is that Charlie Brown knows he’s hopeless, but he doesn’t trust the infallibility of his own insecurity. If he’s always wrong about everything, perhaps he’s wrong about this, too.
When Charlie mentions the impossibility of his own presidential fantasy, there’s a vague sense that he wants Lucy to tell him he’s mistaken. And at first (of course), Lucy does exactly that. She says “maybe.” And then (of course) she does what she always does. She reminds Charlie Brown that he is Charlie Brown. Which is how I suspect Charles M. Schulz felt about himself, up until the very end: “No matter what I do or what I try, I’m always going to be myself.”
This, I realize, is no accident.
I know that Charlie Brown is the type of character consciously designed to make people feel like they’re looking at an image of themselves. If you can’t empathize with Charlie Brown, you likely lack an ability to empathize with any fictional character. Here is a child continually humiliated for desiring nothing more than normalcy—the opportunity to kick a football, the aptitude to fly a kite, the freedom to walk down the sidewalk without having a random acquaintance compare his skull to a block of lumber. He wants glory, but not an excessive amount (one baseball victory would be more than enough). He has the coolest dog in town, but that plays to his disadvantage. He’s an eight-year-old who needs a psychiatrist, and he has to pay the bill himself (only five cents, but still). Charlie Brown knows his life is a contradictory struggle, and sometimes his only option is to lie in a dark room, alone with his thoughts. He will never win. He will never win.
Yet here’s the paradox: Charlie Brown is still happy. He still has friends. He still gets excited about all the projects that are destined to fail. Very often, young Americans are simultaneously pessimistic about the world and optimistic about themselves—they assume everyone’s future is bleak, except for their own. Charlie is the opposite. He knows he’s doomed, but that doesn’t stop him from trying anything and everything. He believes existence is amazing, despite his own personal experience. It’s the quality that makes him so infinitely likable: He does not see the world as cruel. He believes the world is good, even if everything that’s ever happened to him suggests otherwise. All he wants are the things everyone else seems to get without trying. He aspires to be average, which—for him—is an impossible dream.I suppose nobody feels this way all the time. But everybody feels this way occasionally.
Charles M. Schulz died on February 12, 2000. The final Peanuts strip ran the very next day, a coincidence noted by virtually everyone who cared about the man and his work. In the years since his passing, I’ve noticed a curious trend: For whatever reason, it’s become popular to assert that the spiritual center of the Peanuts universe is not Charlie Brown. The postmodern answer to that puzzle is Snoopy—dynamic, indefatigable, and hyperimaginative. Perception has drifted toward what the public prefers to celebrate. It’s a little like what happened on the TV show Happy Days: A sitcom originally focused on milquetoast Richie Cunningham rapidly evolved into a vehicle for the super‑coolness of Fonzie. Obviously, this type of paradigm shift is no crime against humanity, and I love Snoopy almost as much as his owner (he’s a wonderful dancer and my all-time favorite novelist). But Snoopy is not the emotional vortex of Peanuts. That’s simply wrong. The linchpin to Peanuts will always be Charlie Brown. It can be no one else. And this is because Charlie Brown effortlessly embodies what Peanuts truly is: an introduction to adult problems, explained by children.
The probable (read: inevitable) death of daily newspapers will have a lot of collateral damage, to varying degrees of impact. I don’t know where the gradual disappearance of the Sunday comics falls on this continuum, or even if it belongs at all. I assume something else will come to occupy its role in the culture, and the notion of bemoaning such a loss will be categorized as nostalgia for a period when the media was controlled by dinosaurs who refused to accept that the purpose of every news story was to provide random people the opportunity to publicly comment on how they felt about it. But I will miss the Sunday comics. I miss them already.
As a kid, I loved the idea that there was at least one section of the newspaper directly targeted at my brain; as an adult, it was reassuring to read something that was still the exact same product I remembered from the past. It was static in the best possible way. Like most people, I moved through various adolescent phases where different strips temporarily became my obsession: Garfield in fifth grade, The Far Side throughout high school, Calvin and Hobbes as a college boozehound. But I always considered Peanuts the most “important” comic strip, and the one that all other strips were measured against. The fact that Peanuts was the first strip on the very top of the Sunday comics’ front page verified this subjective belief—if comics were rock bands, it seemed obvious that Peanuts was the Beatles. (...)
“It’s depressing to realize that you’re so insignificant you haven’t got a chance ever to become president,” Charlie Brown tells Lucy on page 76 (it’s the second week of June, 1957). “It wouldn’t be so bad if I thought I had some chance.” Like so much of the classic Peanuts banter, he makes these remarks apropos of nothing—it’s just something he’s suddenly worried about, for no clear reason. Lucy, of course, obliterates Charlie for voicing this trepidation, mocking him with a tsunami of faint praise, almost as if he had somehow claimed he was destined for political greatness. Is her response amusing? I suppose it’s a little amusing. But it’s mostly dark (and entirely true). At the age of eight, Charlie Brown is considering a reality that most people don’t confront until much later: a realization that the future is limited. It’s not that he desperately wants to become Dwight Eisenhower—it’s the simple recognition that this couldn’t happen even if he did. He’s confronting the central myth of childhood, which is that anyone can be anything. Charlie Brown represents the downside of adult consciousness. And what does Lucy represent? Lucy represents the world itself. Lucy responds the way society always responds to any sudden insight of existential despair: How did you not know this already, Blockhead?
It doesn’t matter how many times this sort of thing has happened before. It will never stop happening, to Charlie Brown or anyone else. Like I said—Charlie Brown knows he’s doomed. He absolutely, irrefutably knows it. But a little part of his mind always thinks, “Maybe not this time, though.” That glimmer of hope is his Achilles’ heel. It’s also the quality that makes him so eminently relatable. The joke is not that Charlie Brown is hopeless. The joke is that Charlie Brown knows he’s hopeless, but he doesn’t trust the infallibility of his own insecurity. If he’s always wrong about everything, perhaps he’s wrong about this, too.
When Charlie mentions the impossibility of his own presidential fantasy, there’s a vague sense that he wants Lucy to tell him he’s mistaken. And at first (of course), Lucy does exactly that. She says “maybe.” And then (of course) she does what she always does. She reminds Charlie Brown that he is Charlie Brown. Which is how I suspect Charles M. Schulz felt about himself, up until the very end: “No matter what I do or what I try, I’m always going to be myself.”
by Chuck Klosterman, LitHub | Read more:
Image: via:
How Boomers Ruined the World
The day before I finished reading A Generation of Sociopaths, who should pop up to prove Bruce Cannon Gibney’s point, as if he had been paid to do so, but the notorious Joe Walsh (born 1961), former congressman and Obama denigrator. In answer to talkshow host Jimmy Kimmel’s plea for merciful health insurance, using his newborn son’s heart defect as an example, Walsh tweeted: “Sorry Jimmy Kimmel: your sad story doesn’t obligate me or anyone else to pay for somebody else’s health care.” Gibney’s essential point, thus proved, is that boomers are selfish to the core, among other failings, and as a boomer myself, I feel the “you got me” pain that we all ought to feel but so few of us do.
Gibney is about my daughter’s age – born in the late 1970s – and admits that one of his parents is a boomer. He has a wry, amusing style (“As the Boomers became Washington’s most lethal invasive species … ”) and plenty of well parsed statistics to back him up. His essential point is that by refusing to make the most basic (and fairly minimal) sacrifices to manage infrastructure, address climate change and provide decent education and healthcare, the boomers have bequeathed their children a mess of daunting proportions. Through such government programmes as social security and other entitlements, they have run up huge debts that the US government cannot pay except by, eventually, soaking the young. One of his most affecting chapters is about how failing schools feed mostly African American youth into the huge for-profit prison system. Someday, they will get out. There will be no structures in place to employ or take care of them.
The boomers have made sure that they themselves will live long and prosper, but only at the expense of their offspring. That we are skating on thin ice is no solace: “Because the problems Boomers created, from entitlements on, grow not so much in linear as exponential terms, the crisis that feels distant today will, when it comes, seem to have arrived overnight.” As one who has been raging against the American right since the election of Ronald Reagan, as someone with plenty of boomer friends who have done the same, I would like to let myself off the hook, but Gibney points out that while “not all Boomers directly participated, almost all benefited; they are, as the law would have it, jointly and severally liable”.
Gibney’s theories about how we boomers got to be sociopaths (inclined to “deceit, selfishness, imprudence, remorselessness, hostility”) are a little light: no experience of the second world war, unlike the Europeans; coddled childhoods owing to 1950s prosperity; and TV – “a training and reinforcement mechanism for deceit”, not to mention softening viewers up for ever more consumption of goods.
My own theories are based on my experience of the cold war. I think that the constant danger of nuclear annihilation and the drumbeat on TV and radio of the Soviet threat raised our fight-flight instincts so that some of us became overly cautious (me) and others overly aggressive (Dick Cheney). I also think that our parents were not “permissive”, but that they produced too many children in an era when there was nothing much for the children to do but get out of the house and into trouble – few time-consuming tasks around the house or on the farm, plus bored mothers and absent fathers, who felt a sense of despair when they compared themselves with the shiny advertisements of middle-class perfection they saw everywhere, not just on TV. This was what America had to offer – washing machines, high heels, perfect hairdos, Corn Flakes, TV dinners, patriotism and imminent destruction.
Gibney is about my daughter’s age – born in the late 1970s – and admits that one of his parents is a boomer. He has a wry, amusing style (“As the Boomers became Washington’s most lethal invasive species … ”) and plenty of well parsed statistics to back him up. His essential point is that by refusing to make the most basic (and fairly minimal) sacrifices to manage infrastructure, address climate change and provide decent education and healthcare, the boomers have bequeathed their children a mess of daunting proportions. Through such government programmes as social security and other entitlements, they have run up huge debts that the US government cannot pay except by, eventually, soaking the young. One of his most affecting chapters is about how failing schools feed mostly African American youth into the huge for-profit prison system. Someday, they will get out. There will be no structures in place to employ or take care of them.The boomers have made sure that they themselves will live long and prosper, but only at the expense of their offspring. That we are skating on thin ice is no solace: “Because the problems Boomers created, from entitlements on, grow not so much in linear as exponential terms, the crisis that feels distant today will, when it comes, seem to have arrived overnight.” As one who has been raging against the American right since the election of Ronald Reagan, as someone with plenty of boomer friends who have done the same, I would like to let myself off the hook, but Gibney points out that while “not all Boomers directly participated, almost all benefited; they are, as the law would have it, jointly and severally liable”.
Gibney’s theories about how we boomers got to be sociopaths (inclined to “deceit, selfishness, imprudence, remorselessness, hostility”) are a little light: no experience of the second world war, unlike the Europeans; coddled childhoods owing to 1950s prosperity; and TV – “a training and reinforcement mechanism for deceit”, not to mention softening viewers up for ever more consumption of goods.
My own theories are based on my experience of the cold war. I think that the constant danger of nuclear annihilation and the drumbeat on TV and radio of the Soviet threat raised our fight-flight instincts so that some of us became overly cautious (me) and others overly aggressive (Dick Cheney). I also think that our parents were not “permissive”, but that they produced too many children in an era when there was nothing much for the children to do but get out of the house and into trouble – few time-consuming tasks around the house or on the farm, plus bored mothers and absent fathers, who felt a sense of despair when they compared themselves with the shiny advertisements of middle-class perfection they saw everywhere, not just on TV. This was what America had to offer – washing machines, high heels, perfect hairdos, Corn Flakes, TV dinners, patriotism and imminent destruction.
by Jane Smiley, The Guardian | Read more:
Image: Lambert/Getty ImagesPolitics 101
When the World Is Led by a Child
It’s Chicken or Fish
This Isn’t Smoke. It’s Fire.
This Is How Trump’s NatSec Aides Get Him To Pay Attention To His Briefings
Chelsea Manning Is a Free Woman: Her Heroism Has Expanded Beyond Her Initial Whistleblowing
Will Sean Spicer Take the Fall?
Former FBI head Robert Mueller to oversee Trump-Russia investigation
Rod Rosenstein saves the Republican Party from itself
[ed. Stay tuned. It's only Wednesday.]
It’s Chicken or Fish
This Isn’t Smoke. It’s Fire.
This Is How Trump’s NatSec Aides Get Him To Pay Attention To His Briefings
Chelsea Manning Is a Free Woman: Her Heroism Has Expanded Beyond Her Initial Whistleblowing
Will Sean Spicer Take the Fall?
Former FBI head Robert Mueller to oversee Trump-Russia investigation
Rod Rosenstein saves the Republican Party from itself
[ed. Stay tuned. It's only Wednesday.]
My Family’s Slave
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
Her name was Eudocia Tomas Pulido. We called her Lola. She was 4 foot 11, with mocha-brown skin and almond eyes that I can still see looking into mine—my first memory. She was 18 years old when my grandfather gave her to my mother as a gift, and when my family moved to the United States, we brought her with us. No other word but slave encompassed the life she lived. Her days began before everyone else woke and ended after we went to bed. She prepared three meals a day, cleaned the house, waited on my parents, and took care of my four siblings and me. My parents never paid her, and they scolded her constantly. She wasn’t kept in leg irons, but she might as well have been. So many nights, on my way to the bathroom, I’d spot her sleeping in a corner, slumped against a mound of laundry, her fingers clutching a garment she was in the middle of folding.
To our American neighbors, we were model immigrants, a poster family. They told us so. My father had a law degree, my mother was on her way to becoming a doctor, and my siblings and I got good grades and always said “please” and “thank you.” We never talked about Lola. Our secret went to the core of who we were and, at least for us kids, who we wanted to be.
After my mother died of leukemia, in 1999, Lola came to live with me in a small town north of Seattle. I had a family, a career, a house in the suburbs—the American dream. And then I had a slave.
Her name was Eudocia Tomas Pulido. We called her Lola. She was 4 foot 11, with mocha-brown skin and almond eyes that I can still see looking into mine—my first memory. She was 18 years old when my grandfather gave her to my mother as a gift, and when my family moved to the United States, we brought her with us. No other word but slave encompassed the life she lived. Her days began before everyone else woke and ended after we went to bed. She prepared three meals a day, cleaned the house, waited on my parents, and took care of my four siblings and me. My parents never paid her, and they scolded her constantly. She wasn’t kept in leg irons, but she might as well have been. So many nights, on my way to the bathroom, I’d spot her sleeping in a corner, slumped against a mound of laundry, her fingers clutching a garment she was in the middle of folding.To our American neighbors, we were model immigrants, a poster family. They told us so. My father had a law degree, my mother was on her way to becoming a doctor, and my siblings and I got good grades and always said “please” and “thank you.” We never talked about Lola. Our secret went to the core of who we were and, at least for us kids, who we wanted to be.
After my mother died of leukemia, in 1999, Lola came to live with me in a small town north of Seattle. I had a family, a career, a house in the suburbs—the American dream. And then I had a slave.
by Alex Tizon, The Atlantic | Read more:
Image: Alex Tizon
Tuesday, May 16, 2017
The Gospel According to Mitch
It will surprise no one to hear that politicians are hypocrites. Even the word “politics” today works as a de facto synonym for not practicing what you preach. To be a “skilled politician” means you’re good at saying all the right things while hiding your intent to do the opposite most of the time. Only within the morally corrupt confines of the Beltway is the phrase regarded as a compliment.
For millennia now, moralists have assailed hypocrisy not only as a despicable personal trait, but also a stain on one’s soul. Even if you could fool other people into believing what you say, even if they never caught on to your self-serving and double-crossing, God could see straight through you. One way or another, you will be judged. People in general, and voters in particular, despise hypocrisy. Actions speak louder than words, and empty promises will come back to bite you. You can only bullshit your fellow humans so much—eventually they will catch on and hold you accountable.
The problem, though, is that none of this is true. The cup of political history overfloweth with proof that reliably rank moral dishonesty pays off in public life—one of the most glaring cases in point is Senior Kentucky Senator and Senate Majority Leader Mitch McConnell.
I have developed an unhealthy obsession with McConnell’s political career, not because there’s anything interesting about him personally (the only universally shared opinion about McConnell seems to be that he’s got the charisma of a tub of Vaseline), but because he’s the purest embodiment of some of our most significant political contradictions. And the baseline contradiction from which all the others flow is this: if hypocrisy is such a unanimously despised trait, then how did someone like McConnell become one of the most powerful people in the country?
Having read numerous lengthy profiles of one of the most outwardly boring people in the galaxy, I’ll spare you the long yarns about how a Southern boy who contracted polio at age two ascended the local and national ranks of government without ever losing a single electoral race. For a thorough account of McConnell’s career that simultaneously traces the evolution of the GOP over the past four decades, read Alec MacGillis’s sharp biography The Cynic: The Political Education of Mitch McConnell. If you have the stomach for it, compare MacGillis’s book with McConnell’s own propagandistic memoir, The Long Game.
Perhaps the single-most perplexing feature of McConnell’s life as a professional politician is the most painfully obvious one: the guy is the epitome of unlikeability. From the beginning of his political career, friends and competitors alike have remarked on McConnell’s coldness and lack of basic amiability, his astoundingly bland and awkward bearing as an orator, and, of course, his flaccid physical demeanor, like a turtle without a shell. In his 1977 campaign for county judge in Jefferson County, Kentucky, McConnell raised enough money to hire the (very expensive) pollster and strategist Tully Plesser along with the ad producer Robert Goodman. Goodman himself said of McConnell, his own client, “He isn’t interesting. He doesn’t have an aura, an air of mystery about him.” Mitch McConnell is the human equivalent of eggshell paint. He’s a bowl of porridge whose girlfriend dumped him for gruel. You get the picture.
More perplexing still is this reptilian nonentity’s nugatory track record in terms of doing anything to incrementally improve our shared public life. You’d be hard pressed to find someone outside of D.C. who knows or remembers McConnell for remotely good reasons. For liberals and leftists, McConnell’s impeccably punchable face has been the symbol of cynical Republican obstructionism over the last eight years. And for the terminally aggrieved conservative base that Trump stole away from the GOP establishment, McConnell was often painted as too ready to reach compromises with the Obama administration, especially on the showdowns over raising the debt ceiling (2011) and avoiding the fiscal cliff (2013). On the far right, McConnell’s a spineless “cuckservative” puppet of corporate interests, plain and simple.
These latter complaints will, no doubt, baffle anyone left of center. After all, this is the same man who famously declared in 2010 that “The single most important thing we want to achieve is for President Obama to be a one-term president.” This is also the man who followed through on that pledge by leading the GOP’s congressional charge to throw sand in the gears of government at every turn during Obama’s presidency.
Beyond outright petulance, the logic behind McConnell’s strategy was clear. As Michael Grunwald explained in Politico, “Republican leaders simply did not want their fingerprints on the Obama agenda; as McConnell explained, if Americans thought D.C. politicians were working together, they would credit the president, and if they thought D.C. seemed as ugly and messy as always, they would blame the president.” (That this strategy did not, in fact, make Obama a one-term president had little to do with McConnell’s search-and-destroy legislative philosophy, and almost everything to do with the GOP’s nomination of private-equity Fauntleroy Mitt Romney as the president’s 2012 challenger.)
Obama stoked the hopes of voters in 2008 for a “post-partisan” way of doing politics that would allegedly put country over party differences. And in the wake of a disastrous Bush presidency, capped off by a crippling economic recession, it appeared that the buoyant Obama wave was pushing the modern GOP closer and closer to oblivion. If the American people began to sense that things were, indeed, getting better under Obama, that would be the death knell for the modern Republican party.
When others in the party began to panic, though, McConnell buckled down. Harkening back to the infamous tactics of Newt Gingrich, McConnell followed this fathomlessly cynical logic to its culmination, weaponizing his branch of Congress to deny the Obama administration any chance whatsoever to claim post-partisanism was working, even if that meant torpedoing the public’s faith in government entirely.
McConnell’s plan proved a (quite literal) smashing success. After eight years of intentionally driving the government into crippling gridlock, McConnell at last has everything he ever wanted—Obama’s gone, Republicans control every branch of government, and he’s fastened his turtle chompers onto the job he’s obsessed over for most of his adult life. In The Long Game, McConnell confesses that, while just about every ambitious senator on the Hill is gunning for the ultimate prize of one day commanding the Oval Office, this was never his goal. “When it came to what I most desired,” he writes, “and the place from which I thought I could make the greatest difference, I knew deep down it was the majority leader’s desk I hoped to occupy one day.” That day came in January of 2015.
There was one big unforeseen consequence, though. As one of the chief architects of the GOP’s scorched-earth strategy during the Obama years, McConnell had created the basic conditions for the Senate’s—and indeed, the GOP’s—own public immolation. Even if it meant filibustering their own proposals, Republicans wanted to expose the useless guts of a broken system to the public and try to pin as much of the blame on Obama as possible. In the process of burning down Washington, though, they cleared a path for the anti-Obama, a loud-mouthed beast who would capitalize on the collective lost faith in the government establishment they themselves had used to fuel a fire that was now burning beyond their control.
This is what makes McConnell such an easy target now. After years of intransigent, uncompromising warfare with the Obama vision, he now must figure out some way to jumpstart the same machine he’s tried so hard to drive into the dirt. It is thus with a peculiar mixture of schadenfreude and fury that we are now treated to the ongoing spectacle of Mitch’s hypocrisy—Mitch-pocrisy, if you will—laid bare. As with Trump, the law of digital irony continuously seems to affirm that, for every injunction McConnell makes during the current administration, there’s a clip somewhere of him saying the exact opposite during the Obama years. (...)
These recent examples of McConnell’s outlandish hypocrisy are just the tip of the iceberg; he has spent his entire career flip-flopping. As John Yarmuth, Kentucky’s only Democratic congressman, told a union crowd in 2014, “Mitch McConnell has been the same cold-hearted, power-hungry politician for the entire forty-six years I’ve known him . . . He’s like a windmill—whichever way the wind blows, he goes. He doesn’t have any core values. He just wants to be something. He doesn’t want to do anything.” Perhaps what’s most distressing about this is that virtually none of us register it as anything resembling news. Everyone knows McConnell is a slimy hypocrite. What’s worse, everyone knows that his hypocrisy works.
For millennia now, moralists have assailed hypocrisy not only as a despicable personal trait, but also a stain on one’s soul. Even if you could fool other people into believing what you say, even if they never caught on to your self-serving and double-crossing, God could see straight through you. One way or another, you will be judged. People in general, and voters in particular, despise hypocrisy. Actions speak louder than words, and empty promises will come back to bite you. You can only bullshit your fellow humans so much—eventually they will catch on and hold you accountable.
The problem, though, is that none of this is true. The cup of political history overfloweth with proof that reliably rank moral dishonesty pays off in public life—one of the most glaring cases in point is Senior Kentucky Senator and Senate Majority Leader Mitch McConnell.I have developed an unhealthy obsession with McConnell’s political career, not because there’s anything interesting about him personally (the only universally shared opinion about McConnell seems to be that he’s got the charisma of a tub of Vaseline), but because he’s the purest embodiment of some of our most significant political contradictions. And the baseline contradiction from which all the others flow is this: if hypocrisy is such a unanimously despised trait, then how did someone like McConnell become one of the most powerful people in the country?
Having read numerous lengthy profiles of one of the most outwardly boring people in the galaxy, I’ll spare you the long yarns about how a Southern boy who contracted polio at age two ascended the local and national ranks of government without ever losing a single electoral race. For a thorough account of McConnell’s career that simultaneously traces the evolution of the GOP over the past four decades, read Alec MacGillis’s sharp biography The Cynic: The Political Education of Mitch McConnell. If you have the stomach for it, compare MacGillis’s book with McConnell’s own propagandistic memoir, The Long Game.
Perhaps the single-most perplexing feature of McConnell’s life as a professional politician is the most painfully obvious one: the guy is the epitome of unlikeability. From the beginning of his political career, friends and competitors alike have remarked on McConnell’s coldness and lack of basic amiability, his astoundingly bland and awkward bearing as an orator, and, of course, his flaccid physical demeanor, like a turtle without a shell. In his 1977 campaign for county judge in Jefferson County, Kentucky, McConnell raised enough money to hire the (very expensive) pollster and strategist Tully Plesser along with the ad producer Robert Goodman. Goodman himself said of McConnell, his own client, “He isn’t interesting. He doesn’t have an aura, an air of mystery about him.” Mitch McConnell is the human equivalent of eggshell paint. He’s a bowl of porridge whose girlfriend dumped him for gruel. You get the picture.
More perplexing still is this reptilian nonentity’s nugatory track record in terms of doing anything to incrementally improve our shared public life. You’d be hard pressed to find someone outside of D.C. who knows or remembers McConnell for remotely good reasons. For liberals and leftists, McConnell’s impeccably punchable face has been the symbol of cynical Republican obstructionism over the last eight years. And for the terminally aggrieved conservative base that Trump stole away from the GOP establishment, McConnell was often painted as too ready to reach compromises with the Obama administration, especially on the showdowns over raising the debt ceiling (2011) and avoiding the fiscal cliff (2013). On the far right, McConnell’s a spineless “cuckservative” puppet of corporate interests, plain and simple.
These latter complaints will, no doubt, baffle anyone left of center. After all, this is the same man who famously declared in 2010 that “The single most important thing we want to achieve is for President Obama to be a one-term president.” This is also the man who followed through on that pledge by leading the GOP’s congressional charge to throw sand in the gears of government at every turn during Obama’s presidency.
Beyond outright petulance, the logic behind McConnell’s strategy was clear. As Michael Grunwald explained in Politico, “Republican leaders simply did not want their fingerprints on the Obama agenda; as McConnell explained, if Americans thought D.C. politicians were working together, they would credit the president, and if they thought D.C. seemed as ugly and messy as always, they would blame the president.” (That this strategy did not, in fact, make Obama a one-term president had little to do with McConnell’s search-and-destroy legislative philosophy, and almost everything to do with the GOP’s nomination of private-equity Fauntleroy Mitt Romney as the president’s 2012 challenger.)
Obama stoked the hopes of voters in 2008 for a “post-partisan” way of doing politics that would allegedly put country over party differences. And in the wake of a disastrous Bush presidency, capped off by a crippling economic recession, it appeared that the buoyant Obama wave was pushing the modern GOP closer and closer to oblivion. If the American people began to sense that things were, indeed, getting better under Obama, that would be the death knell for the modern Republican party.
When others in the party began to panic, though, McConnell buckled down. Harkening back to the infamous tactics of Newt Gingrich, McConnell followed this fathomlessly cynical logic to its culmination, weaponizing his branch of Congress to deny the Obama administration any chance whatsoever to claim post-partisanism was working, even if that meant torpedoing the public’s faith in government entirely.
McConnell’s plan proved a (quite literal) smashing success. After eight years of intentionally driving the government into crippling gridlock, McConnell at last has everything he ever wanted—Obama’s gone, Republicans control every branch of government, and he’s fastened his turtle chompers onto the job he’s obsessed over for most of his adult life. In The Long Game, McConnell confesses that, while just about every ambitious senator on the Hill is gunning for the ultimate prize of one day commanding the Oval Office, this was never his goal. “When it came to what I most desired,” he writes, “and the place from which I thought I could make the greatest difference, I knew deep down it was the majority leader’s desk I hoped to occupy one day.” That day came in January of 2015.
There was one big unforeseen consequence, though. As one of the chief architects of the GOP’s scorched-earth strategy during the Obama years, McConnell had created the basic conditions for the Senate’s—and indeed, the GOP’s—own public immolation. Even if it meant filibustering their own proposals, Republicans wanted to expose the useless guts of a broken system to the public and try to pin as much of the blame on Obama as possible. In the process of burning down Washington, though, they cleared a path for the anti-Obama, a loud-mouthed beast who would capitalize on the collective lost faith in the government establishment they themselves had used to fuel a fire that was now burning beyond their control.
This is what makes McConnell such an easy target now. After years of intransigent, uncompromising warfare with the Obama vision, he now must figure out some way to jumpstart the same machine he’s tried so hard to drive into the dirt. It is thus with a peculiar mixture of schadenfreude and fury that we are now treated to the ongoing spectacle of Mitch’s hypocrisy—Mitch-pocrisy, if you will—laid bare. As with Trump, the law of digital irony continuously seems to affirm that, for every injunction McConnell makes during the current administration, there’s a clip somewhere of him saying the exact opposite during the Obama years. (...)
These recent examples of McConnell’s outlandish hypocrisy are just the tip of the iceberg; he has spent his entire career flip-flopping. As John Yarmuth, Kentucky’s only Democratic congressman, told a union crowd in 2014, “Mitch McConnell has been the same cold-hearted, power-hungry politician for the entire forty-six years I’ve known him . . . He’s like a windmill—whichever way the wind blows, he goes. He doesn’t have any core values. He just wants to be something. He doesn’t want to do anything.” Perhaps what’s most distressing about this is that virtually none of us register it as anything resembling news. Everyone knows McConnell is a slimy hypocrite. What’s worse, everyone knows that his hypocrisy works.
Health Insurers Bilk Medicare for Billions
When Medicare was facing an impossible $13 trillion funding gap, Congress opted for a bold fix: It handed over part of the program to insurance companies, expecting them to provide better care at a lower cost. The new program was named Medicare Advantage.
Nearly 15 years later, a third of all Americans who receive some form of Medicare have chosen the insurer-provided version, which, by most accounts, has been a success.
But now a whistle-blower, a former well-placed official at UnitedHealth Group, asserts that the big insurance companies have been systematically bilking Medicare Advantage for years, reaping billions of taxpayer dollars from the program by gaming the payment system.
The Justice Department takes the whistle-blower’s claims so seriously that it has said it intends to sue the whistle-blower’s former employer, UnitedHealth Group, even as it investigates other Medicare Advantage participants. The agency has until the end of Tuesday to take action against UnitedHealth.
In the first interview since his allegations were made public, the whistle-blower, Benjamin Poehling of Bloomington, Minn., described in detail how his company and others like it — in his view — gamed the system: Finance directors like him monitored projects that UnitedHealth had designed to make patients look sicker than they were, by scouring patients’ health records electronically and finding ways to goose the diagnosis codes.
The sicker the patient, the more UnitedHealth was paid by Medicare Advantage — and the bigger the bonuses people earned, including Mr. Poehling.
In February, a federal judge unsealed the lawsuit that Mr. Poehling filed against UnitedHealth and 14 other companies involved in Medicare Advantage.
“They’ve set up a perfect scheme here,” Mr. Poehling said in an interview. “It was rigged so there was no way they could lose.”
A spokesman for UnitedHealth, Matthew A. Burns, said the company rejected Mr. Poehling’s allegations and would contest them vigorously. (...)
Mr. Poehling’s suit, filed under the False Claims Act, seeks to recover excess payments, and big penalties, for the Centers for Medicare and Medicaid Services. (Mr. Poehling would earn a percentage of any money recovered.) The amounts in question industrywide are mind-boggling: Some analysts estimate improper Medicare Advantage payments at $10 billion a year or more.
At the heart of the dispute: The government pays insurers extra to enroll people with more serious medical problems, to discourage them from cherry-picking healthy people for their Medicare Advantage plans. The higher payments are determined by a complicated risk scoring system, which has nothing to do with the treatments people get from their doctors; rather, it is all about diagnoses. (...)
Mr. Poehling said the data-mining projects that he had monitored could raise the government’s payments to UnitedHealth by nearly $3,000 per new diagnosis found. The company, he said, did not bother looking for conditions like high blood pressure, which, though dangerous, do not raise risk scores.
He included in his complaint an email message from Jerry J. Knutson, the chief financial officer of his division, in which Mr. Knutson urged Mr. Poehling’s team “to really go after the potential risk scoring you have consistently indicated is out there.”
“You mentioned vasculatory disease opportunities, screening opportunities, etc., with huge $ opportunities,” Mr. Knutson wrote. “Let’s turn on the gas!”
Nearly 15 years later, a third of all Americans who receive some form of Medicare have chosen the insurer-provided version, which, by most accounts, has been a success.
But now a whistle-blower, a former well-placed official at UnitedHealth Group, asserts that the big insurance companies have been systematically bilking Medicare Advantage for years, reaping billions of taxpayer dollars from the program by gaming the payment system.
The Justice Department takes the whistle-blower’s claims so seriously that it has said it intends to sue the whistle-blower’s former employer, UnitedHealth Group, even as it investigates other Medicare Advantage participants. The agency has until the end of Tuesday to take action against UnitedHealth.In the first interview since his allegations were made public, the whistle-blower, Benjamin Poehling of Bloomington, Minn., described in detail how his company and others like it — in his view — gamed the system: Finance directors like him monitored projects that UnitedHealth had designed to make patients look sicker than they were, by scouring patients’ health records electronically and finding ways to goose the diagnosis codes.
The sicker the patient, the more UnitedHealth was paid by Medicare Advantage — and the bigger the bonuses people earned, including Mr. Poehling.
In February, a federal judge unsealed the lawsuit that Mr. Poehling filed against UnitedHealth and 14 other companies involved in Medicare Advantage.
“They’ve set up a perfect scheme here,” Mr. Poehling said in an interview. “It was rigged so there was no way they could lose.”
A spokesman for UnitedHealth, Matthew A. Burns, said the company rejected Mr. Poehling’s allegations and would contest them vigorously. (...)
Mr. Poehling’s suit, filed under the False Claims Act, seeks to recover excess payments, and big penalties, for the Centers for Medicare and Medicaid Services. (Mr. Poehling would earn a percentage of any money recovered.) The amounts in question industrywide are mind-boggling: Some analysts estimate improper Medicare Advantage payments at $10 billion a year or more.
At the heart of the dispute: The government pays insurers extra to enroll people with more serious medical problems, to discourage them from cherry-picking healthy people for their Medicare Advantage plans. The higher payments are determined by a complicated risk scoring system, which has nothing to do with the treatments people get from their doctors; rather, it is all about diagnoses. (...)
Mr. Poehling said the data-mining projects that he had monitored could raise the government’s payments to UnitedHealth by nearly $3,000 per new diagnosis found. The company, he said, did not bother looking for conditions like high blood pressure, which, though dangerous, do not raise risk scores.
He included in his complaint an email message from Jerry J. Knutson, the chief financial officer of his division, in which Mr. Knutson urged Mr. Poehling’s team “to really go after the potential risk scoring you have consistently indicated is out there.”
“You mentioned vasculatory disease opportunities, screening opportunities, etc., with huge $ opportunities,” Mr. Knutson wrote. “Let’s turn on the gas!”
by Mary Williams Walsh, NY Times | Read more:
Image: NY Times
[ed. Anyone surprised? What will be surprising is if Congress and the Justice Department under Jeff Sessions actually do anything.]
[ed. Anyone surprised? What will be surprising is if Congress and the Justice Department under Jeff Sessions actually do anything.]
Ladies Who Jam
"Jazz has the power to make men forget their differences and come together.” These are the words with which Quincy Jones inaugurated the first UNESCO International Jazz Day exactly five years ago.
Broadcasting on April 30 from Havana, Cuba, this year’s headliners include Herbie Hancock, Chucho Valdés, Carl Allen, Marc Antoine, Till Brönner, Antonio Hart, Marcus Miller, Kurt Elling, Gonzalo Rubalcaba, Ben Williams, Pancho Amat, César López, Ivan Lins, Igor Burman, Julio Padón, Richard Bona, and Bobby Carcasses, plus three notable jazzwomen: Cassandra Wilson, Esperanza Spalding, and Regina Carter.
If the X-Y energy sounds disproportionate in that lineup, just consider that Wynton Marsalis’s renowned Jazz at Lincoln Center Orchestra — among the best-paying gigs for an American jazz musician — has never once hired a permanent female member. This is all too common a story. While female jazz vocalists like Wilson and Spalding, who also plays bass, are somewhat de rigueur, the instrument section is overwhelmingly a masculine domain, which historically prizes aggressive self-confidence on the bandstand; it’s a job that requires frequent absences from home and family, and punishes women — particularly horn players — for being “unattractive” while “blowing hot.”
What’s more, research shows that the trumpet, trombone, and drums are still perceived as “masculine” instruments, while the flute, clarinet, and piano are considered feminine. In other words, sexual stereotyping of band instruments helps explain why boys are more likely to play the trombone, and girls the flute. For a long time, in fact, girls were prohibited from playing saxophones and percussion.
Of course, a penis is no prerequisite for playing jazz. It’s a social art. But as a freelance, ensemble-based industry, it remains largely a musical boys’ club whose members typically get a foot in the door by referrals through buddies. There’s rarely any formal hiring procedures in place, or any public postings of openings in big bands or jazz ensembles. The jazz gender gap extends beyond the music — as the mastheads of leading jazz magazines show, less than 10 percent of jazz critics and journalists are women, and a player’s promotion hinges on mostly male-run booking agencies and jazz festival programmers.
In kicking off that first International Jazz Day, Jones described jazz as “the personification of transforming overwhelmingly negative circumstances into freedom, friendship, hope, and dignity.” A nice, inclusive interpretation of the music. But as a commercial business, jazz is among the most sexist sectors of the music industry.
Classical music, while not typically incubated in jazz’s red-light classrooms of bars and clubs, offers an intriguing comparison.
In the 1970s, women accounted for less than five percent of classical musicians. Then a musicians’ union mandated “blind audition” policies, which conceal the identity of performance candidates from the jury and decrease bias, be it conscious or unconscious. Today, 48 percent of symphony musicians in metropolitan areas are women, says Ellen Seeling, a professional trumpet player and chairperson of JazzWomen and Girls Advocates, the first and largest organization dedicated to promoting “the visibility of women and girl instrumentalists of all ethnicities in jazz” and advocating “for their inclusion in all aspects of the art form.”
The group’s mission poses the question: If pressure were applied to the hiring tactics of jazz orchestras, could women’s representation in the genre undergo a sea change similar to that in the classical world? Seeling hopes so.
She made headlines a couple years ago when summoning hundreds of musicians and a female-led band to stage a rally outside Jazz at Lincoln Center during a high-ticketed donors’ gala to advocate for blind auditions. But Seeling contends the very nature of jazz makes things a little more complicated.
“Jazz is cool, it’s rogue,” says Seeling, making air quotes. “It’s rogue and totally unregulated and misogynistic — even more so than rock ’n’ roll. Look at the Grammys house band, the SNL band, any of them. How many women do you see there?”
by Katie O’Reilly, LARB | Read more:
Image: uncredited
Broadcasting on April 30 from Havana, Cuba, this year’s headliners include Herbie Hancock, Chucho Valdés, Carl Allen, Marc Antoine, Till Brönner, Antonio Hart, Marcus Miller, Kurt Elling, Gonzalo Rubalcaba, Ben Williams, Pancho Amat, César López, Ivan Lins, Igor Burman, Julio Padón, Richard Bona, and Bobby Carcasses, plus three notable jazzwomen: Cassandra Wilson, Esperanza Spalding, and Regina Carter.
If the X-Y energy sounds disproportionate in that lineup, just consider that Wynton Marsalis’s renowned Jazz at Lincoln Center Orchestra — among the best-paying gigs for an American jazz musician — has never once hired a permanent female member. This is all too common a story. While female jazz vocalists like Wilson and Spalding, who also plays bass, are somewhat de rigueur, the instrument section is overwhelmingly a masculine domain, which historically prizes aggressive self-confidence on the bandstand; it’s a job that requires frequent absences from home and family, and punishes women — particularly horn players — for being “unattractive” while “blowing hot.”What’s more, research shows that the trumpet, trombone, and drums are still perceived as “masculine” instruments, while the flute, clarinet, and piano are considered feminine. In other words, sexual stereotyping of band instruments helps explain why boys are more likely to play the trombone, and girls the flute. For a long time, in fact, girls were prohibited from playing saxophones and percussion.
Of course, a penis is no prerequisite for playing jazz. It’s a social art. But as a freelance, ensemble-based industry, it remains largely a musical boys’ club whose members typically get a foot in the door by referrals through buddies. There’s rarely any formal hiring procedures in place, or any public postings of openings in big bands or jazz ensembles. The jazz gender gap extends beyond the music — as the mastheads of leading jazz magazines show, less than 10 percent of jazz critics and journalists are women, and a player’s promotion hinges on mostly male-run booking agencies and jazz festival programmers.
In kicking off that first International Jazz Day, Jones described jazz as “the personification of transforming overwhelmingly negative circumstances into freedom, friendship, hope, and dignity.” A nice, inclusive interpretation of the music. But as a commercial business, jazz is among the most sexist sectors of the music industry.
Classical music, while not typically incubated in jazz’s red-light classrooms of bars and clubs, offers an intriguing comparison.
In the 1970s, women accounted for less than five percent of classical musicians. Then a musicians’ union mandated “blind audition” policies, which conceal the identity of performance candidates from the jury and decrease bias, be it conscious or unconscious. Today, 48 percent of symphony musicians in metropolitan areas are women, says Ellen Seeling, a professional trumpet player and chairperson of JazzWomen and Girls Advocates, the first and largest organization dedicated to promoting “the visibility of women and girl instrumentalists of all ethnicities in jazz” and advocating “for their inclusion in all aspects of the art form.”
The group’s mission poses the question: If pressure were applied to the hiring tactics of jazz orchestras, could women’s representation in the genre undergo a sea change similar to that in the classical world? Seeling hopes so.
She made headlines a couple years ago when summoning hundreds of musicians and a female-led band to stage a rally outside Jazz at Lincoln Center during a high-ticketed donors’ gala to advocate for blind auditions. But Seeling contends the very nature of jazz makes things a little more complicated.
“Jazz is cool, it’s rogue,” says Seeling, making air quotes. “It’s rogue and totally unregulated and misogynistic — even more so than rock ’n’ roll. Look at the Grammys house band, the SNL band, any of them. How many women do you see there?”
by Katie O’Reilly, LARB | Read more:
Image: uncredited
Monday, May 15, 2017
The Startup Industry’s Toxic “Side Hustle” Fixation
A handsome man gazes at the camera. “These days, everyone needs a side hustle,” he shrugs. We cut to scenes from his well-lit life, and it’s a mix of pleasant chauffeur jaunts, and hangout sessions with his daughter, dog, and pals. “Earning, chilling, earning, chilling,” the man sing-songs, a prosperous avatar for enviable work-life balance. His existence is so delightfully calibrated, I could play the scene for my therapist to best explain what I’m aiming for, except I won’t do that, because the man is an actor and he’s in an ad for Uber. The transit company has embraced the concept of “side hustle” to entice people to become contractor-taxis, spinning the idea of having a second job as a form of freedom, a salvation from drudgery. “Get your side hustle on,” Uber’s website beckons new drivers.
Uber is the most prominent business in startup culture to explicitly use the term as a way to sell piecemeal labor as a savvy lifestyle choice, but the phrase is frequently deployed within the startup industry to hype all sorts of gig-economy work. Websites like Side Hustle Academy and books like Side Hustle to Success and Side Hustle Blueprint promise readers they’ll explain how to build wealth as an extracurricular habit. A marketing consultant who refers to herself as a “digital nomad” published a self-help book called The Side Hustle Gal. It’s like those spambot comments at the bottom of blog posts — make extra money working from home — were interpreted credulously and turned into an economic game plan by a cadre of self-published wannabe Suze Ormans. But the way Uber and startup culture has co-opted and bowdlerized the phrase into an anodyne signifier of entrepreneurialism is gallingly hollow. The side hustle is a survival mechanism, not an aspirational career track.
Two definitions of the “side hustle” are hyped by startup culture. One is the Uber interpretation, and it’s simple: side hustle as second gig. The other, what I’d call the “life coach” definition, is a little more specific: the side hustle as second gig that is also a passion project. While these two definitions are distinct, they are not so distant from one another. Both imagine that the side hustler is on track to a better life through ceaseless piecemeal labor rather than 9-to-5 employment. Even in Uber’s estimation, the “side hustle” is a sanguine endeavor, something that makes life easier, a way to grease one’s most ambitious life track. It’s a captivating tale. The idea that success depends on after-hours striving speaks to an archetypically American combination of values — the national preoccupation with work ethic and individualism. It also misconstrues economic reality to make companies like Uber look like benevolent job creators rather than businesses tailored to maximize profit while shifting as much financial risk onto contractors as possible.
Companies like Uber want to spit-shine the concept of “side hustle” so it looks like a better alternative to steady, gainful employment. If people see gig-economy labor as a flexible stepping stone to a better life, they’re less likely to also see it as a force eroding a work culture with protections and benefits for employees in favor of a low-ball freelance marketplace. (Also, one that will eventually be automated, making their jobs obsolete.) Uber is selling a fantasy of economic advancement through the corrosion of employment benefits and stability, pitching increased subjugation to the corporation as some sort of salvo. That Uber is promoting itself as a solution for the financial flailing is particularly eye-popping when one considers the company’s ultimate goal to eliminate the job of driving in favor of large-scale automation. (...)
I recently wrote about how startup multilevel marketing companies, like LuLaRoe and It Works!, are growing in popularity on social media. Many of these businesses push the idea that people can get rich by selling wares as a type of side hustle, but the reality is that the majority of contractors shilling for these companies make little to no money. This does not mean they are indolent or obtuse. Many startup gigs that are breathlessly pitched as ways to transcend the grind are, in fact, often more wearying, riskier, and less financially rewarding than salaried employment.
CNBC recently reported a story about how a college student earned $10,000 using a “side hustle app” called JoyRun. That sounds impressive until the figures are broken down. She worked around 12–20 hours a week for a year. That’s a classic part-time job making around $10–15 an hour. So it’s slightly better than the average wage at McDonald’s. The student’s “success” on this app was apparently rare enough to warrant media attention; what a closer look at the numbers reveals is how easily low-rung employment can get ginned into a success story by slapping a hyperbolic “side hustle” narrative on it.
Uber is the most prominent business in startup culture to explicitly use the term as a way to sell piecemeal labor as a savvy lifestyle choice, but the phrase is frequently deployed within the startup industry to hype all sorts of gig-economy work. Websites like Side Hustle Academy and books like Side Hustle to Success and Side Hustle Blueprint promise readers they’ll explain how to build wealth as an extracurricular habit. A marketing consultant who refers to herself as a “digital nomad” published a self-help book called The Side Hustle Gal. It’s like those spambot comments at the bottom of blog posts — make extra money working from home — were interpreted credulously and turned into an economic game plan by a cadre of self-published wannabe Suze Ormans. But the way Uber and startup culture has co-opted and bowdlerized the phrase into an anodyne signifier of entrepreneurialism is gallingly hollow. The side hustle is a survival mechanism, not an aspirational career track.Two definitions of the “side hustle” are hyped by startup culture. One is the Uber interpretation, and it’s simple: side hustle as second gig. The other, what I’d call the “life coach” definition, is a little more specific: the side hustle as second gig that is also a passion project. While these two definitions are distinct, they are not so distant from one another. Both imagine that the side hustler is on track to a better life through ceaseless piecemeal labor rather than 9-to-5 employment. Even in Uber’s estimation, the “side hustle” is a sanguine endeavor, something that makes life easier, a way to grease one’s most ambitious life track. It’s a captivating tale. The idea that success depends on after-hours striving speaks to an archetypically American combination of values — the national preoccupation with work ethic and individualism. It also misconstrues economic reality to make companies like Uber look like benevolent job creators rather than businesses tailored to maximize profit while shifting as much financial risk onto contractors as possible.
Companies like Uber want to spit-shine the concept of “side hustle” so it looks like a better alternative to steady, gainful employment. If people see gig-economy labor as a flexible stepping stone to a better life, they’re less likely to also see it as a force eroding a work culture with protections and benefits for employees in favor of a low-ball freelance marketplace. (Also, one that will eventually be automated, making their jobs obsolete.) Uber is selling a fantasy of economic advancement through the corrosion of employment benefits and stability, pitching increased subjugation to the corporation as some sort of salvo. That Uber is promoting itself as a solution for the financial flailing is particularly eye-popping when one considers the company’s ultimate goal to eliminate the job of driving in favor of large-scale automation. (...)
I recently wrote about how startup multilevel marketing companies, like LuLaRoe and It Works!, are growing in popularity on social media. Many of these businesses push the idea that people can get rich by selling wares as a type of side hustle, but the reality is that the majority of contractors shilling for these companies make little to no money. This does not mean they are indolent or obtuse. Many startup gigs that are breathlessly pitched as ways to transcend the grind are, in fact, often more wearying, riskier, and less financially rewarding than salaried employment.
CNBC recently reported a story about how a college student earned $10,000 using a “side hustle app” called JoyRun. That sounds impressive until the figures are broken down. She worked around 12–20 hours a week for a year. That’s a classic part-time job making around $10–15 an hour. So it’s slightly better than the average wage at McDonald’s. The student’s “success” on this app was apparently rare enough to warrant media attention; what a closer look at the numbers reveals is how easily low-rung employment can get ginned into a success story by slapping a hyperbolic “side hustle” narrative on it.
by Kate Knibbs, The Ringer | Read more:
Image: Getty/The Ringer
How Noncompete Clauses Keep Workers Locked In
Keith Bollinger’s paycheck as a factory manager had shriveled after the 2008 financial crisis, but then he got a chance to pull himself out of recession’s hole. A rival textile company offered him a better job — and a big raise.
When he said yes, it set off a three-year legal battle that concluded this past week but wiped out his savings along the way.
“I tried to get a better life for my wife and my son, and it backfired,” said Mr. Bollinger, who is 53. “Now I’m in my mid-50s, and I’m ruined.”
Mr. Bollinger had signed a noncompete agreement, designed to prevent him from leaving his previous employer for a competitor. These contracts have long been routine among senior executives. But they are rapidly spreading to employees like Mr. Bollinger, who do the kind of blue-collar work that President Trump has promised to create more of.
The growth of noncompete agreements is part of a broad shift in which companies assert ownership over work experience as well as work. A recent survey by economists including Evan Starr, a management professor at the University of Maryland, showed that about one in five employees was bound by a noncompete clause in 2014.
Employment lawyers say their use has exploded. Russell Beck, a partner at the Boston law firm Beck Reed Riden who does an annual survey of noncompete litigation, said the most recent data showed that noncompete and trade-secret lawsuits had roughly tripled since 2000.
“Companies of all sorts use them for people at all levels,” he said. “That’s a change.”
Employment lawyers know this, but workers are often astonished to learn that they’ve signed away their right to leave for a competitor. Timothy Gonzalez, an hourly laborer who shoveled dirt for a fast-food-level wage, was sued after leaving one environmental drilling company for another. Phillip Barone, a midlevel salesman and Air Force veteran, was let go from his job after his old company sent a cease-and-desist letter saying he had signed a noncompete. (...)
Alan B. Krueger, a Princeton economics professor who was chairman of President Barack Obama’s Council of Economic Advisers, recently described noncompetes and other restrictive employment contracts — along with outright collusion — as part of a “rigged” labor market in which employers “act to prevent the forces of competition.”
By giving companies huge power to dictate where and for whom their employees can work next, noncompetes take a person’s greatest professional assets — years of hard work and earned skills — and turn them into a liability.
“It’s one thing to have a bump in the road and be in between jobs for a little while; it’s another thing to be prevented from doing the only thing you know how to do,” said Max Burton Wahrhaftig, an arborist in Doylestown, Pa., who in 2013 was threatened by his former employer after leaving for a better-paying job with a rival tree service. He was able to avoid a full-blown lawsuit.
Noncompetes are but one factor atop a great mountain of challenges making it harder for employees to get ahead. Globalization and automation have put American workers in competition with overseas labor and machines. The rise of contract employment has made it harder to find a steady job. The decline of unions has made it tougher to negotiate.
But the move to tie workers down with noncompete agreements falls in line with the decades-long trend in which their mobility and bargaining power has steadily declined, and with it their share of company earnings.
When a noncompete agreement is litigated to the letter, a worker can be barred or ousted from a new job by court order. Even if that never happens, the threat alone can create a chilling effect that reduces wages throughout the work force.
“People can’t negotiate when their company knows they won’t leave,” said Sandra E. Black, an economics professor at the University of Texas at Austin.
When he said yes, it set off a three-year legal battle that concluded this past week but wiped out his savings along the way.
“I tried to get a better life for my wife and my son, and it backfired,” said Mr. Bollinger, who is 53. “Now I’m in my mid-50s, and I’m ruined.”
Mr. Bollinger had signed a noncompete agreement, designed to prevent him from leaving his previous employer for a competitor. These contracts have long been routine among senior executives. But they are rapidly spreading to employees like Mr. Bollinger, who do the kind of blue-collar work that President Trump has promised to create more of.The growth of noncompete agreements is part of a broad shift in which companies assert ownership over work experience as well as work. A recent survey by economists including Evan Starr, a management professor at the University of Maryland, showed that about one in five employees was bound by a noncompete clause in 2014.
Employment lawyers say their use has exploded. Russell Beck, a partner at the Boston law firm Beck Reed Riden who does an annual survey of noncompete litigation, said the most recent data showed that noncompete and trade-secret lawsuits had roughly tripled since 2000.
“Companies of all sorts use them for people at all levels,” he said. “That’s a change.”
Employment lawyers know this, but workers are often astonished to learn that they’ve signed away their right to leave for a competitor. Timothy Gonzalez, an hourly laborer who shoveled dirt for a fast-food-level wage, was sued after leaving one environmental drilling company for another. Phillip Barone, a midlevel salesman and Air Force veteran, was let go from his job after his old company sent a cease-and-desist letter saying he had signed a noncompete. (...)
Alan B. Krueger, a Princeton economics professor who was chairman of President Barack Obama’s Council of Economic Advisers, recently described noncompetes and other restrictive employment contracts — along with outright collusion — as part of a “rigged” labor market in which employers “act to prevent the forces of competition.”
By giving companies huge power to dictate where and for whom their employees can work next, noncompetes take a person’s greatest professional assets — years of hard work and earned skills — and turn them into a liability.
“It’s one thing to have a bump in the road and be in between jobs for a little while; it’s another thing to be prevented from doing the only thing you know how to do,” said Max Burton Wahrhaftig, an arborist in Doylestown, Pa., who in 2013 was threatened by his former employer after leaving for a better-paying job with a rival tree service. He was able to avoid a full-blown lawsuit.
Noncompetes are but one factor atop a great mountain of challenges making it harder for employees to get ahead. Globalization and automation have put American workers in competition with overseas labor and machines. The rise of contract employment has made it harder to find a steady job. The decline of unions has made it tougher to negotiate.
But the move to tie workers down with noncompete agreements falls in line with the decades-long trend in which their mobility and bargaining power has steadily declined, and with it their share of company earnings.
When a noncompete agreement is litigated to the letter, a worker can be barred or ousted from a new job by court order. Even if that never happens, the threat alone can create a chilling effect that reduces wages throughout the work force.
“People can’t negotiate when their company knows they won’t leave,” said Sandra E. Black, an economics professor at the University of Texas at Austin.
How Untreated Depression Contributes to the Opioid Epidemic
It can sometimes seem strange how so much of the country got hooked on opioids within just a few years. Deaths from prescription drugs like oxycodone, hydrocodone, and methadone have more than quadrupled since 1999, according to the CDC. But pain doesn’t seem to be the only culprit: About one-third of Americans have chronic pain, but not all of them take prescription painkillers for it. Of those who do take prescription opioids, not all become addicted.
Several researchers now believe depression, one of the most common medical diagnoses in the U.S., might be one underlying cause that’s driving some patients to seek out prescription opioids and to use them improperly.
People with depression show abnormalities in the body’s release of its own, endogenous, opioid chemicals. Depression tends to exacerbate pain—it makes chronic pain last longer and hurts the recovery process after surgery.
“Depressed people are in a state of alarm,” said Mark Sullivan, a professor of psychiatry at the University of Washington. “They’re fearful, or frozen in place. There’s a heightened sense of threat.” That increased threat sensitivity might also be what heightens sensations of pain.
Not only do people with depression tend to be more pain-sensitive, the effect of opioids can, for some, feel as mood-elevating as an antidepressant.
“Depression is a mixed bag,” Sullivan said. “People can feel sluggish and uninterested, but they can also feel agitated, irritated, and anxious. They feel both unrelaxed and really unmotivated at the same time.”
Opioids might, at least temporarily, feel soothing and sedating. Indeed, several studies have found that buprenorphine, an opioid that is typically used to wean people off of heroin, has some antidepressant properties.
Sullivan and other researchers from Washington and California found in 2012 that depressed people were about twice as likely as non-depressed ones to misuse their painkillers for non-pain symptoms, and depressed individuals were between two and three times more likely to ramp up their own doses of painkillers. Adolescents with depression were also more likely, in one study, to use prescription painkillers for non-medical reasons and to become addicted.
In 2015, a different group of researchers found that depressed people were likely to keep using opioids, even when their pain had subsided and when they were more functional. “If the emotional pain, the depression, is never properly diagnosed or treated, the patient might continue taking the opioid because it’s treating something,” said Jenna Goesling, an assistant professor in the department of anesthesiology at the University of Michigan and an author of that study.
Several researchers now believe depression, one of the most common medical diagnoses in the U.S., might be one underlying cause that’s driving some patients to seek out prescription opioids and to use them improperly.
People with depression show abnormalities in the body’s release of its own, endogenous, opioid chemicals. Depression tends to exacerbate pain—it makes chronic pain last longer and hurts the recovery process after surgery.“Depressed people are in a state of alarm,” said Mark Sullivan, a professor of psychiatry at the University of Washington. “They’re fearful, or frozen in place. There’s a heightened sense of threat.” That increased threat sensitivity might also be what heightens sensations of pain.
Not only do people with depression tend to be more pain-sensitive, the effect of opioids can, for some, feel as mood-elevating as an antidepressant.
“Depression is a mixed bag,” Sullivan said. “People can feel sluggish and uninterested, but they can also feel agitated, irritated, and anxious. They feel both unrelaxed and really unmotivated at the same time.”
Opioids might, at least temporarily, feel soothing and sedating. Indeed, several studies have found that buprenorphine, an opioid that is typically used to wean people off of heroin, has some antidepressant properties.
Sullivan and other researchers from Washington and California found in 2012 that depressed people were about twice as likely as non-depressed ones to misuse their painkillers for non-pain symptoms, and depressed individuals were between two and three times more likely to ramp up their own doses of painkillers. Adolescents with depression were also more likely, in one study, to use prescription painkillers for non-medical reasons and to become addicted.
In 2015, a different group of researchers found that depressed people were likely to keep using opioids, even when their pain had subsided and when they were more functional. “If the emotional pain, the depression, is never properly diagnosed or treated, the patient might continue taking the opioid because it’s treating something,” said Jenna Goesling, an assistant professor in the department of anesthesiology at the University of Michigan and an author of that study.
by Olga Khazan, The Atlantic | Read more:
Image: Jonathan Ernst/Reuters
Subscribe to:
Comments (Atom)


