Tuesday, October 9, 2018

The Distinction Between ‘High’ and ‘Low’ Pleasures

Parents often say that they don’t mind what their children do in life just as long as they are happy. Happiness and pleasure are almost universally seen as among the most precious human goods; only the most curmudgeonly would question whether benign enjoyment is anything other than a good thing. Disagreement soon creeps in, however, if you ask whether some forms of pleasure are better than others. Does it matter whether our pleasures are spiritual or carnal, intellectual or stupid? Or are all pleasures pretty much the same?

Utilitarianism, as a moral philosophy, puts pleasure at the centre of its concerns, arguing that actions are right to the extent that they increase happiness and decrease suffering, wrong to the extent that they cause the opposite. Yet even the early Utilitarians couldn’t agree about whether pleasures should be ranked. Jeremy Bentham believed that all sources of pleasure are of equal quality. ‘Prejudice apart,’ he wrote in The Rationale of Reward (1825), ‘the game of push-pin is of equal value with the arts and sciences of music and poetry.’ His protégé John Stuart Mill disagreed, arguing in Utilitarianism (1863) that: ‘It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied.’

Mill argued for a distinction between ‘higher’ and lower pleasures. His distinction is difficult to pin down, but it more or less tracks the distinction between capacities thought to be unique to humans and those we share with other animals. Higher pleasures depend on distinctively human capacities, which have a more complex cognitive element, requiring abilities such as rational thought, self-awareness or language use. Lower pleasures, in contrast, require mere sentience. Humans and other animals alike enjoy basking in the sun, eating something tasty or having sex. Only humans engage in art, philosophy and so on.

Mill was certainly not the first to make this distinction. Aristotle among others thought that the senses of touch and taste were ‘servile and brutish’; the pleasures of eating were ‘as brutes also share in’ and so less valuable than those that used the more developed human mind. Yet many would continue to side with Bentham, arguing that we are really not so intellectual and high-minded as all that, and we might as well accept ourselves for the brutes that we are, shaped by biochemistry and animal drives.

The difficulty with resolving this disagreement about the kinds of pleasure is not that we struggle to agree on the right answer. It’s that we’re asking the wrong question. The entire debate assumes a clear divide between the intellectual and bodily, the human and the animal, which is no longer tenable. These days, few of us are card-carrying dualists who believe that we are made of immaterial minds and material bodies. We have plenty of scientific evidence for the importance of biochemistry and hormones in all that we do and think. Nonetheless, dualistic assumptions still inform our thinking. So, what happens if we take seriously the idea that the physical and the mental are inseparable, that we are fully embodied beings? What would it mean for our ideas about pleasure?

The dining table is a good place to start. Along with sex, food is usually considered to be the quintessential lower pleasure. All animals eat, using the senses of smell and taste. It doesn’t require any complex cognition to conclude that something is delicious. Philosophers have generally assumed that to take pleasure in eating is simply to sate a primitive desire. So, for instance, Plato believed that cookery could never be a form of art, because it ‘never regards either the nature or reason of that pleasure to which she devotes herself, but goes straight to her end’.

Plato and his successors, however, failed to appreciate something that the French food writer Jean Anthelme Brillat-Savarin captured pithily in The Physiology of Taste (1825): ‘Animals feed; man eats; only the man of intellect knows how to eat.’ Brillat-Savarin drew a distinction between mere animal feeding, which is the ingestion of food as fuel, and human eating, which can and should engage more than just our most basic carnal desires. Eating is a complex act. Simply gathering the ingredients takes thought, since what we buy not only requires planning but affects the wellbeing of growers, producers, animals and the planet. Cooking involves knowledge of ingredients, the application of skills, the balancing of different flavours and textures, considerations of nutrition, care for the ordering of courses or the place of the dish in the rhythm of the day. Eating, at its best, brings all these things together, adding an attentive aesthetic appreciation of the end result.

Eating illustrates how the difference between higher and lower pleasures is not what you enjoy but how you enjoy it. Wolfing down your food like a pig at a trough is a lower kind of pleasure. Preparing and eating it using the powers of reflection and attention that only a human being possesses turns it into a higher pleasure. This form of higher pleasure need not be intellectual in the academic sense. An accomplished chef might be judging the balance of flavours and textures intuitively; a home cook might simply be thinking about what his guests are most likely to enjoy. What makes the pleasure higher is that it engages our more complex human abilities. It expresses more than just the brute desire to satisfy a craving.

For every pleasure, it should not be difficult to see that the how matters more than the what. Furthermore, the highest pleasures do not merely use our distinctively human capacities, they use them for a valuable end. Someone who goes to the opera to be seen in a new dress is not experiencing the higher pleasures of music but indulging the lower pleasures of vanity. Someone who reads Dr Seuss with a careful ear for language gets a higher pleasure than someone who mechanically recites The Waste Land (1922) without any understanding of what T S Eliot was doing.

by Julian Baggini, Aeon | Read more:
Image: Ramen heaven. From Juzo Itami’s 1985 noodle-western Tampopo. Courtesy Criterion Collection

The Story of My Tits


The Story of My Tits, Jennifer Hayden (click on the preview)

Named one of the Best Books of the Year by: The New York Times, NPR, Library Journal, Amazon, GQ, Comic Book Resources, Paste, Mental Floss, Forbes, and many more! (...)

"The Story of My Tits details three different women’s battle with breast cancer… and all the cultural and emotional weight that gets carried by a bra. What leavens the story is Hayden’s drawing, packed with wiggly detail, and her voice, irreverent and blunt... it’s a voice that faces heartbreak head-on and comes out swinging." — The Village Voice

"A living, breathing monument to the ways in which illness doesn’t just rewrite your life but allows you, in some fundamental way, to begin seeing, writing, and rewriting the shape and pattern of your own story." — Public Books

When Jennifer Hayden was diagnosed with breast cancer at the age of 43, she realized that her tits told a story. Across a lifetime, they’d held so many meanings: hope and fear, pride and embarrassment, life and death. And then they were gone.

Now, their story has become a way of understanding her story: a journey from the innocence of youth to the chaos of adulthood, through her mother’s mastectomy, her father’s mistress, her husband’s music, and the endlessly evolving definition of family.

As cancer strikes three different lives, some relationships crumble while others emerge even stronger, and this sarcastic child of the ‘70s finally finds a goddess she can believe in.

For everyone who’s faced cancer personally, or watched a loved one fight that battle, Hayden’s story is a much-needed breath of fresh air, an irresistible blend of sweetness and skepticism. Rich with both symbolism & humor, The Story of My Tits will leave you laughing, weeping, and feeling grateful for every day. -- a 352-page softcover graphic novel with French flaps (B&W Interiors), 8” x 8”

Joni Mitchell


“I don’t know if I’ve learned anything yet! I did learn how to have a happy home, but I consider myself fortunate in that regard because I could’ve rolled right by it. Everybody has a superficial side and a deep side, but this culture doesn’t place much value on depth — we don’t have shamans or soothsayers, and depth isn’t encouraged or understood. Surrounded by this shallow, glossy society we develop a shallow side, too, and we become attracted to fluff. That’s reflected in the fact that this culture sets up an addiction to romance based on insecurity — the uncertainty of whether or not you’re truly united with the object of your obsession is the rush people get hooked on. I’ve seen this pattern so much in myself and my friends and some people never get off that line.

“But along with developing my superficial side, I always nurtured a deeper longing, so even when I was falling into the trap of that other kind of love, I was hip to what I was doing. I recently read an article in Esquire magazine called ‘The End of Sex,’ that said something that struck me as very true. It said: “If you want endless repetition, see a lot of different people. If you want infinite variety, stay with one.” What happens when you date is you run all your best moves and tell all your best stories.

“You can’t do that with a longtime mate because he knows all that old material. With a long relationship, things die then are rekindled, and that shared process of rebirth deepens the love. It’s hard work, though, and a lot of people run at the first sign of trouble. You’re with this person, and suddenly you look like an asshole to them or they look like an asshole to you — it’s unpleasant, but if you can get through it you get closer and you learn a way of loving that’s different from the neurotic love enshrined in movies. It’s warmer and has more padding to it.” — Joni Mitchell

via:

How Much Power Do ‘Millennials’ Actually Have?

Whatever future civilization picks over the ruins of our own will find a curious ritual preserved in the archaeological record of the late 2010s — an outlet, they may suspect, for the anxieties of a society in free-fall. First comes a report revealing that young people have abandoned and destroyed yet another cornerstone of postwar American life, like country clubs or breakfast cereal or mayonnaise. Then young people read the report, for all the same reasons you pick at a scab, and completely lose their minds.

This future archaeologist might note how apt it was that these young people were given such a portentous, end-of-days name: millennials. At this point, “millennials” stand accused of “killing” so many industries that there is a whole meta-genre of articles mocking the cliché: Mashable’s “R.I.P.: Here Are 70 Things Millennials Have Killed,” BuzzFeed’s “Here Are 28 Things Millennials Are Accused of Killing in Cold Blood,” Broadly’s “I Did All the Things Millennials Are Accused of Killing.” Business Insider keeps something like an authoritative list of the body count, which includes casual-dining chains, golf, “breastauraunts” like Hooters, diamonds, starter homes, homeownership in general, designer handbags and banks.

Millennials, clearly, are not living the lives of easy abundance bestowed on generations past — no fighting over the check at Outback Steakhouse, no need (or budget) for a station wagon. What young people seem to find galling is the implication that they’ve had any choice in this. Structural shifts in the economy — stagnant wages, the skyrocketing cost of housing, colossal student debt — have put millennials on the path to a lower quality of life than their parents. So when some Australian guy claims that the reason they can’t buy homes is that they spend too much on avocado toast, as the millionaire developer Tim Gurner did last year, it’s easy to see why young people would explode.

But these claims to abject poverty might actually miss the point. Taken as a whole, millennials do wield an incredible amount of economic and cultural power. This is why they terrify moribund industries, and why those industries are so desperate to fit themselves into young people’s curious lifestyles. (It’s hard to remember, but just a few years ago you could not yet use your phone to hire a stranger to go to Chipotle for you and bring your burrito bowl to your door, all so you can watch a cartoon about a depressed horse without interruption.) By virtue of their sheer numbers, millennials are revealing that much of the American way of life is no more permanent than the baby boomers who codified it. If you were one of those geriatric businesses, you would rightly see millennials as an existential threat — even as they continued to see themselves as powerless, completely battered by the world their elders built for them. At the heart of every online dust-up about millennials is a possibly unanswerable question: To what extent does a generation shape history, and to what extent is it shaped by history?

by Willy Staley, NY Magazine |  Read more:
Image: Derek Brahney/New Studio
[ed. This might be one reason why millennials (and everyone else) don't exert more political power: First Comes the ‘Shocking’ News. Then Comes the Navel-Gazing.]

Bojack Horseman and Better Call Saul

[ed. Note: spoiler alert]


BoJack Horseman’s Brilliant Crack-Up


It’s hard to think of a show currently on air that could make me want to watch a single character speak in one long, despairing stream for nearly a whole episode. Prolonged expressions of angst can sink live-action drama, which thrives on eventfulness and conflict. But BoJack Horseman—a cartoon sitcom whose title character is a melancholic, middle-aged stallion—inhabits a genre of its own, somewhere between slapstick and theater of the absurd. Midway through the show’s new season, BoJack (voiced by Will Arnett) wears a charcoal suit and stands at a pulpit next to a coffin. His mother has died. For over 20 full minutes, with no interruption, he delivers a brilliant, pained, rambling eulogy.

Written by the show’s creator, Raphael Bob-Waksberg, with brilliant art direction by Lisa Hanawalt, the monologue careens between pathos and black humor, delusion and acceptance—and is totally transfixing. BoJack doesn’t miss his mother so much as he despises her; he is angry that she’s left him without a sense of closure. He begins his story by saying that when he went to a fast-food place and said that his mother had died, the person behind the counter gave him a free churro. Later, he ties this anecdote up in a joke: “My mother died, and all I got was this free churro.” Then he adds, “That small act of kindness showed more compassion than my mother gave me her entire goddamn life.” His voice starts to break, as he finally confronts a lifetime of abuse from his mother. It is an aria of abjection and resentment. I’m still thinking about it, days later.

If this seems like heavy stuff for a cartoon, BoJack has earned it. Over five seasons, Bob-Waksberg and Hanawalt crafted a truly goofy world (there’s a spider who works as a playwright, multitasking with eight limbs, and an ingenue deer who has literal doe-eyes) that allows them to slip in and out of surreal, sometimes dark subject matter. In one episode, a pop star named Sextina Aquafina (a leotard-wearing dolphin) has a cynical hit song about abortion; in another, BoJack is present when one of his young mentees overdoses on heroin in a planetarium. In true Darwinian fashion, BoJack Horseman has evolved from an easy joke about a horse to one of the most complex and empathetic shows on television. (...)

BoJack has become, more than anything, a show about how hurt people hurt people. It is about generational trauma, and how abuse trickles down until someone works out how to stop the train. In his eulogy, BoJack muses on the nature of sitcoms as a metaphor for life. He says that in television writing, you can never have a happy ending, because then the show would be over: “There is always more show, I guess, until there isn’t.” His mother’s story may be over, but he is still living with the trauma of her life, still acting out its major scenes. He is caught in a loop—a fact underscored by the eerie sense that BoJack may not be delivering this speech to anyone at all, but may be standing in an empty room, or perhaps inventing the macabre setting in his mind. He often cues an off-screen drummer to play a snare riff after his jokes, which makes the episode feel like a dream sequence, a kind of nonsensical vaudeville act. (...)

There is a sticky cohesion to this episode, which is the apex of the season—it both stands alone and works as a mortar for the other characters’ stories (Diane travels to Vietnam in the numb wake of her divorce, Princess Caroline is desperately trying to adopt a baby, the feckless Todd rockets to the top of the corporate ladder in a position he can neither handle nor control). This is what BoJack Horseman has been building up to for several seasons—it is a cathartic release and a cruel joke. The last words BoJack’s mother ever said to him were “I see you” from her hospital bed. It was “not a statement of judgment or disappointment,” he says, “just acceptance and the simple recognition of another person in a room. Hello there, you are a person, and I see you. Let me tell you, it is a weird thing to feel at 54 years old that for the first time in your life, your mother sees you.”

By the end of his speech, BoJack realizes that Beatrice was in the intensive care unit, and she was probably just reading the words “ICU” from a wall. He steels himself against this knowledge and says that he is relieved to finally know that, like all other creatures slithering and trotting and flapping their way through Hollywoo, he is truly on his own. Then, he looks up, and we finally see his audience: a confused-looking room full of reptiles, flicking their tongues. He is in the wrong funeral parlor. The ordeal sends him on a long bender, a dizzying descent toward tragedy. But for a moment, the show conveys all the ache of another person’s loss, whether he is man or beast.

by Rachel Syme, TNR | Read more:
Image: Netflix
***

Better Call Saul Ends a Bleak, Beautiful Season

“S’all good, man.”

That is the parting shot of Better Call Saul’s fourth season, as Jimmy McGill (Bob Odenkirk) is led away from his partner Kim Wexler (Rhea Seehorn) to sign papers that will return him to the field. “Good” is a concept that Jimmy’s gotten further away from during the run of the show, and there is nothing good about the way he achieves his victory here. Kim’s reaction to his revelation of insincerity — the same quality that caused a different tribunal to reject him the previous week — is wrenching in part because it stands in for viewers who like Jimmy, and who want to continue to see redeemable qualities along with his amazing facility for con games and improvised bullshit. It’s hard to see Jimmy going the other way between now and the end of this show, whenever that turns out to be. The look on Kim’s face at the end is heartbreaking: deep disappointment shading into nausea at what an empty-hearted manipulator Jimmy has become. (...)

The other characters end the season in morally precarious places, too — Mike (Jonathan Banks) more than anybody, after executing the runaway German engineer Werner Ziegler (Rainer Bock). This is the first killing we’ve seen Mike commit that was cold-blooded — housekeeping with a gun instead of a mop. He was on the gray scale already when we met him, but things are looking a lot darker now. Meanwhile, Mike’s boss, the canny individualist Gus Fring (Giancarlo Esposito), is on the brink of being fully absorbed into the Salamanca empire — I’d rather not say how that story resolves in Breaking Bad, as increasing numbers of people tell me that they’re watching BCS despite never having seen a frame of the other series — and Salamanca captain Nacho Varga (Michael Mando) is facing what looks a purgatorial sentence, somewhat mirroring Jimmy’s season of doing without a law license. He’s forced to deal with a new Salamanca, Eduardo a.k.a. Lalo (Tony Dalton), not long after trying and failing to kill another one (Mark Margolis’s Don Hector, to whom Lalo presents the desk bell that will become his aural signature).

But for now, Kim is still the closest thing Better Call Saul has to a voice of conscience, her attraction to Jimmy an Achilles heel inseparable from her attraction to danger as well as her sentimental attachment to winning seemingly unwinnable cases. The switcheroo that she pulled with the architectural plans for Mesa Verde’s El Paso branch could also get her in trouble, thrilling as it was to team up with Jimmy again in what could’ve been a deleted scene from The Sting. Her hands are dirty for sure, but she hasn’t smeared herself from head to toe in muck like Jimmy, or knowingly slithered deeper into the swamp like Mike. But in one respect, Kim’s position might be the saddest of all, because she gets to watch somebody she cares about and believes in turn colder and more manipulative over time, with no reasonable hope of pulling him back in the other direction.

It’s a gut punch even if you knew that Jimmy McGill had to become Saul, that his corruption would be a subtle process, and that — as on Breaking Bad, the series that created Jimmy/Saul, as well as Mike, Gus, and other BCS regulars — it wouldn’t be the sort you could analyze like a soil sample and then display with each layer named and tagged. Saul was always present in Jimmy, just as Heisenberg was always present in Walter White. To the credit of both shows, the writing, directing, and acting gave you a lot of information to process but tended to stop short of telling you exactly how you were supposed to read it.

None of that numbs the bruising sadness of Better Call Saul’s fourth season finale, the capper to the best and bleakest season of this excellent comedy-drama. This batch of episodes embraced the bifurcated nature of the show, which always spent roughly half of its time in the high-dollar world of white-collar crime and the other half among physically violent drug dealers and street criminals — embraced it so fully, in fact, that it often felt as if we were watching two shows in one, starring three central protagonists (Jimmy, Mike, and Kim) and many major supporting players, all sliding on the moral spectrum between pure and corrupt, each landing farther along by the end.

Series creators Peter Gould and Vince Gilligan and their peerless writing staff (which includes Gennifer Hutchison, Alison Tatlock, Ann Cherkis, Thomas Schnauz, Heather Marion, and Gordon Smith) have created a series that’s nearly immaculate in its construction, with every story beat taking the form of a clearly laid-out montage, sequence, or theater-style scene — one that usually unfolds at a much slower pace than TV’s usual, leaving room for pregnant pauses, entrances and exits, and moments where we get to just stare at a character’s face as they contemplate what they’re about to do, what they’ve just done, or what’s about to happen to them. 

by Matt Zoller Seitz, Vulture | Read more:
Image: Nicole Wilder/AMC/Sony Pictures Television
[ed. I haven't seen BCS's fourth season yet and will have to wait until it's released in its entirety on Netflix. However, I did finally watch Bojack deliver his eulogy in the "Free Churro" episode last night, and it's both brilliant and riveting. Both shows are incredibly nuanced and intelligent. If you haven't seen one or the other, do yourself a favor and check them out. TV doesn't get much better than this.] 

Monday, October 8, 2018

What Sarah Palin Saw Clearly

Ten years ago, we first met Sarah Palin, then the governor of Alaska, for a series of TV interviews. At the time, Palin might not have been able to name a single newspaper or magazine—but she did read where the electorate, at least a significant part of it, was moving. Her candidacy revealed that long-standing political norms were being pushed aside by a new style of divisive, personality-driven populism. A decade later, it’s clear that Palin was more than a historical footnote; she was the harbinger of things to come.

In 2008, John McCain wanted to change politics with his selection of a running mate; his idea was to pick Joe Lieberman, an independent senator who caucused with the Democrats. According to aides, McCain wanted to confront extreme partisanship and forge a kind of national-unity government built on comity and compromise, pledging to serve a single term. But after Senator Lindsey Graham floated the idea, the hard-core party faithful rejected the notion out of hand. Faced with the choice of picking a fight with the most loyal (and ideological) Republican voters, or picking a more doctrinaire candidate, McCain decided to appease the base.

Historically, vice-presidential nominees have been selected for a variety of reasons, a combination of campaign politics and compatible skill sets. Despite the slogan “Country first,” the McCain team focused exclusively on politics—desperate for an edge that would help the GOP win a rare third term, even in a weak economy. (A month later, the economy would go from problematic to cataclysmic.)

We recently spoke with top McCain and Obama aides for our podcast marking 10 years since the Palin interviews. According to Steve Schmidt, a senior McCain adviser, Palin’s vetting did not include asking any questions like “Do you understand the U.S. tax system?” or “Do you know where Iraq is?” Schmidt said they simply assumed that a governor would be knowledgeable about public policy.

Fundamentally, it was the priority the campaign placed on optics—Palin’s outsider image and undeniable charisma—that led to the selection of a politician who believed that Saddam Hussein attacked the U.S. on 9/11 and that the British government was run by Queen Elizabeth. (...)

As the campaign went on, Palin bridled at the tone McCain set. When a McCain supporter said “I don’t trust Obama. I have read about him and he’s an Arab,” McCain responded, “No ma’am, no ma’am … He's [a] citizen that I just happen to have disagreements with on fundamental issues.” When one man said he was scared of Obama, McCain replied that “[Obama] is a decent person, and a person that you do not have to be scared [of] as president of the United States.” The crowd booed. McCain also said, “I admire Senator Obama and his accomplishments. I will respect him and I want everyone to be respectful, and let’s make sure we are.”

Palin took the opposite tack: She stoked her supporters’ fears—and won their cheers. At her rallies, Palin said, “I am just so fearful that this is not a man who sees America the way that you and I see America.” At one, a man shouted “Treason” and Palin said nothing. At another, Palin’s anti-Obama diatribe led a man to yell out, “Kill him!” Palin did not push back against her often-angry crowds. In the strongest echo of today’s Trump rallies, she instead used her speeches to go after the free press (or the “lamestream media”), reserving particular scorn for elite publications. Palin’s supporters then started verbally attacking her traveling press corps, including hurling a racial epithet at an African American journalist. Again, Palin not only refused to lower the temperature, she seemed to bask in that kind of heat.

This is not to say that McCain or other “old school” politicians were unwilling to go negative or attack their political opponents. They would and did. It’s that there were lines they wouldn’t cross—especially when it came to respecting the legitimacy of their opponents and of journalists. These are lines that politicians like Palin and President Trump won’t even acknowledge. And in a big, diverse democracy, where power is transferred peacefully, where compromise and consensus are required to get things done, those boundaries matter.

Another corrosive trend: Palin’s contempt for experts and elites. The then-governor didn’t study policy journals or even follow national news. She resisted when McCain aides tried to get her to focus on preparing for our interviews. But she thrilled her supporters with attacks on coastal liberals and support for “normal Joe Six Pack Americans.” Among some conservatives, a disdain for the liberal intelligentsia morphed into a disdain for the highly educated or for facts that contradicted their worldview. That has led to the current environment, where no matter what evidence the experts have brought to bear—against Brexit, against Trump’s trade and tax policies—it doesn’t matter to many voters. These elites, and the arguments they make, are dismissed out of hand.

by Katie Couric and Brian Goldsmith, The Atlantic |  Read more:
Image: Jonathan Ernst
[ed. What she saw most clearly was opportunity. A dimwitted Alaska hillbilly (who briefly was my governor). Does anyone remember that "high profile" meeting she had with Donald Trump ahead of the 2012 elections? She must be really burned she didn't get any reward once he was elected.]

Saturday, October 6, 2018

NFL Quarterbacks Are Breaking Records: What's Behind the Passing Explosion?

The NFL is the middle of an offensive explosion, and it is the jump in passing production that has been most jarring. Teams have thrown more passing touchdowns in the first four weeks – 228 – than ever before, shattering the previous record of 205, which was set in 2003. The average yards per attempt has also skyrocketed. In 1977, teams averaged 6.5 yards per pass attempt (YPA). In 2001, it was 6.8. That figure has jumped to 7.5 in 2018.

Below we’ll look at the reasons for the astonishing rise (the statistics below do not include Thursday night’s game between the Colts and Patriots).

Penalties are helping the passing game

This is the one fans instinctively point to. Rules have changed up and down the field: roughing the passer; hand fighting; pass interference calls; freedom for skill talent to go over the middle of the field without fear their head will be dislodged from their neck. It’s tough to be a defender these days.

The effect is undeniable. First, there’s the fact that more penalties lead to extended drives, which leads to more opportunities to catapult the ball all over the field. Further, offensive coaches know the rules are in their favor and that breeds bravery. Coaches, by their nature, often coach to minimise risks. But this new breed of head honcho seems a little different. They go for it on fourth down. They’re unafraid to drop their quarterback back 50 times. Why? All the advantages are in their favor.

Throwing the ball deep in the NFL is a gigantic market inefficiency, particularly because pass interference is a spot foul. The reward far outweighs the risk: any semblance of contact and the call is going to the offense. Teams still are not going far enough with deep shots.

Think of this as similar to when the NBA discovered the efficiency of the three-point shot. For years, slow, methodical, pound-the-ball-inside offenses dominated basketball. And it was nonsense! There was a more valuable shot a couple of feet away. Sure, you’d miss more, but the ones you did make more than made up for that fact.

And the more draconian interpretation of the roughing-the-passer rule has changed things too. Why would quarterbacks be intimated about standing in the pocket in this era? They have every protection imaginable. Defenders have to play a tick slower, wondering whether they’re hitting at the right angle or whether they can keep their body weight from tumbling on the poor, helpless quarterback (sarcasm intended). It’s a major psychological edge. 

Teams are passing more often

It seems the league as a whole has cottoned on to the idea that passing is more efficient than running. Teams are slinging the ball on first down – traditionally a run down – more than ever before.

That’s partly because fossilized coaches have made way for a slew of wunderkinds, and a fresh batch of offensive coordinators has pushed the quarterback-driven league into overdrive, down-and-distance be damned.

It seems that every down is now a passing down. Kirk Cousins is on pace to attempt 756 passes, despite having an exciting, explosive running back accompanying him in the backfield, and the very worst offensive line through four weeks in league history.

Meanwhile, teams such as Miami and Atlanta use the quick passing game as an extension of the run game. They rely on yards after the catch. They use similar run-game principles, get the ball to the perimeter and force the defense to drive downhill and pursue. Only the approach is different to slamming the ball into one guy’s belly in the backfield. Any eligible player could get the ball, not one. Screen passes continue to steadily rise year on year.

There’s an intangible effect to that, too: quarterbacks juice their efficiency numbers, completing more balls and lowering their interception and sack rates. That breeds confidence.

Quarterbacks are getting better

At this point, you may be wondering whether the league-wide trends are being pumped up by a couple of pass-happy teams. That’s not the case. True, the Chiefs and Rams have been at the forefront of the bombs away movement, but this evolution is taking place everywhere.

If you erase the Rams and Chiefs passing offense, the NFL league average quarterback rating drops from 94.5 to 90.5, according to Mike Tanier of Football Outsiders. That’s still four points higher than the league average quarterback a year ago. (...)

Offensive and defensive schemes are evolving

The NFL can only use the talent that college provides. That’s always been true: the professionals like to the think of themselves as the wise ones. They send down football decrees from on high and the rest of the football universe takes note.

The reverse is actually true. High school coaches innovate. College guys steal and evolve those ideas. And the NFL folks do the same, pinching players and ideas from college, before twisting and contorting them to fit their own needs. Evolution in high school takes weeks. It’s a genius born of desperation. Evolution in the NFL, meanwhile, takes decades.

We’re finally seeing the fruits of the changes at the high school and college level paying dividends in the NFL – if you like high-scoring games, that is.

College style offenses – which feature multi-receiver sets, an emphasis on spacing the field, switch releases to attack man-coverage (receivers crisscrossing at the line of scrimmage), spread formations, and all manner of pre-snap deception – make the game easier for quarterbacks. There’s no other way to say it. It’s how schools routinely chuck out 4,000-yard passers, regardless of the individual player’s talent.

by Oliver Connolly, The Guardian | Read more:
Image: Mark J Terrill/AP
[ed. Football is dying. See also: Why the Seahawks' Earl Thomas flipped off his own team after fracturing his leg.]

Harry Nilsson

Goat Yoga


Goat Yoga Is 'Preposterous,' Says Goat Yoga Teacher. It's Also ... Terrific!


[ed. A friend told me about goat yoga last week and I thought he was kidding (pun intended), but apparently it's a real thing. And popular (here's a video). How do they get those little goats to stand on people?]
Image via:

Banksy Painting Self-Destructs

A New Kind of Economy

Andrew Yang is a 43-year-old American entrepreneur who is seeking the Democratic Party’s nomination for president in 2020. His campaign focuses on solving the problem of job losses to automation—an issue many politicians seem happy to ignore. Starting right now, Yang wants to create a whole new kind of economy from the ground up, in which automation is transformed from a threat into the foundation for widespread human flourishing.

Briefly, his policy proposals include implementing a form of Universal Basic Income (also known as UBI, or what he calls the “Freedom Dividend”), universal healthcare, a “digital social currency,” and a redefinition of GDP that more accurately reflect the health of the nation. If this sounds like socialism then, according to Yang, your thinking about the economy might be antiquated. He contends that the capitalism/socialism spectrum is no longer relevant or useful if we take an honest look at the modern world.

The following is a transcription of my phone conversation with Andrew Yang, lightly edited for length and clarity.
* * *
Peter Clarke: Let’s say Donald Trump wins again in 2020 and the government continues on its current path of ignoring automation. What can we expect to happen in the near future?

Andrew Yang: You would expect the current trends that we’re seeing to accelerate. Many of the trends I’m most concerned about will accelerate with either a Democrat or a Republican in the White House, because we’re talking about how technology is going to displace millions of retail workers, call center workers, fast food workers, and truck drivers. And there’s no dramatic halting of that trend that would occur if a different political party were in office.

Now, if I were president, my goal would be to accelerate meaningful countermeasures and solutions. That does not mean putting a stop to Artificial Intelligence (AI) and autonomous vehicles, but that we need to dramatically reshape the way that both value and work are experienced in our society. And that’s a generational challenge. It’s not going to happen overnight.

What I’m most concerned about is the trends we’ve seen of the automation of four million manufacturing jobs in the U.S. between 2000 and 2015. When that gets applied to retail workers and truck drivers and fast food workers, which are some of the most common jobs in the U.S. economy, we’ll witness a continued disintegration of American society, which we can see in the numbers right now.

A lot of the automation is happening more quickly than almost anyone projected. I think I just read this week that Waymo is releasing its autonomous taxis in 2019. Do you think that this is going to sneak up on everyone in the next couple of years?

Well, I’m going to use call centers as an example. There are about 2.5 million call center workers in the United States right now making $14 an hour—typically high school graduates. So, if you’re reading this right now, how long is it going to be before Artificial Intelligence can outperform the average call center worker?

Let’s say that timeframe is two or three years. How many call center workers will that effect? How many will be out of a job shortly thereafter? And so that’s not speculative at all. That’s something that we know Google and other companies are working on right now.

If you take that one fact pattern and apply it over and over again in the economy, you’ll wind up with a massive displacement of workers. And it will sneak up on us quite quickly because that replacement of call center workers won’t affect five or ten thousand workers; it may well effect 500,000 or a million.

I know that it might take a while, even in the best case scenario, to implement Universal Basic Income or some of the other measures you’re proposing. So, is it already too late? Are we already going to see a massive dip in jobs because of automation and then huge swaths of the country are out of work?

It’s a little late in the day, truly. If you look at the labor force participation rate in the U.S., it peaked around 2000 and has declined ever since over the last 18 years—to a point where now it’s 62.9 percent, which is the same level as El Salvador and the Dominican Republic. And almost one out of five prime working-age men—between the ages of 21 and 30—have not worked in the last 12 months.

So, this is already with us. If you wait until the truckers start to riot and the taxi drivers start to riot—then it is late in the day. And that’s one of the reasons I’m running for president now. If I can get to the Oval Office and make this happen in 2021, then we can at least be able to prevent some of the disintegration that accompanies loss of work.

By the numbers, when men in particular get idle, we tend to degenerate into self-destructive and antisocial behaviors. You can see that in the surge of suicides among middle-aged Americans around the country that have brought down our country’s life expectancy over the last two years—and the fact that eight Americans are dying of opiates every hour. Again, if you look beneath the surface, all of these trends are already here with us. (...)

I am curious about how Democrats are addressing this—or not addressing this. Just this week Bernie Sanders was on Facebook saying that workers at Whole Foods, owned by Amazon, need to unionize so that they can keep their jobs and not be displaced by robots. To me this seems possibly shortsighted, but do you see any role for unionizing jobs to keep them around?

There are a few different approaches to this. And one of the things I disagree with Bernie Sanders on is that I believe he has a vision of the economy that functions like it did decades ago, where the path to prosperity is to get fair treatment by employers for workers. That relies upon a notion of the economy where, in order for a company to succeed and grow, it needs to hire more and more people and it needs to treat them well.

Unfortunately, we’re increasingly entering an age where companies can become very, very successful and profitable without hiring lots of people. And then when it does hire people, the most efficient way for them to do so is as temporary or gig workers or contract workers or outsourced workers. And so, trying to force companies to change their employment models, and then empowering workers through unions to do so, might be the right thing to do in some contexts; but in my opinion, it’s highly unlikely to solve the problem because we’ve been heading in this direction for decades, and in some ways Bernie Sanders’ solution is an attempt to turn back the clock.

As an example, let’s say that you were a fast food restaurant, and you’re paying your employees $10 an hour. Then, fast food workers quite rightfully say, hey, we can’t live on that; we need to be paid $15 an hour. So, one approach could be to say, the fast food workers should unionize and then bargain for $15 an hour. Another approach might be for the fast food companies to say—and they would do this if they had to pay $15 an hour in many instances—that maybe we can make our locations work with fewer workers.

At that point, you have to ask yourself whether you would purposefully want the fast food company to not automate its locations for the purpose of having more people in jobs that pay them between $10 and $15 an hour. And that becomes a very interesting question about what you think the purpose of jobs is.

If the purpose of jobs is to get a certain task done, then you would obviously want to automate that task because if the fast food company can serve the food with fewer workers, then that would be a good thing. If you think jobs are a way to maintain social order and make sure that someone has to be somewhere for certain shifts of the day—and that, without that, that person would struggle to find a degree of structure or purpose—then maybe you say, let’s make these fast food companies employ people just for the sake of it. That to me is a really fundamental question that we have to ask ourselves.

Outside politics, I do see a lot of intellectuals talking about how we need to redefine jobs. I know Steven Pinker recently said that we need to protect the interests of people, not the interests of jobs. Do you think it’s possible for the country at large to ever shift their perspective on jobs like this, where we don’t worry about loss of jobs, we worry about loss of human wellbeing?

I completely believe it is possible. And I think that the Freedom Dividend—the Universal Basic Income—that I’m proposing and will implement as president would enable that shift in a real way for millions of Americans quite quickly.

I will say that if you dig into the data, you find that men and women experience idleness differently. …Women who are idle, I believe, would very, very naturally adopt this project-based approach that you’re talking about. The data shows that women who are out of work get involved in the community and go back to school and do things that are quite productive and pro-social. Whereas, men who are out of work spend 75 percent of their time on the computer playing videogames and surfing porn—and then tend to devolve into substance abuse and self-destructive behaviors. Men who are out of work volunteer less than employed men, even though they have more time. And so, men and women seem to experience idleness differently.

When you talk about this project-based approach to work—for women it would be entirely natural and attainable, in my opinion, for many, many women. And for many men it would be as well. But for some men it might be less natural. …The providing of structure and purpose and fulfillment to millions of relatively unskilled men who are making transitions over the next number of years is one of the great projects of this age. (...)

You hold yourself out as a strong capitalist, which separates your campaign from Bernie Sanders, who embraces the term ‘democratic socialism.’ Do you have any strong feelings about the term socialism? Do you think it’s ever something that you’ll incorporate into the branding of your campaign, or are you shying away from that?

My honest feeling is that the entire capitalism/socialism framing is decades old and unproductive. So, what I’m suggesting is that we need to evolve to the next stage of capitalism, which prioritizes human wellbeing and development. If someone were to say to me, for example, hey, you’re for universal health care, and that’s an idea I associate with socialists…I would shrug and say, sure. [Laughs.] You know? I just think the labels are unfortunate. People have very strong associations with each one.

A friend of mine, Eric Weinstein, said a couple of things that I thought were very profound. First, he said we never knew that capitalism was going to be eaten by its son—technology. Second, we have to become both radically capitalist and radically socialist in different aspects of American life and the economy. And I think both of those things are true.

I just don’t think it’s constructive to try and pick a spot in this arbitrary capitalism/socialism spectrum. What I believe is we have to redefine our economy and re-write the rules so that it centers around us. Capitalism’s efficiency and GDP are going to have an increasingly nonexistent relationship to how most Americans are doing.

by Peter Clarke, Quillette |  Read more:
Image: Stephen McCarthy/Collision
[ed. See also: America has become a gerontocracy. We must change that.]

The Big Hack: How China Used a Tiny Chip to Infiltrate U.S. Companies


In 2015, Amazon.com Inc. began quietly evaluating a startup called Elemental Technologies, a potential acquisition to help with a major expansion of its streaming video service, known today as Amazon Prime Video. Based in Portland, Ore., Elemental made software for compressing massive video files and formatting them for different devices. Its technology had helped stream the Olympic Games online, communicate with the International Space Station, and funnel drone footage to the Central Intelligence Agency. Elemental’s national security contracts weren’t the main reason for the proposed acquisition, but they fit nicely with Amazon’s government businesses, such as the highly secure cloud that Amazon Web Services (AWS) was building for the CIA.

To help with due diligence, AWS, which was overseeing the prospective acquisition, hired a third-party company to scrutinize Elemental’s security, according to one person familiar with the process. The first pass uncovered troubling issues, prompting AWS to take a closer look at Elemental’s main product: the expensive servers that customers installed in their networks to handle the video compression. These servers were assembled for Elemental by Super Micro Computer Inc., a San Jose-based company (commonly known as Supermicro) that’s also one of the world’s biggest suppliers of server motherboards, the fiberglass-mounted clusters of chips and capacitors that act as the neurons of data centers large and small. In late spring of 2015, Elemental’s staff boxed up several servers and sent them to Ontario, Canada, for the third-party security company to test, the person says.

Nested on the servers’ motherboards, the testers found a tiny microchip, not much bigger than a grain of rice, that wasn’t part of the boards’ original design. Amazon reported the discovery to U.S. authorities, sending a shudder through the intelligence community. Elemental’s servers could be found in Department of Defense data centers, the CIA’s drone operations, and the onboard networks of Navy warships. And Elemental was just one of hundreds of Supermicro customers.

During the ensuing top-secret probe, which remains open more than three years later, investigators determined that the chips allowed the attackers to create a stealth doorway into any network that included the altered machines. Multiple people familiar with the matter say investigators found that the chips had been inserted at factories run by manufacturing subcontractors in China.

This attack was something graver than the software-based incidents the world has grown accustomed to seeing. Hardware hacks are more difficult to pull off and potentially more devastating, promising the kind of long-term, stealth access that spy agencies are willing to invest millions of dollars and many years to get.

There are two ways for spies to alter the guts of computer equipment. One, known as interdiction, consists of manipulating devices as they’re in transit from manufacturer to customer. This approach is favored by U.S. spy agencies, according to documents leaked by former National Security Agency contractor Edward Snowden. The other method involves seeding changes from the very beginning.

One country in particular has an advantage executing this kind of attack: China, which by some estimates makes 75 percent of the world’s mobile phones and 90 percent of its PCs. Still, to actually accomplish a seeding attack would mean developing a deep understanding of a product’s design, manipulating components at the factory, and ensuring that the doctored devices made it through the global logistics chain to the desired location—a feat akin to throwing a stick in the Yangtze River upstream from Shanghai and ensuring that it washes ashore in Seattle. “Having a well-done, nation-state-level hardware implant surface would be like witnessing a unicorn jumping over a rainbow,” says Joe Grand, a hardware hacker and the founder of Grand Idea Studio Inc. “Hardware is just so far off the radar, it’s almost treated like black magic.”

But that’s just what U.S. investigators found: The chips had been inserted during the manufacturing process, two officials say, by operatives from a unit of the People’s Liberation Army. In Supermicro, China’s spies appear to have found a perfect conduit for what U.S. officials now describe as the most significant supply chain attack known to have been carried out against American companies.

by Jordan Robertson and Michael Riley, Bloomberg | Read more:
Image: Scott Gelber

Sunday, September 30, 2018

Out of Office Reply


Bill Watterson
via:
[ed. Taking a short vacation and will be back soon. Enjoy the archives. Update: Returning tomorrow (Oct. 6).]

Saturday, September 29, 2018

Smart Reply

Last week I got an email from my boss about a recent piece I’d written on Donald Trump’s penis (just out here doing the family name proud). Surprisingly, it was a kind email as opposed to a notification of the termination of my employment for besmirching the paper of CP Scott with wisecracks about Trump’s junk, but, kind or otherwise, I’ve never known how to respond to messages from my editors.

The informality of the email form clashes with my natural instinct, which is – as New Yorker writer Anthony Lane once wrote of his former editor Tina Brown – to stand to attention every time they call me on the phone. Is email the internet equivalent of going out for drinks with someone after work? Are you expected to be casual with each other in a way you aren’t in the office? Or should I express myself in a way that reflects my true feelings? “Dear Boss, I humbly thank you. Really sorry about all the dick jokes. Respectfully yours, Hadley K Freeman.” After more than 20 years of using email, I have not figured this out.

Unexpectedly, at the bottom of this email, Google itself stepped in to help me. Beneath my boss’s message were three suggested responses: “Love it!” “Haha that’s awesome!” “That’s a good one!” Turns out “respectfully yours” is just not very Silicon Valley 2018.

Once I’d recovered from the hernia caused by laughing for 72 hours at Google’s noble effort to make me sound like a fembot (“Haha, that’s a good one, Hadley!”), I investigated what was going on here. It turned out that my Gmail had updated itself, which is kind of creepy in itself, though not nearly as creepy as its suggested responses, a feature Google has dubbed “Smart Reply” and what I dub “Hell”. Smart Reply is conclusive proof that Google does, as we already kinda knew, read our emails; now it has decided it can answer them better than we do. No need to read any sci-fi books about the dystopian future, kids, because we’re already here. Love it!

I could talk about how furious I am about this bizarrely open disregard of privacy (though in today’s world, the only thing that marks one out as more of an oldster than starting an email with “Dear” are concerns about privacy) because, sure, of course I am. But I have to be honest: I enjoy the banality of these automated answers, so much so that I’ve started reading them before the actual emails. When my mother wrote to ask if we were meeting up over the weekend, the suggested responses were “Let me get back to you on that!” “Amazing!” and “No way!” For the record, Google, if I ever replied “Let me get back to you on that!” to my mother, she would call the police to say my body had been possessed by an alien.

These responses confirm something anyone who’s ever been on social media already knows: online, there is no middle ground. Everything is either “Amazing!” or “No way!” Meanwhile the exclamation mark continues its deadening march to become as ubiquitous as the “x” for kiss: a once almost ironic stylistic extra that was strictly reserved for close friends is now the downright, earnest norm.

I think what tickles me most about these suggested replies is the way they lay bare some of the most irksome elements of our age. Most of us have become inured to the point of obliviousness to those jarring, algorithm-driven adverts, all faux chumminess mixed with creepy surveillance, topped off with spectacularly unhelpful help: “Hey! We noticed you once bought a red coat! Here are some other red coats! To add to your red coat collection! We’re just being helpful!” (...)

And there is something of this soulless mentality in Gmail’s Smart Responses. On the one hand, email is supposed to make it easier for all of us to keep in touch. On the other, we are now being urged to outsource that correspondence to an exclamation-happy bot that makes us sound more robot than human.

by Hadley Freeman, The Guardian |  Read more:
Image: The Project Twins/Synergy

How Chefs Have Reinvented the Dishes They Envy Most
Image (and Scotch egg recipe): via (Bon Appétit)