Wednesday, April 14, 2021

Afghanistan: An End to America's "Forever War"

Joe Biden has decided that 20 years is enough for America’s longest war, and has ordered the remaining troops out no matter what happens between now and September.

Biden’s withdrawal is one area of continuity with his predecessor, although unlike Donald Trump, this administration consulted the Afghans, US allies and its own agencies before announcing the decision. But both presidents were responding to a national weariness of “forever wars”.

To the surprise of no one, the Republican party that acquiesced in Trump’s order to get the troops out by May, is now launching attacks on Biden’s “reckless” decision. The political attacks will mount if, as many expected, the current peace initiative fails and the Taliban steps up their offensive.

In Afghanistan, any US president is damned if you do and damned if you don’t. Biden has plainly decided in that case, “don’t” is the better option.

In the Obama administration, Biden was a consistent voice of scepticism over the utility of military force in foreign policy, sometimes in opposition to advocates of humanitarian intervention.

He bluntly told a television interviewer on the campaign trail that he would feel “zero responsibility” if the status of Afghan women and other human rights suffered as a consequence of a US withdrawal.

“Are you telling me that we should go into China, go to war with China because what they’re doing to the Uyghurs,” he asked his CBS interviewer.

Safeguarding Afghan women and civil society has never been an official aim of the vestigial US military presence, but in the absence of a clearly defined goal, it became part of the de facto rationale.

“There are things that American officials have said over time to encourage that kind of thinking,” said Laurel Miller, who served as US special representative for Afghanistan and Pakistan, and now runs the Asia programme of the International Crisis Group.

“I’ll admit to – when I was in government – not feeling comfortable with some of those statements of enduring commitment, because I didn’t think it was believable.”

In making this decision, Biden has made clear he is setting aside Colin Powell’s famous “Pottery Barn rule”: if you break it, you own it. The quote comes from 2002 when the then secretary of state cited the fictional rule (which is not the policy of that furniture store) to warn George W Bush of the implications of invading Iraq. In Afghanistan, the US has part-owned the store for two decades now, and in reality, people and their livelihoods are still getting smashed.

by Julian Borger, The Guardian |  Read more:
Image: Kim Jae-Hwan/AFP/Getty Images
[ed. Finally. The problem being there was never a Plan B to start with. Just making stuff up as we went along (a textbook example of mission creep). See also: What Did the U.S. Get for $2 Trillion in Afghanistan? (NYT)]

The Daily Grind


Quite how long it takes a woman to grind for a family, apart from the time husking and shucking the maize, collecting the cooking water, and shaping and cooking the tortillas, depends on her skill and strength, the age and number of family members, the type of masa, and the quality of the metate. My estimate is that it takes about five hours a day to make enough masa for a family of five. This may seem incredible but it is in line with other estimates for contemporary Mexico and Guatemala collected by Michael Searcy, with Arnold Bauer’s estimate for Mexico, and experimental estimates for Europe collected in David Peacock’s in The Stone of Life (2013), 127. Since five hours is about as much as anyone can grind, the labor of one in five adults has to be devoted to making the staple bread.

The Daily Grind (Works in Progress)
Image: Magnus Ingvar Agustsson

How Trader Joe’s $2 "Two-Buck Chuck" Became a Best-Seller

Walk into almost any Trader Joe’s store and you’ll spot a behemoth display of Charles Shaw wine — or, as it’s more affectionately known, “Two Buck Chuck.”

Priced at a mere $1.99 to $3.79 per bottle, this magical ether is cheaper than most bottled water. It’s been knighted as the “darling of the discount wine world” by critics, and boasts a cult following among price-minded consumers.

For Trader Joe’s, the wine is also a gold mine.

The grocery chain has sold 1B+ bottles of Two Buck Chuck since debuting the beverage in 2002. Today, some locations sell as many as 6k bottles/day — or ~16% of the average store’s daily sales.

How is a supposedly decent wine sold at such a low price point? Where does it come from? And how did it rise to prominence?

This is the tale of one wine brand, two vintners, and the unlikely democratization of a historically snobby industry. (...)

The box wine baron

Fred Franzia did not share Shaw’s air d’élégance.

He was unrefined and heavyset, with a body shape the New Yorker likened to a “gourmet marshmallow.” Reclusive and gruff, he shied away from public appearances. He referred to winemakers as “bozos” and didn’t care for France.

Nonetheless, Franzia came from a long lineage of winemakers: His great-grandfather, Giuseppe, had immigrated to California’s Central Valley in 1893 and set up Franzia Brothers Winery (later sold to Coca-Cola); his uncle, Ernest Gallo, had built the largest wine exporter in California.

In 1973, Franzia launched his own wine company, Bronco Wine Co.

In a rickety wood-paneled trailer held together with duct tape, he set out to produce extremely cheap, high-quality “super-value” wines — wines that rejected the pretentiousness of Napa Valley.

Initially, Bronco operated as a wholesaler, buying bulk wine and selling it to larger wineries at a profit.

But soon, Franzia saw an opportunity to produce his own cheap wines — wines, as he later put it, that “yuppies would feel comfortable drinking.”

Through a legal loophole, he could say his wines were “Cellared and Bottled in Napa” if the brand was founded prior to 1986. So, he developed a strategy of buying out distressed wineries with distinguished-sounding names — Napa Ridge, Napa Creek, Domaine Napa — and using them to sell his stock of less-desirable Central Valley wines.

On a summer day in 1995, a few years after Charles F. Shaw Winery went bust, Franzia purchased the winery’s brand, label, and name for a mere $27k.

“We buy wineries from guys from Stanford who go bankrupt,” he later boasted. “Some real dumb-asses from there.”

Unbeknownst to the real Charles Shaw, Franzia was about to transform the once-fancy wine brand into an impossibly cheap everyman’s juice.

And in the process, he’d change the wine industry forever.

by Zachary Crockett, The Hustle |  Read more:
Image: Stephen Osman/Los Angeles Times via Getty Images

Tuesday, April 13, 2021

Charlie Musselwhite


[ed. See also: Ben Harper, Charlie Musselwhite - I'm In I'm Out And I'm Gone.]

What Is Kafkaesque? - The 'Philosophy' of Franz Kafka

Safety is Fatal

Humans need closeness and belonging but any society that closes its gates is doomed to atrophy. How do we stay open?

Many of us will recall Petri dishes from our first biology class – those shallow glass vessels containing a nutrient gel into which a microbe sample is injected. In this sea of nutrients, the cells grow and multiply, allowing the colony to flourish, its cells dividing again and again. But just as interesting is how these cells die. Cell death in a colony occurs in two ways, essentially. One is through an active process of programmed elimination; in this so-called ‘apoptotic’ death, cells die across the colony, ‘sacrificing’ themselves in an apparent attempt to keep the colony going. Though the mechanisms underlying apoptotic death are not well understood, it’s clear that some cells benefit from the local nutrient deposits of dying cells in their midst, while others seek nutrition at the colony’s edges. The other kind of colony cell death is the result of nutrient depletion – a death induced by the impact of decreased resources on the structure of the waning colony.

Both kinds of cell death have social parallels in the human world, but the second type is less often studied, because any colony’s focus is on sustainable development; and because a colony is disarmed in a crisis by suddenly having to focus on hoarding resources. At such times, the cells in a colony huddle together at the centre to preserve energy (they even develop protective spores to conserve heat). While individual cells at the centre slow down, become less mobile and eventually die – not from any outside threat, but from their own dynamic decline – life at the edges of such colonies remains, by contrast, dynamic. Are such peripheral cells seeking nourishment, or perhaps, in desperation, an alternative means to live?

But how far can we really push this metaphor: are human societies the same? As they age under confinement, do they become less resilient? Do they slow down as resources dwindle, and develop their own kinds of protective ‘spores’? And do these patterns of dying occur because we’ve built our social networks – like cells growing together with sufficient nutrients – on the naive notion that resources are guaranteed and infinite? Finally, do human colonies on the wane also become increasingly less capable of differentiation? We know that, when human societies feel threatened, they protect themselves: they zero in on short-term gains, even at the cost of their long-term futures. And they scale up their ‘inclusion criteria’. They value sameness over difference; stasis over change; and they privilege selfish advantage over civic sacrifice.

Viewed this way, the comparison seems compelling. In crisis, the colony introverts; collapsing inwards as inequalities escalate and there’s not enough to go around. In a crisis, as we’ve seen during the COVID-19 pandemic, people define ‘culture’ more aggressively, looking for alliances in the very places where they can invest their threatened social trust; for the centre is threatened and perhaps ‘cannot hold’.

Human cultures, like cell cultures, are not steady states. They can have split purposes as their expanding and contracting concepts of insiders and outsiders shift, depending on levels of trust, and on the relationship between available resources and how many people need them. Trust, in other words, is not only related to moral engagement, or the health of a moral economy. It’s also dependent on the dynamics of sharing, and the relationship of sharing practices to group size – this last being a subject that fascinates anthropologists.

In recent years, there’s been growing attention to what drives group size – and what the implications are for how we build alliances, how we see ourselves and others, and who ‘belongs’ and who doesn’t. Of course, with the advent of social media, our understanding of what a group is has fundamentally changed.

The British anthropologist Robin Dunbar popularised the question of group size in his book How Many Friends Does One Person Need? (2010). In that study, he took on the challenge of relating the question of group size to our understanding of social relationships. His interest was based on his early studies of group behaviour in animal primates, and his comparison of group sizes among tribal clans. Dunbar realised that, in groups of more than 150 people, clans tend to split. Averaging sizes of some 20 clan groups, he arrived at 153 members as their generalised limit.

However, as we all know, ‘sympathy groups’ (those built on meaningful relationships and emotional connections) are much smaller. Studies of grieving, for example, show that our number of deep relationships (as measured by extended grieving following the death of a sympathy group member) reach their upward limit at around 15 people, though others see that number as even smaller at 10, while others, still, focus on close support groups that average around five people.

For Dunbar, 150 is the optimal size of a personal network (even if Facebook thinks we have more like 500 ‘friends’), while management specialists think that this number represents the higher limits of cooperation. In tribal contexts, where agrarian or hunting skills might be distributed across a small population, the limiting number is taken to indicate the point after which hierarchy and specialisation emerge. Indeed, military units, small egalitarian companies and innovative think-tanks seem to top out somewhere between 150 and 200 people, depending on the strength of shared conventional understandings.

Though it’s tempting to think that 150 represents both the limits of what our brains can accommodate in assuring common purpose, and the place where complexity emerges, the truth is different; for the actual size of a group successfully working together is, it turns out, less important than our being aware of what those around us are doing. In other words, 150 might be an artefact of social agreement and trust, rather than a biologically determined structural management goal, as Dunbar and so many others think. We know this because it’s the limit after which hierarchy develops in already well-ordered contexts. But we also know this because of the way that group size shrinks radically in the absence of social trust. When people aren’t confident about what proximate others are mutually engaged in, the relevant question quickly turns from numbers of people in a functioning network to numbers of potential relationships in a group. So, while 153 people might constitute a maximum ideal clan size, based on brain capacity, 153 relationships exist in a much smaller group – in fact, 153 relationships exist exactly among only 18 people.

Dunbar’s number should actually be 18, since, under stress, the quality of your relationships matters much more than the number of people in your network. The real question is not how many friends a person can have, but how many people with unknown ideas can be put together and manage themselves in creating a common purpose, bolstered by social rules or cultures of practice (such as the need to live or work together). Once considered this way, anyone can understand why certain small elite groups devoted to creative thinking are sized so similarly.

Take small North American colleges. Increasingly, they vie with big-name universities such as Harvard and Stanford not only because they’re considered safer environments by worried parents, but because their smaller size facilitates growing trust among strangers, making for better educational experiences. Their smaller size matters. Plus, it’s no accident that the best of these colleges on average have about 150 teaching staff (Dunbar’s number) and that (as any teacher will know) a seminar in which you expect everyone to talk tops out at around 18 people.

But what do we learn from these facts? Well, we can learn quite a bit. While charismatic speakers can wow a crowd, even the most gifted seminar leader will tell you that his or her ability to involve everyone starts to come undone as you approach 20 people. And if any of those people require special attention (or can’t tolerate ideological uncertainty) that number will quickly shrink.

In the end, therefore, what matters much more than group size is social integration and social trust. As for Facebook’s or Dunbar’s question of how many ‘friends’ we can manage, the real question ought to be: how healthy is the Petri dish? To determine this, we need to assess not how strong are the dish’s bastions (an indicator of what it fears) but its ability, as with the small North American college, to engage productively and creatively in extroverted risk. And that’s a question that some other cultures have embraced much better than even North American colleges.

by David Napier, Aeon |  Read more:
Image: Fiddlesticks Country Club, a gated community in Fort Meyers, Florida. Photo by Michael Siluk/UIG/Getty

Novel HIV Vaccine Approach Shows Promise in “Landmark” First-in-Human Trial

A novel vaccine approach for the prevention of HIV has shown promise in Phase I trials, reported IAVI and Scripps Research. According to the organisations, the vaccine successfully stimulated the production of the rare immune cells needed to generate antibodies against HIV in 97 percent of participants.

The vaccine is being developed to act as an immune primer, to trigger the activation of naïve B cells via a process called germline-targeting, as the first stage in a multi-step vaccine regimen to elicit the production of many different types of broadly neutralizing antibodies (bnAbs). Stimulating the production of bnAbs has been pursued as a holy grail in HIV for decades. It is hoped that these specialised blood proteins could attach to HIV surface proteins called spikes, which allow the virus to enter human cells, and disable them via a difficult-to-access regions that does not vary much from strain to strain.

“We and others postulated many years ago that in order to induce bnAbs, you must start the process by triggering the right B cells – cells that have special properties giving them potential to develop into bnAb-secreting cells,” explained Dr William Schief, a professor and immunologist at Scripps Research and executive director of vaccine design at IAVI’s Neutralizing Antibody Center, whose laboratory developed the vaccine. “In this trial, the targeted cells were only about one in a million of all naïve B cells. To get the right antibody response, we first need to prime the right B cells. The data from this trial affirms the ability of the vaccine immunogen to do this.” (...)

One of the lead investigators on the trial, Dr Julie McElrath, senior vice president and director of Fred Hutch’s Vaccine and Infectious Disease Division, said the trial was “a landmark study in the HIV vaccine field,” adding that they had demonstrated “success in the first step of a pathway to induce broad neutralising antibodies against HIV-1.”

HIV affects more than 38 million people globally and is among the most difficult viruses to target with a vaccine, in large part because of its unusually fast mutation rate which allows it to constantly evolve and evade the immune system.

Dr Schief commented: “This study demonstrates proof of principle for a new vaccine concept for HIV, a concept that could be applied to other pathogens as well. With our many collaborators on the study team, we showed that vaccines can be designed to stimulate rare immune cells with specific properties and this targeted stimulation can be very efficient in humans. We believe this approach will be key to making an HIV vaccine and possibly important for making vaccines against other pathogens.”

The company’s said this study sets the stage for additional clinical trials that will seek to refine and extend the approach, with the long-term goal of creating a safe and effective HIV vaccine. As a next step, the collaborators are partnering with the biotechnology company Moderna to develop and test an mRNA-based vaccine that harnesses the approach to produce the same beneficial immune cells. According to the team, using mRNA technology could significantly accelerate the pace of HIV vaccine development, as it did with vaccines for COVID-19.

by Hannah Balfour , European Pharmaceutical Review | Read more:
Image: uncredited
[ed. The holy grail.]

Awful but Lawful

The courtroom is a stage. The rules of who says what and when are carefully determined, sometimes after prolonged legal contention. The truth that emerges is viewed as legitimate because it is the product of process. The prosecution and defense each present two versions of the truth, and the jury is set the task of selecting one. Punishment is allotted, or the defendant acquitted, based on which version is selected.

One assumption behind the courtroom theater is that both versions of the truth, the one presented by the prosecution and the one suggested by the defense, are equal, in that either one could be selected by the jury as the winning version of what happened. What is left unanswered in this system of procedural justice is whether both sides are equally worthy if one version of these truths depends on its connection to racial prejudices that a jury of ordinary people may have.

Derek Chauvin’s trial for the killing of George Floyd is an example of this. The lawyers for former Minneapolis police officer Chauvin are basing their defense on racist notions of Black men as angry and uncontrollable, and Black communities as inherently threatening and menacing. Beliefs that would be considered overtly racist is other contexts are thus drawn into the courtroom without any scrutiny of their foundations. In the opening argument, the defense counsel specifically notes that “Mr. Chauvin stands five foot nine, 140 pounds. George Floyd is six three and 223 pounds.” Not only that, but in these first two weeks of the trial, the defense has resorted to another stereotype—that of the drug-addicted Black man. In the defense’s opening statement, Chauvin’s lawyer described the initial call to police. Floyd, it was reported, “was under the influence of something. . . He’s not acting right. He’s six to six-and-a-half feet tall.”

The racist seed of the uncontrollable black man, planted early, has been nourished ever since. Defense counsel Eric Nelson insisted that video preceding Floyd’s death shows “the police squad car rocking back and forth” to highlight just how strong and wild Floyd was. So significant was the danger posed by the large and out-of-control Floyd, the defense wants the jury (and everyone else watching) to believe, that it justified the use of “maximum restraint technique,” what used to be called the “hobble or the hog tie.” It is in the justified and appropriate process of hog tying George Floyd that the defendant Derek Chauvin used “[one] knee to pin Mr. Floyd’s left shoulder blade and back to the ground and his right knee to pin Mr. Floyd’s left arm to the ground.”

As these details are tossed about in the courtroom, the damage to the larger discourse about race is already done. The Black man has been framed as an animal, a “hog” that must be tied up so that it won’t thrash and flail at having been apprehended. In the choreographed environment of the courtroom, where the focus is upon discerning a truth in the specific case at hand, there is no room to draw connections to America’s larger racist history. (...)

Having stated that a Black man can be hog tied and held to the ground, the defense also set out to prove that the Black crowd at Cup Foods was belligerent, hostile, and menacing. In his opening statement, Nelson told the jury that “as the crowd grew in size, similarly so too did their anger.” In the defense narrative, there is no effort to individuate the members of the crowd, to note that the “crowd” included a seventeen-year-old and her nine-year-old cousin out to get a snack at Cup Foods, an off-duty EMT who begged police to let her help Floyd, an MMA fighter who had trained alongside Minneapolis police officers, and an old man who could not help but cry when he took the stand. The defense deployment of the “angry black man” became literal in the cross-examination of Donald Wynn Williams II, the mixed martial artist. “It’s fair to say you got angrier and angrier?” Nelson needled Williams, until he replied, “I grew professional and professional. I stayed in my body. You can’t paint me out to be angry.” (...)

There is no doubt that the murder of George Floyd has provoked a racial reckoning in the United States. In the anti-racist work and conversations that have taken place since, efforts have been made to expose surreptitious and systemic racism. But if the Derek Chauvin trial is any illustration, this work has not yet reached the courtroom. Regardless of the outcome of the trial itself and whether there is any justice for George Floyd, the direct appeal to racist prejudice, featuring the uncontrollable and angry Black man, the rough and lawless world of the inner city, the mostly Black crowd as menacing, have all been normalized by the defense as plausible explanations of what happened. Having these ideas form such an integral and overt centerpiece of the trial in a courtroom, where they are not critically examined, suggests to those watching that such a narrative is permissible and possibly true.

by Rafia Zakaria, The Baffler | Read more:
Image: Defense attorney Eric Nelson delivers his opening statement at the Derek Chauvin trial. | AJC

Monday, April 12, 2021

Florida GOP Introduces Ballotless Voting In Disenfranchised Communities


TALLAHASSEE, FL—In an effort to streamline the state’s electoral process, Florida Republicans introduced a new bill to the legislature Thursday that would establish ballotless voting in disenfranchised communities. “We’ve eliminated the complex and insecure process of casting a ballot so that voters from underserved communities don’t have to worry about going to the polls or mailing anything in,” said co-sponsor Rep. Chris Sprowls of the popular proposal, which had already garnered unanimous support among Republicans in the House and Senate. “Come voting day, voters will be able to walk right up to the doors of their polling place, then turn around. No lines, no worry. We’ve listened to your concerns, and are confident that ballotless voting will address them.” At press time, Sprowls added that the bill would also help fight voter fraud by eliminating the likelihood of votes being erroneously counted.

by The Onion |  Read more:
Image: uncredited
[ed. See also: Georgia Lawmakers Warn Stricter Gun Regulation Could Cause Mass Shooters To Move To Other States; Man Opposes Taxing Rich Because He Knows One Day He Could Find $20 Bill On Ground; and, Report: Today Not One You Will Remember (Onion). And, more seriously: Republicans Are Making 4 Key Mistakes (The Atlantic): 

"Arizona Republicans propose to reduce the number of days for early voting. They want to purge voter rolls of people who missed the previous election. They want to cut off mail-in balloting five days before Election Day. And they want to require that affidavits of identity accompany any ballot that is mailed in.

Texas Republicans are pushing a bill to limit early voting, prohibit drive-through voting, limit the number of ballot drop-off locations, and restrict local officials’ ability to publicize voting by mail."]

Against Timarchus

When Aeschines, one of the ten Attic orators and member of the peace embassy dispatched to Philip of Macedon, was accused by Demosthenes and Timarchus of intriguing against Athens on behalf of the same, his defense before the Assembly was swift and straightforward: “Timarchus can’t accuse anyone of betraying Athens, because I heard he’s a fucking skank.”

It worked. Timarchus was stripped of citizenship and vanished from public life, while Aeschines went on to commission the Fourth Sacred War under Philip’s aegis, found a school of rhetoric, and eventually retire to the winemaking island of Samos, where he died well into his seventies. Being a dizzy little bitch who hates fun pays off sometimes.
 [Full text of Against Timarchus here.]

Gentlemen and themtlemen! You know I never have a bad word to say about anyone. It’s probably the second-best-known thing about me, my quiet and peaceful approach to conflict. If I were to guess what the first-best-known thing about me is, I honestly couldn’t even begin to guess, because I just don’t think about human relationships in that way, you know? But this isn’t about me, which is such a relief, because I’m so uncomfortable when things are about me, so I’m really glad to be able to say that this has nothing to do with me and everything to do with Athens. It’s the city I’m here to talk about, not me. Honestly, if I could talk about the city without being here at all, like if there was some way I could talk without talking, or being myself, or being in any way perceived by all of you, I would do that. But as I already said, this isn’t about me, so even though it makes me really uncomfortable to address you all publicly like this, I don’t even care, my discomfort at being the center of attention is just not as important as the dignity and safety of Athens, which always comes first, at least for me, and I hope for you guys, haha!

God!! Athens! Athens, you know? It’s just like — Athens! Okay? Like, what does that even mean, but also, it kind of means everything, right? It says it all. Athens!! I just think, for me personally, that Athens is so important, especially for all of us as Athenians, which I truly believe that we are, every one of us in our own very special way, that it would be such a shame if Athens ever came to harm because someone among us was representing themselves as a friend to Athens when in actual fact, like in honest-to-God real life, in a very tangible way, they were not a friend to Athens, and were actually making her look bad to other people, Macedonians for example.

And I’m just going to come out and say it, which is that this so-called “friend,” this person who has actually really done a lot of damage to Athens, is, I’m sorry to say, but it’s Timarchus. I don’t even care that he’s trying to hurt me, because I’m just whatever, but insofar as I am a representative of Athens, a legitimate member of her assemblies and jury pools, a member of the greater Athenian body, that’s the issue here, not me, personally, Aeschines. You know me, you know I absolutely do not hold grudges or even care what happens to this bag of meat that I call my “body.” But I do care, like really care, about my friends, and I honestly do consider Athens a friend. I really do. And when someone hurts her? Okay, then it’s like, let’s go. So let’s go.

I know this is probably not the first time some of you have heard about Timarchus. If I were to guess, I’d say you probably have been hearing a lot of really troubling shit about him over the years, because I’ve been hearing it too, even though I never passed it along or said anything about him myself. But like, you guys, you guys, we live in a democracy, right? Like we live in a society, yes, but more importantly a democratic society, so we have laws and rules and so on, and we do ask of our citizens a certain commitment to excellence that not just anybody can abide by. And I don’t think we should have to apologize for having high standards. Do you? Okay, good, I’m really glad to hear that. Honestly, I’m really relieved to hear that, because I was worried it was going to be just me, but I’m so glad that we can all agree that if someone violates those laws, or doesn’t live up to our admittedly very high standards (but that’s why Athens is so great, you guys, and just to throw this in for detail, I don’t want to get too far off topic but I do think it’s important, I also think this is why it’s actually completely fair to consider Macedonia like fully Greek, like absolutely there’s a shared commitment to Athenian values, which is why I sometimes call Macedonia Athens II, just for short), but our very high standards, then they should just like….they should just go! Away, and have to live somewhere else, and I don’t even care where, like be well, okay, be safe and healthy, good fortune go with you, I absolutely wish you the best wherever your journey takes you, but you just cannot stay here, because everyone here is already pulling their weight and frankly doing more than their fair share to begin with.

You are all probably aware that prostitution — sorry, “sex work,” I mean I want to be as nice about this as I can, and it’s perfectly fine to do sex work, that’s a totally legitimate option if that’s all you want out of life, I’ve known some really great prostitutes who I would absolutely invite to a dinner party if the vibe was right, but it’s not like being a judge, or a general, I think we can all admit that — is not a compatible side gig for an Athenian citizen, right? We actually have a full, actually-written-down law about that. If you’re, like….I don’t know, a really friendly Thracian of uncertain parentage and you want to be like a fun courtesan, you should go for it and really with my blessing, but it’s just not appropriate for a freeborn man of Athens tasked with safeguarding our citizenry. Right? You have to pick one. You can be a sex worker all you want, God bless, but then you have to stick with that, and you can’t try to become one of the nine archons, or apply to the priesthood, or hold office.

So don’t you think it’s kind of fucked up that Timarchus had the gall to address this assembly as a citizen of Athens even though he absolutely fucked for cash when he was in medical school? Like don’t you think the fact that he lied to us all about it is also a problem? We probably could have made an exception for him if he just asked. But he didn’t ask. It’s honestly not even the sex work, for me personally that causes the problem, but that he lied about it, because it like begs the question — ahaha sorry, that was just a little joke for some of you rhetoricians — it like raises the question of what else has he lied about? Also I know a lot of you were really uncomfortable last week when he took his cloak off during his address, like we were all just in the gymnasium or something, like it was no big deal, and we shouldn’t have to feel uncomfortable when we’re just trying to assemble.

To be clear, I’m not trying to shame Timarchus by bringing all of this old shit up, even though a lot of it isn’t even that old. Partly because I’m honestly not even sure he can feel shame? Like I just don’t think he registers emotions on that scale, at all. So it’s not even worth it. But also I don’t want to make a big deal out of this. I just want us all to agree to abide by the rules we already agreed on!

by Daniel Lavery, Shatner Chatner |  Read more:
Image: Aeschines, copy of Herculaneum original in the National Museum, Naples, early to mid 1800s, marble via
[ed. If you're not familair with Daniel's (formerly, Mallory's) charming work, see here and here. See also: Why We’re Freaking Out About Substack (NYT).]

Masters 2021 Champion: Hideki Matsuyama


Masters 2021: Hideki Matsuyama, quiet star, makes a loud statement for his nation and for himself (Golf Digest)

Saturday, April 10, 2021

The Universal Warrior


The oldest way of war was what Native North Americans called – evocatively – the ‘cutting off’ way of war (a phrase I am borrowing from W. Lee, “The Military Revolution of Native North America” in Empires and Indigines, ed. W. Lee (2011)), but which was common among non-state peoples everywhere in the world for the vast stretch of human history (and one may easily argue much of modern insurgency and terrorism is merely this same toolkit, updated with modern weapons). The goal of such warfare was not to subjugate a population but to drive them off, forcing them to vacate resource-rich land which could then be exploited by your group. To do this, you wanted to inflict maximum damage (casualties inflicted, animals rustled, goods stolen, people captured) at minimum risk, until the lopsided balance of pain you inflicted forced the enemy to simply move away from you to get out of your operational range.

The main tool of this form of warfare (detailed more extensively in A. Gat, War in Human Civilization (2006) and L. Keeley, War Before Civilization (1996)) was the raid. Rather than announcing your movements, a war party would attempt to advance into enemy territory in secret, hoping (in the best case) to catch an enemy village or camp unawares (typically by night) so that the population could be killed or captured (mostly killed; these are mostly non-specialized societies with limited ability to incorporate large numbers of subjugated captives) safely. Then you quickly get out of enemy territory before villages or camps allied to your target can retaliate. If you detected an incoming raid, you might rally up your allied villages or camps and ambush the ambusher in an equally lopsided engagement.

Only rarely in this did a battle result – typically when both the surprise of the raid and the surprise of the counter-raid ambush failed. At that point, with the chance for surprise utterly lost, both sides might line up and exchange missile fire (arrows, javelins) at fairly long range. Casualties in these battles were generally very low – instead the battle served both as a display of valor and a signal of resolve by both sides to continue the conflict. That isn’t to say these wars were bloodless – indeed the overall level of military mortality was much higher than in ‘pitched battle’ cultures, but the killing was done almost entirely in the ambush and the raid.

We may call this the first system of war. It is the oldest, but as noted above, never entirely goes away. We tend to call this style ‘asymmetric’ or ‘unconventional’ war, but it is the most conventional war – it was the first convention, after all. It is also sometimes denigrated as primitive, but should not be judged so quickly – first system armies have managed to frustrate far stronger opponents when terrain and politics were favorable.

What changed? Very briefly, agriculture, cities and the state. Agriculture created a stationary population that both wouldn’t move but which could also be dominated, subjugated and have their production extracted from them. Their wealth was clustered in towns which could be fortified with walls that would resist any quick raid, but control of that fortified town center (and its administrative apparatus of taxation) meant control of the countryside and its resources. Taking such a town meant a siege – delivering a large body of troops and keeping them there long enough to either breach the walls or starve out the town into surrender. This created a war where territorial control was defined by the taking of fixed points.

In such war, the goal was the deliver the siege. But delivery of the siege meant a large army which might now be confronted in the field (for it was unlikely to move by stealth, being that it has to be large enough to take the town). And so to prohibit the siege from being delivered, defenders might march out and meet the attackers in the field for that pitched battle. In certain periods, siegecraft or army size had so outpaced fortress design that everyone rather understood that after the outcome of the pitched battle, the siege would be a forgone conclusion – it is that unusual state of affairs which gives us the ‘decisive battle’ where a war might potentially be ended in a stoke (though they rarely were).

We may term this the second system of war. It is the system that most modern industrial and post-industrial cultures are focused on. Our cultural products are filled with such pitched battles, placed in every sort of era of our past or speculative future. It is how we imagine war. Except that it isn’t the sort of war we wage, is it?

Because in the early 1900s, the industrial revolution resulted in armies possessing both amounts of resources and levels of industrial firepower which precluded open pitched battles. All of those staples of our cultural fiction of battles, developed from the second system – surveying the enemy army drawn up in battle array, the tense wait, then the furious charge, coming to grips with the enemy in masses close up – none of that could survive modern machine guns and artillery.

What replaced it we may term the third system of war, though longer readers may know it by Biddle’s term, the Modern System (more here). Armies in this modern system still aim to control territory, as with second-system war, but they no longer square off in open fields. Rather, relying on cover and concealment to mitigate the overwhelming firepower a modern battlefield covered with machine guns, artillery and airpower, they aim to disorient and overwhelm the decision-making capabilities of their enemy with lightning mechanized offensives.

What happens when two current-day modern systems meet? We don’t really know, though there is a lot of speculation. One of the things which made the conflict between Azerbaijan and Armenia so closely watched last year (in 2020, for those reading this later) was that it provided a chance to see two sides both with (sometimes incomplete) access to the full modern kit of war – not only tanks, jets and artillery, but cyber warfare, drones and so on. The results remain to be much discussed analyzed, but it may well be that a fourth system of war is in the offing, defined by the way that drone-based airpower combined with electronic surveillance and cyber-warfare redefined the battle-space and allowed Azerbaijan in particular to project firepower deep into areas where Armenian forces considered themselves safe.

But I shouldn’t get too off track. The point of all of this is that these systems of war are not merely different, they are so radically different that armies created in one system often fundamentally fail to understand the others (thus the tendency for second and third system armies to treat first system war as some strange new innovation in war, when it is in fact the oldest system by far). As we’re going to see, the aims, experiences and outcomes of these systems are often very different. They demand and inculcate different values and condition societies differently as well.

Collections:

via: A Collection of Unmitigated Pedantry
Image: Via Wikipedia, a Mesolithic painting of a battle from Morella la Vella, Spain (c. 10,000BP), showing what looks to be an ambush, a normal occurrence in first system war.

Monty Python

Amazon Union Vote Fails

Earlier today the National Labor Relations Board announced the results of the vote on whether workers at the Amazon warehouse in Bessemer, Ala., would join a union. The vote was 738 in favor to 1,798 against. It’s bad news, but it doesn’t mean workers in future Amazon campaigns won’t or can’t win. They can. The results were not surprising, however, for reasons that have more to do with the approach used in the campaign itself than any other factor.

The stories of horrific working conditions at Amazon are well-known. Long before the campaign at Bessemer, anyone paying even scant attention would be aware that workers toil at such a grueling pace that they resort to urinating in bottles so as not to get disciplined for taking too much time to use the facilities, which the company calls “time off task.” Christian Smalls was fired a year ago for speaking publicly about people not getting personal protective equipment in his Amazon facility, in bright-blue state New York. Jennifer Bates, the Amazon employee from the Bessemer warehouse, delivered testimony to Congress that would make your stomach turn. Workers at Amazon desperately need to unionize, in Alabama, Germany—and any other place where the high-tech, futuristic employer with medieval attitudes about employees sets up a job site of any kind. With conditions so bad, what explains the defeat in Bessemer?

Three factors weigh heavily in any unionization election: the outrageously vicious behavior of employers—some of it illegal, most fully legal—including harassing and intimidating workers, and telling bold lies (which, outside of countries with openly repressive governments, is unique to the United States); the strategies and tactics used in the campaign by the organizers; and the broader social-political context in which the union election is being held.

Blowout in Bessemer: A Postmortem on the Amazon Campaign (The Nation)

[ed. What it was all about:]

Amazon is the second-largest private employer in the US, with 800,000 employees, and it has fiercely resisted attempts at worker organizing. The only other unionization effort to make it to a vote was in 2014, with a small group of repair technicians in Delaware, and it failed after an aggressive anti-union campaign. More recently, the NLRB found that Amazon threatened and fired workers who protested the company’s handling of COVID-19. While the Bessemer effort would only organize a single warehouse, it would show that it can be done. Already, employees at other Amazon facilities have expressed interest in following in BHM1’s footsteps.

“There’s a basic principle of organizing work that success breeds success, and that organizing often happens in self-reinforcing cycles of victory,” said Benjamin Sachs, a professor at Harvard Law School. “Organizing requires workers taking a risk, and the workers are more likely to take a risk when they see that the risk is going to pay off.”

Such a chain reaction could do more than change the conditions that hundreds of thousands of Amazon employees work under. Because of its size and the sprawling geographic scope of its logistics network, the quality and pay of Amazon’s jobs have a powerful effect on the quality and pay of other jobs. Amazon itself has been touting this effect in its ads lobbying for a $15 minimum wage, and indeed, a recent study found that when Amazon raised its starting wage to $15 an hour in 2018, wages at nearby employers also rose.

But when Amazon jobs are compared to similar types of work, they come off much worse. Logistics jobs were historically a path to the middle class, and unionized warehouses typically pay double what Amazon does. When Amazon opens a warehouse, a Bloomberg analysis found, wages at other nearby warehouses often drop. Amazon’s methods for worker tracking and enforcing productivity — aspects of the job that prompted BHM1 to unionize — have also spread across the logistics industry and other sectors as companies attempt to compete with Amazon.

Sachs calls Amazon a bellwether employer, for its outsize role in shaping the labor market and defining the future of work, similar to the role the auto industry played in the early 20th century. “The unionization of that industry, which had a lot to do with labor law reform, was a defining moment for the labor market for decades,” he said.

Why the Amazon union vote is bigger than Amazon (The Verge)
Image: Patrick T. Fallon/AFP via Getty

Vlogging and Fishing in the Cascade Mountains


There it was again – an all-too-familiar splash in the shallow, rocky portion of the lake, maybe 200 feet along the shore from where I was standing. I had heard it twice already, and seen nothing but circular ripples on the glasslike surface of the water. But this time, I was watching. Just as I’d identified the torpedo-shaped, thrashing object launching from the surface of the water as a massive trout, a second one leapt into the air and snatched at an unfortunate insect.

I was backpacking and fishing deep in the Cascade mountains of Washington state, in search of alpine trout to catch and eat, and to film another adventure for my YouTube channel, NW Fishing Secrets. I’d started my fishing show as a hobby in April 2019, filming instructional videos on how to catch local fish, but as the audience grew rapidly, I realised viewers wanted more than that.

They wanted to feel what it was like to actually be in the wilds. NWFS had become more than a tutorial series – it was now a fishing adventure show, bringing the outdoors into people’s living rooms, allowing viewers who might not be able to visit these remote places to experience them as if they were there with me.

The evening before, I had driven 100 miles from my home near Seattle into the mountains in my self-converted, 1998 yellow campervan. The remote trailhead was another 12 miles up a gravel road crossed by several small creeks. I wanted to get as far away from urban life as I could, to be in a place where it was unlikely I’d see another soul. I wanted to be alone in the mountains. That night I filmed time-lapses of the bright starry sky while getting my camera gear ready for a four-day mountain backpacking and fishing adventure.

The next morning, after a good night’s sleep on the small bed in the back of my van, and a cup of freshly brewed coffee made in my little side-door mounted kitchen, I set off on the trail. I was travelling as light as possible – in my backpack were my pole-less tent, sleeping bag, butane stove, water purifier, first aid kit, fishing rod and reel, lures and various other bits of tackle. However, my video equipment – five cameras, batteries, tripods and a solar charger – must have brought the weight to around 60lbs. “See you in four days,” I mumbled to my van before disappearing into the forest.
by Leif Steffny, The Guardian |  Read more:
Image: YouTube/NW Fishing Secrets
[ed. For my grandson. Gotta give the guy an 'A' for enthusiasm (catching an 8" fish, carving a spoon, driving across a little stream...). Also packing as light as possible (plus the 60 lbs of video gear), then pulling out an avocado, tortilla, bottle of chipolte sauce, etc. : ) Strangely relaxing. See also: here and here.

Paranoia’s Pleasures

I am a paranoid person, which, if we’re not going to be fussy about clinical definitions, means I feel a constant unreasonable fear, one ruled by no overarching logic or taxonomy. I am paranoid about my relationships and my work. I am paranoid about rising sea levels, air pollutants, tap water, dark parking lots, and the back seat of my car. I am paranoid about whether I’ve locked the door—really, properly locked the door. I experience frequent bouts of paranoia in regards to the men in my life—what do they get up to when I’m not around?—as well as to many men I do not know. I realize I don’t look like the paranoid type, which is culturally coded as someone white and male, so I am also paranoid about other paranoiacs, what they make of my face and my monosyllabic last name.

In other words, I am fixated on what I must regularly confront yet cannot control. It is a very human condition, if not the human condition. Philip K. Dick once said that “the ultimate in paranoia is not when everyone is against you but when everything is against you.” Who has not experienced this? It is, in a basic way that has to do with subjectivity and the limits of individual free will, the premise of existence, its inaugural bad deal. We wake up each day to find our environments aligned against us; whatever lies outside our bodies’ jurisdiction is evidence of the world’s ongoing disregard for our inner wishes and designs. We cannot assert our will on life, or move the furniture with our minds; we can only feel unease about the baroque and bewildering prearrangements of both. Why are things like this? And: Who put that chair there?

As Dick noted, everyone recognizes at some point that “objects sometimes seem to possess a will of their own.” Paranoia turns this recognition into enmity, soaking the world in malignant animism, turning all the tables—literal and figurative—against us. This is what Thomas Pynchon acknowledges in the opening of his 1966 novel The Crying of Lot 49, a staple of paranoiac literature. Protagonist Oedipa Maas, a reasonably prosperous housewife who lives among all the comforts of Californian suburbia, has a dark thought about a deceased ex-boyfriend. She’s alone in her house, and the thought makes her laugh out loud: “You’re so sick, Oedipa, she told herself, or the room, which knew.”

The claustrophobia of “the room that knows” might connect Oedipa to her forebears, the mad wives and mistresses of Gothic fiction, but her paranoia is a product of her political moment—the social collapse of postwar California. It emerges when she’s asked to execute the will of Pierce Inverarity, the aforementioned ex-boyfriend (her dark thought was about his death; it was indeed pretty funny). Because Pierce was a rich and evil man with hearty investments in California’s defense industry, his accumulated estate is considerable. Sifting through it requires Oedipa to take a ranging tour of 1960s California, where she meets the state’s preeminent outcasts and kooks: John Birchers, retired anarchists, a suicidal playwright—all the people stuck, wriggling, to the underside of the rock of respectable society. The novel’s atmosphere strikes the dreamy balance between eccentricity and artifice peculiar to the West Coast. The college campuses heave with youthful radicalism, while the gay bars of San Francisco suffer, even then, from busloads of out-of-towners looking for some pre-authenticated thrills. Traveling through these circles, Oedipa learns of a secret, privatized postal service called WASTE; like a dream or an algorithm, her world auto-populates with symbols and messages in a “malignant, deliberate replication.”

Oedipa is one of the few well-known paranoiacs of literature who is also a woman, and for her, WASTE is not merely an abstract conundrum. Her apophenia begins and ends at the personal terminus of her ex-lover. Every sign of or clue about the network’s existence is connected to Pierce Inverarity, to an industry or investment he once touched. Did Inverarity concoct WASTE to torment Oedipa, as “some grandiose practical joke” from beyond the grave? Paranoia, here, leads not to the government or the World Bank but back to Pierce’s bed. It bears the still-warm imprint of a single human body, which makes it all the more terrifying.

If we’re tempted to say that Oedipa represents a female brand of paranoia, that’s only to emphasize that hers is actually a realistic version of the condition. American culture so often construes paranoia as an intellectual—and thus masculine—problem, rather than an emotional one. The truth is that it’s both. Rather than the dry thought exercises we associate with male conspiracy theorists, Pynchon gives us, in Oedipa, a view of what true paranoia is: a gut response to society’s collapse, to the deadening force of American capitalism—a way, maybe, of thrashing about in one’s loneliness and alienation. Knowledge under paranoia takes on emotional dimensions: it feels bad; it feels addictive. This becomes especially true for paranoiacs who aren’t white men, because for them, conspiracies—if you define a conspiracy as “the targeted wielding of systemic violence by the powerful against the powerless”—are often real, which makes unraveling them all the more imperative. What does it feel like to be constantly educating yourself about your own precarity, your own proximity to violence and death?

The theorist Eve Kosofsky Sedgwick once wrote that the paranoiac has a facile “faith in exposure”—a belief that revealing the contours of a vast regime of cruelty is the same thing as eradicating it. Non-white and non-male paranoiacs know it’s not that easy. Knowledge can save or kill us; sometimes, over the long slog of trying to outwit a murderous system, it’ll do both. The only way to survive is by seeking out other people who know this truth, and to find solidarity in the shared condition of knowing, intimately, all the problems but none of the solutions.

This is what Oedipa does. Her paranoia pushes her not inward but outward, into America’s tattered public sphere, into whatever social life is possible in a jealously individualistic country. Maybe Oedipa is susceptible to paranoia because of this individualism—its isolating and suburban rhythms—but the sickness, in its full bloom, starts looking like salvation. In searching for and hoarding information, Oedipa swings from one conversation to another; she rides municipal buses all night, talks to strangers in cafés. She meets a drunken sailor and, finding herself “overcome all at once by a need to touch him,” puts him to bed. These small acts of generosity are what remain when the answers don’t come. Paranoia presents an excuse to delve into the social: the people Oedipa speaks to are also suspicious and insane, but they still speak to one another. WASTE, which is ultimately a communications network, could be a ruse. But that network—and, by implication, Oedipa’s search for it—could also be “a real alternative to the exitlessness, to the absence of surprise to life, that harrows the head of everybody American you know.”

by Zoe Hu, The Believer |  Read more:
Image: via
[ed. Social bonding and paranoia seem to co-exist rather easily in many conspiratorial and domestic terrorism groups (and in fact, are probably membership drivers). A good example can be found Richard Power's fictional novel The Overstory.]

Making Music Theory Entertaining

To me, music theory is as foreign a concept as the intricacies of cooking meth, and I don’t think I’m alone in that. Sure, I have a basic grasp of scales, and I can read sheet music from my piano lesson days, but as soon as someone mentions the Dorian scale and E-flat-diminished-seventh chords, my eyes start to glaze over. So, when I found myself on the other side of an Adam Neely YouTube wormhole, I was a little surprised. Not only had I just sunk four hours of my life into content discussing those aforementioned topics, but also that I was thoroughly entertained by it. And I’m not alone.

Neely, 33, has garnered 1.25 million subscribers with his music theory-drenched video content. It’s all highly polished, cleverly edited and consistently funny. Some of his videos are simple Q&As or tongue-in-cheek deep dives into viral moments like the TikTok sea shanty fad, but Adam really shines in his on-camera essays. The wit and editing is still there and it’s accompanied by lush original backing tracks composed by Adam and his colleagues.

I asked his friend and collaborator, Ben Levin, what he thought had garnered Adam such a big following. “I think the key to Adam’s work, and it’s something maybe he takes for granted and can’t put his finger on, but he loves music in a very special way. I don’t think he realizes how contagious that is. Everything he does comes off as incredibly genuine, and I think it makes people realize, ‘dang, music is just so much bigger than I thought it was.’” (...)

Student becomes the teacher

Neely grew up with music. His mother is a singer who performed avant-garde contemporary classical music and taught students in her house. Yet, it wasn’t until high school that Adam became enthralled with it.

“I did the thing that almost every other bassist does which is, ‘I want to join a band. Oh, they already have a guitarist, I guess I’ll play bass.’”

He soon found it wasn’t the short straw he thought it was. His mother brought him to a performance by the late Dave Brubeck and jazz bassist Christian McBride at the Library of Congress. “I knew about jazz. I thought it was fun, but this was the first time I realized that it was really fun. Just watching these two masters going back and forth laughing like there was some sort of inside joke I wasn’t a part of. I didn’t realize until then how life affirming music could be.”(...)

“The plan was to play ‘cool’ jazz music, whatever that is, and teach,” he says. And for Adam, that was teaching at a university. “I was very much ready to be in academia. I knew what it meant to be a working musician from my family, it’s a grind, and they were able to support themselves through teaching, so I had my plan set. Stuff turned out a little differently, but I am teaching jazz and playing it.”

After a stint of playing gigs ranging from weddings to musical theater, Adam was beginning to burn out, and then suddenly a big portion of his paying work fell through. He was unsure what to do with all of his newly found free time, or where he’d find the next check to pay his rent. YouTube didn’t occur to him until a friend recommended it to him.

Neely knew a bit of the video editing program Final Cut, bought himself a DSLR camera, and in the same way, he threw himself into bass, he began producing videos. “I would categorize it as edutainment, or curiosity content, letting people discover things they don’t know,” he says, being sure to distinguish himself from other musical education channels that teach theory fundamentals like keys, notation, or the circle of fifths. Neely is too modest to make this comparison, but as Neil DeGrasse Tyson is to astrophysics, Neely is to music theory. He’s a communicator. (...)

This quality of production is consistent throughout much of Neely’s content. It’s all supremely edited and cohesively plotted for a one-man army. “There are very few music YouTubers that work with editors because you want someone who’s adept at editing, but also has an understanding of music theory. There is a very specific timing to how the edit should reflect the music that is in the video.”

You can feel that in his video on “The Girl From Ipanema” where he needs to cut between multiple arrangements of the storied bossa nova track. “Editing is very musical, working in Final Cut reminds me of working in Ableton Live (a digital audio workstation popular with electronic music producers) — it’s about creating a rhythmic flow. One cut of video should flow into the next just as one section of music should lead into the next.” Instead of bleeping curses, Neely plays a characteristic clip of a man yelling “bass,” which feels much like a little flair that an improvising trumpist might play over a jazz arrangement.

In this video Neely again shares a narrative that isn’t taught in music school. He points out in the beginning of the video that the version of “The Girl from Ipanema” that’s taught in The Real Book, the veritable jazz standards Bible, is a watered-down and white-washed version of the song. Through an in-depth dive into the context and history of bossa nova and the song itself, Neely shows that the original track from Brazil is actually more ambiguous and interesting than the Americanized version that would later be adopted into textbooks.

When asked about these videos, Neely says with a smirk, “I’ve always had kind of a shit-stirrer persona. I’m just doing my best to use that for good.”

by Lukas Harnisch, Spin | Read more:
Image:Liz Maney

Thursday, April 8, 2021


Yamaha XSR 155 Scrambler
via:

Four Ways of Looking at the Radicalism of Joe Biden

Joe Biden didn’t wake up one day and realize he’d been wrong for 30 years.

I covered him in the Senate, in the Obama White House, in the Democratic Party’s post-Trump reckoning. Biden was rarely, if ever, the voice calling for transformational change or go-it-alone ambition.

But you’d never know it from his presidency. The standard explanation for all this is the advent of the coronavirus. The country is in crisis, and Biden is rising to meet the moment. But I don’t buy it. That may explain the American Rescue Plan. But the American Jobs Plan, and the forthcoming American Family Plan, go far beyond the virus. Put together, they are a sweeping indictment of the prepandemic status quo as a disaster for both people and the planet — a status quo that in many cases Biden helped build and certainly never seemed eager to upend.

Over the past few months, I’ve been talking to White House staff members, to congressional Democrats, to policy experts and to the Biden administration’s critics to better understand why President Biden is making such a sharp break with Joe Biden. Here are a few of them, though this is by no means a complete list.

The collapse of the Republican Party as a negotiating partner. Most discussions of the renewed ambitions of the Democratic Party focus on ideological trends on the left. The real starting point, however, is the institutional collapse of the right. Before Biden, Democratic presidents designed policy with one eye on attracting Republican votes, or at least mollifying Republican critics. That’s why a third of the 2009 stimulus was made up of tax cuts, why the Affordable Care Act was built atop the Romneycare framework, why President Bill Clinton’s first budget included sharp spending cuts. Both as a senator and a vice president, Biden backed this approach. He always thought a bipartisan deal could be made and usually believed he was the guy who could make it.

But over the past decade, congressional Republicans slowly but completely disabused Democrats of these hopes. The long campaign against the ideological compromise that was the Affordable Care Act is central here, but so too was then-Speaker John Boehner’s inability to sell his members on the budget bargain he’d negotiated with President Barack Obama, followed by his refusal to allow so much as a vote in the House on the 2013 immigration bill. And it’s impossible to overstate the damage that Mitch McConnell’s stonewalling of Merrick Garland, followed by his swift action to replace Justice Ruth Bader Ginsburg, did to the belief among Senate Democrats that McConnell was in any way, in any context, a good-faith actor. They gave up on him completely.

The result is that Obama, Biden, the key political strategists who advise Biden and almost the entire Democratic congressional caucus simply stopped believing Republicans would ever vote for major Democratic bills. They listened to McConnell when he said that “the only way the American people would know that a great debate was going on was if the measures were not bipartisan.” And so Democrats stopped devising compromised bills in a bid to win Republican votes.

This has transformed policy design: These are now negotiations among Democrats, done with the intention of finding policies popular enough that Republican voters will back them, even if Republican politicians will not. Biden still talks like he believes bipartisanship is possible in Congress, but his administration has put the onus on Republicans to prove it, and to do so on the administration’s terms. That, more than any other single factor, has unleashed Democrats’ legislative ambitions.

A new generation of crises created a new generation of staffers. I’ve been struck by the generational divide within the Democratic Party. Washington is run by 20- and 30-somethings who run the numbers, draft the bills, brief the principals. And there is a marked difference between the staffers and even the politicians whose formative years were defined by stagflation, the rise of Reaganism and the relief of the Clinton boom, and those who came of age during financial crises, skyrocketing personal debt, racial reckonings and the climate emergency. There are exceptions to every rule, of course — see Sanders, Bernie — but in general, the younger generation has sharply different views on the role of government, the worth of markets and the risks worth taking seriously.

I put this observation to Brian Deese, the 43-year-old head of the National Economic Council. Deese was a young economic policy prodigy in the Obama administration. Now he’s the guy running the N.E.C., and he agreed that the new generation of staff members see the world very differently. “There has been a lot more work done to try to understand what the roots of economic inequality are over the course of the last decade, and openness to thinking about power and power dynamics,” he told me. (...)

Biden has less trust in economists, and so does everyone else. Obama’s constant frustration was that politicians didn’t understand economics. Biden’s constant frustration is that economists don’t understand politics.

Multiple economists, both inside and outside the Biden administration, told me that this is an administration in which economists and financiers are simply far less influential than they were in past administrations. Some were frustrated by the change, others thought it a proper rebalancing of roles. But there is nothing like the axis of influence held by Summers, Tim Geithner and Peter Orszag at the dawn of the Obama administration, or that Robert Rubin and Summers held in the Clinton administration. Janet Yellen, the Treasury secretary, holds real weight in internal discussions, and so do some others, but economists are one of many voices at the table, not the dominant voices. This partly reflects Biden himself: he’s less academically minded, and more naturally skeptical of the way economists view the world and human behavior, than either Obama or Clinton. But it goes deeper than that.

The backdrop for this administration is the failures of the past generation of economic advice. Fifteen years of financial crises, yawning inequality and repeated debt panics that never showed up in interest rates have taken the shine off economic expertise. But the core of this story is climate. “Many mainstream economists, even in the 1980s, recognized that the market wouldn’t cover everyone’s needs so you’d need some modest amount of public support to correct for that moderate market failure,” Felicia Wong, the president of the Roosevelt Institute, said. “But they never envisioned the climate crisis. This is not a failure of the market at the margins. This is the market incentivizing destruction.” (...)

Economists have their ideas for solving climate change — a hefty carbon tax chief among them — but Biden and his team see this as fundamentally a political problem. They view the idea that a carbon tax is the essential answer to the problem of climate change as being so divorced from political reality as to be actively dangerous. Deese gets animated on this point. “I want to double down on that and say, it’s not just a messaging and narrative imperative,” he told me. “It has to be that Americans see and experience that the investments in building out a more resilient power grid actually improve their lives and create job opportunities for them, or their neighbors.”

Even beyond climate, political risks weigh more heavily on the Biden administration than they did on past administrations. This is another lesson learned from the Obama years. The Obama team had real policy successes: They prevented another Great Depression, they re-regulated the financial sector, they expanded health insurance to more than 20 million people. But Democrats lost the House in 2010, effectively ending Obama’s legislative agenda, and then they lost the Senate in 2014, and then Donald Trump won the White House in 2016, and then Democrats lost the Supreme Court for a generation.

Many who served under Obama, and who now serve under Biden, believe that they were so focused on economic risks that they missed the political risks — and you can’t make good economic policy if you lose political power. The Biden team is haunted by the fear that if they fail, a Trump-like strongman could recapture power. This helps explain why, for instance, they’re unmoved by arguments that the $1,400 stimulus checks, though wildly popular, were poorly targeted. As one of Biden’s economic advisers put it to me, “if we don’t show people we’re helping the dickens out of them, this country could be back to Trump way too quickly,” only he used an earthier word than “dickens.”

Biden is a politician, in the truest sense of the word. Biden sees his role, in part, as sensing what the country wants, intuiting what people will and won’t accept, and then working within those boundaries. In America, that’s often treated as a dirty business. We like the aesthetics of conviction, we believe leaders should follow their own counsel, we use “politician” as an epithet.

But Biden’s more traditional understanding of the politician’s job has given him the flexibility to change alongside the country. When the mood was more conservative, when the idea of big government frightened people and the virtues of private enterprise gleamed, Biden reflected those politics, calling for balanced budget amendments and warning of “welfare mothers driving luxury cars.” Then the country changed, and so did he.

A younger generation revived the American left, and Bernie Sanders’s two campaigns proved the potency of its politics. Republicans abandoned any pretense of fiscal conservatism, and Trump raised — but did not follow through on — the fearful possibility of a populist conservatism, one that would combine xenophobia and resentment with popular economic policies. Stagnating wages and a warming world and Hurricane Katrina and a pandemic virus proved that there were scarier words in the English language than “I’m from the government, and I’m here to help,” as Ronald Reagan famously put it.

Even when Biden was running as the moderate in the Democratic primary, his agenda had moved well to the left of anything he’d supported before. But then he did something unusual: Rather than swinging to the center in the general election, he went further left. And the same happened after winning the election. He’s moved away from work requirements and complex targeting in policy design. He’s emphasizing the irresponsibility of allowing social and economic problems to fester, as opposed to the irresponsibility of spending money on social and economic problems. His administration is defined by the fear that the government isn’t doing enough, not that it’s doing too much. 

by Ezra Klein, NY Times |  Read more:
Image: Amr Alfiky/The New York Times
[ed. Exactly.]