Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Wednesday, October 29, 2025

Please Do Not Ban Autonomous Vehicles In Your City

I was listening with horror to a Boston City Council meeting today where many council members made it clear that they’re interested in effectively banning autonomous vehicles (AVs) in the city.

A speaker said that Waymo (the AV company requesting clearance to run in Boston) was only interested in not paying human drivers (Waymo is a new company that has never had human drivers in the first place) and then referred to the ‘notion that somehow our cities are unsafe because people are driving cars’ as if this were a crazy idea. A council person strongly implied that new valuable technology always causes us to value people less. One speaker associated Waymo with the Trump administration. There were a lot of implications that AVs couldn’t possibly be as good as human drivers, despite lots of evidence to the contrary. Some speeches were included lots of criticisms that applied equally well to what Uber did to taxis, but now deployed to defend Uber.

AVs are ridiculously safe compared to human drivers

The most obvious reason to allow AVs in your city is that every time a rider takes one over driving a car themselves or getting in a ride share, their odds of being in a crash that causes serious injury or worse drop by about 90%. I’d strongly recommend this deep dive on every single crash Waymo has had so far:

[Very few of Waymo’s most serious crashes were Waymo’s fault (Understanding AI).]

This is based on public police records rather than Waymo’s self-reported crashes. It doesn’t seem like there have been any serious crashes Waymo’s been involved in where the AV itself was at fault. This is wild, because Waymo’s driven over 100 million miles. These statistics were brought up out of context in the hearing to imply that Waymo is dangerous. By any. normal metric it’s much more safe than human drivers.

40,000 people die in car accidents in America each year. This is as many deaths as 9/11 every single month. We should be treating this as more of an emergency than we do. Our first thought in making any policy related to cars should be “How can we do everything we can to stop so many people from being killed?” Everything else is secondary to that. Dropping the rate of serious crashes by even 50% would save 20,000 people a year. Here’s 20,000 dots:


The more people choose to ride AVs over human-driven cars, the fewer total crashes will happen.

One common argument is that Waymos are very safe compared to everyday drivers, but not professional drivers. I can’t find super reliable data, but ride share accidents seem to occur at about a rate of 40 per 100 million miles traveled. Waymo in comparison was involved in 34 crashes where airbags deployed in its 100 million miles, and 45 crashes altogether. Crucially, it seems like the AV was only at fault for one of these, when a wheel fell off. There’s no similar data for how many Uber and Lyft crashes were the driver’s fault, but they’re competing with what seems like effectively 0 per 100 million miles.

by Andy Masley, The Weird Turn Pro |  Read more:
Image: Smith Collection/Gado/Getty Images

Tuesday, October 28, 2025

The Uncool: A Memoir

Who Is Cameron Crowe Kidding With the Title of His Memoir?

One of the greatest tricks cool people play on the rest of us is convincing us in their memoirs that they were and are profoundly uncool. Cameron Crowe comes right out with the pandering on his book’s cover: “The Uncool: A Memoir.”

The title refers to a scene in “Almost Famous” (2000), the tender film he wrote and directed. The headstrong rock critic Lester Bangs (Philip Seymour Hoffman) is consoling the Crowe-like hero, a floppy-haired teenage rock journalist, over the telephone at a low moment. Bangs says, “The only true currency in this bankrupt world is what you share with someone else when you’re uncool.” It’s a good line. Call me anytime, Bangs adds: “I’m always home. I’m uncool.”

Never mind whether Lester Bangs was plausibly uncool. How about Crowe? Here’s a man who spent his adolescence in the 1970s careening around the United States for Rolling Stone magazine, a boy wonder in the intimate and extended company of David Bowie, Led Zeppelin, Gram Parsons, the Allman Brothers, Fleetwood Mac, Emmylou Harris, Kris Kristofferson, the Eagles, Todd Rundgren and Yes, about whom he was writing profiles and cover stories.

Occasionally, he’d fly home to see his mother, check out high school for a day or two, then blearily type up his road memories and interview notes. Sounds uncool to me.

The second act of Crowe’s career began when, in his early 20s, he went undercover for a year, posing as a high school student in San Diego, and wrote the experience up in a book called “Fast Times at Ridgemont High.” Crowe and the director Amy Heckerling turned it into a wide-awake 1982 movie that provided rocket fuel for Sean Penn, who played the perpetually stoned surfer Jeff Spicoli.

Crowe, who burned out young as a journalist, pivoted to film. He wrote and directed “Say Anything” (1989), with John Cusack, Ione Skye and a famous boombox; “Singles” (1992), a romantic early look at the Seattle grunge scene; and “Jerry Maguire” (1996), with Tom Cruise and RenĂ©e Zellweger, before winning an Oscar for his “Almost Famous” screenplay. All this while married to Nancy Wilson, the guitarist in Heart. No sane person would trade their allotment of experience for this man’s. Omnidirectionally uncool.

When you read Crowe’s memoir, though, you begin to see things from his unhip point of view. He had no interest in drink and drugs while on the road, though Gregg Allman tried to hook him up with a speedball. He seems to have mostly abstained from sex, too, though there’s something about his adoration in the presence of his rock heroes that makes it seem he’s losing his virginity every few pages.

His editors at Rolling Stone thought he was uncool, increasingly as time went on, because the acolyte in him overrode the journalist. He Forrest Gumped along. Bands liked having Crowe around because he was adorable and a bit servile; he’d often leave out the bits they wanted left out. (...)

Crowe thought rock writers were snobs. He moved in with Glenn Frey and Don Henley of the Eagles while profiling them, for example, and he was in the room when they wrote “One of These Nights” and “Lyin’ Eyes.” It bugged him to see them put down:
A collection of rock writers at a party would challenge each other on their musical taste, each one going further and further into the world of the obscure until they’d collectively decided that “Self Portrait” was Bob Dylan’s greatest album and the Eagles barely deserved a record contract.
He especially liked Frey, because his message to the world seemed to be: “Lead with your optimism.” This was Crowe’s mother’s ethos, as well, and it chimed with his own. It’s a worldview that has worked for him in his best movies, though he’s also made gooey flops. The world needs its Paul McCartneys as much as it needs its Lou Reeds. It makes sense that Reed only sneered when he met Crowe. (...)

The crucial thing to know about this book is that it overlaps almost exactly with the story Crowe tells in “Almost Famous.” If you remember the phrases “It’s all happening” and “Don’t take drugs,” or the young woman — a “Band-Aid” in the movie’s argot — who is offered for a case of Heineken, or the rock star who briefly kills an important story, or Crowe’s flight-attendant sister, or the group sex scene that seems like a series of flickering veils, or the L.A. hotel known as the Riot House, or Lester Bangs acting out in a glassed-in first-floor radio studio, it’s all here and more.

The book reads like a novelization of the movie, so much so that it makes you consider the nature of memory. I’m not suggesting Crowe is making things up in this memoir. I’m merely suggesting that the stories he wrote for the movie may have been so reverberant that they began to subtly bleed into his own.

The secret to the movie, one that most people miss, Crowe says, is the empty chair at the family’s dining-room table. It belonged to Crowe’s older sister, Cathy, who was troubled from birth and died by suicide at 19. This detail reminds you how relatively sanitized this book otherwise is. There is little that’s grainy or truly revelatory about his own life and loves. The book ends before his directing career has begun, thus leaving room for a sequel. Everything is a bit gauzy, soft-core.

God help me, I read this book quickly and enjoyed it anyway: The backstage details alone keep this kite afloat. It got to me in the same way “Almost Famous” always gets to me, despite the way that movie sets off my entire bank of incoming sentimentality detectors. If you can watch the “Tiny Dancer” scene without blinking back a tear, you’re a stronger person than me. 

by Dwight Garner, NY Times |  Read more:
Image: Neal Preston

Wednesday, October 22, 2025

Norman Greenbaum

Norman Greenbaum, singer, guitarist, songwriter

Spirit in the Sky started as an old blues riff I’d been playing since my college days in Boston, but I didn’t know what to do with it. After I moved to LA, a guy I knew came up with a way of putting a fuzzbox inside my Fender Telecaster, which created the distinctive sound on Spirit in the Sky.

I’d come across a greeting card with a picture of some Native Americans praying to the “spirit in the sky”. The phrase stuck in my head. One night I was watching country music on TV and the singer Porter Wagoner sang a gospel song, which gave me the idea to write religious lyrics. Although I came from a semi-religious Jewish family, I wasn’t religious, but found myself writing Christian lyrics such as “When I die and they lay me to rest, I’m going to the place that’s the best” and “Gotta have a friend in Jesus”. It came together very quickly. I survived a car crash and now give thanks to the spirit every day

Soon after that, I was playing the Troubadour club in LA when the Lovin’ Spoonful’s producer Erik Jacobsen walked in. He said he had a production deal with Warner Brothers and was interested in signing me. When we recorded Spirit in the Sky for my debut album, the finished mix sent shivers up my spine. Initially, Warner said a four-minute single containing lyrics about Jesus would never get played on pop radio, but eventually they relented. In 1969, it sold two million copies. But I couldn’t recreate the success.

In 1986, I was working as a cook when Dr and the Medics took it back to No 1 in the UK. Then Gareth Gates’s 2003 version meant it was No 1 in three different decades. It’s been in countless movies, including Apollo 13, Oceans 11 and Guardians of the Galaxy. I’m 82. A few years ago, I was a passenger in a car crash and spent three weeks in a coma. I feel like I was granted another life. So now every day, I pray and give thanks to the spirit in the sky.

Erik Jacobsen, producer

I saw Norman at a hootenanny at the Troubadour singing one song, School for Sweet Talk, but he said: “I’ve got a million songs I’d love to play for ya.” It turned out he’d had a minor hit called The Eggplant That Ate Chicago with a group called Dr West’s Medicine Show and Junk Band and had a whole raft of crazy songs about goats, chickens or a Chinese guy who ate some acid. I said: “Let’s make some records that somebody might like.”

I put him together with Norman Mayell, the drummer from San Francisco psychedelic group Sopwith Camel, and Doug Killmer, a bassist, who’d played a lot of black music. The Spirit in the Sky riff originated in an old John Lee Hooker tune called Boogie Chillen’ and set the tone for where the song went, but the rhythm track sounded too loose. I got Norman to bring his acoustic guitar in and we recorded two performances – each slightly different – and made it stereo. Then we brought in gospel singers the Stovall Singers and their church-type clapping became a key part of the groove. A guitarist called Russell DaShiell played a hell of a solo. By now, the track was sounding immense, but when I heard Norman’s little vocal, my heart sank. It just wasn’t heavy enough, so once again I recorded two performances and combined the two together. I thought: “Thank God!” It sounded amazing. (...)

The funny thing is that when we went in to record it, my engineer was sick but we went ahead anyway with just a handful of little mics, no headphones and no sound baffling. Every sound was coming in on every mic, but it sounded great. For years people asked: “How in the world did you get that sound?” I said: “I just pointed the amps right at the drums. I had no idea what I was doing.”

by Dave Simpson, The Guardian |  Read more:
Image: Henry Diltz/Corbis/Getty Images
[ed. I think Norman (Iron Butterfly and Jimi) did more to invent the term "heavy" back in the late-60s than anybody else - along with this new thing called a fuzz box. See also: The Uncool by Cameron Crowe - Inside Rock's Wildest Decade (Guardian).]

Sunday, October 19, 2025

Gerrymandering - Past, Present, Future

‘I think we’ll get five,’ President Trump said, and five was what he got. At his prompting, the Republican-dominated Texas legislature remapped the districts to be used in next year’s elections to the federal House of Representatives. Their map includes five new seats that are likely to be won by the Republicans, who already hold 25 of the state’s 38 seats. Until this year, the Democrat Al Green’s Ninth Congressional District covered Democrat-leaning south and south-western Houston. Now, it ranges east over Republican-leaning Harris County and Liberty County, with most of the former constituency reallocated to other districts. Green has accused Trump and his allies in Texas of infusing ‘racism into Texas redistricting’ by targeting Black representatives like him and diluting the Black vote. ‘I did not take race into consideration when drawing this map,’ Phil King, the state senator responsible for the redistricting legislation, claimed. ‘I drew it based on what would better perform for Republican candidates.’ His colleague Todd Hunter, who introduced the redistricting bill, agreed. ‘The underlying goal of this plan is straightforward: improve Republican political performance.’


King and Hunter can say these things because there is no judicial remedy for designing a redistricting map that sews up the outcome of a congressional election. In 2019, Chief Justice John Roberts declared that although the Supreme Court ‘does not condone excessive partisan gerrymandering’, any court-mandated intervention in district maps would inevitably look partisan and impugn the court’s neutrality. In 2017, during arguments in a different case, Roberts contrasted the ‘sociological gobbledygook’ of political science on gerrymandering with the formal and objective science of American constitutional law.

‘Sociological gobbledygook’ teaches that the drawing of the boundaries of single-member districts can all but determine the outcome of an election. Imagine a state with twenty blue and thirty red voters that must be sliced into five districts. A map that tracked the overall distribution of votes would have two blue and three red districts. But if you can put six red voters and four blue voters in each of the five boxes, you will end up with five relatively safe red districts. This is known as ‘cracking’ the blue electorate. Or you could create two districts with six blues and one with eight blues, making three safe blue districts by ‘packing’ red supporters – concentrating them in a smaller number of districts. The notion that democratic elections are supposed to allow voters to make a real choice between candidates, or even kick out the bums in power, sits uneasily with the combination of untrammelled redistricting power and predictable political preferences that characterise the US today. But if it is so easy for mapmakers to vitiate the democratic purpose of elections in single-member districts, doesn’t neutrality demand some constraint on the ability of incumbents to choose voters, rather than the other way round?

After the Texas redistricting, Roberts’s belief that neutrality requires inaction appears even shakier. By adding five seats to the expected Texan Republican delegation, the Republican Party improves the odds it will retain, or even increase, its six-seat majority in the House in November 2026. Even a slight advantage gained through redistricting may have national implications because the Democrats’ lead in the polls is consistently small (around two points). Congressional maps are usually redrawn once every ten years, after each decennial census (the next one is in 2030). Mid-cycle redistricting does sometimes happen – Texas did the same thing two decades ago – but it is unusual. So is Trump’s open embrace of gerrymanders. In 1891, Benjamin Harrison condemned gerrymandering as ‘political robbery’. Sixty years later, Harry Truman called for federal legislation to end its use; a bill was introduced in the House but died in the Senate. In 1987, Ronald Reagan told a meeting of Republican governors that gerrymanders were ‘corrupt’. (...)

Democratic states have threatened to retaliate. In California, Governor Gavin Newsom has scheduled a special election on Proposition 50, which would temporarily suspend the state’s independent redistricting commission, making it possible for the Democratic legislature to flip five Republican seats (43 of California’s 52 seats are held by Democrats). Like California, New York has a bipartisan commission, which usually redraws its maps once a decade. The Democrats have brought in legislation allowing mid-decade changes, but new maps won’t be in place until 2028. Democrats who used to be fierce advocates of independent commissions are now asking themselves whether they’ve been too slow to fight back. From a party that has a habit of bringing a knife to a gunfight, the question answers itself.

In the late 20th century, there were only ten seats nationally that repeatedly changed hands as a result of partisan gerrymandering, with control of the House flipping on just one occasion, in 1954. But in 2012, Republicans started to change this. Michigan, North Carolina, Ohio, Pennsylvania and Virginia were all sliced up. The increase in gerrymanders was in part a result of Redmap, the Redistricting Majority Project, a Republican initiative set up in 2010 which invested in the races for the state legislatures, such as Texas’s, tasked with drawing district maps. In 1981, Democrats controlled the mapmaking process in 164 seats, while Republicans controlled it in 50. By 2021, the Republicans controlled line-drawing for 187 seats, the Democrats 49. At the same time, computers had made it cheaper and easier to design maps optimising one party’s performance without breaking the legal constraints on redistricting, such as the Voting Rights Act and the prohibition on districts drawn on the basis of race. In the 1980s, it cost $75,000 to buy software to do this; by the early 2000s, programs such as Maptitude for Redistricting cost $3000.

Just as in the late 19th century, urbanisation is now producing new political geography: migration from Democrat-leaning states such as California, New York, Pennsylvania and Illinois means they will lose House seats after the 2030 census. Meanwhile, Texas, Florida, Georgia and North Carolina, all of which lean Republican, are set to gain seats. Texas’s gerrymander, in other words, foreshadows a change in national political power that is coming anyway.

by Azia Huq, London Review of Books |  Read more:
Image: The Ninth Congressional District in Texas, before and after this year’s remapping.
[ed. If you can't win fair and square, cheat. It looks almost certain that all national elections going forward will be a nightmare.]

Friday, October 17, 2025

The '3.5% Rule'

 How a small minority can change the world.

Nonviolent protests are twice as likely to succeed as armed conflicts – and those engaging a threshold of 3.5% of the population have never failed to bring about change.

In 1986, millions of Filipinos took to the streets of Manila in peaceful protest and prayer in the People Power movement. The Marcos regime folded on the fourth day.

In 2003, the people of Georgia ousted Eduard Shevardnadze through the bloodless Rose Revolution, in which protestors stormed the parliament building holding the flowers in their hands. While in 2019, the presidents of Sudan and Algeria both announced they would step aside after decades in office, thanks to peaceful campaigns of resistance.

In each case, civil resistance by ordinary members of the public trumped the political elite to achieve radical change.

There are, of course, many ethical reasons to use nonviolent strategies. But compelling research by Erica Chenoweth, a political scientist at Harvard University, confirms that civil disobedience is not only the moral choice; it is also the most powerful way of shaping world politics – by a long way.

Looking at hundreds of campaigns over the last century, Chenoweth found that nonviolent campaigns are twice as likely to achieve their goals as violent campaigns. And although the exact dynamics will depend on many factors, she has shown it takes around 3.5% of the population actively participating in the protests to ensure serious political change. (...)

Working with Maria Stephan, a researcher at the ICNC, Chenoweth performed an extensive review of the literature on civil resistance and social movements from 1900 to 2006 – a data set then corroborated with other experts in the field. They primarily considered attempts to bring about regime change. A movement was considered a success if it fully achieved its goals both within a year of its peak engagement and as a direct result of its activities. A regime change resulting from foreign military intervention would not be considered a success, for instance. A campaign was considered violent, meanwhile, if it involved bombings, kidnappings, the destruction of infrastructure – or any other physical harm to people or property.

“We were trying to apply a pretty hard test to nonviolent resistance as a strategy,” Chenoweth says. (The criteria were so strict that India’s independence movement was not considered as evidence in favour of nonviolent protest in Chenoweth and Stephan’s analysis – since Britain’s dwindling military resources were considered to have been a deciding factor, even if the protests themselves were also a huge influence.)

By the end of this process, they had collected data from 323 violent and nonviolent campaigns. And their results – which were published in their book Why Civil Resistance Works: The Strategic Logic of Nonviolent Conflict – were striking.

Strength in numbers

Overall, nonviolent campaigns were twice as likely to succeed as violent campaigns: they led to political change 53% of the time compared to 26% for the violent protests.

This was partly the result of strength in numbers. Chenoweth argues that nonviolent campaigns are more likely to succeed because they can recruit many more participants from a much broader demographic, which can cause severe disruption that paralyses normal urban life and the functioning of society.

In fact, of the 25 largest campaigns that they studied, 20 were nonviolent, and 14 of these were outright successes. Overall, the nonviolent campaigns attracted around four times as many participants (200,000) as the average violent campaign (50,000).

The People Power campaign against the Marcos regime in the Philippines, for instance, attracted two million participants at its height, while the Brazilian uprising in 1984 and 1985 attracted one million, and the Velvet Revolution in Czechoslovakia in 1989 attracted 500,000 participants.

“Numbers really matter for building power in ways that can really pose a serious challenge or threat to entrenched authorities or occupations,” Chenoweth says – and nonviolent protest seems to be the best way to get that widespread support.

Once around 3.5% of the whole population has begun to participate actively, success appears to be inevitable. (...)

Chenoweth admits that she was initially surprised by her results. But she now cites many reasons that nonviolent protests can garner such high levels of support. Perhaps most obviously, violent protests necessarily exclude people who abhor and fear bloodshed, whereas peaceful protesters maintain the moral high ground. (...)

“There are more options for engaging and nonviolent resistance that don’t place people in as much physical danger, particularly as the numbers grow, compared to armed activity,” Chenoweth says. “And the techniques of nonviolent resistance are often more visible, so that it's easier for people to find out how to participate directly, and how to coordinate their activities for maximum disruption.”

by David Robson, BBC |  Read more:
Images: Getty Images
[ed. I'll be at the No Kings 2.0 rally tomorrow. As a rule, I tend to avoid these things since they mostly seem performative in nature (goofy costumes, dumb signs, mugging for the media, etc.), or devolve into violence if a few bad actors aren't immediately reigned in. But in this case, the issues threatening our constitution and democracy seem so great that merely voting every few years and writing letters isn't enough. I doubt it'll change anything this administration does or has planned, but maybe some other institutions (eg. Congress) might actually be scared or emboldened enough to grow a spine. I only wish they'd named it something other than No Kings (many countries actually support constitutional monarchies - Britain, Netherlands, Sweden, Japan, Norway, Spain, etc. It's the absolute ones - now and throughout history - that give the term a bad name: think Saudi Arabia, Oman, North Korea, etc.). I'm especially concerned that we may never see an uncontested national election again if one party refuses to accept results (or reality).]

Enshittification: Why Everything Sucks Now

We all feel it: Our once-happy digital spaces have become increasingly less user-friendly and more toxic, cluttered with extras nobody asked for and hardly anybody wants. There’s even a word for it: “enshittification,” named 2023 Word of the Year by the American Dialect Society. The term was coined by tech journalist/science fiction author Cory Doctorow, a longtime advocate of digital rights. Doctorow has spun his analysis of what’s been ailing the tech industry into an eminently readable new book, Enshittification: Why Everything Suddenly Got Worse and What To Do About It. (...)

People generally use “enshittification” colloquially to mean “the degradation in the quality and experience of online platforms over time.” Doctorow’s definition is more specific, encompassing “why an online service gets worse, how that worsening unfolds,” and how this process spreads to other online services, such that everything is getting worse all at once.

For Doctorow, enshittification is a disease with symptoms, a mechanism, and an epidemiology. It has infected everything from Facebook, Twitter, Amazon, and Google, to Airbnb, dating apps, iPhones, and everything in between. “For me, the fact that there were a lot of platforms that were going through this at the same time is one of the most interesting and important factors in the critique,” he said. “It makes this a structural issue and not a series of individual issues.”

It starts with the creation of a new two-sided online product of high quality, initially offered at a loss to attract users—say, Facebook, to pick an obvious example. Once the users are hooked on the product, the vendor moves to the second stage: degrading the product in some way for the benefit of their business customers. This might include selling advertisements, scraping and/or selling user data, or tweaking algorithms to prioritize content the vendor wishes users to see rather than what those users actually want.

This locks in the business customers, who, in turn, invest heavily in that product, such as media companies that started Facebook pages to promote their published content. Once business customers are locked in, the vendor can degrade those services too—i.e., by de-emphasizing news and links away from Facebook—to maximize profits to shareholders. Voila! The product is now enshittified.

The four horsemen of the shitocalypse

Doctorow identifies four key factors that have played a role in ushering in an era that he has dubbed the “Enshittocene.” The first is competition (markets), in which companies are motivated to make good products at affordable prices, with good working conditions, because otherwise customers and workers will go to their competitors. The second is government regulation, such as antitrust laws that serve to keep corporate consolidation in check, or levying fines for dishonest practices, which makes it unprofitable to cheat.

The third is interoperability: the inherent flexibility of digital tools, which can play a useful adversarial role. “The fact that enshittification can always be reversed with a dis-enshittifiting counter-technology always acted as a brake on the worst impulses of tech companies,” Doctorow writes. Finally, there is labor power; in the case of the tech industry, highly skilled workers were scarce and thus had considerable leverage over employers.

All four factors, when functioning correctly, should serve as constraints to enshittification. However, “One by one each enshittification restraint was eroded until it dissolved, leaving the enshittification impulse unchecked,” Doctorow writes. Any “cure” will require reversing those well-established trends.

But isn’t all this just the nature of capitalism? Doctorow thinks it’s not, arguing that the aforementioned weakening of traditional constraints has resulted in the usual profit-seeking behavior producing very different, enshittified outcomes. “Adam Smith has this famous passage in Wealth of Nations about how it’s not due to the generosity of the baker that we get our bread but to his own self-regard,” said Doctorow. “It’s the fear that you’ll get your bread somewhere else that makes him keep prices low and keep quality high. It’s the fear of his employees leaving that makes him pay them a fair wage. It is the constraints that causes firms to behave better. You don’t have to believe that everything should be a capitalist or a for-profit enterprise to acknowledge that that’s true.”

Our wide-ranging conversation below has been edited for length to highlight the main points of discussion.

Ars Technica: I was intrigued by your choice of framing device, discussing enshittification as a form of contagion.

Cory Doctorow: I’m on a constant search for different framing devices for these complex arguments. I have talked about enshittification in lots of different ways. That frame was one that resonated with people. I’ve been a blogger for a quarter of a century, and instead of keeping notes to myself, I make notes in public, and I write up what I think is important about something that has entered my mind, for better or for worse. The downside is that you’re constantly getting feedback that can be a little overwhelming. The upside is that you’re constantly getting feedback, and if you pay attention, it tells you where to go next, what to double down on.

Another way of organizing this is the Galaxy Brain meme, where the tiny brain is “Oh, this is because consumers shopped wrong.” The medium brain is “This is because VCs are greedy.” The larger brain is “This is because tech bosses are assholes.” But the biggest brain of all is “This is because policymakers created the policy environment where greed can ruin our lives.” There’s probably never going to be just one way to talk about this stuff that lands with everyone. So I like using a variety of approaches. I suck at being on message. I’m not going to do Enshittification for the Soul and Mornings with Enshittifying Maury. I am restless, and my Myers-Briggs type is ADHD, and I want to have a lot of different ways of talking about this stuff.

Ars Technica: One site that hasn’t (yet) succumbed is Wikipedia. What has protected Wikipedia thus far?

Cory Doctorow: Wikipedia is an amazing example of what we at the Electronic Frontier Foundation (EFF) call the public interest Internet. Internet Archive is another one. Most of these public interest Internet services start off as one person’s labor of love, and that person ends up being what we affectionately call the benevolent dictator for life. Very few of these projects have seen the benevolent dictator for life say, “Actually, this is too important for one person to run. I cannot be the keeper of the soul of this project. I am prone to self-deception and folly just like every other person. This needs to belong to its community.” Wikipedia is one of them. The founder, my friend Jimmy Wales, woke up one day and said, “No individual should run Wikipedia. It should be a communal effort.”

There’s a much more durable and thick constraint on the decisions of anyone at Wikipedia to do something bad. For example, Jimmy had this idea that you could use AI in Wikipedia to help people make entries and navigate Wikipedia’s policies, which are daunting. The community evaluated his arguments and decided—not in a reactionary way, but in a really thoughtful way—that this was wrong. Jimmy didn’t get his way. It didn’t rule out something in the future, but that’s not happening now. That’s pretty cool.

Wikipedia is not just governed by a board; it’s also structured as a nonprofit. That doesn’t mean that there’s no way it could go bad. But it’s a source of friction against enshittification. Wikipedia has its entire corpus irrevocably licensed as the most open it can be without actually being in the public domain. Even if someone were to capture Wikipedia, there’s limits on what they could do to it.

There’s also a labor constraint in Wikipedia in that there’s very little that the leadership can do without bringing along a critical mass of a large and diffuse body of volunteers. That cuts against the volunteers working in unison—they’re not represented by a union; it’s hard for them to push back with one voice. But because they’re so diffuse and because there’s no paychecks involved, it’s really hard for management to do bad things. So if there are two people vying for the job of running the Wikimedia Foundation and one of them has got nefarious plans and the other doesn’t, the nefarious plan person, if they’re smart, is going to give it up—because if they try to squeeze Wikipedia, the harder they squeeze, the more it will slip through their grasp.

So these are structural defenses against enshittification of Wikipedia. I don’t know that it was in the mechanism design—I think they just got lucky—but it is a template for how to run such a project. It does raise this question: How do you build the community? But if you have a community of volunteers around a project, it’s a model of how to turn that project over to that community.

Ars Technica: Your case studies naturally include the decay of social media, notably Facebook and the social media site formerly known as Twitter. How might newer social media platforms resist the spiral into “platform decay”?

Cory Doctorow: What you want is a foundation in which people on social media face few switching costs. If the social media is interoperable, if it’s federatable, then it’s much harder for management to make decisions that are antithetical to the interests of users. If they do, users can escape. And it sets up an internal dynamic within the firm, where the people who have good ideas don’t get shouted down by the people who have bad but more profitable ideas, because it makes those bad ideas unprofitable. It creates both short and long-term risks to the bottom line.

There has to be a structure that stops their investors from pressurizing them into doing bad things, that stops them from rationalizing their way into complying. I think there’s this pathology where you start a company, you convince 150 of your friends to risk their kids’ college fund and their mortgage working for you. You make millions of users really happy, and your investors come along and say, “You have to destroy the life of 5 percent of your users with some change.” And you’re like, “Well, I guess the right thing to do here is to sacrifice those 5 percent, keep the other 95 percent happy, and live to fight another day, because I’m a good guy. If I quit over this, they’ll just put a bad guy in who’ll wreck things. I keep those 150 people working. Not only that, I’m kind of a martyr because everyone thinks I’m a dick for doing this. No one understands that I have taken the tough decision.”

I think that’s a common pattern among people who, in fact, are quite ethical but are also capable of rationalizing their way into bad things. I am very capable of rationalizing my way into bad things. This is not an indictment of someone’s character. But it’s why, before you go on a diet, you throw away the Oreos. It’s why you bind yourself to what behavioral economists call “Ulysses pacts“: You tie yourself to the mast before you go into the sea of sirens, not because you’re weak but because you’re strong enough now to know that you’ll be weak in the future.

I have what I would call the epistemic humility to say that I don’t know what makes a good social media network, but I do know what makes it so that when they go bad, you’re not stuck there. You and I might want totally different things out of our social media experience, but I think that you should 100 percent have the right to go somewhere else without losing anything. The easier it is for you to go without losing something, the better it is for all of us.

My dream is a social media universe where knowing what network someone is using is just a weird curiosity. It’d be like knowing which cell phone carrier your friend is using when you give them a call. It should just not matter. There might be regional or technical reasons to use one network or another, but it shouldn’t matter to anyone other than the user what network they’re using. A social media platform where it’s always easier for users to leave is much more future-proof and much more effective than trying to design characteristics of good social media.

by Jennifer Ouellette and Cory Doctorow, Ars Technica | Read more:
Image: Julia Galdo and Cody Cloud (JUCO)/CC-BY 3.0
[ed. Do a search on this site for much more by Mr. Doctorow, including copyright and right-to-repair issues. Further on in this interview:]
***
When we had a functional antitrust system for the last four years, we saw a bunch of telecoms mergers stopped because once you start enforcing antitrust, it’s like eating Pringles. You just can’t stop. You embolden a lot of people to start thinking about market structure as a source of either good or bad policy. The real thing that happened with [former FCC chair] Lina Kahn doing all that merger scrutiny was that people just stopped planning mergers.

There are a lot of people who benefit from this. It’s not just tech workers or tech users; it’s not just media users. Hospital consolidation, pharmaceutical consolidation, has a lot of people who are very concerned about it. Mark Cuban is freaking out about pharmacy benefit manager consolidation and vertical integration with HMOs, as he should be. I don’t think that we’re just asking the anti-enshittification world to carry this weight.

Same with the other factors. The best progress we’ve seen on interoperability has been through right-to-repair. It hasn’t been through people who care about social media interoperability. One of the first really good state-level right-to-repair bills was the one that [Governor] Jared Polis signed in Colorado for powered wheelchairs. Those people have a story that is much more salient to normies.

"What do you mean you spent six months in bed because there’s only two powered wheelchair manufacturers and your chair broke and you weren’t allowed to get it fixed by a third party?” And they’ve slashed their repair department, so it takes six months for someone to show up and fix your chair. So you had bed sores and pneumonia because you couldn’t get your chair fixed. This is bullshit.

Thursday, October 16, 2025

The Lost Art Of Thinking Historically

On a sun-drenched November day in Dallas, 1963, as President John F. Kennedy’s motorcade rounded the corner onto Elm Street, a single, baffling figure stood out against the cheerful crowd: a man holding a black umbrella aloft against the cloudless sky. Seconds later, shots rang out, and the world changed forever.

In the chaotic aftermath, as a nation grappled with an incomprehensible act of violence, the image of the “Umbrella Man” became a fetish, as novelist John Updike would later write, dangling around history’s neck. The man was an anomaly, a detail that didn’t fit. In a world desperate for causal links, his presence seemed anything but benign. Was the umbrella a secret signaling device? A disguised flechette gun that fired the first, mysterious throat wound? For years, investigators and conspiracy theorists alike saw him as a key to a sinister underpinning, a puzzle piece in a grand, nefarious design.

The truth, when it finally emerged, was nearly absurd in its banality. Testifying before a House committee in 1978, a Dallas warehouse worker named Louie Steven Witt admitted he was the man. His motive was not assassination, but heckling. The umbrella was a symbolic protest against the Kennedy family, referencing the Nazi-appeasing policies of former British Prime Minister Neville Chamberlain — whose signature accessory was an umbrella — and his association with JFK’s father, Joseph P. Kennedy, who had been an ambassador to the U.K. It was, as the investigator Josiah Thompson noted, an explanation “just wacky enough to be true.”

The story of the Umbrella Man reveals our deep-seated human desire to make sense of a complex universe through tidy, airtight explanations. We crave certainty, especially in the face of tragedy, and are quick to weave disparate facts into a coherent, and often sinister, narrative. We see a man with an umbrella on a sunny day and assume conspiracy, because the alternative — that the world is a stage for random, idiosyncratic and often meaningless acts — is far more unsettling. (...)

Making consequential choices about an unknowable future is a profoundly challenging task. The world is not a laboratory. It is a vortex of ambiguity, contingency and competing perspectives, where motives are unclear, evidence is contradictory and the significance of events changes with the passage of time. No economic model or regression analysis can fully explain the Umbrella Man, nor can it provide the clarity we need to navigate the intricate challenges of our time.

What we have lost, and what we desperately need to reclaim, is a different mode of cognition, a historical sensibility. This is not about memorizing dates and facts. It is, as the historian Gordon S. Wood describes it, a “different consciousness,” a way of understanding that profoundly influences how we see the world. It is a temperament that is comfortable with uncertainty, sensitive to context and aware of the powerful, often unpredictable rhythms of the past. To cultivate this sensibility is to acquire the intellectual virtues of modesty, curiosity and empathy — an antidote to the hubris of rigid, monocausal thinking.

The Historian’s Audacious Act

The stereotypical image of a historian is a collector of dusty facts, obsessed with the archives, who then weaves them into a story. But this portrait misses the audacious intellectual act at the heart of the discipline. (...)

This is an ambitious, almost brazen attempt to impose a shared order on the infinite, confusing array of facts and causes that mark our existence. It offers an argument about causality and agency — about who and what matters, and how the world works and why. Does change come from great leaders, collective institutions or vast, impersonal structural forces? A historian’s narrative is never just a story; it is a theory of change.

This process is fundamentally different from that of many other disciplines. Where social sciences often seek to create generalizable, predictive and parsimonious theories — the simplest explanation for the largest number of things — history revels in complexity. A historical sensibility is skeptical of master ideas or unitary historical motors. It recognizes that different things happen for different reasons, that direct causal connections can be elusive, and that the world is rife with unintended consequences. It makes no claim to predict the future; rather, it seeks to deepen our understanding of how the past unfolded into our present, reminding us, as British historian Sir Llewellyn Woodward said, that “our ignorance is very deep.”

This sensibility compels us to reconsider concepts we take for granted. We use terms such as “capitalism” and “human rights” as if they are timeless and universal, when in fact they are concepts that emerged and evolved at particular historical moments, often identified and defined by historians. A historical consciousness demands that we seek the origins of things we thought we understood and empathize with the past in its own context. This is to imagine ourselves in the shoes of those who came before, wrestling with their dilemmas in their world. It doesn’t mean suspending moral judgment, but rather being less confident that we — here today — have a monopoly on timeless insight.

Why We Get History Wrong

Thinking historically is valuable but rare. Most of us encounter “history” in up to three ways, none of which cultivates this deeper consciousness. First, in school, where it is often presented as a dry chronology of dates and facts to be memorized with little connection to our lives. Second, through public history — museums, memorials, historical sites — which can inspire curiosity, but are themselves historical products, often reflecting the biases and blind spots of the era in which they were created. (A tour of Colonial Williamsburg may reveal more about the Rockefeller-funded restoration ethos of the 1930s than about the 18th-century reality it purports to represent.) Third, through bestselling books and documentaries, which may tell vivid, engaging stories, but can be hagiographic and anecdotal, oriented toward simple lessons and celebrating national myths rather than challenging our assumptions.

None of these is the same as developing a historical sensibility. They are more like comfort food, satisfying a deep urge to connect with the past but providing little real nourishment. At worst, they reinforce the very cognitive habits — the desire for certainty, simple narratives and clear heroes and villains — that a true historical sensibility seeks to question.

The academic discipline of history has, in recent decades, largely failed in its public duty. It has retreated from the consequential subjects of statecraft and strategy, seeing them as unworthy of scholarly pursuit. The rosters of tenured historians at major universities show a steep decline in scholars engaged with questions of war, peace and diplomacy. When they do address such topics, they often do so in a jargon-laden style that is inaccessible and unhelpful to decision-makers or the wider public.

This decline is a tragedy, especially at a time when leaders confronting complex global challenges are desperate for guidance. The field of history has become estranged from the very world of power and decision-making it is uniquely equipped to analyze. Historians and policymakers, who should be natural interlocutors, rarely engage one another. This has left a vacuum that is eagerly filled by other disciplines more confident in their ability to provide actionable advice — which is often dangerously simplistic. (...)

The Practice Of Thinking Historically

If a historical sensibility is the temperament, then thinking historically is the practice. It is the active deployment of that sensibility as a set of tools to assess the world and make more informed choices. It is a distinct epistemology, one that offers a powerful method for evaluating causality and agency, weighing competing narratives and navigating the dilemmas of decision-making without succumbing to what can be called “paralysis by analysis.” It offers not a crystal ball, but a more sophisticated lens — a historian’s microscope — through which to see the present.

Thinking historically begins by questioning vertical and horizontal time. The vertical axis asks: How did we get here? It is the rigorous construction of a chronology, not as a mere list of dates, but as a map of cause and effect. Where this timeline begins — with the Bolshevik Revolution of 1917, the end of World War II in 1945 or the rise of China in 1979 — fundamentally changes the story and its meaning. It reveals our own unspoken assumptions about what truly drives events.

The horizontal axis asks: What else is happening? It recognizes that history is not a single storyline but a thick tapestry of interwoven threads. The decision to escalate the war in Vietnam, for example, cannot be fully understood without examining the parallel, and seemingly contradictory, efforts by the same administration to cooperate with the Soviet Union on nuclear nonproliferation. Thinking historically is the act of integrating these divergent streams.

Crucially, this practice leads us to confront our own biases, particularly outcome bias. Because we know how the story ended — how the Cold War concluded or how the 2008 financial crisis resolved — we are tempted to construct a neat narrative of inevitability. Thinking historically resists this temptation. It demands that we try to see the world as the actors of the past saw it: through a foggy windshield, not a rearview mirror, facing a future of radical uncertainty. It restores a sense of contingency to the past, reminding us that choices mattered and that the world could have turned out differently.

Ultimately, thinking historically is about asking better, more probing questions. It is a disciplined curiosity that fosters an appreciation for the complex interplay of individual agency, structural forces and pure chance. Instead of offering easy answers, it provides the intellectual equipment to engage with hard questions, a skill indispensable for navigating a future that will surely be as unpredictable as the past.

by Francis Gavin, Noema |  Read more:
Image: Mr.Nelson design for Noema Magazine
[ed. Unfortunately, I'm not seeing a Renaissance in critical thinking anytime soon. See also: Believing misinformation is a “win” for some people, even when proven false (Ars Technica - below); and, Rescuing Democracy From The Quiet Rule Of AI (Noema).]

"Why do some people endorse claims that can easily be disproved? It’s one thing to believe false information, but another to actively stick with something that’s obviously wrong.

Our new research, published in the Journal of Social Psychology, suggests that some people consider it a “win” to lean in to known falsehoods. (...)

Rather than consider issues in light of actual facts, we suggest people with this mindset prioritize being independent from outside influence. It means you can justify espousing pretty much anything—the easier a statement is to disprove, the more of a power move it is to say it, as it symbolizes how far you’re willing to go...
 for some people, literal truth is not the point."

Mission Impossible

After the midair collision in January over the Potomac River between an Army helicopter and a regional jet packed with young figure skaters and their parents flying out of Wichita, Kansas, and considering the ongoing travails of the Boeing Company, which saw at least five of its airplanes crash last year, I was so concerned about the state of U.S. aviation that, when called on by this magazine to attend President Donald Trump’s military parade in Washington, on June 14, 2025, I decided to drive all the way from my home in Austin, Texas, even though it cost me two days behind the wheel and a gas bill as expensive as a plane ticket.

I was no less concerned about the prospect of standing on the National Mall on the day of the parade, a celebration of the two-hundred-fiftieth anniversary of the founding of the U.S. Army, which happened to coincide with Trump’s seventy-ninth birthday. The forecast predicted appropriately foul weather for the occasion, and there would be a number of helicopters, of both modern and Vietnam-era vintage, flying over the parade grounds. The Army’s recent track record didn’t bode well for those positioned under the flight path. In the past two years, there had been at least twenty-four serious accidents involving helicopters and nineteen fatalities, culminating with the collision over the Potomac, the deadliest incident in American commercial aviation since 2001.

A crash was not the only thing that I worried about. Acts of low-level domestic terrorism and random shootings take place routinely in this country, and although security at the parade would be tight, I wondered what the chance was of some sort of attack on the parade-goers, or even another attempt on Trump’s life. The probability seemed low, but considering the number of veterans who would be in attendance, I had occasion to recall a 2023 study that found that military service is the single strongest predictor of whether an American will commit a mass killing. (...)

Then there were the politics of the parade, the first procession of military forces past the White House since the end of the Gulf War. For weeks, opinion columnists and television pundits had been sounding the alarm over the controversial festivities, which they saw as another sign of America’s downward slide into authoritarianism, into fascism. Comparisons abounded to Mussolini’s Italy, Pinochet’s Chile, and Hitler’s Germany. A coalition of opposition groups had organized a day of protests under the slogan “No Kings,” and that morning, in thousands of cities across the United States, millions of demonstrators were assembling, waving signs that said things like stop fascism, resist fascism, and no to trump’s fascist military parade.

I was no more thrilled than they were about the idea of tanks and armored vehicles rolling down Constitution Avenue. Trump’s accelerationist instincts, the zeal of his fan base, and the complicity, cowardice, and inaction of the Democratic Party in the face of the governing Republican trifecta made the possibility of a military dictatorship in the United States seem borderline plausible. But in a reminder that Trump is not wildly popular with the electorate so much as unopposed by any effective political counterweight, groups of foreign tourists predominated among the parade’s early arrivals.

The first people I met in the surprisingly short line to pass through the security checkpoint were an affable pair of fun-loving Europeans. Jelena, a Slovenian, had come in hopes of meeting a husband. “If someone’s going to marry me,” she explained with a laugh, “it will be a Republican man.” Liberals were too elitist for her: “Democrats will ask what school I went to.” Her high-spirited wingman, a Bulgarian named Slavko, was drinking beer out of a plastic cup at eleven o’clock in the morning. He had come “to get fucking drunk and high all day long,” he told me, “and just hang out.”

There were a number of Trump voters in line, but they seemed muted, even reasonable, in their political views, far from the legions of MAGA faithful I had expected to encounter. David and Sandra Clark, a middle-aged couple from Carlisle, Pennsylvania, were divided in their opinions of the president. Sandra was not a fan, she said, and David described himself as a “marginal” Trump supporter. They had come to observe the Army’s semiquincentennial, a “momentous occasion,” he said. The day before, Israel had bombed Iran, opening yet another front in the apartheid state’s war against its Muslim neighbors, and the Clarks were concerned about the situation. “It seems like it could get out of hand,” he said. “I’m here to see the protesters,” Sandra put in. “I may join them.”

A few of the attendees trickling in had on red hats that said trump 2028 or make iran great again, but these slogans somehow lacked their intended provocative effect. I looked out over the Mall, where the second-rate exhibits that the Army had set up made a mockery of the parade’s $30 million price tag. Was this supposed to be a show of American military might? (...)

By midday, the heat was ungodly. Not a drop of the predicted rain fell, and not a breeze blew. Near a much-needed water station was an exhibit of military first-aid kits manned by a delegation from Fort Bragg’s 44th Medical Brigade, which recently saw three of its current or former soldiers convicted of federal drug-trafficking charges related to a racket smuggling ketamine out of Cameroon. After hydrating, I watched the 3rd Infantry Regiment, a ceremonial unit known as the Old Guard, spin and toss their rifles and bayonets to a smattering of languorous applause from a small crowd of South Asian tourists, aging veterans, and subdued MAGA fans.

What kind of fascism was this? Rather than the authoritarian spectacle that liberals had anticipated, the festivities seemed to be more a demonstration of political fatigue and civic apathy. And if Trump intended the parade to be an advertisement of America’s military strength, it would instead prove to be an inadvertent display of the armed forces’ creeping decrepitude, low morale, shrinking size, obsolescence, and dysfunction. (...)

During the speech, Trump touted his proposed trillion-dollar defense budget, taunted the reporters in attendance, warned of hordes of immigrants coming from “the Congo in Africa,” denounced the protesters in Los Angeles as “animals,” ridiculed transgender people, and promised the troops a pay raise, even as he repeatedly strayed from his prepared remarks to praise the good looks of handsome service members who caught his eye. “For two and a half centuries, our soldiers have marched into the raging fires of battle and obliterated America’s enemies,” Trump told the crowd. “Our Army has smashed foreign empires, humbled kings, toppled tyrants, and hunted terrorist savages through the very gates of hell,” he said. “They all fear us. And we have the greatest force anywhere on earth.” (...)

In point of fact, the modern American military is a much weaker and more debilitated force than Trump’s braggadocio, and the Defense Department’s gargantuan spending habits, might suggest. The United States has either failed to achieve its stated aims in, or outright lost, every major war it has waged since 1945—with the arguable exception of the Gulf War—and it only seems to be getting less effective as defense expenditures continue to rise. You don’t need to look back to U.S. defeats in Iraq or Afghanistan, much less Vietnam, to illustrate this point. Just one month before Trump’s parade, in May, our armed forces suffered a humiliating loss against a tiny but fearless adversary in Yemen, one of the poorest countries in the world.

The Houthi rebels, also known as Ansar Allah, have been defying the United States, Saudi Arabia, and Israel ever since they first emerged as a military force in 2004 protesting the U.S. invasion of Iraq, the Israeli occupation of Palestine, and the quisling Yemeni regime’s collaboration with the Bush Administration. After Hamas attacked Israel on October 7, 2023, the Houthis, who had endured nearly a decade of starvation under a U.S.-backed Saudi blockade of their ports, tried to force Israel and its allies to lift the siege of Gaza by using their scrappy speedboat navy and homemade arsenal of cheaply manufactured missiles, drones, and unmanned underwater vehicles to choke off maritime traffic in the Red Sea. In response, the Biden Administration, invoking the threat posed by the Houthis to freedom of navigation, launched a wave of air strikes on Yemen and dispatched a naval fleet to reopen the Bab el-Mandeb Strait. The campaign did not go well. A pair of Navy SEALs drowned while attempting to board a Houthi dhow, and the crew of the USS Gettysburg accidentally shot down an F/A-18F Super Hornet fighter jet after it took off from the USS Harry S. Truman, one of America’s premier aircraft carriers, which a short time later collided with an Egyptian merchant ship.

In January of this year, Trump declared the Houthis a terrorist organization and doubled down on Biden’s war. The administration replaced the commander of the Gettysburg and augmented U.S. assets in the region with another aircraft-carrier strike group, which costs $6.5 million a day to operate; B-2 bombers, which cost $90,000 per flight hour; and antimissile interceptors, which can cost $2.7 million apiece. In the span of a few weeks in March and April, the United States launched hundreds of air strikes on Yemen. The tough, ingenious (and dirt-poor) Houthis, protected by Yemen’s mountainous interior, fought back with the tenacity of drug-resistant microbes. They downed hundreds of millions of dollars’ worth of Reaper drones; nearly managed to shoot several F-16s and an F-35 out of the sky; and evaded air defenses to strike Israel with long-range drones, all the while continuing to harass commercial shipping in the Red Sea, which plummeted by 60 percent.

On April 28, American warplanes struck a migrant detention center in the northern Yemeni city of Sadah, then dropped more bombs on emergency workers who arrived in the aftermath. Sixty-eight people were killed. In retaliation, the Houthis launched a fusillade of ballistic missiles at the Truman, which turned tail and steamed away, causing another Super Hornet to slide off the deck into the ocean.

The loss of a second $67 million fighter jet was evidently a turning point for President Trump. In one month, the United States had used up much of its stockpile of guided missiles and lost a number of aircraft but failed to establish air superiority over a country with a per capita GDP one sixth the size of Haiti’s. To avoid further embarrassment, Trump officials declared Operation Rough Rider a success and ordered U.S. Central Command to “pause” operations, effectively capitulating to the Houthis. “We hit them very hard and they had a great ability to withstand punishment,” Trump conceded. “You could say there was a lot of bravery there.” The very same day, yet another $67 million Super Hornet slipped off the deck of the Truman and sank to the bottom of the sea. (...)

At last it was time for the parade. The thin crowd, which hadn’t thickened much over the course of the day, filtered through a secondary security checkpoint and took up positions along Constitution Avenue, angling for spots in the shade. I saw a woman changing a baby’s diaper at the base of a tree, and a shirtless old man in a cavalry hat standing atop an overflowing garbage can. With the sun still high in the sky at six o’clock, the heat had barely relented. Smoke from a wildfire in New Jersey had turned the overcast sky a dirty brown.

On the north side of the street, in front of the White House, a covered stage had been set up for the reviewing party, protected by bulletproof glass and flanked by tanks below. First to take his seat was the chairman of the Joint Chiefs of Staff, General Dan Caine, a “serial entrepreneur and investor,” according to his Air Force biography. The secretary of defense, former Fox News host Pete Hegseth, came out shortly after, wearing a blue suit and camouflage tie, followed by Vice President J. D. Vance, who garnered scattered claps and whistles from the crowd. More-enthusiastic applause greeted President Trump’s appearance onstage, accompanied by a jarring blast of trumpets, but the cheering was still rather sedate. First Lady Melania Trump stood beside him, looking down at the crowd with cold contempt. The whole perverse regime was onstage, including Kristi Noem and Marco Rubio. Seeing them seated there in such close proximity, I found myself wondering how long-range those Houthi drones really are.

Throughout the day, I had spoken to various Trump voters and tried to sound out their opinions on Trump’s brand of militarism and his foreign policy. Rather than any ethos or ideology that could support the renewal of National Socialism in the United States, I found them to be motivated mostly by tired cultural grudges, xenophobic resentment, social-media memes, and civic illiteracy. Few were enthusiastic about defending Trump’s complete capitulation to Israel and the neocons.

Trump voters know just as well as the rest of us that the terror wars were a mistake. We all know that they were based on lies. We are all well aware that our side lost, and that the defeats were costly, and indeed ruinous. We are going to keep starting new wars anyway, and losing them too. As President Biden said last year of his administration’s air strikes on Yemen: “Are they stopping the Houthis? No. Are they going to continue? Yes.”

This isn’t a sign of ascendant fascism so much as the nadir of late-stage capitalism, which depends on forever wars to juice corporate profits at a time of falling rates of return on investment. In its doddering senescence, the capitalist war machine is no less murderous than fascism was—witness the millions of Muslims killed by the United States and Israel since 2001—but it has considerably lower production values. In this soft dystopia, our military forces will not be destroyed in a cataclysmic confrontation with the armies of Communism, as befell Nazi Germany on the Eastern Front. Instead, the defense oligarchs who own Congress will go on pocketing the money allocated to the military, just as they have been for the past forty years, until nothing is left but a hollow shell, a shrinking and sclerotic military so debilitated by graft, suicides, overdoses, and violent crime that it’s incapable of fulfilling its mission, and suitable only for use in theatrical deployments at home beating up protesters and rounding up migrants and the homeless.

Mustering the last of my morale, I trudged back to Constitution Avenue and took my place among the remaining parade-goers. One of the last formations to march past was an Army weapons-testing platoon accompanied by a number of small quadcopter drones. Quadcopters like these have proved pivotal in Ukraine, but the United States hardly makes any. China can churn out an estimated hundred cheap, disposable drones for every one produced in America. In an effort to close the gap, Pete Hegseth has announced new initiatives to boost domestic manufacturing of the devices, but early results have not been promising. A recent report in the New York Times described an exercise in Alaska in which defense contractors and soldiers tested prototypes of U.S.-built “one-way” kamikaze drones with results so dismal they were almost comical. None of the tests described were successful. The drones failed to launch or missed their targets. One crashed into a mountain.

The quadcopters hovering over the testing platoon at the rear of the parade were the X10D model made by Skydio, the largest U.S. drone manufacturer. Not long ago, Skydio transitioned its business from consumer to military and police drones, targeting markets in Ukraine, Israel, and elsewhere. After Skydio sold drones to Taiwan, Beijing retaliated last year by cutting off the company’s access to Chinese batteries, prompting the company to ration them to only one per drone. I noticed that one of the Skydio quadcopters hovering over the parade had dropped out of view. I couldn’t see where it had gone. Then one of the soldiers in the testing platoon marched past, holding it up over his head, make-believing that it was still aloft.

by Seth Harp, Harper's |  Read more:
Images: uncredited 

Saturday, October 11, 2025

Mask of la Roche-Cotard,

Also known as the “Mousterian Protofigurine”, is a purported artifact dated to around 75,000 years ago, in the Mousterian period. It was found in 1975 in the entrance of a cave named La Roche-Cotard, territory of the commune of Langeais (Indre-et-loire), on the banks of the river Loire.

The artifact, possibly created by Neanderthal humans, is a piece of flat flint that has been shaped in a way that seems to resemble the upper part of a face. A piece of bone pushed through a hole in the stone has been interpreted as a representation of eyes.

Paul Bahn has suggested this “mask” is “highly inconvenient”, as “It makes a nonsense of the view that clueless Neanderthals could only copy their cultural superiors the Cro-Magnon”.

Though this may represent an example of artistic expression in Neanderthal humans, some archaeologists question whether the artifact represents a face, and some suggest that it may be practical rather than artistic.

In 2023 the oldest known Neanderthal engravings were found in La Roche-Cotard cave which have been dated to more than 57,000 years ago.

The Life and Death of the American Foodie

When food culture became pop culture, a new national persona was born. We regret to inform you, it’s probably you.

When did you become such an adventurous eater?” my mom often asks me, after I’ve squealed about some meal involving jamĂ³n ibĂ©rico or numbing spices. The answer is, I don’t know, but I can think of moments throughout my life where food erupted as more than a mere meal: My cousin and his Ivy League rowing team hand-making pumpkin ravioli for me at Thanksgiving. Going to the pre-Amazon Whole Foods and giddily deciding to buy bison bacon for breakfast sandwiches assembled in a dorm kitchen. Eating paneer for the first time in India. Slurping a raw oyster in New Orleans.

What made me even want to try a raw oyster in 2004, despite everything about an oyster telling me NO, was an entire culture emerging promising me I’d be better for it. Food, I was beginning to understand from TV and magazines and whatever blogs existed then, was important. It could be an expression of culture or creativity or cachet, folk art or surrealism or science, but it was something to pay attention to. Mostly, I gleaned that to reject foodieism was to give up on a new and powerful form of social currency. I would, then, become a foodie.

To be a foodie in the mid-aughts meant it wasn’t enough to enjoy French wines and Michelin-starred restaurants. The pursuit of the “best” food, with the broadest definition possible, became a defining trait: a pastry deserving of a two-hour wait, an international trip worth taking just for a bowl of noodles. Knowing the name of a restaurant’s chef was good, but knowing the last four places he’d worked at was better — like knowing the specs of Prince’s guitars. This knowledge was meant to be shared. Foodies traded in Yelp reviews and Chowhound posts, offering tips on the most authentic tortillas and treatises on ramps. Ultimately, we foodies were fans, gleefully devoted to our subculture.

Which inevitably leads to some problems, when, say, the celebrities the subculture has put on a pedestal are revealed to be less-than-honorable actors, or when values like authenticity and craft are inevitably challenged. What it’s historically meant to be a foodie, a fan, has shifted and cracked and been reborn.

And ultimately, it has died. Or at least the term has. To be called a “foodie” now is the equivalent of being hit with an “Okay, boomer.” But while the slang may have changed, the ideals the foodie embodied have been absorbed into all aspects of American culture. There may be different words now, or no words at all, but the story of American food over the past 20 years is one of a speedrun of cultural importance. At this point, who isn’t a foodie? (...)
***
How did we get to chefs-holding-squeeze-bottles as entertainment? The 1984 Cable Communications Policy Act deregulated the industry, and by 1992, more than 60 percent of American households had a cable subscription. Food Network launched in 1993, and compared to Julia Child or Joyce Chen drawing adoring viewers on public broadcasting programs, the channel was all killer, no filler, with shows for every mood. By the early 2000s, you could geek out with Alton Brown on Good Eats, experience Italian sensuality with Molto Mario or Everyday Italian, fantasize about a richer life with Barefoot Contessa, or have fun in your busy suburban kitchen with 30 Minute Meals. Anthony Bourdain’s A Cook’s Tour gave viewers an initial taste of his particular brand of smart-alecky wonder, and there were even competition shows, like the Japanese import Iron Chef.

The premiere of 2005’s The Next Food Network Star, which later gave us Guy Fieri, baron of the big bite, was the network’s first admission that we were ready to think of food shows in terms of entertainment, not just instruction and education. But Food Network was still a food network. The mid-aughts brought the revelation that food programming didn’t have to live just there, but could be popular primetime television — when that was an actual time and not just a saying.

Then came Top Chef, inspired by the success of Bravo’s other reality competition series, Project Runway. There is no overstating Top Chef’s lasting influence on food entertainment, but off the bat it did one thing that further cemented foodieism as a bona fide subculture: Its air of professionalism gave people a vocabulary. “The real pushback from the network was but the viewers can’t taste the food,” says Lauren Zalaznick, president of Bravo at the time. But just like the experts on Project Runway could explain good draping to someone who didn’t know how to sew, Top Chef “committed to telling the story of the food in such a way that it would become attainable no matter where you were,” she says.

This gave viewers a shared language to speak about food in their own lives. Now, people who would never taste these dishes had a visual and linguistic reference for molecular gastronomy, and could speculate about Marcel Vigneron’s foams. If you didn’t know what a scallop was, you learned, as Top Chef was awash in them. Yes, you could hear Tom Colicchio critique a classic beurre blanc, but also poke, al pastor, and laksa, and now that language was yours too. And you could hear chefs speak about their own influences and inspirations, learning why exactly they thought to pair watermelon and gnocchi.

The food scene then “was more bifurcated,” says Evan Kleiman, chef and longtime host of KCRW’s Good Food. “There were super-high-end restaurants that were expensive, maybe exclusive, and for the most part represented European cuisines. And then what was called ‘ethnic food’ was often relegated to casual, family-run kind of spots.” Top Chef may have been entertainment for the upwardly mobile foodie, but in 2005, Bourdain’s No Reservations premiered on the Travel Channel, similarly emphasizing storytelling and narrative. In his hands, the best meals often didn’t even require a plate. His was a romantic appreciation of the authentic, the hole-in-the-wall, the kind of stuff that would never be served in a dining room. It set off an entire generation of (often less respectful, less considered) foodie adventurism.

No Reservations is what got me interested in the culture of eating,” says Elazar Sontag, currently the restaurant editor at Bon AppĂ©tit. Because it was about food as culture, not as profession. But there was programming for it all. Also in 2005, Hell’s Kitchen premiered on Fox, with an amped-up recreation of a dinner service in each night’s challenge. “Hell’s Kitchen’s high-octane, insane, intense environment of a restaurant kitchen is actually what made me think, when I was maybe 12 or 13, that I want to work in restaurants,” says Sontag.

All these shows were first and foremost about gathering knowledge, whether it was what, indeed, a gastrique was, or the history of boat noodles in Thailand. It didn’t matter if you’d ever been there. The point was that you knew. “Food was becoming a different kind of cultural currency,” says Sontag. “I didn’t clock that shift happening at the time, but it’s very much continued.”

Language is meant to be spoken; knowledge is meant to be shared. Now that everyone knew there were multiple styles of ramen, there was no better place to flex about it than with a new tool: the social internet. Online, “talking about restaurants and going to restaurants became something that people could have a shared identity about,” says Rosner. “There was this perfect storm of a national explosion of gastronomic vocabulary and a platform on which everybody could show off how much they knew, learn from each other, and engage in this discovery together.” Your opinion about your corner bagel shop suddenly had a much wider relevance.

by Jaya Saxena, Eater | Read more:
Image: Julia Duffosé

Frog Boiling 101: When Should a Frog Jump The Pot?

Fascism Can't Mean Both A Specific Ideology And A Legitimate Target

When Woodie Guthrie famously wrote on his guitar that “This machine kills fascists” - a sentiment imitated and snowcloned by later generations of musicians and commentators - nobody worried this was a bad thing. Nobody demanded that somebody stop the machine before it killed again.

There’s no number of examples I could give which would absolutely prove I’m not cherry-picking. But I think it’s suggestive that even people who argue against casually killing fascists have to disclaim that they’re certainly not opposing all violence against fascists - just against jumping straight to murder before other forms of violence have been tried. Besides that, I can only appeal to a hope that you’ve experienced the same cultural currents that I have, and that this seems obviously true to you.

I’m not trying to normalize fascism, or claim that it isn’t extremely evil (I think it is, see here for more). I’m only saying, again, as a matter of basic logic, that the following things can’t all be true:

1). Many Americans are fascists

2.) Fascists are an acceptable target for political violence

3.) Political violence in America is morally unacceptable (at the current time)

And I don’t want to abandon 1, because it seems like a factual claim that might be true - even if you don’t think it’s true now, it obviously has the potential to be true in the future - and we shouldn’t ban people from asserting true claims.

And I don’t want to abandon 3, because political violence is extremely bad, the norm against it is the only thing restraining us from various forms of smoldering or overt civil war, and we’re still doing pretty well by the standards of most times and places.

So I think the natural conclusion is to abandon 2. Fascists, although evil, aren’t automatically a legitimate target for political violence.

The strongest objection is a slippery slope argument: political violence will always be inconvenient; it will always be tempting to put it off until some further red line is crossed. But if we always give into that impulse, nobody will ever resist dictatorship or start a revolution against an unjust government. Isn’t the tree of liberty naturally “fertilized with the blood of tyrants”?

There’s no simple answer to this concern. Nicholas Decker, who considers this question more thoughtfully than most, concludes that:
Your threshold may differ from mine, but you must have one. If the present administration should cancel elections; if it should engage in fraud in the electoral process; if it should suppress the speech of its opponents, and jail its political adversaries; if it ignores the will of Congress; if it should directly spurn the orders of the court; all these are reasons for revolution. It may be best to stave off, and wait for elections to throw out this scourge; but if it should threaten the ability to remove it, we shall have no choice.
But all of these are their own sorts of slippery slopes. Suppress the speech of their opponents? Should the Republicans have started a civil war when Democrats got social media to do woke content moderation? Ignore the will of Congress? Should Democrats have started a civil war when Trump refused to fund PEPFAR even after Congress allocated the money? Prosecute political opponents? Should the Republicans have started a civil war when New York prosecuted Trump for Stormy Daniels? Should the Democrats start one now that Trump is prosecuting James Comey for perjury? No particular form of any of these things ever feels like the cosmically significant version of these things where assassinations and armed uprisings become acceptable. But would-be dictators are masters of boundary-pushing and frog-boiling; there’s almost never one moment when they say outright “Today I will be cancelling democracy for no reason, sorry”.

I used to think that my bright line was contempt of the Supreme Court - when a leader echoes Andrew Jackson’s boast that “[the Court] has made its decision, now let them enforce it”. But the Trump administration briefly seemed to consider defying a Supreme Court order in the Kilmar Abrego Garcia case. In the end, they didn’t actually defy the order. And they were being subtle: less Jacksonian swagger, more special pleading about reasons why they thought the ruling didn’t mean what we thought it meant. But if they had actually defied the order - while still doing their best to maintain plausible deniability - would I have resorted to violence, or even felt in an abstract way that “it was time” for violence? I can’t imagine this would have felt convincing at the time.

Is violence justified when we get to FDR-level court packing threats? When we get to Orban? To Chavez? To Xi? To Putin? To Hitler? To Pol Pot? I think I land somewhere between Orban and Hitler, but I can’t say for sure, nor can I operationalize the distinction. And the last person to think about these questions in too much detail got a (mercifully polite) visit from the Secret Service, and even if we disagree with him it’s poor practice to hold a debate where it’s impermissible to assert one side. I will be punting on the deep cosmic question here, at least publicly. (...)

So as a bare minimum, I think people should reject premise (2) above and stop talking about fascists as if it’s okay to kill them. I don’t think this implies support for fascism, any more than saying that you shouldn’t kill communists implies support for communism. They’re both evil ideologies which are bad and which we should work hard to keep out of America - but which don’t, in and of themselves, justify killing the host.

What about going beyond the minimum? If fascist denotatively means “far-right nationalist authoritarian corporatist”, but connotatively “person whom it is okay to kill”, and we personally try not to worsen the connotation but other people still have that association, then should we avoid using it at all? Or is it permissible to still use it for its denotative meaning?

by Scott Alexander, Astral Codex Ten |  Read more:
Image: Woody Guthrie/uncredited
[ed. Predictably, staunch do-or-die Second Amendment defenders (with basements full of stockpiled weapons) who've been advocating exactly this kind of violence for years go apoplectic whenever the same rhetoric is used against them.  See also: I Stand with Nicholas Decker (US0E):]
***
Attempting to determine when it is appropriate to engage in political violence is, of course, a legitimate, legally protected — in fact, quintessentially American — and worthwhile endeavor. The United States was founded on the principle that if a government becomes tyrannical, “it is the Right of the People to alter or to abolish it,” including through revolutionary violence. As Thomas Jefferson famously wrote to William Stephens Smith, the son-in-law of John Adams, following the Shays Rebellion in 1787, Jefferson believed it was essential for citizens to instill the fear of God in government by conducting a violent rebellion at least once every 20 years, and thereby “refreshing [the tree of liberty] from time to time with the blood of patriots and tyrants.” (...)

Decker’s point is obviously not that the American left (of which he does not consider himself a member) ought to initiate politicide, but that we’re closer to the sort of King George III tyranny that justifies revolution according to the American founding tradition than we’ve been at any point in recent memory. He illustrates this cunningly — evidently too cunningly for his critics — by establishing a parallelism between the conduct of the second Trump regime and the conduct of George III as it’s indicted by the Declaration of Independence. Below is the relevant passage by Jefferson, with lines bolded where Decker draws an analogy to Trump:
The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States. To prove this, let Facts be submitted to a candid world.

He has combined with others to subject us to a jurisdiction foreign to our constitution, and unacknowledged by our laws; giving his Assent to their Acts of pretended Legislation:

For Quartering large bodies of armed troops among us:

For protecting them, by a mock Trial, from punishment for any Murders which they should commit on the Inhabitants of these States:

For cutting off our Trade with all parts of the world:

For imposing Taxes on us without our Consent:

For depriving us in many cases, of the benefits of Trial by Jury:

For transporting us beyond Seas to be tried for pretended offences.
And here’s Decker — an astute reader might catch the similarities!
Evil has come to America. The present administration is engaged in barbarism; it has arbitrarily imprisoned its opponents, revoked the visas of thousands of students, imposed taxes upon us without our consent, and seeks to destroy the institutions which oppose it. Its leader has threatened those who produce unfavorable coverage, and suggested that their licenses be revoked. It has deprived us, in many cases, of trial by jury; it has subjected us to a jurisdiction foreign to our constitution, and has transported us beyond seas to be imprisoned for pretended offenses. It has scorned the orders of our courts, and threatens to alter fundamentally our form of government. It has pardoned its thugs, and extorted the lawyers who defended its opponents.
This alone doesn’t get you in trouble, of course. Unless you’re a partisan of the MAGA right, there’s nothing that contradicts the current moral fashion about identifying the tyrannical character of the Trump regime, or even comparing Trump to historical figures against whom it is widely accepted that revolutionary violence would have been justified. No more than a decade ago, even the mild-mannered, respectable, moderate conservative author and pop sociologist J.D. Vance was comparing Trump to Hitler!

Decker only gets in trouble when he follows these widely accepted facts and values to their logical conclusion: that it is not unreasonable to believe that at some point in the near future, it will become justifiable to engage in revolutionary (or, more accurately, counter-revolutionary) violence against the principals and agents of the Trump regime, so long as this violence is not conducted glibly or indiscriminately. Admittedly, Decker could have made these qualifications clearer. But the point should not be lost on someone who reads the essay in good faith.  (...)

It is nevertheless clear to me, having either been a part of or adjacent to Decker’s intellectual milieu for my entire adult life, based on the homage to the American revolution and the repeated references to the “present administration,” that the class of people being identified as potentially legitimate targets for violence is narrowly limited to regime decisionmakers and the agents who would execute their illegal and revisionary orders. This is also clear in the following paragraph where Decker identifies the conditions he believes would justify a resort to violence:
And when is that time? Your threshold may differ from mine, but you must have one. If the present administration should cancel elections; if it should engage in fraud in the electoral process; if it should suppress the speech of its opponents, and jail its political adversaries; if it ignores the will of Congress; if it should directly spurn the orders of the court; all these are reasons for revolution. It may be best to stave off, and wait for elections to throw out this scourge; but if it should threaten the ability to remove it, we shall have no choice. We will have to do the right thing. We will have to prepare ourselves to die.
Yet his critics all insist he’s calling for the death of anyone on the right “because he lost an election,” even when it’s explained to them why this is false. (...)

A more reasonable explanation is that the people who don’t understand Decker’s article are simply dumb and boring people. Like everyone else, they believe what they’re told — or at least what they want to believe, and then what they’re told to believe in whatever echo chamber they happened to end up in. Unlike Decker and other smart and interesting people, however, they’re pathologically incapable of also thinking for themselves. It’s okay to think you should kill Baby Hitler. It’s okay to admire the American founders and their values. It’s okay to think we need a Second Amendment to deter state tyranny. Hell, for most of these people, it’s okay to think you should murder the vice president if you’re convinced he’s complicit in helping the other side steal an election. [ed. Paging Mike Pence.]

Can you say the same thing about your own side? Of course not!

Why not? It doesn’t matter!

A smart and interesting person is someone who notices these inconsistencies and doesn’t simply paper them over. You don’t have to be precisely right about everything — you just have to make a well-reasoned, good-faith, unconventional argument and be willing to change your mind if someone gives you a good reason to do so. That might not seem like much of a challenge, but most people fail miserably. If telling inconvenient truths was popular, then it wouldn’t be very inconvenient, would it?

[ed. Watch this recent video from Chicago. Who are the ones engaged in political violence?]