Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Thursday, December 4, 2025

Steve Cropper, Guitarist, Songwriter and Shaper of Memphis Soul Music, Dies at 84

Steve Cropper, the prodigious guitarist, songwriter and producer who played a pivotal role in shaping the lean gutbucket soul music made at Memphis’s Stax Records in the 1960s and ’70s, died on Wednesday in Nashville. He was 84.

His death, at a rehabilitation facility, was confirmed by his wife, Angel Cropper, who did not specify the cause.

As a member of Booker T. & the MG’s, the house rhythm section at Stax, Mr. Cropper played the snarling Fender Telecaster lick on “Green Onions,” the funky hit instrumental by the MG’s from 1962. He also contributed the ringing guitar figure that opened Sam & Dave’s gospel-steeped “Soul Man,” the 1966 single on which the singer Sam Moore shouted, “Play it, Steve!” to cue Mr. Cropper’s stinging single-string solo on the chorus. Both records were Top 10 pop hits and reached No. 1 on the R&B chart.

Mr. Cropper had an innate feel for a groove as well as a penchant for feeling over flash — gifts evident in his bell-toned guitar work on Otis Redding’s “(Sittin’ on) The Dock of the Bay.” In 2015, he was ranked 39th on Rolling Stone’s list of the 100 greatest guitarists of all time. Britain’s Mojo magazine slotted him second, behind only Jimi Hendrix, on a similar list of guitarists published in 1996.

“I’ve always thought of myself as a rhythm player,” Mr. Cropper said in an interview with Guitar.com in 2021. “I get off on the fact that I can play something over and over and over, while other guitar players don’t want to even know about that. They won’t even play the same riff or the same lick twice.”

Mr. Cropper was also a prolific songwriter. His credits, typically as a co-writer, include the epoch-defining likes of “Dock of the Bay,” Wilson Pickett’s “In the Midnight Hour” and Eddie Floyd’s “Knock on Wood.” All three were No. 1 R&B singles. Mr. Redding’s record topped the pop chart as well, and won Grammy Awards for best R&B song and best male R&B vocal performance in 1969.

In charge of artists and repertoire at Stax during the 1960s, Mr. Cropper produced the recordings of many of the songs he had a hand in writing. His website states that he was “involved in virtually every record issued by Stax from the fall of 1961 through year end 1970.” Judging by the testimony of the Stax co-founder Jim Stewart, it is not hard to imagine that this was the case.


“Steve was my right-hand man,” Mr. Stewart said of Mr. Cropper’s contributions to the label’s legacy in Peter Guralnick’s 1999 book, “Sweet Soul Music: Rhythm and Blues and the Southern Dream of Freedom.” “He would come to the studio and sit there and keep the doors open and take care of business; he was disciplined and responsible. Steve was the key.”

In the process, Mr. Cropper helped reimagine the Southern soul music of the era, imbuing it with a simultaneously urban and down-home feel — a bluesy mix of sinew and grit that was instantly recognizable over the radio airwaves. Widely sampled, the records he played on or produced influenced subsequent generations of musicians, especially in hip-hop and R&B.

Mr. Cropper achieved further acclaim in the late 1970s for his work with the Blues Brothers, the musical side project of the “Saturday Night Live” co-stars John Belushi and Dan Aykroyd. By then, Stax had closed, having fallen into insolvency in 1975, and Mr. Cropper had begun immersing himself in freelance session and production work with artists like Art Garfunkel and Ringo Starr.

“Briefcase Full of Blues,” the Blues Brothers’ first album, included a remake of “Soul Man,” complete with a reprise of the shout “Play it, Steve!” from Mr. Belushi on the chorus. The single reached No. 14 on the pop chart in 1979, anticipating the release of the 1980 movie “The Blues Brothers,” starring Mr. Belushi and Mr. Aykroyd and featuring Mr. Cropper as Steve “the Colonel” Cropper, who plays in a band called Murph and the Magic Tones. (Born of Mr. Cropper’s tendency to take charge of situations, the Colonel was a childhood nickname that stuck with him even after he established himself as a musician.) (...)

Mr. Cropper’s affiliation with the Blues Brothers spanned four decades. But back in 1978, when he and Mr. Dunn first joined the band, skeptics failed to understand why they would want to collaborate with the two comedians from “Saturday Night Live.”

“We got a lot of flak — Duck and I did — about playing with those guys,” Mr. Cropper told guitar.com. “Folks said, ‘What are you guys doing with these two clowns from S.N.L.?’”

“But those guys were great musicians,” he went on. “John Belushi had played drums in a band for years before he ever went to Second City,” the Chicago improv comedy troupe. “And Ackroyd is actually playing the harmonica on everything we did.”

by Bill Friskics-Warren, NY Times | Read more:
Image: David Reed Archive/Alamy
[ed. Legendary, and widely respected by just about everyone. He created a whole new genre - the Stax Sound.]

Sunday, November 30, 2025

The High-Romantic Nightmare That Wasn’t

Someday someone will actually adapt Mary Shelley’s Frankenstein into a film. Until then, we will have to make do with filmmakers using Shelley’s ever-resilient scaffolding as a playground for their own obsessions. Del Toro’s newest treatment of the story has been marketed and blurbed by many critics as “the movie he was born to make.” More than anything, though, the film serves to prove how far we still are from realizing the depths of Shelley’s original vision. Del Toro’s achingly sincere and fitfully compelling version of the book has maintained only that — the mere scaffolding of the story. It has next to nothing in common with the spirit of Shelley’s High-Romantic nightmare, and far more to do with del Toro’s own interests, especially his perennially unilluminating and often ponderous dedication to the tone of fable and fairy tale.

It’s no accident that the only great Frankenstein films — James Whale’s two immortal Universal classics, Frankenstein (1931) and Bride of Frankenstein (1935) — didn’t even worry about the scaffolding. They are of course the bases for the Frankenstein of modern popular culture, films which jettisoned all but a few garbled scenarios from the book and erected the rest from a pure Hollywood riff on a century of other vague Gothic imagery and literature. Two of the funniest movies ever made — Abbott and Costello Meet Frankenstein (1948) and Mel Brooks’ Young Frankenstein (1974) — are themselves riffs on the Universal films, and only those films. And while there have been a few attempts to stage the proper Shelley version, nearly all of them, such as Kenneth Branagh’s awful and characteristically self-important 1994 film, have seen fit to mangle whole sections of Shelley’s work, and invent others from whole cloth.

So now we get the long-awaited version from the man who would seem the most obvious choice to make it — and yet, once again, here is a Frankenstein that finds nothing worth saving from the original besides that basic scenario. In the first, authoritative 1818 version of the text, Victor Frankenstein was a man from a happy family, betrothed to his cousin Elizabeth, who finds himself reading the works of alchemists like Paracelsus and Albertus Magnus while getting caught up in the fervor of late 18th-century Enlightenment science. This precise setting and period are key to the original story’s brilliance: Shelley evokes an almost beatific time in her own recent history where faith in medical, technological, and social progress was just beginning to achieve its modern velocity — a time in which the center of scientific study was shifting from physics to chemistry and biology.

English Romanticism was the great inheritor of this new concern for biological science, and thrived on metaphors of botany and organicism, just as it fed itself on the new psychology of the German philosophers. Frankenstein gets its power from this — and from its mordant, haunting sense of the old fairy tales furiously spinning into new, wretched life at the birth of the industrial world. In a way, to read Frankenstein is to read what the Romantics thought of the Enlightenment — and their thought was, in brief, that the new scientists had better read Paradise Lost. In fact, one could sum up the ambiguity of the original Shelley novel simply by saying that, in her Frankenstein, it is the Monster himself who reads and understands Milton, not his creator.

Del Toro, as expected, avoids almost all of Shelley’s original material. His primal obsession has always been the feeling of fairy tale itself, united with the trappings and settings of old Hollywood horror films. Even the subtler, Promethean horrors of the original are absent. Instead, he grafts all the whizbang technologic set-dressing of the old Universal films onto an even more overtly-Romantic, maximalist vision of the Shelley story; updating its setting to the mid-19th century — presumably to get in a few dull stereotypes of Victorian squalor and a tinge of punk Darwinism in the reanimation presentation to the Edinburgh Dons, who revoke Frankenstein’s qualifications in horror.

Victor Frankenstein himself (Oscar Isaac, in an uncharacteristically hammy and misjudged performance) becomes, to all effects, like a grown-up Lord Bullingdon from Barry Lyndon: he’s a sour brooder with a tyrannical father (Charles Dance, in a Charles-Dance-type role) and a doomed pregnant mother (Mia Goth, who also pulls Oedipal double-duty as Elizabeth). The nature of Frankenstein’s work is changed from accidental discovery to lifelong attempt at making up for the loss of his dead mother. Cousin Elizabeth is no longer the saintly pen pal and future wife, but a foil and an object of envy destined to marry Frankenstein’s brother (and, in a peculiar turn, a sort of angel for the Monster). She’s also the niece of the man who wants to bankroll Frankenstein’s experiment (Christoph Waltz). The private, tortured space of Frankenstein’s chambers in the book is transplanted to a huge, vertiginous castle on the edge of a sea — if we had any doubts before about just what height of Gothicism del Toro is going for.

The point of all this, of course, is not only to amp up the opera, but to give del Toro a chance to dream up a thousand gnarly details for the making of the Monster. Shelley herself barely spared a moment to describe the actual process of making the creature. But for del Toro, that’s the whole point. He delights in playing yet another turgid, whimsical Alexandre Desplat waltz while he lingers over Frankenstein’s vivisections, and makes sure to show us all the minute aspects of the building of the electrical apparatus, as well as the construction of the attic and underbelly of the tower. This sequence of the film is entertaining, even if his incessantly roaming, unfixed camera quickly grows exhausting. When it comes to the camera, del Toro is no great director: his Frankenstein has moments of beauty, but even the most arresting images are frequently undercut by the film’s waxy, shadowless look and by awkward framing that makes every other shot feel as if it’s coming from the corner of a too-wide room. (...)

In the end, there’s one main reason to see Del Toro’s Frankenstein and that is Jacob Elordi, who here proves himself to be what was mostly hidden underneath the pure exploitation schlock of Saltburn or Euphoria, and could be briefly glimpsed in his Elvis from Sofia Coppola’s Priscilla: that is, a great physical actor trapped in the body of a beautiful man. Has any contemporary heartthrob so totally embraced such a complete privation of his trademark physique? There’s no room for vanity within the Monster. Elordi surely saw his chance to free himself of the burden of his looks — and yet what he chooses to do is pretty magnificent. His elegantly awkward, Butoh-inspired performance is the real glory of a film that would be a rather hollow experience otherwise. After its overheated Freudian first half, the film finally comes alive when it leaves behind Frankenstein the man and follows the Monster — a section which comes closest to following the finest section of Shelley’s story. The middle of the film, wherein the Monster leaves to hide and watch the family of an old blind man, is also the finest part of the film. And, thank god, this time the movie Monster actually does read Paradise Lost.

As the film goes on, and the Monster returns to wreak havoc, del Toro’s Frankenstein almost comes close to the heights of a tragic fairy tale. Though the contrivances del Toro takes to get himself there are ridiculous, the pietà of Elizabeth’s death in the Monster’s arms is lovely (so sincere that there were snickers in my theater when I saw it), as is the Monster’s return to the ruins of the tower to discover the site of his creation. Even at the ending, when the Monster and Frankenstein have met out on the ice and come together in the cabins of the Scandinavian ship — the resolution of this particular father-son/God-Adam story is moving. Still, del Toro doesn’t quite earn the weight of the climax he worked so ponderously toward. It rests entirely on Jacob Elordi’s broad shoulders, and he does his best. Yet where Shelley’s Monster chooses to burn his creator and end his own life — del Toro, big earnest softy that he is, can’t help but let his Monster stare off into the sunset, ponder his apparent immortality, and conquer his desire to die.

by Sam Jennings, Metropolitan Review |  Read more:
Image: via
[ed. I read Mary Shelley's Frankenstein about a year or so ago, and it's true, much of the power and nuance (and tenderness) of that book is lost in this film. That said, it's still the best adaptation that I've seen to date. Did you know that her monster developed a deep literary and philosophical intelligence (from reading so many books in isolation), and that his main objective in pursuing and tormenting his creator was simply to have him create a female companion to share his life (which the doctor initially promised to do, then refused after having second thoughts)? Great book that really should be read to get a full appreciation of all its many themes (including technological hubris). See also: Analysis of Mary Shelley’s Frankenstein (Literary Theory and Criticism):]
***
The work and its monster-hero became such a popular subject for film and stage, in serious, comedic, and parodic productions, that many acquaint themselves with Victor Frankenstein’s monster long before encountering it in Shelley’s book. Many first-time readers discover with a shock that the monster remains unnamed, with his creator bearing the Frankenstein moniker. A second, stronger shock may occur when readers realize that the monster, in great contrast to the bumbling, murderous, wild-eyed, grunting, crazy-stitched object of film, proves the most rational and also the most eloquent of any of the novel’s characters. (...)

The monster was not “born” hating others; his hate was taught him by people who refused to see beyond his external appearance to the brilliant warm nature existing just below its surface. While science might be expected to lack compassion, the same could not be said of religion, which should have prepared the public to be more accepting. That the monster possesses a quick intellect and a natural warmth and goodness that is corrupted only by his exposure to humans remains an indictment of shallow social values and a rigid class structure.

Friday, November 28, 2025

The Decline of Deviance

Where has all the weirdness gone?

People are less weird than they used to be. That might sound odd, but data from every sector of society is pointing strongly in the same direction: we’re in a recession of mischief, a crisis of conventionality, and an epidemic of the mundane. Deviance is on the decline.

I’m not the first to notice something strange going on—or, really, the lack of something strange going on. But so far, I think, each person has only pointed to a piece of the phenomenon. As a result, most of them have concluded that these trends are:

a) very recent, and therefore likely caused by the internet, when in fact most of them began long before

b) restricted to one segment of society (art, science, business), when in fact this is a culture-wide phenomenon, and

c) purely bad, when in fact they’re a mix of positive and negative.

When you put all the data together, you see a stark shift in society that is on the one hand miraculous, fantastic, worthy of a ticker-tape parade. And a shift that is, on the other hand, dismal, depressing, and in need of immediate intervention. Looking at these epoch-making events also suggests, I think, that they may all share a single cause.

by Adam Mastroianni, Experimental History |  Read more:
Images: Author and Alex Murrell
[ed. Interesting thesis. For example, architecture:]
***
The physical world, too, looks increasingly same-y. As Alex Murrell has documented, every cafe in the world now has the same bourgeois boho style:


Every new apartment building looks like this:

Thursday, November 27, 2025

Joni Mitchell - Joni's Jazz

[ed. What a welcome surprise. A new archival release focusing on Joni's jazz evolution, dedicated to Wayne Shorter who died in 2023. So much good stuff here - 61 tracks (full album). See also: Joni’s Jazz Reviewed: Short on rarities but steeped in a love of the genre (Mojo).]

Saturday, November 22, 2025

What Does China Want?

Abstract

The conventional wisdom is that China is a rising hegemon eager to replace the United States, dominate international institutions, and re-create the liberal international order in its own image. Drawing on data from 12,000 articles and hundreds of speeches by Xi Jinping, to discern China's intentions we analyze three terms or phrases from Chinese rhetoric: “struggle” (斗争), “rise of the East, decline of the West” (东升西降), and “no intention to replace the United States” ((无意取代美国). Our findings indicate that China is a status quo power concerned with regime stability and is more inwardly focused than externally oriented. China's aims are unambiguous, enduring, and limited: It cares about its borders, sovereignty, and foreign economic relations. China's main concerns are almost all regional and related to parts of China that the rest of the region has agreed are Chinese—Hong Kong, Taiwan, Tibet, and Xinjiang. Our argument has three main implications. First, China does not pose the type of military threat that the conventional wisdom claims it does. Thus, a hostile U.S. military posture in the Pacific is unwise and may unnecessarily create tensions. Second, the two countries could cooperate on several overlooked issue areas. Third, the conventional view of China plays down the economic and diplomatic arenas that a war-fighting approach is unsuited to address.

There is much about China that is disturbing for the West. China's gross domestic product grew from $1.2 trillion in 2000 to $17 trillion in 2023. Having modernized the People's Liberation Army over the past generation, China is also rapidly increasing its stockpile of nuclear warheads. China spends almost $300 billion annually on defense. Current leader Xi Jinping has consolidated power and appears set to rule the authoritarian Communist country indefinitely. Chinese firms often engage in questionable activities, such as restricting data, inadequately enforcing intellectual property rights, and engaging in cyber theft. The Chinese government violates human rights and restricts numerous personal freedoms for its citizens. In violation of the United Nations Convention on the Law of the Sea (UNCLOS), every country in the region, including China, is reclaiming land and militarizing islets in the disputed East and South China Seas. In short, China poses many potential problems to the United States and indeed to the world.

In U.S. academic and policymaking circles, the conventional wisdom is that China wants to dominate the world and expand its territory. For example, Elbridge Colby, deputy assistant secretary of defense during Donald Trump's first term and undersecretary of defense for Trump's second term, writes: “If China could subjugate Taiwan, it could then lift its gaze to targets farther afield … a natural next target for Beijing would be the Philippines … Vietnam, although not a U.S. ally, might also make a good target.” (...) The then–U.S. Secretary of State Anthony Blinken said in 2022 that “China is the only country with both the intent to reshape the international order and, increasingly, the economic, diplomatic, military, and technological power to do it.” Trump's former U.S. trade representative, Robert Lithgizer, claims that “China to me is an existential threat to the United States…. China views itself as number one in the world and wants to be that way.”

These assessments of China's intentions lead mainstream U.S. scholars and policy analysts from both the Left and the Right to policy prescriptions that will take generations to unfold, and that are almost completely focused on war-fighting, deterrence, and decoupling from China. Those who believe in this China threat call for increasing U.S. military expenditures and showing “resolve” toward China. The conventional wisdom also advocates a regional expansion of alliances with any country, democratic or authoritarian, that could join the United States to contain China. As Colby writes, “This is a book about war.” Brands and Beckley argue that the United States should reinforce its efforts to deter China from invading Taiwan: “What is needed is a strategy to deter or perhaps win a conflict in the 2020s … the Pentagon can dramatically raise the costs of a Chinese invasion by turning the international waters of the Taiwan Strait into a death trap for attacking forces.” Doshi argues that the United States should arm countries such as “Taiwan, Japan, Vietnam, the Philippines, Indonesia, Malaysia, and India” with capabilities to contain China.

This leads to a key question: What does China want? To answer this question, this article examines contemporary China's goals and fears in words and deeds. In contrast to the conventional view, the evidence provided in this article leads to one overarching conclusion and three specific observations. Overall, China is a status quo power concerned with regime stability, and it remains more inwardly focused than externally oriented. More specifically: China's aims are unambiguous; China's aims are enduring; and China's aims are limited.

First, China's aims are unambiguous: China cares about its borders, its sovereignty, and its foreign economic relations. China cares about its unresolved borders in the East and South China Seas and with India, respectively. Almost all of its concerns are regional. Second, China deeply cares about its sovereign rights over various parts of China that the rest of the region has agreed are Chinese—Hong Kong, Taiwan, Tibet, and Xinjiang. Third, China has an increasingly clear economic strategy for its relations with both East Asia and the rest of the world that aims to expand trade and economic relations, not reduce them.

It is also clear what China does not want: There is little mention in Chinese discourse of expansive goals or ambitions for global leadership and hegemony. Furthermore, China is not exporting ideology. Significantly, the CCP's emphasis on “socialism with Chinese characteristics” is not a generalized model for the world. In contrast, the United States claims to represent global values and norms. What China also does not want is to invade and conquer other countries; there is no evidence that China poses an existential threat to the countries on its borders or in its region that it does not already claim sovereignty over.

We explore how China views its own position and role in the region and globally. Recognizing that public statements vary in their level of authoritativeness, we examined three main sources: People's Daily, which represents not only the state but also the Central Committee of the CCP; Xi Jinping's and other senior officials' speeches; and Qiushi, a magazine publicizing the CCP's latest policy directions. We used computer-assisted text analysis to systematically assess China's stated goals over time. This method allowed us to more accurately track China's concerns and identify how they have changed. We also show that China's top leaders consistently reiterate that China does not seek regional hegemony or aim to compete with the United States for global supremacy. Instead, China views international relations as multilateral and cooperative.

Second, China's aims are inherited and enduring, not new. There is a “trans-dynastic” Chinese identity: Almost every major issue that the People's Republic of China (PRC) cares about today dates back to at least the nineteenth century during the Qing dynasty. These are not new goals that emerged after the Communist victory in 1949, and none of China's core interests were created by Xi. These are enduring Chinese concerns, even though the political authority governing China has changed dramatically and multiple times over the past two hundred years or more.

Third, what China wants is limited, even though its power has rapidly expanded over the past generation. China's claims and goals are either being resolved or remain static. This reality is in contrast to many of the expectations of U.S. policymakers and to the conventional wisdom of the international relations scholarly literature, which maintains that states' interests will grow as power grows. Rather, the evidence shows that the Chinese leadership is concerned about internal challenges more than external threats or expansion.

We find that China does not pose the type of military threat that the conventional wisdom claims it does. Consequently, there is no need for a hostile military posture in the Pacific, and indeed the United States may be unnecessarily creating tensions. Just as important, we suggest that there is room for the two countries to cooperate on a number of issues areas that are currently overlooked. Finally, the conventional view of China de-emphasizes the economic and diplomatic arenas that a war-fighting approach is unsuited to address. The conventional wisdom about U.S. grand strategy is problematic, and the vision of China that exists in Washington is dangerously wrong.

This article proceeds as follows. First, we discuss the conventional wisdom regarding China's goals as represented by top policymakers in the United States and in the existing scholarly literature. The second section examines Chinese rhetoric and points out nuances in how to read and interpret Chinese rhetoric. The third section uses quantitative methods to more systematically and accurately assess Chinese claims across time as reflected in the most authoritative Chinese pronouncements. The fourth section details how China's main priorities are enduring and trans-dynastic, and the fifth section shows how the most important of these claims are not expanding, even though China's power has grown rapidly over the past generation. We present the implications of our argument for the U.S.-China relationship in the conclusion.

by David C. Kang, Jackie S. H. Wong, Zenobia T. Chan, MIT Press | Read more:
Image: via
[ed. The Roman empire collapsed because it was overextended. China won't make that mistake. They'll just get stronger and more self-reliant - securing their borders, advancing technology, providing security for their citizens. Dominant because they have a strategy for advancing their country's long-term interests, not dominance for its own sake. Most US problems have been self-inflicted - militarily, economically, politically, techologically. We've been distracted and screwing around for decades, empire building and trying to rule the world.]

Friday, November 21, 2025

The Bookie at the Center of the Ohtani Betting Scandal

It was a round of poker, fittingly, that upended Mathew Bowyer’s life in spectacular fashion. While he preferred to sate his appetite for risk by playing baccarat, poker had served as his formative introduction to the pleasures and possibilities of gambling. Back in the early Nineties, as an enterprising high school student in Orange County, California, Bowyer ran a regular game out of his childhood home that provided a template for what he later organized his adult life around on a dizzying scale: the thrill of the wager, the intoxicant of fast money, and the ability to shimmy into worlds inaccessible to most. Unlike so many of Orange County’s native sons, for example, Bowyer wasn’t raised with access to bottomless funds. But his adolescent poker winnings netted him enough to buy a pickup, which he tricked out with a thunderous subwoofer that ensured that his presence was felt even when he wasn’t seen.

Thirty years later, on Sept. 8, 2021, Bowyer was behind the wheel of a very different vehicle, his white Bentley GT Continental, driving to a very different poker game. Held in a hotel conference room in San Diego, it was hosted by some players and staff of the L.A. Angels, who were in town for two games against the Padres. For Bowyer, then a 46-year-old father of five who could be mistaken for a retired slugger — confident gait, hulking arms mosaicked in tribal tattoos — attending was a no-brainer. These were the back rooms where he cultivated new clients to expand what he referred to, cryptically, as “my business.”

During the poker game, Bowyer and one of his friends, a stocky guy named Michael Greenberg who had been a fixture at those long-ago high school poker games, began talking to a man seated at the card table. Japanese, slight in build, sporting a gray T-shirt, with inky hair cut into a modish bowl, neither Greenberg nor Bowyer yet knew the man’s name — Ippei Mizuhara. But both were aware that he was the interpreter and close friend of a player being heralded as the most extraordinary in baseball history: Shohei Ohtani, the two-way phenomenon who was then in his third year with the Angels, and finishing up a transcendent season in which he would hit 46 home runs, strike out 156 batters, and be named the American League Most Valuable Player. This connection, however, was not the reason Bowyer was keen to talk to Mizuhara. Between hands at the poker table, the interpreter was obsessively placing bets on sports through his phone.

Bowyer sidled up for a brief conversation — one he’d later come to spend many sleepless nights replaying in his mind.

“What are you betting on?”

“Soccer,” replied the interpreter.

“I run my own site,” said Bowyer, speaking as he always did: polite tone, penetrating eye contact. “We do soccer — we do it all. And with me, you don’t need to use your credit card. I’ll give you credit.” He extended his hand. “My name’s Matt.”

“I’m Ippei.”

“Ippei, if you’re interested, hit me up.”

And that was that, an exchange of the sort that Bowyer had been finessing for the better part of two decades in constructing one of the largest and most audacious illegal bookmaking operations in the United States. He’d had versions of this talk on manicured golf courses, over $5,000 bottles of Macallan 30 scotch, while flying 41,000 feet above the Earth in private jets comped by casinos, and lounging poolside at his palatial Orange County home. He’d had the talk with celebrities, doctors, day traders, trial lawyers, trust-fund scions. Often nothing came of it. But sometimes it led to a new customer — or “player,” in his industry’s parlance — adding to a stable of nearly 1,000 bettors who placed millions in weekly wagers through Bowyer. He used the bulk of his earnings to fuel his own ferocious thirst for gambling and the attendant lifestyle, escaping often to villas at Las Vegas casinos for lavish sprees that earned him a reputation as one of the Strip’s more notorious whales — a high roller with an icy demeanor doted on by the top brass of numerous casinos.

In this case, however, the exchange with Mizuhara sent Bowyer down a different path. Shortly after the poker game, he set up Mizuhara with an account at AnyActionSports.com, the site Bowyer used for his operation, run through servers in Costa Rica. It was the start of a relationship that, while surreal in its bounty, would eventually come to attract the unwanted attention of the Department of Homeland Security, the criminal division of the Internal Revenue Service, Major League Baseball, the Nevada Gaming Control Board, and, as Bowyer’s illicit empire crumbled, the world at large.

‘Victim A’

Two years later, in December 2023, Shohei Ohtani signed what was then the largest contract in professional sports history with the Los Angeles Dodgers: 10 years, $700 million. The deal for “Shotime” dominated the sports media for months. But on March 20, 2024, news broke that threatened to derail the show just as it was beginning.

The revelation that millions of dollars had been transferred from Ohtani’s bank account to an illegal bookmaker surfaced in dueling reports from ESPN and the Los Angeles Times. Both centering on his then-39-year-old interpreter, Ippei Mizuhara, the dispatches were as confounding as they were explosive. In an interview with ESPN, Mizuhara initially presented himself as a problem gambler, declared that Ohtani was not involved in any betting, and explained the payments as Ohtani bailing out a friend, going so far as to describe the two of them sitting at Ohtani’s computer and wiring the money.

But the following morning, before ESPN went live, Mizuhara disavowed his earlier statements. The Dodgers immediately fired Mizuhara; investigations were launched by MLB and the IRS; and five days later, Ohtani issued a statement denying any role in a scandal that echoed unsavory chapters of the sport’s past. “I never bet on sports or have willfully sent money to the bookmaker,” Ohtani said. “I’m just beyond shocked.”

Given the whiplash of shifting narratives, the speculation that followed was inevitable. Flip on talk radio, or venture into a conspiratorial corner of the internet, and you were treated to bro-inflected theorizing as to what really happened, what Ohtani really knew. Equally intriguing was the timing. The scandal erupted at a moment when the longtime stigma surrounding sports betting had, following a 2018 Supreme Court ruling that paved the way for wider legalization, given way to a previously unfathomable landscape where pro athletes had become spokespeople for entities like DraftKings and FanDuel; where ESPN operated its own multimillion-dollar sportsbook; and where Las Vegas, a town historically shunned by professional sports leagues, had just celebrated its reinvention as a sporting mecca by hosting the Super Bowl. But if such factors tempered the public’s instinct to rush to the harshest judgments, the ordeal also revealed how the corporatization of sports betting had done little to snuff out a secretive underworld estimated to be responsible for $64 billion in illicit wagers annually. (California is one of 11 states where sports betting remains illegal.)

Yet perhaps most remarkable was the speed at which the matter was seemingly resolved. Acting with uncharacteristic swiftness, the federal government issued a scathing criminal complaint against Mizuhara just three weeks later — on April 11 — that supported Ohtani’s narrative. The numbers were vertigo-inducing. Over roughly 24 months, Mizuhara had placed more than $300 million in bets, running up a debt of $40.6 million to an illegal bookmaking operation. To service it, the government alleged, Mizuhara himself became a criminal, taking control of one of Ohtani’s bank accounts and ­siphoning almost $17 million from the superstar. In June, Mizuhara pleaded guilty to bank and tax fraud.

One person who was not shocked by any twist in this saga was a central character who, throughout, remained an enigma: Mathew Bowyer. Since meeting Mizuhara at that poker game in San Diego, he had received at least $16.25 million in wires directly from Ohtani’s account, had poured most of it into conspicuous escapades in Vegas, and had been braced for a reckoning since the previous October, when dozens of armed federal agents raided his home. While the raid inadvertently unearthed the Ohtani-Mizuhara ordeal, the mushrooming scandal obscured a more complex, far-reaching, and ongoing drama. The agents who descended upon Bowyer’s home were not interested in the private misfortunes of a baseball superstar, but rather in exposing something Bowyer understood more intimately than most: how Las Vegas casinos skirted laws — and reaped profits — by allowing major bookies to launder millions by gambling on the city’s supposedly cleaned-up Strip.

by David Amsden, Rolling Stone |  Read more:
Image: Philip Cheung/Kyodo AP/Matthew Bowyer

The Big Reveal

The Bible, as every Sunday-school student learns, has a Hollywood ending. Not a happy ending, certainly, but one where all the dramatic plot points left open earlier, to the whispered uncertainty of the audience (“I don’t get it—when did he say he was coming back?”), are resolved in a rush, and a final, climactic confrontation between the stern-lipped action hero and the really bad guys takes place. That ending—the Book of Revelation—has every element that Michael Bay could want: dragons, seven-headed sea beasts, double-horned land beasts, huge C.G.I.-style battles involving hundreds of thousands of angels and demons, and even, in Jezebel the temptress, a part for Megan Fox. (“And I gave her space to repent of her fornication; and she repented not.”) Although Revelation got into the canonical Bible only by the skin of its teeth—it did poorly in previews, and was buried by the Apostolic suits until one key exec favored its release—it has always been a pop hit. Everybody reads Revelation; everybody gets excited about it; and generations of readers have insisted that it might even be telling the truth about what’s coming for Christmas.

In a new book on those end pages, “Revelations: Visions, Prophecy, and Politics in the Book of Revelation” (Viking), Elaine Pagels sets out gently to bring their portents back to earth. She accepts that Revelation was probably written, toward the end of the first century C.E., by a refugee mystic named John on the little island of Patmos, just off the coast of modern Turkey. (Though this John was not, she insists, the disciple John of Zebedee, whom Jesus loved, or the author of the Gospel that bears the same name.) She neatly synopsizes the spectacular action. John, finding himself before the Throne of God, sees a lamb, an image of Christ, who receives a scroll sealed by seven seals. The seals are broken in order, each revealing a mystical vision: a hundred and forty-four thousand “firstfruits” eventually are saved as servants of God—the famous “rapture.” Seven trumpets then sound, signalling various catastrophes—stars fall, the sun darkens, mountains explode, those beasts appear. At the sound of the sixth trumpet, two hundred million horsemen annihilate a third of mankind. This all leads to the millennium—not the end of all things but the thousand-year reign of Christ on earth—which, in turn, finally leads to Satan’s end in a lake of fire and the true climax. The Heaven and Earth we know are destroyed, and replaced by better ones. (There are many subsidiary incidents along the way, involving strange bowls and that Whore of Babylon, but they can be saved, so to speak, for the director’s cut on the DVD.)

Pagels then shows that Revelation, far from being meant as a hallucinatory prophecy, is actually a coded account of events that were happening at the time John was writing. It’s essentially a political cartoon about the crisis in the Jesus movement in the late first century, with Jerusalem fallen and the Temple destroyed and the Saviour, despite his promises, still not back. All the imagery of the rapt and the raptured and the rest that the “Left Behind” books have made a staple for fundamentalist Christians represents contemporary people and events, and was well understood in those terms by the original audience. Revelation is really like one of those old-fashioned editorial drawings where Labor is a pair of overalls and a hammer, and Capital a bag of money in a tuxedo and top hat, and Economic Justice a woman in flowing robes, with a worried look. “When John says that ‘the beast that I saw was like a leopard, its feet were like a bear’s and its mouth was like a lion’s mouth,’ he revises Daniel’s vision to picture Rome as the worst empire of all,” Pagels writes. “When he says that the beast’s seven heads are ‘seven kings,’ John probably means the Roman emperors who ruled from the time of Augustus until his own time.” As for the creepy 666, the “number of the beast,” the original text adds, helpfully, “Let anyone with understanding calculate the number of the beast, for it is the number of a person.” This almost certainly refers—by way of Gematria, the Jewish numerological system—to the contemporary Emperor Nero. Even John’s vision of a great mountain exploding is a topical reference to the recent eruption of Vesuvius, in C.E. 79. Revelation is a highly colored picture of the present, not a prophecy of the future.

What’s more original to Pagels’s book is the view that Revelation is essentially an anti-Christian polemic. That is, it was written by an expatriate follower of Jesus who wanted the movement to remain within an entirely Jewish context, as opposed to the “Christianity” just then being invented by St. Paul, who welcomed uncircumcised and trayf-eating Gentiles into the sect. At a time when no one quite called himself “Christian,” in the modern sense, John is prophesying what would happen if people did. That’s the forward-looking worry in the book. “In retrospect, we can see that John stood on the cusp of an enormous change—one that eventually would transform the entire movement from a Jewish messianic sect into ‘Christianity,’ a new religion flooded with Gentiles,” Pagels writes. “But since this had not yet happened—not, at least, among the groups John addressed in Asia Minor—he took his stand as a Jewish prophet charged to keep God’s people holy, unpolluted by Roman culture. So, John says, Jesus twice warns his followers in Asia Minor to beware of ‘blasphemers’ among them, ‘who say they are Jews, and are not.’ They are, he says, a ‘synagogue of Satan.’ ” Balaam and Jezebel, named as satanic prophets in Revelation, are, in this view, caricatures of “Pauline” Christians, who blithely violated Jewish food and sexual laws while still claiming to be followers of the good rabbi Yeshua... The scarlet whores and mad beasts in Revelation are the Gentile followers of Paul—and so, in a neat irony, the spiritual ancestors of today’s Protestant evangelicals.

Pagels shows persuasively that the Jew/non-Jew argument over the future of the Jesus movement, the real subject of Revelation, was much fiercer than later Christianity wanted to admit. The first-century Jesus movement was torn apart between Paul’s mission to the Gentiles—who were allowed to follow Jesus without being circumcised or eating kosher—and the more strictly Jewish movement tended by Jesus’ brothers in Jerusalem. (...)

After decoding Revelation for us, Pagels turns away from the canonic texts to look at the alternative, long-lost “Gnostic” texts of the period that have turned up over the past sixty years or so, most notably in the buried Coptic library of Nag Hammadi. As in her earlier books (“The Johannine Gospel in Gnostic Exegesis”; “The Gnostic Paul: Gnostic Exegesis of the Pauline Letters”; “The Gnostic Gospels”), she shows us that revelations in the period were not limited to John’s militant, vengeful-minded one, and that mystic visions more provocative and many-sided were widespread in the early Jesus movement.

As an alternative revelation to John’s, she focusses on what must be the single most astonishing text of its time, the long feminist poem found at Nag Hammadi in 1945 and called “Thunder, Perfect Mind”—a poem so contemporary in feeling that one would swear it had been written by Ntozake Shange in a feminist collective in the nineteen-seventies, and then adapted as a Helen Reddy song. In a series of riddling antitheses, a divine feminine principle is celebrated as transcending all principles (the divine woman is both whore and sibyl) and opening the way toward a true revelation of the hidden, embracing goddess of perfect being who lies behind all things:

I am the whore and the holy one.
I am the wife and the virgin.
I am the mother and the daughter.
I am the members of my mother.
I am the barren one
and many are her sons.
I am she whose wedding is great,
and I have not taken a husband.
I am the midwife and she who does not bear.
I am the solace of my labor pains.
I am the bride and the bridegroom . . .
Why, you who hate me, do you love me,
and hate those who love me?
You who deny me, confess me,
and you who confess me, deny me.
You who tell the truth about me, lie about me,
and you who have lied about me, tell the truth about me.
Astonishingly, the text of this mystic masterpiece was—a bit of YouTube viewing reveals—recently used by Ridley Scott as the background narration for a gorgeous long-form ad for Prada perfumes. The Gnostic strophes, laid over the model’s busy life, are meant to suggest the Many Mystifying Moods of the Modern Woman, particularly while she’s changing from one Prada outfit to another in the back seat of a sedan. (One feels that one should disapprove, but surely the Gnostic idea of the eternal feminine antitheses is meant to speak to the complicated, this-and-that condition of actually being a woman at any moment, and why not in Prada as well as in a flowing white robe?)

Pagels’s essential point is convincing and instructive: there were revelations all over Asia Minor and the Holy Land; John’s was just one of many, and we should read it as such. How is it, then, that this strange one became canonic, while those other, to us more appealing ones had to be buried in the desert for safekeeping, lest they be destroyed as heretical? Revelation very nearly did not make the cut. In the early second century, a majority of bishops in Asia Minor voted to condemn the text as blasphemous. It was only in the three-sixties that the church council, under the control of the fiery Athanasius, inserted Revelation as the climax of the entire New Testament. As a belligerent controversialist himself, Pagels suggests, Athanasius liked its belligerently controversial qualities. (...)

Perhaps what most strikes the naïve reader of the Book of Revelation is what a close-run thing the battle is. When God finally gets tired of waiting it out and decides to end things, the back-and-forth between dragons and serpents and sea monsters and Jesus is less like a scouring of the stables than like a Giants-Patriots Super Bowl. It seems that Manichaeanism—bad god vs. good god—is the natural religion of mankind and that all faiths bend toward the Devil, to make sense of God’s furious impotence. A god omniscient and omnipotent and also powerless to stop evil remains a theological perplexity, even as it becomes a prop of faith. It gives you the advantage of clarity—only one guy worth worshipping—at the loss of lucidity: if he’s so great, why is he so weak?

You can’t help feeling, along with Pagels, a pang that the Gnostic poems, so much more affecting in their mystical, pantheistic rapture, got interred while Revelation lives on. But you also have to wonder if there ever was a likely alternative. Don’t squishy doctrines of transformation through personal illumination always get marginalized in mass movements? As Stephen Batchelor has recently shown, the open-minded, non-authoritarian side of Buddhism, too, quickly succumbed to its theocratic side, gasping under the weight of those heavy statues. The histories of faiths are all essentially the same: a vague and ambiguous millennial doctrine preached by a charismatic founder, Marx or Jesus; mystical variants held by the first generations of followers; and a militant consensus put firmly in place by the power-achieving generation. Bakunin, like the Essenes, never really had a chance. The truth is that punitive, hysterical religions thrive, while soft, mystical ones must hide their scriptures somewhere in the hot sand.

John of Patmos’s hatred for the pagan world extended from its cruelties to its beauties—the exquisite temple at nearby Pergamon was for him the Devil’s Altar, worthy only of destruction. For all that, Pagels tells us, many claim to have found in John “the promise, famously repeated by Martin Luther King Jr., that the ‘arc of the moral universe is long, but it bends toward justice.’ . . . This worst of all nightmares ends not in terror but in a glorious new world, radiant with the light of God’s presence, flowing with the water of life, abounding in joy and delight.” Well, yeah, but this happens only after all the millions of heretics, past and present, have been burned alive and the planet destroyed. That’s some long arc. It’s like the inevitable moment in an apocalyptic blockbuster, “Independence Day” or “Armageddon” or “2012,” when the stars embrace and celebrate their survival. The Hans Zimmer music swells, and we’re reassured that it’s O.K. to rejoice. Millions are annihilated, every major city has been destroyed, but nobody you really like has died. It’s a Hollywood ending in that way, too.

by Adam Gopnik, New Yorker | Read more:
Image: Ron Kurniawan

Wednesday, November 19, 2025

Ronald Reagan and the First MAGA Movement

“Hegel remarks somewhere that all great world-historic facts and personages appear, so to speak, twice. He forgot to add: the first time as tragedy, the second time as farce.”Karl Marx, The Eighteenth Brumaire of Louis Bonaparte 

“Let’s Make America Great Again”Ronald Reagan campaign slogan, 1980

In 2011, a Gallup poll found that when Americans were asked who the greatest president in U.S. history was, they were most likely to say Ronald Reagan. Abraham Lincoln and Bill Clinton were next on the list. George Washington came in fifth, after JFK. Reagan is less fondly remembered among certain groups, like Black Americans and LGBTQ people, who recall what his presidency meant for them. But on the whole, Americans had, and still have, a positive impression of Reagan the man. 

His policies are a different story. They were unpopular then. They’re unpopular now. Americans didn’t want to see an upward redistribution of wealth, the bloating of the military budget, tax cuts for the rich, and the arming of Central American death squads. Reagan’s secret funding of the Nicaraguan Contras, in direct violation of U.S. law, proved staggeringly unpopular, with nearly 80 percent of the public disapproving. Yet even after the exposure of the Iran-Contra scandal, threefourths of Americans still approved of Reagan “as a person.” It’s not hard to see why. Reagan’s public persona was avuncular and self-deprecating. He was a Hollywood actor, and he performed the role of president perfectly. He peppered his speech with humorous, folksy anecdotes and spoke in a warm, reassuring voice. He conveyed an impression of complete innocence, so that when repeated ethical scandals hit his administration, he was able to convince much of the public that he couldn’t possibly be responsible for anything nefarious—hence the moniker “Teflon president.” Watch clips of Reagan joshing with the press, or making light-hearted references to his assassination attempt, and we can see easily how the Reagan mystique was developed. 

Yet the actual record of the Reagan administration is horrendous. As Peter Dreier wrote in the Nation in 2011,
During his two terms in the White House (1981–89), Reagan presided over a widening gap between the rich and everyone else, declining wages and living standards for working families, an assault on labor unions as a vehicle to lift Americans into the middle class, a dramatic increase in poverty and homelessness, and the consolidation and deregulation of the financial industry that led to the current mortgage meltdown, foreclosure epidemic and lingering recession. These trends were not caused by inevitable social and economic forces. They resulted from Reagan’s policy and political choices based on an underlying “you’re on your own” ideology.
Beneath Reagan’s “gee whiz” and “aw shucks” persona there was a cruelty, a belief that people were responsible for their own suffering and it wasn’t the job of government to help alleviate social misery. Reagan famously said that “the nine most terrifying words in the English language are: I’m from the Government, and I’m here to help.” This would be news to anyone who has ever been rescued by a firefighter or a park ranger or given a Social Security check. But Reagan didn’t try to make a factual, logical case that the government was incapable of doing anything but harm. Instead, he told stories that projected a vision of an idyllic small-town America where people bootstrapped their way to success. Beneath the stories, his actions were cruel and deadly. Reagan helped create many of the most devastating problems facing American society in 2024. 

Reagan always denied being in any way racist and claimed to have had a “hatred for bigotry and prejudice” from an early age. Nevertheless, in a private phone call with Richard Nixon, he called African United Nations delegates “monkeys,” and rightwing economist Thomas Sowell departed Reagan’s 1980 campaign after he insisted on giving a “states’ rights” speech in Mississippi near the site of the infamous 1964 murders of three civil rights workers, a move that was widely interpreted (including by Sowell) as a dog-whistle to white supremacists. Reagan’s support for apartheid South Africa (and softness on white supremacist Rhodesia), his reluctance to approve a federal holiday honoring Martin Luther King Jr., and his initiation of the “war on drugs” all help to explain why Black Americans did not look back fondly at Reagan’s presidency once it was over. 

Reagan never cared much if what he was saying was true. He would pass movie scenes off as historical fact and even told the prime minister of Israel that he had personally helped liberate Nazi death camps, when in fact he had edited footage of them in Culver City, California, while working on films for the War Department. As Jimmy Carter said, with characteristic understatement, “President Reagan doesn’t always check the facts before he makes statements, and the press accepts this as kind of amusing.” In On Bended Knee: The Press and the Reagan Presidency, journalist Mark Hertsgaard reports that eventually, the press just gave up on fact-checking Reagan, since so much of what he said was nonsense. The national news editor of Newsweek said that “I think everybody in the press corps is just a little bit astonished at how many times the President can make horrible mistakes in public[…] [F]or a long time we were writing practically every week a little box on what he said that wasn’t true. We ultimately just couldn’t stand doing it week after week after week because it seemed sort of unfair […] [I]t seemed like persecuting him or something.” 

Much of what Reagan said was ludicrous. In the words of his daughter Patti Davis, “he [had] the ability to make statements that are so far outside the parameters of logic that they leave you speechless.” (...)

Simon Hoggart noted in The Observer in 1986 the peculiar way in which Reagan’s “errors glide past unchallenged. At one point […] he alleged that almost half the population gets a free meal from the government each day. No one told him he was crazy. The general message of the American press is that, yes, while it is perfectly true that the emperor has no clothes, nudity is actually very acceptable this year.” Mark Green notes that Reagan’s rigid anti-government ideology, his belief that the state could do no right, led him to wilfully misinterpret reality: “This loathing for government, this eagerness to prove that any program to aid the disadvantaged is nothing but a boondoggle and a money gobbler, leads him to contrive statistics and stories with unmatched vigor.” Those who interacted with Reagan up close were often shocked by his ignorance. “You sometimes wonder why it occurred to anyone that he should be president, or even governor,” commented Henry Kissinger. Richard Nixon called him a “man of limited mental capacity [who] simply doesn’t know what the Christ is going on in the foreign area.” House Speaker Tip O’Neill said that Reagan “knows less about the budget than any president in my lifetime. He can’t even carry on a conversation about the budget. It’s an absolute and utter disgrace.” Reagan was a hands-off and inattentive manager, nodding off in meetings or remaining silent and often leaving staff in the dark about what his administration’s actual policies were supposed to be. Many people have described Reagan as a mere figurehead or speculated that his Alzheimer’s symptoms began before his term in office was over.

But to treat Reagan as a vapid actor, a pleasant frontman for a rapacious oligarchy, is to underappreciate his talent and let him off the hook for his worst actions. Watch Reagan interacting with the press in 1987, and it’s clear that he’s fully lucid and engaged. After the Iran-Contra scandal, Congressional leaders declined to impeach Reagan that same year, perhaps because he successfully conveyed the impression that he was a bewildered innocent. (Famously, he confessed: “I told the American people I did not trade arms for hostages. My heart and my best intentions still tell me that’s true, but the facts and the evidence tell me it is not.”) But he deserves to be given credit for his record. 

The record was disgraceful. It’s chilling to go back and look at how Reagan’s press secretary responded to questions about AIDS, for instance. As a terrifying epidemic began to decimate the gay community, Reagan’s spokesman cracked homophobic jokes in the press room. The reporter who asked about AIDS was “met with dismissive wisecracks questioning the reporter’s own sexual orientation.” Reagan himself showed no interest in the issue and even proposed to cut funding for AIDS research, until the death of his friend Rock Hudson spurred him to action. 

Despite being the only former labor leader ever to ascend to the presidency (he had been president of the Screen Actors Guild), Reagan did everything in his power to crush the American labor movement. In 1981, 10,000 members of the Professional Air Traffic Controllers Organization (PATCO) went on strike for better pay and working conditions. Reagan simply fired them all. As Richard Sharpe writes, 
The strikers were often working-class men and women who had achieved suburban middle class lives as air traffic controllers without having gone to college. Many were veterans of the US armed forces where they had learned their skills; their union had backed Reagan in his election campaign. Nevertheless, Reagan refused to back down. Several strikers were jailed; the union was fined and eventually made bankrupt. Only about 800 got their jobs back when Clinton lifted the ban on rehiring those who went on strike. Many of the strikers were forced into poverty as a result of being blacklisted for [U.S. government] employment. 
Reagan’s crushing of the union “was interpreted by many as a green light from the federal government for union-busting, and ushered in the vicious employer attacks of the 1980s.” The head of Reagan’s Office of Personnel Management said this explicitly, writing that with the strike, “American business leaders were given a lesson in managerial leadership that they could not and did not ignore. Many private sector executives have told me that they were able to cut the fat from their organizations and adopt more competitive work practices because of what the government did in those days.” Journalist Jon Schwarz dates the beginning of the 40-year-long “murder of the middle class” to Reagan’s firing of the air traffic controllers. [ed. I do too.]

Reagan began what journalist Mark Ames calls “one of the most shocking wealth transfers in the history of the world, all under the propaganda diversion of ‘making America competitive’ and ‘unleashing the creative energies of the American worker.’” With the aid of Congressional Democrats, he substantially cut taxes on the wealthy and attempted to undo both the New Deal and the Great Society. This included making more than $22 billion in cuts to social welfare programs, while still massively increasing the federal deficit, in part by bloating the military budget. Poverty, homelessness, and precarity all increased. 

It’s harder to measure the indirect cultural consequences of Reagan’s tenure, but he certainly did nothing to counteract the “greed is good” spirit of the times. As Mario Cuomo put it, Reagan “made the denial of compassion for the people who needed it most sound like a virtue.” Similarly, Cornel West says that “Reagan made it fashionable to be indifferent to the poor and gave permission to be greedy with little or no conscience.” 

In The Man Who Sold the World: Ronald Reagan and the Betrayal of Main Street America, William Kleinknecht summarizes the dire consequences of “Reaganomics”: 
He enacted policies that helped wipe out the high-paying jobs for the working class that were the real backbone of the country. This supposed guardian of traditional values was the architect of wrenching social change that swept across the country in the 1980s, the emergence of an eerie, overcommercialized, postmodern America that has left so much of the populace psychically adrift. Reagan propelled the transition to hypercapitalism, an epoch in which the forces of self-interest and profit seek to make a final rout of traditional human values. His legacy—mergers, deregulation, tax cuts for the wealthy, privatization, globalization—helped weaken the family and eradicate small-town life and the sense of community. 
Investigative journalist Greg Palast puts things even more bluntly: 
The New York Times, in its canned obit, wrote that Reagan projected, “faith in small town America” and “old-time values.” “Values” my ass. It was union-busting and a declaration of war on the poor and anyone who couldn’t buy designer dresses. It was the New Meanness, bringing starvation back to America so that every millionaire could get another million. “Small town” values? From the movie star of the Pacific Palisades, the Malibu mogul? I want to throw up. 
All of that’s just on the domestic front. Reagan’s foreign policy was a horror show. His administration supported Saddam Hussein as Iraq waged a brutal war of aggression against Iran, even covering up evidence of Hussein’s use of chemical weapons. Reagan violated both domestic and international law in his support for the Nicaraguan Contras. The Contras, according to Human Rights Watch, “were major and systematic violators of the most basic standards of the laws of armed conflict, including by launching indiscriminate attacks on civilians, selectively murdering non-combatants, and mistreating prisoners.” (Reagan repeatedly compared the Contras to the American Founding Fathers, labeling them “freedom fighters” and “our brothers.”) The Reagan administration funneled money to them through arms sales in explicit violation of U.S. law, while Reagan’s terrorism against Nicaragua (mining the country’s harbors and destroying civilian boats) was found to be illegal by the World Court, a ruling the administration simply ignored. 

Like other presidents, Reagan supported friendly despots around the world when it served “U.S. interests,” including not only Hussein, but Ferdinand Marcos in the Philippines, the deposed Pol Pot regime in Cambodia, Suharto in Indonesia, and the genocidal Guatemalan military leader Ríos Montt, whom he called “a man of great personal integrity and commitment.” Reagan freely violated international law, such as by invading Grenada without any authorization from the United Nations Security Council. Yet some Reagan policies look moderate and restrained by comparison with recent presidential actions. Reagan was willing to restrain Israel when its conduct became embarrassing and appears to have been sincerely committed to the issue of reducing nuclear weapons, going so far as to propose eliminating nuclear weapons altogether in one meeting with Mikhail Gorbachev. 

Unfortunately, Reagan was rigidly committed to his Strategic Defense Initiative (SDI, derisively known as “Star Wars”), an attempt to intercept inbound nuclear attacks on the United States. Reagan thought nothing could be objectionable about defending against a nuclear attack, but it disrupted the logic of deterrence (if the U.S. could defend itself from a nuclear attack but the Soviet Union could not, there was less reason for the U.S. to avoid attacking the Soviet Union), and the Soviets saw it as a serious threat, which led to one of the worst nuclear scares of the Cold War. 

The Reagan presidency was a giant fraud. He promised safety but brought us closer to Armageddon. He promised prosperity but crushed American workers. His kindly demeanor belied a nasty streak. (For instance, Jon Schwarz writes in the Intercept that “when Patty Hearst’s kidnappers demanded that her family start handing out free food to the poor, Reagan privately said, ‘It’s too bad we can’t have an epidemic of botulism.’”) TIME magazine called him “a Prospero of American memories, a magician who carries a bright, ideal America like a holograph in his mind and projects the image in the air.” Reagan, “master illusionist, is himself a kind of American dream.” Well, as George Carlin said, “they call it the American dream because you have to be asleep to believe it.” Reagan smiled at the country in a big cowboy hat while robbing people blind. 

From Ronald to Donald

An ignorant, deceitful entertainer bamboozling Americans into thinking that plutocracy is good for them. Does this sound familiar? We’ve had another one of those recently, one even more cartoonishly dishonest in his promises to “Make America Great Again” (a slogan Trump simply lifted and repurposed from Reagan). As Schwarz writes, Trump and Reagan share the “same political DNA”: “Reagan was Trump’s progenitor, and Trump is Reagan’s degenerate 21st-century descendant. Trump is to Reagan much like crack is to cocaine: cheaper, faster-acting, and less glamorous. Still, in their essence, they are the same thing.” 

There are some important differences. Reagan exuded positivity, even utopianism, promising that “America’s best days are yet to come. Our proudest moments are yet to be. Our most glorious achievements are just ahead.” He was capable of seeming reassuring and reasonable, as in his well-received address after the Challenger space shuttle disaster. Trump’s tone is dark, hateful, vindictive, while Reagan’s was sunny. But both carried out variations on a similar fraud. 

Reagan promised to tame the worst excesses of government. But in office, he let corruption and abuse run rampant. His Department of Housing and Urban Development was “enveloped by influence peddling, favoritism, abuse, greed, fraud, embezzlement, and theft” according to the House Government Operations Committee. His presidency ultimately resulted “in the investigation, indictment or conviction of over 138 administration officials, the largest number for any president of the United States.” Reagan should plainly have been impeached and removed from office over the Iran-Contra scandal. The irony is that we only need to have a Reaganesque fear of government when people like Ronald Reagan are running the government.

 Let’s Make America Great Again, Reagan said. Did he? Of course not. It was a fantasy, an image. Trump is the same, offering an appealing lie that desperate people would very much like to believe in. But if it’s trivial to point out that these men are selling snake oil, the question is: how do you convince people not to buy it? That’s much more difficult. Reagan won two landslide victories... Perhaps one lesson of Reagan is that because appealing visions and stories can be so powerful, we need one of our own. People voted for Reagan even though they disliked his policies, because he seemed personable and he projected an image of forward-looking confidence. Trump does not seem kind or personable, but he has a powerful story to tell, one of a country being ruined and awaiting its redeemer. Counteracting salesmen like these requires a powerful alternative story, with a promise of a different, better future. Democrats since Barack Obama (who himself was an admirer of Reagan) have failed to offer such a message—consider Hillary Clinton’s “America is Already Great” or Joe Biden’s promise to his donors that “nothing will fundamentally change.” 

Ronald Reagan and Donald Trump are two of the greatest con men of the age, successfully convincing many people to do immense harm to themselves and their country. Their political talents, however, should not be underrated. Reagan has, incredibly, been successfully sold as one of the greatest presidents ever, with Republicans viewing him as something close to a saint, an achievement that Noam Chomsky says would have impressed Kim Il-Sung. We need not just to puncture the myths, which is done well in both Kleinknecht’s The Man Who Sold The World and Will Bunch’s Tear Down This Myth, but to offer a more inspiring alternative that will keep people from falling for the pitches of vicious grifters.

by Nathan J. Robinson, Current Affairs |  Read more:
Image: uncredited
[ed. Good summary. I lived through it all, and in fact had to deal with a few of Reagan's policies directly (like leasing the entire coastline of the US to oil and gas development in an acceleratated 5-year OCS leasing schedule). But it wasn't just Reagan. Equal blame (if not more so) should fall on other ideologues at the time including Pat Buchanan, Newt Gingrich, Grover Norquist ('drown government in the bathtub' fame); Alan Greenspan, and the numerous cabinet secretaries and others that faithfully, if not gleefully, set forth to carry out Reagan's agenda (like Interior Secretary James Watt, and EPA's Anne Gorsuch, mom of present Supreme Court justice Neal Gorsuch). Never could have imagined it could get much worse (ha!), but then we ended up taking a few detours into Afghanistan, Iraq, legalized torture, and black sites. Connect the dots. Reagan lit the fire that hollowed out the middle class, deregulated government, crippled unions, jump started neoliberalism (later supercharged by Clinton), and set us on the path to where we are now. At this point, with a completely useless Congress, weaponized military and Justice Department, open corruption, unapologetic criminal pardons, and DOGE-depleted government things can't possibly get much worse, right? Right? (ha!)]

Only a Failing System Could Produce Chuck Grassley

Did you know that, right now, the person who sits third in line to the U.S. presidency is a deeply strange 92-year-old from Iowa? It’s one of those facts you forget about, until you look at the government website for “presidential succession” and get taken by surprise. But there it is: if anything happens to Donald Trump, JD Vance, and Mike Johnson, Senator Chuck Grassley would be our country’s Commander in Chief. He’s both the President pro tempore of the Senate and the chair of its Judiciary Committee, which makes him one of the most powerful people in Congress. This is alarming news for America, because Grassley is also the oldest member of Congress—he’s been in politics since the Eisenhower administration—and one of its foremost weirdos. On a regular basis, he puts things on the internet that make Trump look normal by comparison. He has a legislative track record a mile long, and most of it is awful. But the problem he represents is much bigger than one man. The fact that someone like Chuck Grassley has represented Iowa in the Senate for 45 years is a sign that American democracy is in a near-terminal state of dysfunction. What’s more, it’s the most damning indictment of the Democratic Party imaginable. If they can’t beat this guy, what are they good for?

When Chuck Grassley was born in 1933, Hitler and Stalin were both still alive, and the chocolate chip cookie had not yet been invented. When he was first elected to the Iowa state legislature in 1958, segregation and Jim Crow were still in full effect, and would be for another six years. When he became a U.S. senator in 1980, it was part of the “Reagan Revolution” that created the Republican Party as we know it today—and Grassley was endorsed by the Ku Klux Klan, who reportedly gave him “an eight out of ten for his voting record.” One of his first big decisions in Washington was to vote against the creation of Martin Luther King Jr. Day in 1983, although he insists he was just concerned about the expense of giving federal workers another day off. Simply put, this guy has been in Congress forever, outlasting six successive presidents. Now, at age 92, he visibly struggles to read statements on the Senate floor—but that hasn’t stopped him from filing the paperwork to run for yet another term in 2028, when he’d be 95. More likely, if the actuarial tables are anything to go by, he’ll follow in the footsteps of Senator Dianne Feinstein and Representative Gerry Connolly, and simply drop dead in office one of these days.

There’s a popular line of thinking, embodied in David Hogg’s “Leaders We Deserve” PAC and Samuel Moyn’s forthcoming book Gerontocracy in America, that says elderly, out-of-touch leaders like Chuck Grassley are behind a lot of the country’s problems. Certainly with people like Dianne Feinstein and Joe Biden, there’s a pattern of politicians staying in office long after it would have been sensible to retire. But you’ve got to be careful here, because the problem with these leaders is not only that they’re old. In general, age is a bad proxy for policy preferences, class allegiance, and even competence. The presumption behind the “gerontocracy” narrative is that younger equals more progressive, more worker-friendly, and that’s statistically likely, but not always true in individual cases. Even basic on-the-job ability varies. Bernie Sanders is old, though eight years Grassley’s junior, and he’s still doing (mostly) solid work. Ritchie Torres and Marie Gluesenkamp Perez are young, and they’re terrible. In Grassley’s case, the real problem is a more insidious combination of things. He hasn’t just been hanging on to power like a barnacle for decades, he’s also been making policy choices that directly harm the people of Iowa, and he’s been exhibiting some truly bizarre behavior along the way.

Congress is, as we know, essentially a group home for cranks, perverts, and the deranged. But even among that crowd, Grassley stands out. Like Donald Trump, he loves to post, and every time he goes online, he gives the world a glimpse into a lifestyle that can only be described as baffling. Take his longstanding devotion to Beth the vacuum cleaner. This is a 1987 Hoover Concept Two upright vacuum, which presumably used to be white-and-red, but thanks to the passage of time is now more beige-and-red. Not only has Senator Grassley named this vacuum cleaner “Beth,” which is weird and vaguely sexist by itself, but he feels the need to tell the world about it on every major holiday, like clockwork. “Once again Beth has performed wonderfully for family reunion If u knew Beth like I know Beth u would know the dependability I know,” he posted this August. Or, in April 2022: “Grassley to Beth: Sunday we hv our Easter family gathering are u ready to roll ?” Or last December: “Beth going to get Grassley farm house ready for 32 guest Christmas Day.” The man is obsessed.

Like a lot of older people, Grassley’s posting style is terse, full of abbreviations and run-on sentences, and somewhat incoherent. In a recent article, the Iowa-based Little Village described it as having “the start-stop, quiet-loud, herky-jerky quality of an E.E. Cummings poem.” The subjects, too, are odd. “Windsor Heights Dairy Queen is good place for u kno what,” the senator tweeted in 2014, causing a collective huh? to spread across the nation. He would repeat the sentiment the following year, writing that “I'm at the Jefferson Iowa DairyQueen doing ‘you know what’ !!!” Apparently, “you know what” just means “eating ice cream”—or at least, that’s the story he’s sticking to. (...)

Grassley is a fascinating figure, because you never know what you’re going to get next with him. And all of his corn and vacuum-related antics might be charming, if he didn’t have any political power, and was just somebody’s weird grandfather (or, at this point, great-grandfather). There’s an entire category of American political grotesques like this: figures who’ve been defined in the public eye by their personal strangeness and entertainment value, as much as their actual politics. Trump is another, with his constant stream of garbled utterances about the relative merits of death by shark vs. electrocution or how “nothing bad can happen, it can only good happen.” Or there’s RFK Jr. with his brain worms and quack cures, or even New York City’s favorite sons, Eric Adams and Curtis Sliwa. But the problem is, these people do have power. They control things like public health, the police, and the military, and they decide the outcomes of people’s lives. Like Sideshow Bob on The Simpsons, they’re a lot less funny when you realize they’re actually trying to harm you, and Chuck Grassley is no exception.

So what has Chuck Grassley done with his considerable power? When the curtain finally falls on his life and career, how will he be judged? Not well, if you’re an ordinary working-class Iowan. At every turn, Grassley has consistently made decisions that make their lives worse. (...)

Then, too, as head of the Senate Judiciary Committee Grassley had a major role in converting the Supreme Court to the openly right-wing institution it is today. Back in 2016, when he first led the committee, it was Grassley who delayed the vote on Merrick Garland’s confirmation to the Court until after the 2016 election, effectively stealing a seat from the outgoing Obama administration. Afterward, it was Grassley who was among the staunchest defenders of Brett Kavanaugh, even (and especially) after it became clear that Kavanaugh had lied to the American people about the sexual assault accusations brought against him by Christine Blasey Ford. So in a sense, all of the decisions that make up the Court’s post-2016 rightward turn—from the dismantling of women’s reproductive rights to the sweeping criminal immunity granted to Donald Trump—are Grassley’s handiwork.

Good news, though: if you’re a mentally ill person who wants to get a high-powered gun, Chuck Grassley is your best friend! One of his pet projects in 2017 was to repeal Obama-era regulations that prevented people from buying firearms if they had “mental impairments” so significant that they needed a third party to help them claim Social Security benefits. That seems like a rule even the most avid hunters and rifle collectors could agree with—if you can’t fill out a form unaided, you shouldn’t have a gun—but Grassley objected, claiming that the standards were too “vague” and that “if a specific individual is likely to be violent due to the nature of their mental illness, then the government should have to prove it” on a case-by-case basis. Never mind that, by the time the “proof” arrives, a school or a Walmart could be riddled with bullets and bloodstains.

This is who Chuck Grassley is. He makes decisions in Washington that ruin people’s lives, and then he flies back to Iowa to post incoherent gibberish about Dairy Queen online. The wacky grandpa image is a cloak for the deeper depravity. And his constituents know it. In 2021, only 28 percent of Iowans wanted him to run for re-election, with “the age thing” cited as the most common reason. More recently, Grassley’s town hall events have become outpourings of frustration against Republican policy: “I’M PISSED!” one man recently yelled at him, after he made a mumbling defense of the Trump administration shipping people to a gulag in El Salvador without due process. He spoke for millions.

Which leads to another, even grimmer question: why, in Grassley’s 45-year career in the Senate, have the Democrats never been able to unseat him? (...)

Lately, I’ve been thinking a lot about the kind of leaders a system of government throws up in its dying days. You probably remember them from your high school history books. Romulus Augustulus, the last emperor of Rome, who ruled for only ten months before being deposed by the Barbarians (who found him so non-threatening they let him retire to a monastery). Kings Louis XIV through XVI in France, swanning around Versailles in their fur capes while the revolution was brewing outside. Nicholas II in Russia, letting Rasputin whisper in his ear as more and more of his people got blown to bits in World War I, while Lenin and Trotsky drew up battle plans of their own. Later, President Boris Yeltsin, who had crippling alcoholism even by Russian standards, to the extent he “wandered into the street in his underwear” during a state visit with Bill Clinton—and who played a key role in the downfall of the Soviet Union. In each era, the pattern is the same. The people in power are incompetent, corrupt, and personally contemptible, pale shadows of the leaders the country or system had at its peak—and yet, there seems to be no way to get rid of them.

Contrary to the “great man” (or rather “weak man”) theory of history, it’s not that these leaders cause the downfall of their regimes through their personal failings. Just the opposite. They’re not catalysts of decline, but morbid symptoms. The fact that they ever got near power is proof that the system itself is no longer functional. The mechanisms that are supposed to produce strong, effective leaders, from education to military promotion to party leadership contests, are no longer doing so. The skills and attributes needed to reach the top of the hierarchy no longer have much, if anything, to do with the skills and attributes needed to actually rule. Nepotism, mutual back-slapping, and financial corruption have taken hold, like rust. In the early 1800s, Napoleon was able to sweep across the map of Europe like a holy terror, in part because the ancien régime was still choosing military officers based on their noble bloodlines, while Napoleon only cared about effectiveness and would promote any old commoner who could win battles for him. Monarchy was dying, and the last things it belched up as it expired were tenth-generation, third-rate Hapsburg cousins, ripe for the slaughter. In the USSR, the bureaucracy elevated people based on how well they recited the Party line like a catechism, as much as their actual abilities. Thus, they eventually produced a Yeltsin.

And today in the United States, we have Chuck Grassley.

by Alex Skopic, Current Affairs |  Read more:
Image: uncredited
[ed. Good point. In my experience, once an incumbent wins a couple elections they're almost impossible to unseat. Seen it all my life: out of sight, out of mind (in DC).]