Image: USS George H.W. Bush (CVN 77) sails in the Indian Ocean, April 23. CENTCOM/X
[ed. Updates from a variety of sources. Draw your own conclusions. See also: Iran War: Team Trump as Narrative War Captives? (NC).]
Duck Soup
...dog paddling through culture, technology, music and more.
Friday, April 24, 2026
Iran War Updates: April 24, 2026
Iran War: Trump Says Time Is on His Side, Iranian Leadership Is Divided, Iran Begs to Differ (Naked Capitalism)
Labels:
Crime,
Economics,
Government,
Journalism,
Media,
Military,
Politics,
Security,
Technology
What I Saw Inside the Kennedy Center
What I Saw Inside the Kennedy Center (The Atlantic)
Image: Jabin Botsford/The Washington Post/Getty
Image: Jabin Botsford/The Washington Post/Getty
[ed. An order of magnitude worse than I imagined.]
Labels:
Architecture,
Art,
Culture,
Government,
Journalism,
Politics
We Haven’t Seen the Worst of What Gambling and Prediction Markets Will Do to America
Here are three stories about the state of gambling in America.
1. BaseballIn November 2025, two pitchers for the Cleveland Guardians, Emmanuel Clase and Luis Ortiz, were charged in a conspiracy for “rigging pitches.” Frankly, I had never heard of rigged pitches before, but the federal indictment describes a scheme so simple that it’s a miracle that this sort of thing doesn’t happen all the time. Three years ago, a few corrupt bettors approached the pitchers with a tantalizing deal: (1) We’ll bet that certain pitches will be balls; (2) you throw those pitches into the dirt; (3) we’ll win the bets and give you some money.
The plan worked. Why wouldn’t it? There are hundreds of pitches thrown in a baseball game, and nobody cares about one bad pitch. The bets were so deviously clever because they offered enormous rewards for bettors and only incidental inconvenience for players and viewers. Before their plan was snuffed out, the fraudsters won $450,000 from pitches that not even the most ardent Cleveland baseball fan would ever remember the next day. Nobody watching America’s pastime could have guessed that they were witnessing a six-figure fraud.
2. BombsOn the morning of February 28th, someone logged onto the prediction market website Polymarket and made an unusually large bet. This bet wasn’t placed on a baseball game. It wasn’t placed on any sport. This was a bet that the United States would bomb Iran on a specific day, despite extremely low odds of such a thing happening.
A few hours later, bombs landed in Iran. This one bet was part of a $553,000 payday for a user named “Magamyman.” And it was just one of dozens of suspicious, perfectly-timed wagers, totaling millions of dollars, placed in the hours before a war began.
It is almost impossible to believe that, whoever Magamyman is, he didn’t have inside information from members of the administration. The term war profiteering typically refers to arms dealers who get rich from war. But we now live in a world not only where online bettors stand to profit from war, but also where key decision makers in government have the tantalizing options to make hundreds of thousands of dollars by synchronizing military engagements with their gambling position.
3. Bombs, againOn March 10, several days into the Iran War, the journalist Emanuel Fabian reported that a warhead launched from Iran struck a site outside Jerusalem.
Meanwhile on Polymarket, users had placed bets on the precise location of missile strikes on March 10. Fabian’s article was therefore poised to determine payouts of $14 million in betting. As The Atlantic’s Charlie Warzel reported, bettors encouraged him to rewrite his story to produce the outcome that they’d bet on. Others threatened to make his life “miserable.”
A clever dystopian novelist might conceive of a future where poorly paid journalists for news wires are offered six-figure deals to report fictions that cash out bets from online prediction markets. But just how fanciful is that scenario when we have good reason to believe that journalists are already being pressured, bullied, and threatened to publish specific stories that align with multi-thousand dollar bets about the future?
Put it all together: rigged pitches, rigged war bets, and attempts to rig wartime journalism. Without context, each story would sound like a wacky conspiracy theory. But these are not conspiracy theories. These are things that have happened. These are conspiracies—full stop.
“If you’re not paranoid, you’re not paying attention” has historically been one of those bumperstickers you find on the back of a car with so many other bumperstickers that you worry for the sanity of its occupants. But in this weird new reality where every event on the planet has a price, and behind every price is a shadowy counterparty, the jittery gambler’s paranoia—is what I’m watching happening because somebody more powerful than me bet on it?—is starting to seem, eerily, like a kind of perverse common sense.
From Laundromats to Airplanes
What’s remarkable is not just the fact that online sports books have taken over sports, or that betting markets have metastasized in politics and culture, but the speed with which both have taken place.
For most of the last century, the major sports leagues were vehemently against gambling, as the Atlantic staff writer McKay Coppins explained in his recent feature. [...]
Following the 2018 Supreme Court decision Murphy vs. NCAA, sports gambling was unleashed into the world, and the leagues haven’t looked back. Last year, the NFL saw $30 billion gambled on football games, and the league itself made half a billion dollars in advertising, licensing, and data deals.
Nine years ago, Americans bet less than $5 billion on sports. Last year, that number rose to at least $160 billion. Big numbers mean nothing to me, so let me put that statistic another way: $5 billion is roughly the amount Americans spend annually at coin-operated laundromats and $160 billion is nearly what Americans spent last year on domestic airline tickets. So, in a decade, the online sports gambling industry will have risen from the level of coin laundromats to rival the entire airline industry.
And now here come the prediction markets, such as Polymarket and Kalshi, whose combined 2025 revenue came in around $50 billion. “These predictive markets are the logical endpoint of the online gambling boom,” Coppins told me on my podcast Plain English. “We have taught the entire American population how to gamble with sports. We’ve made it frictionless and easy and put it on everybody’s phone. Why not extend the logic and culture of gambling to other segments of American life?” He continued:
It is a comforting myth that dystopias happen when obviously bad ideas go too far. Comforting, because it plays to our naive hope that the world can be divided into static categories of good versus evil and that once we stigmatize all the bad people and ghettoize all the bad ideas, some utopia will spring into view. But I think dystopias more likely happen because seemingly good ideas go too far. “Pleasure is better than pain” is a sensible notion, and a society devoted to its implications created Brave New World. “Order is better than disorder” sounds alright to me, but a society devoted to the most grotesque vision of that principle takes us to 1984. Sports gambling is fun, and prediction markets can forecast future events. But extended without guardrails or limitations, those principles lead to a world where ubiquitous gambling leads to cheating, cheating leads to distrust, and distrust leads ultimately to cynicism or outright disengagement.
“The crisis of authority that has kind of already visited every other American institution in the last couple of decades has arrived at professional sports,” Coppins said. Two-thirds of Americans now believe that professional athletes sometimes change their performance to influence gambling outcomes. “Not to overstate it, but that’s a disaster,” he said. And not just for sports.
Four Ways to Lose (Or, What's a 'Rigged Pitch' in a War?)
There are four reasons to worry about the effect of gambling in sports and culture.
Nine years ago, Americans bet less than $5 billion on sports. Last year, that number rose to at least $160 billion. Big numbers mean nothing to me, so let me put that statistic another way: $5 billion is roughly the amount Americans spend annually at coin-operated laundromats and $160 billion is nearly what Americans spent last year on domestic airline tickets. So, in a decade, the online sports gambling industry will have risen from the level of coin laundromats to rival the entire airline industry.
And now here come the prediction markets, such as Polymarket and Kalshi, whose combined 2025 revenue came in around $50 billion. “These predictive markets are the logical endpoint of the online gambling boom,” Coppins told me on my podcast Plain English. “We have taught the entire American population how to gamble with sports. We’ve made it frictionless and easy and put it on everybody’s phone. Why not extend the logic and culture of gambling to other segments of American life?” He continued:
Why not let people gamble on who’s going to win the Oscar, when Taylor Swift’s wedding will be, how many people will be deported from the United States next year, when the Iranian regime will fall, whether a nuclear weapon will be detonated in the year 2026, or whether there will be a famine in Gaza? These are not things that I’m making up. These are all bets that you can make on these predictive markets.Indeed, why not let people gamble on whether there will be a famine in Gaza? The market logic is cold and simple: More bets means more information, and more informational volume is more efficiency in the marketplace of all future happenings. But from another perspective—let’s call it, baseline morality?—the transformation of a famine into a windfall event for prescient bettors seems so grotesque as to require no elaboration. One imagines a young man sending his 1099 documents to a tax accountant the following spring: “right, so here are my dividends, these are the cap gains, and, oh yeah, here’s my $9,000 payout for totally nailing when all those kids would die.”
It is a comforting myth that dystopias happen when obviously bad ideas go too far. Comforting, because it plays to our naive hope that the world can be divided into static categories of good versus evil and that once we stigmatize all the bad people and ghettoize all the bad ideas, some utopia will spring into view. But I think dystopias more likely happen because seemingly good ideas go too far. “Pleasure is better than pain” is a sensible notion, and a society devoted to its implications created Brave New World. “Order is better than disorder” sounds alright to me, but a society devoted to the most grotesque vision of that principle takes us to 1984. Sports gambling is fun, and prediction markets can forecast future events. But extended without guardrails or limitations, those principles lead to a world where ubiquitous gambling leads to cheating, cheating leads to distrust, and distrust leads ultimately to cynicism or outright disengagement.
“The crisis of authority that has kind of already visited every other American institution in the last couple of decades has arrived at professional sports,” Coppins said. Two-thirds of Americans now believe that professional athletes sometimes change their performance to influence gambling outcomes. “Not to overstate it, but that’s a disaster,” he said. And not just for sports.
Four Ways to Lose (Or, What's a 'Rigged Pitch' in a War?)
There are four reasons to worry about the effect of gambling in sports and culture.
by Derek Thompson, Substack | Read more:
Image: Eyestetix Studio on UnsplashSuper Bird
Before it took off, the bird ate parts of its own liver, kidneys, and gut. That was the only way to be light enough to fly. Then it flew 8,425 miles from Alaska to Australia, in 11 days, without eating, drinking, or landing once.
The bird is called B6. It's a bar-tailed godwit, four months old, weighing about as much as a can of beans. In October 2022, scientists at the US Geological Survey tracked its flight from Alaska all the way to Tasmania. The trip took 11 days and 1 hour. It is still the longest non-stop flight of any animal on Earth.
For two weeks before takeoff, godwits eat until they almost double in weight. Fat ends up being 55% of their body, more than any bird ever measured. Then they shrink their own insides. About a quarter of their liver, kidneys, stomach, and intestines gets broken down and reused for fuel, making room for the extra fat and cutting weight. Their heart and wing muscles grow bigger at the same time.
They never drink along the way. The water they need comes out of burning fat, the same reaction their muscles use for energy. They also never really sleep. B6 flapped its wings for 264 straight hours, cruising around 35 miles per hour with help from storm tailwinds. By the time it landed, it had lost almost half its body weight. The shrunken organs grew back over the following weeks. Scientists still cannot explain the navigation. B6 had never made this flight before. Adult godwits leave Alaska weeks earlier, so young birds fly alone with nobody to follow. How a four-month-old bird finds its way across 8,425 miles of open ocean to a place it has never seen is still an open question. About 100,000 bar-tailed godwits leave Alaska every fall. Most of them land in New Zealand or Australia 10 or 11 days later, having eaten parts of themselves to get there.
by Anish Moonka, X | Read more:
Image: All Day Astronomy
Karl Ove Knausgaard’s Diabolic Realism
If you made it through the 3,600 pages of Karl Ove Knausgaard’s My Struggle (Min kamp, in the Norwegian), its conclusion could only inspire mixed feelings. Book Six — also known as “the Hitler one” due to its three hundred pages on the life of the dictator whose manifesto gave Knausgaard his title — records the precise moment (7:07 a.m., on September 2, 2011) that Karl Ove brought it to a close. “The novel is finally finished,” he writes. “In two hours Linda will be coming here, I will hug her and tell her I’ve finished, and I will never do anything like this to her and our children again.” They will go to a literature festival, where he will endure an interview and then his wife will, too, since her own book has just come out. “Afterwards we will catch the train to Malmö, where we will get in the car and drive back to our house, and the whole way I will revel in, truly revel in, the thought that I am no longer a writer.”
Beyond the physical relief of putting down the carpal-tunnel-inducing final tome (1,157 pages in all), you might have sighed with despair at the thought of post-Struggle existence. After all, you’d spent countless hours swimming through Karl Ove’s mind, seeing through his eyes as he smoked, chugged coffee, “trudged” through various forms of bad weather, tried to write and then wrote and wrote and wrote, took care of his children, felt ashamed of taking care of his children, painfully recalled his father’s drunken misbehavior and his own, fretted over his sexual imperfections and moral indiscretions, agonized about his overwhelming shyness but also his glaring narcissism, stared at himself in various reflections, and, on two occasions, sliced up his face with broken glass. How will I fill my time, you might have wondered, if not by reading Knausgaard? And if he was renouncing the vocation he struggled so hard to claim, what had it all been for?
But of course Knausgaard didn’t stop writing. In fact, just the opposite. My Struggle was released in Norway between 2009 and 2011; by the time the final installment of this Viking longship of a novel invaded the English-speaking world, in 2018, Knausgaard had already published five more books in his native country...
[ed. Like with Proust... two books and I'm good.]
Beyond the physical relief of putting down the carpal-tunnel-inducing final tome (1,157 pages in all), you might have sighed with despair at the thought of post-Struggle existence. After all, you’d spent countless hours swimming through Karl Ove’s mind, seeing through his eyes as he smoked, chugged coffee, “trudged” through various forms of bad weather, tried to write and then wrote and wrote and wrote, took care of his children, felt ashamed of taking care of his children, painfully recalled his father’s drunken misbehavior and his own, fretted over his sexual imperfections and moral indiscretions, agonized about his overwhelming shyness but also his glaring narcissism, stared at himself in various reflections, and, on two occasions, sliced up his face with broken glass. How will I fill my time, you might have wondered, if not by reading Knausgaard? And if he was renouncing the vocation he struggled so hard to claim, what had it all been for?
But of course Knausgaard didn’t stop writing. In fact, just the opposite. My Struggle was released in Norway between 2009 and 2011; by the time the final installment of this Viking longship of a novel invaded the English-speaking world, in 2018, Knausgaard had already published five more books in his native country...
Now the cycle continues with The School of Night (2023/2026), a bildungsroman about a young Norwegian photographer and the Faustian bargain that catapults him to artistic greatness. So far, we’re at 2,512 pages and counting. Two more tomes have already been published in Norway; Knausgaard told a Norwegian newspaper that the seventh will be the last, because, incredibly, “there is so much else I want to write.”
An attentive Struggler will identify bits and pieces that Knausgaard recycles in these novels: the aphrodisiac qualities of prawns, or a grandfather’s antisemitic quip, or the frequent appearance of hospitals and mental institutions. There is typically Knausgaardian attention paid to the precise color of piss (sometimes, like Knausgaard’s father’s, disturbingly dark) and the unevenly shared burdens of domestic life; much Pepsi Max is slurped, significant time is spent brooding on verandas, and the destructive desire for just one more drink is often satisfied. Narrators resemble Karl Ove at various points in My Struggle, like the alcoholic literature professor and aspiring novelist whose mentally unstable wife is hospitalized, as Linda was in Book Two; The School of Night’s young artist maps onto student Karl Ove in Book Five.
Yet the Star series is in many ways My Struggle’s opposite. Rather than the unrelenting voice of one man, we get an array of perspectives, and some of the most compelling characters are women. Whereas My Struggle somehow keeps you engaged despite its apparent formlessness, with little plot beyond the shaggy shape of an actual life, the Star series is structured around a series of more or less suspenseful mysteries. But the most obvious difference is the weirdness. While Knausgaard continues to beguile us with his trademark hyperrealist style, predictably observant down to the coffee granules dissolving inside a mug, what happens in these new novels transcends the real. One of the narrators — Egil, a trust-funded documentarian turned religious searcher who composes an essay on death that constitutes the last fifty or so pages of The Morning Star — helpfully informs us that the titular phrase is not just a literal translation of Lucifer, the name of the fallen angel who rebels against God, but also one of the ways Jesus describes himself. And the dark corners of these novels are illuminated by a gleam equal parts demonic and divine: hordes of crabs scuttle their way inland, a Sasquatch-like beast emerges from the woods and seemingly possesses an escaped mental patient, dreams start changing, dead bodies stop arriving at mortuaries, and people who should be dead seem somehow to keep living.
The struggle of My Struggle is, at heart, about what to believe in the face of death when religion is not an option, ideology has failed, and there’s nothing more than the life you’ve got. “Attaching meaning to the world is peculiar only to man,” Knausgaard writes in Book Six. “We are the givers of meaning, and this is not only our own responsibility but also our obligation.” Knausgaard sought a form that would not just describe but enact the process by which meaning is made in secular life. But in the Star books, secular lives — and seemingly mortality itself — are disrupted by the new star; characters and readers alike wonder whether it’s a sign to be interpreted or simply a phenomenon to be explained. Knausgaard widens his frame to encompass not just the banal and everyday, but the cosmic. He tries, in other words, to reenchant the secular world, and the secular novel, dramatizing a search for meaning beyond the self and beyond realism. But like his characters, we’re left wondering what it all means.
An attentive Struggler will identify bits and pieces that Knausgaard recycles in these novels: the aphrodisiac qualities of prawns, or a grandfather’s antisemitic quip, or the frequent appearance of hospitals and mental institutions. There is typically Knausgaardian attention paid to the precise color of piss (sometimes, like Knausgaard’s father’s, disturbingly dark) and the unevenly shared burdens of domestic life; much Pepsi Max is slurped, significant time is spent brooding on verandas, and the destructive desire for just one more drink is often satisfied. Narrators resemble Karl Ove at various points in My Struggle, like the alcoholic literature professor and aspiring novelist whose mentally unstable wife is hospitalized, as Linda was in Book Two; The School of Night’s young artist maps onto student Karl Ove in Book Five.
Yet the Star series is in many ways My Struggle’s opposite. Rather than the unrelenting voice of one man, we get an array of perspectives, and some of the most compelling characters are women. Whereas My Struggle somehow keeps you engaged despite its apparent formlessness, with little plot beyond the shaggy shape of an actual life, the Star series is structured around a series of more or less suspenseful mysteries. But the most obvious difference is the weirdness. While Knausgaard continues to beguile us with his trademark hyperrealist style, predictably observant down to the coffee granules dissolving inside a mug, what happens in these new novels transcends the real. One of the narrators — Egil, a trust-funded documentarian turned religious searcher who composes an essay on death that constitutes the last fifty or so pages of The Morning Star — helpfully informs us that the titular phrase is not just a literal translation of Lucifer, the name of the fallen angel who rebels against God, but also one of the ways Jesus describes himself. And the dark corners of these novels are illuminated by a gleam equal parts demonic and divine: hordes of crabs scuttle their way inland, a Sasquatch-like beast emerges from the woods and seemingly possesses an escaped mental patient, dreams start changing, dead bodies stop arriving at mortuaries, and people who should be dead seem somehow to keep living.
The struggle of My Struggle is, at heart, about what to believe in the face of death when religion is not an option, ideology has failed, and there’s nothing more than the life you’ve got. “Attaching meaning to the world is peculiar only to man,” Knausgaard writes in Book Six. “We are the givers of meaning, and this is not only our own responsibility but also our obligation.” Knausgaard sought a form that would not just describe but enact the process by which meaning is made in secular life. But in the Star books, secular lives — and seemingly mortality itself — are disrupted by the new star; characters and readers alike wonder whether it’s a sign to be interpreted or simply a phenomenon to be explained. Knausgaard widens his frame to encompass not just the banal and everyday, but the cosmic. He tries, in other words, to reenchant the secular world, and the secular novel, dramatizing a search for meaning beyond the self and beyond realism. But like his characters, we’re left wondering what it all means.
by Max Norman, The Drift | Read more:
Image: Maki Yamaguchi
Labels:
Critical Thought,
Fiction,
Literature,
Philosophy,
Psychology
Thursday, April 23, 2026
Suddenly Everyone Wants a Tailor. They’re in Short Supply.
As AI sweeps into white-collar workplaces, old-timey hands-on jobs are getting a new look—and some of those professions even have shortages.
Consider tailors. Sewing is a vanishing skill, much like lacemaking and watchmaking, putting tailors in short supply when big retailers like Nordstrom and Men’s Wearhouse, as well as fashion designers and local dry cleaners, say they need more of them.
The job, which can take years to master, can be a tough sell to younger generations more accustomed to instant gratification. But apprenticeships that offer pay to learn on the job and new training programs are helping entice more people.
Bennett had been working as a technical designer for a fashion company, responsible for verifying that production met quality and construction standards. When he was laid off, he had trouble finding a new job. Then he came across a new Nordstrom-backed program at New York’s Fashion Institute of Technology that teaches custom alterations and tailoring.
Bennett completed the training late last year and is now a tailor’s apprentice at the department-store chain, where he is getting real-life experience on the intricacies of pant hems. (Denim requires a different technique than slacks. For denim, the original hem is cut, the pant leg is shortened, and the hem is reattached to give the jeans a worn-in look.)
For the first semester of its program, which concluded in December, FIT received more than 190 applications for 15 spots. The nine-week course requires prior sewing experience. Nordstrom hired seven students from the inaugural class.
“It’s increasingly becoming more challenging to find people to fill these alterations jobs,” said Marco Esquivel, the director of alterations and aftercare services at Nordstrom, which employs about 1,500 tailors. Similar to other high-end retailers, Nordstrom offers free basic tailoring for garments purchased at the department-store chain and charges a fee for those bought elsewhere.
Tailored Brands, which employs about 1,300 tailors at its Men’s Wearhouse, Jos. A. Bank and other chains, is updating its apprenticeship program to include more self-guided videos with the goal of moving people through the training faster.
“The pipeline has dwindled,” the company’s chief operating officer, Karla Gray, said.
While counterintuitive, there is an acute need for tailoring even in the current age of casual dressing. Pants and cuffs still need to be hemmed to say nothing of bridal, prom and other special-occasion clothes.
Decades of offshoring affected the American apparel industry, decimating the profession. Now most tailors who are working are starting to approach retirement age, so demand for them outstrips the supply of labor, industry executives say.
Other colliding factors have had an impact, too. As more women took traditional corporate jobs outside the home, schools eliminated home-economics programs, which were a steppingstone to becoming a professional tailor or seamstress. More recently, the explosion in popularity for resale clothing and the growing use of GLP-1 drugs for weight loss have created more need for nipping and tucking what is in peoples’ closets.
“These are all trends that require more tailored clothing,” Nordstrom’s Esquivel said.
U.S. tailors numbered about 18,500 in 2024, a nearly 30% drop from a decade ago, according to the Bureau of Labor Statistics. In 1997, there were almost twice as many. Federal data show the typical annual wage for a dressmaker is about $43,000 a year, but some tailors and seamstresses can make more.
Jenny Robbins, 61 years old, recently joined Nordstrom after completing the Fashion Institute’s program. It is her latest reinvention after starting her career as a math teacher, working as a tutor for Princeton Review and then becoming a pattern maker for designer Anna Sui after taking a few sewing classes.
Robbins says she learned to operate industrial sewing machines, which stitch much faster than home machines, create blind hems where the stitching is essentially invisible, and can cuff a blazer.
“There is no shortage of work,” she said.
The lack of tailors and sewers has also been a blow to reviving apparel manufacturing in the U.S.
Cindie Husbands opened an apparel manufacturer in Las Vegas in 2013 but closed it in 2021 partly due to a lack of trained sewers, she said. [...]
“Tailoring is one of the oldest skilled trades in the world,” she said. “Yet the pathway has almost vanished in a single generation.”
by Suzanne Kapner, Wall Street Journal | Read more:
Consider tailors. Sewing is a vanishing skill, much like lacemaking and watchmaking, putting tailors in short supply when big retailers like Nordstrom and Men’s Wearhouse, as well as fashion designers and local dry cleaners, say they need more of them.
The job, which can take years to master, can be a tough sell to younger generations more accustomed to instant gratification. But apprenticeships that offer pay to learn on the job and new training programs are helping entice more people.
“It’s not glamorous and not something you want to post about on social media,” says Khaleel Bennett, a 30-year-old who lives in Queens, N.Y. “But it’s a skill that will carry me for life.”
Bennett had been working as a technical designer for a fashion company, responsible for verifying that production met quality and construction standards. When he was laid off, he had trouble finding a new job. Then he came across a new Nordstrom-backed program at New York’s Fashion Institute of Technology that teaches custom alterations and tailoring.
Bennett completed the training late last year and is now a tailor’s apprentice at the department-store chain, where he is getting real-life experience on the intricacies of pant hems. (Denim requires a different technique than slacks. For denim, the original hem is cut, the pant leg is shortened, and the hem is reattached to give the jeans a worn-in look.)
For the first semester of its program, which concluded in December, FIT received more than 190 applications for 15 spots. The nine-week course requires prior sewing experience. Nordstrom hired seven students from the inaugural class.
“It’s increasingly becoming more challenging to find people to fill these alterations jobs,” said Marco Esquivel, the director of alterations and aftercare services at Nordstrom, which employs about 1,500 tailors. Similar to other high-end retailers, Nordstrom offers free basic tailoring for garments purchased at the department-store chain and charges a fee for those bought elsewhere.
Tailored Brands, which employs about 1,300 tailors at its Men’s Wearhouse, Jos. A. Bank and other chains, is updating its apprenticeship program to include more self-guided videos with the goal of moving people through the training faster.
“The pipeline has dwindled,” the company’s chief operating officer, Karla Gray, said.
While counterintuitive, there is an acute need for tailoring even in the current age of casual dressing. Pants and cuffs still need to be hemmed to say nothing of bridal, prom and other special-occasion clothes.
Decades of offshoring affected the American apparel industry, decimating the profession. Now most tailors who are working are starting to approach retirement age, so demand for them outstrips the supply of labor, industry executives say.
Other colliding factors have had an impact, too. As more women took traditional corporate jobs outside the home, schools eliminated home-economics programs, which were a steppingstone to becoming a professional tailor or seamstress. More recently, the explosion in popularity for resale clothing and the growing use of GLP-1 drugs for weight loss have created more need for nipping and tucking what is in peoples’ closets.
“These are all trends that require more tailored clothing,” Nordstrom’s Esquivel said.
U.S. tailors numbered about 18,500 in 2024, a nearly 30% drop from a decade ago, according to the Bureau of Labor Statistics. In 1997, there were almost twice as many. Federal data show the typical annual wage for a dressmaker is about $43,000 a year, but some tailors and seamstresses can make more.
Jenny Robbins, 61 years old, recently joined Nordstrom after completing the Fashion Institute’s program. It is her latest reinvention after starting her career as a math teacher, working as a tutor for Princeton Review and then becoming a pattern maker for designer Anna Sui after taking a few sewing classes.
Robbins says she learned to operate industrial sewing machines, which stitch much faster than home machines, create blind hems where the stitching is essentially invisible, and can cuff a blazer.
“There is no shortage of work,” she said.
The lack of tailors and sewers has also been a blow to reviving apparel manufacturing in the U.S.
Cindie Husbands opened an apparel manufacturer in Las Vegas in 2013 but closed it in 2021 partly due to a lack of trained sewers, she said. [...]
In November, Husbands founded the American Tailors and Sewing Association, which aims to create a standardized, scalable training and certification model for the industry.
“Tailoring is one of the oldest skilled trades in the world,” she said. “Yet the pathway has almost vanished in a single generation.”
by Suzanne Kapner, Wall Street Journal | Read more:
Image: uncredited
[ed. No kidding, try finding a good tailor or seamstress these days. It's nearly impossible (or they're booked for weeks). What a lost art. My grandmother, aunties, mom... everyone used to sew (and awesomely well! I think they were all competing against each other), all kinds of clothes, and beautiful quilts and pillows, placemats, whatever... it was Art. Now those lessons seem to be fading, maybe not everywhere, but surely here in the US.]
Organs on Demand
The initial heart transplant was not greeted with universal applause. Shortly after Dr. Christiaan Barnard performed the procedure for the first time in 1967, people bombarded his hospital in South Africa with letters that characterized the doctor as a butcher and a ghoul. A fellow cardiologist likened the operation to a form of cannibalism. Many people criticized Barnard for picking one life over another and playing God.
It did not take long for most of this criticism to dissipate. Within a couple of years, the public became accustomed to the idea of heart transplants and then they welcomed them. Last year, about 10,000 people worldwide had heart transplants, while nearly 165,000 people received a kidney, liver, lung or pancreas.
There would be far more organ transplants if there were more viable organs available. Which brings us to the next medical and ethical quandary that society may soon face.
A three-year-old startup named Kind Biotechnology has begun work on what it calls an integrated organ network, or ION. This acronym undersells what Kind is making, which is a collection of organs that can be grown inside of an animal’s womb and then harvested for transplantation. Cue the gasps from some and the cheers from others.
By creating a series of genetic edits, Kind can alter the development of an embryo so that it forms organs without also forming limbs, a central nervous system and brain. The result is a group of organs growing in the womb. It sounds like science fiction, but Kind has already done this hundreds of times in mice and now rats, according to Justin Rebo, the company’s founder and CEO.
In the months ahead, Kind plans to expand its technology to larger mammals like pigs and possibly sheep with the hopes of producing organs good enough to endure the transplantation process. One day, Rebo expects that humans might be able to use these animal-grown organs to deal with medical emergencies and to help people live longer.
“We’re working on a platform to build abundant organ medicine, which we believe is a path not only to treating organ failure, but eventually to being more broadly medically useful and even impacting human lifespan,” Rebo says. “The point of medicine is to make people live longer and healthier lives. That’s what it’s always been. And that’s what we’re working on.”
TENS OF thousands of people languish waiting for viable organs each year. Scientists have been attempting to solve this problem for decades by trying to create individual organs in their labs. In some cases, they take the cells of an organ and then coax them into developing more fully to make, say, a lab-grown kidney or liver. Companies like United Therapeutics and eGenesis have also been editing the genes of pig organs to make them more suitable for human use.
While there has been some success with these approaches, Rebo considers them too basic and limited to produce the full complement of organs that humans need. He contends that you can’t create the best organs in isolation and that they need to develop alongside each other. “The heart relies on the kidney to modulate the system environment in the right way to allow it to live and grow,” he says. “And both rely on the lungs and the liver and so forth, and both need access to nutrients, which is provided by the intestines.”
Rebo is a doctor and scientist with a long history in the bio-tech and longevity fields. And he’s not alone on this quest to create organs inside of what could be called headless bodies. R3 Bio, co-founded by John Schloendorn and Alice Gilman, is pursuing similar technology, although without much detail as of yet. Gilman has talked about trying to create animal models that could be used for medical testing so that researchers would no longer need to experiment on living, conscious mammals like primates. RenewalBio in Israel is also believed to be working in this area, trying to build organs from a patient’s own cells. (Schloendorn and Rebo were previously collaborators.)
Before even getting to the ethical considerations of Kind’s technology, there are myriad practical, scientific matters to confront.
[ed. Not sure the technology is as advanced or straightforward as they'd have you to believe, but you can see where it's heading.]
It did not take long for most of this criticism to dissipate. Within a couple of years, the public became accustomed to the idea of heart transplants and then they welcomed them. Last year, about 10,000 people worldwide had heart transplants, while nearly 165,000 people received a kidney, liver, lung or pancreas.
There would be far more organ transplants if there were more viable organs available. Which brings us to the next medical and ethical quandary that society may soon face.
A three-year-old startup named Kind Biotechnology has begun work on what it calls an integrated organ network, or ION. This acronym undersells what Kind is making, which is a collection of organs that can be grown inside of an animal’s womb and then harvested for transplantation. Cue the gasps from some and the cheers from others.
By creating a series of genetic edits, Kind can alter the development of an embryo so that it forms organs without also forming limbs, a central nervous system and brain. The result is a group of organs growing in the womb. It sounds like science fiction, but Kind has already done this hundreds of times in mice and now rats, according to Justin Rebo, the company’s founder and CEO.
In the months ahead, Kind plans to expand its technology to larger mammals like pigs and possibly sheep with the hopes of producing organs good enough to endure the transplantation process. One day, Rebo expects that humans might be able to use these animal-grown organs to deal with medical emergencies and to help people live longer.
“We’re working on a platform to build abundant organ medicine, which we believe is a path not only to treating organ failure, but eventually to being more broadly medically useful and even impacting human lifespan,” Rebo says. “The point of medicine is to make people live longer and healthier lives. That’s what it’s always been. And that’s what we’re working on.”
TENS OF thousands of people languish waiting for viable organs each year. Scientists have been attempting to solve this problem for decades by trying to create individual organs in their labs. In some cases, they take the cells of an organ and then coax them into developing more fully to make, say, a lab-grown kidney or liver. Companies like United Therapeutics and eGenesis have also been editing the genes of pig organs to make them more suitable for human use.
While there has been some success with these approaches, Rebo considers them too basic and limited to produce the full complement of organs that humans need. He contends that you can’t create the best organs in isolation and that they need to develop alongside each other. “The heart relies on the kidney to modulate the system environment in the right way to allow it to live and grow,” he says. “And both rely on the lungs and the liver and so forth, and both need access to nutrients, which is provided by the intestines.”
Rebo is a doctor and scientist with a long history in the bio-tech and longevity fields. And he’s not alone on this quest to create organs inside of what could be called headless bodies. R3 Bio, co-founded by John Schloendorn and Alice Gilman, is pursuing similar technology, although without much detail as of yet. Gilman has talked about trying to create animal models that could be used for medical testing so that researchers would no longer need to experiment on living, conscious mammals like primates. RenewalBio in Israel is also believed to be working in this area, trying to build organs from a patient’s own cells. (Schloendorn and Rebo were previously collaborators.)
Before even getting to the ethical considerations of Kind’s technology, there are myriad practical, scientific matters to confront.
by Ashlee Vance, Core Memory | Read more:
Image: uncredited
Power, Not Economic Theory, Created Neoliberalism
Neoliberalism didn’t win an intellectual argument — it won power. Vivek Chibber unpacks how employers and political elites in the 1970s and ’80s turned economic turmoil into an opportunity to reshape society on their terms.
Neoliberalism’s victory over Keynesianism wasn’t an intellectual revolution — it was a class offensive. To roll it back, the Left doesn’t need to win an argument so much as it needs to rebuild working-class institutions from the ground up. [...]
Melissa Naschek: Neoliberalism in general is a pretty hot topic right now among researchers, and one of the most common lenses is to focus on the role of ideas, theories, and thinkers in establishing neoliberalism.
The last time we talked about this topic, you dispelled a lot of common misconceptions about what it is and what it’s not. One of the questions that we’ve gotten a lot from listeners since then is, where does neoliberalism come from?
Vivek Chibber: Yeah, it’s very topical, but it’s also important for the Left, because getting to the crux of this helps us understand where and how important changes in economic regimes and models of accumulation come from. So it’s good for us to get into it in some more depth. [...]
Neoliberalism’s victory over Keynesianism wasn’t an intellectual revolution — it was a class offensive. To roll it back, the Left doesn’t need to win an argument so much as it needs to rebuild working-class institutions from the ground up. [...]
Melissa Naschek: Neoliberalism in general is a pretty hot topic right now among researchers, and one of the most common lenses is to focus on the role of ideas, theories, and thinkers in establishing neoliberalism.
The last time we talked about this topic, you dispelled a lot of common misconceptions about what it is and what it’s not. One of the questions that we’ve gotten a lot from listeners since then is, where does neoliberalism come from?
Vivek Chibber: Yeah, it’s very topical, but it’s also important for the Left, because getting to the crux of this helps us understand where and how important changes in economic regimes and models of accumulation come from. So it’s good for us to get into it in some more depth. [...]
* [ed. Historical discussion of Keynesism vs. Neoliberalism.]
That little story tells you something. What it says is ideas that are going into the halls of power go through certain filters. And the filters are essentially the policy priorities that the politicians have already committed to. Now, what creates those priorities? It’s the balance of class power. Social forces are setting the agenda.
If the social forces, that is, say, trade unions and community organizations, have set the agenda for politicians such that they think the only rational thing to do is to institute a welfare state, then they will bring in economists who help them design a welfare state. That gives intellectual influence to those economists. Economists who are saying “Get rid of this whole thing” are cast out into the wilderness. That’s how it works. [...]
Melissa Naschek: How do theories that focus on this notion that ideas and thinkers caused neoliberalism suggest a certain set of solutions to neoliberalism?
Vivek Chibber: It’s a really good point and a very good question. It gets us back to the issue of, why should we care about this? What does it matter if you misunderstand the factors that go into a change in economic policies? What does it matter if you wrongly attribute influence to ideas, let’s say, over material interests? Well, it can lead you to propose wrong solutions.
This is a very good example of that. If you think that what’s behind dramatic shifts in policy is the influence of ideas per se, the brilliance of those ideas, then, if you think that neoliberalism is a catastrophe and we need to go back to social democracy, then your solution is going to be, “Let’s get some economists or political scientists who are really good theorists of social democracy and give them publicity — put them in newspapers, give them lots of op-eds, maybe try to get them a meeting in the White House or something like that.”
But if you think that what’s really driving these changes is the social balance of power — the power balance between capital and labor, between rich and poor — then you won’t pour your energies into getting the right people entrĂ©e into the halls of power. You’ll pour your energies into changing the class balance. That’s the difference between how people on what used to be called the Left approach these issues and the way in which mainstream theorists and thinkers approach these issues.
This kind of ideas-based analysis leads to a great man version of policy change, whereby you get the right person in the right place with the right ideas. And then, counterfactually, the reason we don’t have a desired change is that we haven’t managed to get the right people with the right ideas into the right places. That’s a great man theory of historical change.
But if you are a socialist on the Left, you know ideas get their salience because of the background conditions, the social context, and the power relations. They don’t get their influence because of simple brilliance, at least when it comes to politics. Science is a different matter. But in politics, they get their influence because some agency with social power gives them the platform.
Without that, I mean, if the power of ideas mattered and if the correctness mattered, we’d already have a social democratic government, and we would have had one for decades. Because not only are these ideas, we think in our arrogance, they appeal to everybody.
Zohran Mamdani’s ideas, Bernie Sanders’s ideas, are not radical the way the New York Times is constantly hammering that these are radical fringe ideas. They’re mainstream as can be. They are ideas that appeal to the majority.
Why do they not have entrĂ©e? Why do they not have political influence right now? It’s because the balance of class power is such that even though they appeal to the largest number of people, those people have no political organization. They have no way of effectuating their demands. And so, their demands as encapsulated in Sanders and Mamdani don’t have a lot of political influence.
So ideas can matter, but they have to be made to matter.
Vivek Chibber: The mere fact that such ideas exist does not in any way give them influence. The question for us, for socialists and for the Left is, when do ideas gain influence?
It’s a profound methodological error, I think, when you ask the question, “Where did neoliberalism come from?” to look at the contemporary theorists or the contemporary advocates of neoliberalism and then, because they are influential today, trace the origins of their ideas back to where they first started and say, that is where the origins come from.
Melissa Naschek: How important was this debate in establishing or causing neoliberalism?
Vivek Chibber: Not even the least bit. It was largely irrelevant to it. In other words, even if this debate had never happened, even if Milton Friedman had not existed, even if Hayek had not existed, you would have still had a turn to neoliberalism, and that’s the key. This is what the Left needs to understand.
This does not in any way invalidate the intellectual project of tracing those ideas. It’s intellectually interesting. It’s an interesting fact that those ideas had been around for forty years, and they had no impact on policy. Some historians have done great work tracing these ideas back to their origin, but it’s quite another to say that it was the ideas themselves that in the 1970s and ’80s caused the turn to neoliberalism.
Now, it’s an easy mistake to make because when the change came, the change was justified with a highly technical economic apparatus, and people like Friedman were given the stage to say not just that these policies are desirable for political reasons, but that they make a lot of economic sense and that it’s rational to do it this way. That gives you the sense, then, that it’s these particular individuals and their intellectual influence on the politicians that makes the politicians make the changes.
But in fact, the order of causation is exactly the other way around. It’s the politicians who make the changes based on criteria that have nothing to do with the technical sophistication of the ideas or their scientific validity. They make the changes because of the political desirability of those changes, and then they seek out advice on a) justifying the changes so that the naked subservience to power is not visible or obvious — it makes it look like it was done for highfalutin’ reasons — And then b) of course, they do legitimately say, “OK, now that we’re committed to this, help us work it out.”
Melissa Naschek: Right, especially because as long as you’re still in capitalism, you’re going to be facing constant economic crises. Even if you’re instituting a new regime, you’re going to be constantly looking for new solutions.
Vivek Chibber: Yeah. And even short of crises, you’re going to look for ways of making the policies work smoothly. And you’re going to look for ways of coming up with the correct balance of instruments and policies within them. So you bring in Milton Friedman or you bring in somebody else.
Surface level, it looks like what’s driving the whole thing is these ideas. But I said to you that the ideas actually have no role to play in the turn itself. So that brings up the question, what does? Why did they do it then?
I just said a second ago that what drove it was political priorities, not intellectual feasibility. Well, what were the political priorities? Who were the politicians actually listening to? Ideas can matter, but they have to be made to matter.
There are only two key players when it comes to policy changes of this kind. The key players are the politicians, because they’re the ones who are pulling the levers. But then, it’s the key constituency that actually has influence over the politicians.
The least important part is intellectuals. You might say voters have some degree of influence, but really, in a money-driven system like the United States, it’s investors, it’s capitalists — it’s big capital. They’re the ones who are pushing for these changes.
That means that if you want to understand where neoliberalism comes from, or rather if you want to understand why it came about, the answer is, it came about because capitalists ceased to tolerate the welfare state.
Now, why did they tolerate the welfare state at all? Most people on the Left understand the welfare state was brought about through massive trade union mobilization and labor mobilizations and was kept in place as long as the trade union movement had some kind of presence within the Democratic Party, within the economy more generally, because those unions were powerful enough, employers had to figure out a way of living with them. Part of what they did to live with the trade unions was to agree to a certain measure of redistribution and a certain kind of welfare state. As long as that was the case, politicians kept the welfare state going.
This is why, in that era from the mid-1930s to the mid-1970s, Keynesianism or the economics of state intervention of some kind was the hegemonic economic theory. The theory became hegemonic because it was given respectability by virtue of the fact that everybody in power was using it. Because it’s being used by people in power, it has great respectability.
This is why, in the 1950s and ’60s, Milton Friedman was in the wilderness — same guy, same ideas, equally intellectually attractive, equally technically sophisticated, but he was in the wilderness.[...]
It’s a profound methodological error, I think, when you ask the question, “Where did neoliberalism come from?” to look at the contemporary theorists or the contemporary advocates of neoliberalism and then, because they are influential today, trace the origins of their ideas back to where they first started and say, that is where the origins come from.
Melissa Naschek: How important was this debate in establishing or causing neoliberalism?
Vivek Chibber: Not even the least bit. It was largely irrelevant to it. In other words, even if this debate had never happened, even if Milton Friedman had not existed, even if Hayek had not existed, you would have still had a turn to neoliberalism, and that’s the key. This is what the Left needs to understand.
This does not in any way invalidate the intellectual project of tracing those ideas. It’s intellectually interesting. It’s an interesting fact that those ideas had been around for forty years, and they had no impact on policy. Some historians have done great work tracing these ideas back to their origin, but it’s quite another to say that it was the ideas themselves that in the 1970s and ’80s caused the turn to neoliberalism.
Now, it’s an easy mistake to make because when the change came, the change was justified with a highly technical economic apparatus, and people like Friedman were given the stage to say not just that these policies are desirable for political reasons, but that they make a lot of economic sense and that it’s rational to do it this way. That gives you the sense, then, that it’s these particular individuals and their intellectual influence on the politicians that makes the politicians make the changes.
But in fact, the order of causation is exactly the other way around. It’s the politicians who make the changes based on criteria that have nothing to do with the technical sophistication of the ideas or their scientific validity. They make the changes because of the political desirability of those changes, and then they seek out advice on a) justifying the changes so that the naked subservience to power is not visible or obvious — it makes it look like it was done for highfalutin’ reasons — And then b) of course, they do legitimately say, “OK, now that we’re committed to this, help us work it out.”
Melissa Naschek: Right, especially because as long as you’re still in capitalism, you’re going to be facing constant economic crises. Even if you’re instituting a new regime, you’re going to be constantly looking for new solutions.
Vivek Chibber: Yeah. And even short of crises, you’re going to look for ways of making the policies work smoothly. And you’re going to look for ways of coming up with the correct balance of instruments and policies within them. So you bring in Milton Friedman or you bring in somebody else.
Surface level, it looks like what’s driving the whole thing is these ideas. But I said to you that the ideas actually have no role to play in the turn itself. So that brings up the question, what does? Why did they do it then?
I just said a second ago that what drove it was political priorities, not intellectual feasibility. Well, what were the political priorities? Who were the politicians actually listening to? Ideas can matter, but they have to be made to matter.
There are only two key players when it comes to policy changes of this kind. The key players are the politicians, because they’re the ones who are pulling the levers. But then, it’s the key constituency that actually has influence over the politicians.
The least important part is intellectuals. You might say voters have some degree of influence, but really, in a money-driven system like the United States, it’s investors, it’s capitalists — it’s big capital. They’re the ones who are pushing for these changes.
That means that if you want to understand where neoliberalism comes from, or rather if you want to understand why it came about, the answer is, it came about because capitalists ceased to tolerate the welfare state.
Now, why did they tolerate the welfare state at all? Most people on the Left understand the welfare state was brought about through massive trade union mobilization and labor mobilizations and was kept in place as long as the trade union movement had some kind of presence within the Democratic Party, within the economy more generally, because those unions were powerful enough, employers had to figure out a way of living with them. Part of what they did to live with the trade unions was to agree to a certain measure of redistribution and a certain kind of welfare state. As long as that was the case, politicians kept the welfare state going.
This is why, in that era from the mid-1930s to the mid-1970s, Keynesianism or the economics of state intervention of some kind was the hegemonic economic theory. The theory became hegemonic because it was given respectability by virtue of the fact that everybody in power was using it. Because it’s being used by people in power, it has great respectability.
This is why, in the 1950s and ’60s, Milton Friedman was in the wilderness — same guy, same ideas, equally intellectually attractive, equally technically sophisticated, but he was in the wilderness.[...]
That little story tells you something. What it says is ideas that are going into the halls of power go through certain filters. And the filters are essentially the policy priorities that the politicians have already committed to. Now, what creates those priorities? It’s the balance of class power. Social forces are setting the agenda.
If the social forces, that is, say, trade unions and community organizations, have set the agenda for politicians such that they think the only rational thing to do is to institute a welfare state, then they will bring in economists who help them design a welfare state. That gives intellectual influence to those economists. Economists who are saying “Get rid of this whole thing” are cast out into the wilderness. That’s how it works. [...]
Melissa Naschek: How do theories that focus on this notion that ideas and thinkers caused neoliberalism suggest a certain set of solutions to neoliberalism?
Vivek Chibber: It’s a really good point and a very good question. It gets us back to the issue of, why should we care about this? What does it matter if you misunderstand the factors that go into a change in economic policies? What does it matter if you wrongly attribute influence to ideas, let’s say, over material interests? Well, it can lead you to propose wrong solutions.
This is a very good example of that. If you think that what’s behind dramatic shifts in policy is the influence of ideas per se, the brilliance of those ideas, then, if you think that neoliberalism is a catastrophe and we need to go back to social democracy, then your solution is going to be, “Let’s get some economists or political scientists who are really good theorists of social democracy and give them publicity — put them in newspapers, give them lots of op-eds, maybe try to get them a meeting in the White House or something like that.”
But if you think that what’s really driving these changes is the social balance of power — the power balance between capital and labor, between rich and poor — then you won’t pour your energies into getting the right people entrĂ©e into the halls of power. You’ll pour your energies into changing the class balance. That’s the difference between how people on what used to be called the Left approach these issues and the way in which mainstream theorists and thinkers approach these issues.
This kind of ideas-based analysis leads to a great man version of policy change, whereby you get the right person in the right place with the right ideas. And then, counterfactually, the reason we don’t have a desired change is that we haven’t managed to get the right people with the right ideas into the right places. That’s a great man theory of historical change.
But if you are a socialist on the Left, you know ideas get their salience because of the background conditions, the social context, and the power relations. They don’t get their influence because of simple brilliance, at least when it comes to politics. Science is a different matter. But in politics, they get their influence because some agency with social power gives them the platform.
Without that, I mean, if the power of ideas mattered and if the correctness mattered, we’d already have a social democratic government, and we would have had one for decades. Because not only are these ideas, we think in our arrogance, they appeal to everybody.
Zohran Mamdani’s ideas, Bernie Sanders’s ideas, are not radical the way the New York Times is constantly hammering that these are radical fringe ideas. They’re mainstream as can be. They are ideas that appeal to the majority.
Why do they not have entrĂ©e? Why do they not have political influence right now? It’s because the balance of class power is such that even though they appeal to the largest number of people, those people have no political organization. They have no way of effectuating their demands. And so, their demands as encapsulated in Sanders and Mamdani don’t have a lot of political influence.
So ideas can matter, but they have to be made to matter.
Labels:
Business,
Critical Thought,
Economics,
Education,
Government,
history,
Politics
I'm Just a Sound
One Sunday morning recently I listened, one after the other, to Monteverdi’s Selva morale e spirituale (1641) and the Beach Boys’ Pet Sounds (1966), and it wasn’t in any way jarring. I have to say, though, that it was by Pet Sounds that I felt truly transported. Between July 1965 and April 1966, the 23-year-old Brian Wilson wrote, arranged, produced and sang on songs including ‘Don’t Talk (Put Your Head on My Shoulder)’, ‘Caroline, No’, ‘I Just Wasn’t Made for These Times’ and ‘God Only Knows’. All around three minutes long or a bit less, they can make you feel as if you are standing alone in a cathedral, bathed in sound. Wilson was able to make pop music that was uplifting without ever being sickly. Secular hymns baited with pop hooks; heavy themes made exquisitely light. ‘His progressions are always going up, then pausing before they go up again, like they’re going towards God,’ says a musician quoted in David Leaf’s liner notes to The Pet Sounds Sessions (1997).
To an extent rivalled only by the Beatles, the Beach Boys have become the tales told about them, the ever expanding archive, the cornucopia of box sets, the shelves of books. It’s easy enough to see why. This is a tale stuffed with unlikely heroes and monstrous villains, which moves back and forth between glorious sunshine and the depths of despair. Many of the people in it – abusers, exploiters, bad magi – are not rounded or sympathetic figures; it sometimes seems as if everyone is trying to become the worst possible version of themselves. Here are Eugene Landy, Charles Manson, Phil Spector, Murry Wilson. Then there are the scarcely believable transformations of the boy-child Brian Wilson. How did he jump through the hula hoops of novelty pop to arrive, in the blink of an ‘I’, at a place where it seemed perfectly natural to come up with the idea of writing a pop music suite embodying the four elements?
Everything in this story is multiple and contradictory. No fact is secure, no testimony certain: all is apocrypha, surmise and legend. Over the years, the principals have offered wildly different readings of the same events, none more so than the prodigy at the heart of it all. In his brisk, canny, entertaining book Surf’s Up – a summa theologica of Beach Boy lore and legend – Peter Doggett sums it up: ‘As ever with Brian and the past ... the details altered sharply in each new telling.’ Sometimes during the course of the same interview. As if this or that reminiscence were simply one more of the musical ‘feels’ he said flowed through his head. There are even two starkly different Brian Wilson memoirs. [...]
Brian studied Bach and Beethoven, and learned to trust in the healing balm of counterpoint. And like Beethoven, who also had two brothers and a violent, overbearing father, he was slowly going deaf. ‘Before he entered his teens,’ Doggett explains, ‘Brian’s parents noticed that he tended to talk out of one side of his mouth and would turn his head around to pick up sounds or voices that came from the opposite side of his head. Tests were carried out, and it was determined that he enjoyed less than 20 per cent hearing in his right ear.’ There seems never to have been an official diagnosis. All we know is that Brian didn’t seem to hear like anyone else. As with Beethoven, his partial deafness and the ringing in his ears didn’t hinder his work as a composer, but it did make live performance a living hell and caused him to withdraw slowly from the hubbub of social life. The crossroads moment took place high up in the air: in December 1964 Wilson was flying to Houston to start a tour when he had some kind of convulsive breakdown. Too much pressure, in both senses. Things that make your head go pop. He no longer wanted to be up on stage with all the feedback and screaming.
The recording studio made possible new ways of listening. Tiny increments of syllable and sound to juggle. Listen to tapes of Wilson working in the studio and you can hear just how precise and in control he is: this is the one place where he knows who he is and what he wants. Did he ever sound more sensual than when he delivered the lines ‘I can hear so much/in your sighs’ and ‘Listen, listen, listen ...’ from ‘Don’t Talk (Put Your Head on My Shoulder)’? ‘Music became his language of choice,’ Doggett writes, ‘with which he was far more articulate than he ever was with words.’ You could say things to girls you could never say in real life. You could conjure up swells, plateaux, shivers; the sound of the sun coming up over the sea. Like many a Romantic man, his way of feeling intimacy is via something cloudy, oceanic, mountain-top. The nearest faraway place.
One of the mythic promises of rock’n’roll was escape to a place where the action was and where you could maybe find others like yourself. But there would be no such getaway for the Beach Boys – no big city salvation, no yellow brick road. They would live and die in LA. The Beach Boys didn’t scour snow-strafed city streets looking for old blues 78s. They idolised the very ‘square’ barbershop quartet the Four Freshmen; Wilson wrote a song called ‘Be True to Your School’. They were not, in a word, cool. They didn’t leave home, didn’t mooch, didn’t stray: they were already in the teen fantasy promised land. In Hawthorne, south-west Los Angeles, everything was on their doorstep, including their future bandmates: livewire cousin Mike Love; high-school classmate Al Jardine; long-time neighbour David Marks.
The Beach Boys, like many of the new bands of that era, sprang out of a local scene with its own heroes, slang, fashion. In this bright diurnal paradise, four of their early singles were hymns to a local leisure pursuit/metaphysical quest: ‘Surfin’’, ‘Surfin’ Safari’, ‘Surfin’ USA’, ‘Surfer Girl’. But then there suddenly appeared the achingly introspective ‘In My Room’. Co-written with Gary Usher, it’s a swerve away from the world of the drive-in, the burger place, the drag strip into a wholly/holy inner world where the singer can ‘lie awake and pray’. It’s a vulnerable song about the desire to be invulnerable. ‘You’re not afraid when you’re in your room,’ Wilson once said. The recording studio was his other panic room. It was somewhere you could explore a spectrum of emotional tones, as heard in early songs like ‘Lonely Sea’ (1963), ‘Don’t Worry Baby’ (1964), ‘The Warmth of the Sun’ (1964) and, most of all for me, the near-perfect pop record ‘Guess I’m Dumb’ (1965), written, arranged and produced by Wilson, sung by a young Glen Campbell.
Wilson would soon become notorious for how much time he took to record things, but at this early stage everything was a blur. There were ten Beach Boys studio albums between 1962 and 1965. There were no maps, no precedents; their de facto manager and ‘appropriate adult’ at this point was their irascible, interfering father. Brian Wilson may have had his mood swings but he was, in his own way, quite sturdy. Something you begin to notice, leafing through all the Beach Boys books, is how strapping the teenage Brian looks in high-school snaps and how sporty too; this was no neurasthenic squirt. He also had a reputation for being a bit of a cut-up. A twelfth-grade report card reveals that he got an A in Physical Education, a B in Senior Problems (Personal Psychology) and only a C for Piano and Harmony.
Wilson was famously not a surfer: he may have held business meetings in his swimming pool and set up his piano in a sand pit, but he had to be dragged into the sea as if he was undergoing aversion therapy. The Beach Boys’ early hits were the sound of everything to do with surfing, absent the sensation of surfing itself. Surfers try to control unpredictable swells and curls, seeking moments of transcendence, measured in seconds or a few short minutes – just like the pop music Wilson was about to unleash on the world.
From ‘In My Room’ (1963) to songs like ‘’Til I Die’ (1971) and ‘Sail on, Sailor’ (1973), the Beach Boys made music that for some of us has become a kind of gospel. This may seem a large and baffling claim if what you see in your mind’s eye when someone mentions them is an image of leathery old guys in Hawaiian shirts, or if all you know of their music is zippy hits like ‘Fun Fun Fun’, ‘Barbara Ann’ and ‘I Get Around’. Yet there is a logic here. Rock’n’roll was born from the uneasy tension between Saturday night and Sunday morning, church pew and dance floor, showing out and making things right with God. After those beginnings, pop and rock would go on to supply plenty of carnal jolt, but far fewer intimations of the sacred.
To an extent rivalled only by the Beatles, the Beach Boys have become the tales told about them, the ever expanding archive, the cornucopia of box sets, the shelves of books. It’s easy enough to see why. This is a tale stuffed with unlikely heroes and monstrous villains, which moves back and forth between glorious sunshine and the depths of despair. Many of the people in it – abusers, exploiters, bad magi – are not rounded or sympathetic figures; it sometimes seems as if everyone is trying to become the worst possible version of themselves. Here are Eugene Landy, Charles Manson, Phil Spector, Murry Wilson. Then there are the scarcely believable transformations of the boy-child Brian Wilson. How did he jump through the hula hoops of novelty pop to arrive, in the blink of an ‘I’, at a place where it seemed perfectly natural to come up with the idea of writing a pop music suite embodying the four elements?
Everything in this story is multiple and contradictory. No fact is secure, no testimony certain: all is apocrypha, surmise and legend. Over the years, the principals have offered wildly different readings of the same events, none more so than the prodigy at the heart of it all. In his brisk, canny, entertaining book Surf’s Up – a summa theologica of Beach Boy lore and legend – Peter Doggett sums it up: ‘As ever with Brian and the past ... the details altered sharply in each new telling.’ Sometimes during the course of the same interview. As if this or that reminiscence were simply one more of the musical ‘feels’ he said flowed through his head. There are even two starkly different Brian Wilson memoirs. [...]
Murry Wilson, the father of Brian and his two brothers, fellow Beach Boys Carl and Dennis, was not by all accounts an easy man to love. He was a businessman, but his dream life was dominated by the siren call of music. It nagged at him that his talent as a composer and songwriter wasn’t getting its due. Why weren’t his melodies heard everywhere? He was snappish, sniping, volatile, and doled out violent punishments to his three sons. The only time he wasn’t angry was when he could be soothed by the syrupy sounds of easy listening music. Off the back of his sons’ success he would eventually release his own LP, The Many Moods of Murry Wilson (1967) – the title is apt. The middle Wilson, Dennis, took the brunt of Murry’s physical abuse, but Brian, first born and most gifted, was the one in the dangerous position of being able to realise his father’s dreams. The ire of a disappointed god: anything you do will be either too good or not good enough. In this eggshell atmosphere, while the boys’ mother, Audree, rustled up huge amounts of anaesthetic food – hyperactive Dennis was the only one who didn’t pile on the pounds – Brian taught Carl and Dennis to sing in harmony; this, he later reflected, ‘brought peace to us’.
Brian studied Bach and Beethoven, and learned to trust in the healing balm of counterpoint. And like Beethoven, who also had two brothers and a violent, overbearing father, he was slowly going deaf. ‘Before he entered his teens,’ Doggett explains, ‘Brian’s parents noticed that he tended to talk out of one side of his mouth and would turn his head around to pick up sounds or voices that came from the opposite side of his head. Tests were carried out, and it was determined that he enjoyed less than 20 per cent hearing in his right ear.’ There seems never to have been an official diagnosis. All we know is that Brian didn’t seem to hear like anyone else. As with Beethoven, his partial deafness and the ringing in his ears didn’t hinder his work as a composer, but it did make live performance a living hell and caused him to withdraw slowly from the hubbub of social life. The crossroads moment took place high up in the air: in December 1964 Wilson was flying to Houston to start a tour when he had some kind of convulsive breakdown. Too much pressure, in both senses. Things that make your head go pop. He no longer wanted to be up on stage with all the feedback and screaming.
The recording studio made possible new ways of listening. Tiny increments of syllable and sound to juggle. Listen to tapes of Wilson working in the studio and you can hear just how precise and in control he is: this is the one place where he knows who he is and what he wants. Did he ever sound more sensual than when he delivered the lines ‘I can hear so much/in your sighs’ and ‘Listen, listen, listen ...’ from ‘Don’t Talk (Put Your Head on My Shoulder)’? ‘Music became his language of choice,’ Doggett writes, ‘with which he was far more articulate than he ever was with words.’ You could say things to girls you could never say in real life. You could conjure up swells, plateaux, shivers; the sound of the sun coming up over the sea. Like many a Romantic man, his way of feeling intimacy is via something cloudy, oceanic, mountain-top. The nearest faraway place.
One of the mythic promises of rock’n’roll was escape to a place where the action was and where you could maybe find others like yourself. But there would be no such getaway for the Beach Boys – no big city salvation, no yellow brick road. They would live and die in LA. The Beach Boys didn’t scour snow-strafed city streets looking for old blues 78s. They idolised the very ‘square’ barbershop quartet the Four Freshmen; Wilson wrote a song called ‘Be True to Your School’. They were not, in a word, cool. They didn’t leave home, didn’t mooch, didn’t stray: they were already in the teen fantasy promised land. In Hawthorne, south-west Los Angeles, everything was on their doorstep, including their future bandmates: livewire cousin Mike Love; high-school classmate Al Jardine; long-time neighbour David Marks.
The Beach Boys, like many of the new bands of that era, sprang out of a local scene with its own heroes, slang, fashion. In this bright diurnal paradise, four of their early singles were hymns to a local leisure pursuit/metaphysical quest: ‘Surfin’’, ‘Surfin’ Safari’, ‘Surfin’ USA’, ‘Surfer Girl’. But then there suddenly appeared the achingly introspective ‘In My Room’. Co-written with Gary Usher, it’s a swerve away from the world of the drive-in, the burger place, the drag strip into a wholly/holy inner world where the singer can ‘lie awake and pray’. It’s a vulnerable song about the desire to be invulnerable. ‘You’re not afraid when you’re in your room,’ Wilson once said. The recording studio was his other panic room. It was somewhere you could explore a spectrum of emotional tones, as heard in early songs like ‘Lonely Sea’ (1963), ‘Don’t Worry Baby’ (1964), ‘The Warmth of the Sun’ (1964) and, most of all for me, the near-perfect pop record ‘Guess I’m Dumb’ (1965), written, arranged and produced by Wilson, sung by a young Glen Campbell.
Wilson would soon become notorious for how much time he took to record things, but at this early stage everything was a blur. There were ten Beach Boys studio albums between 1962 and 1965. There were no maps, no precedents; their de facto manager and ‘appropriate adult’ at this point was their irascible, interfering father. Brian Wilson may have had his mood swings but he was, in his own way, quite sturdy. Something you begin to notice, leafing through all the Beach Boys books, is how strapping the teenage Brian looks in high-school snaps and how sporty too; this was no neurasthenic squirt. He also had a reputation for being a bit of a cut-up. A twelfth-grade report card reveals that he got an A in Physical Education, a B in Senior Problems (Personal Psychology) and only a C for Piano and Harmony.
Wilson was famously not a surfer: he may have held business meetings in his swimming pool and set up his piano in a sand pit, but he had to be dragged into the sea as if he was undergoing aversion therapy. The Beach Boys’ early hits were the sound of everything to do with surfing, absent the sensation of surfing itself. Surfers try to control unpredictable swells and curls, seeking moments of transcendence, measured in seconds or a few short minutes – just like the pop music Wilson was about to unleash on the world.
by Ian Pernman, London Review of Books | Read more:
Image: uncredited
Labels:
Culture,
history,
Music,
Psychology,
Relationships
Wednesday, April 22, 2026
The Secret History of Wakanda
The history of Wakanda is not, of course, an African history; it’s a history of Europe, and of Europe’s fantasies about Africa.
This hidden kingdom is first attested in Book V, Chapter VIII of Pliny the Elder’s Natural History, on the ‘countries on the other side of Africa.’ As Pliny ventures further from the known world of the Mediterranean, and into the depths of Africa, the peoples he describes are drawn with a lighter and lighter brush. He can’t quite say what these people are, but only what they lack. Nightmares live here, in the hot voids of the world: [...]
But then, after this list of fantastic degenerations, we meet something different. Pliny describes a kind of African Utopia:
We do know that in Pliny’s time, Vicindaria was widely believed to be real. Sixteen years before the Natural History was published, the emperor Nero sent a praetorian expedition down the White Nile, to find its source and establish relations between Rome and Vicindaria, for future trade and possible conquest. Seneca, as Nero’s tutor, had commissioned the voyage, and he reports its findings in his Natural Questions:
This is not to say that the story was forgotten. Pliny’s account was reproduced in the Etymologies of Isidore of Seville; among early medieval writers the most significant part of the narrative was the reference to ‘their God and his son.’ Centuries before Christ, these people were Christian. In 687 AD, the heresiarch Caelestius of Aquitaine was burned for insisting that Christ had been born twice, once to the Vicindariae and once to the rest of the world, but that the Vicindariae, being wise, had not killed him. Small communities of Caelestians survived in the Pyrenees for another two hundred years, claiming to follow a purer, African version of Christianity, in which redemption can be achieved without blood. (They rejected the name Caelestians, and preferred to call themselves the ‘Good Whites’ instead.)
This hidden kingdom is first attested in Book V, Chapter VIII of Pliny the Elder’s Natural History, on the ‘countries on the other side of Africa.’ As Pliny ventures further from the known world of the Mediterranean, and into the depths of Africa, the peoples he describes are drawn with a lighter and lighter brush. He can’t quite say what these people are, but only what they lack. Nightmares live here, in the hot voids of the world: [...]
But then, after this list of fantastic degenerations, we meet something different. Pliny describes a kind of African Utopia:
At the centre of the region of Ăthiopia we may find the source of the Nile, guarded by a kingdom called Vicindaria, so called for its many conquests. The VicindariĂŠ are ruled by their philosophers; and if Pelagon of Rhodes is to be believed their libraries contain all that can be known in the useful crafts. Among their marvels are flying chariots, drawn by certain spinning serpents; fine silks that protect the body like armour; trees bearing glowing fruit with which they light their houses; and great towers made of brass and iron. Their cities are arranged in circles, like those of the Etruscans; at the centre of each stands a library which is also a temple to their God and his son. In all their affairs they are orderly and virtuous; solemn are their laws and just are their judges, and all men live in amity with one another. The VicindariĂŠ are the ancestors of the the Egyptians and the Numidians, and by some accounts, the fathers of all men. But Pelagon says that they have withdrawn from their troublesome children, have no intercourse with the peoples of the world, and no longer set off on voyages over the oceans or to the Moon; preferring to perfect their knowledge in seclusion, their kingdom can not be found by foreigners.Where did this idea come from? And how did Pliny appear to describe helicopters, skyscrapers, and the electric lightbulb? Pelagon of Rhodes was a Greek geographer of the second century BC; frustratingly, one of our only surviving sources for his works is Pliny himself. Maybe the story stretches back further; maybe the Greeks had nursed this legend of a distant, magical kingdom for centuries. It’s been suggested that the army of Memmon in Arctinus Milesius’ lost Aethiopis might have some relation to the myth; so too might the Homeric gods’ repeated habit of flying off to visit Ethiopia. We will probably never know.
We do know that in Pliny’s time, Vicindaria was widely believed to be real. Sixteen years before the Natural History was published, the emperor Nero sent a praetorian expedition down the White Nile, to find its source and establish relations between Rome and Vicindaria, for future trade and possible conquest. Seneca, as Nero’s tutor, had commissioned the voyage, and he reports its findings in his Natural Questions:
There we found not towers of bronze or wondrous libraries, but only marshes, the limit of which even the natives did not know, and no one else could hope to know, so completely was the river entangled with vegetable growth, so impassable the waters by foot, or even by boat, since the muddy overgrown marsh would bear only a small boat containing one person.Nero’s expedition may have reached present-day Uganda: the furthest Roman legions ever travelled into equatorial Africa. Europeans made no further efforts to contact the hidden kingdom of Vicindaria for another thousand years.
This is not to say that the story was forgotten. Pliny’s account was reproduced in the Etymologies of Isidore of Seville; among early medieval writers the most significant part of the narrative was the reference to ‘their God and his son.’ Centuries before Christ, these people were Christian. In 687 AD, the heresiarch Caelestius of Aquitaine was burned for insisting that Christ had been born twice, once to the Vicindariae and once to the rest of the world, but that the Vicindariae, being wise, had not killed him. Small communities of Caelestians survived in the Pyrenees for another two hundred years, claiming to follow a purer, African version of Christianity, in which redemption can be achieved without blood. (They rejected the name Caelestians, and preferred to call themselves the ‘Good Whites’ instead.)
by Sam Kriss, Numb at the Lodge | Read more:
Image: uncredited
Humanism in a Posthumanist Age
Should we be surprised that Oxford University Press picked rage bait as the 2025 Word of the Year? Defining the compound noun as “online content deliberately designed to elicit anger or outrage by being frustrating, provocative, or offensive,” the chair of the selection committee explained that the point of the annual exercise is “to encourage people to reflect on where we are as a culture, who we are at the moment, through the lens of words we use.”
Quite clearly, we are not in a very nice place. When anger is the prime motivator, you can be sure that it feeds on other dark emotions, including fear, suspicion, and resentment. The ubiquity of anger also speaks to the widespread demoralization of late-modern society, evident in the grim statistics pertaining to depression, addiction, suicide, and other deaths of despair. Perhaps the darkest fear bubbling up through our culture is that humans themselves are replaceable—or at least in need of drastic biotechnological upgrading if they hope to keep up with the cool efficiency of their machines. The causes of our unhappy cultural condition are, as social scientists say, multifactorial and overdetermined, with some researchers placing the onus on our highly unsocial social media and others more sensibly arguing that the new media only amplify and reinforce trends and pathologies long in the making. We see the signs everywhere, from the decay of basic good manners and civility to gratuitously violent and crude entertainments to the mistreatment of working people as disposable units of production to actual acts of unspeakable cruelty inflicted on those we deem to be lesser or other, particularly those strangers whom we were long ago enjoined to treat as our neighbors. If we were still capable of the emotion, we would be ashamed of ourselves. But the loss of shame is another hallmark of our current condition, and as columnist George Will recently observed, “A nation incapable of shame is dangerous, not least to itself.”
What we are witnessing today is less a degradation of politics—though it is also that—than a meta-political and profoundly cultural swerve away from the informing humanist idealism of the modern liberal democratic project. When TomĂĄĆĄ Garrigue Masaryk, the first president of Czechoslovakia, defined democracy as “the political form of the humane ideal,” he was emphasizing the inseparability of a set of political practices and institutions from a broader humanizing effort drawing on the richest ethical, intellectual, and religious traditions of the West. More important than the rivalrous claims of the partisan participants in a liberal democracy was a shared national commitment to the various goods that sustain a decent human life. Give that a thought. Masaryk was no proto-globalist, no “We Are the World” sentimentalist. He regarded the democratic nation as the indispensable crucible in which the humane ideal could be practically instantiated in the treatment of one’s fellow citizens. Though no utopian, he believed the betterment of the shared human condition was the raison d’ĂȘtre of the democratic nation-state.
Masaryk’s idealism, and the exuberant hopes of Czechoslovakia’s fledgling democracy, were for a time snuffed by another variety of nationalist whose goose-stepping troops marched into the Sudetenland in 1938, first annexing that rich northwestern region before moving on to absorb most of the remainder of the country within a year. Far from advancing the humane ideal, this conquering zero-sum nationalist would have no qualms about eliminating some three hundred thousand undesirable elements from his newly acquired territory. Here and elsewhere, he saw it as fundamental to his project of purifying the Reich, ridding it of human garbage, and making Germany great again.
Lessons learned, lessons forgotten. So today, when we utter words such as humanitarian, humane ideal, humanism, we may have trouble suppressing the ironic smirk or the dismissive yawn. Fine words, but what is their real purchase when so much is being done to diminish, transform, transcend, and even surpass the merely human? [...]
The cost of the obliteration of the humane ideal in our time is incalculable. The stakes are nothing less than civilizational—meaning the civilization of the West and other civilizations that value the sacredness and inviolability of the individual human person. The challenge is ultimately about resisting those authoritarians who, now empowered by the most advanced articulation of the Machine, aim to crush the merely human for the sake of absolute power and control.
Image: Human Figure (detail), 1921, by Vilmos HuszĂĄr (1884–1962)
Quite clearly, we are not in a very nice place. When anger is the prime motivator, you can be sure that it feeds on other dark emotions, including fear, suspicion, and resentment. The ubiquity of anger also speaks to the widespread demoralization of late-modern society, evident in the grim statistics pertaining to depression, addiction, suicide, and other deaths of despair. Perhaps the darkest fear bubbling up through our culture is that humans themselves are replaceable—or at least in need of drastic biotechnological upgrading if they hope to keep up with the cool efficiency of their machines. The causes of our unhappy cultural condition are, as social scientists say, multifactorial and overdetermined, with some researchers placing the onus on our highly unsocial social media and others more sensibly arguing that the new media only amplify and reinforce trends and pathologies long in the making. We see the signs everywhere, from the decay of basic good manners and civility to gratuitously violent and crude entertainments to the mistreatment of working people as disposable units of production to actual acts of unspeakable cruelty inflicted on those we deem to be lesser or other, particularly those strangers whom we were long ago enjoined to treat as our neighbors. If we were still capable of the emotion, we would be ashamed of ourselves. But the loss of shame is another hallmark of our current condition, and as columnist George Will recently observed, “A nation incapable of shame is dangerous, not least to itself.”
What we are witnessing today is less a degradation of politics—though it is also that—than a meta-political and profoundly cultural swerve away from the informing humanist idealism of the modern liberal democratic project. When TomĂĄĆĄ Garrigue Masaryk, the first president of Czechoslovakia, defined democracy as “the political form of the humane ideal,” he was emphasizing the inseparability of a set of political practices and institutions from a broader humanizing effort drawing on the richest ethical, intellectual, and religious traditions of the West. More important than the rivalrous claims of the partisan participants in a liberal democracy was a shared national commitment to the various goods that sustain a decent human life. Give that a thought. Masaryk was no proto-globalist, no “We Are the World” sentimentalist. He regarded the democratic nation as the indispensable crucible in which the humane ideal could be practically instantiated in the treatment of one’s fellow citizens. Though no utopian, he believed the betterment of the shared human condition was the raison d’ĂȘtre of the democratic nation-state.
Masaryk’s idealism, and the exuberant hopes of Czechoslovakia’s fledgling democracy, were for a time snuffed by another variety of nationalist whose goose-stepping troops marched into the Sudetenland in 1938, first annexing that rich northwestern region before moving on to absorb most of the remainder of the country within a year. Far from advancing the humane ideal, this conquering zero-sum nationalist would have no qualms about eliminating some three hundred thousand undesirable elements from his newly acquired territory. Here and elsewhere, he saw it as fundamental to his project of purifying the Reich, ridding it of human garbage, and making Germany great again.
Lessons learned, lessons forgotten. So today, when we utter words such as humanitarian, humane ideal, humanism, we may have trouble suppressing the ironic smirk or the dismissive yawn. Fine words, but what is their real purchase when so much is being done to diminish, transform, transcend, and even surpass the merely human? [...]
The cost of the obliteration of the humane ideal in our time is incalculable. The stakes are nothing less than civilizational—meaning the civilization of the West and other civilizations that value the sacredness and inviolability of the individual human person. The challenge is ultimately about resisting those authoritarians who, now empowered by the most advanced articulation of the Machine, aim to crush the merely human for the sake of absolute power and control.
by Jay Tolson, The Argument | Read more:
You'll Regret It
Human beings have manic episodes; when it happens to an entire nation we call it empire. The affliction is the same. You prance around town with your tits practically pouring out your top, demanding drinks from strangers, snatching cigarettes out their hands. Isn’t it funny how I can do absolutely anything I want? And everybody loves me? You know you have a special destiny in the world. It’s obvious; flowers turn their faces towards you whenever you walk past. You’re going to save the world by sniffing coke off a stranger’s frenulum. And other people don’t understand, they’re all such bummers, they take things so personally, when really it was just a joke. In fact the whole world is a joke, none of it’s really serious, this great primary-coloured playground built for your delight. Sometimes in the brief moments you’re alone you can hear laughter, not coming from anyone in particular, not laughing at anything you can name, just the manic chattering laughter of the entire universe, flooding the silence. Lately you’ve been getting in fights. You’ve been winning them all. You’ve been stumbling into casinos and putting it all on red, emptying out your bank account, taking unsecured loans, putting it all on red and winning every time. God loves you more than he loves other people, he loves you in a different way. Maybe in an erotic way. Maybe you’re interested. You’ve been buying precious stones, rubies and sapphires; you keep them in your pockets. Sometimes people tell you that one day you’re going to wake up in hospital again, or jail, again, or in a pool of your own blood and vomit, or maybe not at all. They’re wrong. That happens to other people. It will never, ever happen to you.
I like American optimism. Not everyone does. A lot of people from long-vanished empires claim to find it unbearable; it reminds them of what they no longer have. But I like it. There’s something ridiculous about an American who tries to hate their own country, like a dog trying to walk on two legs. They don’t know what it means to wake up and curse the grey skies and poisoned soil of Splugovina, this place that closes around you like a tomb. They can rage against the slavery and genocide, but it’s still with that bright, feverish, all-American gleam in the eye. The only way an American can really encounter pessimism is by hiring a British person to perform it for them. That’s what I do, basically. It’s a living.
One good thing about Europe is we’ve all already been through it all. Here, every miserable dirt-poor republic had its century in the sun. Today, Splugovina is a dreary landlocked country of eight million people that produces sunflower seeds, insulated cables, and zinc-bearing ores, but for a brief period in the fifteenth century the glorious Splug Empire stretched clear across the continent. The crowned heads of Europe came to kneel and give tribute. After that, it’s true, there was the War of the Quintuple Alliance, and all the cities were razed, and maybe forty percent of the population starved in the fields, but there are still some very impressive ruins in the hills. That time is never coming back, though. All you can do now is put up a bunch of gaudy statues to the conquering heroes, make genocidal chants at football games. Remember, with a kind of lazy black bitterness, the days when the world was made of sugar and you were mad. [...]
I like American optimism. Not everyone does. A lot of people from long-vanished empires claim to find it unbearable; it reminds them of what they no longer have. But I like it. There’s something ridiculous about an American who tries to hate their own country, like a dog trying to walk on two legs. They don’t know what it means to wake up and curse the grey skies and poisoned soil of Splugovina, this place that closes around you like a tomb. They can rage against the slavery and genocide, but it’s still with that bright, feverish, all-American gleam in the eye. The only way an American can really encounter pessimism is by hiring a British person to perform it for them. That’s what I do, basically. It’s a living.
The problem, though, is the corollary to all this charming American exuberance, which is the repeated bouts of mass murder. It comes in cycles. A few years of screaming bloodlust until it all blows up in your face, and then you spend the next few years at home drinking wine out the bottle and wailing over the unfairness of the world, before finally straightening your back, giving one last sniff, and bravely stepping outside to once again club someone’s children to death. I used to think some kind of progress was possible here. I used to have something called the Iraq War Theory of Divorce in Hollywood Films. The theory says that if a film features a male lead character who gets divorced or separated from his main romantic interest, and it came out before 2005 or so, by the end he will have cajoled his ex back into bed and they’ll live happily ever after. Liar Liar, The Parent Trap, Eternal Sunshine of the Spotless Mind. If it came out after 2005, by the end he will have learned to accept the situation, moved on, and found someone new. A total bloodbath in the Middle East, maybe a million people shot or blown up or tortured to death with power tools, so you can learn that hey, sometimes things don’t work out there way you want them to, and hey, sometimes that’s ok. But all these things are temporary. Don Quixote got a decade of sanity between volumes before the rabbit poison started glittering in his eyes and he was babbling about knight errantry again. America got less than half. Four years after the last American troops left Afghanistan under Taliban guard, war critic JD Vance was on the TV, saying that while he understood why people were put off by the last round of wars in the Middle East, ‘the difference is that back then we had dumb presidents, and now we have a president who actually knows how to accomplish America’s national security objectives.’ The dumb presidents, the ones who blundered around getting America into quagmires, still always held back from directly attacking Iran. The smart president is Donald Trump. [...]
So far, the war is going very well. It’s called Operation Epic Fury. Operation Epic Badass Ninja Pirate. Organs of state keep issuing public statements that say things like ‘Kill without hesitation, avenge without mercy’ and ‘You say death to America, we say America will be your death.’ They’re having no problems killing anyone they want to kill. Iran might be a proud and ancient civilisation with a historical memory stretching back six thousand years, but right now it’s an easily broken toy in the hands of an empire that can barely remember the day before yesterday. But somehow, the power to kill anyone at will isn’t enough. Things are not going according to plan. As far as I can tell, the plan was this. As soon as Israel and America eliminated the Supreme Leader, the entire Islamic Republic would disintegrate like an alien invasion fleet once the mothership’s been hit. At this point the Iranian people would fill the streets, overthrow the mullahs, and immediately start signing up for an OnlyFans account. Obviously these are early days, but it doesn’t look like things are going to plan. Something very different is happening. Decapitating the Islamic Republic has not shut it down. Instead, individual IRGC units are all operating autonomously, using their own mobile and highly fluid command structures. Instead of a single enemy, there’s now a swarm. No central authority to negotiate with even if you wanted to. A headless zombie Iran, the wreckage of a six-thousand-year-old state spewing ballistic missiles in every direction. Missiles falling on Saudi oil refineries, Bahraini radar installations, on the matcha labubu sexual slavery camps of Dubai. You thought all those CGI skyscrapers meant you were abstracted from geography, but this is still the Middle East. Meanwhile the revolutionaries have not yet shown up in the streets of Tehran. Possibly because the people most likely to overthrow the regime already tried that in January, and the regime killed or imprisoned them all. It might not happen. The Islamic Republic is a bad government, possibly the worst government anywhere on the face of the earth, but it’s being attacked by children making plane noises.
by Sam Kriss, Numb at the Lodge | Read more:
Image: uncredited
Labels:
Crime,
Culture,
Government,
history,
Journalism,
Military,
Politics,
Psychology,
Security
Tuesday, April 21, 2026
Subscribe to:
Comments (Atom)


