Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Sunday, March 8, 2026

The China Vibe Shift

A year ago came what, for lack of a better term, we dubbed the DeepSeek moment. That was followed fairly quickly by the curious migration of “TikTok refugees” to Xiaohongshu, and not long after that by the first conversations Jeremy Goldkorn and I had about what felt like a changing American — or even Western — mood toward China.

Today, freshly back from Switzerland after covering the World Economic Forum (where the chatter was, not surprisingly, fixated on Trump’s covetous pronouncements on Greenland and Mark Carney’s “rupture” speech), with Keir Starmer now in Beijing to continue talks about restoring some version of the UK–China “Golden Age,” it feels like a decent moment to look back and ask what, if anything, all of that amounted to.

Jeremy and I recorded a podcast episode in which we tried to describe something we were both sensing in the early months of 2025 but couldn’t quite pin down. It wasn’t a policy shift, or even a clear change in opinion. It was more atmospheric than that — a change in tone, in default assumptions, in the emotional register through which China was being discussed in Western discourse. We eventually settled, somewhat sheepishly, on calling it a “vibe shift.” (Less sheepishly, we reconvened in November to gloat about how we’d gotten that right!)

The phrase was imprecise and was intended to convey imprecision. But it did seem to capture something real. Multiple polls have since borne it out, and the feeling has only grown stronger. What’s become clearer to me, looking back, is how that shift relates to a larger argument I’ve been making for some time now — what I called the “Great Reckoning” in a piece I published in The Ideas Letter.

The two are not the same thing. The vibe shift is not the reckoning I’m looking for. But it may be making one more possible.

The change I’m describing is not a sudden outbreak of admiration for China, nor a reversal of long-standing concerns about human rights, political repression, or democracy (though admittedly I’ve seen some of that in some quarters). Those issues remain very much part of the picture. What’s changing is something more basic: the set of assumptions that have long structured how China is interpreted in Western public life.

For years, a relatively stable narrative did a lot of work. China’s successes were provisional; its failures were fundamental. Growth would eventually give way to crisis. Political liberalization was assumed to be inevitable, even if perpetually deferred. Moral condemnation often stood in for empirical assessment. China could be criticized without being fully understood, because history, it was assumed, would take care of the rest.

That narrative hasn’t exactly been replaced. One only has to look at how eagerly some commentators declared Party rule “brittle” following the purges of Zhang Youxia and Liu Zhenli, or how quickly far-fetched rumors were embraced, to see that the old habits die hard.

But the narrative has lost much of its force, mainly because the U.S. — Gaza to Greenland — no longer commands the moral authority it once assumed. Increasingly, when I hear it, it sounds less like analysis and more like reassurance. I know I’m not alone in this.

You can see this erosion in small but telling ways: in the growing reluctance to predict imminent collapse; in the uneasy acknowledgment that China is capable of building complex systems at scale; in the fact that younger audiences, and people closer to technology, manufacturing, or logistics, are less willing to treat China as a purely derivative or temporary phenomenon.

None of this amounts to endorsement. But it does suggest a loosening of reflexes.

A year of small shocks

The past year offered no shortage of moments that helped crystallize this shift.

The emergence of DeepSeek was only one of them. The reaction it provoked wasn’t really about a single large language model. It was about the dawning realization that China was not merely following at the technological frontier, but participating in shaping it. That realization sat awkwardly with long-standing assumptions about where innovation could — and could not — come from.

Then there was the strange but revealing episode of Western “TikTok refugees” making their way onto Xiaohongshu. Tens of thousands of users encountered a Chinese social media environment directly, without mediation by think tanks, policy papers, or cable news. The result wasn’t mass admiration so much as something more disarming: familiarity. China appeared less opaque, less exotic, and therefore harder to keep at a safe analytical distance. (In a strange coda to that episode a year on — not something I’ve looked into too closely, but from what I’m hearing — people are once again abandoning TikTok for Chinese apps, TikTok being under new and apparently very censorship-happy American management).

Around the same time, a steady trickle of firsthand accounts — from executives, engineers, investors, and travelers — described a China that didn’t fit neatly into prevailing narratives. Infrastructure that worked. Manufacturing ecosystems that functioned smoothly. A sense of momentum that was hard to reconcile with predictions of stagnation or decay.

Some of this material was shallow. A fair amount of the so-called “China-pilled” content circulating online is overwrought, unserious, or plainly wrong. I don’t endorse it. But even that excess is revealing. It suggests that people are groping, sometimes awkwardly, for ways to make sense of realities that just don’t fit the narrative they’ve been sold.

One of the stranger — and more amusing — expressions of this moment was described in a recent Wired piece by Zeyi Yang, who is always worth reading. Yang wrote about the sudden popularity of memes in which Americans announce that they are in “a very Chinese time” of their lives: drinking hot water (which I do endorse), wearing slippers in the house, posting videos of themselves eating dim sum, sporting vaguely Chinese-coded streetwear, or joking about “Chinamaxxing.”

The joke, as Yang notes, is not really about China, and certainly not about Chinese people. It’s a projection — a way of gesturing at something Americans feel they’ve lost.

The meme works precisely because it’s unserious. No one is actually becoming Chinese. But the impulse behind it is telling. China, in this memified version, functions less as a real place than as a symbolic contrast: a stand-in for competence, momentum, coherence, or simply “things getting done,” set against a backdrop of crumbling infrastructure, normalized dysfunction, and institutional paralysis at home.

That selectivity is the point. The meme is disposable, ironic, and easily reversed. It allows people to flirt with an alternative without committing to understanding it. In that sense, it’s less a sign of admiration than of dissatisfaction — a sideways commentary on American malaise, filtered through a half-ironic orientalist lens.

I wouldn’t read too much into it. But I wouldn’t dismiss it either. Cultural detritus often reflects shifts in mood before more formal discourse catches up.

The reckoning beneath the surface

This is where the connection to the “Great Reckoning” comes in — and where it’s easy to sound more portentous than necessary.

The reckoning I have in mind isn’t really about China. It’s about us. More specifically, it’s about a long-standing Western habit of assuming that modern outcomes — wealth, tech sophistication, state capacity — are inseparable from Western political forms. When things don’t line up that way, the tendency has been to assume something must be temporary, distorted, or unsustainable.

China’s rise has been awkward for that story. Not because it offers the West some appealing alternative model — I don’t think it does — but because it keeps producing results that are hard to dismiss without contortions. Over time, this has encouraged a set of coping strategies: predictions of imminent collapse, confident talk of inevitable convergence, and a habit of substituting moral judgment for careful description.

For a while, that worked. Or at least it postponed the need for a harder conversation...

That’s what I mean by the vibe shift. Not that people have settled on a new story, but that the old one is starting to creak loudly enough to be noticed.

In that sense, the shift is preparatory. It doesn’t tell us what to think next. It just makes it harder to keep thinking the same way.

by Kaiser Y. Kuo, Sinica | Read more:
Image: via
[ed. I've got nothing against China, it's just doing what any superpower would do, looking out for its interests, expanding its sphere of influence for economic and security reasons, and attempting to preserve its history, culture and political system. See also: The Civilization Trap (Sinica). And, in case you missed it, Why Everyone Is Suddenly in a ‘Very Chinese Time’ in Their Lives (Wired). Oh, and this: China's power grid investments to surge to record $574 billion in 2026-2030. Maybe people are just envious that China is investing in its future, while the US self-destructs and spends $ trillions on military weapons and war mongering.]

Tuesday, March 3, 2026

The Irsay Collection/Auction

Kurt Cobain’s famed Fender is part of $1 billion collection going to auction

In the summer of 1991, Nirvana filmed the music video for “Smells Like Teen Spirit” on a Culver City sound stage. Kurt Cobain strummed the grunge anthem’s iconic four-chord opening riff on a 1969 Fender Mustang, Lake Placid Blue with a signature racing stripe.

Nearly 35 years later, the six-string relic hung on a gallery wall at Christie’s in Beverly Hills as part of a display of late billionaire businessman Jim Irsay’s world-renowned guitar collection, which heads to auction at Christie’s, New York, beginning Tuesday. Each piece in the Beverly Hills gallery, illuminated by an arched spotlight and flanked by a label chronicling its history, carried the aura of a Renaissance painting.

Irsay’s billion-dollar guitar arsenal, crowned “The Greatest Guitar Collection on Earth” by Guitar World magazine, is the focal point of the Christie’s auction, which has split approximately 400 objects — about half of which are guitars — into four segments: the “Hall of Fame” group of anchor items, the “Icons of Pop Culture” class of miscellaneous memorabilia, the “Icons of Music” mixed batch of electric and acoustic guitars and an online segment that compiles the remainder of Irsay’s collection. The online sale, featuring various autographed items, smaller instruments and historical documents, features the items at the lowest price points.

A portion of auction proceeds will be donated to charities that Irsay supported during his lifetime.

Cobain’s Fender was only one of the music history treasures nestled in Christie’s gallery. A few paces away, Jerry Garcia’s “Budman” amplifier, once part of the Grateful Dead’s three-story high “Wall of Sound,” perched atop a podium. Just past it lay the Beatles logo drum head (estimated between $1 million and $2 million) used for the band’s debut appearance on “The Ed Sullivan Show,” which garnered a historic 73 million viewers and catalyzed the British Invasion. Pencil lines were still visible beneath the logo’s signature “drop T.” [ed. Also includes Eric Clapton's Martin acoustic guitar used on 'Unplugged'].

It is exceptionally rare for even one such artifact to go to market, let alone a billion-dollar group of them at once, Walker said. But a public sale enabling many to participate and demonstrate the “true market value” of these objects is what Irsay would have wanted, she added.

Dropping tens of millions of dollars on pop culture memorabilia may seem an odd hobby for an NFL general manager, yet Irsay viewed collecting much like he viewed leading the Indianapolis Colts.

Irsay, the youngest NFL general manager in history, said in a 2014 Colts Media interview that watching and emulating the legendary NFL owners who came before him “really taught me to be a steward.”

“Ownership is a great responsibility. You can’t buy respect,” he said. “Respect only comes from you being a steward.”

The first major acquisition in Irsay’s collection came in 2001, with his $2.4-million purchase of the original 120-foot scroll for Jack Kerouac’s 1957 novel, “On the Road.” He loved the book and wanted to preserve it, Walker said. But he also frequently lent it out, just like he regularly toured his guitar collection beginning 20 years later.

“He said publicly, ‘I’m not the owner of these things. I’m just that current custodian looking after them for future generations,’ ” Walker said. “And I think that’s what true collectors always say.”

At its L.A. highlight exhibition, Irsay’s collection held an air of synchronicity. Paul McCartney’s handwritten lyrics for “Hey Jude” hung just a few steps from a promotional poster — the only one in existence — for the 1959 concert Buddy Holly, Ritchie Valens and J.P. “The Big Bopper” Richardson were en route to perform when their plane crashed. The tragedy spurred Don McLean to write “American Pie,” about “the day the music died.”

Holly was McCartney’s “great inspiration,” Christie’s specialist Zita Gibson said. “So everything connects.”

Later, the Beatles’ 1966 song “Paperback Writer” played over the speakers near-parallel to the guitars the song was written on. [...]

Another fan-favorite is the “Wilson” volleyball from 2000’s “Cast Away,” starring Tom Hanks, estimated between $60,000 and $80,000, Gibson said.

Historically, such objects were often preserved by accident. But as the memorabilia market has ballooned over the last decade or so, Gibson said, “a lot of artists are much more careful about making sure that things don’t get into the wrong hands. After rehearsals, they tidy up after themselves.”

by Malia Mendez, Los Angeles Times/Seattle Times |  Read more:
Image:Cover Images/ZUMA Press/TNS
[ed. Mentioned this in a previous post but still can't believe what's here.]

Sunday, March 1, 2026

Jimi Hendrix Was a Systems Engineer

Jimi Hendrix Was a Systems Engineer. He precisely controlled modulation and feedback loops (IEEE Spectrum).
Image: James Provost
[ed. Everything was new and primitive back then. Jimi pushed these new tools to their limits.]

Monday, February 23, 2026

Chicago Gets a Lift

Walking down the magnificent streets of downtown Chicago, towering skyscrapers on all sides of you, you probably couldn’t guess the incredible scheme the city carried out in the area some 160 years before.

They lifted the whole city up in the air.

Between four and fourteen feet. Buildings, streets and all. Straight up, using hydraulic jacks and jackscrews.

It was a titanic feat of engineering, imagination and sheer moxie. And it might just say a lot about that early Chicago character.

... buildings were lifted up using jackscrews and the occasional hydraulic lift. And we’re not just talking houses. Entire masonry buildings were raised in the air. Eventually, they even figured out how to raise an entire block at once. They placed 6000 jackscrews under the one-acre block between Lake, Clark and LaSalle streets, estimated at 35,000 tons in weight, and raised the whole thing over four days—buildings, sidewalks and all. The process was gradual enough that business continued in the buildings throughout.

Not every building went through the process. Not because it was too difficult, but because some of the buildings no longer fit with where the city was going. But waste not, want not. They put these old wooden buildings on rollers and drew them by horse to the edges of town. Of course, the enterprising owners of businesses operating in these buildings didn’t want to miss out on business, so many continued to serve customers even as the buildings were rolling down the street.

by Illinois Office of Tourism |  Read more:
Image: uncredited
[ed. Man, they really got things done back then. See also: American water is too clean (WIP).]

Sunday, February 22, 2026

Embryo Selection Company Herasight Goes All In On Eugenics

Multiple commercial companies are now offering polygenic embryo selection on a wide range of traits, including genetic predictors of behavior and IQ. I’ve previously written about the methodological unknowns around this technology but I haven’t commented on the ethics. I think having a child is a very personal decision and it’s not my place to tell people how to do it. But the new embryo selection company, Herasight, has started advocating for eugenic societal norms that I find disturbing and worth raising alarm over. Because this is a fraught topic, I’ll start with some basic definitions.

What is eugenics?

Eugenics is an ideology that advocates for conditioning reproductive rights on the perceived genetic quality of the parents. Francis Galton, the father of eugenics, declared that eugenics’ “first object is to check the birth-rate of the Unfit, instead of allowing them to come into being”. This goal was to be achieved through social stigma and, if necessary, by force. The Eugenics Education Society, for instance, advocated for education, segregation, and — “perhaps” — compulsory sterilization to prevent the “unfit and degenerate” from reproducing:

A core component of defining “the unfit” was heredity. Eugenicists are not just interested in improving people’s phenotypes — a goal that is widely shared by modern society — but the future genotypic distribution. The genetic stock. This is why eugenic policies historically focus on sterilization, including the sterilization of unaffected relatives who harbor genotype but not phenotype. If someone commits a crime, they face time in prison for their actions, but under eugenic reasoning their law-abiding sibling or child is also suspect and should be stigmatized (or forcefully prevented) from passing on deficient genetic material.

A simple two-part test for eugenics is then: (1) Is it concerned with the future genetic stock? (2) Is it advocating for restricted reproduction, either through stigma or force, for those deemed genetically inferior?

Is embryo selection eugenics?

I have publicly resisted applying the “eugenics” label to embryo selection writ large and I continue to do so. Embryo selection is a tool and its use is morally complex. A couple can choose to have embryo screening for a variety of reasons ranging from frivolous (“we want to have a blue eyed baby”) to widely supported (“we carry a recessive mutation that would be fatal in our baby”), none of which have eugenic intent. Embryo selection can even be an anti-eugenic tool, as in the case of high-risk couples who have already decided against having children. If embryo selection technology allows them to lower the risk to a comfortable level and have a child they would otherwise have avoided, then the outcome is literally the opposite of eugenic selection: “unfit” individuals (at least as they see themselves) now have an incentive to produce more offspring than they would have. In practice, IVF remains a physically and emotionally demanding procedure, and my guess is that individual eugenic intentions — the desire to select out unfit embryos with the specific motivation of improving the “genetic stock” of the population — are exceedingly rare.

Is Herasight advocating for eugenics?


While I do not think embryo selection is eugenic in itself, like any reproductive technology, it can be wielded for eugenic purposes. The new embryo selection company Herasight, in my opinion, is advocating for exactly that. To understand why, it is useful to first understand the theories put forth by Herasight’s director of scientific research and communication Jonathan Anomaly (in case you’re wondering, that is a chosen last name). Anomaly is a self-proclaimed eugenicist [Update: Anomaly has clarified that this description was not provided by him and he requested that it be removed]:

Prior to joining Herasight, Anomaly wrote extensively on the ethics of embryo selection, notably in a 2018 article titled “Defending eugenics”. How does Anomaly defend eugenics? First, he reiterates the classic position that eugenics is a resistance to the uncontrolled reproduction of the “unfit” (emphasis mine, throughout):
Darwin argued that social welfare programs for the poor and sick are a natural expression of our sympathy, but also a danger to future populations if they encourage people with serious congenital diseases and heritable traits like low levels of impulse control, intelligence, or empathy to reproduce at higher rates than other people in the population. Darwin feared that in developed nations “the reckless, degraded, and often vicious members of society, tend to increase at a quicker rate than the provident and generally virtuous members”
Anomaly goes on to sympathize with Darwin’s position and that of the classic eugenicists, arguing that “While Darwin’s language is shocking to contemporary readers, we should take him seriously”, later that “there is increasingly good evidence that Darwin was right to worry about demographic trends in developed countries”, and that we should “stop allowing [the Holocaust] to silence any discussion of the merits of eugenic thinking”.

Anomaly then proposes several potential eugenic interventions, one of which is a “parental licensing” scheme that prevents unfit parents from having children:
The typical response is for the state to step in and pay for all of these things, and in extreme cases to remove children from their parents and put them in foster care. But it would be more cost-effective to prevent unwanted pregnancies than treating their consequences, especially if we could achieve this goal by subsidizing the voluntary use of contraception. It may also be more desirable from the standpoint of future people.
The phrase “future people” figures repeatedly in Anomaly’s writing as a euphemism for the more conventional eugenic concept of genetic stock. This connection is made explicit when he explains the most compelling reason for supporting parental licensing:
The most compelling reason (though certainly not a decisive reason) for supporting parental licensing is that traits like impulse control, health, intelligence, and empathy have significant genetic components. What matters is not just that some parents are unwilling or unable to take care of their children; but that in many cases they are passing along an undesirable genetic endowment.
What are we really talking about here? Anomaly has proposed a technocratic rebranding of eugenic sterilization: instead of taking away your reproductive rights clinically, the state will take away your reproductive license and, if you still have children, impose “fines or other costs” (though Anomaly does not make the “other costs” explicit, eugenic sterilization is mentioned as an example in the very next sentence). How would the state decide who should lose their license? Anomaly explains:
For a parental licensing scheme to be fair, we would need to devise criteria that are effective at screening out only parents who impose significant risks of harm on their children or (through their children) on other people.
A fundamental normative principle of our society is that all members are created equal and endowed with unalienable rights. What Anomaly envisions instead is a society where the state can seize one of the most intimate of human freedoms — the right to become a parent — based on innate factors. How does the state determine whether a future child imposes significant risk on future people? By inspecting the biological makeup of the parents and identifying “undesirable genetic endowments” that will harm others “through their children”. This is a policy built explicitly on genetic desirability and undesirability, where those deemed genetically unfit are stripped of their rights to have children and/or fined for doing so — aka bog-standard coercive eugenics.

Today, Anomaly is the spokesperson for a company that screens parents for “undesirable genetic endowments” and, for a price, promises to boost their genetic desirability and their value to future people. It is easy to see how Herasight fits directly into the eugenic parental licensing scheme Anomaly proposed. Having an open eugenicist as the spokesperson for an embryo selection company seems, to me, akin to hiring Hannibal Lecter to do PR for a hospital, but perhaps Anomaly has radically changed his views since billing himself as a eugenicist in 2023?

Herasight (with Anomaly as first author) recently published a perspective white-paper on the ethics polygenic selection, from which we can glean their corporate position. The perspective outlines the potential benefits and harms of embryo selection. The very first positive benefit listed? The “benefits to future people”. While this section starts with a focus the welfare of individual children, it ends with the same societal motivations as classical eugenics: the social costs of the unfit on communities and the benefits of the fit to scientific innovation and the public good: [...]

When eugenics goes mainstream

Let’s review: eugenics has as a goal of limiting the birthrate of the “unfit” or “undesirable” for the benefit of the group. Anomaly describes himself as a eugenicist and explicitly echoes this goal through, among other policies, a parental licensing proposal. Anomaly now runs a genetic screening company. The company recently published a perspective paper advocating for the stigmatization of “unfit” parents who do not screen. Anomaly, as spokesperson, reiterates that their goal is indeed eugenics — “Yes, and it’s great!”. With any other person one could argue that they were clueless or trolling; but if anyone knows what eugenics means, it is a person who has spent the past decade defending it.

I have to say I am floored by how strange this all is. My personal take on embryo selection has been decidedly neutral. I think the expected gains are limited by the genetic architecture of the traits being scored and the companies are mostly fudging the numbers to look good. As noted above, I also think a common use of this technology will be to calm the nerves of parents who otherwise would have gone childless. So I have no actual concerns about changes to the genetic make-up of the population or genetic inequality or any of the other utopian/dystopian predictions. But I am concerned that the marketing around the technology revives and normalizes classic eugenic arguments: that society is divided into the genetically fit and the genetically unfit, and the latter need to be stigmatized away from parenthood for the benefit of the former. I am particularly disturbed by the giddiness with which Anomaly and Herasight have repeatedly courted eugenics-related controversy as part of their launch campaign.

Even stranger has been the response, or rather non-response, from the genetics community. Social science geneticists and organizations spent the past decade writing FAQs warning against the use of their methods and data for individual prediction and against genetic essentialism. Many conference presentations and seminars start with a section on the sordid history of eugenics and the sterilization programs in the US and Nazi Germany, vowing not to repeat the mistakes of the past. Now, a company is openly advocating for eugenics (in fact, a company with direct connections to these social science organizations) and these organizations are silent. It is hard not to conclude that the FAQs and warnings were just lip service. And if the experts aren’t raising alarms, why would the public be alarmed?

by Sasha Gusev, The Infinitesimal |  Read more:
Image: Anselm Kiefer, Die Ungeborenen (The Unborn), 2002
[ed. With neophyte Nazis seemingly everywhere these days, CRISPR advances, and technocrats who want to live forever, it's perhaps not surprising that eugenics would be making a comeback. Update: Jonathan Anomaly, director of scientific research and communication for Herasight and whose articles I criticize here, responds in a detailed comment. I recommend reading his response together with this post. Anomaly’s role in the company has also been clarified. See also: Have we leapt into commercial genetic testing without understanding it? (Ars Technica).]

Friday, February 20, 2026

February 18, 2026: J.B. Pritzker State of the State Address

Today Illinois governor J.B. Pritzker delivered the State of the State address. The underlying purpose of the address is to explain the state budget, but Pritzker, a Democrat, used the occasion to talk far more broadly about the state of Illinois and the nation.

Pritzker anchored his speech by reaching back to the days of John Peter Altgeld, a German-born American who helped to lead the Progressive movement and served as governor of Illinois from 1893 to 1897. Altgeld oversaw passage of some of the strongest laws in the country for workplace safety and protection of child workers, invested heavily in education, and appointed women to important positions in state government despite the fact that women could not yet vote.

Pritzker noted that in his State of the State speech in January 1895, Altgeld talked about “the need to ensure that science would govern the practice of medicine in Illinois; the high cost of insurance; the condition of Illinois prisons; the funding of state universities; a needed revision of election laws; the concentration of wealth in large businesses.” Altgeld expressed pride for appointing women to office and his statement that “[j]ustice requires that the same rewards and honors that encourage and incite men should be equally in reach of women in every field and activity.”

Pritzker said he brought up Altgeld’s defense of equal rights “to highlight one enduring human truth—injustice can become a genetic condition we bequeath on future generations if we fail to face it forthrightly.”

Pritzker then turned to the year that has passed since President Donald J. Trump took office. “To be perfectly candid,” Pritzker said, “as Illinois is one of the states whose taxpayers send more dollars to the federal government than we receive back in services, I was hoping that his threats to gut programs that support working families [were] the kind of unrealistic hyperbole that fuels a presidential campaign but then is abandoned when cooler heads prevail.” But, he said, “Unfortunately, there are no cooler heads at 1600 Pennsylvania Avenue these days.”

The Trump administration has cost Illinois $8.4 billion, Pritzker said, “illegally confiscating money that has already been promised and appropriated by the Congress to the people of Illinois.” Pritzker was clear that this money is not handouts but “dollars that real Illinoisans paid in federal taxes and that have been constitutionally approved by our elected Democratic and Republican representatives in Washington.”

Unlike the federal government, states must balance their budgets every year. Trump’s billions in illegally withheld funds inflict a cost on the state’s residents, while Illinois has been “forced to spend enormous time and taxpayer money going to court and fighting to get what is rightfully ours.” Pritzker said: “It is impossible to tally the hours, days, and weeks our state government has spent chasing news of Presidential executive orders, letters, and edicts that read like proclamations from the Lollipop Guild.” [...]

He noted the growth of Illinois’s economy and economic stability over the past eight years even as the state had balanced its budget every year and made historic investments in education, child welfare, disability services, and job creation in the private sector. In the past year, Illinois’s gross domestic product was more than $1.2 trillion, up from $881 billion when Pritzker took office.

Looking forward, Pritzker outlined plans to address the top three economic issues on the mind of most Americans: the cost of housing, electricity, and healthcare. He promised to reduce the cost of housing by cutting local regulations and providing more options for financing. He promised to address the skyrocketing cost of electricity first by pausing the authorization of new data center tax credits and then by investing in renewable energy and nuclear power. Finally, he announced that, as of this week, the state had eliminated $1 billion in medical debt for more than 500,000 people in the state by purchasing and erasing it for pennies on the dollar...

“I’m committed to doing everything government can to rein in the worst of the price gouging and profiteering we are seeing,” Pritzker said. “But I implore the titans of industry who regularly ask government to make their lives easier—what are you doing to make your employers’ and your customers’ lives easier?”

Then Pritzker turned to the crisis federal agents created on the streets of Chicago. “A year ago, I stood before you and asked a provocative question: After we have discriminated against, disparaged, and deported all our immigrant neighbors—and the problems we started with still remained—what comes next?” Pritzker said. He recalled that when he asked that question, some people walked out.

“But a year later, we have an answer—don’t we?” he said. “Masked, unaccountable federal agents—with little training—occupied our streets, brutalized our people, tear-gassed kids and cops, kidnapped parents in front of their children, detained and arrested and at times attempted to deport U.S. citizens, and killed innocent Americans in the streets.”

Pritzker identified Trump and White House deputy chief of staff Stephen Miller as the architects of that plan to “drip authoritarianism…into our veins.”

But, he noted, people in Illinois did not accept that authoritarianism.

Pritzker reminded the audience that President Grover Cleveland had similarly tried to “subdue the Illinois population with hired thugs” during the 1894 Pullman strike after the Pullman Company, which made railroad cars, cut workers’ wages by about 25%. When workers struck, Cleveland deputized U.S. Marshals to end the strike. They fired into crowds of bystanders and, according to a Chicago paper, “seemed to be hunting trouble.” Twenty-five people died and more were wounded before the strike ended.

Altgeld had opposed the arrival of federal troops, and his fury at their intrusion still smoldered when he gave his State of the State speech almost six months later. “If the President can, at his pleasure, send troops into any city, town, or hamlet…whenever and wherever he pleases, under pretense of enforcing some law,” Altgeld wrote, “his judgment, which means his pleasure being the sole criterion—then there can be no difference whatever in this respect between the powers of the President and those of...the Czar of Russia.”

Pritzker joked that he wished he “could spend just one year of my governorship presiding over precedented times. I yearn for normal problems,” he said. But these are not normal times.

“I’ve been thinking a lot lately about love—about loving people and loving your country and the power involved in both,” the governor said. “I know, right now, there are a lot of people out there who love their country and feel like their country is not loving them back. I know that.” But he told those people that “your country is loving you back—just not in the way you are used to hearing.”

“It’s not speaking in anthems or flags or ostentatious displays of patriotism. It will never come from the people who say the only way to love America is to hate Americans. Love is found in every act of courage—large and small—taken to preserve the country we once knew. You will find it in homes and schools and churches and art. It is there; it has not been squashed.”

by Heather Cox Richardson, Letters From An American |  Read more:
Image: via
[ed. Sounds good to me. The entire text of Governor Pritzker's speech can be found here. Really worth a full read.]

Tuesday, February 17, 2026

The Crisis, No. 5: On the Hollowing of Apple

[ed. No.5 of 17 Crisis Papers.]

I never met Steve Jobs. But I know him—or I know him as well as anyone can know a man through the historical record. I have read every book written about him. I have read everything the man said publicly. I have spoken to people who knew him, who worked with him, who loved him and were hurt by him.

And I think Steve would be disgusted by what has become of his company.

This is not hagiography. Jobs was not a saint. He was cruel to people who loved him. He denied paternity of his daughter for years. He drove employees to breakdowns. He was vain, tyrannical, and capable of extraordinary pettiness. I am not unaware of his failings, of the terrible way he treated people needlessly along the way.

But he had a conscience. He moved, later in life, to repair the damage he had done. The reconciliation with his daughter Lisa was part of a broader moral development—a man who had hurt people learning, slowly, how to stop. He examined himself. He made changes. He was not a perfect man. But he had heart. He had morals. And he was willing to admit when he was wrong.

That is a lot more than can be said for this lot of corporate leaders.

It is this Steve Jobs—the morally serious man underneath the mythology—who would be so angry at what Tim Cook has made of Apple.

Steve Jobs understood money as instrumental.

I know this sounds like a distinction without a difference. The man built the most valuable company in the world. He died a billionaire many times over. He negotiated hard, fought for his compensation, wanted Apple to be profitable. He was not indifferent to money.

But he never treated money as the goal. Money was what let him make the things he wanted to make. It was freedom—the freedom to say no to investors, to kill products that weren’t good enough, to spend years on details that no spreadsheet could justify. Money was the instrument. The thing it purchased was the ability to do what he believed was right.

This is how he acted.

Jobs got fired from his own company because he refused to compromise his vision for what the board considered financial prudence. He spent years in the wilderness, building NeXT—a company that made beautiful machines almost no one bought—because he believed in what he was making. He acquired Pixar when it was bleeding cash and kept it alive through sheer stubbornness until it revolutionized animation.

When he returned to Apple, he killed products that were profitable because they were mediocre. He could have milked the existing lines, played it safe, optimized for margin. Instead, he burned it down and rebuilt from scratch. The iMac. The iPod. The iPhone. Each one a bet that could have destroyed the company. Each one made because he believed it was right, not because a spreadsheet said it was safe...

This essay is not really about Steve Jobs or Tim Cook. It is about what happens when efficiency becomes a substitute for freedom. Jobs and Cook are case studies in a larger question: can a company—can an economy—optimize its way out of moral responsibility? The answer, I will argue, is yes. And we are living with the consequences.

Jobs understood something that most technology executives do not: culture matters more than politics.

He did not tweet. He did not issue press releases about social issues. He did not perform his values for an audience. He was not interested in shibboleths of the left or the right. [...]

This is how Jobs approached politics: through art, film, music, and design. Through the quiet curation of what got made. Through the understanding that the products we live with shape who we become.

If Jobs were alive today, I do not believe he would be posting on Twitter about fascism. That was never his mode. [...]

Tim Cook is a supply chain manager.

I do not say this as an insult. It is simply what he is. It is what he was hired to be. When Jobs brought Cook to Apple in 1998, he brought him to fix operations—to make the trains run on time, to optimize inventory, to build the manufacturing relationships that would let Apple scale.

Cook was extraordinary at this job. He is, by all accounts, one of the greatest operations executives in the history of American business. The margins, the logistics, the global supply chain that can produce millions of iPhones in weeks—that is Cook’s cathedral. He built it.

But operations is not vision. Optimization is not creation. And a supply chain manager who inherits a visionary’s company is not thereby transformed into a visionary.

Under Cook, Apple has become very good at making more of what Jobs created. The iPhone gets better cameras, faster chips, new colors. The ecosystem tightens. The services revenue grows. The stock price rises. By every metric that Wall Street cares about, Cook has been a success.

But what has Apple created under Cook that Jobs did not originate? What new thing has emerged from Cupertino that reflects a vision of the future, rather than an optimization of the past?

The Vision Pro is an expensive curiosity. The car project was canceled after a decade of drift. The television set never materialized. Apple under Cook has become a company that perfects what exists rather than inventing what doesn’t.

This is what happens when an optimizer inherits a creator’s legacy. The cathedral still stands. But no one is building new rooms.

There is a deeper problem than the absence of vision. Tim Cook has built an Apple that cannot act with moral freedom.

The supply chain that Cook constructed—his great achievement, his life’s work—runs through China. Not partially. Not incidentally. Fundamentally. The factories that build Apple‘s products are in China. The engineers who refine the manufacturing processes are in China. The workers who assemble the devices, who test the components, who pack the boxes—they are in Shenzhen and Zhengzhou and a dozen other cities that most Americans cannot find on a map.

This was a choice. It was Cook’s choice. And once made, it ceased to be a choice at all. Supply chains, like empires, do not forgive hesitation. For twenty years, it looked like genius. Chinese manufacturing was cheap, fast, and scalable. Apple could design in California and build in China, and the margins were extraordinary.

But dependency is not partnership. And Cook built a dependency so complete that Apple cannot escape it.

When Hong Kong’s democracy movement rose, Apple was silent. When the Uyghur genocide became undeniable, Apple was silent. When Beijing pressured Apple to remove apps, to store Chinese user data on Chinese servers, to make the iPhone a tool of state surveillance for Chinese citizens—Apple complied. Silently. Efficiently. As Cook’s supply chain required.

This is not a company that can stand up to authoritarianism. This is a company that has made itself a instrument of authoritarianism, because the alternative is losing access to the factories that build its products.

There is something worse than the dependency. There is what Cook gave away.

Apple did not merely use Chinese manufacturing. Apple trained it. Cook’s operations team—the best in the world—went to China and taught Chinese companies how to do what Apple does. The manufacturing techniques. The materials science. The logistics systems. The quality control processes.

This was the price of access. This was what China demanded in exchange for letting Apple build its empire in Shenzhen. And Cook paid it.

Now look at the result.

BYD, the Chinese electric vehicle company, learned battery manufacturing and supply chain management from its work with Apple. It is now the largest EV manufacturer in the world, threatening Tesla and every Western automaker.

DJI dominates the global drone market with technology and manufacturing processes refined through the Apple relationship.

Dozens of other Chinese companies—in components, in assembly, in materials—were trained by Apple‘s experts and now compete against Western firms with the skills Apple taught them.

Cook built a supply chain. And in building it, he handed the Chinese Communist Party the industrial capabilities it needed to challenge American technological supremacy. [...]

So when I see Tim Cook at Donald Trump’s inauguration, I understand what I am seeing.

When I see him at the White House on January 25th, 2026—attending a private screening of Melania, a vanity documentary about the First Lady, directed by Brett Ratner, a man credibly accused of sexual misconduct by multiple women—I understand what I am seeing.

I understand what I am seeing when I learn that this screening took place on the same night that federal agents shot Alex Pretti ten times in the back in Minneapolis. That while a nurse lay dying in the street for the crime of trying to help a woman being pepper-sprayed, Tim Cook was eating canapés and watching a film about the president’s wife.

Tim Cook’s Twitter bio contains a quote from Martin Luther King Jr.: “Life’s most persistent and urgent question is, ‘What are you doing for others?’”

What was Tim Cook doing for others on the night of January 25th?

He was doing what efficiency requires. He was maintaining relationships with power. He was protecting the supply chain, the margins, the tariff exemptions. He was being a good middleman.

I am seeing a man who cannot say no.

This is what efficiency looks like when it runs out of room to hide.

He cannot say no to Beijing, because his supply chain depends on Beijing’s favor. He cannot say no to Trump, because his company needs regulatory forbearance and tariff exemptions. He is trapped between two authoritarian powers, serving both, challenging neither.

This is not leadership. This is middleman management. This is a man whose great achievement—the supply chain, the operations excellence, the margins—has become the very thing that prevents him from acting with moral courage.

Cook has more money than Jobs ever had. Apple has more cash, more leverage, more market power than at any point in its history. If anyone in American business could afford to say no—to Trump, to Xi, to anyone—it is Tim Cook.

And he says yes. To everyone. To anything. Because he built a company that cannot afford to say no. [...]

I believe that Steve Jobs built Apple to be something more than a company. He built it to be a statement about what technology could be—beautiful, humane, built for people rather than against them. He believed that the things we make reflect who we are. He believed that how we make them matters.

Tim Cook has betrayed that vision—not through malice, but by excelling in a system that rewards efficiency over freedom and calls it leadership. Through the replacement of values with optimization. Through the construction of a machine so efficient that it cannot afford to be moral.

Apple is not unique in this. It is exemplary.

This is what happens to institutions that mistake scale for strength, efficiency for freedom, optimization for wisdom. They become powerful enough to dominate markets—and too constrained to resist power. Look at Google, training AI for Beijing while preaching openness. Look at Amazon, building surveillance infrastructure for any government that pays. Look at every Fortune 500 company that issued statements about democracy while writing checks to the politicians dismantling it.

Apple is simply the cleanest case, because it once knew the difference. Because Jobs built it to know the difference. And because we can see, with unusual clarity, the precise moment when knowing the difference stopped mattering.

by Mike Brock, Notes From the Circus |  Read more:
Image: Steve Jobs/uncredited
[ed. Part seventeen of a series titled The Crisis Papers. Check them all out and jump in anywhere. A+ effort.]

Sunday, February 15, 2026

The Jim Irsay Collection: Auction


Eric Clapton: The Martin 000-42 Acoustic Guitar Used For His Acclaimed Appearance on MTV Unplugged, 1992.
C.F. Martin & Company, Nazareth, Pennsylvannia, 1939
via: Christies Jim Irsay Collection: Hall of Fame
[ed. Insane music memorabilia auction.]

What Does “Trust in the Media” Mean?

Abstract

Is public trust in the news media in decline? So polls seem to indicate. But the decline goes back to the early 1970s, and it may be that “trust” in the media at that point was too high for the good of a journalism trying to serve democracy. And “the media” is a very recent (1970s) notion popularized by some because it sounded more abstract and distant than a familiar term like “the press.” It may even be that people answering a pollster are not trying to report accurately their level of trust but are acting politically to align themselves with their favored party's perceived critique of the media. This essay tries to reach a deeper understanding of what gives rise to faith or skepticism in various cultural authorities, including journalism.

In F. Scott Fitzgerald's 1920 novel This Side of Paradise, the main character, Amory, harangues his friend and fellow Princeton graduate Tom, a writer for a public affairs weekly:
“People try so hard to believe in leaders now, pitifully hard. But we no sooner get a popular reformer or politician or soldier or writer or philosopher … than the cross-currents of criticism wash him away. … People get sick of hearing the same name over and over.”

“Then you blame it on the press?”

“Absolutely. Look at you, you're on The New Democracy, considered the most brilliant weekly in the country. … What's your business? Why, to be as clever, as interesting and as brilliantly cynical as possible about every man, doctrine, book or policy that is assigned you to deal with.”1
People have “blamed it on the press” for a long time. They have felt grave doubts about the press long before social media, at times when politics was polarized and times when it was not, and even before the broad disillusionment with established institutional authority that blossomed in the 1960s and 1970s, when young people were urged not to trust anybody “over thirty.” This is worth keeping in mind as I, in a skeptical mood myself, try to think through contemporary anxiety about declining trust, particularly declining trust in what we have come to call-in recent decades-”the media.”

As measured trust in most American institutions has sharply declined over the last fifty years, leading news institutions have undergone a dramatic transformation, the reverberations of which have yet to be fully acknowledged, even by journalists themselves. Dissatisfaction with journalism grew in the 1960s. What journalists upheld as “objectivity” came to be criticized as what would later be called “he said, she said” journalism, “false balance” journalism, or “bothsidesism” in sharp, even derisive, and ultimately potent critiques. As multiple scholars have documented, news since the 1960s has become deeper, more analytical or contextual, less fully focused on what happened in the past twenty-four hours, more investigative, and more likely to take “holding government accountable” or “speaking truth to power” as an essential goal. In a sense, journalists not only continued to be fact-centered but also guided by a more explicit avowal of the public service function of upholding democracy itself.

One could go further to say that journalism in the past fifty years did not continue to seek evidence to back up assertions in news stories but began to seek evidence, and to show it, for the first time. Twenty-three years ago, when journalist and media critic Carl Sessions Stepp compared ten metropolitan daily newspapers from 1962 to 1963 with the same papers from 1998 to 1999, he found the 1963 papers “naively trusting of government, shamelessly boosterish, unembarrassedly hokey and obliging,” and was himself particularly surprised to find stories “often not attributed at all, simply passing along an unquestioned, quasi-official sense of things.” In the “bothsidesism” style of news that dominated newspapers in 1963, quoting one party to a dispute or an electoral contest and then quoting the other was the whole of the reporter's obligation. Going behind or beyond the statements of the quoted persons, invariably elite figures, was not required. It was particularly in the work of investigative reporters in the late 1960s and the 1970s that journalists became detectives seeking documentable evidence to paint a picture of the current events they were covering. Later, as digital tools for reporters emerged, the capacity to document and to investigate became greater than ever, and a reporter did not require the extravagant resources of a New York Times newsroom to be able to write authoritative stories.

I will elaborate on the importance of this 1960s/1970s transformation in what follows, not to deny the importance of the more recent digital transformation, but to put into perspective that latter change from a top-down “media-to-the-masses” communication model to a “networked public sphere” with more horizontal lines of communication, more individual and self-appointed sources of news, genuine or fake, and more unedited news content abounding from all corners. Journalism has changed substantially at least twice in fifty years, and the technological change of the early 2000s should not eclipse the political and cultural change of the 1970s in comprehending journalism today. (Arguably, there was a third, largely independent political change: the repeal of the “fairness doctrine” by the Federal Communication Commission in 1987, the action that opened the way to right-wing talk radio, notably Rush Limbaugh's syndicated show, and later, in cable television, to Fox News.) Facebook became publicly accessible in 2006; Twitter was born the same year; YouTube in 2005. Declining trust in major institutions, as measured by surveys, was already apparent three decades earlier-not only before Facebook was launched but before Mark Zuckerberg was born.

At stake here is what it means to ask people how much they “trust” or “have confidence in” “the media.” What do we learn from opinion polls about what respondents mean? In what follows, I raise some doubts about whether current anxiety concerning the apparently growing distrust of the media today is really merited.

Did people ever trust the media? People often recall-or think they recall-that longtime CBS News television anchor Walter Cronkite was in his day “the most trusted man in America.” If you Google that phrase (as I did on October 11, 2021, and again on January 16, 2022) you immediately come up with Walter Cronkite. Why? Because a public opinion poll in 1972 asked respondents which of the leading political figures of the day they trusted most. Cronkite's name was thrown in as a kind of standard of comparison: how do any and all of the politicians compare to some well-known and well-regarded nonpolitical figure? Seventy-three percent of those polled placed Cronkite as the person on the list they most trusted, ahead of a general construct-”average senator” (67 percent)-and well ahead of the then most trusted politician, Senator Edmund Muskie (61 percent). Chances are that any other leading news person or probably many a movie star or athlete would have come out as well or better than Cronkite. A 1974 poll found Cronkite less popular than rival tv news stars John Chancellor, Harry Reasoner, and Howard K. Smith. Cronkite was “most trusted” simply because he was not a politician, and we remember him as such simply because the pollsters chose him as their standard.

Somehow, people have wanted to believe that somewhere, just before all the ruckus began over civil rights and Vietnam and women's roles and status, at some time just before yesterday, the media had been a pillar of central, neutral, moderate, unquestioning Americanism, and Walter Cronkite was as good a symbol of that era as anyone.

But that is an illusion.

by Michael Schudson, MIT Press Direct | Read more:
Image: Walter Cronkite/NY Post

Monday, February 9, 2026

Sunday, February 8, 2026

World War AI

How's that whole golden age thing going for you so far? That golden age of human leisure and wealth awaiting us in a world optimized for the thinking machines.

Are you working a bit less today, enjoying the early fruits of all this 'AI productivity'? Or are you somehow working longer, more stressful hours than ever?

Is it your sense that life is getting a little bit easier for the poor or the middle class or anyone other than the very rich as the 'AI revolution' arrives? Is it your sense that young people are a bit more hopeful about the future now that it's an 'AI economy'? Is it your sense that 'AI friends' are beginning to enrich our social lives? Is it your sense that goods and services are becoming more plentiful and cheaper as 'AI deflation' kicks in? Is it your sense that news is more informative and shows are more entertaining as 'AI content' spreads? Is it your sense that job prospects are improving as we enter an 'AI employment boom'?

Yeah. Same.

Honestly, I don't see how the carrot was ever going to work. It's just too at-odds with our actual lived experience, even here in Fiat World where our reality is declared and announced to us. They're going to need the stick. They're going to need to tell us that national survival is at stake, that our enemies will triumph if we don't make the 'necessary sacrifices' to win this 'AI arms race'.

They're going to need a war.

Oh, maybe not an actual war, but the functional equivalent thereof, full of threats real and imagined and adversaries foreign and domestic. They're going to need World War AI...

The United States spent $296 billion over a roughly four-year period to fight World War II, which would translate to about $4 trillion in today's dollars.

At its peak (1943), the war effort accounted for 37% of US GDP, and no aspect of American life was untouched or unconstrained by the US government's reallocation of the three basic building blocks of economic activity -- labor, capital and energy (energy being my shorthand for all physical resources as well as the core input to mining, farming, manufacturing and transportation) -- and the enormous expansion of government's role in American society to carry out this reallocation. In particular, every aspect of consumer behavior was subordinated to the political will required to execute the war effort, a political will which created extreme shortages in the labor, capital and physical resources available to the consumer economy.

I think it's hard for Americans today to grasp both the level of consumer sacrifice that was required during World War II and the level of government propaganda 'nudge' involved in enforcing that consumer sacrifice. (...)


I mean, I'm guessing that the mother and child in the poster above, dressed in their perfectly matching frocks and radiating Stepford Wives aura, maybe did not have enough food the winter before? And if you think that it's 'encouraging political violence' to call someone a Nazi today for supporting fascist policies ... in 1943 the government would call you a Nazi if you didn't carpool.

I find these posters and broadsides from World War II pretty funny, like they're from some cartoon world, and I bet you do, too. But when you read the memoirs and economic histories of the WWII homefront, there's nothing cartoonish about it. These were hard times! Shortages of food, energy and labor created extreme cost-push inflation, like our Covid-era supply chain inflation but on steroids, to which the government responded with draconian price controls on EVERYTHING. And when price controls didn't work, meaning that when even a suppressed market failed to distribute enough calories to enough people to prevent widespread hunger if not starvation, the government abandoned market mechanisms altogether and instituted outright rationing on food, energy and other necessities.

At the same time, every bit of available domestic investment capital and savings (which are the same thing) was absorbed by the federal government and unavailable for the consumer economy. That meant that in addition to the extreme inflationary pressures from widespread shortages, there was ZERO economic growth from small and medium businesses, which were an even larger portion of American GDP back then than they are today. The only thing that kept the American economy from collapsing into a stagflationary disaster was the $4 trillion that the US government spent on manufacturing war materiel and -- hold this thought! -- the enormous number of new jobs created from that.

The same amount of inflation-adjusted money we spent on World War II -- somewhere between $4 trillion and $5 trillion -- is scheduled to be spent on AI and datacenter buildouts in the United States over the next four years.

Yes, our economy is proportionally bigger today, so this is 'only' something like 15% of US GDP ($30 trillion in 2025), but an economic mobilization of this magnitude will require a similarly massive reallocation of our fundamental economic building blocks -- labor, capital and energy -- especially capital and energy.

On the capital side, it's difficult to communicate how much money this is over such a short period of time. As JPMorgan puts it in their magisterial research note on AI Capex financing, "The question is not which market will finance the AI-boom. Rather, the question is how will financings be structured to access every capital market.” Here's their chart for where they think the money will come from (slightly apples to oranges as this is global spend, not just US, but I figure 70-80% of this datacenter build is going to happen in the US, so it's essentially the same), and I'd call your attention in the $1.4 trillion attributed to "Need for Alternative Capital / Governments", which combines both our favorite financial topic du jour -- private credit -- with direct government subsidy/investment.

AI Capex - Financing The Investment Cycle (J.P.Morgan North America Fundamental Research, Nov. 10, 2025)

This is the necessary context for understanding OpenAI CFO Sarah Friar's recent comments at a Wall Street Journal conference that the company would 'welcome' a federal government 'backstop' on private debt financings of this datacenter buildout, as well as Sam Altman's unintentionally hilarious 5,000 word tweet to 'clarify' Friar's very clear and very correct and very intentional words...

Sarah Friar didn't 'misspeak' when she called for a federal backstop -- by which everyone means and intends a US Treasury guarantee -- on AI datacenter debt issuance, and she didn't need to 'phrase things more clearly'. She used exactly the right word to describe exactly the policy that OpenAI and Wall Street and every other participant in this $10 trillion ouroboros ecosystem desperately wants and frankly requires for this massive reallocation of capital to have a chance of succeeding.

I mean, a federal debt backstop is just the start. Within a couple of years -- and this is the point of the $1.4 trillion "Alternative Capital / Governments" item on the JPMorgan chart! -- the US government will need to allocate hundreds of billions of dollars directly to the AI buildout, maybe through defense appropriations, maybe through equity stakes, maybe through whatever. Otherwise, we're a good trillion dollars short in the funding required to make this work here in the US. All from additional borrowing and deficit spending, of course, just like in World War II when the federal debt skyrocketed to an amount that was 100% of GDP. What's different today, of course, is that the federal deficit is already at World War II debt-to-GDP levels before the additional borrowing for the AI buildout support. Bottom line: whatever you think the future path of US debt-to-GDP looks like, you're too low.

The economic term for the impact of capital reallocation at this enormous scale is 'crowding out'. The public and private capital that is invested in or lent to the AI hyperscalers and their counterparties over the next four years is that much less public and private capital available to be invested in or lent to the rest of the economy. And while I'm sure most large B2B enterprises will find a way to at least get a taste of what's being poured into the AI buildout, small and medium enterprises will be mostly shut out and consumer-facing enterprises are going to be completely shut out.

The inevitable impact of a massive reallocation of capital away from the consumer economy is that consumer credit becomes more expensive (if it's available at all), capital-intensive consumer services like health insurance and homeowners insurance become more expensive (if they're available at all), consumers stop spending (especially the bottom 50%), and consumer-facing businesses stop hiring (if they're not actively cutting back).

Sound familiar? That's because what I'm describing isn't some maybe-projection of some hypothetical future. This is all happening already. This is all happening NOW.

by Ben Hunt, Epsilon Theory |  Read more:
Image: JP Morgan; US Govt.
[ed. Very much enjoy Mr. Hunt's essays. Unfortunately, only for subscribers these days. See also: This is the Great Ravine (ET):]
***
This is all going to get much worse before it gets any better.

In The Dark Forest, volume 2 of the Three-Body Problem science fiction trilogy, Cixin Liu mentions almost in passing a 50-year period of immense social upheaval, destruction and (ultimately) recovery across the globe. He never goes into the details of this period that he calls the Great Ravine. He basically just waves his hands at it and writes “yep, that happened”.

Why? Because the Great Ravine does not advance the plot.

It’s there. It happens. But there’s nothing to be gained by examining its events. Like the Cultural Revolution of Cixin Liu’s real-world history, the Great Ravine is ultimately just a tragic waste. A waste of time. A waste of wealth. A waste of lives. There is nothing to be learned from our time in the Great Ravine; it must simply be crossed.

And cross it we will.

Friday, January 30, 2026

HawaiÊ»i Could See Nation’s Highest Drop In High School Graduates

HawaiÊ»i Could See Nation’s Highest Drop In High School Graduates (CB)

Hawaiʻi is expected to see the greatest decline in high school graduates in the nation over the next several years, raising concerns from lawmakers and Department of Education officials about the future of small schools in shrinking communities.

Between 2023 and 2041, HawaiÊ»i could see a 33% drop in the number of students graduating from high school, according to the Western Interstate Commission for Higher Education. The nation as a whole is projected to see a 10% drop in graduates, according to the commission’s most recent report, published at the end of 2024.

Image: Chart: Megan Tagami/Civil BeatSource: Western Interstate Commission for Higher Education

The Last Flight of PAT 25

Two Army helicopter pilots went on an ill-conceived training mission. Within two hours, 67 people were dead.

One year ago, on January 29, 2025, two Army pilots strapped into a Black Hawk helicopter for a training mission out of Fort Belvoir in eastern Virginia and, two hours later, flew it into an airliner that was approaching Ronald Reagan Washington National Airport, killing all 67 aboard both aircraft. It was the deadliest air disaster in the United States in a quarter-century. Normally, in the aftermath of an air crash, government investigators take a year or more to issue a final report laying out the reasons the incident occurred. But in this case, the newly seated U.S. president, Donald Trump, held a press conference the next day and blamed the accident on the FAA’s DEI under the Biden and Obama administrations. “They actually came out with a directive, ‘too white,’” he claimed. “And we want the people that are competent.”

In the months that followed, major media outlets probed several real-world factors that contributed to the tragedy, including staffing shortages at FAA towers, an excess of traffic in the D.C. airspace, and the failure of the Black Hawk to broadcast its location over ADS-B — an automatic reporting system — before the collision. To address this final point, the Senate last month passed the bipartisan ROTOR Act, which would require all aircraft to use ADS-B — “a fitting way to honor the lives of those lost nearly one year ago over the Potomac River,” as bill co-sponsor Ted Cruz put it.

At a public meeting on Tuesday, the National Transport Safety Board laid out a list of recommended changes in response to the crash, criticizing the FAA for allowing helicopters to operate dangerously close to passenger planes and for allowing professional standards to slip at the control tower.

What has gone unexamined in the public discussion of the crash, however, is why these particular pilots were on this mission in the first place, whether they were competent to do what they were trying to do, what adverse conditions they were facing, and who was in charge at the moment of impact. Ultimately, while systemic issues may have created conditions that were ripe for a fatal accident, it was human decision-making in the cockpit that was the immediate cause of this particular crash.

This account is based on documents from the National Transportation Board (NTSB) accident inquiry and interviews with aviation experts. It shows that, when we focus on the specific details and facts of a case, the cause can seem quite different from what a big-picture overview might indicate. And this, in turn, suggests different logical steps that should be taken to prevent such a tragedy from happening again.

6:42 p.m.: Fort Belvoir, Virginia

The whine of the Blackhawk’s engine increased in pitch, and the whump-whump of its four rotor blades grew louder, as the matte-black aircraft lifted into the darkened sky above the single mile-long runway at Davison Army Airfield in Fairfax County, Virginia, about 25 miles southwest of Washington, D.C.

The UH-60, as it’s formally designated, is an 18,000-pound aircraft that entered service in 1979 as a tactical transport aircraft, used primarily for moving troops and equipment. This one belonged to Company B of the 12th Aviation Battalion, whose primary mission is to transport government VIPs, including Defense Department officials, members of Congress, and visiting dignitaries. Tonight’s flight would operate as PAT 25, for “Priority Air Transit.”

Black Hawks are typically flown by two pilots. The pilot in command, or PIC, sits in the right-hand seat. Tonight, that role was filled by 39-year-old chief warrant officer Andrew Eaves. Warrant officers rank between enlisted personnel and commissioned officers; it’s the warrant officers who carry out the lion’s share of a unit’s operational flying. When not flying VIPs, Eaves served as a flight instructor and a check pilot, providing periodic evaluation of the skills of other pilots. A native of Mississippi, he had 968 hours of flight experience and was considered a solid pilot by others in the unit.

Before he took off, Eaves’ commander had discussed the flight with him and admonished him to “not become too fixated on his evaluator role” and to remain “in control of the helicopter,” according to the NTSB investigation.

His mission was to give a check ride to Captain Rebecca Lobach, the pilot sitting in the left-hand seat. Lobach was a staff officer, meaning that her main role in the battalion was managerial. Nevertheless, she was expected to maintain her pilot qualifications and, to do so, had to undergo a number of annual proficiency checks. Tonight’s three-hour flight was intended to get Lobach her annual sign-off for basic flying skills and for the use of night-vision goggles, or NVGs. To accommodate that, the flight was taking off an hour and 20 minutes after sunset.

Both pilots wore AN/AVS-6(V)3 Night Vision Goggles, which look like opera glasses and clip onto the front of a pilot’s helmet. They gather ambient light, whether from the moon or stars or from man-made sources; intensify it; and display it through the lens of each element. The eyepiece doesn’t sit directly on the face but about an inch away, so the pilot can look down under it and see the instrument panel.

Night-vision goggles have a narrow field of view, just 40 degrees compared to the 200-degree range of normal vision, which makes it harder for pilots to maintain full situational awareness. They have to pay attention to obstacles and other aircraft outside the window, and they also have to keep track of what the gauges on the panel in front them are saying: how fast they’re going, for instance, and how high. There’s a lot to process, and time is of the essence when you’re zooming along at 120 mph while lower than the tops of nearby buildings. To help with situational awareness, Eaves and Lobach were accompanied by a crew chief, Staff Sergeant Ryan O’Hara, sitting in a seat just behind the cockpit, where he would be able to help keep an eye out for trouble.

The helicopter turned to the south as it climbed, then flew along the eastern shore of the Potomac until the point where the river makes a big bend to the east. Eaves banked to the right and headed west toward the commuter suburb of Vicksburg, where the lights of house porches and street lamps seemed to twinkle as they fell in and out of the cover of the bare tree branches.

7:11 p.m.: Approaching Greenhouse Airport, Stevensburg, Virginia

PAT 25 followed the serpentine course of the Rapidan River through the hills and farm fields of the Piedmont. At this point, Eaves was not only the pilot in command, but also the pilot flying, meaning that he had his hands on the controls that guide the aircraft’s speed and direction and his feet on the rudder pedals that keep the helicopter “in trim” — that is, lined up with its direction of flight. Lobach played a supporting role, working the radio, keeping an eye out for obstacles and other traffic, and figuring out their location by referencing visible landmarks.

Lobach, 28, had been a pilot for four years. She’d been an ROTC cadet at the University of North Carolina at Chapel Hill, which she graduated from in 2019. Both her parents were doctors; she’d dreamed of a medical career but eventually realized that she couldn’t pursue one in the Army. According to her roommate, “She did not have a huge, massive passion” for aviation but chose it because it was the closest she could get to practicing medicine, under the circumstances. “She badly wanted to be a Black Hawk pilot because she wanted to be a medevac unit,” he told NTSB investigators. After she completed flight training at Fort Rucker, she was stationed at Fort Belvoir, where she joined the 12th Aviation Battalion and was put in charge of the oil-and-lubricants unit. One fellow pilot in the unit described her to the NTSB as “incredibly professional, very diligent and very thorough.”

In addition to her official duties, Lobach served as a volunteer social liaison at the White House, where she regularly represented the Army at Medal of Honor ceremonies and state dinners. She was both a fitness fanatic and a baker, known for providing fresh sourdough bread to her unit. She had started dabbling in real-estate investments and looked forward to moving in with her boyfriend of one year, another Army pilot with whom she talked about having “lots and lots of babies.” She was planning to leave the service in 2027 and had already applied for medical school at Mount Sinai. Helicopter flying was not something she intended to pursue.

Though talented as a manager, she wasn’t much of a pilot. Helicopter flying is an extremely demanding feat of coordination and balance, akin to juggling and riding a unicycle at the same time. For Lobach, the difficulty was compounded by the fact that she had trained on highly automated, relatively easy-to-fly helicopters at Fort Rucker and then been assigned to an older aircraft, the Black Hawk L or “Lima” model, at Fort Belvoir. Unlike newer models, which can maintain their altitude on autopilot, the Lima requires constant care and attention, and Lobach struggled to master it. One instructor described her skills as “well below average,” noting that she had “lots of difficulties in the aircraft.” Three years before, she’d failed the night-vision evaluation she was taking tonight.

Before the flight, Eaves had told his girlfriend that he was concerned about Lobach’s capability as a pilot and that, skill-wise, she was “not where she should be.”

It’s not uncommon for pilots to struggle during the early phase of their career. But Lobach’s development had been particularly slow. In her five years in the service, she had accumulated just 454 hours of flight time, and she wasn’t clocking more very quickly. The Army requires officers in her role to fly at least 60 hours a year, but in the past 12 months, she’d flown only 56.7. Her superiors had made an exception for her because in March she’d had knee surgery for a sports injury, preventing her from flying for three months. The waiver made her technically qualified to fly, but it didn’t change the fact that she was rustier than pilots were normally allowed to become.

If she’d been keen on flying, she could have used every moment of this flight to hone her skills by taking the controls herself. But she was content to let Eaves do the flying during the first part of the trip.

Drawing near to Greenhouse Airport, a small, private grass runway near a plant nursery, they navigated via an old-fashioned technique called pilotage, using landmarks and dead reckoning to find their way from point to point. Coming in for their first landing of the night, they were looking for the airstrip’s signature greenhouse complex.

Lobach: That large lit building may be part of it.

Eaves: It does look like a greenhouse, doesn’t it?

Lobach: Yeah, it does, doesn’t it? We can start slowing back.

Eaves: All right, slowing back.

As they circled around the runway, Eaves commented that the lighting of the greenhouse building was so intense that it was blinding in the NVGs, and Lobach agreed. Eaves positioned the helicopter a few hundred feet above the landing zone and asked Lobach to show him where it was. After she did so correctly, he told her to take the controls. This process followed a formalized set of acknowledgements to make sure that both parties understood who was in control of the aircraft.

Eaves: You’ve got the flight controls.

Lobach: I’ve got the controls.

As Lobach eased the helicopter toward the ground, Eaves and Crew Chief O’Hara called out times from the landing checklist.

O’Hara: Clear of obstacles on the left.

Lobach: Thank you. Coming forward.

Eaves: Clear down right.

Lobach: Nice and wide.

Eaves: 50 feet.

Lobach: 30 feet.

They touched down. One minute and 42 seconds after passing control to Lobach, Eaves took it back again. As they sat on the ground with their rotor whirring, they discussed the fuel remaining aboard the aircraft and the direction they would travel in during the next segment of their flight. Finally, after six minutes, Eaves signaled that they were ready to take off again.

Eaves: Whenever you’re ready, ma’am.

Lobach: Okay, let’s do it.

Eaves’s deference to Lobach was symptomatic of what is known among psychologists as an “inverted authority gradient.” Although he was the pilot in command, both responsible for the flight and in a position of authority over others on it, Eaves held a lesser rank than Lobach and so in a broader context was her subordinate. In moments of high stress, this ambiguity can muddy the waters as to who is supposed to be making crucial decisions.

Eaves, Lobach, and O’Hara ran through their checklists, and Eaves eased the Black Hawk up into the night sky.

by Jeff Wise, Intelligencer |  Read more:
Image: Intelligencer; Photo: Matt Hecht
[ed. See also: Responders recall a mission of recovery and grief a year after the midair collision near DC (AP).]