Showing posts with label Media. Show all posts
Showing posts with label Media. Show all posts

Sunday, January 25, 2026

Reflections on the 'Manosphere'

Andrew Tate Is the Loneliest Bastard on Earth

Every five years or so, there’s a changing of the guard in digital media. Platform empires rise and fall, subcultures come and go, trends ebb and flow.

In my estimation, we’re entering year two of the latest shift.

The decline of punditry and traditional political commentary is continuing apace from its boom during Covid lockdowns. Commentators who might have once staked out clear, binary positions—conservative or liberal—are drifting away from political debate altogether, moving toward a more parasocial model: building audiences around personality and the feeling of relationship, rather than argument.

It’s increasingly clear that writing is niche. We’re moving away from the age of bloggers and Twitter, and into the age of streaming and clip farming—short video segments, often ripped from longer content, optimized for sharing. (I’ve made this point many times now, but this is why in the world of right-wing digital media, characters like Nick Fuentes are emerging as dominant, whereas no-video podcasters, bloggers, and Twitter personalities receive less attention.)

Labels like “right” and “left” are better thought of as “right-coded” and “left-coded”: ways of signaling who you are and who you’re with, rather than actual positions on what government should do. The people still doing, or more accurately “playing,” politics are themselves experiencing a realignment, scrambling to figure out new alliances as the old divisions stop making sense. I’ve written previously about New Old Leftists and the “post-right,” a motley group of former right-wing commentators who are not “progressives” in the traditional sense, but take up progressive points of view specifically in dialogue with their disgust with reactionary elements of the right.

Anyway, in this rise of coded communities—where affiliation is about vibe and identity more than ideology—we’re seeing the Manosphere go mainstream again. Second time? Third?

The Manosphere—if you’re a reader of this blog who somehow doesn’t know—refers to a loose network of communities organized around men, masculinity, dating advice, and self-improvement, sometimes tipping into outright hostility toward women. These communities have been around on the fringes of the internet for years, though depending on your vantage point, their underlying ideas are either hundreds of years old or at least sixty.

Either way, they keep surfacing into broader culture.
***
The Manosphere as we know it today has at least two distinct antecedents. The first is the mid-twentieth-century convergence of pick-up artistry and men’s rights discourse: one responding to the Sexual Revolution and changing dating norms, the other developing in explicit opposition to second wave feminism. These strands framed gender relations as adversarial, strategic, and zero-sum.

The second antecedent is the part that I hear people talk about less often. The Manosphere in so many ways is a Black phenomenon. I do not mean this as a racial claim about ownership or blame, nor am I referring narrowly to what is sometimes called the “Black Manosphere.” I mean something more specific: many of the aesthetic forms, masculine philosophies, and anxieties that the Manosphere treats as “newly” discovered were articulated in Black American communities decades earlier. These were responses to economic exclusion, social displacement, and the erosion of traditional routes to masculine status.

Someone on X made the good point that the viral clips of Clavicular’s Big Night Out—Andrew Tate, Nick Fuentes, Sneako, and company—felt like a child’s idea of not only masculinity, but wealth. The cigars, the suits, the VIP table, the ham-fisted advice about how you don’t take women out to dinner.

If you’ve read Iceberg Slim, or watched 1970s blaxploitation films like The Mack or Super Fly, the visual language is immediately recognizable. You’ve seen this figure before: the fur coat, the Cadillac Eldorado, the exaggerated display of wealth and control. The question is why that aesthetic originally looked the way it did.

In mid-century America, Black men were systematically excluded from the institutions through which wealth and status quietly accumulate: country clubs, elite universities, corporate ladders, inherited property. The GI Bill’s housing provisions were administered in ways that shut out Black veterans. Union jobs in the building trades stayed segregated. The FHA explicitly refused to insure mortgages in Black neighborhoods. Under those conditions, conspicuous display wasn’t vulgarity (at least, not primarily or exclusively)—it was one of the few available ways to signal success in a society that denied access to the kinds of prestige that don’t need to announce themselves. When wealth can’t whisper—as TikTok’s “old money aesthetic” crowd loves to remind us it should—it has to shout.

The modern Manosphere inherits this aesthetic, adopting the symbols as though they were universal markers of arrival rather than compensatory performances forged under exclusion. What began as a response to being locked out of legitimate power gets recycled, abstracted, and repackaged, this time as timeless masculine truth. As so, to modern audiences, it reads as immature.

The aesthetic was codified in the late ‘60s. (...)

By the 1970s, blaxploitation films had transformed the pimp into an outlaw folk hero, emphasizing style over the moral complexity of the source material. What survived was the cool, the walk, the talk, the clothes, the attitude. Hip-hop — which I admittedly know very little about, so please feel free to correct me here —- picked up the thread: Ice-T named himself in tribute to Iceberg Slim; Snoop Dogg built an entire persona around pimp iconography; the rest is history. The pimp was no longer a figure of the Black underclass navigating impossible circumstances but was quickly becoming embraced as an inadvertent, unironic symbol of male success, available for adoption by anyone — race agnostic.

The “high-value man” who dominates contemporary Manosphere discourse is this same archetype, put through a respectability filter, or maybe just re-fit for modern tastes. The fur coat becomes a tailored suit. The Cadillac becomes a Bugatti. The stable of sex workers becomes a rotating roster of Instagram models (I guess, in Andrew Tate’s case, still sex [trafficked] workers). The underlying logic — and material conditions — are identical: women are resources to be managed, emotional detachment is strength, and a man’s worth is measured by his material display and his control over female attention. (...)

The Manosphere’s grievances are not manufactured—just as the pimp’s weren’t. The anxieties it addresses are real. The conditions that produced the pimp archetype in Black America, the sense that legitimate paths to respect and provision have been foreclosed, are now conditions we all experience.

The Manosphere exists because millions of young men — of every race — are asking the same question Black men were asking in 1965: what does masculinity mean when its economic foundations have been removed?

by Katherine Dee, Default Blog |  Read more:
Images: uncredited
[ed. Pathetic bunch of losers. Includes some truly cringe videos I've never seen before.]

Tuesday, January 20, 2026

It's Not Normal

Samantha: This town has a weird smell that you're all probably used to…but I'm not.
Mrs Krabappel: It'll take you about six weeks, dear. 
-The Simpsons, "Bart's Friend Falls in Love," S3E23, May 7, 1992
We are living through weird times, and they've persisted for so long that you probably don't even notice it. But these times are not normal.

Now, I realize that this covers a lot of ground, and without detracting from all the other ways in which the world is weird and bad, I want to focus on one specific and pervasive and awful way in which this world is not normal, in part because this abnormality has a defined cause, a precise start date, and an obvious, actionable remedy.

6 years, 5 months and 22 days after Fox aired "Bart's Friend Falls in Love," Bill Clinton signed a new bill into law: the Digital Millennium Copyright Act of 1998 (DMCA).

Under Section 1201 of the DMCA, it's a felony to modify your own property in ways that the manufacturer disapproves of, even if your modifications accomplish some totally innocuous, legal, and socially beneficial goal. Not a little felony, either: DMCA 1201 provides for a five year sentence and a $500,000 fine for a first offense.

Back when the DMCA was being debated, its proponents insisted that their critics were overreacting. They pointed to the legal barriers to invoking DMCA 1201, and insisted that these new restrictions would only apply to a few marginal products in narrow ways that the average person would never even notice.

But that was obvious nonsense, obvious even in 1998, and far more obvious today, more than a quarter-century on. In order for a manufacturer to criminalize modifications to your own property, they have to satisfy two criteria: first, they must sell you a device with a computer in it; and second, they must design that computer with an "access control" that you have to work around in order to make a modification.

For example, say your toaster requires that you scan your bread before it will toast it, to make sure that you're only using a special, expensive kind of bread that kicks back a royalty to the manufacturer. If the embedded computer that does the scanning ships from the factory with a program that is supposed to prevent you from turning off the scanning step, then it is a felony to modify your toaster to work with "unauthorized bread":

If this sounds outlandish, then a) You definitely didn't walk the floor at CES last week, where there were a zillion "cooking robots" that required proprietary feedstock; and b) You haven't really thought hard about your iPhone (which will not allow you to install software of your choosing):

But back in 1998, computers – even the kind of low-powered computers that you'd embed in an appliance – were expensive and relatively rare. No longer! Today, manufacturers source powerful "System on a Chip" (SoC) processors at prices ranging from $0.25 to $8. These are full-fledged computers, easily capable of running an "access control" that satisfies DMCA 1201.

Likewise, in 1998, "access controls" (also called "DRM," "technical protection measures," etc) were a rarity in the field. That was because computer scientists broadly viewed these measures as useless. A determined adversary could always find a way around an access control, and they could package up that break as a software tool and costlessly, instantaneously distribute it over the internet to everyone in the world who wanted to do something that an access control impeded. Access controls were a stupid waste of engineering resources and a source of needless complexity and brittleness:

But – as critics pointed out in 1998 – chips were obviously going to get much cheaper, and if the US Congress made it a felony to bypass an access control, then every kind of manufacturer would be tempted to add some cheap SoCs to their products so they could add access controls and thereby felonize any uses of their products that cut into their profits. Basically, the DMCA offered manufacturers a bargain: add a dollar or two to the bill of materials for your product, and in return, the US government will imprison any competitors who offer your customers a "complementary good" that improves on it.

It's even worse than this: another thing that was obvious in 1998 was that once a manufacturer added a chip to a device, they would probably also figure out a way to connect it to the internet. Once that device is connected to the internet, the manufacturer can push software updates to it at will, which will be installed without user intervention. What's more, by using an access control in connection with that over-the-air update mechanism, the manufacturer can make it a felony to block its updates.

Which means that a manufacturer can sell you a device and then mandatorily update it at a later date to take away its functionality, and then sell that functionality back to you as a "subscription":

A thing that keeps happening:

And happening:

And happening:

In fact, it happens so often I've coined a term for it, "The Darth Vader MBA" (as in, "I'm altering the deal. Pray I don't alter it any further"):

Here's what this all means: any manufacturer who devotes a small amount of engineering work and incurs a small hardware expense can extinguish private property rights altogether.

What do I mean by private property? Well, we can look to Blackstone's 1753 treatise:
The right of property; or that sole and despotic dominion which one man claims and exercises over the external things of the world, in total exclusion of the right of any other individual in the universe.
You can't own your iPhone. If you take your iPhone to Apple and they tell you that it is beyond repair, you have to throw it away. If the repair your phone needs involves "parts pairing" (where a new part won't be recognized until an Apple technician "initializes" it through a DMCA-protected access control), then it's a felony to get that phone fixed somewhere else. If Apple tells you your phone is no longer supported because they've updated their OS, then it's a felony to wipe the phone and put a different OS on it (because installing a new OS involves bypassing an "access control" in the phone's bootloader). If Apple tells you that you can't have a piece of software – like ICE Block, an app that warns you if there are nearby ICE killers who might shoot you in the head through your windshield, which Apple has barred from its App Store on the grounds that ICE is a "protected class" – then you can't install it, because installing software that isn't delivered via the App Store involves bypassing an "access control" that checks software to ensure that it's authorized (just like the toaster with its unauthorized bread).

It's not just iPhones: versions of this play out in your medical implants (hearing aid, insulin pump, etc); appliances (stoves, fridges, washing machines); cars and ebikes; set-top boxes and game consoles; ebooks and streaming videos; small appliances (toothbrushes, TVs, speakers), and more.

Increasingly, things that you actually own are the exception, not the rule.

And this is not normal. The end of ownership represents an overturn of a foundation of modern civilization. The fact that the only "people" who can truly own something are the transhuman, immortal colony organisms we call "Limited Liability Corporations" is an absolutely surreal reversal of the normal order of things.

It's a reversal with deep implications: for one thing, it means that you can't protect yourself from raids on your private data or ready cash by adding privacy blockers to your device, which would make it impossible for airlines or ecommerce sites to guess about how rich/desperate you are before quoting you a "personalized price":

It also means you can't stop your device from leaking information about your movements, or even your conversations – Microsoft has announced that it will gather all of your private communications and ship them to its servers for use by "agentic AI": (...)

Microsoft has also confirmed that it provides US authorities with warrantless, secret access to your data:

This is deeply abnormal. Sure, greedy corporate control freaks weren't invented in the 21st century, but the laws that let those sociopaths put you in prison for failing to arrange your affairs to their benefit – and your own detriment – are.

But because computers got faster and cheaper over decades, the end of ownership has had an incremental rollout, and we've barely noticed that it's happened. Sure, we get irritated when our garage-door opener suddenly requires us to look at seven ads every time we use the app that makes it open or close:

But societally, we haven't connected that incident to this wider phenomenon. It stinks here, but we're all used to it.

It's not normal to buy a book and then not be able to lend it, sell it, or give it away. Lending, selling and giving away books is older than copyright. It's older than publishing. It's older than printing. It's older than paper. It is fucking weird (and also terrible) (obviously) that there's a new kind of very popular book that you can go to prison for lending, selling or giving away.

We're just a few cycles away from a pair of shoes that can figure out which shoelaces you're using, or a dishwasher that can block you from using third-party dishes:

It's not normal, and it has profound implications for our security, our privacy, and our society. It makes us easy pickings for corporate vampires who drain our wallets through the gadgets and tools we rely on. It makes us easy pickings for fascists and authoritarians who ally themselves with corporate vampires by promising them tax breaks in exchange for collusion in the destruction of a free society.

I know that these problems are more important than whether or not we think this is normal. But still. It. Is. Just. Not. Normal.

by Cory Doctorow, Pluralistic |  Read more:
Image: uncredited
[ed. Anything labeled 'smart' is usually suspect. What's particularly dangerous is if successive generations fall prey to what conservation biology calls shifting baseline syndrome (forgetting or never really missing something that's been lost, so we don't grieve or fight to restore it). For a deep dive into why everything keeps getting worse see Mr. Doctorow's new book: Enshittification: Why Everything Suddenly Got Worse and What to Do About It," Farrar, Straus, Giroux, October 7 2025.]

Sony Goes for Peanuts

It wasn’t so long ago that purchases of American institutions by Japanese companies sparked outrage in the United States. When Mitsubishi bought the Rockefeller Center in 1989, a local auto dealership ran a TV spot that invited Americans to “imagine a few years from now. It’s December, and the whole family’s going to see the big Christmas tree at Hirohito Center… Enough already.” Sony’s purchase of Columbia Pictures that same year caused such unease that chairman Akio Morita felt the need to declare “this is not a Japanese invasion.” A Newsweek poll of the era revealed that 54% of Americans saw Japan as a bigger threat to America than the Soviet Union. Many exploited this fear of Japan for their own ends. Politicians grandstanded by smashing Japanese products and demanding investigations into purchases. Predictably, Donald Trump’s first public foray into politics was a jeremiad against Japan in a 1989 appearance on the Oprah Winfrey Show.

Contrast this to yesterday, when Sony announced that it had paid nearly half a billion dollars for another American icon: Peanuts Holding LLC, the company that administers the rights to the Peanuts franchise. Talk about A Charlie Brown Christmas for shareholders! The reaction to this Japanese acquisition of a cultural institution? Crickets. This speaks to how dramatically the relationship between the US and Japan has changed. It also speaks to how dramatically Peanuts changed, how Peanuts changed Japan, and how that in turn changed all of us. But perhaps most of all, it illustrates (pun intended) how stories need products, and products need stories.

There are countless stories out there, and countless products. But crossing these streams — giving stories products in the form of merchandise, or products stories to make them more than just commodities, can supercharge both. It can create international empires. Peanuts is a perfect case in point.

When Charles Shultz’ Peanuts debuted in October of 1950, it was utterly unlike any cartoon Americans had seen in the funny pages. The very first strip’s punchline involved an adorable tyke declaring his hatred for Charlie Brown. Li’l Abner creator Al Capp described the cast as “good mean little bastards eager to hurt each other.” Matt Groening of The Simpsons fame recalled being “excited by the casual cruelty and offhand humiliations at the heart of the strip.” To Garry Trudeau of Doonesbury, it “vibrated with fifties alienation.”

A hint of darkness made Peanuts stick out in a crowded comics page. But it’s hard to square these comments with the Happiness Is a Warm Puppy-era Peanuts I remember from my childhood. By that time Schultz had sanded the rough edges off those “little bastards,” distilling them into cute and lovable archetypes. More to the point, he de-centered the kids to focus on Snoopy, who had morphed from his origins as a four-legged canine into a bipedal, anthropomorphic creature with a bulbous head and a penchant for tap-dancing and flying biplanes.

The vibe shift seems to date to 1966, when the animated It’s the Great Pumpkin, Charlie Brown devoted roughly a quarter of its screen time to Snoopy’s solo flights of fancy. Schultz was already lauded for his short-form social satire: his characters had graced the cover of Time the year before. But he seems to have grasped that the way to riches would be only found by looking at the brighter side of life.

This new Peanuts, less mean, less casually cruel, less alienated, was arguably also less interesting. But there was no question that it was way, way more marketable. You might have identified with one or another of the human characters, with their all too human foibles, but anthropomorphic Snoopy was someone anyone and everyone could inhabit. Kids in particular. You didn’t even have to be American to get him.

This later, kinder, gentler incarnation of Peanuts, and Snoopy in particular, would charm Japanese audiences, thanks to the efforts of a serial entrepreneur named Shintaro Tsuji. He was a would-be poet turned wartime chemist, then a postwar black-market bootlegger of moonshine, and an inveterate hatcher of business schemes ranging from silks to produce to kitchenware. You are undoubtedly familiar with the most successful of his ventures. It is called Sanrio — the home of Hello Kitty.

Tsuji, long interested in American trends, played a key role in importing many of them to Japan. He forged a relationship with Hallmark to translate their greeting cards, and negotiated with Mattel for the rights to Barbie. He acquired the license to Peanuts in 1968, when his company, then known as the Yamanashi Silk Center, was at a low. Snoopy-branded merchandise proved so popular that it put his struggling company back in the black within a year. Snoopy wasn’t the first cute animal to hit big in Japan; Tsuji himself had scored a big hit in the mid-sixties with merchandise featuring Mii-tan, a cute cat designed by the artist Ado Mizumori. But Snoopy’s runaway success seems to have sparked an epiphany in Tsuji.

As he later put it, Japan was “a world in which ‘making money’ meant ‘making things.’ I desperately wanted to leapfrog the ‘things’—the ‘hardware’—and make a business out of the intellectual property—the ‘software.’ I suspect everyone around me thought I was nuts.”

He was nuts. Merchandising characters from hit stories was common sense, then as now. Many Japanese companies did that sort of thing. Creating hit characters without stories was fiendishly difficult, bordering on impossible. Stories breathe life into characters, bestowing them with an authenticity that standalone designs simply do not possess (or need to earn in other ways). Yet Tsuji would not be deterred. In 1971, he launched an in-house art department, staffing it with young women straight out of art school. In the wake of Peanuts’ continuing success, he gave the team a singular directive: “Draw cats and bears. If a dog hit this big, one of those two is sure to follow.”

Two years later, he renamed the Yamanashi Silk Center “Sanrio.” (There’s a whole story about how that came to be, which you can read in my book, if you’re so inclined.) The year after that, in 1974, one of Sanrio’s designers struck gold, in the form of an anthropomorphic cat with a bulbous head and a penchant for hugging: Hello Kitty. Soon, Kitty products were a full-blown fiiba (fever) in Japan. And this time, Tsuji didn’t have to split the proceeds with anyone, because Sanrio owned the character outright. Schultz needed decades of narrative to make stars of Peanuts’ menagerie of characters. Tsuji upended this process by making characters stars without any story at all.

Sanrio famously insists that Hello Kitty isn’t really a cat; she’s a little girl who happens to look like a cat. I take no particular stance on this globally divisive issue. But I think you can make the case that she wouldn’t exist at all, if it hadn’t been for the trail Schultz blazed with Peanuts, shifting away from social satire to make an anthropomorphic dog the star of the show. Tsuji’s genius was realizing that you could make a star without a show — provided you had the ability to print it on countless school supplies, kitchenware, and accessories. That was the trick up his sleeve. The medium is the message, as they say. In essence, Kitty products, ubiquitous to the point of absurdity, became her story.

by Matt Alt, Pure Invention |  Read more:
Image: uncredited
[ed. See also: Super Galapagos (PI):]
***
Once the West feared Japan’s supposed technological superiority. Then came the schadenfreude over Japan’s supposed fall. Now a new generation is projecting upon the country an almost desperate longing for comfort. And is it any wonder? The meme centers on companies producing products that make the lives of consumers easier. That must feel like a dreamy fantasy to young folks who’ve only known life in an attention economy, where corporations are the consumers and they’re the products.

To them, Japan isn’t in the past or the future. It’s a very real place — a place where things haven’t gone haywire. This is Japan as a kind of Galapagos, but not in a pejorative sense. Rather, it’s a superlative, asking, a little plaintively: Why can’t we have nice things like this in our country?...

I agree that Japan is a kind of Galapagos, in the sense that it can be oblivious to global trends. But I disagree that this is a weakness. The reason being that nearly everything the planet loves from Japan was made for by Japanese, for Japanese in the first place.

Looking back, this has always been the case. Whether the woodblock prints that wowed the world in the 19th century, or the Walkmans and Nintendo Entertainment Systems that were must-haves in the Eighties, or the Pokémania that seized the planet at the turn of the Millenium, or the life-changing cleaning magic of the 2010s, or the anime blockbusters Japan keeps unleashing in the 2020s – they hit us in the feels, so we assumed that they were made just for us. But they weren’t.

Sunday, January 18, 2026

The Monkey’s Paw Curls

[ed. More than anyone probably wants to know (or can understand) about prediction markets.]

Isn’t “may you get exactly what you asked for” one of those ancient Chinese curses?

Since we last spoke, prediction markets have gone to the moon, rising from millions to billions in monthly volume.


For a few weeks in October, Polymarket founder Shayne Coplan was the world’s youngest self-made billionaire (now it’s some AI people). Kalshi is so accurate that it’s getting called a national security threat.

The catch is, of course, that it’s mostly degenerate gambling, especially sports betting. Kalshi is 81% sports by monthly volume. Polymarket does better - only 37% - but some of the remainder is things like this $686,000 market on how often Elon Musk will tweet this week - currently dominated by the “140 - 164 times” category.

(ironically, this seems to be a regulatory difference - US regulators don’t mind sports betting, but look unfavorably on potentially “insensitive” markets like bets about wars. Polymarket has historically been offshore, and so able to concentrate on geopolitics; Kalshi has been in the US, and so stuck mostly to sports. But Polymarket is in the process of moving onshore; I don’t know if this will affect their ability to offer geopolitical markets)

Degenerate gambling is bad. Insofar as prediction markets have acted as a Trojan Horse to enable it, this is bad. Insofar as my advocacy helped make this possible, I am bad. I can only plead that it didn’t really seem plausible, back in 2021, that a presidential administration would keep all normal restrictions on sports gambling but also let prediction markets do it as much as they wanted. If only there had been some kind of decentralized forecasting tool that could have given me a canonical probability on this outcome!

Still, it might seem that, whatever the degenerate gamblers are doing, we at least have some interesting data. There are now strong, minimally-regulated, high-volume prediction markets on important global events. In this column, I previously claimed this would revolutionize society. Has it?


I don’t feel revolutionized. Why not?

The problem isn’t that the prediction markets are bad. There’s been a lot of noise about insider trading and disputed resolutions. But insider trading should only increase accuracy - it’s bad for traders, but good for information-seekers - and my impression is that the disputed resolutions were handled as well as possible. When I say I don’t feel revolutionized, it’s not because I don’t believe it when it says there’s a 20% chance Khameini will be out before the end of the month. The several thousand people who have invested $6 million in that question have probably converged upon the most accurate probability possible with existing knowledge, just the way prediction markets should.

I actually like this. Everyone is talking about the protests in Iran, and it’s hard to gauge their importance, and knowing that there’s a 20% chance Khameini is removed by February really does help to place them in context. The missing link seems to be between “it’s now possible to place global events in probabilistic context → society revolutionized”.

Here are some possibilities:

Maybe people just haven’t caught on yet? Most news sources still don’t cite prediction markets, even when many people would care about their outcome. For example, the Khameini market hasn’t gotten mentioned in articles about the Iran protests, even though “will these protests succeed in toppling the regime?” is the obvious first question any reader would ask.

Maybe the problem is that probabilities don’t matter? Maybe there’s some State Department official who would change plans slightly over a 20% vs. 40% chance of Khameini departure, or an Iranian official for whom that would mean the difference between loyalty and defection, and these people are benefiting slightly, but not enough that society feels revolutionized.

Maybe society has been low-key revolutionized and we haven’t noticed? Very optimistically, maybe there aren’t as many “obviously the protests will work, only a defeatist doomer traitor would say they have any chance of failing!” “no, obviously the protests will fail, you’re a neoliberal shill if you think they could work” takes as there used to be. Maybe everyone has converged to a unified assessment of probabilistic knowledge, and we’re all better off as a result.

Maybe Polymarket and Kalshi don’t have the right questions. Ask yourself: what are the big future-prediction questions that important disagreements pivot around? When I try this exercise, I get things like:
  • Will the AI bubble pop? Will scaling get us all the way to AGI? Will AI be misaligned?
  • Will Trump turn America into a dictatorship? Make it great again? Somewhere in between?
  • Will YIMBY policies lower rents? How much?
  • Will selling US chips to China help them win the AI race?
  • Will kidnapping Venezuela’s president weaken international law in some meaningful way that will cause trouble in the future?
  • If America nation-builds Venezuela, for whatever definition of nation-build, will that work well, or backfire?
Some of these are long-horizon, some are conditional, and some are hard to resolve. There are potential solutions to all these problems. But why worry about them when you can go to the moon on sports bets?

Annals of The Rulescucks

The new era of prediction markets has provided charming additions to the language, including “rulescuck” - someone who loses an otherwise-prescient bet based on technicalities of the resolution criteria.

Resolution criteria are the small print explaining what counts as the prediction market topic “happening'“. For example, in the Khameini example above, Khameini qualifies as being “out of power” if:
…he resigns, is detained, or otherwise loses his position or is prevented from fulfilling his duties as Supreme Leader of Iran within this market's timeframe. The primary resolution source for this market will be a consensus of credible reporting.
You can imagine ways this definition departs from an exact common-sensical concept of “out of power” - for example, if Khameini gets stuck in an elevator for half an hour and misses a key meeting, does this count as him being “prevented from fulfilling his duties”? With thousands of markets getting resolved per month, chances are high that at least one will hinge upon one of these edge cases.

Kalshi resolves markets by having a staff member with good judgment decide whether or not the situation satisfies the resolution criteria.

Polymarket resolves markets by . . . oh man, how long do you have? There’s a cryptocurrency called UMA. UMA owners can stake it to vote on Polymarket resolutions in an associated contract called the UMA Oracle. Voters on the losing side get their cryptocurrency confiscated and given to the winners. This creates a Keynesian beauty contest, ie a situation where everyone tries to vote for the winning side. The most natural Schelling point is the side which is actually correct. If someone tries to attack the oracle by buying lots of UMA and voting for the wrong side, this incentivizes bystanders to come in and defend the oracle by voting for the right side, since (conditional on there being common knowledge that everyone will do this) that means they get free money at the attackers’ expense. But also, the UMA currency goes up in value if people trust the oracle and plan to use it more often, and it goes down if people think the oracle is useless and may soon get replaced by other systems. So regardless of their other incentives, everyone who owns the currency has an incentive to vote for the true answer so that people keep trusting the oracle. This system works most of the time, but tends towards so-called “oracle drama” where seemingly prosaic resolutions might lie at the end of a thrilling story of attacks, counterattacks, and escalations.

Here are some of the most interesting alleged rulescuckings of 2026:

Mr Ozi: Will Zelensky wear a suit? Ivan Cryptoslav calls this “the most infamous example in Polymarket history”. Ukraine’s president dresses mostly in military fatigues, vowing never to wear a suit until the war is over. As his sartorial notoriety spread, Polymarket traders bet over $100 million on the question of whether he would crack in any given month. At the Pope’s funeral, Zelensky showed up in a respectful-looking jacket which might or might not count. Most media organizations refused to describe it as a “suit”, so the decentralized oracle ruled against. But over the next few months, Zelensky continued to straddle the border of suithood, and the media eventually started using the word “suit” in their articles. This presented a quandary for the oracle, which was supposed to respect both the precedent of its past rulings, and the consensus of media organizations. Voters switched sides several times until finally settling on NO; true suit believers were unsatisfied with this decision. For what it’s worth, the Twitter menswear guy told Wired that “It meets the technical definition, [but] I would also recognize that most people would not think of that as a suit.”

[more examples...]

With one exception, these aren’t outright oracle failures. They’re honest cases of ambiguous rules.

Most of the links end with pleas for Polymarket to get better at clarifying rules. My perspective is that the few times I’ve talked to Polymarket people, I’ve begged them to implement various cool features, and they’ve always said “Nope, sorry, too busy figuring out ways to make rules clearer”. Prediction market people obsess over maximally finicky resolution criteria, but somehow it’s never enough - you just can’t specify every possible state of the world beforehand.

The most interesting proposal I’ve seen in this space is to make LLMs do it; you can train them on good rulesets, and they’re tolerant enough of tedium to print out pages and pages of every possible edge case without going crazy. It’ll be fun the first time one of them hallucinates, though.

…And Miscellaneous N’er-Do-Wells

I include this section under protest.

The media likes engaging with prediction markets through dramatic stories about insider trading and market manipulation. This is as useful as engaging with Waymo through stories about cats being run over. It doesn’t matter whether you can find one lurid example of something going wrong. What matters is the base rates, the consequences, and the alternatives. Polymarket resolves about a thousand markets a month, and Kalshi closer to five thousand. It’s no surprise that a few go wrong; it’s even less surprise that there are false accusations of a few going wrong.

Still, I would be remiss to not mention this at all, so here are some of the more interesting stories:

by Scott Alexander, Astral Codex Ten |  Read more:
Images: uncredited

Saturday, January 17, 2026

The Dilbert Afterlife

Sixty-eight years of highly defective people

Thanks to everyone who sent in condolences on my recent death from prostate cancer at age 68, but that was Scott Adams. I (Scott Alexander) am still alive.

Still, the condolences are appreciated. Scott Adams was a surprisingly big part of my life. I may be the only person to have read every Dilbert book before graduating elementary school. For some reason, 10-year-old-Scott found Adams’ stories of time-wasting meetings and pointy-haired bosses hilarious. No doubt some of the attraction came from a more-than-passing resemblance between Dilbert’s nameless corporation and the California public school system. We’re all inmates in prisons with different names.

But it would be insufficiently ambitious to stop there. Adams’ comics were about the nerd experience. About being cleverer than everyone else, not just in the sense of being high IQ, but in the sense of being the only sane man in a crazy world where everyone else spends their days listening to overpaid consultants drone on about mission statements instead of doing anything useful. There’s an arc in Dilbert where the boss disappears for a few weeks and the engineers get to manage their own time. Productivity shoots up. Morale soars. They invent warp drives and time machines. Then the boss returns, and they’re back to being chronically behind schedule and over budget. This is the nerd outlook in a nutshell: if I ran the circus, there’d be some changes around here.

Yet the other half of the nerd experience is: for some reason this never works. Dilbert and his brilliant co-workers are stuck watching from their cubicles while their idiot boss racks in bonuses and accolades. If humor, like religion, is an opiate of the masses, then Adams is masterfully unsubtle about what type of wound his art is trying to numb.

This is the basic engine of Dilbert: everyone is rewarded in exact inverse proportion to their virtue. Dilbert and Alice are brilliant and hard-working, so they get crumbs. Wally is brilliant but lazy, so he at least enjoys a fool’s paradise of endless coffee and donuts while his co-workers clean up his messes. The P.H.B. is neither smart nor industrious, so he is forever on top, reaping the rewards of everyone else’s toil. Dogbert, an inveterate scammer with a passing resemblance to various trickster deities, makes out best of all.

The repressed object at the bottom of the nerd subconscious, the thing too scary to view except through humor, is that you’re smarter than everyone else, but for some reason it isn’t working. Somehow all that stuff about small talk and sportsball and drinking makes them stronger than you. No equation can tell you why. Your best-laid plans turn to dust at a single glint of Chad’s perfectly-white teeth.

Lesser lights may distance themselves from their art, but Adams radiated contempt for such surrender. He lived his whole life as a series of Dilbert strips. Gather them into one of his signature compendia, and the title would be Dilbert Achieves Self Awareness And Realizes That If He’s So Smart Then He Ought To Be Able To Become The Pointy-Haired Boss, Devotes His Whole Life To This Effort, Achieves About 50% Success, Ends Up In An Uncanny Valley Where He Has Neither The Virtues Of The Honest Engineer Nor Truly Those Of The Slick Consultant, Then Dies Of Cancer Right When His Character Arc Starts To Get Interesting.

If your reaction is “I would absolutely buy that book”, then keep reading, but expect some detours.

Fugitive From The Cubicle Police

The niche that became Dilbert opened when Garfield first said “I hate Mondays”. The quote became a popular sensation, inspiring t-shirts, coffee mugs, and even a hit single. But (as I’m hardly the first to point out) why should Garfield hate Mondays? He’s a cat! He doesn’t have to work!

In the 80s and 90s, saying that you hated your job was considered the height of humor. Drew Carey: “Oh, you hate your job? There’s a support group for that. It’s called everybody, and they meet at the bar.”


This was merely the career subregion of the supercontinent of Boomer self-deprecating jokes, whose other prominences included “I overeat”, “My marriage is on the rocks”, “I have an alcohol problem”, and “My mental health is poor”.

Arguably this had something to do with the Bohemian turn, the reaction against the forced cheer of the 1950s middle-class establishment of company men who gave their all to faceless corporations and then dropped dead of heart attacks at 60. You could be that guy, proudly boasting to your date about how you traded your second-to-last patent artery to complete a spreadsheet that raised shareholder value 14%. Or you could be the guy who says “Oh yeah, I have a day job working for the Man, but fuck the rat race, my true passion is white water rafting”. When your father came home every day looking haggard and worn out but still praising his boss because “you’ve got to respect the company or they won’t take care of you”, being able to say “I hate Mondays” must have felt liberating, like the mantra of a free man.

This was the world of Dilbert’s rise. You’d put a Dilbert comic on your cubicle wall, and feel like you’d gotten away with something. If you were really clever, you’d put the Dilbert comic where Dilbert gets in trouble for putting a comic on his cubicle wall on your cubicle wall, and dare them to move against you.


(again, I was ten at the time. I only know about this because Scott Adams would start each of his book collections with an essay, and sometimes he would talk about letters he got from fans, and many of them would have stories like these.)

But t-shirts saying “Working Hard . . . Or Hardly Working?” no longer hit as hard as they once did. Contra the usual story, Millennials are too earnest to tolerate the pleasant contradiction of saying they hate their job and then going in every day with a smile. They either have to genuinely hate their job - become some kind of dirtbag communist labor activist - or at least pretend to love it. The worm turns, all that is cringe becomes based once more and vice versa. Imagine that guy boasting to his date again. One says: “Oh yeah, I grudgingly clock in every day to give my eight hours to the rat race, but trust me, I’m secretly hating myself the whole time”? The other: “I work for a boutique solar energy startup that’s ending climate change - saving the environment is my passion!” Zoomers are worse still: not even the fig leaf of social good, just pure hustle.

Dilbert is a relic of a simpler time, when the trope could be played straight. But it’s also an artifact of the transition, maybe even a driver of it. Scott Adams appreciated these considerations earlier and more acutely than anyone else. And they drove him nuts.

Stick To Drawing Comics, Monkey Brain

Adams knew, deep in his bones, that he was cleverer than other people. God always punishes this impulse, especially in nerds. His usual strategy is straightforward enough: let them reach the advanced physics classes, where there will always be someone smarter than them, then beat them on the head with their own intellectual inferiority so many times that they cry uncle and admit they’re nothing special.

For Adams, God took a more creative and – dare I say, crueler – route. He created him only-slightly-above-average at everything except for a world-historical, Mozart-tier, absolutely Leonardo-level skill at making silly comics about hating work.


Scott Adams never forgave this. Too self-aware to deny it, too narcissistic to accept it, he spent his life searching for a loophole. You can read his frustration in his book titles: How To Fail At Almost Everything And Still Win Big. Trapped In A Dilbert World. Stick To Drawing Comics, Monkey Brain. Still, he refused to stick to comics. For a moment in the late-90s, with books like The Dilbert Principle and The Dilbert Future, he seemed on his way to be becoming a semi-serious business intellectual. He never quite made it, maybe because the Dilbert Principle wasn’t really what managers and consultants wanted to hear:
I wrote The Dilbert Principle around the concept that in many cases the least competent, least smart people are promoted, simply because they’re the ones you don't want doing actual work. You want them ordering the doughnuts and yelling at people for not doing their assignments—you know, the easy work. Your heart surgeons and your computer programmers—your smart people—aren't in management.
Okay, “I am cleverer than everyone else”, got it. His next venture (c. 1999) was the Dilberito, an attempt to revolutionize food via a Dilbert-themed burrito with the full Recommended Daily Allowance of twenty-three vitamins. I swear I am not making this up. A contemporaneous NYT review said it “could have been designed only by a food technologist or by someone who eats lunch without much thought to taste”. The Onion, in its twenty year retrospective for the doomed comestible, called it a frustrated groping towards meal replacements like Soylent or Huel, long before the existence of a culture nerdy enough to support them. Adams himself, looking back from several years’ distance, was even more scathing: “the mineral fortification was hard to disguise, and because of the veggie and legume content, three bites of the Dilberito made you fart so hard your intestines formed a tail.”

His second foray into the culinary world was a local restaurant called Stacey’s.

by Scott Alexander, Astral Codex Ten |  Read more:
Images: Dilbert/ACX 
[ed. First picture: Adams actually had a custom-built tower on his home shaped like Dilbert’s head.]

Thursday, January 15, 2026

The Day NY Publishing Lost Its Soul; Fifty People Control the Culture

Everybody can see there’s a crisis in New York publishing. Even the hot new books feel lukewarm. Writers win the Pulitzer Prize and sell just few hundred copies. The big publishers rely on 50 or 100 proven authors—everything else is just window dressing or the back catalog.

You can tell how stagnant things have become from the lookalike covers. I walk into a bookstore and every title I see is like this.


They must have fired the design team and replaced it with a lazy bot. You get big fonts, random shapes, and garish colors—again and again and again. Every cover looks like it was made with a circus clown’s makeup kit.

My wife is in a book club. If I didn’t know better, I’d think they read the same book every month. It’s those same goofy colors and shapes on every one.

Of course, you can’t judge a book by its cover. But if you read enough new releases, you get the same sense of familiarity from the stories. The publishers keep returning to proven formulas—which they keep flogging long after they’ve stopped working.

And that was a long time ago.

It’s not just publishing. A similar stagnancy has settled in at the big movie studios and record labels. Nobody wants to take a risk—but (as I’ve learned through painful personal experience) that’s often the riskiest move of them all. Live by the formula, and you die by the formula.

It’s not just publishing. A similar stagnancy has settled in at the big movie studios and record labels. Nobody wants to take a risk—but (as I’ve learned through painful personal experience) that’s often the riskiest move of them all. Live by the formula, and you die by the formula.

How did we end up here?

It’s hard to pick a day when the publishing industry made its deal with the devil. But an anecdote recently shared by Steve Wasserman is as good a place to begin as any.

by Ted Gioia, Honest Broker | Read more:
Image: uncredited
[ed. I'll never buy a book that looks like this, no matter what the reviews say. I'd be embarrassed to be seen in public with it, let alone display it on my bookshelf. See also: Fifty People Control the Culture (HB).]

Tuesday, January 13, 2026

The Inevitable Rise of the Art TV

The Samsung Frame TV, first announced in 2017, doesn’t look all that great as an actual television. But switch it off and it sure is pretty—certainly much better to look at than an empty black void.

This is thanks to its matte-finish, anti-glare screen and the picture-frame-like bezels that together transform whatever fine art you choose to display on the TV when it's in standby mode (Samsung offers a variety of high-resolution digital slides) into something that resembles a framed painting. In the years since its debut and through a few updates, the Frame TV has become one of the more considered options for people who live in smaller spaces without dedicated rooms for watching TV.

It has taken a while for other brands to catch up, but we're now seeing a huge wave of Frame-like TVs hit the market. The trend is largely driven by aesthetes in cities where smaller living rooms are the norm, but it's getting a boost from advances in screen design.

Late last year, Hisense announced its CanvasTV, a frame competitor that also has a matte screen and displays art. (We have a review unit coming shortly.) TCL has the similar NXTvision model that uses a Vincent van Gogh self-portrait in the marketing, and LG has announced the Gallery TV (also repping van Gogh) for later this year. Even Amazon has decided to throw its hat in the ring, with the Ember Artline TV. Announced this week at CES 2026, Amazon's $899 television can display one of 2,000 works of art (available for free to Ember Artline owners) and even has a tool that uses Alexa AI to help you decide which artworks are the best fit for your room.

So what's so great about Art TVs, and why do brands seem to be pivoting so hard into the category?

Part of it has to do with personal space. It's true that many younger buyers just don't have the same taste or sense of style as folks from previous generations. But also, young city-dwelling professionals are less likely to have the room to place a large screen in a dedicated area in their home, a pain point compounded by the fact that TV screen sizes have ballooned over the past decade.

The other reason TV makers are getting artsy has to do with the evolution of TV technology itself. Brands are choosing to step into this space now because they have finally developed the means to create matte screens that can accurately represent a painting or a fine art photograph. Though Samsung is a pioneer in the space, matte LED screens are enjoying something of a renaissance across all television brands.

A typical glossy TV display reflects light like a window, but a matte screen absorbs light like a canvas might. This effect enables any art pieces displayed on the screen to look extra realistic. Another advance in technology is backlighting. Where previous generations of these Art TVs needed to be lit from the edges of the display in order to maintain their painting-like thinness and allow them to be mounted flush against a wall, brands have recently been able to employ more advanced lighting systems while keeping the TVs slim. Local dimming, better backlighting processing, and the ability to adjust the screen brightness to match a room's ambient lighting when in “art mode” make these new displays look better than ever.

by Parker Hall, Wired |  Read more:
Image: Samsung/PCMag
[ed. See also: Ambient Intelligence in the Living Room (MDPI).]

Sunday, January 11, 2026

Fascism in America

Beginning in 1943, the War Department published a series of pamphlets for U.S. Army personnel in the European theater of World War II. Titled Army Talks, the series was designed “to help [the personnel] become better-informed men and women and therefore better soldiers.”

On March 24, 1945, the topic for the week was “FASCISM!”

“You are away from home, separated from your families, no longer at a civilian job or at school and many of you are risking your very lives,” the pamphlet explained, “because of a thing called fascism.” But, the publication asked, what is fascism? “Fascism is not the easiest thing to identify and analyze,” it said, “nor, once in power, is it easy to destroy. It is important for our future and that of the world that as many of us as possible understand the causes and practices of fascism, in order to combat it.”

Fascism, the U.S. government document explained, “is government by the few and for the few. The objective is seizure and control of the economic, political, social, and cultural life of the state.” “The people run democratic governments, but fascist governments run the people.”

“The basic principles of democracy stand in the way of their desires; hence—democracy must go! Anyone who is not a member of their inner gang has to do what he’s told. They permit no civil liberties, no equality before the law.” “Fascism treats women as mere breeders. ‘Children, kitchen, and the church,’ was the Nazi slogan for women,” the pamphlet said.

Fascists “make their own rules and change them when they choose…. They maintain themselves in power by use of force combined with propaganda based on primitive ideas of ‘blood’ and ‘race,’ by skillful manipulation of fear and hate, and by false promise of security. The propaganda glorifies war and insists it is smart and ‘realistic’ to be pitiless and violent.”

Fascists understood that “the fundamental principle of democracy—faith in the common sense of the common people—was the direct opposite of the fascist principle of rule by the elite few,” it explained, “[s]o they fought democracy…. They played political, religious, social, and economic groups against each other and seized power while these groups struggled.”

Americans should not be fooled into thinking that fascism could not come to America, the pamphlet warned; after all, “[w]e once laughed Hitler off as a harmless little clown with a funny mustache.” And indeed, the U.S. had experienced “sorry instances of mob sadism, lynchings, vigilantism, terror, and suppression of civil liberties. We have had our hooded gangs, Black Legions, Silver Shirts, and racial and religious bigots. All of them, in the name of Americanism, have used undemocratic methods and doctrines which…can be properly identified as ‘fascist.’”

The War Department thought it was important for Americans to understand the tactics fascists would use to take power in the United States. They would try to gain power “under the guise of ‘super-patriotism’ and ‘super-Americanism.’” And they would use three techniques:

First, they would pit religious, racial, and economic groups against one another to break down national unity. Part of that effort to divide and conquer would be a “well-planned ‘hate campaign’ against minority races, religions, and other groups.”

Second, they would deny any need for international cooperation, because that would fly in the face of their insistence that their supporters were better than everyone else. “In place of international cooperation, the fascists seek to substitute a perverted sort of ultra-nationalism which tells their people that they are the only people in the world who count. With this goes hatred and suspicion toward the people of all other nations.”

Third, fascists would insist that “the world has but two choices—either fascism or communism, and they label as ‘communists’ everyone who refuses to support them.”

It is “vitally important” to learn to spot native fascists, the government said, “even though they adopt names and slogans with popular appeal, drape themselves with the American flag, and attempt to carry out their program in the name of the democracy they are trying to destroy.”

The only way to stop the rise of fascism in the United States, the document said, “is by making our democracy work and by actively cooperating to preserve world peace and security.” In the midst of the insecurity of the modern world, the hatred at the root of fascism “fulfills a triple mission.” By dividing people, it weakens democracy. “By getting men to hate rather than to think,” it prevents them “from seeking the real cause and a democratic solution to the problem.” By falsely promising prosperity, it lures people to embrace its security.

“Fascism thrives on indifference and ignorance,” it warned. Freedom requires “being alert and on guard against the infringement not only of our own freedom but the freedom of every American. If we permit discrimination, prejudice, or hate to rob anyone of his democratic rights, our own freedom and all democracy is threatened.”

by US Army/War Department/Heather Cox Richardson, Letters from an American |  Read more:
Image: US Army
[ed. Dictators are gonna dictate, it's what they do. The real blame lies with supporters who give them their power, willingly. The people who think their personal fortunes or the country's will be enhanced by standing in the shadow of a strongman. And others: tuned out and oblivious, who "just aren't into politics" or rely on "talking points" to tell them what to think. It's all here. Now. See also: January 10, 2026:]
***
Yesterday, in an apparent attempt to regain control of the national narrative surrounding the deadly shooting of Renee Good in Minneapolis, Vice President J.D. Vance led the administration in pushing a video of the shooting captured by the shooter himself, Jonathan Ross, on his cell phone. (...)

What is truly astonishing is that the administration thought this video would exonerate Ross and support the administration’s insistence that he was under attack from a domestic terrorist trying to ram him with her car. The video was leaked to a right-wing news site, and Vance reposted it with the caption: “What the press has done in lying about this innocent law enforcement officer is disgusting. You should all be ashamed of yourselves.” The Department of Homeland Security reposted Vance’s post.

As senior editor of Lawfare Media Eric Columbus commented: “Do Vance and DHS think we can’t actually watch the video?” Multiple social media users noted that Good’s last words to Ross were “That’s fine. I’m not mad at you,” while his to her, after he shot her in the face, were “F*cking b*tch!”

In the case of the murder of Renee Good, the shooter and his protectors are clearly so isolated in their own authoritarian bubble they cannot see how regular Americans would react to the video of a woman smiling at a masked agent and saying: “That’s fine, dude. I’m not mad at you,” only to have him shoot her in the face and then spit out “F*cking b*tch” after he killed her. (...) [ed. Probably the same way they reacted to the storming of Capitol Building...

Although ICE currently employs more than 20,000 people, it is looking to hire over 10,000 more with the help of the money Republicans put in their One Big Beautiful Bill Act of July. That law tripled ICE’s budget for enforcement and deportation to about $30 billion.

On December 31, Drew Harwell and Joyce Sohyun Lee of the Washington Post reported that ICE was investing $100 million on what it called a “wartime recruitment” strategy to hire thousands of new officers. It planned to target gun rights supporters and military enthusiasts as well as those who listen to right-wing radi0 shows, directing ads to people who have gone to Ultimate Fighting Championship (UFC) fights or shopped for guns and tactical gear. It planned to send ads to the phone web browsers and social media feeds of people near military bases, NASCAR races, gun and trade shows, or college campuses, apparently not considering them the hotbeds of left-wing indoctrination right-wing politicians claim. (...)

When Kaitlan Collins of CNN asked Trump yesterday if he thought the FBI should be sharing information about the shooting of Renee Good with state officials, as is normally the case, Trump responded: “Well, normally, I would, but they’re crooked officials. I mean, Minneapolis and Minnesota, what a beautiful place, but it’s being destroyed. It’s got an incompetent governor fool. I mean, he’s a stupid person, and, uh, it looks like the number could be $19 billion stolen from a lot of people, but largely people from Somalia. They buy their vote, they vote in a group, they buy their vote. They sell more Mercedes-Benzes in that area than almost—can you imagine? You come over with no money and then shortly thereafter you’re driving a Mercedes-Benz. The whole thing is ridiculous. They’re very corrupt people. It’s a very corrupt state. I feel that I won Minnesota. I think I won it all three times. Nobody’s won it for since Richard Nixon won it many, many years ago. I won it all three times, in my opinion, and it’s a corrupt state, a corrupt voting state, and the Republicans ought to get smart and demand on voter ID. They ought to demand, maybe same-day voting and all of the other things that you have to have to safe election. But I won Minnesota three times that I didn’t get credit for. I did so well in that state, every time. The people were, they were crying. Every time after. That’s a crooked state. California’s a crooked state. Many crooked states. We have a very, very dishonest voting system.”

Trump lost Minnesota in 2016, 2020, and 2024.

Thursday, January 8, 2026

If You Give a Mouse a Cookie

If You Give a Mouse a Cookie

Illustrations: Felicia Bond
[ed. For future reference. Wish I'd known about this book (and series) when my grandaughter was a bit younger, but maybe it's not too late (still seven, but she's growing up fast).]

Wikipedia Style Guide

Many people edit Wikipedia because they enjoy writing; however, that passion can result in overlong composition. This reflects a lack of time or commitment to refine an effort through successively more concise drafts. With some application, natural redundancies and digressions can often be eliminated. Recall the venerable paraphrase of Pascal: "I made this so long because I did not have time to make it shorter." [Wikipedia: tl;dr]

Inverted pyramid

Some articles follow the inverted pyramid structure of journalism, which can be seen in news articles that get directly to the point. The main feature of the inverted pyramid is placement of important information first, with a decreasing importance as the article advances. Originally developed so that the editors could cut from the bottom to fit an item into the available layout space, this style encourages brevity and prioritizes information, because many people expect to find important material early, and less important information later, where interest decreases. (...)

What Wikipedia is not

Wikipedia is not a manual, guidebook, textbook, or scientific journal. Articles and other encyclopedic content should be written in a formal tone. Standards for formal tone vary depending upon the subject matter but should usually match the style used in Featured- and Good-class articles in the same category. Encyclopedic writing has a fairly academic approach, while remaining clear and understandable. Formal tone means that the article should not be written using argot, slang, colloquialisms, doublespeak, legalese, or jargon that is unintelligible to an average reader; it means that the English language should be used in a businesslike manner (e.g. use "feel" or "atmosphere" instead of "vibes").

News style or persuasive writing

A Wikipedia article should not sound like a news article. Especially avoid bombastic wording, attempts at humor or cleverness, over-reliance on primary sources, editorializing, recentism, pull quotes, journalese, and headlinese.

Similarly, avoid persuasive writing, which has many of those faults and more of its own, most often various kinds of appeals to emotion and related fallacies. This style is used in press releases, advertising, editorial writing, activism, propaganda, proposals, formal debate, reviews, and much tabloid and sometimes investigative journalism. It is not Wikipedia's role to try to convince the reader of anything, only to provide the salient facts as best they can be determined, and the reliable sources for them.

Comparison of styles

via: Wikipedia: Writing better articles
Image: Benjamin Busch/Import Projects - Wikimedia commons 
[ed. In celebration of Wikipedia Day (roughly Jan. 15). It's easy to forget how awesome this product really is: a massive, free, indispensable resource tended to by hundreds (thousands?) of volunteers simply for altruistic reasons. The best of the internet (and reminder of what could have been). See also: Wikipedia:What Wikipedia is not]

Sunday, December 28, 2025

December 26, 2025: Christmas Greetings From the President

Axios reported on December 23 that the White House has taken over the X account of the Justice Department, and on the same day, that account tried to undercut the new information by claiming that accusations in it are “unfounded and false.” But Trump’s behavior on December 25, Christmas, suggested otherwise.

Trump’s social media account posted: “Merry Christmas to all, including the many Sleazebags who loved Jeffrey Epstein, gave him bundles of money, went to his Island, attended his parties, and thought he was the greatest guy on earth, only to ‘drop him like a dog’ when things got too HOT, falsely claimed they had nothing to do with him, didn’t know him, said he was a disgusting person, and then blame, of course, President Donald J. Trump, who was actually the only one who did drop Epstein, and long before it became fashionable to do so. When their names get brought out in the ongoing Radical Left Witch Hunt (plus one lowlife ‘Republican,’ Massie!), and it is revealed that they are Democrats all, there will be a lot of explaining to do, much like there was when it was made public that the Russia, Russia, Russia Hoax was a fictitious story—a total Scam—and had nothing to do with ‘TRUMP.’”

After misrepresenting the New York Times, he went on: “Now the same losers are at it again, only this time so many of their friends, mostly innocent, will be badly hurt and reputationally tarnished. But, sadly, that’s the way it is in the World of Corrupt Democrat Politics!!! Enjoy what may be your last Merry Christmas! President Donald J. Trump.” (...)

This evening, Trump posted: “Now 1,000,000 more pages on Epstein are found. DOJ is being forced to spend all of its time on this Democrat inspired Hoax. When do they say NO MORE, and work on Election Fraud etc. The Dem[ocrat]s are the ones who worked with Epstein, not the Republicans. Release all of their names, embarrass them, and get back to helping our Country! The Radical Left doesn’t want people talking about TRUMP & REPUBLICAN SUCCESS, only a long ago dead Jeffrey Epstein—Just another Witch Hunt!!!”

“I love the smell of panic in the evening,” former representative and Trump critic Adam Kinzinger (R-IL) posted over Trump’s screed. “Smells like… victory.”

Even before Trump’s evening post, in Meditations in an Emergency, Rebecca Solnit noted that it seems “clear that there is likely something in the files that further incriminates” Trump, an observation with which scholar of authoritarianism Timothy Snyder agreed. He added: “Horrible as the facts at hand are, there must be something else, something verging on the unimaginable.”

by Heather Cox Richardson, Letters From An American |  Read more:
Image: none, too disgusted
[ed. It's called Illeism (to talk about yourself in the third person):]
In the realm of clinical psychology, illeism takes on a whole new dimension. It’s been observed in certain personality disorders and mental health conditions, sometimes as a coping mechanism or a symptom of dissociation.
***
[ed. Also this: Americans are waking up. A grand reckoning awaits us (Guardian):]

The US had to come to this point. We couldn’t go on as we were, even under Democratic presidents. For 40 years, a narrow economic elite has been siphoning off ever more wealth and power.

I’m old enough to remember when the US had the largest and fastest-growing middle class in the world. We adhered to the basic bargain that if someone worked hard and played by the rules, they’d do better than their parents, and their children would do even better.

I remember when CEOs took home 20 times the pay of their workers, not 300 times. When members of Congress acted in the interests of their constituents rather than being bribed by campaign donations to do the bidding of big corporations and the super-wealthy.

I remember when our biggest domestic challenges were civil rights, women’s rights and gay rights – not the very survival of democracy and the rule of law.

But over the last 40 years, starting with Ronald Reagan, the US went off the rails: deregulation, privatization, free trade, wild gambling by Wall Street, union-busting, monopolization, record levels of inequality, stagnant wages for most, staggering wealth for a few, big money taking over our politics.

Corporate profits became more important than good jobs and good wages for all, stock buy-backs and the wellbeing of investors more important than the common good.

Democratic presidents were better than Republicans, to be sure, but the underlying rot worsened. It was undermining the foundations of the US.

Trump has precipitated a long-overdue reckoning.

That reckoning has revealed the rot.

It has also revealed the suck-up cowardice of so many CEOs, billionaires, Wall Street bankers, media moguls, tech titans, Republican politicians and other so-called “leaders” who have stayed silent or actively sought to curry Trump’s favor.

America’s so-called “leadership class” is a sham. Most of them do not care a whit for the rest of the US. They are out for themselves.

The “fucking nightmare” is not over by any stretch. It’s likely to get worse in 2026 as Trump and his sycophants, and many of America’s “leaders”, realize 2026 may be their last unrestrained year to inflict damage and siphon off the spoils.

Hollywood Has Left L.A.

The past decade has been tough on Hollywood — both the industry and the place. L.A. has endured a parade of black-swan catastrophes that have repeatedly upended its signature business, including fires, strikes, COVID, the decline of movie theaters and linear TV, and streaming’s boom and bust. Taken together, these disasters have triggered something like an identity crisis. If you call up a couple dozen executives, agents, directors, producers, writers, actors, and below-the-line artists and ask about the scene on the ground right now, they’ll describe a city detached from its old rhythms and sense of purpose. Today’s L.A., a few say, feels more like a Rust Belt crater than the glamorous capital of the world’s entertainment. “It’s so grim, like a sad company town where the mill is closing,” says one executive. “It’s morose, and everybody’s scared,” says the actor and director Mark Duplass. “It’s a bummer to live here now,” says a writer.

Pieces of the business still hum along in the city, albeit more quietly than they used to. Executives and agents are back in the office, at least the ones who weren’t laid off. Pitching and deal-making continue, though much of that now happens over Zoom. But production — the physical process of turning script pages into movies and TV shows — has largely left town. What began years ago as a trickle has suddenly become an exodus. Today, only about a fifth of American movies and shows are filmed in L.A. (...)

As the labor of making movies and shows splinters across far-flung cities and countries, Hollywood has become dislodged from its physical home. Some of these new hubs may suit the needs of individual projects, but none of them offers what L.A. did for most of the past century: a stable gravitational center where crews can make a living and the craft can be passed down. This isn’t just a logistical reorganization; it’s an existential shift, and there may be no going back. “The nucleus that Hollywood grew out of is dying,” says Jonathan Nolan, the writer-director whose work includes Person of Interest, Westworld, and Fallout. “I don’t think Hollywood the industry has much to do with Hollywood the place anymore,” says Lowe.

One reason L.A. even became a city at all is because it was a great place to make movies. (It helped that it was far enough from New Jersey to escape the enforcement of Thomas Edison’s patents on motion-picture cameras and projectors.) The weather allowed for year-round outdoor shooting, attracting the industry’s best filmmakers, actors, and crews. This created a self-reinforcing bubble in which the top talent was all concentrated in the same place; this, in turn, supported an informal apprenticeship system under which younger crew members learned on the job, providing a steady influx of skilled labor. For a long time, there was usually no good reason to shoot anywhere else.

Then, in the 1990s, British Columbia hatched a plan to bring some of that action north. The Canadian provincial government introduced one of the world’s first film tax credits — a financial incentive meant to lure foreign productions — offering a modest rebate on money spent employing local crews. It worked, and many U.S. states took notice. In the early aughts, Louisiana and New Mexico rolled out flashy credits of their own, transforming New Orleans and Albuquerque into viable production hubs.

A short while later, streaming boomed, and the demand for scripted entertainment exploded. With more production in play, regions around the world began ramping up their incentives, and many built soundstages and crew bases that could compete with those in L.A. What followed was a global bakeoff for Hollywood’s business. Canada, the U.K., and Australia enhanced their already aggressive tax credits, often made even more appealing by favorable exchange rates. Many U.S. states, including Georgia and New York, followed. Before long, most of the country, and dozens of countries beyond, were offering some version of a production subsidy.

These incentives can be shockingly generous. Today, producers can shoot in certain locations and receive back 30 to 40 percent of a project’s budget with local taxpayers footing the bill. Unlike traditional tax breaks, which merely reduce what a company owes, these credits often amount to direct cash payments, issued regardless of whether the production generates significant tax revenue in return. New York State, for example, offers a 30 percent base tax credit with a 10 percent bonus for projects made upstate. Last year, New York tax dollars helped subsidize TV shows including HBO’s The Gilded Age (which received $52 million from state coffers), Prime Video’s The Marvelous Mrs. Maisel ($46 million), and Apple TV+’s Severance ($39 million). An Albany-funded audit of New York’s film tax credit, published last year, determined that the incentive is probably a net loss for the state, returning as little as 15 cents in direct tax revenue for every dollar spent. Regardless, in May, New York added an additional $100 million for independent films, bringing the state’s total film subsidies to $800 million.

Meanwhile, California mostly sat on its hands, assuming its long-standing monopoly on talent and infrastructure would be enough to keep Hollywood anchored. Subsidizing an industry already based there was a tougher political sell, but the state introduced its own incentive program in 2009 and has sweetened it since. Unfortunately, while the credit may sound generous, in practice it’s miserly to the point of uselessness.

by Lane Brown, Vulture |  Read more:
Image: Alvaro Dominguez