Showing posts with label Movies. Show all posts
Showing posts with label Movies. Show all posts

Tuesday, July 22, 2025

We Are Winning!

Something has changed in the last few days.

In recent months, we’ve been bombarded with millions of lousy AI songs, idiotic AI videos, and clumsy AI images. Error-filled AI texts are everywhere—from your workplace memos to the books sold on Amazon.com. (...)

All Fake

But something has changed in the last few days.

The garbage hasn’t disappeared. It’s still everywhere, stinking up the joint.

But people are disgusted, and finally pushing back. And they are doing so with such fervor that even the biggest AI companies are now getting nervous and pulling back.

Just consider this surprising headline:


This was stunning news. YouTube is part of the world’s largest AI slop promoter—namely the Google/Alphabet empire. How can they possibly abandon AI garbage? Their bosses are the biggest slopmasters of them all.

After this shocking news reverberated through the creative economy, YouTube started to backtrack. They said that they would not punish every AI video—some can still be monetized.

But even the revised guidelines are still a major blow to AI slop purveyors. YouTube made clear that “creators are required to disclose when their realistic content is altered or synthetic.” That’s a huge win—we finally have a requirement for disclosure, and it came straight from the dark planet Alphabet. [ed. who's motto used to don't be evil]

YouTube also stressed that it opposes “content that is mass-produced or repetitive, which is content viewers often consider spam.” This is just a step away from blocking slop. 

What happened?

Maybe the folks at YouTube are just as disgusted by AI as the rest of us. Or maybe we have shamed them into taking action.

My view is that YouTube is (finally) reading the room. I’ve noted before that YouTube is the only part of the Google empire that actually understands creators and audiences. And (unlike their corporate overseers) they have figured out that AI slop is an embarrassment that will tarnish their brand.

The widespread mockery of the fake AI band Velvet Sundown might have been the turning point. This blew up in the last few days, and left AI promoters reeling.

Velvet Sundown is a non-existent AI band that got a million plays on Spotify. These deceptions have occurred in the past, but something different happened this time.

Music fans started mocking Spotify and its alleged promotion of a stupid slop band. The company was subjected to a level of ridicule and angry denunciation it has never endured before.

Journalists called this out as a hoax or fraud. And many speculated about Spotify’s role in the charade. After all, the company has been caught promoting AI slop in the past.

But this time Spotify got turned into a joke—or even worse. They were linked to a scam so clumsy that everyone was now making fun of them, as well as scrutinizing their policies and practices.

Rick Beato’s response to Velvet Sundown got two million views—so more people were watching takedowns of the band than listening to it. An industry group even demanded disclaimers and regulation.

And the jokes kept coming. People mocked the slop with more slop


That must be painful to endure, even for the billionaire CEO of a streaming platform.

Whatever the reason, Spotify started to buckle. It actually began imposing restrictions on AI.

“Spotify has now pulled several uploads from the AI act and the associated Velvet Sundown,” reported Digital Music News on July 14.

It felt like the tide was now turning in the war against slop AI music.

Dylan Smith, one of the best sources on this subject, clearly thinks so. “Velvet Sundown’s Spotify pulldown,” he writes, “doesn’t exactly bode well for forthcoming AI releases.”

I’m focused here on AI’s destructive impact on culture, but there are other signs that growing AI resistance is now forcing companies to reconsider their bot mania.

“An IBM survey of 2,000 chief executives found three out of four AI projects failed to show a return on investment, a remarkably high failure rate,” reports Andrew Orlowski. “AI agents fail to complete the job successfully about 65 to 70 percent of the time, says a study by Carnegie Mellon University and Salesforce.”

He also shared the results of a devastating test that debunked AI’s status in its favorite field, namely writing code. This study reveals that software developers think they are operating 20% faster with AI, but they’re actually running 19% slower.

Some companies are bringing back human workers because AI can’t deliver positive results. Even AI researchers are now expressing skepticism. And only 30% of AI project leaders can say that their CEOs are happy with AI results.

This is called failure. There’s no other name for it.

And it will get worse. The Gartner Group is now predicting that 40% of AI agent programs will be cancelled before 2027—due to “rising costs, unclear business value and inadequate risk controls.”

by Ted Gioia, The Honest Broker |  Read more: 
Images: Bridge Chronicle/YouTube
[ed. I'd say temporary setback. The AI industry will eventually figure something out, they've got too much money and tech beavers involved not to. The product will get better, legislators will be lushly rewarded for IP protection and distribution, some hit movie/song will get made entirely by AI, some important (maybe unusual) event will occur and eventually be traced to it, etc. A million things could happen. So calling this winning seems a little premature. Likely we'll just get used to it over time (like advertising), with authenticity mostly a certification issue (if anyone cares. you have to wonder with taste these days). See also: I'm Sorry... This New Artist Completely Sucks ie. how to create a fake song of your own (with just two sentences) (Beato)]

Sunday, May 25, 2025

On Life in the Shadow of the Boomers

Ideology, which was once the road to action, has become a dead end.
—Daniel Bell (1960)

Yuval Levin’s 2017 book Fractured Republic: Renewing America’s Social Contract in the Age of Individualism has several interesting passages inside it, but none so interesting as Levin’s meditation on the generational frame that clouds the modern mind. Levin maintains that 21st century Americans largely understand the last decades of the 20th century, and the first decades of the 21st, through the eyes of the Boomers. Many of the associations we have with various decades (say, the fifties with innocence and social conformity, or the sixties with explosive youthful energy), says Levin, had more to do with the life-stage in which Boomer’s experienced these decades than anything objective about the decades themselves:
Because they were born into a postwar economic expansion, they have been an exceptionally middle-class generation, targeted as consumers from birth. Producers and advertisers have flattered this generation for decades in an effort to shape their tastes and win their dollars. And the boomers’ economic power has only increased with time as they have grown older and wealthier. Today, baby boomers possess about half the consumer purchasing power of the American economy, and roughly three-quarters of all personal financial assets, although they are only about one-quarter of the population. All of this has also made the baby boomers an unusually self-aware generation. Bombarded from childhood with cultural messages about the promise and potential of their own cohort, they have conceived of themselves as a coherent group to a greater degree than any generation of Americans before them.

Since the middle of the twentieth century they have not only shaped the course of American life through their preferences and choices but also defined the nation’s self-understanding. Indeed, the baby boomers now utterly dominate our understanding of America’s postwar history, and in a very peculiar way. To see how, let us consider an average baby boomer: an American born in, say, 1950, who has spent his life comfortably in the broad middle class. This person experienced the 1950s as a child, and so remembers that era, through those innocent eyes, as a simple time of stability and wholesome values in which all things seemed possible.

By the mid-1960s, he was a teenager, and he recalls that time through a lens of youthful rebellion and growing cultural awareness—a period of idealism and promise. The music was great, the future was bright, but there were also great problems to tackle in the world, and he had the confidence of a teenager that his generation could do it right. In the 1970s, as a twenty-something entering the workforce and the adult world, he found that confidence shaken. Youthful idealism gave way to some cynicism about the potential for change, recreational drugs served more for distraction than inspiration, everything was unsettled, and the future seemed ominous and ambiguous. His recollection of that decade is drenched in cold sweat.

In the 1980s, in his thirties, he was settling down. His work likely fell into a manageable groove, he was building a family, and concerns about car loans, dentist bills, and the mortgage largely replaced an ambition to transform the world. This was the time when he first began to understand his parents, and he started to value stability, low taxes, and low crime. He looks back on that era as the onset of real adulthood. By the 1990s, in his forties, he was comfortable and confident, building wealth and stability. He worried that his kids were slackers and that the culture was corrupting them, and he began to be concerned about his own health and witness as fifty approached. But on the whole, our baby boomer enjoyed his forties—it was finally his generation’s chance to be in charge, and it looked to be working out.

As the twenty-first century dawned, our boomer turned fifty. He was still at the peak of his powers (and earnings), but he gradually began to peer over the hill toward old age. He started the decade with great confidence, but found it ultimately to be filled with unexpected dangers and unfamiliar forces. The world was becoming less and less his own, and it was hard to avoid the conclusion that he might be past his prime. He turned sixty-five in the middle of this decade, and in the midst of uncertainty and instability. Health and retirement now became prime concerns for him. The culture started to seem a little bewildering, and the economy seemed awfully insecure. He was not without hope. Indeed, in some respects, his outlook on the future has been improving a little is he contemplates retirement. He doesn’t exactly admire his children (that so-called “Generation X”), but they have exceeded his expectations, and his grandchildren (the youngest Millennials and those younger still) seem genuinely promising and special. As he contemplates their future, he does worry that they will be denied the extraordinary blend of circumstances that defined the world of his youth.

The economy, politics, and the culture just don’t work the way they used to, and frankly, it is difficult for him to imagine America two or three decades from now. He rebelled against the world he knew as a young man, but now it stands revealed to him as a paradise lost. How can it be regained? This portrait of changing attitudes is, of course, stylized for effect. But it offers the broad contours of how people tend to look at their world in different stages of life, and it shows how Americans (and, crucially, not just the boomers) tend to understand each of the past seven decades of our national life. This is no coincidence. We see our recent history through the boomers’ eyes. Were the 1950s really simple and wholesome? Were the 1960s really idealistic and rebellious? Were the 1970s aimless and anxious? Did we find our footing in the 1980s? Become comfortable and confident in the 1990s? Or more fearful and disoriented over the past decade and a half? As we shall see in the coming chapters, the answer in each case is not simply yes or no. But it is hard to deny that we all frequently view the postwar era in this way—through the lens of the boomer experience.

The boomers’ self-image casts a giant shadow over our politics, and it means we are inclined to look backward to find our prime. More liberal-leaning boomers miss the idealism of the flower of their youth, while more conservative ones, as might be expected, are more inclined to miss the stability and confidence of early middle age—so the Left yearns for the 1960s and the Right for the 1980s. But both are telling the same story: a boomer’s story of the America they have known. The trouble is that it is not only the boomers themselves who think this way about America, but all of us, especially in politics. We really have almost no self-understanding of our country in the years since World War II that is not in some fundamental way a baby-boomer narrative. [1]
When I first read this passage in 2018 I experienced it as a sort of revelation that suddenly unlocked many mysteries then turning in my mind.

To start with: The 1950s did not seem like an age of innocent idyll or bland conformity to the adults who lived through it. It was a decade when intellectual life was still attempting to come to terms with the horrors of World War II and the Holocaust. Consider a few famous book titles: Orwell’s 1984 (published 1949), Hersey’s The Wall (1950), Arendt’s The Origins of Totalitarianism (1951), Chambers’ Witness (1952), Miller’s The Crucible (1953), Bradbury’s Fahrenheit 451 (1953), Golding’s Lord of the Flies (1954), Pasternak’s Doctor Zhivago (1957), and Shirer’s Rise and Fall of the Third Reich (1960) were all intensely preoccupied with the weaknesses of liberalism and the allure of totalitarian solutions. For every optimistic summons to Tomorrowland, there was a Lionel Trilling, Reinhold Niebuhr, or Richard Hofstadter ready to declare Zion forever out of reach, hamstrung by the irony and tragedy of the American condition. Nor was it the wholesome era of memory. An age we associate with childlike obedience saw its children as anything but obedient—witness the anxiety of the age in films like The Wild One (1953), Rebel Without a Cause (1955), and Blackboard Jungle (1955). This age of innocence saw the inaugural issue of Playboy, the books Lolita (1955) and Peyton Pace (1956) hitting the New York Times Fiction best seller list, the Kinsey reports topping the Non-fiction best seller list, and Little Richard inaugurating rock ‘n roll with the lyrics
Good Golly Miss Molly, sure like to ball
When you’re rocking and rolling
Can’t hear your mama call.
And that is all without considering a lost war in Korea, the tension of the larger Cold War, and the tumult of the Civil Rights revolution. We may think of the 1950s as an age of conformity, purity, and stability, but those who lived through it as adults experienced it as an age of fragmentation, permissiveness, and shattered innocence.[2]

Levin explains why our perception of the era differs so much from the perceptions of the adults who lived through it. We see it as an age of innocence because we see it through the eyes of the Boomers, who experienced this age as children. But his account also helps explain something else—that odd feeling I have whenever I watch Youtube clips of a show like What’s My Line. Though products of American pop culture, those shows seem like relics from alien world, an antique past more different in manners and morals from the America of 2020 than many foreign lands today. However, this eerie feeling of an alien world does not descend upon me when I see a television show from the 1970s. The past may be a different country, the border line is not crossed until we hit 1965.

This observation is not mine alone. In his new book, The Decadent Society: How We Became Victims of Our Own Success, Ross Douthat describes it as a more general feeling, a feeling expressed in many corners on the 30 year anniversary of the 1985 blockbuster Back to the Future. The plot of that film revolves around a contemporary teenager whisked back via time machine to the high school of his parents, 30 years earlier. When the film’s anniversary hit in 2015, many commented that the same plot could not work today. The 1980s simply seemed far too similar to the 2010s for the juxtaposition to entertain. Douthat explains why this might be so:
A small case study: in the original Back to the Future, Marty McFly invaded his father’s sleep dressed as “Darth Vader from the planet Vulcan.” The joke was that the pop culture of the 1960s and 1970s could be passed off as a genuine alien visitation because it would seem so strange to the ears of a 1950s teen. But thirty years after 1985, the year’s biggest blockbuster was a Star Wars movie about Darth Vader’s grandkid… which was directed by a filmmaker, J. J. Abrams, who was coming off rebooting Star Trek… which was part of a wider cinematic landscape dominated by “presold” comic-book properties developed when the baby boomers were young. A Martina McFly visiting the Reagan-era past from the late 2010s wouldn’t have a Vader/ Vulcan prank to play, because her pop culture and her parents’ pop culture are strikingly the same….
by Tanner Greer, The Scholar's Stage |  Read more:
Image: via

Wednesday, May 14, 2025

Kazuo Ishiguro: A Pale View of Hills

Kazuo Ishiguro still remembers where he was when he wrote A Pale View of Hills: hunched over the dining room table in a bedsit in Cardiff. He was in his mid-20s then; he is 70 now. “I had no idea that the book would be published, let alone that I had a career ahead of me as a writer,” he says. “[But] the story remains an important part of me, not only because it was the start of my novel-writing life, but because it helped settle my relationship with Japan.”


First published in 1982, A Pale View of Hills is a charged family story that connects England with Japan and the present with the past. Now along comes a film version to provide a new frame for the mystery, a fresh view of the hills. Scripted and directed by Kei Ishikawa, it is a splendidly elegant and deliberate affair; a trail of carefully laid breadcrumbs that link a mothballed home in early 80s suburbia with wounded, resilient postwar Nagasaki. Middle-aged Etsuko is long settled in the UK and haunted by the fate of her displaced eldest child. Her younger daughter, Niki, is a budding writer, borderline skint and keen to make a name for herself. Niki has a chunky tape-recorder and plenty of time on her hands. She says, “Mum, will you tell me about your lives before, in Japan?”

In awarding Ishiguro the Nobel prize for literature in 2017, the Swedish Academy paid tribute to the emotional force of his prose and his focus on “memory, time and self-delusion”. These are the themes that colour all his fiction, whether he is writing about the below-stairs staff at a stately home (The Remains of the Day), sacrificial children at an elite boarding school (Never Let Me Go) or aged wanderers in Arthurian Britain (The Buried Giant), although they seem closest to home in A Pale View of Hills.

The story lightly excavates the author’s family history and his own hybrid identity as a child of Nagasaki, transplanted to the UK at the age of five. Fittingly, the movie version premieres at the Cannes film festival, where it risks getting lost amid the palm trees, yachts and bling. Cultural dislocation, in large part, is what the tale is about.

I’m tempted to view Niki – the bumptious young writer from whom no family secret is safe – as Ishiguro’s alter ego. Actually, he says, she was conceived as “more a ‘reader proxy’ than a writer one”. She’s our entry point to the story; possibly our red thread through the maze. It’s hard to believe today, he adds, but most contemporary British readers were resistant to Japanese stories and characters and needed a reassuring western presence to help ease them in.

Niki is played in the film by Camilla Aiko, a recent graduate of the Bristol Old Vic theatre school. She sees the character as the story’s truth-seeker, the eyes of the audience, and the picture itself as the tale of two women who struggle to connect. “It didn’t cross my mind – maybe it should have – that I was playing Ishiguro,” she says.

What she shares with the author is the same blended cultural heritage. Aiko is British mixed-race – her mother is Japanese. “And the thing about being mixed-race is that I find it difficult speaking for Japanese people or British people because I’m not sure which side I’m on. In Japan I’m a foreigner; here I’m Asian. As an actor I’m someone who tries to slip through the cracks.”

Niki isn’t Ishiguro. Nonetheless, the author admits that there are parallels. He says, “Where I see myself in Niki – and I was reminded of this watching Camilla Aiko’s fine performance – is in her sometimes uncomfortable, sometimes coy and cunning curiosity when coaxing memories from her mother of another, more troubled time.”

It is the mother, after all, who looms largest in the tale. Etsuko in a sense has led two lives and been two different people. In 80s England she is a respectable widowed music teacher. In Nagasaki seven years after the atomic bomb dropped, she’s a harried young bride, contaminated with radiation and a potential hazard to her unborn child. She needs a friend or an escape route, whichever comes first. But she is never an entirely reliable narrator – and the family story she tells Niki finally doesn’t add up.

What did Ishiguro’s own mother make of A Pale View of Hills? “I believe it remained special to her among my books,” he says. “A little before I started the book, with cold war tensions intensifying in the Reagan-Brezhnev era, she said to me she felt it was important she should relate to me some of her experiences in Nagasaki. Partly because I was of the next generation, but also because I was wanting to be a writer and had a chance to pass things on … A Pale View of Hills didn’t use any of her stories directly, but I think she thought the book was some sort of evolution of them, and closer to her than the books I wrote later.” Ishiguro’s mother died in 2019, aged 92. After watching Ishikawa’s adaptation, he thought: “What a pity she wasn’t here to see this film.”

Cinema is an enduring passion for Ishiguro and influences his writing as much as literature does. His favourite recent films include the Oscar-winning animation Flow, about a small soot-grey cat who survives a great flood, plus the French legal dramas Anatomy of a Fall and Saint Omer (“Is French justice really conducted like this? Or are these hallucinatory versions of French courts?”).

A few years back, between novels, he wrote the screenplay for Living – a quietly wrenching adaptation of Akira Kurosawa’s 1952 classic Ikiru, relocated to London and starring Bill Nighy and Aimee Lou Wood. The poster for Ikiru, incidentally, can be glimpsed on the street in A Pale View of Hills.


Loving film can be a double-edged sword. Is it a help or a hindrance when it comes to having his own work adapted? Hopefully the former, Ishiguro says, so long as he maintains a safe distance. “I have a strict rule not to attempt to adapt any of my novels myself,” adds the writer, who is speaking to me by email. “As long as I keep well in the background, I don’t think I’m necessarily a hindrance. I always emphasise to film-makers that they have to own the film – that it shouldn’t be approached reverentially.”

Merchant-Ivory managed a near perfect adaptation of The Remains of the Day. Mark Romanek and Alex Garland crafted an appropriately haunting, chilly version of Never Let Me Go. Both films preserve Ishiguro’s distinctive style and flavour. The restraint and simplicity; the sense of deep mystery. Both, though, remain films first and foremost. They have been allowed to migrate and adapt to a new habitat.

“This is personal to me,” he says, “but I lean toward the film version moving the story on – not being a faithful translation the way a foreign language edition of a book might be. I know many novelists who’d be annoyed to hear me say this … The thing is, I watch many, many films and when an adaptation of a well-known book doesn’t work, 95% of the time it’s because the film-makers have been too reverential to the source.” Books and films are very different, he thinks. “They’re sometimes almost antithetical.”

In A Pale View of Hills, Etsuko hands her story on to Niki. Niki, in turn, will write it up how she likes. So this is a family story about family stories, aware of how they warp and change in the telling. Every tale is subject to the same cultural static. They are adapted and extrapolated, lost and found in translation. One might even say that’s what keeps a story alive.

by Xan Brooks, The Guardian |  Read more:
Images:Chris Pizzello/Invision/AP; Pale View Partners; YouTube

Thursday, May 8, 2025

Harrison Ford and the Origin of Western Civilization

TED:

So what do want to talk about today?

INTERVIEWER:

Today I want you to stop acting so elitist—that’s why we’re going to talk about action films. What’s your favorite?

TED:

I’m not as elitist as you think. I’ve written hundreds of essays about science fiction, horror stories, locked-room mysteries, TV westerns, and other types of popular entertainment.

By the way, I love action movies of all sorts—I even have a Jackie Chan poster on my bedroom wall.

INTERVIEWER:

Is that true?

TED:

No, I just made that up.

But I do enjoy Jackie Chan’s movies, especially the early ones. I would consider putting a Jackie Chan poster on the wall, but Tara would veto that.

She already made me take down my autographed photo of Jake LaMotta—she said it clashed with the decor.


INTERVIEWER:

She is probably right. But let’s go back to my original question. What’s your favorite action film?

TED:

That’s hard to answer. There was a very good movie about LaMotta…

INTERVIEWER:

That doesn’t count. It wasn’t a real action movie. Pick another one.

TED:

Huh? There were plenty of fight scenes in it. But I’ll take you at your word, and choose another movie

[Ted stops and thinks.]

Okay, I’ve got an answer for you. The action movie I’ve seen most often is The Fugitive—starring Harrison Ford and Tommy Lee Jones. I’ve watched it so many times, I’ve lost count.

INTERVIEWER:

What do you like about it.

TED:

For a start, it’s the exact counterpart of Homer’s Odyssey….

INTERVIEWER:

Gimme a break, you’re doing it again. I said no elitist stuff today. So you’re not allowed to talk about Homer and ancient epic poetry.

TED:

Hey, hear me out. Homer’s Odyssey is also an adventure story—and not for elites. This story has entertained youngsters for thousands of years.

And it’s my favorite kind of adventure story.

INTERVIEWER:

Why is that?

TED:

The Odyssey was the first adventure story in Western culture about a hero who prevails through intelligence and reasoning, not fighting and bloodshed.

That’s a big deal. It signals the moment when the West emerged from savagery—assuming that we have emerged from savagery.

Odysseus is not a brave solider—if you’ve read Pseudo-Apollodorus, you will know that he tried to avoid fighting in the Trojan War by pretending to be crazy.

INTERVIEWER:

Sudoku app adores us? What the devil are you talking about?

TED:

Don’t worry about Sudoku. I’m trying to explain that Odysseus was the first adventurer who hates adventure. There’s a postmodern concept for you. He doesn’t even like fighting—he prefers to use his wiles and cunning.

This is the greatest turning point in Western culture. We finally have an alternative to the reciprocal violence that dominates so much of human history. The worst mistakes we’ve made in the West have taken place when we have forgotten that alternative.

But, of course, it’s also a breakthrough in storytelling.

Homer’s previous epic, the Iliad, is all about bravery and violence on the battlefield. Some 240 battlefield deaths are described during the course of that brutal poem—frequently related in grisly detail.

But the Odyssey is totally different. The hero is actually portrayed as a coward.

Homer drops a hint when he says the Odysseus places his ship in the exact middle of all the Greeks boats on the shore of Troy—that’s the safest place in the event of a surprise attack by the Trojans. Homer doesn’t say it explicitly, but he implies that Odysseus always had an escape plan, and needed to ensure that his ship was available for a hasty retreat.

INTERVIEWER:

What does this have to do with Harrison Ford and The Fugitive.

TED:

It has everything to with it. In The Fugitive, Harrison Ford succeeds through cunning and intelligence. There’s that great scene when Ford’s colleague tells Tommy Lee Jones: “You will never find him. He is too smart.”

Just as the Odyssey represents a shift away from the obsessive violence of the Iliad, Harrison turns his back on the constant battling of his previous manifestations in Star Wars and Indiana Jones.

In the movie poster for The Fugitive, Ford is actually running away from the fight—much like Odysseus tried to do.

So this is a great moment in Hollywood action movies. There’s actually very little fighting in The Fugitive. Ford even risks capture at one point by saving a person’s life. And that makes perfect sense because he is playing Dr. Richard Kimble, who—like all doctors—has taken a Hippocratic oath to avoid harm and do good.

That’s why The Fugitive is so satisfying to watch. We finally have a hero who really does good deeds and avoids reciprocal violence. And when he must engage in conflict, he out-thinks his opponent—instead of fighting and killing.

In fact, the entire point of the film is that Dr. Kimble is an innocent man. He has been falsely accused (of murdering his wife), and his only goal in this movie is to prove his innocence and his commitment to doing good.

I won’t give away spoilers. But in the final minutes of the film, he applies that Hippocratic Oath to do good through medicine and healing in a very unexpected way. You might even say that he saves thousands of people—in addition to himself.

But there are many other similarities between The Fugitive and Homer’s great epic the Odyssey.

INTERVIEWER:

What other similarities?

TED:

Like all great epic poets, Homer starts the Odyssey in the middle of the story—literary critics call this in medias res. Homer may even have invented this storytelling technique.

The Fugitive follows the same pattern. The movie begins after our hero Dr. Richard Kimble has been falsely accused and convicted of his wife’s murder. So (as in the Odyssey) we must learn about these incidents through flashbacks.

In the case of the Odyssey, our hero must battle a one-eyed monster—the Cyclops!—in order to survive and prevail. The same thing happens in The Fugitive, except that Dr. Kimble needs to deal with a one-armed monster who murdered his wife.

INTERVIEWER:

This is just coincidence. Stop playing games with me…

TED:

You’re totally wrong about that.

Let me ask you a question now. What’s the name of the one-armed man in The Fugitive?

INTERVIEWER:

I have no idea.

TED:

The character’s name is Sykes. This reference to the Cyclops would be obvious to any classicist in the audience.

Can’t you see that the filmmaker wants to remind us of the Odyssey?

INTERVIEWER:

You’re blowing my mind. Is that for real?

TED:

Go ahead, check it out for yourself.

But let me go on. There’s a whole web of connections here.

I’m not even going to talk about the obvious ones—for example, Homer frequently refers to Odysseus as “great-hearted” while Dr. Kimble is an actual heart surgeon. And Odysseus’s troubles began with Helen of Troy, while Dr. Kimble’s problems begin with his wife Helen—both victims of fighting men who intrude into their peaceful lives.

Those are just tiny details. The plot is the main source of my interest here.

In the Odyssey, our hero must survive a ship wreck—and later must escape from captivity on an island, where Calypso wants to hold him for the rest of his life. In The Fugitive, Harrison Ford needs to survive a train wreck—which allows him to escape from captivity as a prisoner on death row, where he would otherwise spend the rest of his life.

In the Odyssey, our hero eventually returns unexpectedly to his native land—the island of Ithaca—where he faces his final and greatest challenges. In The Fugitive, the US Marshalls are shocked when Dr. Kimble returns to—can you guess it?—his home town of Chicago.

That’s the last thing they expected from a runaway fugitive. “Sonofabitch,” declares Tommy Lee Jones, “our boy came home.”

But, of course, a homecoming is necessary in this type of adventure story. These heroes must return home to resolve all the dangers and obstacles they face. And in that familiar terrain, both heroes prevail against heavy odds.

By the way, both the Odyssey and The Fugitive culminate with an unexpected confrontation in a crowded banquet hall in that same home town. The parallelism is now completed.

And this brings me to my favorite part of the story.

by Ted Gioia, The Honest Broker |  Read more:
Images: Ted Gioia and The Fugitive
[ed. Interesting take, even though The Fugitive was initially produced for tv in 1963 (starring David Janssen), and as far as I know had none of these themes/connections.]

Wednesday, May 7, 2025

Is This the Worst-Ever Era of American Pop Culture?

Last year, I visited the music historian Ted Gioia to talk about the death of civilization.

He welcomed me into his suburban-Texas home and showed me to a sunlit library. At the center of the room, arranged neatly on a countertop, stood 41 books. These, he said, were the books I needed to read.

The display included all seven volumes of Edward Gibbon’s 18th-century opus, The Decline and Fall of the Roman Empire ; both volumes of Oswald Spengler’s World War I–era tract, The Decline of the West ; and a 2,500-year-old account of the Peloponnesian War by Thucydides, who “was the first historian to look at his own culture, Greece, and say, I’m going to tell you the story of how stupid we were,” Gioia explained.

Gioia’s contributions to this lineage of doomsaying have made him into something of an internet celebrity. For most of his career, he was best-known for writing about jazz. But with his Substack newsletter, The Honest Broker, he’s attracted a large and avid readership by taking on contemporary culture—and arguing that it’s terrible. America’s “creative energy” has been sapped, he told me, and the results can be seen in the diminished quality of arts and entertainment, with knock-on effects to the country’s happiness and even its political stability.

He’s not alone in fearing that we’ve entered a cultural dark age. According to a recent YouGov poll, Americans rate the 2020s as the worst decade in a century for music, movies, fashion, TV, and sports. A 2023 story in The New York Times Magazine declared that we’re in the “least innovative, least transformative, least pioneering century for culture since the invention of the printing press.” An art critic for The Guardian recently proclaimed that “the avant garde is dead.”

What’s so jarring about these declarations of malaise is that we should, logically, be in a renaissance. The internet has caused a Cambrian explosion of creative expression by allowing artists to execute and distribute their visions with unprecedented ease. More than 500 scripted TV shows get made every year; streaming services reportedly add about 100,000 songs every day. We have podcasts that cater to every niche passion and video games of novelistic sophistication. Technology companies like to say that they’ve democratized the arts, enabling exciting collisions of ideas from unlikely talents. Yet no one seems very happy about the results.

To a certain extent, such negativity may simply reflect an innate human tendency to fret about decline. Some of the most liberating developments in history have first triggered fears of social stultification. The advent of the printing press caused 15th-century thinkers to complain of mass distraction. In 1964, The Atlantic published an essay predicting, not unpersuasively, that rock and roll would only foster conformity and consumerism in young Americans.

For as long as I have been a critic at this magazine, I’ve tried to cut against the declinist impulse. The year I started the job, 2011, was a turning point of sorts: Spotify launched in America that July; Netflix debuted its first original series soon after. The brainy rock bands that I’d grown up loving—Radiohead, Wilco—were starting to fade in importance, but pop, hip-hop, and electronic music were cross-pollinating in fascinating ways. Understanding change, and appreciating how human creativity flourishes anew in each era, always seemed to be the point of the job.

Yet the 2020s have tested my optimism. The chaos of TikTok, the disruption of the pandemic, and the threat of AI have destabilized any coherent story of progress driving the arts forward. In its place, a narrative of decay has taken hold, evangelized by critics such as Gioia. They’re citing very real problems: Hollywood’s regurgitation of intellectual property; partisan culture wars hijacking actual culture; unsustainable economic conditions for artists; the addicting, distracting effects of modern technology.

I wanted to meet with some of the most articulate pessimists to test the validity of their ideas, and to see whether a story other than decline might yet be told. Previous periods of change have yielded great artistic breakthroughs: Industrialization begat Romanticism; World War I awakened the modernists. Either something similar is happening now and we’re not yet able to see it, or we really have, at last, slid into the wasteland. (....)

Stagnation

Cyniscism

Acceleration

by Spencer Kornhaber, The Atlantic | Read more:
Image: Javier Jaén

Saturday, May 3, 2025


Peter Lorre as Raskolnikov in “Crime and Punishment” by Lusha Nelson, 1935.
via:

Wednesday, April 30, 2025

Wall Street’s Not-So-Golden Rule

We’re all familiar with the Golden Rule — Do unto others as you would have them do unto you — and I don’t think it’s a stretch to say that its message of reciprocity and empathy is the bedrock of human civilization, certainly of Judeo-Christian thought. As Hillel the Elder said, “What is hateful to you, do not do to your neighbor. That is the whole Torah. The rest is commentary.”

There’s a variation of the Golden Rule — I don’t think it’s a stretch to call it a perversion — that is the bedrock of the business of Money, a business that goes by the shorthand of ‘Wall Street’. This not-so-Golden Rule is the source of pretty much all of the unexpected Bad Things that happen from time to time in markets, where there’s a shock to the system that ‘no one could have foreseen’, like a sudden crash in the price of something or like a run on a bank or an investment firm. That perversion of the Golden Rule is this:

Do unto others as they would do unto you. But do it first.

It’s a perversion of the Golden Rule in two ways. First and most obviously, it’s got that extra sentence about doing the thing before the other guy. But second and less obviously, it’s normative-negative, which is a ten-dollar phrase to say that it’s not talking about doing good things (‘as you would have them do’), but is pretty obviously saying that you should do something that will actively hurt the other guy.

If you’re in the business of Money for more than a nanosecond, you will see this not-so-Golden Rule in action all around you. More to the point, if you want to stay in the business of Money and be successful in the business of Money, you must adopt and live by this not-so-Golden Rule yourself. Seems harsh, I know, but as Hyman Roth so aptly put it in The Godfather, Part II, “this is the business we have chosen.”

And it IS harsh. You can rationalize it by saying that he would have done the same thing to you if the situation had been reversed — and you are almost certainly correct in that assessment! — but the fact remains that YOU are doing the negative thing to the other guy. If you’re a thinking, feeling, non-sociopathic human being you will feel bad about doing that negative thing, but you will also get over it pretty quickly because it is absolutely, unequivocally, 100% the rational thing to do, and if you’ve been entrusted with managing Other People’s Money you have a moral if not legal obligation to do that rational thing despite the blecch feeling you have inside.

The first time I experienced that blecch feeling keenly was in December 2007 when I called our Bear Stearns rep and told him that we had decided to leave Bear Stearns as our hedge fund’s prime broker and we were pulling our money out. A prime broker is basically the ‘bank’ for a hedge fund. They provide lots of services, but the main ones are that they lend you money against the value of your portfolio so that you can buy more stock without using actual cash to go long (bet that the stock price will go up), and they locate and secure the shares of stock that you have to borrow in order to go short (bet that the stock price will go down). In exchange you pay them interest on the ‘leverage’ you used to buy more stock, just like you’d pay interest on a bank loan, and even more importantly from their perspective (and also just like a bank) you ‘deposit’ your stock holdings and some cash with them, which they can use to fund the loans and leverage they’re making available to other clients. It’s arguably the most important counterparty relationship that most hedge funds will have, certainly back then, and it’s a very profitable business for Wall Street investment banks, certainly back then.

What you need to understand is that I didn’t like working with Bear Stearns … I loved working with Bear Stearns. Loved the people, loved the attitude, loved the business terms. Bear Stearns was famously unafraid to take a chance on up-and-comers, both in its hiring of non-pedigreed entry-level employees (preferring, in legendary CEO Ace Greenberg’s words, to hire people who were ‘PSDs’: poor, smart, with a deep desire to be rich) and in its willingness to work with non-pedigreed hedge funds like mine. To be sure, it helped that the larger firm of which my fund was a part was filled with ex-Bear employees, all friends who would vouch for me and my partner. This was back in the day when vouching for someone meant something. It still does, I suppose, but a lot less than it used to. Bear stepped up to be our hedge fund’s prime broker from the very start, putting real time and real effort into a dinky little fund when nobody else would. Yes, they made good money off our business as we grew into a non-dinky fund, but I also owed a personal debt of gratitude to Bear Stearns for taking a chance on us.

And it didn’t matter.

Once I figured out in late fall of 2007 that if we had a nationwide decline in home prices, Bear Stearns faced enormous potential losses in the mortgage-backed securities that they owned, losses big enough to wipe out the entire bank because of their internal leverage on assets – or rather, once I suspected that I had figured this out, because you never know this stuff for sure unless you’re on the inside — then I knew for a certainty that it was only a matter of time before other prime broker clients of Bear Stearns would come to the same suspicion. And once that word got around — that there were doubts and suspicions about Bear Stearns as a counterparty — then I knew for a certainty that what would start as a trickle of clients taking their money out of the prime brokerage ‘bank’ would become a stream and then a river and then … well, then the dam breaks and the investment bank fails and if you’re still there as a prime brokerage client you get really, really hurt.

It didn’t matter if I was right about Bear Stearns and the risks to their balance sheet. I was, but I swear that didn’t matter. What mattered was the not-so-Golden Rule of Wall Street. What mattered is that you must act first when you have even a suspicion of counterparty risk, well before you know for sure whether or not you are ‘right’ about that risk, because everyone else on Wall Street will act first if you don’t. And if you don’t act first, or at least early … if you wait until you’re sure that there’s a counterparty risk … well, you’re screwed.


In December 2007, Bear Stearns still traded for over $100/share. In three months, it was below $5, before finally being taken out by JP Morgan for $10/share in a mercy killing. From suspicions to lights out in three months. Life comes at you fast when the not-so-Golden Rule of Wall Street comes into play. Getting out when we did saved our fund untold hassle and legal tie-ups, gave us the time to move to another prime broker out of strength and not desperation, and set us up for a career-making year in 2008.

Is this sort of run on the bank a self-fulfilling prophecy of doubt and ruin? Yep. If everyone had just kept their prime brokerage account in place would Bear Stearns have survived? Maybe. Do you have a choice but to get out before everyone else does, no matter how much it pains you personally and no matter how much your getting out might accelerate the sad and disappointing outcome? Nope. This is the business we have chosen. (...)

Why am I telling you this story?

I’m telling you this story because I think that Trump a) recognizes he made a mistake by overplaying the tariff card, b) is sidelining the ideologue pro-tariff crew like Navarro and Miran, and c) is actively looking for off-ramps and de-escalation in the China trade war. I think he may find an off-ramp and de-escalation in the China trade war, and that would be a wonderful thing for the United States and the world.

And it doesn’t matter.

by Ben Hunt, Epsilon Theory |  Read more:
Image: Margin Call (2011); Godfather Part II
[ed. Trust lost is almost impossible to regain. See also: ‘Trump wanted to break us’, says Carney as Liberals triumph in Canadian election' (Guardian); and (the not to be missed) Crashing the Car of Pax Americana. (Epsilon Theory).]

"Mirroring a theme of the campaign, Carney told election-night supporters that Trump wanted to “break us, so that America can own us”, adding: “That will never, ever happen,” to shouts from the crowd.

He also gave a stark assessment of a world order once defined by an integrated global trading system with the US at the centre, saying such a system was over, and he pledged to reshape Canada’s relationships with other nations.

“We are over the shock over American betrayal. But we will never forget the lessons,” he said."

[ed. And this: 2035: An Allocator Looks Back Over the Last 10 Years (AQR):]

"We really did not see this underperformance coming. After all, the prior 30 years saw much higher IRRs on private equity than total returns on public equity. What we didn’t count on, I mean who could see this coming, was this outperformance reversing. I mean, what better way is there to estimate what will happen in the future than looking at what happened in the past!?"

Tuesday, April 15, 2025

How We All Became Clint Eastwood

Is This the Dominant Personality Type of Our Time?

Filmmaker Sergio Leone once explained why Clint Eastwood was a perfect actor for his movies. Eastwood’s portrayal of a cowboy, he explained, “only had two expressions: with hat and no hat.”

That might sound like criticism, or even mockery. But Leone needed a hero who presented a mask to the audience. In Eastwood, he found someone who did that naturally—as part of his acting style.

But Leone got lucky.

At least eight different actors—from Henry Fonda to Steve Reeves—turned down the role of the nameless stranger who destroys an entire Wild West town in A Fistful of Dollars (1964). With no better options, he hired an unproven film actor who possessed an extremely narrow range of facial and vocal expression.

That turned out to be just what he needed. But some people think this is terrible acting.

Talk show guest Ray Liotta left everyone in stunned silence when he said that Clint Eastwood was the most overrated actor of his generation. But Liotta doubled down—turning to the audience and saying: “I don’t give a sh-t.”

Even so, it’s hard to criticize Eastwood—because this flat style of acting became so pervasive in subsequent years. His detached, emotionless on-screen persona has served as a role model for countless heroes and villains.

Just think of all those Arnold Schwarzenegger movies where the dialogue became famous because it was delivered so mechanically. Or consider Anton Chigurh in No Country for Old Men—a performance praised by experts for its authentic portrayal of a psychopath. Or even Bryan Cranston in Breaking Bad, who became more emotionless and detached with each passing season.

After Eastwood, this exact same playbook worked for both heroes and villains. An extreme example is Terminator 2—where both the good guy (Schwarzenegger) and bad guy ( a T-1000 killer robot) battle to see who can achieve the most expressionless persona.

But the defining villain of this style remains Darth Vader. Eastwood had a face like a mask, according to Leone, but Darth Vader wears a literal mask. Not only can’t you see his face, but you aren’t even allowed to hear his natural voice—which has been processed to sound as inhuman as possible.

Clint Eastwood, for his part, continued to work variants on this character type—making millions of dollars in the process. In his career-defining Dirty Harry films, he showed that he required no cowboy hat to work this trope—although he repeats the gimmick of using up all six bullets that was so effective in the closing scene of A Fistful of Dollars.

No, Eastwood didn’t invent deadpan acting. But it had originally been done for laughs—most famously by Buster Keaton. In fact, the first use of the word “deadpan” in print (from 1915) refers specifically to Keaton.

Often the deadpan role went to the so-called “straight man” in comic duos—Martin (for Lewis), Abbott (for Costello), Rowan (for Martin), Smothers (for Smothers), etc. But these flat sidekicks were as necessary as the punchline in creating comic effects.

This deadpan demeanor was intrinsically funny, because any person with so little personality is weird, and makes us laugh.

Before Eastwood, we only see a few hints of this style in dramatic or action films, for example James Dean’s Rebel Without a Cause or Sean Connery’s James Bond. But they both seem positively giddy compared with Eastwood’s cold and wooden demeanor.

I call this the “Man without Personality”—and it’s almost always a man. When psychologists studied this character type, they identified 126 movie characters of this sort, and only 21 were female.

So let’s give credit to Glenn Close (in Fatal Attraction) and Sharon Stone (in Basic Instinct). But they are far outnumbered by male cinematic psychopaths with flattened personalities—such as Kevin Spacey (in The Usual Suspects) or Daniel Day-Lewis (in The Gangs of New York) or Anthony Hopkins (in The Silence of the Lambs).

Sometimes these characters are actual machines (as in Terminator or RoboCop or 2001: A Space Odyssey). But even when they are made of flesh-and-blood, they retain obvious robotic elements.

It’s disturbing how much pop culture has fallen in love with these mechanical figures. But even worse, in the world of Zero Personality, all moral values become irrelevant.

That was true even for Eastwood’s debut as the unnamed stranger back in 1964. He does two good deeds during the course of the film—but at the cost of killing (directly or indirectly) most of the citizenry during the course of 90 macabre minutes.

What a bizarre story to tell. And it raises obvious questions:

Where did this personality type come from? And how did it become so popular? (...)

Americans needed decadent Europeans to blaze the trail. We were too optimistic. But they had seen evil, up close and personal. And had stories to tell.

Alfred Hitchcock—an émigré himself—was the only other influential source for this character type in Hollywood films. But Hitchcock turned to psychotics for horror and repulsion, not audience acclaim.

And even Hitchcock knew the European philosophical roots of this personality style. In his underrated masterpiece Rope (1948) he even introduces a Nietzschean professor (played by Jimmy Stewart, of all people!).

He returned to this character type in Psycho (1960)—but, once again, for horror not heroism. And audiences were shocked. Even though there is little graphic violence on screen, the public found this film deeply disturbing—to a degree that Hitchcock himself never matched, before or after.

Then, over the course of just a few years, this murderous psycho went from villain to hero.

By the time we get to Dirty Harry (1971) and Death Wish (1974)—both starring Leone alums—audiences are actually cheering and clapping when the sadistic and expressionless protagonist commits cold-blooded murder. [ed. John Wick]

And here’s the scariest part of the story.

We’ve all become Clint Eastwood today.

Okay, maybe not everybody. But the main forums of public discourse on social media are filled with flat emotionless people who flare up into anger at the slightest provocation.

None of us saw this coming with the rise of the Internet. At least, I didn’t—nor did I hear anyone else predict the eventual effects back in the mid-1990s.

But maybe we should have anticipated it.

by Ted Gioia, Honest Broker |  Read more:
Images: Warner Bros./Paramount
[ed. See also: Subversively Human: A Conversation with Ted Gioia (Image Journal); and, Psychiatrists Declare No Country For Old Men Character As Most Realistic Portrayal Of A Psychopath (Unilad).]

Thursday, April 10, 2025

Studio Ghibli, My Neighbour Totoro (Chinese poster) 2018.

Sunday, March 23, 2025

On Being a Thai Girl

[ed. White Lotus. Haven't watched the series but this scene was recommended for its sheer depravity and disassociative perversity. There's actually a psychological term for this condition (of course): Autogynephilia. What acting:]

"In a recent episode, set in Thailand, actor Sam Rockwell delivers an intense monologue on how he took partying as far as it could go. And he backs up this claim with a lurid account of excesses and fetishes beyond anything I’ve heard on TV before.

But nobody mentioned the key fact.

Rockwell talks about how he walked away from this wild self, and found serenity in a Buddhist life of detachment and enlightenment. And this shows in how he delivers his monologue, which grabs our attention precisely because its serene tone is such a mismatch with the activities described.

But, of course, viewers didn’t latch on to that—because that kind of message never shows up in a TV series.

Or does it?

It’s worth repeating this: the white lotus is a symbol of purity and enlightenment—and emergence from darkness, especially in times of turmoil or political crisis

“You’re getting what you asked for.”

That’s how people describe a punishment—the curse of getting what you want. And it’s been true since the Garden of Eden.

Consider this in the context of the algorithm—a feedback technology designed to give people exactly what they want." 

via: THB

Monday, March 17, 2025

Why Adolescence is Such Powerful TV That It Could Save Lives


The arrival of searing new series Adolescence could hardly be more timely...

On a street level, it’s about knife crime. Over the past decade, the number of UK teenagers killed with a blade or sharp object has risen by 240%. On a cultural level, it’s about cyberbullying, the malign influence of social media and the unfathomable pressures faced by boys in Britain today. Male rage, toxic masculinity, online misogyny. This isn’t just all-too-plausible fiction. It’s unavoidable fact.

As the boy’s father, Eddie, a self-employed plumber in an unspecified Yorkshire town, Graham spends the opening hour shell-shocked. He is inclined to believe his son’s protestations of innocence, as any parent would. That is, until he is poleaxed by chilling footage of the frenzied multiple stabbing.

It might be a masterclass from the best actor working today but Graham leaves room for his castmates to shine. Ashley Walters delivers a career-best turn as lead investigator DI Luke Bascombe. Walters was considering quitting acting and moving behind the camera but Adolescence changed his mind, not least because it resonated personally with a man who, in his own teens, was sentenced to 18 months for gun possession. He has admitted to “crying most nights” while learning the script.

Erin Doherty drops in for a blistering head-to-head as clinical psychologist Briony. Christine Tremarco is heartbreaking in the finale as Jamie’s mother, Manda. And then come the kids. Newcomer Owen Cooper – incredibly, it’s the 15-year-old’s acting debut – is flat-out phenomenal as Jamie. He goes from sympathetic to scary, lost little boy to angry young man, often within the same breath, announcing himself as a major talent in the process. Fatima Bojang is movingly raw as Katie’s bereaved best friend Jade. Amélie Pease excels as Jamie’s elder sister Lisa, whose low-key wisdom becomes the glue holding her fractured family together.

The story is brought to life by telling details. The way that Jamie still has space-themed wallpaper in his bedroom and wets himself when armed police burst in, reminding us of the “gormless little boy” behind the shocking violence. The way the secure training centre where he awaits trial is populated by youngsters with radiator burns who yell at Coronation Street. The way incidental characters – the creepy CCTV guy, the DIY store conspiracy theorist – warn us that adult males can be equally threatening. The way nonsensical graffiti and a nosy neighbour are what finally tip Eddie over the edge. (...)

Adolescence lays bare how an outwardly normal but inwardly self-loathing and susceptible youngster can be radicalised without anyone noticing. His parents recall Jamie coming home from school, heading straight upstairs, slamming his bedroom door and spending hours at his computer. They thought he was safe. They thought they were doing the right thing. It’s a scenario which will ring bells with many parents. Some will be alarm bells.

We take pains to teach them how to cross roads and not talk to strangers. We rarely teach them how to navigate the internet. There is often a glaring gap between parents’ blissfully ignorant image of their children’s lives and the truth of what they get up to online. We think they’re playing Roblox but they’re actually on Reddit. We think they’re doing homework or innocently texting mates. They are watching pornography or, as DS Frank pithily puts it, “that Andrew Tate shite”.

Jamie’s plight becomes a poignant study of the nightmarish influence of the so-called manosphere – that pernicious online world of “red pills”, “truth groups” and the 80-20 rule (which posits that 80% of women are attracted to 20% of men). It’s a shadowy sphere populated by alphas, “incels”, MRAs (men’s rights activists) and PUAs (pickup artists), whose fragile egos turn into entitled fury. From mocking emojis on Instagram to the dark web and deepfakes, it’s another country to anyone over 40. No wonder parents are, as Bascombe’s son points out, “blundering around, not getting it”. (...)

As unanimous five-star reviews attest, Adolescence is the best drama of 2025 so far. We’re less than a quarter of the way through, admittedly, but the rest of the year’s TV will have to go some to beat it. This is old-fashioned, issue-led, socially conscious television – and all the better for it. 

by Michael Hogan, The Guardian |  Read more:
Image: Netflix
[ed. Powerful throughout. See also: Is this the most terrifying TV show of our times? Adolescence, the drama that will horrify all parents (Guardian):]
***
“Steve’s starting point was not wanting to blame the parents,” says Thorne of his collaboration. “It was: ‘Let’s not make this about a kid who commits a crime because of an evil thing going on at home.’”

“I didn’t want his dad to be a violent man,” confirms Graham. “I didn’t want Mum to be a drinker. I didn’t want our young boy to be molested by his uncle Tony. I wanted to remove all of those possibilities for us to go: ‘Oh, that’s why he did it.’”

As a result, Adolescence takes us somewhere even more terrifying. Jamie, the show’s 13-year-old subject, is an outwardly normal, well-adjusted kid. But the conversations around him, at school and online, start to lean towards incels and the manosphere. Slowly, a picture builds about how this regular kid found himself radicalised without anyone even realising. (...)

Still, as heavy as Adolescence is, it also stretches the capacity of what can be achieved with a single take... the scale of Adolescence meant that the camera had to be continually passed from operator to operator, getting clipped in and out of different devices by various teams as necessary.

He takes me through the show’s opening sequence. “When the episode starts, my cinematographer Matt is holding the camera,” he explains. “As we’re filming the actors in the car, the camera’s being attached to a crane. The car drives off, and the crane follows. While this is happening, Matt has gone in another car, driven ahead and jumped out so he can take the camera into the house. When we come back out of the house, the other camera operator Lee is sat in the custody van. Matt would pass Lee the camera, so now Lee’s got the camera while Matt drives ahead to the police station, so he’s ready to take the camera when we go inside.”

Such visual flashiness might suggest that Adolescence is purely a technical experiment, but that couldn’t be further from the case. “I never want the one-take thing to be at the forefront,” says Barantini. “I wanted this to be seamless, but not a spectacle.

Tuesday, March 11, 2025

Every Studio Ghibli Film, Ranked From Worst to Best

Every Studio Ghibli Film, Ranked From Worst to Best (Wired)
Images: My Neighbor Totoro and Spirited Away (Deviant Art)
[ed. Gotta say, Spirited Away is my favorite.]

Thursday, January 30, 2025

What I Saw at the Streaming Revolution

Back in January 2020, Disney’s and Apple’s subscription platforms were just a few weeks old, Peacock and the Streamer Formerly Known as HBO Max did not yet exist, and there was a ton of mystery surrounding a soon-to-debut streamer that sounded like a joke — and yet somehow wasn’t. Five years on, while Quibi is no more, those four other services are still very much around, as is one other thing: Buffering, which published its very first edition five years ago this month. (...)

Since Buffering is only turning five and not 50, my bosses at Vulture politely passed on my pitch for a primetime special and a series of documentary specials about the early years of this newsletter. That said, they are allowing me to mark this milestone with a special edition focused on five of the biggest developments that have shaped streaming since 2020, what lessons can be taken from them, and some thoughts on what to expect in the years to come.

1. Netflix: Dominant then, dominant now

One of the lead stories in our debut edition revolved around Netflix racking up more Oscar nominations than any other studio or distributor for the first time. This was a huge deal back then, since it signaled the streamer would be able to reshape the film business in much the same way it had already transformed television. Five years later, what’s most remarkable to me is how — despite a few bumpy moments and the emergence of several strong competitors — Netflix still sets the pace in Hollywood. It’s the benchmark against which every other streamer is judged, and its successes (and failures) have resonated through so much of what we’ve covered here in Buffering.

For instance, when now co-CEO Ted Sarandos decided to push out his longtime deputy Cindy Holland in 2020, it was first and foremost a story about Netflix moving away from the premium, critic-friendly fare that marked its early years and toward its current status as the 21st-century equivalent of CBS in its Tiffany era: a mass broadcaster able to churn out everything from Mister Ed and The Beverly Hillbillies to The Twilight Zone and Harvest of Shame. But in retrospect, Holland’s ouster — and Netflix’s pivot — also look like the beginning of the end of streaming’s mini Golden Age, when the industry spent billions not just on content, but on getting the most audacious, star-studded, and not-even-really-TV-anymore programming that money could buy. Netflix pioneered the strategy of luring customers by trying to out-HBO HBO; its pivot to the center pushed most of the rest of the industry to follow.

We saw this pattern play out multiple times over the last five years, even when Netflix technically wasn’t the first to do something. The streamer decided to begin selling commercials a couple months after Disney+ announced it would do so, but it was Netflix’s entry into the space that felt like a sea change for subscription streaming. Ditto the industrywide crackdown on password sharing, or the trend toward ending even successful series after just three or four seasons. And even though Amazon has been airing Thursday Night Football games for a few years now, and Peacock has done playoff games and the Olympics, Netflix’s recent Christmas Day doubleheader still felt like an event. Netflix doesn’t innovate like it once did, but almost anything it does still makes the biggest splash.

Last week’s earnings report from the streamer underscores this point. Netflix said it added another 40 million–plus subscribers in 2024 — 19 million in the last three months of the year alone — and now boasts just over 300 million paid global customers, giving it a reach of more than a half-billion potential viewers. And while its peers are still mostly swimming in red ink or barely eking out tiny profits, Netflix has turned into a veritable ATM: Instead of losing a few billion dollars every year, as was still happening five years ago, the company is forecasting profits in excess of $40 billion in 2025. Adding subscribers, double-digit profit margins: “This is what winning looks like,” analyst Jeffrey Wlodarczak of Pivotal Research Group wrote last week. This was true when Buffering first launched in 2020, of course, but that’s also the point: Despite the launch of several well-financed competitors, heavy spending from older tech rivals Amazon and Apple, and the usual laws of showbiz gravity, Netflix is still #winning. (And yes, that applies to Oscar nominations. It once again racked up the most noms of any individual studio.)

➼ Over the Next Five Years: Now that Netflix has gone from being seen as the cool future of TV to a generic word for TV, will brand affinity eventually start to suffer — not just among consumers but with the creatives Netflix relies on for programming? Or, as it has in the past, will Netflix continue to prove the doubters wrong?

2. Streaming became more like linear TV rather than the other way around

As the 2020s got underway, there was still a sense that digital, on-demand television was going to be a completely new medium, one very distinct from what we’d seen with traditional TV since the 1950s. Not only were there no channels or time slots, but the biggest streamers didn’t even bother with commercials, and compared to what we’d grown used to paying for cable, it was substantially cheaper. Well, the arc of the small-screen universe apparently isn’t that long, and in the case of streaming, it reverts to the mean.

The move of Disney+ to introduce an ad-supported tier (followed quickly by Netflix and Amazon Prime Video) was the most glaring example of this network-ification of the industry, but there were many others. For example, all of the upstart streamers launched over the last five or so years opted not to adopt Netflix’s binge release strategy for most of their new releases, thus preserving the linear tradition of doling out episodes of a show on a weekly basis. Instead of focusing almost entirely on expensive scripted programming, streamers started investing increasingly large portions of their budgets on live sports and events, less expensive reality shows, and true-crime docs. Rather than keeping prices low to attract (and keep) customers, platforms began implementing dramatic increases to their monthly subscription fees — while also cutting back on the number of new shows they green-lit and the size of their libraries of older TV shows and movies. Then, when those price hikes and content reductions started facing pushback from consumers, streamers took a page out of the old cable-TV playbook and began offering consumers discounted rates if they signed up for a bundle of services at the same time.

All of this was probably inevitable once legacy-media giants such as Comcast, Warner Bros. Discovery, and Paramount Global jumped into the streaming pond. These are the companies that shaped the linear-TV business for decades; of course they were going to bring their old habits with them. But that’s not entirely a bad thing, as evidenced by how quickly streamers run by tech companies adapted so many of these ideas. Apple might be the company that once urged us to Think Different, but its Hollywood wing knew that a series like Ted Lasso needed the sort of word-of-mouth buzz that can only be built via launching a show with weekly episodes. Advertising is annoying, especially when you’re already paying for a subscription, and yet cable thrived for decades with exactly that combination of commercials and monthly fees. At least with streaming, there’s still the option to pay more for an ad-free experience and the ease of canceling for a few months if a streamer’s programming slate isn’t meeting your needs.

I get that for many consumers, all of this seems like a case of dumb, greedy TV execs pulling a fast one in order to jack up profits for shareholders. And to be sure, there’s plenty of dumb and no shortage of greed in Hollywood. But the fact is streamers came into the market significantly underpriced relative to how much programming they offered and compared to what cable was (and is) charging. Netflix racked up billions in red ink getting you hooked on its version of streaming nirvana, and the legacy-media companies also went deep into debt trying to compete in the early 2020s — and most are still losing money, or just now starting to turn the tiniest of profits. Those heady days when you could pay under $20 for Netflix and Hulu and get just about every show and movie you’d ever want to see, plus binge watch the latest season of Breaking Bad or Mad Men a few months after its finale? They were never gonna last, and it’s not because David Zaslav is a Trump-friendly wannabe mogul who seems to delight in annoying as many fandoms as possible. Streaming needed to become more like regular TV because it needed to become profitable, and if there’s one thing network and cable TV were good at, it was making money.

➼ Over the Next Five Years: Will audiences revolt if prices get too high or the volume of commercials on streaming reaches the same level as cable? Or will the seemingly inevitable consolidation of streaming platforms and bundling of services result in a sort of equilibrium where consumers feel like they’re not getting totally robbed?

by Josef Adalian, Buffering/Vulture | Read more:
Image: Vulture; Photos: Everett Collection (Freevee, Ali Goldstein/Netflix), Apple TV+, Netflix
[ed. Revolt. See also: "The Infrastructure of the Recording Industry Is About to Fail” (HB).]