Showing posts with label Movies. Show all posts
Showing posts with label Movies. Show all posts

Wednesday, September 24, 2025

Notes on The Greatest Night In Pop

A Study In Leadership, Teamwork, and Love

The Netflix documentary, The Greatest Night In Pop, tells the story of the making of We Are The World, the 1985 charity single featuring (almost) everyone in American pop at the time: Michael Jackson, Lionel Richie, Stevie Wonder, Bruce Springsteen, Bob Dylan, Diana Ross, Cyndi Lauper, Tina Turner, Billy Joel, Dionne Warwick…the list goes on and on.

The documentary is based on hours of footage from the night they recorded the single, only a few minutes of which was used for the original music video. The Greatest Night In Pop (TGNIP) came out eighteen months ago, and while millions of people have viewed it, I’m constantly surprised to learn that many have not. Everyone should.


If I had to recommend a documentary or just ‘something to watch on TV’ for absolutely anyone - man or woman, old or young, liberal or conservative, highbrow or lowbrow - I’d recommend The Greatest Night In Pop. It may not be the deepest, most profound ninety minutes of TV, but it is irresistibly enjoyable. And actually, like the best pop, it is deep; it just doesn’t pretend to be.

To us Brits, We Are The World was a mere footnote to Do They Know It’s Christmas? That record was instigated by Bob Geldof and Midge Ure and recorded by a supergroup of British and Irish musicians under the name Band Aid (an underrated pun). The proceeds went to famine victims in Ethiopia. We Are The World was made for the same cause.

I knew that, but what I learnt from TGNIP is that there was an element of racial pride in the American response, which arose spontaneously from a conversation between Harry Belafonte and Lionel Richie’s manager. Belafonte said, “We have white folks saving Black folks—we don’t have Black folks saving Black folks”. Lionel agreed, and the wheels started to turn.

Ah, Lionel. The man who makes everything happen. It is perhaps not coincidental that he should emerge as the star of this documentary, given that he co-produced it. The same might be said of Paul McCartney, who emerged as the hero of Get Back. But in neither case do I sense corruption of historical truth. Richie is extraordinary, both as a talking head and in his 1985 incarnation. As chief interviewee - host might be a better word - he sparkles: mischievous, funny, a supreme storyteller. As the prime mover behind the recording of We Are The World, he is simply awesome.

After Belafonte’s prompt, Lionel calls Quincy Jones - the maestro, the master, the producer of the best-selling album of all time. Jones immediately says yes, and they call Michael Jackson and Stevie Wonder. Both agree, Stevie only belatedly because nobody can get hold of him (a theme of the doc is that Stevie Wonder is both delightful and utterly ungovernable). With these stars on board, they know that pretty much everyone else will want to be involved, and so it proves. They decide to recruit white stars as well as black, and get Springsteen and Joel and Kenny Rogers and Willie Nelson and others.

The first thing the principals need to do is come up with a song. Lionel goes to Michael’s house, and the pair spend several days hacking away at the piano (Stevie was invited but is AWOL). With a couple of days to go, they crack it (that is, Quincy likes it). The song is, well, fine: not a work of genius but a pleasant, gospel-inflected anthem, easy enough to sing without much preparation, catchy enough to be a hit. It does the job.

When he took up this baton, Richie was on a career high. He’d left The Commodores and broken out as a solo star. He was about to host the American Music Awards in Los Angeles, the biggest primetime music show, for which he himself was nominated for eight awards (he won six). It soon becomes apparent to all concerned that the best and perhaps only way to get all the talent in the same place to record a single would be to do it on the night of the awards, when so many of them are in town anyway. That would mean doing an all-night session, and for Lionel, it would mean first hosting a live awards show watched by millions - demanding, stressful, exhausting - then helping to run this second, private show right afterwards. No problem!

So it is that after the AMAs we see limousines dropping stars off at an LA recording studio. From a narrative point of view, a delicious premise emerges: a bunch of very famous, egotistical, impatient, nervous pop stars, most of whom don’t know each other (“It was like the first day of kindergarten”, recalls Richie) are brought together in a room to make a record of a song they barely know (they’ve heard a demo). It absolutely has to be a huge hit. They have about eight hours; there’s no coming back tomorrow.

It could have gone badly wrong. That it didn’t is testament to all involved but to Richie and Jones in particular. The two of them corral this unwieldy gaggle into making a sleek and successful product.1 The first time I watched TGNIP I enjoyed it unreflectively. When I watched it for a second time, I began to see it as a study in leadership, collaboration and teamwork.

I’ve written before about how diversity needs to be interpreted beyond demographic attributes like race and gender to temperament and personality. The British management researcher Meredith Belbin constructed a famous inventory of behavioural types which together make up a successful team: the Resource Investigator, the Coordinator, the Shaper, the Catalyst, and so on.

TGNIP prompted me to come up with an inventory of my own: the Decider, the Connector, the Conscience, the Old Buck, the Disrupter, the Weirdo, and the Lover.

THE DECIDER

Quincy Jones taped up a handwritten sign at the entrance to the studio: LEAVE YOUR EGO AT THE DOOR. He was possibly the only person in America who would have dared to write such a sign for such a crowd and certainly the only one who would have been listened to.

To lead a team of 40 superstars was a tough task but it certainly helped to be Quincy Jones. Aged 51, he been an arranger for Duke Ellington and Frank Sinatra; produced Donna Summer and Aretha Franklin; won multiple Grammys; turned Michael Jackson into the biggest artist in the world.

In TGNIP he is somewhat marginal to the action just because he is in the control room, while the camera roves the studio floor. We hear his voice over the intercom and see him when he comes onto the floor to coach someone through a difficult vocal part. (He wasn’t interviewed for the doc but we hear him speaking about the night from an earlier interview).

There’s no question he is in charge, though. His interventions are economical and precise; he doesn’t waste words. He is stern when he needs be, jocular in a restrained way; cool. Everyone in the room looks up to him, literally and metaphorically. He is friendly but not your best friend. He is here to make sure the job gets done, and done well. He is The Decider.

THE CONNECTOR


By contrast, Lionel Richie is very much your best friend. He is everywhere, talking to everyone: greeting, thanking, hugging; answering a thousand queries; soothing egos; telling stories and making jokes; giving pep talks; smoothing over potential conflicts; solving musical problems; hyping and cheerleading; raising the energy level when it flags; consoling the weary. Somebody else says of him, “He’s making the water flow.” That’s it.

Richie has a special knack for wrangling very talented, slightly nuts individuals. Cyndi Lauper, who was a massive star at that time, bigger than Madonna, decided on the evening of the recording that she wasn’t going to do it after all. The reason she gave is that her boyfriend didn’t like the demo of the song that Richie and Jackson had made. He’d told her it would never be a hit.

Lionel has to take a minute backstage at the awards ceremony which he is presenting to find Lauper, put any hurt feelings he might have aside, and cajole her into returning to the team. Later on, he’s the one negotiating with Prince over his possible participation over the phone. He also has to hide wine bottles from Al Jarreau so that he doesn’t get too drunk before recording his solo part. Details.

by Ian Leslie, The Ruffian |  Read more:
Image: Netflix
[ed. Highly recommended.]

Tuesday, September 23, 2025

Is Mid-20th Century American Culture Getting Erased?

A few days ago, The Atlantic published an article on esteemed author John Cheever (1912-1982). But the magazine is almost apologetic, and feels compelled to admit the “final indignity” suffered by this troubled author—”less than 30 years after his death, even his best books were no longer selling.”

What a comedown for a writer who, during his lifetime, was a superstar contributor to The New Yorker, and got all the awards. Those included the Pulitzer Prize, the National Book Critics Circle Award, the National Book Award, and the National Medal for Literature.


But that’s not enough to keep any of his books in the top 25,000 sellers at Amazon. Try suggesting any of Cheever’s prize-winning works to your local reading group, and count the blank stares around the room.

And it’s not just Cheever. Not long ago, any short list of great American novelists would include obvious names such as John Updike, Saul Bellow, and Ralph Ellison. But nowadays I don’t hear anybody say they are reading their books.

And they are brilliant books. But reading Updike today would be an act of rebellion. Or perhaps indulging in nostalgia for a lost era.

The list goes on—Joseph Heller, Bernard Malamud, Carson McCullers, Robert Penn Warren, Katherine Anne Porter, James Agee, etc. Do they exist for readers under the age of forty?

Their era—mid-20th-century America—really is disappearing, at least in terms of culture and criticism. Anything from the 1950s is like an alien from another planet. It simply doesn’t communicate to us, or maybe isn’t given a chance.

And what about music?

The New York Times recently noticed that mid-century American operas never get performed by the Met. It’s almost as if the 1940s and 1950s don’t exist at Lincoln Center. (...)

But I see the exact same thing in jazz. Most jazz fans want to listen to music recorded after the the emergence of high fidelity sound in the late 1950s. So they are very familiar with Kind of Blue (1959) and what happened after, but know next to nothing about jazz of earlier periods.

If I were making a list of the greatest American contributions to music, my top ten would include Duke Ellington’s music from the early 1940s and Charlie Parker’s recordings from the mid-1940s. But even jazz radio stations refuse to play those works nowadays. So what hope is there that these musical milestones will retain a place in the public’s cultural memory?

Jazz musicians who died in the mid-1950s, such as Art Tatum, Charlie Parker, and Clifford Brown should rank among the great musicians of the century, but somehow fall through the cracks. Maybe if they had lived a few more years, they would get their deserved acclaim. But the same fans who love Monk, Miles, Ornette, and Trane often have zero knowledge of these earlier figures.

Now let’s consider cinema from the 1940s and 1950s. It doesn’t exist on Netflix.

You might say that Netflix has eliminated the entire history of cinema from its platform. But it especially hates Hollywood black-and-white films from those postwar glory years.


Citizen Kane is the greatest American film of all time, according to the American Film Institute. But when I try to find it on Netflix, the algorithm tells me to watch a movie about McDonald’s hamburgers instead.

The second best American film of all time is Casablanca, according to the AFI. When I tried to find it on Netflix, the algorithm offered me an animated film from 2020 as a substitute.

The sad reality is that the entire work of great filmmakers and movie stars has disappeared from the dominant platform. It wouldn’t cost Netflix much to offer a representative sample of historic films from the past, but they can’t be bothered. (...)

Not all of these works deserve lasting acclaim. Some of the tropes and attitudes are outdated. Avant-garde obsessions of the era often feel arbitrary or constraining when viewed from a later perspective. Censorship prevented artists from pursuing a more stringent realism in their works.

But those reasons don’t really justify the wholesale erasure of an extraordinary era of American creativity.

What’s happening? Why aren’t these works surviving?

The larger truth is that the Internet creates the illusion that all culture is taking place right now. Actual history disappears in the eternal present of the web.
  • Everything on YouTube is happening right now!
  • Everything on Netflix is happening right now!
  • Everything on Spotify is happening right now!
Of course, this is an illusion. Just compare these platforms with libraries and archives and other repositories of history. The contrast is extreme.

When you walk into a library, you understand immediately that it took centuries to create all these books. The same is true of the Louvre and other great art museums. A visit to an Ivy League campus conveys the same intense feeling, if only via the architecture.

You feel the weight of the past. We are building on a foundation created by previous generations—and with a responsibility to future ones.

The web has cultivated an impatience with that weight of the past. You might even say that it conveys a hatred of the past.

And the past is hated all the more because history is outside of our control. When we scream at history, it’s not listening. We can’t get it cancelled. We can’t get it de-platformed. The best we can do is attach warning labels or (the preferred response today) pretend it doesn’t exist at all.

That’s how Netflix erases Citizen Kane and Casablanca. It can’t deny the greatness of these films. It can’t remove their artistry, even by the smallest iota.

But it can act as if they never happened.

This is especially damaging to works from the 1940s an 1950s. These are still remembered—but only by a few people, who will soon die.

This is the moment when works from 80 years ago should pass from contemporary memory and get enshrined in history. But that won’t happen in an age that hates history and wants to live in the eternal present. (...)

But that eternal present is a lie, an illusion, a fabrication of the digital interfaces. And this not only destroys our sense of the past but also undermines our ability to think about the future.

In an environment without past or future, all we have is stasis.

So it’s no coincidence that culture has stagnated in this eternal digital now. The same brand franchises get reheated over and over. The same song styles get repeated ad nauseam. The same clichés get served up, again and again.

by Ted Gioia, Honest Broker |  Read more:
Image:Bettmann/Getty/reddit

Saturday, September 20, 2025

The Way They Were

In 1986, my most prized possession was a little pink phone message slip written by a hotel clerk.

“Miss Dowd,” it read, “Robert Redford called. He’s at the same number as last night.”

I’d never met Redford, but that piece of paper was a magic portal to all kinds of pink-cloud fantasies. I stuck it up on my cubicle in the Washington bureau of The Times and gazed at it whenever I needed a lift.

Then, one night, the bureau chief went on a crazed cleaning campaign and sent a crew in to throw out every stray piece of paper around our desks.

I came in the next morning and my beloved message was gone.

I had called Redford to interview him for a Times Magazine profile on Paul Newman. Often, movie stars won’t talk about other movie stars (it’s not about them!); Joanne Woodward wouldn’t even talk to me about her husband for that piece.

But Redford was happy to talk about his pal. When I heard that famous voice on the phone, I said: “Wait a minute, let me get a pen and pencil. I mean, a pen and pen. No, a pen and paper.”

He just laughed, accustomed to women getting flustered.

I heard from someone on his team about seven years later. Redford wanted to offer me a role in a movie he was directing called “Quiz Show.” It was just one line — “Excuse me, are you the son?” — uttered by a woman who’s at a book party trying to chat up Ralph Fiennes’s Charles Van Doren, the fraudulent quiz whiz and son of the renowned Shakespearean scholar Mark Van Doren.

I wrote Redford a note, explaining that I was too shy to act in a glossy movie. I couldn’t even muster the nerve to do TV as myself.

He sent a handwritten letter back, telling me that being shy was not a good excuse and that he was shy and you had to push past that and take risks. It was a charming letter — and I vowed to take his advice in the future.

Years later, I got to know Redford over friendly lunches and dinners and interviews for The Times and at Harvard’s Kennedy School. And that rarest of things happened: He was everything you hoped he would be. I had the same experience when I spent that week interviewing Newman.

Both men were elusive, private, funny, generous and self-deprecating. They both liked painting and writing poetry. (Newman’s poetry — and humor — was goofier.) And they both struggled with the sex symbol role.

“To work as hard as I’ve worked to accomplish anything and then have some yo-yo come up and say, ‘Take off those dark glasses and let’s have a look at those blue eyes’ is really discouraging,” Newman told me, adding: “Usually, I just say, ‘I would take off my sunglasses, madam, but my pants would fall down.’” What if his eyes turned brown, he wondered ruefully, and he died a failure?

Redford chafed at the chatter about his blond locks. At first, he told me, it felt great when he became a top Hollywood hunk with “Butch Cassidy” and “The Way We Were.” But then the constant references to his looks and some “out of whack” fan run-ins made it “exhausting.” He felt like he was being put in a cage and wanted to protest, “No, I’m an actor.”

When I talked to him for his solitary and horrific sailboat yarn, “All Is Lost,” in 2013, about aging onscreen and whether it became harder to do close-ups, he replied: “Well, let’s get something straight. I don’t see myself as beautiful. I was a kid who was freckle-faced, and they used to call me ‘hay head.’”

When Redford got kicked out of college in Colorado and lost his baseball scholarship for carousing too much, he went to be an underfed bohemian in Europe, trying his hand at painting. He wore a beret and stripy T-shirt but failed to impress French girls, who thought he was too ignorant about politics.

While being gorgeous can propel your career — can we agree that Newman and Redford were the most charismatic screen couple ever? — there is also a penalty. It’s as though you can’t have too much. Many in Hollywood were slow to realize what wonderful actors the two men were. Despite a string of indelible performances, Newman did not win a best actor Oscar until 1987, for “The Color of Money.” And Redford, an iconic American star of the sort that no longer exists, never won an Oscar for acting.

They both kept Hollywood at arm’s length, disdaining the superficiality, which didn’t endear them to Tinseltown. Newman lived on the East Coast and Redford conjured Sundance, creating a film lab and festival that transformed the movie industry and produced many great talents. (He was appalled when it got so popular that Paris Hilton showed up.)

The two friends with the raffish all-American smiles and sporting lives radiated cool and glamour, as though — to paraphrase “The Way We Were” — things came too easily to them.

But their self-images were different. Newman, the son of a Cleveland sporting goods store owner, said he thought of himself as a terrier with a bone, always working to make his acting more distilled. Redford, who grew up feeling economically insecure and suffered a bout of polio when he was 11, told me he thought of himself as climbing the hill, Sisyphus-style, never “standing at the top.” He quoted a favorite T.S. Eliot line: “There is only the trying. The rest is not our business.”

Both men could be uncomfortable in their skins, filled with self-doubt, haunted by family traumas. Newman lost a son and Redford lost two.

And yet, over several decades, they helped define American culture with their riveting portrayals of morally ambiguous characters.

“I was not interested in the red, white and blue part of America,” Redford told NPR’s Terry Gross. “I was interested in the gray part where complexity lies.”

by Maureen Dowd, NY Times | Read more:
Image: Robert Redford and Paul Newman in 1969’s “Butch Cassidy and the Sundance Kid.”Credit...Screen Archives/Getty Images

Thursday, September 18, 2025

The Uggo Police

The life of Marilyn Monroe yields a few lessons for those who would follow in her footsteps. One, don’t marry a playwright. Two, get paid. No current-day actress has taken this second lesson to heart like Sydney Sweeney, whose tousled good looks are practically designed to make people underestimate her. Sweeney understands that being an object of sexual fantasy involves a hefty dose of contempt—and says, If that’s the game, I’m going to make some money off of me, too. She’s under no illusions that if her career is left to others, she’ll be cast in parts she finds interesting. So if she sees a script she likes, she funds it herself. To get money, she sells stuff: bath soap that supposedly contains her bathwater, jeans, ice cream.

And if these products are advertised in ways that are a little tasteless, or a little offensive, that means that people will talk about the ads, and that talk means sales, and those sales mean, in the end, more checks for Sweeney. Asking whether or not Sweeney knew that a jeans ad campaign with the tagline “Sydney Sweeney has great jeans” would activate the very weird and very horny portion of the Internet that has made her into a symbol of anti-wokeness misses the point. She would have done it either way. That is, I imagine that Sweeney regards her crew of weird, horny right-wing fans the same way she probably regards any group of fans: as wallets.

As for me, personally? I like Sydney Sweeney, in a vague way that doesn’t mean I have any interest in her movies. I just have a lot of respect for actors who don’t ever say no to a check (see, Orson Welles). The other side of libidinal contempt is feel-good pity, but there’s nothing pitiable about Sweeney either. Some girls are born connected, some girls are born pretty, and some girls are born smart. Two out of three isn’t so bad. But her cultists are another story. Aside from the obvious—adopting Sydney Sweeney as a cause allows them to post pictures of her in underwear with plausible deniability—what’s going on there?

The “Ballad of Sydney Sweeney” goes like this: “They” wanted to exterminate beautiful busty blondes. “They” put ugly people in ads (sometimes). Now, however, here comes Sydney Sweeney, ending wokeness once and for all. The implication is that at some point in the past ten years, it’s been disadvantageous to be a curvaceous babe. The only sense in which that is true has not changed: Sweeney keeps showing up in ads in bras that don’t fit. But never mind that; thanks to Sweeney, it is now legal to be hot. The hot people have come out from the places where they’d been driven into hiding by the uggo police. Now they frolic freely in the sun. Very touching.

Meanwhile, the anti-Sweeney in this drama is Taylor Swift. Swift and Sweeney have been pitted against each other by spectators, including Donald Trump: Swift, who represents woke, is no longer hot; Sweeney, anti-woke, is hot. (Out with the old blonde, in with the new.) Like so many statements about both Taylor Swift and Sydney Sweeney, or, for that matter, by Trump, this one has no tether to reality, but it’s how a certain type of person wants things to be. There’s a level of personal betrayal at play here. Swift, who stays out of trouble, avoids politics, doesn’t do drugs, rarely seems out of control, and sings about love, was the crypto-conservative icon of an earlier era. Eventually, it turned out that she was not one of them. Their Brünnhilde was within another ring of fire. Now all their hopes are pinned on Sweeney.

Does something about this scenario feel a little off to you? Not to sound like I’ve woken up from a coma, in which I have languished since 1992 after hearing Dan Quayle rail against Murphy Brown, but when exactly did making cleavage great again become a conservative cause? Somebody with the combined memory powers of (let’s say) three goldfish can easily imagine an alternate present in which Sweeney and her cleavage were an object of outraged conservative disdain. In this other world, Sweeney is attracting rage-filled press over her horror movie in which (I’m told) she plays a nun who bashes a baby to death. But in this world, these people don’t even get to do that. All rage provides is free marketing.

The people who are slavering over Sweeney will cheerfully confess to motivations that are gross enough. They like her because she’s white, busty, blonde, thin, and blue-eyed, but it seems like the white part might be the most important trait [ed. don't think so.]. To them, Sweeney represents things being right with the world; she’s the hot cheerleader to their collective star quarterback. (Among her many crimes, Taylor Swift’s engagement to a woke-for-football fellow, whose name I can’t recall, surely ranks pretty high on the list.) She’s the human embodiment of A.I.-generated pictures of beautiful white families, on a farm, reading the Bible, captioned, This is what they took from you!

Intriguingly little of this fandom has anything to do with Sydney Sweeney, the actual person, her professional life, or her public statements. When Doreen St. Félix, a writer for the New Yorker, had the temerity to call the American Eagle ad (and Sweeney, by implication) “banal,” the immediate reaction was to try to get her fired by digging up tweets she had written more than ten years ago and accusing her of racism against white people. One wonders whether what really set them off was St. Félix’s pointing out that Sweeney dyes her hair blonde: “Her blondness, like a lot of adult blondness, is a chemical thing masquerading as natural only to those most gullible in the population, straight men, who don’t know, and don’t care to understand, how much of so-called natural female beauty is constructed.” As both St. Félix’s piece and the subsequent backlash illustrated, the idea that Sydney Sweeney might be marketing herself undoes the illusion of the naturally beautiful girl who attracts attention and fame for doing nothing. Her fans miss all the things Sweeney herself clearly is—a smart businesswoman and an ambitious artist—because in her advertisements they see only a sleepy-looking fantasy object. Do any of these people even know that Sweeney makes movies? It’s an open question. (...)

So these people are deprived not only of the chance to ogle but of control. Neither their approval nor their disapproval can move the needle. The only thing that can is conjuring up the idea of a phantom lib, outraged and disapproving, and hoping some real people will come along to play the part. This type of resentment politics is the only card they really have: Look at how they despise you; make them mad, drink their tears! There’s always a professor somewhere who has said something inflammatory and stupid to back up this assertion.

But who cares? Really. Who cares? At last, to own the libs, we can admit McDonald’s tastes good, have fun at the movies, and post pictures of beautiful women in advertisements. But we already could do all of those things. It’s just that McDonald’s is junk, the movies are junk, and those advertisements exist to sell us junk. (...)

It might sound paradoxical to say that Sweeney’s worst fans adore her because they hate women, but it’s true. (Also, they don’t adore her.) There is always a young blonde to attach yourself to, and an older blonde to throw away. As long as Sweeney does nothing to alienate them, they will continue to hype her up; if one day she endorses a politician they don’t like, then it will be time to start talking about how she’s washed (or whatever slang has replaced “washed” by then). What they really want, besides the Fourth Reich, is a world in which women are either objects or invisible, disposable or essentially private.

by B.D. McClay, The Lamp |  Read more:
Image: American Eagle
[ed. Still high on winning the 'War on Christmas'. Also, have nothing against breasts.]

Wednesday, September 17, 2025

Buddies

Redford and Newman: A Screen Partnership That Defined an Era (NYT)
Image: Twentieth Century Fox Film Corporation/Sunset Boulevard/Corbis, via Getty Images
[ed. Time marches on, and friendships... what you make of them. See also: Robert Redford and the Perils of Perfection (New Yorker).]

Thursday, September 11, 2025

Withnail and I


Vivian MacKerrell

Image: Sotheby's

[ed. Never seen the movie, but this arresting photo caught my attention. Who is this guy? See also: Disdain, decay and a half-dead eel: Withnail and I:]
***
"This is an age of rackety behaviour. Withnail is a story about rackety behaviour. More than that, it is about decay and disdain for the authorities that contrive to make us miserable. And who can say they haven’t felt the misery of life now? (...) Withnail taught me many things. I might not have understood the film when I first saw it. But the sense of freedom, even if ill-conceived, spat at me like water from a fatted pan. These were my people. I recognised the nihilism, the attraction of necking booze from the bottle at lunch, and the hard, unspoken words of love."

Wednesday, September 3, 2025

Tuesday, July 22, 2025

We Are Winning!

Something has changed in the last few days.

In recent months, we’ve been bombarded with millions of lousy AI songs, idiotic AI videos, and clumsy AI images. Error-filled AI texts are everywhere—from your workplace memos to the books sold on Amazon.com. (...)

All Fake

But something has changed in the last few days.

The garbage hasn’t disappeared. It’s still everywhere, stinking up the joint.

But people are disgusted, and finally pushing back. And they are doing so with such fervor that even the biggest AI companies are now getting nervous and pulling back.

Just consider this surprising headline:


This was stunning news. YouTube is part of the world’s largest AI slop promoter—namely the Google/Alphabet empire. How can they possibly abandon AI garbage? Their bosses are the biggest slopmasters of them all.

After this shocking news reverberated through the creative economy, YouTube started to backtrack. They said that they would not punish every AI video—some can still be monetized.

But even the revised guidelines are still a major blow to AI slop purveyors. YouTube made clear that “creators are required to disclose when their realistic content is altered or synthetic.” That’s a huge win—we finally have a requirement for disclosure, and it came straight from the dark planet Alphabet. [ed. who's motto used to don't be evil]

YouTube also stressed that it opposes “content that is mass-produced or repetitive, which is content viewers often consider spam.” This is just a step away from blocking slop. 

What happened?

Maybe the folks at YouTube are just as disgusted by AI as the rest of us. Or maybe we have shamed them into taking action.

My view is that YouTube is (finally) reading the room. I’ve noted before that YouTube is the only part of the Google empire that actually understands creators and audiences. And (unlike their corporate overseers) they have figured out that AI slop is an embarrassment that will tarnish their brand.

The widespread mockery of the fake AI band Velvet Sundown might have been the turning point. This blew up in the last few days, and left AI promoters reeling.

Velvet Sundown is a non-existent AI band that got a million plays on Spotify. These deceptions have occurred in the past, but something different happened this time.

Music fans started mocking Spotify and its alleged promotion of a stupid slop band. The company was subjected to a level of ridicule and angry denunciation it has never endured before.

Journalists called this out as a hoax or fraud. And many speculated about Spotify’s role in the charade. After all, the company has been caught promoting AI slop in the past.

But this time Spotify got turned into a joke—or even worse. They were linked to a scam so clumsy that everyone was now making fun of them, as well as scrutinizing their policies and practices.

Rick Beato’s response to Velvet Sundown got two million views—so more people were watching takedowns of the band than listening to it. An industry group even demanded disclaimers and regulation.

And the jokes kept coming. People mocked the slop with more slop


That must be painful to endure, even for the billionaire CEO of a streaming platform.

Whatever the reason, Spotify started to buckle. It actually began imposing restrictions on AI.

“Spotify has now pulled several uploads from the AI act and the associated Velvet Sundown,” reported Digital Music News on July 14.

It felt like the tide was now turning in the war against slop AI music.

Dylan Smith, one of the best sources on this subject, clearly thinks so. “Velvet Sundown’s Spotify pulldown,” he writes, “doesn’t exactly bode well for forthcoming AI releases.”

I’m focused here on AI’s destructive impact on culture, but there are other signs that growing AI resistance is now forcing companies to reconsider their bot mania.

“An IBM survey of 2,000 chief executives found three out of four AI projects failed to show a return on investment, a remarkably high failure rate,” reports Andrew Orlowski. “AI agents fail to complete the job successfully about 65 to 70 percent of the time, says a study by Carnegie Mellon University and Salesforce.”

He also shared the results of a devastating test that debunked AI’s status in its favorite field, namely writing code. This study reveals that software developers think they are operating 20% faster with AI, but they’re actually running 19% slower.

Some companies are bringing back human workers because AI can’t deliver positive results. Even AI researchers are now expressing skepticism. And only 30% of AI project leaders can say that their CEOs are happy with AI results.

This is called failure. There’s no other name for it.

And it will get worse. The Gartner Group is now predicting that 40% of AI agent programs will be cancelled before 2027—due to “rising costs, unclear business value and inadequate risk controls.”

by Ted Gioia, The Honest Broker |  Read more: 
Images: Bridge Chronicle/YouTube
[ed. I'd say temporary setback. The AI industry will eventually figure something out, they've got too much money and tech beavers involved not to. The product will get better, legislators will be lushly rewarded for IP protection and distribution, some hit movie/song will get made entirely by AI, some important (maybe unusual) event will occur and eventually be traced to it, etc. A million things could happen. So calling this winning seems a little premature. Likely we'll just get used to it over time (like advertising), with authenticity mostly a certification issue (if anyone cares. you have to wonder with taste these days). See also: I'm Sorry... This New Artist Completely Sucks ie. how to create a fake song of your own (with just two sentences) (Beato)]

Sunday, May 25, 2025

On Life in the Shadow of the Boomers

Ideology, which was once the road to action, has become a dead end.
—Daniel Bell (1960)

Yuval Levin’s 2017 book Fractured Republic: Renewing America’s Social Contract in the Age of Individualism has several interesting passages inside it, but none so interesting as Levin’s meditation on the generational frame that clouds the modern mind. Levin maintains that 21st century Americans largely understand the last decades of the 20th century, and the first decades of the 21st, through the eyes of the Boomers. Many of the associations we have with various decades (say, the fifties with innocence and social conformity, or the sixties with explosive youthful energy), says Levin, had more to do with the life-stage in which Boomer’s experienced these decades than anything objective about the decades themselves:
Because they were born into a postwar economic expansion, they have been an exceptionally middle-class generation, targeted as consumers from birth. Producers and advertisers have flattered this generation for decades in an effort to shape their tastes and win their dollars. And the boomers’ economic power has only increased with time as they have grown older and wealthier. Today, baby boomers possess about half the consumer purchasing power of the American economy, and roughly three-quarters of all personal financial assets, although they are only about one-quarter of the population. All of this has also made the baby boomers an unusually self-aware generation. Bombarded from childhood with cultural messages about the promise and potential of their own cohort, they have conceived of themselves as a coherent group to a greater degree than any generation of Americans before them.

Since the middle of the twentieth century they have not only shaped the course of American life through their preferences and choices but also defined the nation’s self-understanding. Indeed, the baby boomers now utterly dominate our understanding of America’s postwar history, and in a very peculiar way. To see how, let us consider an average baby boomer: an American born in, say, 1950, who has spent his life comfortably in the broad middle class. This person experienced the 1950s as a child, and so remembers that era, through those innocent eyes, as a simple time of stability and wholesome values in which all things seemed possible.

By the mid-1960s, he was a teenager, and he recalls that time through a lens of youthful rebellion and growing cultural awareness—a period of idealism and promise. The music was great, the future was bright, but there were also great problems to tackle in the world, and he had the confidence of a teenager that his generation could do it right. In the 1970s, as a twenty-something entering the workforce and the adult world, he found that confidence shaken. Youthful idealism gave way to some cynicism about the potential for change, recreational drugs served more for distraction than inspiration, everything was unsettled, and the future seemed ominous and ambiguous. His recollection of that decade is drenched in cold sweat.

In the 1980s, in his thirties, he was settling down. His work likely fell into a manageable groove, he was building a family, and concerns about car loans, dentist bills, and the mortgage largely replaced an ambition to transform the world. This was the time when he first began to understand his parents, and he started to value stability, low taxes, and low crime. He looks back on that era as the onset of real adulthood. By the 1990s, in his forties, he was comfortable and confident, building wealth and stability. He worried that his kids were slackers and that the culture was corrupting them, and he began to be concerned about his own health and witness as fifty approached. But on the whole, our baby boomer enjoyed his forties—it was finally his generation’s chance to be in charge, and it looked to be working out.

As the twenty-first century dawned, our boomer turned fifty. He was still at the peak of his powers (and earnings), but he gradually began to peer over the hill toward old age. He started the decade with great confidence, but found it ultimately to be filled with unexpected dangers and unfamiliar forces. The world was becoming less and less his own, and it was hard to avoid the conclusion that he might be past his prime. He turned sixty-five in the middle of this decade, and in the midst of uncertainty and instability. Health and retirement now became prime concerns for him. The culture started to seem a little bewildering, and the economy seemed awfully insecure. He was not without hope. Indeed, in some respects, his outlook on the future has been improving a little is he contemplates retirement. He doesn’t exactly admire his children (that so-called “Generation X”), but they have exceeded his expectations, and his grandchildren (the youngest Millennials and those younger still) seem genuinely promising and special. As he contemplates their future, he does worry that they will be denied the extraordinary blend of circumstances that defined the world of his youth.

The economy, politics, and the culture just don’t work the way they used to, and frankly, it is difficult for him to imagine America two or three decades from now. He rebelled against the world he knew as a young man, but now it stands revealed to him as a paradise lost. How can it be regained? This portrait of changing attitudes is, of course, stylized for effect. But it offers the broad contours of how people tend to look at their world in different stages of life, and it shows how Americans (and, crucially, not just the boomers) tend to understand each of the past seven decades of our national life. This is no coincidence. We see our recent history through the boomers’ eyes. Were the 1950s really simple and wholesome? Were the 1960s really idealistic and rebellious? Were the 1970s aimless and anxious? Did we find our footing in the 1980s? Become comfortable and confident in the 1990s? Or more fearful and disoriented over the past decade and a half? As we shall see in the coming chapters, the answer in each case is not simply yes or no. But it is hard to deny that we all frequently view the postwar era in this way—through the lens of the boomer experience.

The boomers’ self-image casts a giant shadow over our politics, and it means we are inclined to look backward to find our prime. More liberal-leaning boomers miss the idealism of the flower of their youth, while more conservative ones, as might be expected, are more inclined to miss the stability and confidence of early middle age—so the Left yearns for the 1960s and the Right for the 1980s. But both are telling the same story: a boomer’s story of the America they have known. The trouble is that it is not only the boomers themselves who think this way about America, but all of us, especially in politics. We really have almost no self-understanding of our country in the years since World War II that is not in some fundamental way a baby-boomer narrative. [1]
When I first read this passage in 2018 I experienced it as a sort of revelation that suddenly unlocked many mysteries then turning in my mind.

To start with: The 1950s did not seem like an age of innocent idyll or bland conformity to the adults who lived through it. It was a decade when intellectual life was still attempting to come to terms with the horrors of World War II and the Holocaust. Consider a few famous book titles: Orwell’s 1984 (published 1949), Hersey’s The Wall (1950), Arendt’s The Origins of Totalitarianism (1951), Chambers’ Witness (1952), Miller’s The Crucible (1953), Bradbury’s Fahrenheit 451 (1953), Golding’s Lord of the Flies (1954), Pasternak’s Doctor Zhivago (1957), and Shirer’s Rise and Fall of the Third Reich (1960) were all intensely preoccupied with the weaknesses of liberalism and the allure of totalitarian solutions. For every optimistic summons to Tomorrowland, there was a Lionel Trilling, Reinhold Niebuhr, or Richard Hofstadter ready to declare Zion forever out of reach, hamstrung by the irony and tragedy of the American condition. Nor was it the wholesome era of memory. An age we associate with childlike obedience saw its children as anything but obedient—witness the anxiety of the age in films like The Wild One (1953), Rebel Without a Cause (1955), and Blackboard Jungle (1955). This age of innocence saw the inaugural issue of Playboy, the books Lolita (1955) and Peyton Pace (1956) hitting the New York Times Fiction best seller list, the Kinsey reports topping the Non-fiction best seller list, and Little Richard inaugurating rock ‘n roll with the lyrics
Good Golly Miss Molly, sure like to ball
When you’re rocking and rolling
Can’t hear your mama call.
And that is all without considering a lost war in Korea, the tension of the larger Cold War, and the tumult of the Civil Rights revolution. We may think of the 1950s as an age of conformity, purity, and stability, but those who lived through it as adults experienced it as an age of fragmentation, permissiveness, and shattered innocence.[2]

Levin explains why our perception of the era differs so much from the perceptions of the adults who lived through it. We see it as an age of innocence because we see it through the eyes of the Boomers, who experienced this age as children. But his account also helps explain something else—that odd feeling I have whenever I watch Youtube clips of a show like What’s My Line. Though products of American pop culture, those shows seem like relics from alien world, an antique past more different in manners and morals from the America of 2020 than many foreign lands today. However, this eerie feeling of an alien world does not descend upon me when I see a television show from the 1970s. The past may be a different country, the border line is not crossed until we hit 1965.

This observation is not mine alone. In his new book, The Decadent Society: How We Became Victims of Our Own Success, Ross Douthat describes it as a more general feeling, a feeling expressed in many corners on the 30 year anniversary of the 1985 blockbuster Back to the Future. The plot of that film revolves around a contemporary teenager whisked back via time machine to the high school of his parents, 30 years earlier. When the film’s anniversary hit in 2015, many commented that the same plot could not work today. The 1980s simply seemed far too similar to the 2010s for the juxtaposition to entertain. Douthat explains why this might be so:
A small case study: in the original Back to the Future, Marty McFly invaded his father’s sleep dressed as “Darth Vader from the planet Vulcan.” The joke was that the pop culture of the 1960s and 1970s could be passed off as a genuine alien visitation because it would seem so strange to the ears of a 1950s teen. But thirty years after 1985, the year’s biggest blockbuster was a Star Wars movie about Darth Vader’s grandkid… which was directed by a filmmaker, J. J. Abrams, who was coming off rebooting Star Trek… which was part of a wider cinematic landscape dominated by “presold” comic-book properties developed when the baby boomers were young. A Martina McFly visiting the Reagan-era past from the late 2010s wouldn’t have a Vader/ Vulcan prank to play, because her pop culture and her parents’ pop culture are strikingly the same….
by Tanner Greer, The Scholar's Stage |  Read more:
Image: via

Wednesday, May 14, 2025

Kazuo Ishiguro: A Pale View of Hills

Kazuo Ishiguro still remembers where he was when he wrote A Pale View of Hills: hunched over the dining room table in a bedsit in Cardiff. He was in his mid-20s then; he is 70 now. “I had no idea that the book would be published, let alone that I had a career ahead of me as a writer,” he says. “[But] the story remains an important part of me, not only because it was the start of my novel-writing life, but because it helped settle my relationship with Japan.”


First published in 1982, A Pale View of Hills is a charged family story that connects England with Japan and the present with the past. Now along comes a film version to provide a new frame for the mystery, a fresh view of the hills. Scripted and directed by Kei Ishikawa, it is a splendidly elegant and deliberate affair; a trail of carefully laid breadcrumbs that link a mothballed home in early 80s suburbia with wounded, resilient postwar Nagasaki. Middle-aged Etsuko is long settled in the UK and haunted by the fate of her displaced eldest child. Her younger daughter, Niki, is a budding writer, borderline skint and keen to make a name for herself. Niki has a chunky tape-recorder and plenty of time on her hands. She says, “Mum, will you tell me about your lives before, in Japan?”

In awarding Ishiguro the Nobel prize for literature in 2017, the Swedish Academy paid tribute to the emotional force of his prose and his focus on “memory, time and self-delusion”. These are the themes that colour all his fiction, whether he is writing about the below-stairs staff at a stately home (The Remains of the Day), sacrificial children at an elite boarding school (Never Let Me Go) or aged wanderers in Arthurian Britain (The Buried Giant), although they seem closest to home in A Pale View of Hills.

The story lightly excavates the author’s family history and his own hybrid identity as a child of Nagasaki, transplanted to the UK at the age of five. Fittingly, the movie version premieres at the Cannes film festival, where it risks getting lost amid the palm trees, yachts and bling. Cultural dislocation, in large part, is what the tale is about.

I’m tempted to view Niki – the bumptious young writer from whom no family secret is safe – as Ishiguro’s alter ego. Actually, he says, she was conceived as “more a ‘reader proxy’ than a writer one”. She’s our entry point to the story; possibly our red thread through the maze. It’s hard to believe today, he adds, but most contemporary British readers were resistant to Japanese stories and characters and needed a reassuring western presence to help ease them in.

Niki is played in the film by Camilla Aiko, a recent graduate of the Bristol Old Vic theatre school. She sees the character as the story’s truth-seeker, the eyes of the audience, and the picture itself as the tale of two women who struggle to connect. “It didn’t cross my mind – maybe it should have – that I was playing Ishiguro,” she says.

What she shares with the author is the same blended cultural heritage. Aiko is British mixed-race – her mother is Japanese. “And the thing about being mixed-race is that I find it difficult speaking for Japanese people or British people because I’m not sure which side I’m on. In Japan I’m a foreigner; here I’m Asian. As an actor I’m someone who tries to slip through the cracks.”

Niki isn’t Ishiguro. Nonetheless, the author admits that there are parallels. He says, “Where I see myself in Niki – and I was reminded of this watching Camilla Aiko’s fine performance – is in her sometimes uncomfortable, sometimes coy and cunning curiosity when coaxing memories from her mother of another, more troubled time.”

It is the mother, after all, who looms largest in the tale. Etsuko in a sense has led two lives and been two different people. In 80s England she is a respectable widowed music teacher. In Nagasaki seven years after the atomic bomb dropped, she’s a harried young bride, contaminated with radiation and a potential hazard to her unborn child. She needs a friend or an escape route, whichever comes first. But she is never an entirely reliable narrator – and the family story she tells Niki finally doesn’t add up.

What did Ishiguro’s own mother make of A Pale View of Hills? “I believe it remained special to her among my books,” he says. “A little before I started the book, with cold war tensions intensifying in the Reagan-Brezhnev era, she said to me she felt it was important she should relate to me some of her experiences in Nagasaki. Partly because I was of the next generation, but also because I was wanting to be a writer and had a chance to pass things on … A Pale View of Hills didn’t use any of her stories directly, but I think she thought the book was some sort of evolution of them, and closer to her than the books I wrote later.” Ishiguro’s mother died in 2019, aged 92. After watching Ishikawa’s adaptation, he thought: “What a pity she wasn’t here to see this film.”

Cinema is an enduring passion for Ishiguro and influences his writing as much as literature does. His favourite recent films include the Oscar-winning animation Flow, about a small soot-grey cat who survives a great flood, plus the French legal dramas Anatomy of a Fall and Saint Omer (“Is French justice really conducted like this? Or are these hallucinatory versions of French courts?”).

A few years back, between novels, he wrote the screenplay for Living – a quietly wrenching adaptation of Akira Kurosawa’s 1952 classic Ikiru, relocated to London and starring Bill Nighy and Aimee Lou Wood. The poster for Ikiru, incidentally, can be glimpsed on the street in A Pale View of Hills.


Loving film can be a double-edged sword. Is it a help or a hindrance when it comes to having his own work adapted? Hopefully the former, Ishiguro says, so long as he maintains a safe distance. “I have a strict rule not to attempt to adapt any of my novels myself,” adds the writer, who is speaking to me by email. “As long as I keep well in the background, I don’t think I’m necessarily a hindrance. I always emphasise to film-makers that they have to own the film – that it shouldn’t be approached reverentially.”

Merchant-Ivory managed a near perfect adaptation of The Remains of the Day. Mark Romanek and Alex Garland crafted an appropriately haunting, chilly version of Never Let Me Go. Both films preserve Ishiguro’s distinctive style and flavour. The restraint and simplicity; the sense of deep mystery. Both, though, remain films first and foremost. They have been allowed to migrate and adapt to a new habitat.

“This is personal to me,” he says, “but I lean toward the film version moving the story on – not being a faithful translation the way a foreign language edition of a book might be. I know many novelists who’d be annoyed to hear me say this … The thing is, I watch many, many films and when an adaptation of a well-known book doesn’t work, 95% of the time it’s because the film-makers have been too reverential to the source.” Books and films are very different, he thinks. “They’re sometimes almost antithetical.”

In A Pale View of Hills, Etsuko hands her story on to Niki. Niki, in turn, will write it up how she likes. So this is a family story about family stories, aware of how they warp and change in the telling. Every tale is subject to the same cultural static. They are adapted and extrapolated, lost and found in translation. One might even say that’s what keeps a story alive.

by Xan Brooks, The Guardian |  Read more:
Images:Chris Pizzello/Invision/AP; Pale View Partners; YouTube

Thursday, May 8, 2025

Harrison Ford and the Origin of Western Civilization

TED:

So what do want to talk about today?

INTERVIEWER:

Today I want you to stop acting so elitist—that’s why we’re going to talk about action films. What’s your favorite?

TED:

I’m not as elitist as you think. I’ve written hundreds of essays about science fiction, horror stories, locked-room mysteries, TV westerns, and other types of popular entertainment.

By the way, I love action movies of all sorts—I even have a Jackie Chan poster on my bedroom wall.

INTERVIEWER:

Is that true?

TED:

No, I just made that up.

But I do enjoy Jackie Chan’s movies, especially the early ones. I would consider putting a Jackie Chan poster on the wall, but Tara would veto that.

She already made me take down my autographed photo of Jake LaMotta—she said it clashed with the decor.


INTERVIEWER:

She is probably right. But let’s go back to my original question. What’s your favorite action film?

TED:

That’s hard to answer. There was a very good movie about LaMotta…

INTERVIEWER:

That doesn’t count. It wasn’t a real action movie. Pick another one.

TED:

Huh? There were plenty of fight scenes in it. But I’ll take you at your word, and choose another movie

[Ted stops and thinks.]

Okay, I’ve got an answer for you. The action movie I’ve seen most often is The Fugitive—starring Harrison Ford and Tommy Lee Jones. I’ve watched it so many times, I’ve lost count.

INTERVIEWER:

What do you like about it.

TED:

For a start, it’s the exact counterpart of Homer’s Odyssey….

INTERVIEWER:

Gimme a break, you’re doing it again. I said no elitist stuff today. So you’re not allowed to talk about Homer and ancient epic poetry.

TED:

Hey, hear me out. Homer’s Odyssey is also an adventure story—and not for elites. This story has entertained youngsters for thousands of years.

And it’s my favorite kind of adventure story.

INTERVIEWER:

Why is that?

TED:

The Odyssey was the first adventure story in Western culture about a hero who prevails through intelligence and reasoning, not fighting and bloodshed.

That’s a big deal. It signals the moment when the West emerged from savagery—assuming that we have emerged from savagery.

Odysseus is not a brave solider—if you’ve read Pseudo-Apollodorus, you will know that he tried to avoid fighting in the Trojan War by pretending to be crazy.

INTERVIEWER:

Sudoku app adores us? What the devil are you talking about?

TED:

Don’t worry about Sudoku. I’m trying to explain that Odysseus was the first adventurer who hates adventure. There’s a postmodern concept for you. He doesn’t even like fighting—he prefers to use his wiles and cunning.

This is the greatest turning point in Western culture. We finally have an alternative to the reciprocal violence that dominates so much of human history. The worst mistakes we’ve made in the West have taken place when we have forgotten that alternative.

But, of course, it’s also a breakthrough in storytelling.

Homer’s previous epic, the Iliad, is all about bravery and violence on the battlefield. Some 240 battlefield deaths are described during the course of that brutal poem—frequently related in grisly detail.

But the Odyssey is totally different. The hero is actually portrayed as a coward.

Homer drops a hint when he says the Odysseus places his ship in the exact middle of all the Greeks boats on the shore of Troy—that’s the safest place in the event of a surprise attack by the Trojans. Homer doesn’t say it explicitly, but he implies that Odysseus always had an escape plan, and needed to ensure that his ship was available for a hasty retreat.

INTERVIEWER:

What does this have to do with Harrison Ford and The Fugitive.

TED:

It has everything to with it. In The Fugitive, Harrison Ford succeeds through cunning and intelligence. There’s that great scene when Ford’s colleague tells Tommy Lee Jones: “You will never find him. He is too smart.”

Just as the Odyssey represents a shift away from the obsessive violence of the Iliad, Harrison turns his back on the constant battling of his previous manifestations in Star Wars and Indiana Jones.

In the movie poster for The Fugitive, Ford is actually running away from the fight—much like Odysseus tried to do.

So this is a great moment in Hollywood action movies. There’s actually very little fighting in The Fugitive. Ford even risks capture at one point by saving a person’s life. And that makes perfect sense because he is playing Dr. Richard Kimble, who—like all doctors—has taken a Hippocratic oath to avoid harm and do good.

That’s why The Fugitive is so satisfying to watch. We finally have a hero who really does good deeds and avoids reciprocal violence. And when he must engage in conflict, he out-thinks his opponent—instead of fighting and killing.

In fact, the entire point of the film is that Dr. Kimble is an innocent man. He has been falsely accused (of murdering his wife), and his only goal in this movie is to prove his innocence and his commitment to doing good.

I won’t give away spoilers. But in the final minutes of the film, he applies that Hippocratic Oath to do good through medicine and healing in a very unexpected way. You might even say that he saves thousands of people—in addition to himself.

But there are many other similarities between The Fugitive and Homer’s great epic the Odyssey.

INTERVIEWER:

What other similarities?

TED:

Like all great epic poets, Homer starts the Odyssey in the middle of the story—literary critics call this in medias res. Homer may even have invented this storytelling technique.

The Fugitive follows the same pattern. The movie begins after our hero Dr. Richard Kimble has been falsely accused and convicted of his wife’s murder. So (as in the Odyssey) we must learn about these incidents through flashbacks.

In the case of the Odyssey, our hero must battle a one-eyed monster—the Cyclops!—in order to survive and prevail. The same thing happens in The Fugitive, except that Dr. Kimble needs to deal with a one-armed monster who murdered his wife.

INTERVIEWER:

This is just coincidence. Stop playing games with me…

TED:

You’re totally wrong about that.

Let me ask you a question now. What’s the name of the one-armed man in The Fugitive?

INTERVIEWER:

I have no idea.

TED:

The character’s name is Sykes. This reference to the Cyclops would be obvious to any classicist in the audience.

Can’t you see that the filmmaker wants to remind us of the Odyssey?

INTERVIEWER:

You’re blowing my mind. Is that for real?

TED:

Go ahead, check it out for yourself.

But let me go on. There’s a whole web of connections here.

I’m not even going to talk about the obvious ones—for example, Homer frequently refers to Odysseus as “great-hearted” while Dr. Kimble is an actual heart surgeon. And Odysseus’s troubles began with Helen of Troy, while Dr. Kimble’s problems begin with his wife Helen—both victims of fighting men who intrude into their peaceful lives.

Those are just tiny details. The plot is the main source of my interest here.

In the Odyssey, our hero must survive a ship wreck—and later must escape from captivity on an island, where Calypso wants to hold him for the rest of his life. In The Fugitive, Harrison Ford needs to survive a train wreck—which allows him to escape from captivity as a prisoner on death row, where he would otherwise spend the rest of his life.

In the Odyssey, our hero eventually returns unexpectedly to his native land—the island of Ithaca—where he faces his final and greatest challenges. In The Fugitive, the US Marshalls are shocked when Dr. Kimble returns to—can you guess it?—his home town of Chicago.

That’s the last thing they expected from a runaway fugitive. “Sonofabitch,” declares Tommy Lee Jones, “our boy came home.”

But, of course, a homecoming is necessary in this type of adventure story. These heroes must return home to resolve all the dangers and obstacles they face. And in that familiar terrain, both heroes prevail against heavy odds.

By the way, both the Odyssey and The Fugitive culminate with an unexpected confrontation in a crowded banquet hall in that same home town. The parallelism is now completed.

And this brings me to my favorite part of the story.

by Ted Gioia, The Honest Broker |  Read more:
Images: Ted Gioia and The Fugitive
[ed. Interesting take, even though The Fugitive was initially produced for tv in 1963 (starring David Janssen), and as far as I know had none of these themes/connections.]

Wednesday, May 7, 2025

Is This the Worst-Ever Era of American Pop Culture?

Last year, I visited the music historian Ted Gioia to talk about the death of civilization.

He welcomed me into his suburban-Texas home and showed me to a sunlit library. At the center of the room, arranged neatly on a countertop, stood 41 books. These, he said, were the books I needed to read.

The display included all seven volumes of Edward Gibbon’s 18th-century opus, The Decline and Fall of the Roman Empire ; both volumes of Oswald Spengler’s World War I–era tract, The Decline of the West ; and a 2,500-year-old account of the Peloponnesian War by Thucydides, who “was the first historian to look at his own culture, Greece, and say, I’m going to tell you the story of how stupid we were,” Gioia explained.

Gioia’s contributions to this lineage of doomsaying have made him into something of an internet celebrity. For most of his career, he was best-known for writing about jazz. But with his Substack newsletter, The Honest Broker, he’s attracted a large and avid readership by taking on contemporary culture—and arguing that it’s terrible. America’s “creative energy” has been sapped, he told me, and the results can be seen in the diminished quality of arts and entertainment, with knock-on effects to the country’s happiness and even its political stability.

He’s not alone in fearing that we’ve entered a cultural dark age. According to a recent YouGov poll, Americans rate the 2020s as the worst decade in a century for music, movies, fashion, TV, and sports. A 2023 story in The New York Times Magazine declared that we’re in the “least innovative, least transformative, least pioneering century for culture since the invention of the printing press.” An art critic for The Guardian recently proclaimed that “the avant garde is dead.”

What’s so jarring about these declarations of malaise is that we should, logically, be in a renaissance. The internet has caused a Cambrian explosion of creative expression by allowing artists to execute and distribute their visions with unprecedented ease. More than 500 scripted TV shows get made every year; streaming services reportedly add about 100,000 songs every day. We have podcasts that cater to every niche passion and video games of novelistic sophistication. Technology companies like to say that they’ve democratized the arts, enabling exciting collisions of ideas from unlikely talents. Yet no one seems very happy about the results.

To a certain extent, such negativity may simply reflect an innate human tendency to fret about decline. Some of the most liberating developments in history have first triggered fears of social stultification. The advent of the printing press caused 15th-century thinkers to complain of mass distraction. In 1964, The Atlantic published an essay predicting, not unpersuasively, that rock and roll would only foster conformity and consumerism in young Americans.

For as long as I have been a critic at this magazine, I’ve tried to cut against the declinist impulse. The year I started the job, 2011, was a turning point of sorts: Spotify launched in America that July; Netflix debuted its first original series soon after. The brainy rock bands that I’d grown up loving—Radiohead, Wilco—were starting to fade in importance, but pop, hip-hop, and electronic music were cross-pollinating in fascinating ways. Understanding change, and appreciating how human creativity flourishes anew in each era, always seemed to be the point of the job.

Yet the 2020s have tested my optimism. The chaos of TikTok, the disruption of the pandemic, and the threat of AI have destabilized any coherent story of progress driving the arts forward. In its place, a narrative of decay has taken hold, evangelized by critics such as Gioia. They’re citing very real problems: Hollywood’s regurgitation of intellectual property; partisan culture wars hijacking actual culture; unsustainable economic conditions for artists; the addicting, distracting effects of modern technology.

I wanted to meet with some of the most articulate pessimists to test the validity of their ideas, and to see whether a story other than decline might yet be told. Previous periods of change have yielded great artistic breakthroughs: Industrialization begat Romanticism; World War I awakened the modernists. Either something similar is happening now and we’re not yet able to see it, or we really have, at last, slid into the wasteland. (....)

Stagnation

Cyniscism

Acceleration

by Spencer Kornhaber, The Atlantic | Read more:
Image: Javier Jaén

Saturday, May 3, 2025


Peter Lorre as Raskolnikov in “Crime and Punishment” by Lusha Nelson, 1935.
via:

Wednesday, April 30, 2025

Wall Street’s Not-So-Golden Rule

We’re all familiar with the Golden Rule — Do unto others as you would have them do unto you — and I don’t think it’s a stretch to say that its message of reciprocity and empathy is the bedrock of human civilization, certainly of Judeo-Christian thought. As Hillel the Elder said, “What is hateful to you, do not do to your neighbor. That is the whole Torah. The rest is commentary.”

There’s a variation of the Golden Rule — I don’t think it’s a stretch to call it a perversion — that is the bedrock of the business of Money, a business that goes by the shorthand of ‘Wall Street’. This not-so-Golden Rule is the source of pretty much all of the unexpected Bad Things that happen from time to time in markets, where there’s a shock to the system that ‘no one could have foreseen’, like a sudden crash in the price of something or like a run on a bank or an investment firm. That perversion of the Golden Rule is this:

Do unto others as they would do unto you. But do it first.

It’s a perversion of the Golden Rule in two ways. First and most obviously, it’s got that extra sentence about doing the thing before the other guy. But second and less obviously, it’s normative-negative, which is a ten-dollar phrase to say that it’s not talking about doing good things (‘as you would have them do’), but is pretty obviously saying that you should do something that will actively hurt the other guy.

If you’re in the business of Money for more than a nanosecond, you will see this not-so-Golden Rule in action all around you. More to the point, if you want to stay in the business of Money and be successful in the business of Money, you must adopt and live by this not-so-Golden Rule yourself. Seems harsh, I know, but as Hyman Roth so aptly put it in The Godfather, Part II, “this is the business we have chosen.”

And it IS harsh. You can rationalize it by saying that he would have done the same thing to you if the situation had been reversed — and you are almost certainly correct in that assessment! — but the fact remains that YOU are doing the negative thing to the other guy. If you’re a thinking, feeling, non-sociopathic human being you will feel bad about doing that negative thing, but you will also get over it pretty quickly because it is absolutely, unequivocally, 100% the rational thing to do, and if you’ve been entrusted with managing Other People’s Money you have a moral if not legal obligation to do that rational thing despite the blecch feeling you have inside.

The first time I experienced that blecch feeling keenly was in December 2007 when I called our Bear Stearns rep and told him that we had decided to leave Bear Stearns as our hedge fund’s prime broker and we were pulling our money out. A prime broker is basically the ‘bank’ for a hedge fund. They provide lots of services, but the main ones are that they lend you money against the value of your portfolio so that you can buy more stock without using actual cash to go long (bet that the stock price will go up), and they locate and secure the shares of stock that you have to borrow in order to go short (bet that the stock price will go down). In exchange you pay them interest on the ‘leverage’ you used to buy more stock, just like you’d pay interest on a bank loan, and even more importantly from their perspective (and also just like a bank) you ‘deposit’ your stock holdings and some cash with them, which they can use to fund the loans and leverage they’re making available to other clients. It’s arguably the most important counterparty relationship that most hedge funds will have, certainly back then, and it’s a very profitable business for Wall Street investment banks, certainly back then.

What you need to understand is that I didn’t like working with Bear Stearns … I loved working with Bear Stearns. Loved the people, loved the attitude, loved the business terms. Bear Stearns was famously unafraid to take a chance on up-and-comers, both in its hiring of non-pedigreed entry-level employees (preferring, in legendary CEO Ace Greenberg’s words, to hire people who were ‘PSDs’: poor, smart, with a deep desire to be rich) and in its willingness to work with non-pedigreed hedge funds like mine. To be sure, it helped that the larger firm of which my fund was a part was filled with ex-Bear employees, all friends who would vouch for me and my partner. This was back in the day when vouching for someone meant something. It still does, I suppose, but a lot less than it used to. Bear stepped up to be our hedge fund’s prime broker from the very start, putting real time and real effort into a dinky little fund when nobody else would. Yes, they made good money off our business as we grew into a non-dinky fund, but I also owed a personal debt of gratitude to Bear Stearns for taking a chance on us.

And it didn’t matter.

Once I figured out in late fall of 2007 that if we had a nationwide decline in home prices, Bear Stearns faced enormous potential losses in the mortgage-backed securities that they owned, losses big enough to wipe out the entire bank because of their internal leverage on assets – or rather, once I suspected that I had figured this out, because you never know this stuff for sure unless you’re on the inside — then I knew for a certainty that it was only a matter of time before other prime broker clients of Bear Stearns would come to the same suspicion. And once that word got around — that there were doubts and suspicions about Bear Stearns as a counterparty — then I knew for a certainty that what would start as a trickle of clients taking their money out of the prime brokerage ‘bank’ would become a stream and then a river and then … well, then the dam breaks and the investment bank fails and if you’re still there as a prime brokerage client you get really, really hurt.

It didn’t matter if I was right about Bear Stearns and the risks to their balance sheet. I was, but I swear that didn’t matter. What mattered was the not-so-Golden Rule of Wall Street. What mattered is that you must act first when you have even a suspicion of counterparty risk, well before you know for sure whether or not you are ‘right’ about that risk, because everyone else on Wall Street will act first if you don’t. And if you don’t act first, or at least early … if you wait until you’re sure that there’s a counterparty risk … well, you’re screwed.


In December 2007, Bear Stearns still traded for over $100/share. In three months, it was below $5, before finally being taken out by JP Morgan for $10/share in a mercy killing. From suspicions to lights out in three months. Life comes at you fast when the not-so-Golden Rule of Wall Street comes into play. Getting out when we did saved our fund untold hassle and legal tie-ups, gave us the time to move to another prime broker out of strength and not desperation, and set us up for a career-making year in 2008.

Is this sort of run on the bank a self-fulfilling prophecy of doubt and ruin? Yep. If everyone had just kept their prime brokerage account in place would Bear Stearns have survived? Maybe. Do you have a choice but to get out before everyone else does, no matter how much it pains you personally and no matter how much your getting out might accelerate the sad and disappointing outcome? Nope. This is the business we have chosen. (...)

Why am I telling you this story?

I’m telling you this story because I think that Trump a) recognizes he made a mistake by overplaying the tariff card, b) is sidelining the ideologue pro-tariff crew like Navarro and Miran, and c) is actively looking for off-ramps and de-escalation in the China trade war. I think he may find an off-ramp and de-escalation in the China trade war, and that would be a wonderful thing for the United States and the world.

And it doesn’t matter.

by Ben Hunt, Epsilon Theory |  Read more:
Image: Margin Call (2011); Godfather Part II
[ed. Trust lost is almost impossible to regain. See also: ‘Trump wanted to break us’, says Carney as Liberals triumph in Canadian election' (Guardian); and (the not to be missed) Crashing the Car of Pax Americana. (Epsilon Theory).]

"Mirroring a theme of the campaign, Carney told election-night supporters that Trump wanted to “break us, so that America can own us”, adding: “That will never, ever happen,” to shouts from the crowd.

He also gave a stark assessment of a world order once defined by an integrated global trading system with the US at the centre, saying such a system was over, and he pledged to reshape Canada’s relationships with other nations.

“We are over the shock over American betrayal. But we will never forget the lessons,” he said."

[ed. And this: 2035: An Allocator Looks Back Over the Last 10 Years (AQR):]

"We really did not see this underperformance coming. After all, the prior 30 years saw much higher IRRs on private equity than total returns on public equity. What we didn’t count on, I mean who could see this coming, was this outperformance reversing. I mean, what better way is there to estimate what will happen in the future than looking at what happened in the past!?"