Sunday, May 19, 2024

Aldous Huxley & George Orwell


Aldous Huxley Explains How Man Became “the Victim of His Own Technology” (1961) (Open Culture)

"Having written his acclaimed dystopian novel Brave New World thirty years earlier, Huxley was established as a seer of possible technology-driven totalitarian futures. He understood that “we are a little reluctant to embark upon technology, to allow technology to take over,” but that, “in the long run, we generally succumb,” allowing ourselves to be mastered by our own creations. In this, he resembles the Julia of Byron’s Don Juan, who, “whispering ‘I will ne’er consent’ – consented.” Huxley also knew that “it is possible to make people content with their servitude,” even more effectively in modernity than antiquity: “you can provide them with bread and circuses, and you can provide them with endless amounts of distraction and propaganda” — delivered, here in the twenty-first-century, straight to the device in our hand."


George Orwell’s Political Views, Explained in His Own Words (Open Culture)

"Among modern-day liberals and conservatives alike, George Orwell enjoys practically sainted status. And indeed, throughout his body of work, including but certainly not limited to his oft-assigned novels Animal Farm and Nineteen Eighty-Four, one can find numerous implicitly or explicitly expressed political views that please either side of that divide — or, by definition, views that anger each side. The readers who approve of Orwell’s open advocacy for socialism, for example, are probably not the same ones who approve of his indictment of language policing. To understand what he actually believed, we can’t trust current interpreters who employ his words for their own ends; we must return to the words themselves." (...)

His concerns with the Soviet Union were part of a broader concern on the nature of truth and the way truth is manipulated in politics,” Chapman explains. An important part of his larger project as a writer was to shed light on the widespread “tendency to distort reality according to their political convictions,” especially among the intellectual classes.

“This kind of thing is frightening to me,” Orwell writes in “Looking Back on the Spanish War,” “because it often gives me the feeling that the very concept of objective truth is fading out of the world”: a condition for the rise of ideology “not only forbids you to express — even to think — certain thoughts, but it dictates what you shall think, it creates an ideology for you, it tries to govern your emotional life as well as setting up a code of conduct.”

[ed. Two paths with opposing visions, same results - the subjugation of human freedoms.]

The Tragedy of Eva Cassidy

Today marks the 25th anniversary of Eva Cassidy’s death at age 33, and the passing of time hardly softens the blow. True, other music stars also die young, but they almost always enjoy a taste of fame and fortune before they leave us—and Cassidy had none of that. Fans celebrate her posthumous renown and record sales, but her actual life brought her mostly rejection, financial struggles, and illness.

The biggest concert of her career took place in front of a tiny audience. Her breakout music video was made on a handheld camcorder. Her most important record was self-financed. All the accolades came after her death on November 2, 1996.

Eva Cassidy would eventually sell more than ten million records, and dominate the charts with three albums and a hit single. But during most of her life, Cassidy’s music didn’t even pay the rent, and she worked for fourteen years at Behnke Nurseries in Largo, Maryland—where she watered plants, transplanted seedlings, unloaded huge bales of peat moss or truckloads of trees, and undertook a range of other greenhouse responsibilities.

Cassidy was only 5 foot 2 inches, but she did physically arduous work day after day, sometimes the only woman on a crew of men. It was dirty, tiring labor, and she kept it up as long as she could. But then the medical problems started. (...)

It’s a miracle that her beloved album Live at Blues Alley was even recorded. She had to cash in a small pension to cover the costs—and with all the other medical expenses, putting that money into a recording must have struck many as foolish. And even after setting up equipment to record her two-day booking at Blues Alley in Washington, D.C., technical problems forced her to discard all the tracks from the first night.

So it all came down to one evening, January 3, 1996, when Cassidy showed up for a final chance at a live album—just three days before a huge blizzard shut down the entire city. Cassidy herself was suffering from a cold, and wondered if any of the music would be worth releasing. But at this point, there was no turning back, and she took the stage, ready to sing with all the heart and soul she possessed. (...)

Cassidy was exactly the kind of trend-breaking artist Hammond sought out, but in the mid-1990s the music industry was different, and Eva Cassidy was rejected for the very fact the she didn’t fit easily into any genre pigeonhole.

This was both her preference, but also a necessity given her skills and opportunities. As a freelancer, Cassidy had been forced to prove that she could sing any style. And even now it’s tempting to focus on just one of her skills, maybe her ability with slow ballads or jazz tunes. But some of her finest moments came with the least likely material.

For example, I sit in rapt admiration when I hear Cassidy sing the old folk ballad “Wayfaring Stranger”—which she turned into a soulful groove number. If you want to know how strange that decision was, listen to the way this song was originally sung. It’s one of the starkest traditional songs in the whole Anglo-American canon, and even though it has been updated, usually by country or folk singers, none of those versions even begins to prepare us for what Eva Cassidy achieves.

I call particular attention to how she raises her ambitions and intensity with each passing chorus—and 3:40 into the performance you feel she can’t possibly lift the level of her singing any higher. But she reaches deep, deep inside and delivers something you have to hear to believe.


It gives me chills to listen to this. But nobody talks about Eva Cassidy as a soul singer—and simply because there’s so much else she does, you could miss a track like this. But don’t.

By the same token, I never hear anyone describe Cassidy as a blues singer. But listen to what she does with “Stormy Monday,” and you will realize she could have built a whole career on raw, gritty songs of this sort.


But her most unlikely success was achieved with a song that was more than sixty years old, and performed so often that few would expect it had any new secrets to share. But at Blues Alley that night, Cassidy decided to sing “Over the Rainbow” from the 1939 film The Wizard of Oz. Once again, this is the last thing you would do if you were aiming for a hit pop record in the digital age, but Cassidy picked songs because she loved them, not because they matched the items on an A&R executive’s check list.

Like me, Cassidy had heard this song every year as a child, when The Wizard of Oz was broadcast as an annual ritual on network television. She had performed it previously at a high-profile Washington DC music award show and left the audience stunned. “When she came out, I was just worried, you know, the audience was milling around and talking,” the show’s promoter Mike Schreibman later recalled. Eva’s father said that he heard someone remark: “Don’t tell me that little girl is going to try ‘Over the Rainbow’ on THIS crowd.” But they had never heard it sung like this before. “When she started to sing, they just… stopped,” Schreibman continues. “So many times I’ve heard since then, that was the first time they heard her, and how great she was. Ron Holloway said that he was on the way out the door but when he heard Eva he came back in.”

So at Blues Alley, with the recording equipment that her cashed-out pension had hired capturing this one night of music, she decided to sing it again, accompanying herself on guitar. And this performance, also preserved on film, did more than anything to catapult her to fame.


by Ted Gioia, The Honest Broker |  Read more:
Videos: YouTube

Saturday, May 18, 2024

The Enigma of Rickie Lee Jones

There are many ways to approach the story of Rickie Lee Jones. But let’s start with an anecdote from studio drummer Jeff Porcaro, who was called in as session player on RLJ’s second album Pirates—allegedly because Jones had admired his brush work at a previous encounter.

Porcaro, a legend in the world of studio musicians, later recounted the story:

“What a great thing. I go to the session, it's Chuck Rainey on bass, Dean Parks on guitar, Russell Ferrante on piano, Lenny Castro on percussion, and Rickie Lee Jones playing piano and singing. The drums are in an isolation booth with a big glass going across so I can see everybody in the main studio. I have my headphones on, and we start going over the first song. After the first pass of the tune, Rickie Lee in the phones goes, ‘Mr. Porcaro, I know you're known for keeping good time, but on these sessions, I can't have you do that. With my music, when I'm telling my story, I like things to speed up and slow down, and I like people to follow me.’”

Porcaro, a consummate professional, asks the engineer to give him more of Jones’s vocal in his headphones so he can follow her rhythmic shifts. The band starts the song again—but Jones stops it halfway through and tells the drummer: “The time is too straight. You gotta loosen up a bit.” Porcaro apologizes, and asks for still more of the vocal track in his headphone mix. On the next take, he focuses closely on Jones’s singing, speeding up and slowing down in response to every twist and turn. But RLJ is still displeased. She halts the take and starts complaining again about the beat. Now every musician on the date—a group of all-star studio players—is tense and anxious. The next take is so unsettled that the producer calls a break.

After the break, the band switches to another song—to try to get their mojo back. Pocaro describes what happened next:

“So we start laying the track down, and I come up to this simple fill: triplets over one bar. It's written out on my music, and I play the fill. She stops. She says, You have to play harder. . . . Everybody looks at me. I look at everybody. I go, ‘Okay, let's do it again.’ We start again. One bar before the fill, I hear, louder than hell in my phones, We're coming up to the fill. Remember to play hard, while we're grooving. I whack the mess out of my drums, as hard as I've ever hit anything in my life. While I'm hitting them, she's screaming, Harder! I stop. She stops. I'm looking at my drums. My heads have dents in them; if I hit the drum lightly, it will buzz, and I'm pissed. I'm steaming inside. I'm thinking, ‘Nobody talks to me that way.’ [Producer] Lenny Waronker says, ‘Let's do it again.’ We start again, and everybody is looking at me while they are playing. We're coming up to the fill, and she goes, Play hard! and I take my sticks like daggers and I do the fill, except I stab holes through my tom-tom heads. I land on my snare drum, both sticks are shaking, vibrating, bouncing on the snare drum.

“I get up and pick up my gig bag. There's complete silence. I slide open the sliding glass door, walk past her, down the hallway, get in my car, and I drive home.”

In the aftermath, Porcaro heard from another musician that Jones might sue him—but she let the incident pass. Pirates was eventually released, to critical acclaim, with Steve Gadd, another studio legend, handling much of the drum work.

But here’s the best part of the story. Three years later, Porcaro gets a call from a producer asking him to play on a Rickie Lee Jones session for the album The Magazine. This can’t be true, he thinks—has she forgotten their previous encounter? Maybe she was going through a bad spell back then, and doesn’t even remember the details? So Porcaro, taking pride in his professionalism and unwilling to hold grudges, decides to do the date, and see what happens.

When he shows up, Rickie Lee Jones greets him like an old friend: “Hi, Jeff, good to see you again. You seem to have lost weight.” The session takes place effortlessly and with excellent results. At the conclusion of the second song, Jones walks up to his drum kit, and in front of all the musicians—some who had been in attendance at the Pirates debacle—told Porcaro: "Jeff, I really have to tell you this. No drummer has ever played so great for me, listened to my music so closely, understood what I'm saying with lyrics, and has followed me as well as you. I just want to thank you for the good tracks."

Porcaro’s reaction: “I almost broke up laughing because I had played no differently for her the year before.”

II.

But by the time of this second session with Porcaro in 1984, Rickie Lee Jones—the rising star whose creativity and artistry had, just a short while before, seemed to promise (or even demand) a long career at the top of the music business—had already seen her moment come and go, at least from the point of view of the industry. She wasn’t even thirty years old, but the critics were no longer charmed by her capriciousness. When The Magazine was released, the New York Times responded: “Miss Jones is still looking for direction.” But even casual fans could tell something was wrong. In the five years after the release of her million-selling debut album Rickie Lee Jones, this hot new songwriter only released one album and an EP —a total of 67 minutes and 11 seconds of music.

Do the math: that works out to 13 minutes of new music per year.

Clearly Jones was focused on something besides composing and recording. But, whatever her reasons, most of her audience had left, never to return. You can measure the impact on the Billboard chart Rickie Lee Jones (1979) peaked at number 3 on the US album chart. Her follow-up Pirates reached as high as number 5. The Magazine topped out at number 44. The follow-up Flying Cowboys did somewhat better, reaching number 39. But with Pop Pop, recorded in 1989, Jones got no higher than number 121. The days of hit albums and large audiences were over, and they wouldn’t be coming back coming back. In retrospect, her commercial high point as a pop star was her first album, which produced her only genuine radio hit, “Chuck E.’s in Love.”

This is a familiar story in the music business—a promising debut followed by disappointment. So why am I so troubled by the fall of Rickie Lee Jones? There’s a simple answer: her talent was extraordinary. She seemed poised not only to have hit songs—which, after all, aren’t a rarity in the entertainment world—but do something even more remarkable, namely redefine the parameters of pop singing.

Her studio battle with Porcaro is all too revealing on this front, and it’s why I started my account by relating it. Rickie Lee Jones had a different concept of time than the other singers. She could make it seem as if her voice was floating over the ground beat with the freshness and changeability of the shifting colors of a sunset. This is something you occasionally find in jazz, but even there it’s a rarity: few improvisers can force the beat into such total submission to their artistic vision. But Jones seemed to do it effortlessly—at least for a time.

The sad reality is that her declining sales were the result, to some extent, of her growing affinity for jazz. Her final exile from the Billboard top 100 albums came in response to the unabashed jazz sensibility of Pop Pop (1989). In this regard, Jones experienced the same backlash that ended Joni Mitchell’s run of hit albums after the release of her jazzy Mingus album. Before Mingus, every new Joni Mitchell album seemed destined to get into the top 20 on the chart—afterwards none of them would. But for Rickie Lee Jones, the decline was sharper and less forgiving. After all, Mitchell was embraced by the jazz community—Herbie Hancock even won a Grammy for Album of the Year with his River: The Joni Letters (2007). Rickie Lee Jones, in contract, rarely received that kind of cherishing and celebration from jazz insiders—although she had perhaps the jazziest ways of phrasing of any pop music star during the second half of the twentieth century.

Yet this rhythmic flexibility was only part of Rickie Lee Jones’s innovative approach to singing. She also had a way of moving from singing to spoken speech and back again, while handling every gradation along the way. Listen again to her breakout hit, “Chuck E.’s in Love,” cueing the track at the 1:30 point, and hear what she does in the next thirty seconds. Was anyone else doing this in pop music? The short answer is: No, not even close.

by Ted Gioia, The Honest Broker |  Read more:
Image: Rolling Stone
[ed. One selection from Ted's Honest Broker archives. See more here: A Map to 'The Honest Broker':

The 14 Sections of the The Honest Broker

If this were a real broker’s store, I would break it down into 14 sections. Here’s the layout:

1. The Origin Story

My origins article is the single best guide to what I do, and how I ended up here. So I put this at the front of the store
“How I Became the Honest Broker”

4. Futurists and Futurism

I often try to predict the future here. And I also look at great thinkers from the past who demonstrated an uncanny ability to anticipate social changes.

Here are some examples:

I Revisit My Doom-and-Gloom Forecasts
How Did a Censored Writer from the 1970s Predict the Future with Such Uncanny Accuracy?
The Lifestyle That Corporate America Killed
Every Prediction from My Teenage Years Turned Out Wrong
The Future of Big Cities—as Predicted in The Decline of the West (1922)

[more...]

Ted Gioia & Rick Biato: The Silent Takeover

[ed. More than just a simple discussion of AI's impact on music - actually, much, much more. An all-encompassing and granular dissection of the many elements affecting the entertainment (and more broadly, creative) industry today - technological, financial, artistic, etc. - and the implications all this will have for the future of our culture. For example, do readers know who Johan Rohr is? The 'artist' with more streams on Spotify (15 billion plus), than just about anyone, including Michael Jackson, Elton John, and others? No? Start at 8:10 to learn more about him.] 

Friday, May 17, 2024

Three Bob Night

[ed. Seattle politics.]

We had some laughs with the spicy political tale of the “Three Bobs.” Some lawyers got paid. But after a few days of intrigue, the story is apparently over.

It shouldn’t be.

In case you are no political junkie, “Three Bob Night,” as The Seattle Times headline punsters dubbed it, broke out last Friday when two last-second candidates named “Bob Ferguson” filed to run as Democrats for governor. They were punking the other Bob Ferguson, the state attorney general who has been angling to run for governor for years.

It seemed a sham. And then a longtime Republican activist named Glen Morgan proudly admitted he did it to sabotage the Democrats.

“If I had started a little bit earlier, I would have been able to have six Bob Fergusons,” Morgan boasted to Times reporter Claire Withycombe. “I contacted about 12. I just ran out of time.”

Ha ha, it was going to be endless Bob summer. Except oops. Misleading voters in precisely this way has been against state law for 81 years. Turns out it’s a felony.

On Monday the other Bobs, rattled, withdrew their names. Conflict over, election back to normal. Right?

Not so fast. Here are some after-action thoughts on the Bobapoolaza, and why I think what happened is no one-off. It’s worth more discussion, including by state legislators, who ought to consider changes to state election law.

Point No. 1. You know how Republicans have been out pretending to search for election fraud for years now? They’ve conducted phony audits, attended conspiratorial presentations, and traveled to “stop the steal” conferences hosted by that My Pillow guy.

Well damned if the fake quest didn’t finally turn up something real!

Sure it’s more as if OJ had ever caught the real killer. But this, it turns out, is what actual criminal election fraud looks like.

That it was carried out by the Washington State Republican Party’s 2023 “Volunteer of the Year” — who on his website says that election integrity and rooting out corruption are two of his passions — is the sort of irony that newspaper columnists are not permitted to pass up. I could lose my license.

Point No. 2. Joking aside, letting this pass with just a laugh is a ticket to more of the same.

The Three Bobs was kind of a low-rent version of the fake elector scheme of 2020, in which the Donald Trump campaign went around getting states to appoint bogus Electoral College representatives. Fake electors, fraudulent candidates — you think they wouldn’t repeat either in the future if they thought there’d be zero consequences?

To that end, state law doesn’t just say this was wrong for the two candidates. Another section of state law, RCW 29A.84.270 if you’re scoring at home, says that whoever comes up with the conspiracy to recruit the fake candidates also is guilty of a felony. That could be Morgan, the GOP activist. There also are possible civil fines up to the salary of the elected position being sought (governor, which pays $198,257).

It’s remarkable how this old law, passed in 1943, spells out exactly what happened here. It must have happened before.

Rather than confess he erred, Morgan has pledged “aggressive and extensive legal action” against anyone who challenges him. He also blamed the victim.

“The attorney general should be ashamed of himself,” Morgan told the Washington State Standard after his scheme had collapsed Monday. “He threatens the little people as always to promote himself.”

That’s some Trump-scale grievance and projection. Doesn’t sound like lessons were learned, does it? It means that something like this will probably just happen again.

Which brings me to Point No. 3. Why do we make it so easy to simply purchase spots on the public’s election ballot? 

A gobsmacking 30 candidates filed to run for governor. It could be worse: In 2020, the governor’s primary had 36 candidates. (...)

In the past we had a candidate named Mike the Mover who used the public ballot as an ad billboard for his company of the same name. He claimed he would get $150,000 of business in return for his roughly thousand-dollar candidate filing fee.

“This is vandalism to the ballot,” former King County Prosecuting Attorney Dan Satterberg said Monday. He was talking about the three-headed Bob, but it could be said about our ballot every year.

Sure this is what we get with democracy — the rough with the smooth. But maybe we could set our democracy bar just a bit higher?

The Secretary of State’s Office says anyone can file for governor if they pay $1,982, or if they submit 1,982 signatures of registered voters. All 30 candidates this year simply paid the fee. (The fraudulent two apparently had their fee money raised by Morgan.)

So why not require both? Any legit candidate for statewide office could get the contributions and the signatures from other politically engaged people.

by Danny Westneat, Seattle Times |  Read more:
Image: Claire Withycombe/The Seattle Times

Thursday, May 16, 2024

The Collapse Is Coming. Will Humanity Adapt?

The following conversation was recorded in March 2024. It has been edited for clarity and length.

Peter Watts: In this corner, the biosphere. We’ve spent a solid year higher than 1.5 degrees Celsius; we’re wiping out species at a rate of somewhere between 10,000 and 100,000 annually; insect populations are crashing; and we’re losing the West Antarctic Ice Sheet, no matter what we do at this point. Alaskapox has just claimed its first human victim, and there are over 15,000 zoonoses expected to pop up their heads and take a bite out of our asses by the end of the century. And we’re expecting the exhaustion of all arable land around 2050, which is actually kind of moot because studies from institutions as variable as MIT and the University of Melbourne suggest that global civilizational collapse is going to happen starting around 2040 or 2050.

In response to all of this, the last COP was held in a petrostate and was presided over by the CEO of an oil company; the next COP is pretty much the same thing. We’re headed for the cliff, and not only have we not hit the brakes yet, we still have our foot on the gas.

In that corner: Dan Brooks and Sal Agosta, with a Darwinian survival guide. So, take it away, Dan. Guide us to survival. What’s the strategy?

Daniel Brooks: Well, the primary thing that we have to understand or internalize is that what we’re dealing with is what is called a no-technological-solution problem. In other words, technology is not going to save us, real or imaginary. We have to change our behavior. If we change our behavior, we have sufficient technology to save ourselves. If we don’t change our behavior, we are unlikely to come up with a magical technological fix to compensate for our bad behavior. This is why Sal and I have adopted a position that we should not be talking about sustainability, but about survival, in terms of humanity’s future. Sustainability has come to mean, what kind of technological fixes can we come up with that will allow us to continue to do business as usual without paying a penalty for it? As evolutionary biologists, we understand that all actions carry biological consequences. We know that relying on indefinite growth or uncontrolled growth is unsustainable in the long term, but that’s the behavior we’re seeing now.

Stepping back a bit. Darwin told us in 1859 that what we had been doing for the last 10,000 or so years was not going to work. But people didn’t want to hear that message. So along came a sociologist who said, “It’s OK; I can fix Darwinism.” This guy’s name was Herbert Spencer, and he said, “I can fix Darwinism. We’ll just call it natural selection, but instead of survival of what’s-good-enough-to-survive-in-the-future, we’re going to call it survival of the fittest, and it’s whatever is best now.” Herbert Spencer was instrumental in convincing most biologists to change their perspective from “evolution is long-term survival” to “evolution is short-term adaptation.” And that was consistent with the notion of maximizing short term profits economically, maximizing your chances of being reelected, maximizing the collection plate every Sunday in the churches, and people were quite happy with this.

Well, fast-forward and how’s that working out? Not very well. And it turns out that Spencer’s ideas were not, in fact, consistent with Darwin’s ideas. They represented a major change in perspective. What Sal and I suggest is that if we go back to Darwin’s original message, we not only find an explanation for why we’re in this problem, but, interestingly enough, it also gives us some insights into the kinds of behavioral changes we might want to undertake if we want to survive.

To clarify, when we talk about survival in the book, we talk about two different things. One is the survival of our species, Homo sapiens. We actually don’t think that’s in jeopardy. Now, Homo sapiens of some form or another is going to survive no matter what we do, short of blowing up the planet with nuclear weapons. What’s really important is trying to decide what we would need to do if we wanted what we call “technological humanity,” or better said “technologically-dependent humanity,” to survive.

Put it this way: If you take a couple of typical undergraduates from the University of Toronto and you drop them in the middle of Beijing with their cell phones, they’re going to be fine. You take them up to Algonquin Park, a few hours’ drive north of Toronto, and you drop them in the park, and they’re dead within 48 hours. So we have to understand that we’ve produced a lot of human beings on this planet who can’t survive outside of this technologically dependent existence. And so, if there is the kind of nature collapse that the Melbourne Sustainable Studies Institute is talking about, how are those people going to survive? A completely dispassionate view would just say, “Well, you know, most of them won’t. Most of them are going to die.” But what if it turns out that we think that embedded within all of that technologically dependent society there are some good things? What if we think that there are elements of that existence that are worth trying to save, from high technology to high art to modern medicine? In my particular case, without modern medical knowledge, I would have died when I was just 21 years old of a burst appendix. If I had managed to survive that, I would have died in my late 50s from an enlarged prostate. These are things most would prefer not to happen. What can we begin doing now that will increase the chances that those elements of technologically-dependent humanity will survive a general collapse, if that happens as a result of our unwillingness to begin to do anything effective with respect to climate change and human existence?

Peter Watts: So to be clear, you’re not talking about forestalling the collapse —

Daniel Brooks: No.

Peter Watts: — you’re talking about passing through that bottleneck and coming out the other side with some semblance of what we value intact.

Daniel Brooks: Yeah, that’s right. It is conceivable that if all of humanity suddenly decided to change its behavior, right now, we would emerge after 2050 with most everything intact, and we would be “OK.” We don’t think that’s realistic. It is a possibility, but we don’t think that’s a realistic possibility. We think that, in fact, most of humanity is committed to business as usual, and that’s what we’re really talking about: What can we begin doing now to try to shorten the period of time after the collapse, before we “recover”? In other words — and this is in analogy with Asimov’s Foundation trilogy — if we do nothing, there’s going to be a collapse and it’ll take 30,000 years for the galaxy to recover. But if we start doing things now, then it maybe only takes 1,000 years to recover. So using that analogy, what can some human beings start to do now that would shorten the period of time necessary to recover? Could we, in fact, recover within a generation? Could we be without a global internet for 20 years, but within 20 years, could we have a global internet back again?

Peter Watts: Are you basically talking about the sociological equivalent of the Norwegian Seed Bank, for example?

Daniel Brooks: That’s actually a really good analogy to use, because of course, as you probably know, the temperatures around the Norwegian Seed Bank are so high now that the Seed Bank itself is in some jeopardy of survival. The place where it is was chosen because it was thought that it was going to be cold forever, and everything would be fine, and you could store all these seeds now. And now all the area around it is melting, and this whole thing is in jeopardy. This is a really good example of letting engineers and physicists be in charge of the construction process, rather than biologists. Biologists understand that conditions never stay the same; engineers engineer things for, this is the way things are, this is the way things are always going to be. Physicists are always looking for some sort of general law of in perpetuity, and biologists are never under any illusions about this. Biologists understand that things are always going to change.

Peter Watts: Well, that said, that’s kind of a repeated underlying foundation of the book, which is that evolutionary strategies are our best bet for dealing with stressors. And by definition, that implies that the system changes. Life will find a way, but it won’t necessarily include the right whales and the monarch butterflies.

Daniel Brooks: Right, right. Yeah. (...)

Peter Watts: Now, this is an argument that some might say is invasible by cheaters. I read this and I thought of the Simpsons episode where Montgomery Burns is railing to Lisa, and he says, “Nature started the struggle for survival, and now she wants to call it off because she’s losing? I say, hard cheese!” And less fictitiously, Rush Limbaugh has invoked essentially the same argument when he was advocating against the protection of the spotted owl. You know, life will find a way. This is evolution; this is natural selection. So, I can see cherry-picking oil executives being really happy with this book. How do you guard against that?

Daniel Brooks: Anybody can cherry-pick anything, and they will. Our attitude is just basically saying, look, here’s the fundamental response to any of this stuff. It’s, how’s it working out so far? OK? There’s a common adage by tennis coaches that says during a match, you never change your winning game, and you always change your losing game. That’s what we’re saying.

One of the things that’s really important for us to focus on is to understand why it is that human beings are so susceptible to adopting behaviors that seem like a good idea, and are not. Sal and I say, here are some things that seem to be common to human misbehavior, with respect to their survival. One is that human beings really like drama. Human beings really like magic. And human beings don’t like to hear bad news, especially if it means that they’re personally responsible for the bad news. And that’s a very gross, very superficial thing, but beneath that is a whole bunch of really sophisticated stuff about how human brains work, and the relationship between human beings’ ability to conceptualize the future, but living and experiencing the present.

There seems to be a mismatch within our brain — this is an ongoing sort of sloppy evolutionary phenomenon. So that’s why we spend so much time in the first half of the book talking about human evolution, and that’s why we adopt a nonjudgmental approach to understanding how human beings have gotten themselves into this situation. Because everything that human beings have done for 3 million years has seemed like a good idea at the time, but it’s only been in the last 100 or 150 years that human beings have begun to develop ways of thinking that allow us to try to project future consequences and to think about unanticipated consequences, long-term consequences of what we do now. So this is very new for humanity, and as a consequence, it’s ridiculous to place blame on our ancestors for the situation we’re in now.

Everything that people did at any point in time seemed like a good idea at the time; it seemed to solve a problem. If it worked for a while, that was fine, and when it no longer worked, they tried to do something else. But now we seem to be at a point where our ability to survive in the short term is compromised, and what we’re saying is that our way to survive better in the short term, ironically, is now based on a better understanding of how to survive in the long run. We’re hoping that people will begin seriously thinking that our short-term well-being is best served by thinking about our long-term survival.

Peter Watts: What you’ve just stated is essentially that short-term goals and long-term goals are not necessarily the same thing, that one trades off against the other. When you put it that way, it seems perfectly obvious — although I have to say, what you’re advocating for presumes a level of foresight and self-control that our species has, shall we say, not traditionally manifested. But yeah, a widely adhered-to view of evolution is a reactive one— the pool is drying up, and evolution looks at that and says, oh my goodness, the pool is drying up! We should probably get those fish to evolve lungs. Whereas what evolution actually does is say, oh look, the pool is drying up! Good thing that fish over in the corner that everybody picked on has a perforated swim bladder; it might be able to, like, breathe air long enough to make it over to the next pool. Too bad about all those other poor bastards who are going to die. And to hone that down to a specific example that you guys cite in the book, you’re saying “high fitness equals low fitness” — that you need variation to cope with future change.

Daniel Brooks: Right.

Peter Watts: So optimal adaptation to a specific environment implies a lack of variation. When you’re optimally adapted to one specific environment, you are screwed the moment the environment changes. And the idea that high fitness equals low fitness is what I call a counterintuitive obvious point: It is something that seems oxymoronic and even stupid when you first hear it, but when you think about it for more than two seconds, it’s like — who was it that responded to “The Origin of Species” by saying, Of course! How silly of me not to have thought of it myself. I’ve forgotten who said that. And the idea that high fitness equals low fitness is what I call a counterintuitive obvious point: It is something that seems oxymoronic and even stupid when you first hear it, but when you think about it for more than two seconds, it’s like — who was it that responded to “The Origin of Species” by saying, Of course! How silly of me not to have thought of it myself. I’ve forgotten who said that.

by Peter Watts and Daniel Brooks, MIT Press |  Read more:
Image: MIT Press

Tom Gauld
via:


via:

Bruce Springsteen

Wednesday, May 15, 2024

The Life and Death of Hollywood

In 2012, at the age of thirty-two, the writer Alena Smith went West to Hollywood, like many before her. She arrived to a small apartment in Silver Lake, one block from the Vista Theatre—a single-screen Spanish Colonial Revival building that had opened in 1923, four years before the advent of sound in film.

Smith was looking for a job in television. She had an MFA from the Yale School of Drama, and had lived and worked as a playwright in New York City for years—two of her productions garnered positive reviews in the Times. But playwriting had begun to feel like a vanity project: to pay rent, she’d worked as a nanny, a transcriptionist, an administrative assistant, and more. There seemed to be no viable financial future in theater, nor in academia, the other world where she supposed she could make inroads.

For several years, her friends and colleagues had been absconding for Los Angeles, and were finding success. This was the second decade of prestige television: the era of Mad Men, Breaking Bad, Homeland, Girls. TV had become a place for sharp wit, singular voices, people with vision—and they were getting paid. It took a year and a half, but Smith eventually landed a spot as a staff writer on HBO’s The Newsroom, and then as a story editor on Showtime’s The Affair in 2015.

I first spoke with Smith in August of last year, four months into the strike called by the Writers Guild of America against the members of the Alliance of Motion Picture and Television Producers (AMPTP), the biggest Hollywood studios. In 2013, she’d begun to develop the idea for what would become Dickinson, a gothic, at times surreal comedy based on the life of the poet, Emily. “I realized you could do one of those visceral, sexy, dangerous half hours but make it a period piece,” she said. “I was never trying to write some middle-of-the-road thing.” She sold the pilot and a plan for at least three seasons to Apple in 2017; she would be the showrunner, and the series had the potential to become one of the flagship offerings of the company’s streaming service, which had not yet launched.

Looking back, Smith sometimes marvels that Dickinson was made at all. “It centers an unapologetic, queer female lead,” she said. “It’s about a poet and features her poetry in every episode—hard-to-understand poetry. It has a high barrier of entry.” But that was the time. Apple, like other streamers, was looking to make a splash. “I mean, they made a show out of I Love Dick,” Smith said, referring to the small-press cult classic by Chris Kraus, adapted into a 2016 series for Amazon Prime Video. “That doesn’t happen because people are using profit as their bottom line.”

In fact, they weren’t. The streaming model was based on bringing in subscribers—grabbing as much of the market as possible—rather than on earning revenue from individual shows. And big swings brought in new viewers. “It’s like a whole world of intellectuals and artists got a multibillion-dollar grant from the tech world,” Smith said. “But we mistook that, and were frankly actively gaslit into thinking that that was because they cared about art.”

Making a show for Apple was not what she’d hoped it would be. What the company wanted from her and the series never felt clear—there was a “radical information asymmetry,” she said, regarding management’s priorities and metrics. After she and her colleagues completed the first season of Dickinson, they waited for the streamer to launch and the show to air. Their requests for a firm timeline and premiere date were ignored. Smith started to worry that Apple might scrap the idea for the streaming platform altogether, in which case the show might never be seen, or might even disappear—she didn’t have a copy of the finished product. It belonged to Apple and lived on the company’s servers.

“It was communicated to me,” Smith said, “that my only choice to keep the show alive was to begin all over again and write a whole new season without a green-light guarantee. So I was expected to take on that risk, when the entities that stood to profit the most from the success of my creative labor, the platform and studio, would not risk a dime.” “It was also on me,” she went on, “to kind of fluff everybody involved in the entire making of the show, from the stars to the line producer to the costume designer, etcetera, to make them believe that we’d be coming back again and prevent them, sometimes unsuccessfully, from taking other jobs.”

Finally, in late 2019, when Smith and her colleagues were two months into production on Season 2, the show premiered as one of the streamer’s four original series. It was an immediate critical success and a sensation on social media. “In Apple TV+’s initial smattering of shows,” wrote the Washington Post, “only ‘Dickinson’ is a delicious surprise.” It received a 2019 Peabody Award; in 2021 it made the New York Times’ list of best programs of the year and won a Rotten Tomatoes prize for Fan Favorite TV Series.

But Smith was losing steam. “I was only allowed to make the show to the extent that I was willing to take on unbelievable amounts of risk and labor on my own body perpetually, without ceasing, for years,” she said. “And I knew that if I ever stopped, the show would die.” It had seemed to her that Apple didn’t value the series, and she felt at a loss. Smith now knows that Dickinson was the company’s most-watched show in its second and third seasons. But at the time, she had no access to concrete information about its performance. As was the habit among streamers, Apple didn’t share viewership data with its writers. And without that data, Smith had no leverage. In 2020, after three seasons, she told Apple that she was done. “I said, I can’t do it anymore. And Apple said, Okay.”

“Passion can only get you so far,” she told me. But she’d stayed in Hollywood. “I’m an artist,” she said, “and I’m never going to stop creating.” The industry was still the only place one could make a real living as a writer. “When people say, Why stay in TV?” she said, “The answer is, There is nothing else. What do you mean?”
***
The truth was that the forces that had opened doors for Smith were the same ones that had made her individual work seem not to matter. They were the same forces that had been degrading writers’ working lives for some time, and they were cannibalizing the business of Hollywood itself.

Thanks to decades of deregulation and a gush of speculative cash that first hit the industry in the late Aughts, while prestige TV was climbing the rungs of the culture, massive entertainment and media corporations had been swallowing what few smaller companies remained, and financial firms had been infiltrating the business, moving to reduce risk and maximize efficiency at all costs, exhausting writers in evermore unstable conditions.

“The industry is in a deep and existential crisis,” the head of a midsize studio told me in early August. We were in the lounge of the Soho House in West Hollywood. “It is probably the deepest and most existential crisis it’s ever been in. The writers are losing out. The middle layer of craftsmen are losing out. The top end of the talent are making more money than they ever have, but the nuts-and-bolts people who make the industry go round are losing out dramatically.”

Hollywood had become a winner-takes-all economy. As of 2021, CEOs at the majority of the largest companies and conglomerates in the industry drew salaries between two hundred and three thousand times greater than those of median employees. And while writer-producer royalty such as Shonda Rhimes and Ryan Murphy had in recent years signed deals reportedly worth hundreds of millions of dollars, and a slightly larger group of A-list writers, such as Smith, had carved out comfortable or middle-class lives, many more were working in bare-bones, short-term writers’ rooms, often between stints in the service industry, without much hope for more steady work. As of early 2023, among those lucky enough to be employed, the median TV writer-producer was making 23 percent less a week, in real dollars, than their peers a decade before. Total earnings for feature-film writers had dropped nearly 20 percent between 2019 and 2021.

Writers had been squeezed by the studios many times in the past, but never this far. And when the WGA went on strike last spring, they were historically unified: more guild members than ever before turned out for the vote to authorize, and 97.9 percent voted in favor. After five months, the writers were said to have won: they gained a new residuals model for streaming, new minimum lengths of employment for TV, and more guaranteed paid work on feature-film screenplays, among other protections.

But the business of Hollywood had undergone a foundational change. The new effective bosses of the industry—colossal conglomerates, asset-management companies, and private-equity firms—had not been simply pushing workers too hard and grabbing more than their fair share of the profits. They had been stripping value from the production system like copper pipes from a house—threatening the sustainability of the studios themselves. Today’s business side does not have a necessary vested interest in “the business”—in the health of what we think of as Hollywood, a place and system in which creativity is exchanged for capital. The union wins did not begin to address this fundamental problem.

by Daniel Bessner, Harper's | Read more:
Image: Nicolás Ortega

Evolution of a Cartoon


via:

No One Knows What Universities Are For

Bureaucratic bloat has siphoned power away from instructors and researchers.

Last month, the Pomona College economist Gary N. Smith calculated that the number of tenured and tenure-track professors at his school declined from 1990 to 2022, while the number of administrators nearly sextupled in that period. “Happily, there is a simple solution,” Smith wrote in a droll Washington Post column. In the tradition of Jonathan Swift, his modest proposal called to get rid of all faculty and students at Pomona so that the college could fulfill its destiny as an institution run by and for nonteaching bureaucrats. At the very least, he said, “the elimination of professors and students would greatly improve most colleges’ financial position.”

Administrative growth isn’t unique to Pomona. In 2014, the political scientist Benjamin Ginsberg published The Fall of the Faculty: The Rise of the All-Administrative University and Why It Matters, in which he bemoaned the multi-decade expansion of “administrative blight.” From the early 1990s to 2009, administrative positions at colleges and universities grew 10 times faster than tenured-faculty positions, according to Department of Education data. Although administrative positions grew especially quickly at private universities and colleges, public institutions are not immune to the phenomenon. In the University of California system, the number of managers and senior professionals swelled by 60 percent from 2004 to 2014.

How and why did this happen? Some of this growth reflects benign, and perhaps positive, changes to U.S. higher education. More students are applying to college today, and their needs are more diverse than those of previous classes. Today’s students have more documented mental-health challenges. They take out more student loans. Expanded college-sports participation requires more athletic staff. Increased federal regulations require new departments, such as disability offices and quasi-legal investigation teams for sexual-assault complaints. As the modern college has become more complex and multifarious, there are simply more jobs to do. And the need to raise money to pay for those jobs requires larger advancement and alumni-relations offices—meaning even more administration.

But many of these jobs have a reputation for producing little outside of meeting invites. “I often ask myself, What do these people actually do?,” Ginsberg told me last week. “I think they spend much of their day living in an alternate universe called Meeting World. I think if you took every third person with vice associate or assistant in their title, and they disappeared, nobody would notice.”

In an email to me, Smith, the Pomona economist, said the biggest factor driving the growth of college admin was a phenomenon he called empire building. Administrators are emotionally and financially rewarded if they can hire more people beneath them, and those administrators, in time, will want to increase their own status by hiring more people underneath them. Before long, a human pyramid of bureaucrats has formed to take on jobs of dubious utility. And this can lead to an explosion of new mandates that push the broader institution toward confusion and incoherence.

The world has more pressing issues than overstaffing at America’s colleges. But it’s nonetheless a real problem that could be a factor in rising college costs. After all, higher education is a labor-intensive industry in which worker compensation is driving inflation, and for much of the 21st century, compensation costs grew fastest among noninstructional professional positions. Some of these job cuts could result in lower graduation rates or reduced quality of life on campus. Many others might go unnoticed by students and faculty. In the 2018 book Bullshit Jobs: A Theory, David Graeber drew on his experience as a college professor to excoriate college admin jobs that were “so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence even though, as part of the conditions of employment, the employee feels obliged to pretend that this is not the case.”

Another reason to care about the growth of university bureaucracy is that it siphons power away from instructors and researchers at institutions that are—theoretically—dedicated to instruction and research. In the past few decades, many schools have hired more part-time faculty, including adjunct professors, to keep up with teaching demands, while their full-time-staff hires have disproportionately been for administration positions. As universities shift their resources toward admin, they don’t just create resentment among faculty; they may constrict the faculty’s academic freedom.

“Take something like diversity, equity, and inclusion,” Ginsberg said. “Many colleges who adopt DEI principles have left-liberal faculty who, of course, are in favor of the principles of DEI, in theory,” he said. But the logic of a bureaucracy is to take any mission and grow its power indefinitely, whether or not such growth serves the underlying institution. “Before long, many schools create provosts for diversity, and for equity, and for inclusion. These provosts hold lots of meetings. They create a set of principles. They tell faculty to update their syllabi to be consistent with new principles devised in those meetings. And so, before long, you’ve built an administrative body that is directly intruding on the core function of teaching.”

Bureaucratic growth has a shadow self: mandate inflation. More college bureaucrats lead to new mandates for the organization, such as developing new technology in tech-transfer offices, advancing diversity in humanities classes through DEI offices, and ensuring inclusive living standards through student-affairs offices. As these missions become more important to the organization, they require more hires. Over time, new hires may request more responsibility and create new subgroups, which create even more mandates. Before long, a once-focused organization becomes anything but.

In sociology, this sort of muddle has a name. It is goal ambiguity—a state of confusion, or conflicting expectations, for what an organization should do or be. The modern university now has so many different jobs to do that it can be hard to tell what its priorities are, Gabriel Rossman, a sociologist at UCLA, told me. “For example, what is UCLA’s mission?” he said. “Research? Undergraduate teaching? Graduate teaching? Health care? Patents? Development? For a slightly simpler question, what about individual faculty? When I get back to my office, what should I spend my time on: my next article, editing my lecture notes, doing a peer review, doing service, or advancing diversity? Who knows.”

Goal ambiguity might be a natural by-product of modern institutions trying to be everything to everyone. But eventually, they’ll pay the price. Any institution that finds itself promoting a thousand priorities at once may find it difficult to promote any one of them effectively. In a crisis, goal ambiguity may look like fecklessness or hypocrisy.

by Derek Thompson, The Atlantic |  Read more:
Image: The Atlantic. Sources: Shutterstock; Getty


via:

The War On Recovery

The opioid overdose epidemic has burned through the U.S. for nearly 30 years. Yet for all that time, the country has had tools that are highly effective at preventing overdose deaths: methadone and buprenorphine.

These medicines are cheap and easy to distribute. People who take them use illicit drugs at far lower rates, and are at far lower risk of overdose or death. By beating back the cravings and agonizing withdrawal symptoms that result from trying to quit opioids “cold turkey,” methadone and buprenorphine can help people addicted to opioids escape an existence defined by drugs and achieve stable, healthy lives.

But a yearlong investigation by STAT shows that virtually every sector of American society is obstructing the use of medications that could prevent tens of thousands of deaths each year. Increasingly, public health experts and even government officials cast the country’s singular failure to prevent overdose deaths not as an unavoidable tragedy but as a conscious choice. (...)

Though overdose death rates have climbed steadily for the past two decades, researchers estimate that barely one-fifth of the approximately 2.5 million Americans with opioid use disorder receive medication — and tens of thousands have died for lack of it.

“More than 80,000 people are dying of opioid overdose every year, and yet we have a tool, medication-assisted treatment, that we know dramatically reduces overdose deaths,” said David Frank, a medical sociologist at New York University who takes methadone for opioid addiction. “But because it’s so difficult to access, people that could and should be alive continue to die.”

STAT’s examination of the overdose epidemic is based on hundreds of interviews with patients, doctors, policy experts, lawmakers, scientists, and other major figures in drug policy and addiction medicine. It relies on an exhaustive review of legal documents, tax filings, financial disclosures, patient records, lobbying reports, and peer-reviewed academic research. And it includes a first-of-its-kind analysis of the ownership and practices of America’s roughly 2,000 methadone clinics, detailing for the first time how private equity firms have acquired a major stake in the nation’s addiction-treatment infrastructure while opposing calls for reform. (...)

In an interview, Nora Volkow, the director of the National Institute on Drug Abuse, estimated that if methadone and buprenorphine were made universally available nationwide, opioid overdoses would fall by half, if not more.

“We have these very effective medications, and the question is why are they not being implemented,” she said. “I estimate that we would have at least 50% less people dying, and that’s conservative. I think it would probably be much more consequential.” (...)

Despite the medications’ remarkable effectiveness, the country’s view of buprenorphine and methadone is built largely on myths and stigma. In 2017, Tom Price, then health secretary to President Trump, referred to what is called medication-assisted treatment as “just substituting one opioid for another.” Law enforcement agencies like the Drug Enforcement Administration, while widely criticized for allowing the proliferation of OxyContin and other painkillers that fueled the opioid epidemic in the 1990s and 2000s, now forcefully regulate buprenorphine and methadone, even as illicit fentanyl floods the market.

“They are not changing one drug for another,” said Volkow, who has led the federal government’s $1.6 billion addiction research institute since 2003. “They’re not different from other medications you may need to take, like antihypertensive medications or antidiabetic medications. They allow for your physiology to be normalized, which is necessary to achieve recovery.” (...)

Yet instead of providing people with pharmaceuticals known to treat their condition, in the United States, common approaches to treating opioid addiction still include undergoing painful and ineffective “detox”; 12-step approaches like Narcotics Anonymous; or even “equine therapy,” a form of treatment that centers on spending time with horses.

While such programs often rely heavily on hope, mindfulness, and religion, they often ignore the physiological realities of addiction — in particular, the debilitating withdrawal that occurs when regular opioid users attempt to suddenly stop. In any other medical field, favoring prayer over proven medication would be considered malpractice. Yet for addiction treatment in the U.S., it’s simply the way things work.

“There is a core belief, that’s different from other countries, that people with opioid addiction don’t deserve care in the way that somebody who has cancer or diabetes does,” said Ayana Jordan, a researcher and addiction psychiatrist at NYU Langone Health. “People genuinely have no idea how effective these medications are at preventing people from dying.”
 
‘That’s how nuts this is’

The U.S. laws and practices governing addiction medicine are not just out of step with the latest science — they are also out of step with laws in most of the Western world.

At Arud, a substance use clinic in Zurich, Switzerland, patients receiving addiction medications are free to come and go as they please. They pick up weeks’ worth of methadone, and other powerful addiction drugs, at a pharmacy, and are not forced to undergo drug testing or regular counseling sessions as a condition of receiving their medication. While American law enforcement officials and methadone industry representatives have warned that easier access could increase methadone misuse and even overdose, Switzerland’s results have been the opposite. There, and throughout Western and Central Europe, countries that have increased addiction medications’ availability have consistently seen overdose deaths and infectious disease transmission plummet to rates vastly lower than in the United States.

“We have a precedent in France,” said Volkow, the NIDA director. “What the French did was basically provide buprenorphine to every single person that needed it. And you see this dramatic reduction in overdoses — they basically stopped.”

For decades, American physicians needed to obtain a special license known as the “X-waiver” just to prescribe buprenorphine. As of 2021, just 75,000 of the nation’s roughly 1.1 million physicians had obtained the waiver. The Biden administration effectively eliminated that requirement in early 2021, but according to data from the Centers for Disease Control and Prevention, the overall buprenorphine prescribing rate nonetheless decreased from 2021 to 2022.

Methadone, which is widely accessible across Europe, is available in the U.S. only at specialized clinics known as opioid treatment programs, or OTPs. These clinics typically require patients to report in person each day to receive a single dose, forcing them to structure their lives around the clinic’s dosing schedule.

“This is practically the only medication in the entire country that is treated this way,” said Rep. Don Norcross (D-N.J.), who has co-authored legislation that would allow specialized addiction doctors to prescribe methadone directly to patients. “The medication for abortion — that is easier accessed than methadone. That’s how nuts this is. The idea that the only way to do this is to go to the methadone clinic is just insane.”

‘The system creates barriers to care’

Paradoxically, it is often those who claim to be most sympathetic to the cause of addiction treatment who are among the biggest opponents of expanded access to methadone and buprenorphine.

The recovery group Narcotics Anonymous — perhaps the country’s largest provider of addiction treatment — has taken a hard line against addiction medication. The organization’s own literature acknowledges that people taking methadone or buprenorphine are often banned from speaking at meetings, but offers a concession: “NA may be compatible for addicts on medically assisted protocols if they have a desire to become clean one day.”

In other words: In the view of Narcotics Anonymous, even people who have relied on methadone or buprenorphine to achieve stable recovery are not considered “clean.” Instead, their full participation in the program would require a pledge to stop taking medications they were prescribed by a doctor, and that first helped them quit illicit drugs.

Narcotics Anonymous did not respond to STAT’s requests for comment.

Methadone clinics have also opposed calls to expanded access to medication treatment. The American Association for the Treatment of Opioid Dependence, a trade group representing methadone clinics, has lobbied not just against the deregulation of methadone treatment, but also against a bill that passed in 2022 with overwhelming bipartisan support that made it easier for doctors to prescribe buprenorphine. And in recent decades, methadone treatment has become big business: A majority of methadone clinics now operate as for-profits, and nearly one-third are owned by private equity firms. As calls for reform have grown far louder in recent years, the methadone industry has guarded its monopoly fiercely, and remains staunchly opposed to allowing other doctors to prescribe the medication to patients in need.

Separately, according to federal survey data, at least 751 substance use treatment facilities offer treatment for opioid addiction but reject clients using methadone and buprenorphine. More than 2,000 addiction treatment facilities did not respond to the federal survey, meaning the true number of facilities banning medication is probably significantly higher.

Many medical schools still don’t require any training in addiction medicine, or prescribing addiction medications. Many hospitals still do not offer patients buprenorphine or methadone, even in the immediate aftermath of an overdose. Many pharmacies choose not to stock buprenorphine. And insurers, in an effort to pad profit margins, sometimes refuse to pay for newly developed injectable buprenorphine formulations, which last weeks or months and are shown to help patients remain in treatment — but cost far more than cheaper versions that must be taken daily.

The American criminal justice system also remains skeptical of medication as treatment. The Drug Enforcement Administration has long displayed hostility to buprenorphine and methadone, and many jails and prisons refuse altogether to provide incarcerated people with either medication. Many judges with no medical training — even in “drug court” systems supposedly meant to aid addiction recovery — have historically barred people arrested for low-grade drug offenses from taking any opioid, including addiction medications.

As workers, people taking addiction medications face immense discrimination. Many employers, labor unions, and professional societies ban their members from taking addiction medications in any circumstance.

“There are a lot of ways that the system creates barriers to care,” said Weinstein, the Boston addiction doctor. “We start to believe that if the system is created that way, it must be necessary, there must be a good reason. But that may not be true: The reason may be outdated, or never existed, or was based on stigma.”

by Lev Facher, STAT |  Read more:
Image: Joe Raedle/Getty Images
[ed. Simpler solution: just let people have drugs. They're going to use them anyway (as they have for centuries). Wall Street runs on drugs. Most overdoses past, present and future are because people are injesting uncontrolled products. If drugs are more available (say through dispensaries where amounts and purchases are recorded) some people will have problems, but most likely won't because of strong societal disincentives (family, employment, friends, education, etc.). Control the purity of the products, monitor the problems, restrict access where needed. Then let laws control for bad behavior - another disincentive - as we do with alcohol, marijuana, guns, etc. Probably the cartels' worst nightmare.]

Tuesday, May 14, 2024

Questions and Answers on the PGA Tour Board Drama

Go ahead and slap a zero on the “It’s been X days since golf had unnecessary drama” board. Rory McIlroy’s intention to rejoin the PGA Tour Policy Board has been thwarted, as McIlroy remarked Wednesday at the Wells Fargo Championship that a “subset” of players blocked his return. You have questions and we have … well, we don’t have answers, because the incontrovertible takeaway from the past two years in professional golf is that no one knows anything, at least for certain. But we do have some idea of the behind-the-scenes intrigue regarding power control and the trajectory of the tour’s future, so we’ll do our best to explain what the heck is going on.

OK, what is going on?

In late April, the Guardian reported McIlory was returning to the PGA Tour Policy Board, taking the place of Webb Simpson, who wanted to surrender his position and give it to McIlroy. McIlroy was previously on the board but resigned his post last November. McIlroy confirmed the report the following day at the Zurich Classic. “I don’t think there’s been much progress made in the last eight months, and I was hopeful that there would be,” McIlroy said. “I think I could be helpful to the process. But only if people want me involved.”

Only if people want me involved proved to be operative words. For the past two weeks there have been rumblings that the planned move had an icy reception from other board members, which McIlroy confirmed Wednesday morning at Quail Hollow.

“There's been a lot of conversations. Sort of reminded me partly why I didn't [originally want to stay on the board]," McIlroy said, referencing his departure. "I think it got pretty complicated and pretty messy, and I think with the way it happened, I think it opened up some old wounds and scar tissue from things that have happened before … there was a subset of people on the board that were maybe uncomfortable with me coming back on for some reason. … I think Webb just stays on and sees out his term, and I think he's gotten to a place where he's comfortable with doing that and I just sort of keep doing what I'm doing."

So, for the moment, McIlroy remains out and Simpson is still in.

Why did Rory McIlroy resign in the first place?

McIlroy served as the de facto face of the PGA Tour in its battle with LIV Golf, standing up for his tour in the absence of leadership and doing so because he believed it was the right thing to do. Along with Tiger Woods, he spearheaded a player-led initiative in the summer of 2022 that restructured the tour schedule and seemed to prevent further player exodus. McIlroy admitted that putting himself out there in the game’s civil war took an emotional and physical toll, which McIroy said he was still reckoning with.

So when the tour announced last June a surprising partnership with the LIV Golf’s backer, Saudi Arabia’s Public Investment Fund—a negotiation that McIlroy was not a part of—McIlroy conceded a sense of betrayal. “It's hard for me to not sit up here and feel somewhat like a sacrificial lamb and feeling like I've put myself out there and this is what happens,” McIlroy said at the RBC Canadian Open a day after the framework agreement announcement.

Nevertheless, McIlroy continued in his position on the tour’s policy board throughout the summer. When he ultimately resigned in late November 2023, he cited personal and professional commitments, but also nodded to the direction a potentially unified professional game could be going, that a deal private-equity deal was on the horizon (which came to fruition two months later, with the tour choosing the Fenway-led Strategic Sports Group) and felt his job was done. The remaining board directors elected Jordan Spieth to take McIlroy’s place.

What spurred McIlroy’s reversal?

According to NBC Sports analyst Brad Faxon (who also serves as McIlroy’s putting coach from time to time), McIlroy regretted his decision “almost immediately.” Also, as McIlroy alluded to in his Zurich comments, talks between the PGA Tour and PIF had stalled. While McIlroy has not moved from his anti-LIV Golf stance, the Ulsterman has conceded that golf’s schism is unsustainable and PIF’s participation in professional golf is inevitable. For McIlroy, the most palliative of avenues forward is one where PIF’s investment is diverted to the tour, which would likely welcome reunification of a fractured sport. McIlroy presumably believed his return to the board helps bridge the current gap. 

Why does Webb Simpson want off?

Depends on whom you ask. Simpson, for his part, wants to spend more time focusing on golf and his family. It should also be noted that he is considered one of the more respected and well-liked players on tour, but has come under criticism for his use of four sponsor's exemptions into limited-field signature events. There is also a belief among players, fair or not, that PGA Tour and SSG leadership wants McIlroy on the board to help advocate for unification. Given his credentials, McIlroy is seen as a stronger proponent than Simpson for this task. (...)

Who is the “subset” that doesn’t want McIlroy to return?

(Sigh) OK, let’s get into it.

McIlroy didn’t name names, but sources tell Golf Digest that Patrick Cantlay, Jordan Spieth and Tiger Woods were not particularly eager to welcome McIlroy back. The icy relationship between Cantlay and McIlroy is well-documented; McIlroy himself conceded last fall in an interview with the Irish Independent. “My relationship with Cantlay is average at best. We don’t have a ton in common and see the world quite differently.”

Later in the interview, when speaking to the infamous spat between McIlroy and Cantlay’s caddie, Joe LaCava, at the Ryder Cup, McIlroy said, “LaCava used to be a nice guy when he was caddieing for Tiger, and now he’s caddieing for that d--k he’s turned into a … I still wasn’t in a great headspace.”

The McIlroy-Woods relationship, sources tell Golf Digest, has also soured over the past six months. It remains cordial, yet their different views on the future of professional golf has led to a falling out of sorts. As for Spieth and McIlroy, McIlroy removed himself from a player text chain following Spieth’s comments at Pebble Beach (where Spieth said the tour doesn’t need PIF after the deal with SSG), leading to an hour-long chat between the two. “My thing was if I’m the original [potential] investor that thought that they were going to get this deal done back in July, and I’m hearing a board member say that we don’t really need them now, how are they going to think about that, what are they gonna feel about that?” McIlroy said. “They are still sitting out there with hundreds of billions of dollars, if not trillions, that they’re gonna pour it into sport. And I know what Jordan was saying. … But if I were PIF and I was hearing that coming from here, the day after doing this SSG deal, it wouldn’t have made me too happy, I guess?”

The tension remained at the Players Championship, where Spieth and McIlroy played together in the first two rounds and featured combative moments in the first round regarding two of McIlroy’s drops.

What are the Woods/Cantlay/Spieth arguments against McIlroy?

As a preface, let us again state that if anyone deserves grace, it’s McIlroy. What he did over two years for the tour and the game brought an invisible pain and weight that can't be measured, and he was sold out by the very thing he was trying to defend. However, there is the idea that McIlroy did resign an elected spot, and players—and not just on the board—don’t think he just gets to simply walk on back.

“He was very clear that it was too much for him. He had business dealings, he has a kid, he wants to focus on his game. Trust me, I get it. But once you quit, you’re not getting back,” Kevin Streelman, a former member of the policy board who ran against McIlroy for Player Advisory Council chairman, told Golfweek. “I wouldn’t quit on something that you were elected to by your peers. To want back in is peculiar.”

There is also the perception of a financial entanglement. McIlroy is part of the Fenway-backed TGL Boston team and Fenway is a major player in the Strategic Sports Group. That McIlroy’s team and tour leadership have attempted to sell the board on McIlroy’s return, sources tell Golf Digest, has further raised suspicions.

Additionally, while players are fine with McIlroy changing his stance on PIF involvement, sources tell Golf Digest, a set of them don't like that McIlroy is frustrated others haven't changed their opinions with him.

And let's be honest, there's also some ego involved, to say nothing of a power play. It's like "Succession," only if everyone was Kendall Roy.

How ridiculous is all of this?


Well, as a history minor in college, we were warmed to see McIlroy invoke the Good Friday Agreement when asked his thoughts about how all of this is resolved. For context: “I sort of liken it to like when Northern Ireland went through the peace process in the '90s and the Good Friday Agreement, neither side was happy,” McIlroy said Wednesday. “Catholics weren't happy, Protestants weren't happy, but it brought peace and then you just sort of learn to live with whatever has been negotiated, right? That was in 1998 or whatever it was and 20, 25, 30 years ahead, my generation doesn't know any different. It's just this is what it's always been like and we've never known anything but peace.” McIlroy has a valid point, but also … we’re invoking a peace treaty that ended a three-decades long actual war in the context about, essentially, two battling golf businesses.

by Joel Beall, Golf Digest |  Read more:
Image: David Cannon
[ed. This is a multi-billion dollar industry with professionals practiced in negotiating big deals. Golfers on the tour are not. I'd love to see Rory back because he understands this and just wants what's best for the game, whoever is in charge. See also: Jimmy Dunne resigns (GD). Also good luck to Rory on a rough time, personally.]

"I have heard one truly smart insight into this insanity. It came from Fred Perpall, the president of the USGA. He’s new to golf but not to reading people. I asked him, What should the Tour say to anyone who wants to jump ship and go LIV? “Let them go,” Fred said. You can’t hold on to people who don’t value what they already have.

There’s so much in that I need a long lunch break to unpack it.

Ben Crenshaw had a certain fondness for a lifer caddie named Adolphus Hull, aka Golf Ball. I got this from Hull, on his deathbed. Ball was remembering a world that’s gone but sharing an insight that lives.

Ben: “Ball, what do you do if you love a girl but she don’t love you?”

Ball: “If she don’t love you, you gotta let her go.”

Ben: “I believe you’re right.”

Ball: “I know I’m right.”

Ball knew what he knew and, were he alive today, would know that Fred Perpall has it right."

Why pro golf’s power struggle hints at broader societal shift (Golf)