Joakim Allgulander, Surveillance II
via:
Tuesday, July 25, 2023
The Trillion-Dollar Grift: Inside the Greatest Scam of All Time
The pandemic relief was the biggest bailout in history, and it opened the door to wide-scale fraud the likes of which no one had ever seen — more than three years later, we still don't know how much damage was done
In late March 2020, Haywood Talcove, a CEO at LexisNexis Risk Solutions, was packing up his office, having sent his employees home. He was worrying about laying off his staff, his family’s health, and how he was going to manage two young kids at home during the pandemic.
But when President Trump announced an initial $2.2 trillion relief package to bail out the millions of Americans desperate for cash during the national lockdown, his concern turned away from the coronavirus. An expert in cybersecurity, Talcove has worked in both the private and public sectors, and has been raising the alarm about the government’s exposure to scams for many years. And now, it was like all of his prior analysis and warnings about fraud had just become real.
“I said, ‘Oh, my God, they’re going to allow anyone to get unemployment-insurance benefits,” he recalls. “The systems are vulnerable. All you needed was a name, a date of birth, an address, and a social security number.”
Talcove’s a proud Boston guy who moved to Washington, D.C., in 1990, and went on to help an anti-government-waste-style Republican become governor of New Hampshire. He knew the relief plan would be irresistible to scam artists and especially tempting to organized transnational criminal groups. “As soon as the CARES money was announced, we started seeing squawking on the dark web, criminal groups in China, Nigeria, Romania, and Russia — they see our systems are open,” Talcove says. He estimates that “the United States government is the single largest funder of cybersecurity fraud in the world.”
Talcove understood that he had to act. So he called the White House, trying to warn of the threat. No response. Finally, after weeks of trying to get through, one night while he was playing with his kids, he got a call from an unknown number. It was Larry Kudlow, Trump’s director of the National Economic Council. “I’m like, ‘Mr. Kudlow, I really need to warn you that you have to do something about identity verification,’” Talcove recalls, “’or it’s going to be the biggest fraud in the history of our country.’” (Kudlow didn’t respond to requests for comment.)
He says he talked to Kudlow for about 15 minutes but couldn’t get him to budge. “Kudlow’s like, ‘The money has to get out quickly. You can’t have speed and security,’” Talcove says. “But I’m like, ‘That is bullshit. Sir, that’s just not true. Now you’re never going to get the money back.’”
Eventually, he says Kudlow told him to get in touch with the folks in charge of sending out small-business loans and the Pandemic Unemployment Assistance loans. But those guys told him they aren’t seeing any fraud. “I’m like, ‘Dude, you haven’t even given out any money yet! That’s why you’re not seeing it,’” Talcove says. “I’m sending them screenshots of the dark web. I’m explaining exactly how it’s going to go down. And I tell them you are going to have a $200 billion problem on your hands if you do nothing.”
It's history now: On March 27, 2020, during the height of the pandemic, Trump signed the CARES Act, pumping more than $2 trillion into the U.S. economy. The scale of the crisis was beyond anything we had ever seen, and so was the help. The money flowed like an open spigot and saved the livelihoods of millions of people. But Talcove was right. While many breathed sighs of relief, others saw the crisis as an opportunity — a chance to steal millions.
The list of various CARES Act schemes is endless and astounding: the couple who scammed some $20 million off unemployment insurance while living as high rollers in Los Angeles; the Chicago man under indictment for selling bunk Covid tests and allegedly raking in $83 million (he has declared his innocence); the Florida minister who the feds allege faked the signature of his aging accountant, suffering from dementia, to steal $8 million in PPP loans (in a twist, the pastor has been locked in a legal battle to determine whether he’s psychologically fit to stand trial). One particularly loathsome and effective plot: offering fake meals to underprivileged children in Minnesota to reel in a whopping sum of $250 million. Noted serial liar George Santos allegedly got in on the act: He was charged with receiving unemployment benefits while he had a six-figure job in Florida. (Santos has pleaded not guilty.) Other examples are admittedly funny: A guy named John Doe got unemployment money, as did someone named Mr. Poopy Pants, and so did a person going by the name of Diane Feinstein, presumably not the senator from California.
By the government’s own accounting, we potentially dished out some $16.2 billion to folks with “suspicious” emails; $267 million was sent to the identities matching current federal prisoners, some on death row; another nearly $29 billion to people living in multiple states; we even sent out more than $139 million to dead people. California alone accounts for a whopping $20 billion in pandemic unemployment-insurance fraud.
Factoring in President Biden’s and Trump’s relief efforts, the U.S. released more than $5 trillion into the economy — the biggest bailout in history. Department of Justice Inspector General Michael Horowitz told congress that more than a $100 billion in Covid aid money may have ended up misappropriated, but many experts and members of law enforcement think the number is much higher. The AP estimates $280 billion went to fraudsters and another $123 billion was misappropriated, some 10 percent of the relief money. For his part, Talcove estimates the actual losses blow past the tallies being thrown around. “The real number is much higher. I think the government lost a trillion dollars due to fraud in the pandemic,” he says. “One trillion.”
Talcove’s number is shocking and a higher estimate than many others. There could be instances of suspected fraud that were innocent mistakes. But when I asked Biden administration officials what the number might be, they admit that they “don’t know — the actual determination comes much later.” The DOJ’s Horowitz told the AP much the same thing, saying the official accounting is “at least a couple of years away.” Whatever the actual number is, the losses will be staggering.
It must be said that this should not be a red-versus-blue blame game; given the unprecedented nature of the crisis, some feel like the Trump administration acted correctly in releasing the money as fast as possible. “My liberal friends get mad at me for saying this, but the Trump administration handled pandemic unemployment as well as can be expected,” says Michele Evermore, former deputy director for Policy in the Office of Unemployment Insurance Modernization, which handles unemployment insurance. “The problem was so dire and so vast, and people needed help immediately. There was really no choice.”
Nonetheless, debates will continue for years to come over how the money was doled out — speed versus security. The problem was so large that millions of people were at risk; the necessity couldn’t have been clearer. But the cost? One trillion dollars possibly lost to crooks, many of them our fellow citizens gouging the government during a crisis. Thousands of potential victims, not to mention all of the folks who desperately needed the money and couldn’t get it. Thousands of criminals rewarded, many totally unpunished. Now, partisan gridlock and finger-pointing in Congress over the pandemic-era bailouts may halt any legislation and leaves the outcome of reform efforts less-than-certain. And after months reporting on the problem, speaking with the Secret Service agents hunting the fraudsters, victims saddled with debts they never incurred, and an afternoon spent with a scam artist, one can’t help but have serious concerns that the next time a crisis comes around, we sure as hell might get fooled again.
One law enforcement agency takes the lead when it comes to protecting the U.S. government from fraudsters: the Secret Service. Jason Kane, a Secret Service veteran of more than 20 years, was stationed in New York during those early-pandemic months and knew immediately that a wave of work was heading his agency’s way. “Any time there’s a crisis — Hurricane Katrina, the BP oil spill — people take advantage. It’s human nature,” Kane says. “Today we have initiated more than 3,000 criminal cases from the pandemic. We are still trying to seize back those assets but most of the money is likely gone.”
In late March 2020, Haywood Talcove, a CEO at LexisNexis Risk Solutions, was packing up his office, having sent his employees home. He was worrying about laying off his staff, his family’s health, and how he was going to manage two young kids at home during the pandemic.
But when President Trump announced an initial $2.2 trillion relief package to bail out the millions of Americans desperate for cash during the national lockdown, his concern turned away from the coronavirus. An expert in cybersecurity, Talcove has worked in both the private and public sectors, and has been raising the alarm about the government’s exposure to scams for many years. And now, it was like all of his prior analysis and warnings about fraud had just become real.
“I said, ‘Oh, my God, they’re going to allow anyone to get unemployment-insurance benefits,” he recalls. “The systems are vulnerable. All you needed was a name, a date of birth, an address, and a social security number.”
Talcove’s a proud Boston guy who moved to Washington, D.C., in 1990, and went on to help an anti-government-waste-style Republican become governor of New Hampshire. He knew the relief plan would be irresistible to scam artists and especially tempting to organized transnational criminal groups. “As soon as the CARES money was announced, we started seeing squawking on the dark web, criminal groups in China, Nigeria, Romania, and Russia — they see our systems are open,” Talcove says. He estimates that “the United States government is the single largest funder of cybersecurity fraud in the world.”
Talcove understood that he had to act. So he called the White House, trying to warn of the threat. No response. Finally, after weeks of trying to get through, one night while he was playing with his kids, he got a call from an unknown number. It was Larry Kudlow, Trump’s director of the National Economic Council. “I’m like, ‘Mr. Kudlow, I really need to warn you that you have to do something about identity verification,’” Talcove recalls, “’or it’s going to be the biggest fraud in the history of our country.’” (Kudlow didn’t respond to requests for comment.)
He says he talked to Kudlow for about 15 minutes but couldn’t get him to budge. “Kudlow’s like, ‘The money has to get out quickly. You can’t have speed and security,’” Talcove says. “But I’m like, ‘That is bullshit. Sir, that’s just not true. Now you’re never going to get the money back.’”
Eventually, he says Kudlow told him to get in touch with the folks in charge of sending out small-business loans and the Pandemic Unemployment Assistance loans. But those guys told him they aren’t seeing any fraud. “I’m like, ‘Dude, you haven’t even given out any money yet! That’s why you’re not seeing it,’” Talcove says. “I’m sending them screenshots of the dark web. I’m explaining exactly how it’s going to go down. And I tell them you are going to have a $200 billion problem on your hands if you do nothing.”
It's history now: On March 27, 2020, during the height of the pandemic, Trump signed the CARES Act, pumping more than $2 trillion into the U.S. economy. The scale of the crisis was beyond anything we had ever seen, and so was the help. The money flowed like an open spigot and saved the livelihoods of millions of people. But Talcove was right. While many breathed sighs of relief, others saw the crisis as an opportunity — a chance to steal millions.
The list of various CARES Act schemes is endless and astounding: the couple who scammed some $20 million off unemployment insurance while living as high rollers in Los Angeles; the Chicago man under indictment for selling bunk Covid tests and allegedly raking in $83 million (he has declared his innocence); the Florida minister who the feds allege faked the signature of his aging accountant, suffering from dementia, to steal $8 million in PPP loans (in a twist, the pastor has been locked in a legal battle to determine whether he’s psychologically fit to stand trial). One particularly loathsome and effective plot: offering fake meals to underprivileged children in Minnesota to reel in a whopping sum of $250 million. Noted serial liar George Santos allegedly got in on the act: He was charged with receiving unemployment benefits while he had a six-figure job in Florida. (Santos has pleaded not guilty.) Other examples are admittedly funny: A guy named John Doe got unemployment money, as did someone named Mr. Poopy Pants, and so did a person going by the name of Diane Feinstein, presumably not the senator from California.
By the government’s own accounting, we potentially dished out some $16.2 billion to folks with “suspicious” emails; $267 million was sent to the identities matching current federal prisoners, some on death row; another nearly $29 billion to people living in multiple states; we even sent out more than $139 million to dead people. California alone accounts for a whopping $20 billion in pandemic unemployment-insurance fraud.
Factoring in President Biden’s and Trump’s relief efforts, the U.S. released more than $5 trillion into the economy — the biggest bailout in history. Department of Justice Inspector General Michael Horowitz told congress that more than a $100 billion in Covid aid money may have ended up misappropriated, but many experts and members of law enforcement think the number is much higher. The AP estimates $280 billion went to fraudsters and another $123 billion was misappropriated, some 10 percent of the relief money. For his part, Talcove estimates the actual losses blow past the tallies being thrown around. “The real number is much higher. I think the government lost a trillion dollars due to fraud in the pandemic,” he says. “One trillion.”
Talcove’s number is shocking and a higher estimate than many others. There could be instances of suspected fraud that were innocent mistakes. But when I asked Biden administration officials what the number might be, they admit that they “don’t know — the actual determination comes much later.” The DOJ’s Horowitz told the AP much the same thing, saying the official accounting is “at least a couple of years away.” Whatever the actual number is, the losses will be staggering.
It must be said that this should not be a red-versus-blue blame game; given the unprecedented nature of the crisis, some feel like the Trump administration acted correctly in releasing the money as fast as possible. “My liberal friends get mad at me for saying this, but the Trump administration handled pandemic unemployment as well as can be expected,” says Michele Evermore, former deputy director for Policy in the Office of Unemployment Insurance Modernization, which handles unemployment insurance. “The problem was so dire and so vast, and people needed help immediately. There was really no choice.”
Nonetheless, debates will continue for years to come over how the money was doled out — speed versus security. The problem was so large that millions of people were at risk; the necessity couldn’t have been clearer. But the cost? One trillion dollars possibly lost to crooks, many of them our fellow citizens gouging the government during a crisis. Thousands of potential victims, not to mention all of the folks who desperately needed the money and couldn’t get it. Thousands of criminals rewarded, many totally unpunished. Now, partisan gridlock and finger-pointing in Congress over the pandemic-era bailouts may halt any legislation and leaves the outcome of reform efforts less-than-certain. And after months reporting on the problem, speaking with the Secret Service agents hunting the fraudsters, victims saddled with debts they never incurred, and an afternoon spent with a scam artist, one can’t help but have serious concerns that the next time a crisis comes around, we sure as hell might get fooled again.
One law enforcement agency takes the lead when it comes to protecting the U.S. government from fraudsters: the Secret Service. Jason Kane, a Secret Service veteran of more than 20 years, was stationed in New York during those early-pandemic months and knew immediately that a wave of work was heading his agency’s way. “Any time there’s a crisis — Hurricane Katrina, the BP oil spill — people take advantage. It’s human nature,” Kane says. “Today we have initiated more than 3,000 criminal cases from the pandemic. We are still trying to seize back those assets but most of the money is likely gone.”
by Sean Woods, Rolling Stone | Read more:
Image: Haywood Talcove/uncredited
Pros and Cons of Fentanyl Addiction Treatment Methods
There was a time, not long ago, when doctors had national protocols for using a leading opioid treatment drug.
Heroin was the opioid king then, and it was fairly smooth to transition patients onto the medication buprenorphine, usually given as Suboxone, which reduces cravings for illicit opioids. Far trickier has been starting the medication with patients using fentanyl, a drug up to 50 times as powerful as heroin that’s taken over the opioid market in the last several years.
Fentanyl’s unique properties can send those starting buprenorphine into an agonizing state of sudden, or “precipitous,” withdrawal, leaving providers scrambling to devise new approaches. But no universally adopted treatment protocols have taken hold so far.
“It’s just kind of word-of-mouth and what’s happening in your local community,” said Dr. Nate Kittle, who oversees addiction care across HealthPoint, a nonprofit running primary and urgent care clinics throughout King County. “We’re still learning the best ways to do this.”
Doctors are trying a variety of methods:
Microdosing (...)
In May, the U.S. Food and Drug Administration approved another long-acting, injected medication, called Brixadi, which has two versions, one releasing buprenorphine for a week, the other a month.
Adding ketamine to the mix
Some physicians are trying to ease the transition to buprenorphine by using tiny doses of ketamine, an anesthetic sometimes used recreationally because of its hallucinogenic effects.
Dr. Lucinda Grande, a physician at Lacey-based Pioneer Family Practice, had for years prescribed ketamine for chronic pain and depression when she saw research suggesting the drug holds promise for alleviating intense withdrawal symptoms.
Over the past year, she started using the method with addicted patients, refining her approach. “It definitely helps everybody to some degree,” she said, adding one patient smiled through the process, with no withdrawal symptoms at all. “I’m really ecstatic.”
Accessing and using ketamine adds a complicating factor, though. “It’s too early to know if it will have widespread utility,” Kittle said.
[ed. See also: Is WA’s health ‘hub’ model the 'secret sauce’ in treating fentanyl addiction? (Seattle Times).]
Heroin was the opioid king then, and it was fairly smooth to transition patients onto the medication buprenorphine, usually given as Suboxone, which reduces cravings for illicit opioids. Far trickier has been starting the medication with patients using fentanyl, a drug up to 50 times as powerful as heroin that’s taken over the opioid market in the last several years.
Fentanyl’s unique properties can send those starting buprenorphine into an agonizing state of sudden, or “precipitous,” withdrawal, leaving providers scrambling to devise new approaches. But no universally adopted treatment protocols have taken hold so far.
“It’s just kind of word-of-mouth and what’s happening in your local community,” said Dr. Nate Kittle, who oversees addiction care across HealthPoint, a nonprofit running primary and urgent care clinics throughout King County. “We’re still learning the best ways to do this.”
Doctors are trying a variety of methods:
Microdosing (...)
Macrodosing (...)
Long-acting medications
Over the last year, Duncan has been using Sublocade, an injected medication that continuously releases buprenorphine for a month.
“It has been by far the most effective treatment I’ve seen for people who are using fentanyl,” he said.
The big advantage, he and others say, is that Sublocade, which forms a depot of medication under one’s skin, can only be surgically removed. So people are more or less committed for a month.
Another bonus is people don’t have to fixate on taking their next dose of medication, which can feel eerily similar to feeding an addiction, said Gather care navigator Brooke Reder. Sublocade frees their mind of old habits, she said.
The manufacturer’s instructions call for people to be on the short-acting form of buprenorphine for seven days before starting Sublocade, in part to make sure they can tolerate the medication.
“Everyone’s kind of quickly realized that you don’t really need to be on it for seven days,” Duncan said. Maybe just one day or less.
Duncan, who’s tracking patient experiences to share with the medical community, might even skip giving an advance dose of buprenorphine if someone has previously taken it without experiencing an allergic reaction.
Likewise, doctors are experimenting with how long to ask patients to abstain from fentanyl before starting on Sublocade. It’s tricky because buprenorphine activates the brain’s opioid receptors at only a 60% level. If fentanyl remains in the system, activating receptors at a higher level, the medication could make them feel worse, not better.
Duncan may ask patients to abstain for about 24 hours, “but if that’s too much, I’ll say, ‘OK. What can you tolerate?’ If it’s six hours, ‘OK, then, start there.'”
Long-acting medications
Over the last year, Duncan has been using Sublocade, an injected medication that continuously releases buprenorphine for a month.
“It has been by far the most effective treatment I’ve seen for people who are using fentanyl,” he said.
The big advantage, he and others say, is that Sublocade, which forms a depot of medication under one’s skin, can only be surgically removed. So people are more or less committed for a month.
Another bonus is people don’t have to fixate on taking their next dose of medication, which can feel eerily similar to feeding an addiction, said Gather care navigator Brooke Reder. Sublocade frees their mind of old habits, she said.
The manufacturer’s instructions call for people to be on the short-acting form of buprenorphine for seven days before starting Sublocade, in part to make sure they can tolerate the medication.
“Everyone’s kind of quickly realized that you don’t really need to be on it for seven days,” Duncan said. Maybe just one day or less.
Duncan, who’s tracking patient experiences to share with the medical community, might even skip giving an advance dose of buprenorphine if someone has previously taken it without experiencing an allergic reaction.
Likewise, doctors are experimenting with how long to ask patients to abstain from fentanyl before starting on Sublocade. It’s tricky because buprenorphine activates the brain’s opioid receptors at only a 60% level. If fentanyl remains in the system, activating receptors at a higher level, the medication could make them feel worse, not better.
Duncan may ask patients to abstain for about 24 hours, “but if that’s too much, I’ll say, ‘OK. What can you tolerate?’ If it’s six hours, ‘OK, then, start there.'”
In May, the U.S. Food and Drug Administration approved another long-acting, injected medication, called Brixadi, which has two versions, one releasing buprenorphine for a week, the other a month.
Adding ketamine to the mix
Some physicians are trying to ease the transition to buprenorphine by using tiny doses of ketamine, an anesthetic sometimes used recreationally because of its hallucinogenic effects.
Dr. Lucinda Grande, a physician at Lacey-based Pioneer Family Practice, had for years prescribed ketamine for chronic pain and depression when she saw research suggesting the drug holds promise for alleviating intense withdrawal symptoms.
Over the past year, she started using the method with addicted patients, refining her approach. “It definitely helps everybody to some degree,” she said, adding one patient smiled through the process, with no withdrawal symptoms at all. “I’m really ecstatic.”
Accessing and using ketamine adds a complicating factor, though. “It’s too early to know if it will have widespread utility,” Kittle said.
by Nina Shapiro, Seattle Times | Read more:
Image: Daniel Kim/The Seattle Times[ed. See also: Is WA’s health ‘hub’ model the 'secret sauce’ in treating fentanyl addiction? (Seattle Times).]
Monday, July 24, 2023
The NFL’s Running Back Market Has Bottomed Out
It’s not often that a star athlete says he’s so frustrated with his sport that maybe his entire job shouldn’t exist. But that’s where the NFL found itself when Tennessee Titans running back Derrick Henry, seemingly so dismayed by the news that the league’s three franchise-tagged running backs—Saquon Barkley, Josh Jacobs, and Tony Pollard—all failed to get long-term contracts before Monday’s deadline, tweeted his frustrations about running back value.
“At this point, just take the RB position out the game then,” Henry wrote, responding to a tweet from Matt Miller, an ESPN draft analyst who detailed a draft-and-replace process that seems to be underway across the NFL. “The ones that want to be great & work as hard as they can to give their all to an organization, just seems like it don’t even matter. I’m with every RB that’s fighting to get what they deserve.”
Assuming they sign the franchise tag and show up to training camp next week, Barkley, and Jacobs will be paid the same fixed, non-negotiable salary of $10.1 million each for the season. That’s a lot of money to most people. But consider that a wide receiver making $10 million wouldn’t even rank among the top 25 players at the position in 2023. In the NBA, a player making $10 million per year wouldn’t even be among the top 150 in the entire league. (...)
This positional devaluation has been brewing since 2011, when the NFL and the NFL Players Association agreed to essentially delay massive contracts for first-round picks until their fourth, fifth or even sixth year in the league. But by the time running backs have been in the league that long, they are often already declining. When wide receivers, quarterbacks, and defensive ends are turning 26, they’re hitting life-altering paydays as they approach their primes. NFL running backs turning 26 get treated like Leonardo DiCaprio’s girlfriends.
As running backs fell through the cracks of a new financial model, a running back’s job, in and of itself, also became less important than ever. Passing supplanted running as the dominant football strategy because coaches have realized the average pass goes 7 yards and the average run goes a little over 4. Not only had the job of the running back been devalued, but it’s also been split among players. Most teams figure they can put together a functional running game by committee, giving 20 percent of the money to a few guys who can replace 90 percent of the production.
by Danny Heifetz, The Ringer | Read more:
Image: Getty Images/Ringer
“At this point, just take the RB position out the game then,” Henry wrote, responding to a tweet from Matt Miller, an ESPN draft analyst who detailed a draft-and-replace process that seems to be underway across the NFL. “The ones that want to be great & work as hard as they can to give their all to an organization, just seems like it don’t even matter. I’m with every RB that’s fighting to get what they deserve.”
Colts running back Jonathan Taylor, who led the NFL in rushing in 2021 and is now entering the last year of his rookie contract, also chimed in. (...)
And so did the Chargers’ Austin Ekeler, who was so unhappy with his contract that he reportedly requested a trade earlier this year. “They act,” Ekeler tweeted, “like we are discardable widgets.” (...)
The running back market has been crashing for years. When it bottomed out on Monday, you could sense players finally beginning to grieve. While it’s noble for Henry to say he will fight for what running backs “deserve,” it’s also sad to think about how unwinnable that fight will be. Whether these individual players are deserving has nothing to do with it.
This has been a brutal year for the position. This offseason, Dallas cut star running back Ezekiel Elliott (age: 27). The Vikings did the same with Dalvin Cook (27), Tampa Bay released Leonard Fournette (28), and the Browns declined to re-sign Kareem Hunt (27). Cincinnati’s Joe Mixon (26) took a significant pay cut to avoid joining them in the free agent pool. Green Bay’s Aaron Jones (28) also took a $5 million pay reduction. The Titans reportedly made Henry (an elderly 29) available for trade in the spring, but had no takers for a player who had more than 1,900 yards from scrimmage last year. And there was no trade market for Ekeler (28) despite having a dozen more touchdowns than any skill position player in the NFL over the last two seasons. Ekeler stayed in L.A. and settled for some extra contract incentives. (...)
And so did the Chargers’ Austin Ekeler, who was so unhappy with his contract that he reportedly requested a trade earlier this year. “They act,” Ekeler tweeted, “like we are discardable widgets.” (...)
The running back market has been crashing for years. When it bottomed out on Monday, you could sense players finally beginning to grieve. While it’s noble for Henry to say he will fight for what running backs “deserve,” it’s also sad to think about how unwinnable that fight will be. Whether these individual players are deserving has nothing to do with it.
This has been a brutal year for the position. This offseason, Dallas cut star running back Ezekiel Elliott (age: 27). The Vikings did the same with Dalvin Cook (27), Tampa Bay released Leonard Fournette (28), and the Browns declined to re-sign Kareem Hunt (27). Cincinnati’s Joe Mixon (26) took a significant pay cut to avoid joining them in the free agent pool. Green Bay’s Aaron Jones (28) also took a $5 million pay reduction. The Titans reportedly made Henry (an elderly 29) available for trade in the spring, but had no takers for a player who had more than 1,900 yards from scrimmage last year. And there was no trade market for Ekeler (28) despite having a dozen more touchdowns than any skill position player in the NFL over the last two seasons. Ekeler stayed in L.A. and settled for some extra contract incentives. (...)
Assuming they sign the franchise tag and show up to training camp next week, Barkley, and Jacobs will be paid the same fixed, non-negotiable salary of $10.1 million each for the season. That’s a lot of money to most people. But consider that a wide receiver making $10 million wouldn’t even rank among the top 25 players at the position in 2023. In the NBA, a player making $10 million per year wouldn’t even be among the top 150 in the entire league. (...)
This positional devaluation has been brewing since 2011, when the NFL and the NFL Players Association agreed to essentially delay massive contracts for first-round picks until their fourth, fifth or even sixth year in the league. But by the time running backs have been in the league that long, they are often already declining. When wide receivers, quarterbacks, and defensive ends are turning 26, they’re hitting life-altering paydays as they approach their primes. NFL running backs turning 26 get treated like Leonardo DiCaprio’s girlfriends.
As running backs fell through the cracks of a new financial model, a running back’s job, in and of itself, also became less important than ever. Passing supplanted running as the dominant football strategy because coaches have realized the average pass goes 7 yards and the average run goes a little over 4. Not only had the job of the running back been devalued, but it’s also been split among players. Most teams figure they can put together a functional running game by committee, giving 20 percent of the money to a few guys who can replace 90 percent of the production.
by Danny Heifetz, The Ringer | Read more:
Image: Getty Images/Ringer
How to Spot an AI Cheater
"Labyrinthian mazes". I don't know what exactly struck me about these two words, but they caused me to pause for a moment. As I read on, however, my alarm bells started to ring. I was judging a science-writing competition for 14-16 year-olds, but in this particular essay, there was a sophistication in the language that seemed unlikely from a teenager.
I ran the essay through AI detection software. Within seconds, Copyleaks displayed the result on my screen and it was deeply disappointing: 95.9% of the text was likely AI-generated. I needed to be sure, so I ran it through another tool: Sapling, which identified 96.1% non-human text. ZeroGPT confirmed the first two, but was slightly lower in its scoring: 89% AI. So then I ran it through yet another software called Winston AI. It left no doubt: 1% human. Four separate AI detection softwares all had one clear message: this is an AI cheater. (...)
So, how might we spot the AI cheaters? Could there be cues and tells? Fortunately, new tools are emerging. However, as I would soon discover, the problem of AI fakery spans beyond the world of education – and technology alone won't be enough to respond to this change.
In the case of student cheating, the reassuring news is that teachers and educators already have existing tools and strategies that could help them check essays. For example, Turnitin, a plagiarism prevention software company that is used by educational institutions, released AI writing detection in April. Its CEO Chris Caren told me that the software's false positive rate (when it wrongly identifies human-written text as AI) stands at 1%.
There are also web tools like the ones I used to check my student essay – Copyleaks, Sapling, ZeroGPT and Winston AI. Most are free to use: you simply paste in text on their websites for a result. OpenAI, the creator of ChatGPT, released its own "AI classifier" in January.
How can AI detect another AI? The short answer is pattern recognition. The longer answer is that checkers use unique identifiers that differentiate human writing from computer-generated text. "Perplexity" and "Burstiness" are perhaps the two key metrics in AI text-sleuthing.
by Alex O'Brien, BBC | Read more:
Image: Getty
I ran the essay through AI detection software. Within seconds, Copyleaks displayed the result on my screen and it was deeply disappointing: 95.9% of the text was likely AI-generated. I needed to be sure, so I ran it through another tool: Sapling, which identified 96.1% non-human text. ZeroGPT confirmed the first two, but was slightly lower in its scoring: 89% AI. So then I ran it through yet another software called Winston AI. It left no doubt: 1% human. Four separate AI detection softwares all had one clear message: this is an AI cheater. (...)
So, how might we spot the AI cheaters? Could there be cues and tells? Fortunately, new tools are emerging. However, as I would soon discover, the problem of AI fakery spans beyond the world of education – and technology alone won't be enough to respond to this change.
In the case of student cheating, the reassuring news is that teachers and educators already have existing tools and strategies that could help them check essays. For example, Turnitin, a plagiarism prevention software company that is used by educational institutions, released AI writing detection in April. Its CEO Chris Caren told me that the software's false positive rate (when it wrongly identifies human-written text as AI) stands at 1%.
There are also web tools like the ones I used to check my student essay – Copyleaks, Sapling, ZeroGPT and Winston AI. Most are free to use: you simply paste in text on their websites for a result. OpenAI, the creator of ChatGPT, released its own "AI classifier" in January.
How can AI detect another AI? The short answer is pattern recognition. The longer answer is that checkers use unique identifiers that differentiate human writing from computer-generated text. "Perplexity" and "Burstiness" are perhaps the two key metrics in AI text-sleuthing.
by Alex O'Brien, BBC | Read more:
Image: Getty
Sunday, July 23, 2023
Ordinary People By The Millions
A conversation on current US politics with Thomas Frank, author of What’s the Matter with Kansas?; Listen Liberal; and The People, No: A Brief History of Anti-Populism.
SEYMOUR HERSH: How did we get to the political fault line that gave us a Donald Trump? When did it all start?
THOMAS FRANK: I sometimes feel like it’s the story of my life, because it all began shortly after I was born in 1965, during the Vietnam era. Within a few years came the beginning of the culture wars and the eclipsing of the old liberal consensus. It’s important to remember two facts about it all: First, that every single battle in the culture wars has been presented to us over the years as a kind of substitute class war, as an uprising of ordinary people with their humble values, against the highbrow elite.
The other fact is that, at the same time the Republicans were perfecting the culture-war formula, the Democrats were announcing that they no longer wanted to be the party of blue-collar workers. They said this more or less openly in the early 1970s. They envisioned a more idealistic, more noble constituency out there in the form of the young people then coming off the college campuses plus the enlightened white-collar elite. In other words, the Democrats were abandoning the old working-class agenda at the same moment that the Nixon Republicans were figuring out how to reach out to those voters.
Put both of those strategies in effect for fifty years with slight evolutionary changes (The New Democrats! The War on Christmas!), drag the nation through various disasters for working people and endless triumphs for the white-collar elite, and you get the politics we have today. (...)
SH: And why is it continuing, despite the constitutional horror of January 6, 2021?
TF: If the question is, why doesn’t the public care more about that dreadful event, I don’t really know the answer. I am amazed that Donald Trump is still standing as a politician after all the injuries he has inflicted on himself and the world. My suspicion is that the public doesn’t care more because they have learned to mistrust the news media and because the media’s constant beating of the January 6 drum sounds a lot like their constant beating of the Russiagate drum before that. It’s the problem of crying wolf, and then what do you do when the actual wolf shows up?
But the larger question—why do the upside-down politics of the last 50 years keep going?—is fairly easy to answer. It keeps going because it works for both sides. The Democrats now inhabit a world where they are moral superstars, people of incredibly exalted goodness. The media is aligned with them like we’ve never seen before, so are the most powerful knowledge industries, so is academia, so is the national security establishment. And so are, increasingly, the affluent and highly educated neighborhoods of this country. The Democrats are now frequently competitive with the Republicans in terms of fundraising, sometimes outraising and outspending the GOP, which is new and intoxicating for them. (...)
SH: Which candidate or president in recent history was most responsible for this turn?
TF: I think Bill Clinton was the pivotal figure of our times. Before he came along, the market-based reforms of Reaganism were controversial; after Clinton, they were accepted consensus wisdom. Clinton was the leader of the group that promised to end the Democrats’ old-style Rooseveltian politics, that hoped to make the Democrats into a party of white-collar winners, and he actually pulled that revolution off. He completed the Reagan agenda in a way the Republicans could not have dreamed of doing—signing trade agreements, deregulating Wall Street, getting the balanced budget, the ’94 crime bill, welfare reform. He almost got Social Security partially privatized, too. A near miss on that one.
He remade our party of the left (such as it is) so that it was no longer really identified with the economic fortunes of working people. Instead it was about highly educated professional-class winners, people whose good fortunes the Clintonized Democratic Party now regarded as a reflection of their merit. Now it was possible for the Democratic Party to reach out to Wall Street, to Silicon Valley, and so on.
Although there were hints of this shift before Clinton, he actually got it done, and his perceived success as president then made it permanent. This was something relatively new for a left party in the industrialized world, and it was quickly adopted by other left parties in other countries, most notably “New Labour” in the UK.
Unfortunately, this strategy has little to offer the people who used to be the Democratic Party’s main constituents except scolding. It merely assumes that they have, as the ’90s saying went, nowhere else to go.
SEYMOUR HERSH: How did we get to the political fault line that gave us a Donald Trump? When did it all start?
THOMAS FRANK: I sometimes feel like it’s the story of my life, because it all began shortly after I was born in 1965, during the Vietnam era. Within a few years came the beginning of the culture wars and the eclipsing of the old liberal consensus. It’s important to remember two facts about it all: First, that every single battle in the culture wars has been presented to us over the years as a kind of substitute class war, as an uprising of ordinary people with their humble values, against the highbrow elite.
The other fact is that, at the same time the Republicans were perfecting the culture-war formula, the Democrats were announcing that they no longer wanted to be the party of blue-collar workers. They said this more or less openly in the early 1970s. They envisioned a more idealistic, more noble constituency out there in the form of the young people then coming off the college campuses plus the enlightened white-collar elite. In other words, the Democrats were abandoning the old working-class agenda at the same moment that the Nixon Republicans were figuring out how to reach out to those voters.
Put both of those strategies in effect for fifty years with slight evolutionary changes (The New Democrats! The War on Christmas!), drag the nation through various disasters for working people and endless triumphs for the white-collar elite, and you get the politics we have today. (...)
SH: And why is it continuing, despite the constitutional horror of January 6, 2021?
TF: If the question is, why doesn’t the public care more about that dreadful event, I don’t really know the answer. I am amazed that Donald Trump is still standing as a politician after all the injuries he has inflicted on himself and the world. My suspicion is that the public doesn’t care more because they have learned to mistrust the news media and because the media’s constant beating of the January 6 drum sounds a lot like their constant beating of the Russiagate drum before that. It’s the problem of crying wolf, and then what do you do when the actual wolf shows up?
But the larger question—why do the upside-down politics of the last 50 years keep going?—is fairly easy to answer. It keeps going because it works for both sides. The Democrats now inhabit a world where they are moral superstars, people of incredibly exalted goodness. The media is aligned with them like we’ve never seen before, so are the most powerful knowledge industries, so is academia, so is the national security establishment. And so are, increasingly, the affluent and highly educated neighborhoods of this country. The Democrats are now frequently competitive with the Republicans in terms of fundraising, sometimes outraising and outspending the GOP, which is new and intoxicating for them. (...)
SH: Which candidate or president in recent history was most responsible for this turn?
TF: I think Bill Clinton was the pivotal figure of our times. Before he came along, the market-based reforms of Reaganism were controversial; after Clinton, they were accepted consensus wisdom. Clinton was the leader of the group that promised to end the Democrats’ old-style Rooseveltian politics, that hoped to make the Democrats into a party of white-collar winners, and he actually pulled that revolution off. He completed the Reagan agenda in a way the Republicans could not have dreamed of doing—signing trade agreements, deregulating Wall Street, getting the balanced budget, the ’94 crime bill, welfare reform. He almost got Social Security partially privatized, too. A near miss on that one.
He remade our party of the left (such as it is) so that it was no longer really identified with the economic fortunes of working people. Instead it was about highly educated professional-class winners, people whose good fortunes the Clintonized Democratic Party now regarded as a reflection of their merit. Now it was possible for the Democratic Party to reach out to Wall Street, to Silicon Valley, and so on.
Although there were hints of this shift before Clinton, he actually got it done, and his perceived success as president then made it permanent. This was something relatively new for a left party in the industrialized world, and it was quickly adopted by other left parties in other countries, most notably “New Labour” in the UK.
Unfortunately, this strategy has little to offer the people who used to be the Democratic Party’s main constituents except scolding. It merely assumes that they have, as the ’90s saying went, nowhere else to go.
by Seymour Hersh and Thomas Frank, Substack | Read more:
Image: Joe Raedle/Getty Images.Saturday, July 22, 2023
Stephen Curry: The Full Circle
There were too many bears roaming the woods behind the house and, with four daughters, far too many Barbies inside. Just before the school year ended in the early 1970s in Grottoes, Virginia, Wardell "Jack" Curry needed a solution, and fast. All he wanted was a way to keep his only son, Dell, occupied by something other than deadly animals or dolls during the long summer days ahead. As it turned out, though, with nothing more than an old utility pole, a fiberglass backboard and some fabricated steel brackets, Jack Curry ended up changing the sport of basketball and producing the ultimate point guard, his grandson Stephen Curry.
Jack's hoop was never much to look at. Its finest feature, by far, was the old reliable street lamp that hovered overhead and dutifully blinked on at dusk, bathing the key in warm yellow light. But this was Jack's plan all along: Only people who truly loved the game and understood the commitment it required would stick past dark on his country court.
Jack's hoop was never much to look at. Its finest feature, by far, was the old reliable street lamp that hovered overhead and dutifully blinked on at dusk, bathing the key in warm yellow light. But this was Jack's plan all along: Only people who truly loved the game and understood the commitment it required would stick past dark on his country court.
The soft wings of the backboard had more give than a fence gate. The thick steel rim offered no absolution; only shots placed perfectly in the middle of the cylinder passed through. The institutional green metal breaker box just behind the hoop gave off a constant static hum that lured a shooter's focus away from the target. And the splintery wooden utility pole wasn't squared to a single landmark -- not the white ranch-style house, not the driveway, not the Blue Ridge mountains to the south or the creek to the north. So every shot required instant, expert recalibration.
Years of toil in the sun and mud honed Dell's fluid, deadly jumper -- a shot that produced a state title, a scholarship to Virginia Tech and a 16-year NBA career, mostly in Charlotte, that ended in 2002. And when Dell and his wife, Sonya, started their own family, their first child, Wardell Stephen Curry II, got more than just his name from Grandpa Jack. Stephen inherited the hoop and the same deep abiding love for the game it evokes. During frequent childhood trips to Grottoes, a sleepy mix of horse farms and trailer parks an hour northwest of Charlottesville, Stephen and his younger brother Seth (who played at Duke) would barely wait for the car to stop rolling before darting around back to start shooting. Their grandma, Juanita, 79, whom everyone calls Duckie, knew that if she wanted a kiss hello she had to position herself between the car and the hoop. (Jack died when Stephen was 2.) This is where Curry's love of the long ball was born, his trying to be the first one in the family to swish it from 60 feet, blind, peeking around the corner from the top kitchen step. "I always felt like the love and the lessons of that hoop got passed down to me," Stephen says. "It's crazy to think about how everything kinda started right there at this house with this one old hoop."
This season in Golden State, the legend grows larger by the minute. Nearly every night since the All-Star Game -- for which Curry was the top vote-getter and where he sank 13 straight shots to win the 3-point contest -- he's been expanding the lore of Jack's hoop as well as the parameters by which we define point guard greatness. Yes, his stats are MVP-worthy: Through March 24, he ranked seventh in points (23.4 per game), sixth in assists (7.9) and third in steals (2.1). Yes, he has the fourth-highest 3-point percentage, 43.6 percent, in NBA history and has led the league in total 3s since 2012, if you're counting. And yes, in six years, he has catapulted Golden State from perennial nonfactor to title favorite. But Curry's evolution this season is about something more profound than shooting, stats or hardware. The point guard groomed by that historic hoop in Grottoes has become the game's future.
Curry is standing at the forefront of a new era of playmaker. For the first time since Magic Johnson took an evolutionary leap for the position, we're witnessing the ultimate embodiment of the point guard. Not a shooter like Steve Nash, a passer like John Stockton, a defender like Gary Payton or a floor general like Isiah Thomas. Someone with the ability to do it all, excelling in each category while elevating everyone around him and then topping it the very next night: basketball's new 6-foot-3, 190-pound unstoppable force. "He's lethal," says Curry's coach, Steve Kerr. "He's mesmerizing," says his teammate Klay Thompson. He's the "best shooter I've ever seen," says his president, Barack Obama.
Oftentimes he's all three at once. During a 106-98 win over the Clippers on March 8, Curry needed all of seven seconds to transform LA's defense from a group of elite athletes to a gaggle of bewildered senior citizens stammering around at the wrong connecting gate. Up by 10 with just under nine minutes left in the third, Curry dribbled past half court near the high left wing and used a pick to split defenders Matt Barnes and Chris Paul. When he re-emerged, 7-1 power forward Spencer Hawes and center DeAndre Jordan had walled off his escape to the basket. Curry had a split second left before the Clippers converged on him like a junkyard car crusher. He stopped on a dime, dribbled backward through his legs to his left hand, then returned the ball behind his back to his right. The move caused Paul and Jordan to lunge awkwardly into the vortex Curry no longer occupied. Curry then spun away from the basket (and what looked like an impending bear hug from an exasperated Hawes) before dribble-lunging, back, 3 feet behind the arc, as if leaping a mud puddle in Jack Curry's gravel driveway.
In the blink of an eye -- well, less, actually -- Curry planted, coiled, elevated and snapped his wrist. Splash. "That could be the greatest move I've ever seen live," blurted stunned ESPN analyst Jeff Van Gundy, who coached against Michael Jordan many times. When his colleagues giggled at the suggestion, though, Van Gundy growled back without hesitation, "No, I'm being serious."
The sequence had everything: court presence, ballhandling, flawless shooting fundamentals, creativity and, above all, major, major cojones. It left Kerr looking like a young Macaulay Culkin on the bench. And across the country, it had Grandma Duckie cheering from her favorite burgundy chair in front of the TV. "Each time Stephen does his thing, we all picture big Jack up in heaven, nudging all the angels, gathering 'em up," says Steph's aunt and Dell's sister, Jackie Curry. "And he's yelling and pointing, 'Look, look down there at what I did! Y'all know I started this, right? Started all this with just that one little hoop, right there.'"
Years of toil in the sun and mud honed Dell's fluid, deadly jumper -- a shot that produced a state title, a scholarship to Virginia Tech and a 16-year NBA career, mostly in Charlotte, that ended in 2002. And when Dell and his wife, Sonya, started their own family, their first child, Wardell Stephen Curry II, got more than just his name from Grandpa Jack. Stephen inherited the hoop and the same deep abiding love for the game it evokes. During frequent childhood trips to Grottoes, a sleepy mix of horse farms and trailer parks an hour northwest of Charlottesville, Stephen and his younger brother Seth (who played at Duke) would barely wait for the car to stop rolling before darting around back to start shooting. Their grandma, Juanita, 79, whom everyone calls Duckie, knew that if she wanted a kiss hello she had to position herself between the car and the hoop. (Jack died when Stephen was 2.) This is where Curry's love of the long ball was born, his trying to be the first one in the family to swish it from 60 feet, blind, peeking around the corner from the top kitchen step. "I always felt like the love and the lessons of that hoop got passed down to me," Stephen says. "It's crazy to think about how everything kinda started right there at this house with this one old hoop."
This season in Golden State, the legend grows larger by the minute. Nearly every night since the All-Star Game -- for which Curry was the top vote-getter and where he sank 13 straight shots to win the 3-point contest -- he's been expanding the lore of Jack's hoop as well as the parameters by which we define point guard greatness. Yes, his stats are MVP-worthy: Through March 24, he ranked seventh in points (23.4 per game), sixth in assists (7.9) and third in steals (2.1). Yes, he has the fourth-highest 3-point percentage, 43.6 percent, in NBA history and has led the league in total 3s since 2012, if you're counting. And yes, in six years, he has catapulted Golden State from perennial nonfactor to title favorite. But Curry's evolution this season is about something more profound than shooting, stats or hardware. The point guard groomed by that historic hoop in Grottoes has become the game's future.
Curry is standing at the forefront of a new era of playmaker. For the first time since Magic Johnson took an evolutionary leap for the position, we're witnessing the ultimate embodiment of the point guard. Not a shooter like Steve Nash, a passer like John Stockton, a defender like Gary Payton or a floor general like Isiah Thomas. Someone with the ability to do it all, excelling in each category while elevating everyone around him and then topping it the very next night: basketball's new 6-foot-3, 190-pound unstoppable force. "He's lethal," says Curry's coach, Steve Kerr. "He's mesmerizing," says his teammate Klay Thompson. He's the "best shooter I've ever seen," says his president, Barack Obama.
Oftentimes he's all three at once. During a 106-98 win over the Clippers on March 8, Curry needed all of seven seconds to transform LA's defense from a group of elite athletes to a gaggle of bewildered senior citizens stammering around at the wrong connecting gate. Up by 10 with just under nine minutes left in the third, Curry dribbled past half court near the high left wing and used a pick to split defenders Matt Barnes and Chris Paul. When he re-emerged, 7-1 power forward Spencer Hawes and center DeAndre Jordan had walled off his escape to the basket. Curry had a split second left before the Clippers converged on him like a junkyard car crusher. He stopped on a dime, dribbled backward through his legs to his left hand, then returned the ball behind his back to his right. The move caused Paul and Jordan to lunge awkwardly into the vortex Curry no longer occupied. Curry then spun away from the basket (and what looked like an impending bear hug from an exasperated Hawes) before dribble-lunging, back, 3 feet behind the arc, as if leaping a mud puddle in Jack Curry's gravel driveway.
In the blink of an eye -- well, less, actually -- Curry planted, coiled, elevated and snapped his wrist. Splash. "That could be the greatest move I've ever seen live," blurted stunned ESPN analyst Jeff Van Gundy, who coached against Michael Jordan many times. When his colleagues giggled at the suggestion, though, Van Gundy growled back without hesitation, "No, I'm being serious."
The sequence had everything: court presence, ballhandling, flawless shooting fundamentals, creativity and, above all, major, major cojones. It left Kerr looking like a young Macaulay Culkin on the bench. And across the country, it had Grandma Duckie cheering from her favorite burgundy chair in front of the TV. "Each time Stephen does his thing, we all picture big Jack up in heaven, nudging all the angels, gathering 'em up," says Steph's aunt and Dell's sister, Jackie Curry. "And he's yelling and pointing, 'Look, look down there at what I did! Y'all know I started this, right? Started all this with just that one little hoop, right there.'"
How AI is Bringing Film Stars Back From the Dead
Most actors dream of building a career that will outlive them. Not many manage it – show business can be a tough place to find success. Those that do, though, can achieve a kind of immortality on the silver screen that allows their names to live on in lights.
One such icon is the American film actor James Dean, who died in 1955 in a car accident after starring in just three films, all of which were highly acclaimed. Yet now, nearly seven decades after he died, Dean has been cast as the star in a new, upcoming movie called Back to Eden.
A digital clone of the actor – created using artificial intelligence technology similar to that used to generate deepfakes – will walk, talk and interact on screen with other actors in the film.
Digital clones
Dean's image is one of hundreds represented by WRX and its sister licensing company CMG Worldwide – including Amelia Earhart, Bettie Page, Malcolm X and Rosa Parks.
When Dean died 68 years ago, he left behind a robust collection of his likeness in film, photographs and audio – what WRX's Cloyd calls "source material". Cloyd says that to achieve photorealistic representation of a Dean, countless images are scanned, tuned to high resolution and processed by a team of digital experts using advanced technologies. Add in audio, video and AI, and suddenly these materials become the building blocks of a digital clone that looks, sounds, moves and even responds to prompts like Dean. (...)
There are now even companies that allow users to upload deceased loved one's digital data to create "deadbots" that chat with the living from beyond the grave. The more source material, the more accurate and intelligent the deadbot, meaning the executor of a modern-day celebrity's estate could potentially allow for a convincingly realistic clone of the deceased star to continue working in the film industry – and interacting somewhat autonomously – in perpetuity.
by S.J. Velasquez, BBC | Read more:
Image: Getty Images
One such icon is the American film actor James Dean, who died in 1955 in a car accident after starring in just three films, all of which were highly acclaimed. Yet now, nearly seven decades after he died, Dean has been cast as the star in a new, upcoming movie called Back to Eden.
A digital clone of the actor – created using artificial intelligence technology similar to that used to generate deepfakes – will walk, talk and interact on screen with other actors in the film.
The technology is at the cutting edge of Hollywood computer generated imagery (CGI). But it also lies at the root of some of the concerns being raised by actors and screen writers who have walked out on strike in Hollywood for the first time in 43 years. They fear being replaced by AI algorithms – something they argue will sacrifice creativity for the sake of profit. Actor Susan Sarandon is among those who has spoken about her concerns, warning that AI could make her "say and do things I have no choice about". (Read about how the 2013 film The Congress predicted Hollywood's current AI crisis.) (...)
This is the second time Dean’s digital clone has been lined up for a film. In 2019, it was announced he would be resurrected in CGI for a film called Finding Jack, but it was later cancelled. Cloyd confirmed to BBC, however, that Dean will instead star in Back to Eden, a science fiction film in which "an out of this world visit to find truth leads to a journey across America with the legend James Dean".
The digital cloning of Dean also represents a significant shift in what is possible. Not only will his AI avatar be able to play a flat-screen role in Back to Eden and a series of subsequent films, but also to engage with audiences in interactive platforms including augmented reality, virtual reality and gaming. The technology goes far beyond passive digital reconstruction or deepfake technology that overlays one person's face over someone else's body. It raises the prospect of actors – or anyone else for that matter – achieving a kind of immortality that would have been otherwise impossible, with careers that go on long after their lives have ended.
But it also raises some uncomfortable questions. Who owns the rights to someone's face, voice and persona after they die? What control can they have over the direction of their career after death – could an actor who made their name starring in gritty dramas suddenly be made to appear in a goofball comedy or even pornography? What if they could be used for gratuitous brand promotions in adverts? (...)
This is the second time Dean’s digital clone has been lined up for a film. In 2019, it was announced he would be resurrected in CGI for a film called Finding Jack, but it was later cancelled. Cloyd confirmed to BBC, however, that Dean will instead star in Back to Eden, a science fiction film in which "an out of this world visit to find truth leads to a journey across America with the legend James Dean".
The digital cloning of Dean also represents a significant shift in what is possible. Not only will his AI avatar be able to play a flat-screen role in Back to Eden and a series of subsequent films, but also to engage with audiences in interactive platforms including augmented reality, virtual reality and gaming. The technology goes far beyond passive digital reconstruction or deepfake technology that overlays one person's face over someone else's body. It raises the prospect of actors – or anyone else for that matter – achieving a kind of immortality that would have been otherwise impossible, with careers that go on long after their lives have ended.
But it also raises some uncomfortable questions. Who owns the rights to someone's face, voice and persona after they die? What control can they have over the direction of their career after death – could an actor who made their name starring in gritty dramas suddenly be made to appear in a goofball comedy or even pornography? What if they could be used for gratuitous brand promotions in adverts? (...)
Digital clones
Dean's image is one of hundreds represented by WRX and its sister licensing company CMG Worldwide – including Amelia Earhart, Bettie Page, Malcolm X and Rosa Parks.
When Dean died 68 years ago, he left behind a robust collection of his likeness in film, photographs and audio – what WRX's Cloyd calls "source material". Cloyd says that to achieve photorealistic representation of a Dean, countless images are scanned, tuned to high resolution and processed by a team of digital experts using advanced technologies. Add in audio, video and AI, and suddenly these materials become the building blocks of a digital clone that looks, sounds, moves and even responds to prompts like Dean. (...)
There are now even companies that allow users to upload deceased loved one's digital data to create "deadbots" that chat with the living from beyond the grave. The more source material, the more accurate and intelligent the deadbot, meaning the executor of a modern-day celebrity's estate could potentially allow for a convincingly realistic clone of the deceased star to continue working in the film industry – and interacting somewhat autonomously – in perpetuity.
Image: Getty Images
Friday, July 21, 2023
$$Kudzu$$: The Kingdom of Private Equity
These Are the Plunderers: How Private Equity Runs—and Wrecks—America by Gretchen Morgenson and Joshua Rosner. Simon & Schuster, 381 pages. 2023.
Our Lives in Their Portfolios: Why Asset Managers Own the World by Brett Christophers. Verso, 310 pages. 2023.A specter is hauntng capitalism: the specter of financialization. Industrial capitalism—the capitalism of “dark Satanic mills”—was bad enough, but it had certain redeeming features: in a word (well, two words), people and place. Factory work may have been grueling and dangerous, but workers sometimes acquired genuine skills, and being under one roof made it easier for them to organize and strike. Factories were often tied, by custom and tradition as well as logistics, to one place, making it harder to simply pack up and move in the face of worker dissatisfaction or government regulation.
To put the contrast at its simplest and starkest: industrial capitalism made money by making things; financial capitalism makes money by fiddling with figures. Sometimes, at least, old-fashioned capitalism produced—along with pollution, workplace injuries, and grinding exploitation—useful things: food, clothing, housing, transportation, books, and other necessities of life. Financial capitalism merely siphons money upward, from the suckers to the sharps.
Marxism predicted that because of competition and technological development, it would eventually prove more and more difficult to make a profit through the relatively straightforward activity of industrial capitalism. It looked for a while—from the mid-1940s to the mid-1970s—as though capitalism had proven Marxism wrong. Under the benign guidance of the Bretton Woods Agreement, which used capital controls and fixed exchange rates to promote international economic stability and discourage rapid capital movements and currency speculation, the United States and Europe enjoyed an almost idyllic prosperity in those three decades. But then American companies began to feel the effects of European and Japanese competition. They didn’t like it, so they pressured the Nixon administration to scrap the accords. Wall Street, which the Bretton Woods rules had kept on a leash, sensed its opportunity and also lobbied hard—and successfully.
The result was a tsunami of speculation over the next few decades, enabled by wave after wave of financial deregulation. The latter was a joint product of fierce lobbying by financial institutions and the ascendancy of laissez faire ideology—also called “neoliberalism”—embraced by Ronald Reagan and Margaret Thatcher and subsequently by Bill Clinton and Tony Blair. The idiocy was bipartisan: Clinton and Obama were as clueless as their Republican counterparts.
Among these “reforms”—each of them a dagger aimed at the heart of a sane and fair economy—were: allowing commercial banks, which handle the public’s money, to take many of the same risks as investment banks, which handle investors’ money; lowering banks’ minimum reserve requirements, freeing them to use more of their funds for speculative purposes; allowing pension funds, insurance companies, and savings-and-loan associations (S&Ls) to make high-risk investments; facilitating corporate takeovers; approving new and risky financial instruments like credit default swaps, collateralized debt obligations, derivatives, and mortgage-based securities; and most important, removing all restrictions on the movement of speculative capital, while using the International Monetary Fund (IMF) to force unwilling countries to comply. Together these changes, as the noted economics journalist Robert Kuttner observed, forced governments “to run their economies less in service of steady growth, decent distribution, and full employment—and more to keep the trust of financial speculators, who tended to prize high interest rates, limited social outlays, low taxes on capital, and balanced budgets.”
Keynes, a principal architect of the Bretton Woods Agreement, warned: “Speculators may do no harm as bubbles on a steady stream of enterprise. But the position is serious when enterprise becomes the bubble on a whirlpool of speculation.” That was indeed the position roughly fifty years after Keynes’s death, and the predictable consequences followed. S&Ls were invited to make more adventurous investments in the 1980s. They did, and within a decade a third of them failed. The cost of the bailout was $160 billion. In the 1990s, a hedge fund named Long-Term Capital Management claimed to have discovered an algorithm that would reduce investment risk to nearly zero. For four years it was wildly successful, attracting $125 billion from investors. In 1998 its luck ran out. Judging that its failure would crash the stock market and bring down dozens of banks, the government organized an emergency rescue. The 2007–2008 crisis was an epic clusterfuck, involving nearly everyone in both the financial and political systems, though special blame should be attached to supreme con man Alan Greenspan, who persuaded everyone in government to repose unlimited confidence in the wisdom of the financial markets. Through it all, the Justice Department was asleep at the wheel. During the wild and woolly ten years before the 2008 crash, bank fraud referrals for criminal prosecution decreased by 95 percent.
The Washington Consensus, embodying the neoliberal dogma of market sovereignty, was forced on the rest of the world through the mechanism of “structural adjustments,” a set of conditions tacked onto all loans by the IMF. Latin American countries were encouraged to borrow heavily from U.S. banks after the 1973 oil shock. When interest rates increased later in the decade, those countries were badly squeezed; Washington and the IMF urged still more deregulation. The continent’s economies were devastated; the 1980s are known in Latin America as the “Lost Decade.” In 1997, in Thailand, Indonesia, the Philippines, and South Korea, the same causes—large and risky debts to U.S. banks and subsequent interest-rate fluctuations—produced similar results: economic contraction, redoubled exhortations to accommodate foreign investors, and warnings not to try to regulate capital flows. By the 2000s, Europe had caught the neoliberal contagion: in the wake of the 2008 crisis, the weaker, more heavily indebted economies—Greece, Italy, Portugal, and Spain—were forced to endure crushing austerity rather than default. Financialization was a global plague.
Slash, Burn, and Churn
In the 1960s, a brave new idea was born, which ushered in a brave new world. Traders figured out how to buy things without money. More precisely, they realized that you could borrow the money to buy the thing while using the thing itself as collateral. They could buy a company with borrowed money, using the company’s assets as collateral for the loan. They then transferred the debt to the company, which in effect had to pay for its own hijacking, and eventually sold it for a tidy profit. In the 1960s, when Jerome Kohlberg, a finance executive at Bear Stearns & Co., started to see the possibilities, it was called “bootstrap financing.” By the mid-1970s, when Kohlberg set up a company with Henry Kravis and George Roberts, it was known as the leveraged buyout (LBO).
The leveraged buyout was the key to the magic kingdom of private equity. But LBOs leave casualties. To service its new debt, the acquired company often must cut costs drastically. This usually means firing workers and managers and overworking those who remain, selling off divisions, renegotiating contracts with suppliers, halting environmental mitigation, and eliminating philanthropy and community service. And even then, many companies failed—a significant proportion of companies acquired in LBOs went bankrupt.
Fortunately, it was discovered around this time that workers, suppliers, and communities don’t matter. In the wake of Milton Friedman’s famous and influential 1972 pronouncement that corporations have no other obligations than to maximize profits, several business school professors further honed neoliberalism into an operational formula: the fiduciary duty of every employee is always and only to increase the firm’s share price. This “shareholder value theory,” which exalted the interests of investors over all others—indeed recognized no other interests at all—afforded the intellectual and moral scaffolding of the private equity revolution. (...)
An academic study found that around 20 percent of large, private-equity-acquired companies were bankrupt within ten years, compared with 2 percent of all other companies. Another study looked at ten thousand companies acquired by private equity firms over a thirty-year period and found that employment declined between 13 and 16 percent. A 2019 study found that “over the previous decade almost 600,000 people lost their jobs as retailers as retail collapsed after being bought by private equity.”
by George Scialabba, The Baffler | Read more:
Image:© Brindha Kumar
[ed. Nice capsule history, and still going strong if not accelerating. Wait until AI gets weaponized to help.]
[ed. Nice capsule history, and still going strong if not accelerating. Wait until AI gets weaponized to help.]
Thursday, July 20, 2023
Rat Kings of New York
Some say New York City lost the War on Rats with the invention of the plastic bag. Others point to global warming and the fact that a few more warm breeding days is enough for untold thousands of extra litters. In reality, we never stood a chance: we were doomed the moment the first pair of rattus norvegicus came ashore with the Hessians. For generations, city officials have tried to fight the furry tide. Mayor O’Dwyer convened an anti-rodent committee in 1948. Mayor Giuliani formed an Extermination Task Force—part of his controversial purge of feculence of every kind. All have failed. This time, vows Eric Adams, things will be different.
To get a sense of New York’s reinvigorated campaign, declared in November of 2022 with a tranche of anti-rodent legislation, the Rat Academy is a good enough place to start. There we were—young and old, landlords and tenants, rodenticide enthusiasts and professional rubberneckers—huddled in the basement of a Department of Health and Mental Hygiene building in East Harlem on the last day of May, getting to know our enemy. Their impressive birthrates, their stupendous gnawing strength, the trigger hairs on their heads that give them a feeling of safety under bricks and in cramped burrows. Kathleen Corradi, the city’s Rat Czar, was there to bless the proceedings. A good-looking blonde guy seated in front of me with a notepad turned out to be a reporter for a Danish newspaper. The presence of the Scandinavian media at this humble seminar is what’s known in the pest control business as a “sign”—that New York is at least winning the public relations side of this latest War on Rats.
The tactical session had the zeal of a new crusade, but the Rat Academy dates back to 2005, during billionaire Michael Bloomberg’s tenure as mayor. He called it the Rodent Academy, a three-day crash course for property owners and pest control pros. At some point in the last decade, the city added two-hour sessions for the uncredentialed and the curious. They give good practical advice, like how rats view a paver over their hole as an amenity and the importance of mixing wire with concrete when plugging cracks. And they’re good PR, a chance for a company like Infinity Shields to advertise its dubious miracle spray, and for city councilmembers to show they’re dedicated to taking action—the evening’s PowerPoint bore the logos of Manhattan Community Boards 9, 10, and 11, as well as councilmembers Shaun Abreu, Diana Ayala, and Kristin Richardson Jordan.
What you quickly learn is that the rat problem is really a trash problem. Alone among great cities, New York City residents and businesses drag some forty-four million pounds of garbage to the sidewalk every evening, providing the rats a situation that evokes Templeton’s binge at the fairgrounds in Charlotte’s Web.
One of the first salvos in Mayor Adams’s renewed campaign to take back the city was the announcement that set-out times for this black-bagged waste would be rolled back four hours, to 8 p.m. Of course, rats don’t mind dining on a European schedule. The mailers and bus ads promoting the new rules featured a morose grey rodent dragging a big, gaudy suitcase. “Send Rats Packing!” it announced. A T-shirt ($48) offered by the Department of Sanitation proclaims: “Rats don’t run this city. We Do.”
This and other rhetoric of the War on Rats comes uncomfortably close to anti-immigrant sloganeering and racist cartoons of the not-so-distant past, whipping up public opinion against “enemy populations” to justify corrective regimes—from the rodent abatement industry’s latest traps and poisons to advanced garbage receptacles. Nobody but the rats wants trash-strewn streets. But the patently absurd and endless war on these maligned creatures obscures the fact that any real gains will require systemic changes in urban infrastructure. The sanitation department’s first detailed study of the viability of containerization concluded last year that a complete overhaul of residential garbage collection—made possible by as-yet-undeveloped bins and trucks—could keep 89 percent of the city’s household refuse out of rodents’ reach. Promises and press releases abound, but the chance of such an overhaul actually coming to pass is slim. (...)
The theory of broken windows comes down to aesthetics—a taste for the appearance of order. In this way, the government acts like a neighborhood association with a monopoly on lethal force. And indeed, Giuliani’s tough-guy talk encouraged a level of racist brutality in the enforcement of a program that, on paper, is less a crime-busting blueprint than a way to strengthen the self-regulation of subjective community norms. Like Giuliani’s vendetta against Squeegee Men, the fight against the Murine Menace demands quick, visible busts, producing the feeling of safety, security, and cleanliness while conveniently downplaying the roots of the problem.
Adams has stationed cops on the subway platforms to make people feel “safe”—that is, if you’re the kind of person comforted by cops. He has gathered the unhoused into shelters, partly for their sake, partly for appearances. “The mayor has depicted the city’s rat situation much as he has portrayed its crime and homelessness issues,” writes the New York Times. “He says all illustrate a sense of disorder that Mr. Adams hopes to tamp down.” Indeed, “distasteful, worrisome encounters” certainly describes the experiences of New Yorkers who complain of rodents scampering over mountains of trash and between disused dining sheds. One spokesperson at the Rat Academy compared chasing rats from their nests to illegal evictions—presumably something both landlords and tenants could relate to. If you came home and your locks had been changed, would you give up? No. But if it happened every day, for two weeks . . . (...)
Who benefits from this forever war? The political sugar rush is already hitting. It’s an aesthetic contest, after all—the trick is visible change, a feeling that there is less trash and fewer rats. You may not necessarily notice a lack. You do notice a shiny new container with a tight-fitting lid where once there were mountains of seeping, rustling bags. But the problem with this style of perpetual, piecemeal warfare is that containerization must be consistent, covering residential and commercial, from house to house and block to block—or else the rats will simply adjust their habits. And here, the problem is not just our overwhelming failure to sensibly dispose of our garbage, it’s that we produce too much of it.
To get a sense of New York’s reinvigorated campaign, declared in November of 2022 with a tranche of anti-rodent legislation, the Rat Academy is a good enough place to start. There we were—young and old, landlords and tenants, rodenticide enthusiasts and professional rubberneckers—huddled in the basement of a Department of Health and Mental Hygiene building in East Harlem on the last day of May, getting to know our enemy. Their impressive birthrates, their stupendous gnawing strength, the trigger hairs on their heads that give them a feeling of safety under bricks and in cramped burrows. Kathleen Corradi, the city’s Rat Czar, was there to bless the proceedings. A good-looking blonde guy seated in front of me with a notepad turned out to be a reporter for a Danish newspaper. The presence of the Scandinavian media at this humble seminar is what’s known in the pest control business as a “sign”—that New York is at least winning the public relations side of this latest War on Rats.
The tactical session had the zeal of a new crusade, but the Rat Academy dates back to 2005, during billionaire Michael Bloomberg’s tenure as mayor. He called it the Rodent Academy, a three-day crash course for property owners and pest control pros. At some point in the last decade, the city added two-hour sessions for the uncredentialed and the curious. They give good practical advice, like how rats view a paver over their hole as an amenity and the importance of mixing wire with concrete when plugging cracks. And they’re good PR, a chance for a company like Infinity Shields to advertise its dubious miracle spray, and for city councilmembers to show they’re dedicated to taking action—the evening’s PowerPoint bore the logos of Manhattan Community Boards 9, 10, and 11, as well as councilmembers Shaun Abreu, Diana Ayala, and Kristin Richardson Jordan.
What you quickly learn is that the rat problem is really a trash problem. Alone among great cities, New York City residents and businesses drag some forty-four million pounds of garbage to the sidewalk every evening, providing the rats a situation that evokes Templeton’s binge at the fairgrounds in Charlotte’s Web.
One of the first salvos in Mayor Adams’s renewed campaign to take back the city was the announcement that set-out times for this black-bagged waste would be rolled back four hours, to 8 p.m. Of course, rats don’t mind dining on a European schedule. The mailers and bus ads promoting the new rules featured a morose grey rodent dragging a big, gaudy suitcase. “Send Rats Packing!” it announced. A T-shirt ($48) offered by the Department of Sanitation proclaims: “Rats don’t run this city. We Do.”
This and other rhetoric of the War on Rats comes uncomfortably close to anti-immigrant sloganeering and racist cartoons of the not-so-distant past, whipping up public opinion against “enemy populations” to justify corrective regimes—from the rodent abatement industry’s latest traps and poisons to advanced garbage receptacles. Nobody but the rats wants trash-strewn streets. But the patently absurd and endless war on these maligned creatures obscures the fact that any real gains will require systemic changes in urban infrastructure. The sanitation department’s first detailed study of the viability of containerization concluded last year that a complete overhaul of residential garbage collection—made possible by as-yet-undeveloped bins and trucks—could keep 89 percent of the city’s household refuse out of rodents’ reach. Promises and press releases abound, but the chance of such an overhaul actually coming to pass is slim. (...)
The theory of broken windows comes down to aesthetics—a taste for the appearance of order. In this way, the government acts like a neighborhood association with a monopoly on lethal force. And indeed, Giuliani’s tough-guy talk encouraged a level of racist brutality in the enforcement of a program that, on paper, is less a crime-busting blueprint than a way to strengthen the self-regulation of subjective community norms. Like Giuliani’s vendetta against Squeegee Men, the fight against the Murine Menace demands quick, visible busts, producing the feeling of safety, security, and cleanliness while conveniently downplaying the roots of the problem.
Adams has stationed cops on the subway platforms to make people feel “safe”—that is, if you’re the kind of person comforted by cops. He has gathered the unhoused into shelters, partly for their sake, partly for appearances. “The mayor has depicted the city’s rat situation much as he has portrayed its crime and homelessness issues,” writes the New York Times. “He says all illustrate a sense of disorder that Mr. Adams hopes to tamp down.” Indeed, “distasteful, worrisome encounters” certainly describes the experiences of New Yorkers who complain of rodents scampering over mountains of trash and between disused dining sheds. One spokesperson at the Rat Academy compared chasing rats from their nests to illegal evictions—presumably something both landlords and tenants could relate to. If you came home and your locks had been changed, would you give up? No. But if it happened every day, for two weeks . . . (...)
Who benefits from this forever war? The political sugar rush is already hitting. It’s an aesthetic contest, after all—the trick is visible change, a feeling that there is less trash and fewer rats. You may not necessarily notice a lack. You do notice a shiny new container with a tight-fitting lid where once there were mountains of seeping, rustling bags. But the problem with this style of perpetual, piecemeal warfare is that containerization must be consistent, covering residential and commercial, from house to house and block to block—or else the rats will simply adjust their habits. And here, the problem is not just our overwhelming failure to sensibly dispose of our garbage, it’s that we produce too much of it.
Can Barbie Have It All?
There was a good chance Barbie would topple under the weight of its expectations. In the 60 years since she debuted, Barbie has been embraced and disparaged as a paragon of idealized femininity, as a prompt for imaginative play, and as a tool of the patriarchy, upholding oppressive beauty standards and stereotypes. The Barbie movie winks at the doll’s cultural centrality, as it opens with a shot-for-shot remake of the first scene in 2001: A Space Odyssey, which depicts the dawn of human enlightenment. As humans’ precursors discovered the power of tools, little girls encountered a doll that represented an adult woman.
In the months leading up to its release, Barbie was shrouded with a certain mystique—the feminine kind—with trailers obscuring more than they revealed. It became more than a summer blockbuster based on a famous toy, one-half of a cinematic meme, and the vehicle for an increasingly popular “Barbiecore” aesthetic. Director and co-writer Greta Gerwig has leaned into the idea of Barbie as a unifying cultural phenomenon, a girlhood experience so broadly shared that it could bring all kinds of women together. She told The New York Times that she wanted viewers to find in Barbie a sort of benediction, hoping to replicate the feeling she had attending Shabbat dinner with friends as a child. “I want them to get blessed,” Gerwig said, aware of her subject’s cultural baggage. Could Barbie capture the magic of childhood play, while also contending with the doll’s complicated role in American culture?
More or less. (This article contains mild spoilers, so stop reading here if you want to avoid learning more about the plot.) While hardly a sophisticated vehicle for a feminist treatise, Barbie is a bright, creative endeavor that neatly wraps up the struggles of womanhood in a clever package. The film is a pleasing mishmash of genres, with elements fantastical, political, mystical, and musical, but at its core it is a coming of age story, a bildungsroman in shades of pink.
Barbie would be worth seeing in theaters for the visuals alone. Barbie Land, home to the Barbies, Kens, and their various discontinued doll sidekicks, is a colorful pastiche of life in plastic, and it’s fantastic. Watching the opening sequence of Barbie Land, depicting Barbie’s morning routine, makes the viewer feel like a little girl playing in her own plastic dream house. But it’s the main character who gives the world texture: Margot Robbie is incandescent as Barbie, and not only because with her megawatt smile and flowing blonde hair, it is easy to believe that she is a doll come to life.
Robbie is “Stereotypical Barbie”—the Barbie you picture when you think of “Barbie.” Her charmed existence is upended one day when her tiptoed feet start flattening, she experiences a surge of irrepressible thoughts of death, and notices the development of cellulite (gasp!). At a dance party with bespoke choreography, Barbie interrupts the festivities by asking if her fellow dolls ever think about dying.
To combat her existential woes, Barbie must venture into our reality—the Real World—to find and reconnect with the actual person playing with her, whose anxiety is manifesting in Barbie. Accompanied by Ken (Ryan Gosling), and buoyed by the Indigo Girls’ classic “Closer to Fine,” Barbie must discover who she is in the Real World.
by Grace Segers, TNR | Read more:
Image: Warner Bros. Pictures
In the months leading up to its release, Barbie was shrouded with a certain mystique—the feminine kind—with trailers obscuring more than they revealed. It became more than a summer blockbuster based on a famous toy, one-half of a cinematic meme, and the vehicle for an increasingly popular “Barbiecore” aesthetic. Director and co-writer Greta Gerwig has leaned into the idea of Barbie as a unifying cultural phenomenon, a girlhood experience so broadly shared that it could bring all kinds of women together. She told The New York Times that she wanted viewers to find in Barbie a sort of benediction, hoping to replicate the feeling she had attending Shabbat dinner with friends as a child. “I want them to get blessed,” Gerwig said, aware of her subject’s cultural baggage. Could Barbie capture the magic of childhood play, while also contending with the doll’s complicated role in American culture?
More or less. (This article contains mild spoilers, so stop reading here if you want to avoid learning more about the plot.) While hardly a sophisticated vehicle for a feminist treatise, Barbie is a bright, creative endeavor that neatly wraps up the struggles of womanhood in a clever package. The film is a pleasing mishmash of genres, with elements fantastical, political, mystical, and musical, but at its core it is a coming of age story, a bildungsroman in shades of pink.
Barbie would be worth seeing in theaters for the visuals alone. Barbie Land, home to the Barbies, Kens, and their various discontinued doll sidekicks, is a colorful pastiche of life in plastic, and it’s fantastic. Watching the opening sequence of Barbie Land, depicting Barbie’s morning routine, makes the viewer feel like a little girl playing in her own plastic dream house. But it’s the main character who gives the world texture: Margot Robbie is incandescent as Barbie, and not only because with her megawatt smile and flowing blonde hair, it is easy to believe that she is a doll come to life.
Robbie is “Stereotypical Barbie”—the Barbie you picture when you think of “Barbie.” Her charmed existence is upended one day when her tiptoed feet start flattening, she experiences a surge of irrepressible thoughts of death, and notices the development of cellulite (gasp!). At a dance party with bespoke choreography, Barbie interrupts the festivities by asking if her fellow dolls ever think about dying.
To combat her existential woes, Barbie must venture into our reality—the Real World—to find and reconnect with the actual person playing with her, whose anxiety is manifesting in Barbie. Accompanied by Ken (Ryan Gosling), and buoyed by the Indigo Girls’ classic “Closer to Fine,” Barbie must discover who she is in the Real World.
by Grace Segers, TNR | Read more:
Image: Warner Bros. Pictures
Tuesday, July 18, 2023
Book Review: The Educated Mind
“The promise of a new educational theory”, writes Kieran Egan, “has the magnetism of a newspaper headline like ‘Small Earthquake in Chile: Few Hurt’”.
But — could a new kind of school make the world rational?
I discovered the work of Kieran Egan in a dreary academic library. The book I happened to find — Getting it Wrong from the Beginning — was an evisceration of progressive schools. As I worked at one at the time, I got a kick out of this.
To be sure, broadsides against progressivist education aren’t exactly hard to come by. But Egan’s account went to the root, deeper than any critique I had found. Better yet, as I read more, I discovered he was against traditionalist education, too — and that he had constructed a new paradigm that incorporated the best of both.
But — could a new kind of school make the world rational?
I discovered the work of Kieran Egan in a dreary academic library. The book I happened to find — Getting it Wrong from the Beginning — was an evisceration of progressive schools. As I worked at one at the time, I got a kick out of this.
To be sure, broadsides against progressivist education aren’t exactly hard to come by. But Egan’s account went to the root, deeper than any critique I had found. Better yet, as I read more, I discovered he was against traditionalist education, too — and that he had constructed a new paradigm that incorporated the best of both.
This was important to me because I was a teacher, and had at that point in my life begun to despair that all the flashy exciting educational theories I was studying were just superficial, all show and no go. I was stuck in a cycle: I’d discover some new educational theory, devour a few books about it, and fall head over heels for it — only to eventually get around to spending some time at a school and talk to some teachers and realize holy crap this does exactly one thing well and everything else horribly.
If my life were a movie, these years would be the rom-com montage where the heroine goes on twenty terrible first dates.
I got to look at some approaches in even more detail by teaching or tutoring in schools. Each approach promised to elevate their students’ ability to reason and live well in the world, but the adults I saw coming out of their programs seemed not terribly different from people who didn’t.
They seemed just about as likely to become climate deniers or climate doomers as the average normie, just as likely to become staunch anti-vaxxers or covid isolationists. They seemed just as likely to be sucked up by the latest moral panics. The strength of their convictions seemed untethered to the strength of the evidence, and they seemed blind to the potential disasters that their convictions, if enacted, might cause.
They seemed just about as rational as the average person of their community — which was to say, quite irrational!
Egan’s approach seemed different.
I began to systematically experiment with it — using it to teach science, math, history, world religions, philosophy, to students from elementary school to college. I was astounded by how easy it made it for me to communicate the most important ideas to kids of different ability levels. This, I realized, was what I had gotten into teaching for.
The man
Kieran Egan was born in Ireland, raised in England, and got his PhD in America (at Stanford and Cornell). He lived for the next five decades in British Columbia, where he taught at Simon Fraser University.
As a young man, he became a novice at a Franciscan monastery. By the time he died, he was an atheist, but — he would make clear — a Catholic atheist. His output was prodigious — fifteen books on education, one book on building a Zen garden, and, near the end of his life, two books of poetry, and a mystery novel!
He was whimsical and energetic, a Tigger of an educational philosopher. He was devoted to the dream that (as his obituary put it) “schooling could enrich the lives of children, enabling them to reach their full potential”.
He traveled the world, sharing his approach to education. He gained a devoted following of teachers and educational thinkers, and (from an outsider’s vantage point, at least) seemed perpetually on the edge of breaking through to a larger audience, and getting his approach in general practice: he won the Grawmeyer Award — perhaps educational theory’s highest prize. His books were blurbed by some of education’s biggest names (Howard Gardner, Nel Noddings); Michael Pollan even blurbed his Zen gardening book.
He died last year. I think it’s a particularly good moment to take a clear look at his theory.
The book
This is a review of his 1997 book, The Educated Mind: How Cognitive Tools Shape Our Understanding. It’s his opus, the one book in which he most systematically laid out his paradigm. It’s not an especially easy read — Egan’s theory knits together evolutionary history, anthropology, cultural history, and cognitive psychology, and tells a new big history of humanity to make sense of how education has worked in the past, and how we might make it work now.
But at the root of his paradigm is a novel theory about why schools, as they are now, don’t work.
Part 1: Why don’t schools work?
A school is a hole we fill with money
I got a master’s degree in something like educational theory from a program whose name looked good on paper, and when I was there, one of the things that I could never quite make sense of was my professors’ and fellow students’ rock-solid assumption that schools are basically doing a good job.
Egan disagrees. He opens his book by laying that out:
The usual suspects
Ask around, and you’ll find people’s mouths overflowing with answers. “Lazy teachers!” cry some; “unaccountable administrators” grumble others. Others blame the idiot bureaucrats who write standards. Some teachers will tell you parents are the problem; others point to the students themselves.
Egan’s not having any of it. He thinks all these players are caught in a bigger, stickier web. Egan’s villain is an idea — but to understand it, we’ll have to zoom out and ask a simple question — what is it, exactly, that we’ve been asking schools to do? What’s the job we’ve been giving them? If we rifle through history, Egan suggests we’ll find three potential answers.
Job 1: Shape kids for society
Before there were schools, there was culture — and culture got individuals to further the goals of the society.
Egan dubs this job “socialization”. A school built on the socialization model will mold students to fit into the roles of society. It will shape their sense of what’s “normal” to fit their locale — and what’s normal in say, a capitalist society will be different from what’s normal in a communist society. It’ll supply students with useful knowledge and life skills. A teacher in a school built on socialization will, first and foremost, be a role model — someone who can exemplify the virtues of their society.
Job 2: Fill kids’ minds with truth
In 387 BC, Plato looked out at his fellow well-socialized, worldly wise citizens of Athens, and yelled “Sheeple!”
Fresh off the death of his mentor Socrates, Plato argued that, however wonderful the benefits of socialization, the adults that it produced were the slaves of convention. So long as people were shaped by socialization, they were doomed to repeat the follies of the past. There was no foundation on which to stand to change society. Plato opened his Academy (the Academy, with a capital ‘A’ — the one that all subsequent academies are named after) to fix that. In his school, people studied subjects like math and astronomy so as to open their minds to the truth.
Egan dubs this job “academics”. A school built on the academic model will help students reflect on reality. It will lift up a child’s sense of what’s good to match the Good, even when this separates them from their fellow citizens. And a teacher in an academic school will, first and foremost, be an expert — someone who can authoritatively say what the Truth is.
Job 3: Cultivate each kid’s uniqueness
In 1762, Jean-Jacques Rousseau looked out at his fellow academically-trained European intellectuals, and called them asses loaded with books.
The problem with the academies, Rousseau argued, wasn’t that they hadn’t educated their students, but that they had — and this education had ruined them. They were “crammed with knowledge, but empty of sense” because their schooling had made them strangers to themselves. Rousseau’s solution was to focus on each child individually, to not force our knowledge on them but to help them follow what they’re naturally interested in. The word “natural” is telling here — just as Newton had opened up the science of matter, so we should uncover the science of childhood. We should work hard to understand what a child’s nature is, and plan accordingly.
Egan dubs this job “development”. A school built on the developmental model will invite students into learning. And a teacher in this sort of school will be, first and foremost, a facilitator — someone who can create a supportive learning environment for the child to learn at their own pace.
Q: Can you recap those?
We might sum these up by asking what’s at the very center of schooling. For a socializer, the answer is “society”. For an academicist, the answer is “content”. And for a developmentalist, the answer is “the child”. (...)
One of the things I love about Egan is that he looks at educational ideas historically. (Most histories of education start around the turn of the 20th century; I remember being excited when I found one that began in the 1600s. Egan begins in prehistory.) And what we’re reminded of, when we see these historically, is that these jobs were meant to supplant each other. Put together, they sabotage each other.
If my life were a movie, these years would be the rom-com montage where the heroine goes on twenty terrible first dates.
I got to look at some approaches in even more detail by teaching or tutoring in schools. Each approach promised to elevate their students’ ability to reason and live well in the world, but the adults I saw coming out of their programs seemed not terribly different from people who didn’t.
They seemed just about as likely to become climate deniers or climate doomers as the average normie, just as likely to become staunch anti-vaxxers or covid isolationists. They seemed just as likely to be sucked up by the latest moral panics. The strength of their convictions seemed untethered to the strength of the evidence, and they seemed blind to the potential disasters that their convictions, if enacted, might cause.
They seemed just about as rational as the average person of their community — which was to say, quite irrational!
Egan’s approach seemed different.
I began to systematically experiment with it — using it to teach science, math, history, world religions, philosophy, to students from elementary school to college. I was astounded by how easy it made it for me to communicate the most important ideas to kids of different ability levels. This, I realized, was what I had gotten into teaching for.
The man
Kieran Egan was born in Ireland, raised in England, and got his PhD in America (at Stanford and Cornell). He lived for the next five decades in British Columbia, where he taught at Simon Fraser University.
As a young man, he became a novice at a Franciscan monastery. By the time he died, he was an atheist, but — he would make clear — a Catholic atheist. His output was prodigious — fifteen books on education, one book on building a Zen garden, and, near the end of his life, two books of poetry, and a mystery novel!
He was whimsical and energetic, a Tigger of an educational philosopher. He was devoted to the dream that (as his obituary put it) “schooling could enrich the lives of children, enabling them to reach their full potential”.
He traveled the world, sharing his approach to education. He gained a devoted following of teachers and educational thinkers, and (from an outsider’s vantage point, at least) seemed perpetually on the edge of breaking through to a larger audience, and getting his approach in general practice: he won the Grawmeyer Award — perhaps educational theory’s highest prize. His books were blurbed by some of education’s biggest names (Howard Gardner, Nel Noddings); Michael Pollan even blurbed his Zen gardening book.
He died last year. I think it’s a particularly good moment to take a clear look at his theory.
The book
This is a review of his 1997 book, The Educated Mind: How Cognitive Tools Shape Our Understanding. It’s his opus, the one book in which he most systematically laid out his paradigm. It’s not an especially easy read — Egan’s theory knits together evolutionary history, anthropology, cultural history, and cognitive psychology, and tells a new big history of humanity to make sense of how education has worked in the past, and how we might make it work now.
But at the root of his paradigm is a novel theory about why schools, as they are now, don’t work.
Part 1: Why don’t schools work?
A school is a hole we fill with money
I got a master’s degree in something like educational theory from a program whose name looked good on paper, and when I was there, one of the things that I could never quite make sense of was my professors’ and fellow students’ rock-solid assumption that schools are basically doing a good job.
Egan disagrees. He opens his book by laying that out:
“Education is one of the greatest consumers of public money in the Western world, and it employs a larger workforce than almost any other social agency.
“The goals of the education system – to enhance the competitiveness of nations and the self-fulfillment of citizens – are supposed to justify the immense investment of money and energy.
“School – that business of sitting at a desk among thirty or so others, being talked at, mostly boringly, and doing exercises, tests, and worksheets, mostly boring, for years and years and years – is the instrument designed to deliver these expensive benefits.
“Despite, or because, the vast expenditures of money and energy, finding anyone inside or outside the education system who is content with its performance is difficult.” (...)
America isn’t so much of an outlier; numbers across the rest of the world are comparable. The 4.7 trillion-dollar question is why.
The usual suspects
Ask around, and you’ll find people’s mouths overflowing with answers. “Lazy teachers!” cry some; “unaccountable administrators” grumble others. Others blame the idiot bureaucrats who write standards. Some teachers will tell you parents are the problem; others point to the students themselves.
Egan’s not having any of it. He thinks all these players are caught in a bigger, stickier web. Egan’s villain is an idea — but to understand it, we’ll have to zoom out and ask a simple question — what is it, exactly, that we’ve been asking schools to do? What’s the job we’ve been giving them? If we rifle through history, Egan suggests we’ll find three potential answers.
Job 1: Shape kids for society
Before there were schools, there was culture — and culture got individuals to further the goals of the society.
Egan dubs this job “socialization”. A school built on the socialization model will mold students to fit into the roles of society. It will shape their sense of what’s “normal” to fit their locale — and what’s normal in say, a capitalist society will be different from what’s normal in a communist society. It’ll supply students with useful knowledge and life skills. A teacher in a school built on socialization will, first and foremost, be a role model — someone who can exemplify the virtues of their society.
Job 2: Fill kids’ minds with truth
In 387 BC, Plato looked out at his fellow well-socialized, worldly wise citizens of Athens, and yelled “Sheeple!”
Fresh off the death of his mentor Socrates, Plato argued that, however wonderful the benefits of socialization, the adults that it produced were the slaves of convention. So long as people were shaped by socialization, they were doomed to repeat the follies of the past. There was no foundation on which to stand to change society. Plato opened his Academy (the Academy, with a capital ‘A’ — the one that all subsequent academies are named after) to fix that. In his school, people studied subjects like math and astronomy so as to open their minds to the truth.
Egan dubs this job “academics”. A school built on the academic model will help students reflect on reality. It will lift up a child’s sense of what’s good to match the Good, even when this separates them from their fellow citizens. And a teacher in an academic school will, first and foremost, be an expert — someone who can authoritatively say what the Truth is.
Job 3: Cultivate each kid’s uniqueness
In 1762, Jean-Jacques Rousseau looked out at his fellow academically-trained European intellectuals, and called them asses loaded with books.
The problem with the academies, Rousseau argued, wasn’t that they hadn’t educated their students, but that they had — and this education had ruined them. They were “crammed with knowledge, but empty of sense” because their schooling had made them strangers to themselves. Rousseau’s solution was to focus on each child individually, to not force our knowledge on them but to help them follow what they’re naturally interested in. The word “natural” is telling here — just as Newton had opened up the science of matter, so we should uncover the science of childhood. We should work hard to understand what a child’s nature is, and plan accordingly.
Egan dubs this job “development”. A school built on the developmental model will invite students into learning. And a teacher in this sort of school will be, first and foremost, a facilitator — someone who can create a supportive learning environment for the child to learn at their own pace.
Q: Can you recap those?
We might sum these up by asking what’s at the very center of schooling. For a socializer, the answer is “society”. For an academicist, the answer is “content”. And for a developmentalist, the answer is “the child”. (...)
One of the things I love about Egan is that he looks at educational ideas historically. (Most histories of education start around the turn of the 20th century; I remember being excited when I found one that began in the 1600s. Egan begins in prehistory.) And what we’re reminded of, when we see these historically, is that these jobs were meant to supplant each other. Put together, they sabotage each other.
What are we asking of schools?
Of the three possible jobs, which are we asking mainstream schools to perform? Egan answers: all three.
Of the three possible jobs, which are we asking mainstream schools to perform? Egan answers: all three.
by Anonymous, Astral Codex Ten | Read more:
Image: ACT/uncredited
Monday, July 17, 2023
When Crack Was King
“I was not able to find a direct conspiracy of white guys in a back room saying, let’s destroy the Black community. It was actually more insidious, which is that conspiracy happened hundreds of years ago, that Black people were positioned in American society from very early on to be the Americans closest to harm. When any disaster happens, whether it’s Hurricane Katrina or Covid or crack, we are hit first and we are hit worst.”
When Crack Was King: looking back on an epidemic that destroyed lives (The Guardian)
When Crack Was King: looking back on an epidemic that destroyed lives (The Guardian)
Image: Andrew Lichtenstein/Corbis/Getty Images
[ed. As concise a summation of black struggles as any.]
[ed. As concise a summation of black struggles as any.]
Sunday, July 16, 2023
The Greens' Dilemma: Building Tomorrow's Climate Infrastructure Today
Abstract
“We need to make it easier to build electricity transmission lines.” This plea came recently not from an electric utility executive but from Senator Sheldon Whitehouse, one of the Senate’s champions of progressive climate change policy. His concern is that the massive scale of new climate infrastructure urgently needed to meet our nation’s greenhouse gas emissions reduction policy goals will face a substantial obstacle in the form of existing federal, state, and local environmental laws. A small but growing chorus of politicians and commentators with impeccable green credentials agrees that reform of that system will be needed. But how? How can environmental law be reformed to facilitate building climate infrastructure faster without unduly sacrificing its core progressive goals of environmental conservation, distributional equity, and public participation?
That hard question defines what this Article describes as the Greens’ Dilemma, and there are no easy answers. We take the position in this Article that the unprecedented scale and urgency of required climate infrastructure requires reconsidering the trade-off set in the 1970s between environmental protection and infrastructure development. Green interests, however, largely remain resistant even to opening that discussion. As a result, with few exceptions reform proposals thus far have amounted to modest streamlining “tweaks” compared to what we argue will be needed to accelerate climate infrastructure sufficiently to achieve national climate policy goals. To move “beyond tweaking,” we explore how to assess the trade-off between speed to develop and build climate infrastructure, on the one hand, and ensuring adequate conservation, distributional equity, and public participation on the other. We outline how a new regime would leverage streamlining methods more comprehensively and, ultimately, more aggressively than has been proposed thus far, including through federal preemption, centralizing federal authority, establishing strict timelines, and providing more comprehensive and transparent information sources and access.
The Greens’ Dilemma is real. The trade-offs inherent between building climate infrastructure quickly enough to achieve national climate policy goals versus ensuring strong conservation, equity, and participation goals are difficult. The time for serious debate is now. This article lays the foundation for that emerging national conversation.
by J. B. Ruhl and James E. Salzman, SSRN | Read more:
“We need to make it easier to build electricity transmission lines.” This plea came recently not from an electric utility executive but from Senator Sheldon Whitehouse, one of the Senate’s champions of progressive climate change policy. His concern is that the massive scale of new climate infrastructure urgently needed to meet our nation’s greenhouse gas emissions reduction policy goals will face a substantial obstacle in the form of existing federal, state, and local environmental laws. A small but growing chorus of politicians and commentators with impeccable green credentials agrees that reform of that system will be needed. But how? How can environmental law be reformed to facilitate building climate infrastructure faster without unduly sacrificing its core progressive goals of environmental conservation, distributional equity, and public participation?
That hard question defines what this Article describes as the Greens’ Dilemma, and there are no easy answers. We take the position in this Article that the unprecedented scale and urgency of required climate infrastructure requires reconsidering the trade-off set in the 1970s between environmental protection and infrastructure development. Green interests, however, largely remain resistant even to opening that discussion. As a result, with few exceptions reform proposals thus far have amounted to modest streamlining “tweaks” compared to what we argue will be needed to accelerate climate infrastructure sufficiently to achieve national climate policy goals. To move “beyond tweaking,” we explore how to assess the trade-off between speed to develop and build climate infrastructure, on the one hand, and ensuring adequate conservation, distributional equity, and public participation on the other. We outline how a new regime would leverage streamlining methods more comprehensively and, ultimately, more aggressively than has been proposed thus far, including through federal preemption, centralizing federal authority, establishing strict timelines, and providing more comprehensive and transparent information sources and access.
The Greens’ Dilemma is real. The trade-offs inherent between building climate infrastructure quickly enough to achieve national climate policy goals versus ensuring strong conservation, equity, and participation goals are difficult. The time for serious debate is now. This article lays the foundation for that emerging national conversation.
by J. B. Ruhl and James E. Salzman, SSRN | Read more:
[ed. Download the paper at the link above, or view the pdf here. See also: Two Theories of What I’m Getting Wrong (NYT).]
Labels:
Business,
Critical Thought,
Environment,
Government,
Law,
Politics,
Science,
Technology
We Are All Background Actors
In Hollywood, the cool kids have joined the picket line.
I mean no offense, as a writer, to the screenwriters who have been on strike against film and TV studios for over two months. But writers know the score. We’re the words, not the faces. The cleverest picket sign joke is no match for the attention-focusing power of Margot Robbie or Matt Damon.
SAG-AFTRA, the union representing TV and film actors, joined the writers in a walkout over how Hollywood divvies up the cash in the streaming era and how humans can thrive in the artificial-intelligence era. With that star power comes an easy cheap shot: Why should anybody care about a bunch of privileged elites whining about a dream job?
But for all the focus that a few boldface names will get in this strike, I invite you to consider a term that has come up a lot in the current negotiations: “Background actors.”
You probably don’t think much about background actors. You’re not meant to, hence the name. They’re the nonspeaking figures who populate the screen’s margins, making Gotham City or King’s Landing or the beaches of Normandy feel real, full and lived-in.
And you might have more in common with them than you think.
The lower-paid actors who make up the vast bulk of the profession are facing simple dollars-and-cents threats to their livelihoods. They’re trying to maintain their income amid the vanishing of residual payments, as streaming has shortened TV seasons and decimated the syndication model. They’re seeking guardrails against A.I. encroaching on their jobs.
There’s also a particular, chilling question on the table: Who owns a performer’s face? Background actors are seeking protections and better compensation in the practice of scanning their images for digital reuse.
In a news conference about the strike, a union negotiator said that the studios were seeking the rights to scan and use an actor’s image “for the rest of eternity” in exchange for one day’s pay. The studios argue that they are offering “groundbreaking” protections against the misuse of actors’ images, and counter that their proposal would only allow a company to use the “digital replica” on the specific project a background actor was hired for. (...)
You could, I guess, make the argument that if someone is insignificant enough to be replaced by software, then they’re in the wrong business. But background work and small roles are precisely the routes to someday promoting your blockbuster on the red carpet. And many talented artists build entire careers around a series of small jobs. (Pamela Adlon’s series “Better Things” is a great portrait of the life of ordinary working actors.) (...)
Maybe it’s unfair that exploitation gets more attention when it involves a union that Meryl Streep belongs to. (If the looming UPS strike materializes, it might grab the spotlight for blue-collar labor.) And there’s certainly a legitimate critique of white-collar workers who were blasé about automation until A.I. threatened their own jobs.
But work is work, and some dynamics are universal. As the entertainment reporter and critic Maureen Ryan writes in “Burn It Down,” her investigation of workplace abuses throughout Hollywood, “It is not the inclination nor the habit of the most important entities in the commercial entertainment industry to value the people who make their products.”
If you don’t believe Ryan, listen to the anonymous studio executive, speaking of the writers’ strike, who told the trade publication Deadline, “The endgame is to allow things to drag out until union members start losing their apartments and losing their houses.”
by James Poniewozik, NY Times | Read more:
I mean no offense, as a writer, to the screenwriters who have been on strike against film and TV studios for over two months. But writers know the score. We’re the words, not the faces. The cleverest picket sign joke is no match for the attention-focusing power of Margot Robbie or Matt Damon.
SAG-AFTRA, the union representing TV and film actors, joined the writers in a walkout over how Hollywood divvies up the cash in the streaming era and how humans can thrive in the artificial-intelligence era. With that star power comes an easy cheap shot: Why should anybody care about a bunch of privileged elites whining about a dream job?
But for all the focus that a few boldface names will get in this strike, I invite you to consider a term that has come up a lot in the current negotiations: “Background actors.”
You probably don’t think much about background actors. You’re not meant to, hence the name. They’re the nonspeaking figures who populate the screen’s margins, making Gotham City or King’s Landing or the beaches of Normandy feel real, full and lived-in.
And you might have more in common with them than you think.
The lower-paid actors who make up the vast bulk of the profession are facing simple dollars-and-cents threats to their livelihoods. They’re trying to maintain their income amid the vanishing of residual payments, as streaming has shortened TV seasons and decimated the syndication model. They’re seeking guardrails against A.I. encroaching on their jobs.
There’s also a particular, chilling question on the table: Who owns a performer’s face? Background actors are seeking protections and better compensation in the practice of scanning their images for digital reuse.
In a news conference about the strike, a union negotiator said that the studios were seeking the rights to scan and use an actor’s image “for the rest of eternity” in exchange for one day’s pay. The studios argue that they are offering “groundbreaking” protections against the misuse of actors’ images, and counter that their proposal would only allow a company to use the “digital replica” on the specific project a background actor was hired for. (...)
You could, I guess, make the argument that if someone is insignificant enough to be replaced by software, then they’re in the wrong business. But background work and small roles are precisely the routes to someday promoting your blockbuster on the red carpet. And many talented artists build entire careers around a series of small jobs. (Pamela Adlon’s series “Better Things” is a great portrait of the life of ordinary working actors.) (...)
Maybe it’s unfair that exploitation gets more attention when it involves a union that Meryl Streep belongs to. (If the looming UPS strike materializes, it might grab the spotlight for blue-collar labor.) And there’s certainly a legitimate critique of white-collar workers who were blasé about automation until A.I. threatened their own jobs.
But work is work, and some dynamics are universal. As the entertainment reporter and critic Maureen Ryan writes in “Burn It Down,” her investigation of workplace abuses throughout Hollywood, “It is not the inclination nor the habit of the most important entities in the commercial entertainment industry to value the people who make their products.”
If you don’t believe Ryan, listen to the anonymous studio executive, speaking of the writers’ strike, who told the trade publication Deadline, “The endgame is to allow things to drag out until union members start losing their apartments and losing their houses.”
by James Poniewozik, NY Times | Read more:
Image: Jenna Schoenefeld for The New York Times
[ed. See also: On ‘Better Things,’ a Small Story Goes Out With a Big Bang (NYT).]
[ed. See also: On ‘Better Things,’ a Small Story Goes Out With a Big Bang (NYT).]
Labels:
Business,
Celebrities,
Copyright,
Media,
Movies,
Technology
Saturday, July 15, 2023
Lana Del Ray
[ed. Hadn't heard this one before (not much of a fan) but its pretty good, and the only song lauded in an otherwise pretty brutal essay on the emptiness of music today ("audio furniture"). The principal personification of this being the musician and producer Jack Antonoff: Dream of Antonoffication | Pop Music’s Blandest Prophet (Drift):]
"Then there is “Venice Bitch.” It is the one piece of music Antonoff has had a hand in that is downright numinous, with that hypnotic guitar figure pulsing away as the song dissolves into a six-minute vibe collage. The production is suffused with that signature, unshakeable Jack emptiness. But “Venice Bitch” works in large part because Lana embraces the emptiness and uses it to deliberate effect, rather than trying to fill it up with overheated emoting."
"Then there is “Venice Bitch.” It is the one piece of music Antonoff has had a hand in that is downright numinous, with that hypnotic guitar figure pulsing away as the song dissolves into a six-minute vibe collage. The production is suffused with that signature, unshakeable Jack emptiness. But “Venice Bitch” works in large part because Lana embraces the emptiness and uses it to deliberate effect, rather than trying to fill it up with overheated emoting."
Subscribe to:
Comments (Atom)
