Monday, February 3, 2025
Maeve Brennan (Irish 1980), September, 2020, Oil on canvas
The Mountain West’s Mega-McMansion Problem
The Kelly Parcel is a stunning 600-plus acres of wilderness on the edge of Grand Teton National Park. The land had been allotted to the state, a legacy of the late 19th and early 20th centuries (when the federal government allotted at least one square-mile parcel of land to Wyoming in every new township, essentially as a land trust that would hopefully accumulate value). Two years ago, the state decided to cash out, hoping to auction off the Kelly Parcel—ideally selling to the federal government, which would incorporate the land into the park system—for a sum in the region of $100 million. The state would then use the proceeds to fund schools, which urgently need the money: Wyoming has one of the smallest tax bases in the US, with no state income tax, no estate tax, no corporate income tax, low property taxes, and strictly limited sales taxes.
But having swung rightward as the political winds brought into power a leadership increasingly hostile to the idea of a redistributive, pro-environment government, the five-member State Board of Land Commissioners in the Office of State Lands and Investments (OSLI) opted not to sell to the National Park Service but instead to pursue the highest bids from the private sector, meaning the land would likely be converted into mega-mansions or into casinos, golf courses, and other playthings for the region’s super-wealthy.
Image: Michael Gäbler
Who Will Stop Elon Musk’s Coup?
The world’s richest man now has the power to override congressional spending decisions and access to private information about every US taxpayer.
Elon Musk, often described as Donald Trump’s shadow president [ed. ie., brain], has quickly morphed into something much more dangerous: Trump’s co-autocrat. Hitherto, Trump’s biggest threat to American democracy came when he incited the attack on the Capitol on January 6, 2021. The event was typically Trump in that it was lurid, violent, theatrical, and televised. January 6, like Trump’s first term, demonstrated that he had the ability to menace democratic norms and spur on mayhem—but not to really control the ultimate operation of government.
For his second term, Trump has tried to make amends for that failure by recruiting true believers who share his passion for subduing the government, including running roughshod over the system of checks and balance. Elon Musk, the world’s richest man, who specializes in taking over large companies and remaking them in his image, has been Trump’s most important ally in this agenda, acting as considerably more than an aide. In truth, Musk is emerging as a government within the government, using the time-honored revolutionary tactic of developing dual power in order to seize control.
On Sunday, the Financial Times reported that
Waleed Shahid, a Democratic Party strategist and member of The Nation’s editorial board, distilled with bracing clarity the breathtaking scope of Musk’s power grab:
What’s scandalous is Musk’s nebulous status as both a private citizen and presidential crony, which has allowed him access to data that that can easily be abused. Coming from Silicon Valley, Musk knows that data is power. Now he has access to the full data set of the federal government, which he is both hoarding to himself and preventing the public from seeing (many government websites have already been shuttered, allegedly as a temporary measure during the Trump/Musk riorganizzazione, including public health sets and Census data).
Most disturbingly, Musk and his DOGE team have no proper congressional authorization. (...)
The end goal of the coup is to give Trump and Musk control over the spigot of Treasury spending: in other words, control over the heart of the federal government that ensures the flow of money. In a post on Monday morning, economist Nathan Tankus assembled alarming reporting that makes clear Musk and his DOGE team are working toward control of Treasury’s computer network system. Tankus spells out the new power this would give Trump and Musk:
Congressional Republicans have shown themselves to be abjectly servile to Trump, which makes remedies such as impeachment moot. Nor do Democrats at this moment have any fighting spirit. Instead, Senate majority leader Chuck Schumer has adopted the stance that it is best to keep the Democrats’ powder dry for future battles. Schumer told The New York Times, “We’re not going to go after every single issue. We are picking the most important fights and lying down on the train tracks on those fights.” It’s a bit distressing that even in this fantasy scenario, Schumer is not imagining rallying the public to counter Trump’s destruction of democracy. Instead, Schumer daydreams about Democrats tying themselves to the metaphorical train tracks, which he apparently believes (all evidence to the contrary) will excite the mercy of the Republicans to stop the train. This is the dream life of the willfully self-defeating—people whose only hope is that the bully will at some point get tired of beating them up.
[ed. Same question everybody else is wondering: who elected this guy? Whatever he paid to fund Trump's re-election campaign has provided dividends beyond his wildest dreams (well, maybe not... who knows what the ultimate dreams of megalomanics are, but for sure nothing to do with the Constitution). Wonder what Elon will do with all that data now in his possession? Whether he sticks around or not (and I can't imagine Trump playing second fiddle much longer), he'll have it all, with no accountability to anyone but himself. I envision high density storage devices blazing away as we speak. See also: Ten Days That Screwed the World (Nation); and this horrifying all-encompassing report: Inside Musk’s Aggressive Incursion Into the Federal Government). All aided by his band of good little Nazis: The Familiar Arrogance of Musk’s Young Apparatchiks (NYT):]
Meanwhile, 50,000 to 100,000 Iraqi government workers, many of whom had joined the Baath Party only to get their jobs in the first place, were fired. Schools went without teachers. As Syrus Solo Jin wrote in Time, budget blunders by overwhelmed novices meant that the police weren’t paid on time. The de-Baathification that Vance wanted to emulate is widely seen as a disaster that contributed to the deadly chaos and instability that followed America’s invasion. (...)
Soon after taking over, they created a crisis by shutting down huge segments of federal government spending, though they restarted at least some payments after a judge slapped them with a court order. Late Friday, Elon Musk seized control of the Treasury Department’s payment system, which disburses trillions of dollars and houses sensitive data about millions of Americans. Some of the people helping him take over the government — who include, as Wired reported, a half dozen engineers between the ages of 19 and 24 — appear to be even less experienced than the neophytes who staffed the C.P.A. in Iraq.
Employees at the General Services Administration, which manages office space, transportation and technology for the federal government, told Wired that Edward Coristine, a recent high-school graduate who spent three months at Musk’s company Neuralink, has been on calls where “workers were made to go over code they had written and justify their jobs.” Another young member of Musk’s team, a software engineer named Gavin Kliger, sent out an email to USAID employees informing them that the headquarters has been closed and they shouldn’t come in; Musk said that he’s “feeding USAID into the wood chipper.”
For his second term, Trump has tried to make amends for that failure by recruiting true believers who share his passion for subduing the government, including running roughshod over the system of checks and balance. Elon Musk, the world’s richest man, who specializes in taking over large companies and remaking them in his image, has been Trump’s most important ally in this agenda, acting as considerably more than an aide. In truth, Musk is emerging as a government within the government, using the time-honored revolutionary tactic of developing dual power in order to seize control.
On Sunday, the Financial Times reported that
Musk vowed to unilaterally cancel hundreds of millions of dollars’ worth of government grants after apparently gaining access to review the US Treasury’s vast payments system, a move that prompted the sudden resignation of one of the department’s most senior officials.This is a remarkable power grab on Musk’s part, because he’s a private citizen who is still overseeing his vast fortune even as he claims authority to unilaterally slash government funding. Further, Musk is doing this on behalf of DOGE, which The New York Times accurately describes as “the so-called Department of Government Efficiency.” In fact, DOGE is not a real department authorized by Congress but merely the fiat creation of an executive order signed by Trump. DOGE is an advisory group that is usurping power the Constitution grants to Congress alone. Last week, Trump issued a memo to freeze federal funding for government programs such as Medicaid and SNAP, only to retreat in the face of both popular protest and an adverse court decision. Under the Constitution, Congress alone has the power of the purse, while the president is obligated to “faithfully execute the laws.” Trump’s attempt to arrogate the power to not spend money already allocated by Congress thus constitutes “impoundment”—a practice forbidden by long-standing practices and court decisions.
The world’s richest man, who bankrolled Donald Trump’s reelection campaign and was tasked by the president with running the cost-cutting Department of Government Efficiency, boasted on his social media site X that he was “rapidly shutting down…illegal payments” after a list of grants to Lutheran organisations was posted online.
Waleed Shahid, a Democratic Party strategist and member of The Nation’s editorial board, distilled with bracing clarity the breathtaking scope of Musk’s power grab:
Not with tanks in the streets or militias at government buildings, but with spreadsheets, executive orders, and a network of loyalists embedded in the federal bureaucracy. In just the past few days, Musk’s hand-picked agents have seized control of Treasury’s 6 trillion payment system, the Office of Personnel Management (OPM), and the General Services Administration (GSA)—institutions that, together, function as the central nervous system of the U.S. government…. In any other country, experts would call it state capture, a textbook coup.Because Musk and his DOGE minions have strong-armed their way into the offices of the Treasury, OPM, GSA, and USAID, they have access to an astonishing body of public data. Musk can, for example, find the Social Security number of any American citizen and also what funds (if any) they receive from the government. (...)
What’s scandalous is Musk’s nebulous status as both a private citizen and presidential crony, which has allowed him access to data that that can easily be abused. Coming from Silicon Valley, Musk knows that data is power. Now he has access to the full data set of the federal government, which he is both hoarding to himself and preventing the public from seeing (many government websites have already been shuttered, allegedly as a temporary measure during the Trump/Musk riorganizzazione, including public health sets and Census data).
Most disturbingly, Musk and his DOGE team have no proper congressional authorization. (...)
The end goal of the coup is to give Trump and Musk control over the spigot of Treasury spending: in other words, control over the heart of the federal government that ensures the flow of money. In a post on Monday morning, economist Nathan Tankus assembled alarming reporting that makes clear Musk and his DOGE team are working toward control of Treasury’s computer network system. Tankus spells out the new power this would give Trump and Musk:
Without political control of the payment’s heart, the Trump administration and Elon Musk must chase down every agency and bend it to their will. They are in the process of doing that, but bureaucrats can notionally continue to respect the law and resist their efforts. They are helped in this effort by court injunctions they can point to. This is bureaucratic trench warfare. But if Musk and Trump can reach into the choke point, the Bureau of the Fiscal Service, they could possibly not need agency cooperation. They can just impound agency payments themselves. They could also possibly stop paying federal employees they have forced on paid administrative leave, coercing them to resign. These possibilities are what every Treasury expert I’ve talked to instantly thought of the moment they read the Washington Post reporting and are incredibly alarmed about.Is anyone prepared to end this coup? In an earlier post, Tankus noted that describing an action as unconstitutional does little good if the system of checks and balances breaks down. Paraphrasing Joseph Stalin’s famous quip about the powerlessness of moral leaders such as the pope, Tankus asks, “…and how many divisions does the Constitution have?” There are likely to be court challenges, some of which will roll back parts of Musk’s coup. But the courts are a fickle—and in any case nondemocratic—remedy. Musk’s abuse of power is a massive encroachment on congressional power, but there is little reason to expect Congress will fight back.
Congressional Republicans have shown themselves to be abjectly servile to Trump, which makes remedies such as impeachment moot. Nor do Democrats at this moment have any fighting spirit. Instead, Senate majority leader Chuck Schumer has adopted the stance that it is best to keep the Democrats’ powder dry for future battles. Schumer told The New York Times, “We’re not going to go after every single issue. We are picking the most important fights and lying down on the train tracks on those fights.” It’s a bit distressing that even in this fantasy scenario, Schumer is not imagining rallying the public to counter Trump’s destruction of democracy. Instead, Schumer daydreams about Democrats tying themselves to the metaphorical train tracks, which he apparently believes (all evidence to the contrary) will excite the mercy of the Republicans to stop the train. This is the dream life of the willfully self-defeating—people whose only hope is that the bully will at some point get tired of beating them up.
by Jeet Heer, The Nation | Read more:
Image: Angela Weiss/Getty Images[ed. Same question everybody else is wondering: who elected this guy? Whatever he paid to fund Trump's re-election campaign has provided dividends beyond his wildest dreams (well, maybe not... who knows what the ultimate dreams of megalomanics are, but for sure nothing to do with the Constitution). Wonder what Elon will do with all that data now in his possession? Whether he sticks around or not (and I can't imagine Trump playing second fiddle much longer), he'll have it all, with no accountability to anyone but himself. I envision high density storage devices blazing away as we speak. See also: Ten Days That Screwed the World (Nation); and this horrifying all-encompassing report: Inside Musk’s Aggressive Incursion Into the Federal Government). All aided by his band of good little Nazis: The Familiar Arrogance of Musk’s Young Apparatchiks (NYT):]
***
For those lucky enough not to remember, the Coalition Provisional Authority was the administration that George W. Bush and his team put in place after charging heedlessly into Iraq, convinced that it would be easy to remake a government about which they knew next to nothing. It was full of right-wing apparatchiks, some barely out of college, who were given enormous responsibilities. Six people initially hired for low-level administrative jobs after sending their résumés to the conservative Heritage Foundation were assigned to manage Iraq’s $13 billion budget. A social worker who’d served as director at a Christian charity was put in charge of rebuilding the health care system.Meanwhile, 50,000 to 100,000 Iraqi government workers, many of whom had joined the Baath Party only to get their jobs in the first place, were fired. Schools went without teachers. As Syrus Solo Jin wrote in Time, budget blunders by overwhelmed novices meant that the police weren’t paid on time. The de-Baathification that Vance wanted to emulate is widely seen as a disaster that contributed to the deadly chaos and instability that followed America’s invasion. (...)
Soon after taking over, they created a crisis by shutting down huge segments of federal government spending, though they restarted at least some payments after a judge slapped them with a court order. Late Friday, Elon Musk seized control of the Treasury Department’s payment system, which disburses trillions of dollars and houses sensitive data about millions of Americans. Some of the people helping him take over the government — who include, as Wired reported, a half dozen engineers between the ages of 19 and 24 — appear to be even less experienced than the neophytes who staffed the C.P.A. in Iraq.
Employees at the General Services Administration, which manages office space, transportation and technology for the federal government, told Wired that Edward Coristine, a recent high-school graduate who spent three months at Musk’s company Neuralink, has been on calls where “workers were made to go over code they had written and justify their jobs.” Another young member of Musk’s team, a software engineer named Gavin Kliger, sent out an email to USAID employees informing them that the headquarters has been closed and they shouldn’t come in; Musk said that he’s “feeding USAID into the wood chipper.”
Grammys 2025: Best Jazz Instrumental and Performance Nominees
Lakecia Benjamin Phoenix Reimagined (Live ) FT John Scofield, Randy Brecker , Jeff Tain Watts
“Beyond This Place” — Kenny Barron featuring Kiyoshi Kitagawa, Johnathan Blake, Immanuel Wilkins and Steve Nelson
What if the Attention Crisis Is All a Distraction?
From the pianoforte to the smartphone, each wave of tech has sparked fears of brain rot. But the problem isn’t our ability to focus—it’s what we’re focussing on.
Whatever thoughts past writers have had about the virtues of attention, pessimists would argue that the problem is different now. It’s as if we’re not reading books so much as the books are reading us. TikTok is particularly adept at this; you just scroll and the app learns—from your behavior, plus perhaps other information harvested from your phone—about what will keep you hooked. “I wake up in cold sweats every so often thinking, What did we bring to the world?” Tony Fadell, a co-developer of the iPhone, has said. (...)
It’s been fifteen years since Carr’s “The Shallows.” Now we have what is perhaps the most sophisticated contribution to the genre, “The Sirens’ Call,” by Chris Hayes, an MSNBC anchor. Hayes acknowledges the long history of such panics. Some seem laughable in hindsight, he concedes, like one in the nineteen-fifties about comic books. Yet others seem prophetic, like the early warnings about smoking. “Is the development of a global, ubiquitous, chronically connected social media world more like comic books or cigarettes?” Hayes asks.
Great question. If we take the skeptics seriously, how much of the catastrophist’s argument stands? Enough, Hayes feels, that we should be gravely concerned. “We have a country full of megaphones, a crushing wall of sound, the swirling lights of a 24/7 casino blinking at us, all part of a system minutely engineered to take our attention away from us for profit,” he writes. Thinking clearly and conversing reasonably under these conditions is “like trying to meditate in a strip club.” The case he makes is thoughtful, informed, and disquieting. But is it convincing?
History is littered with lamentations about distraction. Swirling lights and strippers are not a new problem. What’s important to note about bygone debates on the subject, though, is that they truly were debates. Not everyone felt the sky was falling, and the dissenters raised pertinent questions. Is it, in fact, good to pay attention? Whose purposes does it serve? (...)
This situation is, in some sense, our fault, as the whole system runs on our own choices. But those choices don’t always feel free. Hayes distinguishes between voluntary and compelled attention. Some things we focus on by choice; others, because of our psychological hardwiring, we find hard to ignore. Digital tools let online platforms harness the latter, addressing our involuntary impulses rather than our higher-order desires. The algorithms deliver what we want but not, as the late philosopher Harry Frankfurt put it, “what we want to want.”
It’s been fifteen years since Carr’s “The Shallows.” Now we have what is perhaps the most sophisticated contribution to the genre, “The Sirens’ Call,” by Chris Hayes, an MSNBC anchor. Hayes acknowledges the long history of such panics. Some seem laughable in hindsight, he concedes, like one in the nineteen-fifties about comic books. Yet others seem prophetic, like the early warnings about smoking. “Is the development of a global, ubiquitous, chronically connected social media world more like comic books or cigarettes?” Hayes asks.
Great question. If we take the skeptics seriously, how much of the catastrophist’s argument stands? Enough, Hayes feels, that we should be gravely concerned. “We have a country full of megaphones, a crushing wall of sound, the swirling lights of a 24/7 casino blinking at us, all part of a system minutely engineered to take our attention away from us for profit,” he writes. Thinking clearly and conversing reasonably under these conditions is “like trying to meditate in a strip club.” The case he makes is thoughtful, informed, and disquieting. But is it convincing?
History is littered with lamentations about distraction. Swirling lights and strippers are not a new problem. What’s important to note about bygone debates on the subject, though, is that they truly were debates. Not everyone felt the sky was falling, and the dissenters raised pertinent questions. Is it, in fact, good to pay attention? Whose purposes does it serve? (...)
This situation is, in some sense, our fault, as the whole system runs on our own choices. But those choices don’t always feel free. Hayes distinguishes between voluntary and compelled attention. Some things we focus on by choice; others, because of our psychological hardwiring, we find hard to ignore. Digital tools let online platforms harness the latter, addressing our involuntary impulses rather than our higher-order desires. The algorithms deliver what we want but not, as the late philosopher Harry Frankfurt put it, “what we want to want.”
Getting what we want, not what we want to want: it could be the slogan of our times. Hayes notes that it’s not only corporations that home in on our baser instincts. Since social-media users also have access to immediate feedback, they learn what draws eyeballs, too. Years ago, Donald Trump, Elon Musk, and Kanye West had hardly anything in common. Now their pursuit of publicity has morphed them into versions of the same persona—the attention troll. And, despite ourselves, we can’t look away.
The painful twist is that climate change, the thing we really ought to focus on, “evades our attentional faculties,” Hayes writes. “It’s always been a problem,” the writer and activist Bill McKibben told him, “that the most dangerous thing on the planet is invisible, odorless, tasteless, and doesn’t actually do anything to you directly.” Global warming is the opposite of Kanye West: we want to pay attention but we don’t.
The trouble is “attention capitalism,” Hayes argues, and it has the same dehumanizing effect on consumers’ psyches as industrial capitalism has on workers’ bodies. Successful attention capitalists don’t hold our attention with compelling material but, instead, snatch it over and over with slot-machine gimmicks. (...)
What’s awkward about this whole debate is that, though we speak freely of “attention spans,” they are not the sort of thing that psychologists can measure, independent of context, across time. And studies of the ostensible harm that carrying smartphones does to cognitive abilities have been contradictory and inconclusive. A.D.H.D. diagnoses abound, but is that because the condition is growing more prevalent or the diagnosis is? U.S. labor productivity and the percentage of the population with four years or more of college have risen throughout the Internet era. (...)
After decades of the Internet, the mediascape has still not dissolved into a froth of three-second clips of orgasms, kittens, and trampoline accidents, interspersed with sports-betting ads. As the legal scholar Tim Wu argues in “The Attention Merchants,” the road to distraction is not one-way. Yes, businesses seize our attention using the shiniest lures available, but people become inured and learn to ignore them. Or they recoil, which might explain why meditation, bird-watching, and vinyl records are in vogue. Technology firms, in fact, often attract users by promising to reduce distractions, not only the daily hassles—paying bills, arranging travel—but the online onslaught, too. Google’s text ads and mail filters offered respite from the early Internet’s spam and pop-ups. Apple became one of the world’s largest companies by selling simplicity.
Besides, distraction is relative: to be distracted from one thing is to attend to another. And any argument that people are becoming distracted must deal with the plain fact that many spend hours staring intently at their screens. What is doomscrolling if not avid reading? If people are failing to focus in some places, they’re clearly succeeding in others. (...)
Even the supposedly attention-pulverizing TikTok deserves another look. Hayes, who works in TV, treats TikTok wholly as something to watch—an algorithmically individualized idiot box. But TikTok is participatory: more than half its U.S. adult users have posted videos. Where the platform excels is not in slick content but in amateur enthusiasm, which often takes the form of trends with endless variations. To join in, TikTokers spend hours preparing elaborate dance moves, costume changes, makeup looks, lip synchs, trick shots, pranks, and trompe-l’oeil camera maneuvers.
What’s going on? The media theorist Neil Verma, in “Narrative Podcasting in an Age of Obsession,” describes the era of TikTok’s rise as beset by “obsession culture.” Online media, by broadening the scope of possible interests, have given rise to an unabashedly nerdy intellectual style. Verma focusses on the breakout podcast “Serial,” whose first season, in 2014, followed the host for hours as she pored over the details of a fifteen-year-old murder case. But deep dives into niche topics have become the norm. The wildly popular podcaster Joe Rogan runs marathon interviews, some exceeding four hours, on ancient civilizations, cosmology, and mixed martial arts. A four-hour video of the YouTuber Jenny Nicholson dissecting the design flaws of a defunct Disney World hotel has eleven million views (deservedly: it’s terrific). Hayes himself confesses to spending hours “utterly transfixed” by watching old carpets being shampooed. (...)
We blame the Internet for polarizing politics and shredding attention spans, but those tendencies actually pull in opposite directions. What’s true of culture is true of politics, too: as people diverge from the mainstream, they become obsessional and prone to scrambling down rabbit holes. Following QAnon takes the sort of born-again devotion that one expects of a K-pop fan. Democratic Socialists, vaccine skeptics, anti-Zionists, manosphere alphas—these are not people known for casual political engagement. Some may be misinformed, but they’re not uninformed: “Do your own research” is the mantra of the political periphery. Fragmentation, it turns out, yields subcultural depths. Silos are not shallows. (...)
In a sense, what attention alarmists seek is protection from a competition that they’re losing. Fair enough; the market doesn’t always deliver great results, and Hayes is right to deplore the commodification of intellectual life. But one can wonder whether ideas are less warped by the market when they are posted online to a free platform than when they are rolled into books, given bar codes, and sold in stores. It’s worth remembering that those long nineteenth-century novels we’re losing the patience to read were long for a reason: profit-seeking publishers made authors drag out their stories across multiple volumes. Market forces have been stretching, squashing, spinning, and suppressing ideas for centuries. Realistically, the choice isn’t commodified versus free but which commodity form suits best.
For Hayes, what makes the apps awful is that they operate without consent. They seize attention using tricks, leaving us helpless and stupefied. Yet even this argument, his most powerful, warrants caution. Our media have always done a weird dance with our desires. Although Hayes argues for the profound novelty of our predicament, the title of his book, “The Sirens’ Call,” alludes to a Homeric tale from antiquity, of songs too alluring to resist. This isn’t always unwelcome. Consider our highest words of praise for books—captivating, commanding, riveting, absorbing, enthralling. It’s a fantasy of surrendered agency. (“A page-turner”: the pages turn themselves.) Oddly, the thing we deplore in others, submission, is what we most want for ourselves.
The nightmare the alarmists conjure is of a TikTok-addled screen-ager. This isn’t a full picture of the present, though, and it might not reveal much about the future, either. Ours is an era of obsession as much as distraction, of long forms as much as short ones, of zeal as much as indifference. To ascribe our woes to a society-wide attention-deficit disorder is to make the wrong diagnosis.
by Daniel Immerwahr, New Yorker | Read more:
The painful twist is that climate change, the thing we really ought to focus on, “evades our attentional faculties,” Hayes writes. “It’s always been a problem,” the writer and activist Bill McKibben told him, “that the most dangerous thing on the planet is invisible, odorless, tasteless, and doesn’t actually do anything to you directly.” Global warming is the opposite of Kanye West: we want to pay attention but we don’t.
The trouble is “attention capitalism,” Hayes argues, and it has the same dehumanizing effect on consumers’ psyches as industrial capitalism has on workers’ bodies. Successful attention capitalists don’t hold our attention with compelling material but, instead, snatch it over and over with slot-machine gimmicks. (...)
What’s awkward about this whole debate is that, though we speak freely of “attention spans,” they are not the sort of thing that psychologists can measure, independent of context, across time. And studies of the ostensible harm that carrying smartphones does to cognitive abilities have been contradictory and inconclusive. A.D.H.D. diagnoses abound, but is that because the condition is growing more prevalent or the diagnosis is? U.S. labor productivity and the percentage of the population with four years or more of college have risen throughout the Internet era. (...)
After decades of the Internet, the mediascape has still not dissolved into a froth of three-second clips of orgasms, kittens, and trampoline accidents, interspersed with sports-betting ads. As the legal scholar Tim Wu argues in “The Attention Merchants,” the road to distraction is not one-way. Yes, businesses seize our attention using the shiniest lures available, but people become inured and learn to ignore them. Or they recoil, which might explain why meditation, bird-watching, and vinyl records are in vogue. Technology firms, in fact, often attract users by promising to reduce distractions, not only the daily hassles—paying bills, arranging travel—but the online onslaught, too. Google’s text ads and mail filters offered respite from the early Internet’s spam and pop-ups. Apple became one of the world’s largest companies by selling simplicity.
Besides, distraction is relative: to be distracted from one thing is to attend to another. And any argument that people are becoming distracted must deal with the plain fact that many spend hours staring intently at their screens. What is doomscrolling if not avid reading? If people are failing to focus in some places, they’re clearly succeeding in others. (...)
Even the supposedly attention-pulverizing TikTok deserves another look. Hayes, who works in TV, treats TikTok wholly as something to watch—an algorithmically individualized idiot box. But TikTok is participatory: more than half its U.S. adult users have posted videos. Where the platform excels is not in slick content but in amateur enthusiasm, which often takes the form of trends with endless variations. To join in, TikTokers spend hours preparing elaborate dance moves, costume changes, makeup looks, lip synchs, trick shots, pranks, and trompe-l’oeil camera maneuvers.
What’s going on? The media theorist Neil Verma, in “Narrative Podcasting in an Age of Obsession,” describes the era of TikTok’s rise as beset by “obsession culture.” Online media, by broadening the scope of possible interests, have given rise to an unabashedly nerdy intellectual style. Verma focusses on the breakout podcast “Serial,” whose first season, in 2014, followed the host for hours as she pored over the details of a fifteen-year-old murder case. But deep dives into niche topics have become the norm. The wildly popular podcaster Joe Rogan runs marathon interviews, some exceeding four hours, on ancient civilizations, cosmology, and mixed martial arts. A four-hour video of the YouTuber Jenny Nicholson dissecting the design flaws of a defunct Disney World hotel has eleven million views (deservedly: it’s terrific). Hayes himself confesses to spending hours “utterly transfixed” by watching old carpets being shampooed. (...)
We blame the Internet for polarizing politics and shredding attention spans, but those tendencies actually pull in opposite directions. What’s true of culture is true of politics, too: as people diverge from the mainstream, they become obsessional and prone to scrambling down rabbit holes. Following QAnon takes the sort of born-again devotion that one expects of a K-pop fan. Democratic Socialists, vaccine skeptics, anti-Zionists, manosphere alphas—these are not people known for casual political engagement. Some may be misinformed, but they’re not uninformed: “Do your own research” is the mantra of the political periphery. Fragmentation, it turns out, yields subcultural depths. Silos are not shallows. (...)
In a sense, what attention alarmists seek is protection from a competition that they’re losing. Fair enough; the market doesn’t always deliver great results, and Hayes is right to deplore the commodification of intellectual life. But one can wonder whether ideas are less warped by the market when they are posted online to a free platform than when they are rolled into books, given bar codes, and sold in stores. It’s worth remembering that those long nineteenth-century novels we’re losing the patience to read were long for a reason: profit-seeking publishers made authors drag out their stories across multiple volumes. Market forces have been stretching, squashing, spinning, and suppressing ideas for centuries. Realistically, the choice isn’t commodified versus free but which commodity form suits best.
For Hayes, what makes the apps awful is that they operate without consent. They seize attention using tricks, leaving us helpless and stupefied. Yet even this argument, his most powerful, warrants caution. Our media have always done a weird dance with our desires. Although Hayes argues for the profound novelty of our predicament, the title of his book, “The Sirens’ Call,” alludes to a Homeric tale from antiquity, of songs too alluring to resist. This isn’t always unwelcome. Consider our highest words of praise for books—captivating, commanding, riveting, absorbing, enthralling. It’s a fantasy of surrendered agency. (“A page-turner”: the pages turn themselves.) Oddly, the thing we deplore in others, submission, is what we most want for ourselves.
The nightmare the alarmists conjure is of a TikTok-addled screen-ager. This isn’t a full picture of the present, though, and it might not reveal much about the future, either. Ours is an era of obsession as much as distraction, of long forms as much as short ones, of zeal as much as indifference. To ascribe our woes to a society-wide attention-deficit disorder is to make the wrong diagnosis.
by Daniel Immerwahr, New Yorker | Read more:
Image: David Plunkert
[ed. A nice cogent rebuttal to the attention deficit argument recently making the rounds. To better understand the default proposition, see: Attention is Power (and the Problem) (NYT/DS).]
Labels:
Business,
Critical Thought,
Culture,
Education,
history,
Media,
Politics,
Psychology,
Relationships,
Technology
Sunday, February 2, 2025
The Insidious Charms of the Entrepreneurial Work Ethic
You’re passionate. Purpose-driven. Dreaming big, working hard, making it happen. And now they’ve got you where they want you.
No literary form captures the pathologies of contemporary American work quite like the humble—honored, grateful, blessed—LinkedIn post. In the right light, the social network for professionals is a lavish psychoanalytic corpus, bursting with naked ambition, inspiration, desperation, status-seeking, spiritual yearning, brownnosing, name-dropping, corporate shilling, and self-promotion. Novels have been written about less, but no one is on LinkedIn for the prose. Recently, I visited the site after years of being away. A college classmate and talented artist was posting about the balancing act of being an “effective solopreneur”; a former business contact was sharing his professional journey, with the moral “don’t be afraid to change direction”; a person identifying as “ex-Meta” encouraged hopeful Meta interviewees to “show real connection to the mission and motivation”; a director at a multinational brewing company that was hiring wrote, “Our mandate is to dream, challenge, question and provoke.”
When did people start talking like this? LinkedIn’s style of sanitized professional chatter—to say nothing of the robust cottage industry that exists to support it, from branding strategists and career coaches to software programs designed to generate shareable, safe-for-work content—is of a piece with mantras like “do what you love,” “follow your passion,” “bring your whole self to work,” and “make a life, not just a living.” (The linguistic trend extends beyond the domain of yoga classes and L.E.D. signage in co-working spaces; a recent Times article described Luigi Mangione, the twenty-six-year-old accused of murdering the C.E.O. of UnitedHealthcare, as possessing “an entrepreneurial spirit” in college, because he resold Christmas lights.) This discourse around work can seem like a distinctly modern phenomenon. But a new book, “Make Your Own Job: How the Entrepreneurial Work Ethic Exhausted America” (Harvard), by Erik Baker, argues that the imperative to imbue work with personal significance is part of a long-standing national preoccupation with entrepreneurialism.
Baker is a lecturer in the history of science at Harvard, an associate editor of The Drift, and a freelance writer for various publications (including this one). He sees his book as a corrective to “conventional histories of midcentury American culture,” which he believes overemphasize bureaucracy and conformism. As a study in intellectual history, “Make Your Own Job” is less concerned with the chronological development of American entrepreneurship than with the idea of it—the ways in which “ordinary people have thought about their working lives” and how entrepreneurialism has become a value unto itself. Baker aims to track the anxieties and desires of a society undergoing epochal transitions and the evolution of what he calls “the entrepreneurial work ethic”: an orientation that is highly individualistic and competitive, and that operates on the level of personality. It is present in the pervasive compulsion to work harder, longer hours and to feel adrift or even “devoid of purpose” when there isn’t enough work—or the “right” work—to do.
The entrepreneurial work ethic, Baker writes, meets a “fundamental ideological need” by addressing a central tension of American capitalism: most people need to work to earn a living, but well-paid, stable, and fulfilling jobs are hard to find. In times of intensifying economic inequality, when many of the jobs on offer are precarious, underpaid, and spiritually deadening, the prospect of becoming your own boss holds a lot of appeal. Entrepreneurialism is “tenacious,” Baker maintains, in part because it has the power to “metabolize discontent with the present order of work.” It suggests the possibility of liberation or relief—an exit, or a workaround. The ethic, he also notes, tends to be popular during periods of acute unemployment. The result is too many people working much too hard because there’s just not enough work.
Before the entrepreneurial work ethic became widespread, in Baker’s account, what predominated was the “industrious work ethic,” in which labor of any kind was considered a moral good, and framed in terms of stoicism and duty. The industrious work ethic applied to workers in mills and on Henry Ford-style assembly lines, and echoes of it could be seen in middle-class “organization men,” who were loyal to their employers, and received loyalty in return. Workers submitted to a company, whether I.B.M. or General Motors, and slotted themselves into bureaucratic structures that discouraged risktaking and did not reward individualism. This orientation toward work was buffered, in part, by strong labor unions and a relatively sturdy social safety net.
If the industrious work ethic advanced a certain kind of “static moralism,” Baker writes, the entrepreneurial work ethic was “a dynamic philosophy of personal development.” The notion that one’s unique personality could be transmuted into prosperity and opportunity had broad appeal at a time of economic instability and cultural transformation. Baker identifies a number of practices and traditions from the first half of the twentieth century as embodying entrepreneurialism, from New Thought, an influential spiritual movement that championed the transcendence of the mind over material reality, to direct-selling networks. (“Now you are in business for yourself,” Avon told its salespeople.) But the industrious work ethic prevailed well past the mid-century mark, with many workers accepting mental or physical drudgery in return for security and predictability; it was not until the later decades of the century that the entrepreneurial work ethic came into full force. Though entrepreneurial capitalism might have been a bit onerous in its implicit mandate to both generate opportunities and fulfill them, it was also presented as a more creative, even kinder alternative to the industrial capitalism that preceded it. (...)
In 1997, Peters published an essay in Fast Company titled “The Brand Called You,” in which he coined the phrase “personal branding.” “We are CEOs of our own companies: Me Inc.,” he wrote. “To be in business today, our most important job is to be head marketer for the brand called You.” The marketing campaign for Me Inc. had to be relentless. “Your network of friends, colleagues, clients, and customers is the most important marketing vehicle you’ve got,” he wrote, encouraging readers to “nurture your network.” There was no room, in this vision, for employees who did their jobs but didn’t blow their bosses’ minds. The same year, another article in Fast Company—“Free Agent Nation,” by Daniel H. Pink, a former speechwriter for Vice-President Al Gore—celebrated the rising number of freelance workers as a new movement. No matter that many were casualties of downsizings, facing a leaner corporate world: Pink preached freedom, modernity, and “a beautiful synchronicity between who you are and what you do.”
No literary form captures the pathologies of contemporary American work quite like the humble—honored, grateful, blessed—LinkedIn post. In the right light, the social network for professionals is a lavish psychoanalytic corpus, bursting with naked ambition, inspiration, desperation, status-seeking, spiritual yearning, brownnosing, name-dropping, corporate shilling, and self-promotion. Novels have been written about less, but no one is on LinkedIn for the prose. Recently, I visited the site after years of being away. A college classmate and talented artist was posting about the balancing act of being an “effective solopreneur”; a former business contact was sharing his professional journey, with the moral “don’t be afraid to change direction”; a person identifying as “ex-Meta” encouraged hopeful Meta interviewees to “show real connection to the mission and motivation”; a director at a multinational brewing company that was hiring wrote, “Our mandate is to dream, challenge, question and provoke.”
When did people start talking like this? LinkedIn’s style of sanitized professional chatter—to say nothing of the robust cottage industry that exists to support it, from branding strategists and career coaches to software programs designed to generate shareable, safe-for-work content—is of a piece with mantras like “do what you love,” “follow your passion,” “bring your whole self to work,” and “make a life, not just a living.” (The linguistic trend extends beyond the domain of yoga classes and L.E.D. signage in co-working spaces; a recent Times article described Luigi Mangione, the twenty-six-year-old accused of murdering the C.E.O. of UnitedHealthcare, as possessing “an entrepreneurial spirit” in college, because he resold Christmas lights.) This discourse around work can seem like a distinctly modern phenomenon. But a new book, “Make Your Own Job: How the Entrepreneurial Work Ethic Exhausted America” (Harvard), by Erik Baker, argues that the imperative to imbue work with personal significance is part of a long-standing national preoccupation with entrepreneurialism.
Baker is a lecturer in the history of science at Harvard, an associate editor of The Drift, and a freelance writer for various publications (including this one). He sees his book as a corrective to “conventional histories of midcentury American culture,” which he believes overemphasize bureaucracy and conformism. As a study in intellectual history, “Make Your Own Job” is less concerned with the chronological development of American entrepreneurship than with the idea of it—the ways in which “ordinary people have thought about their working lives” and how entrepreneurialism has become a value unto itself. Baker aims to track the anxieties and desires of a society undergoing epochal transitions and the evolution of what he calls “the entrepreneurial work ethic”: an orientation that is highly individualistic and competitive, and that operates on the level of personality. It is present in the pervasive compulsion to work harder, longer hours and to feel adrift or even “devoid of purpose” when there isn’t enough work—or the “right” work—to do.
The entrepreneurial work ethic, Baker writes, meets a “fundamental ideological need” by addressing a central tension of American capitalism: most people need to work to earn a living, but well-paid, stable, and fulfilling jobs are hard to find. In times of intensifying economic inequality, when many of the jobs on offer are precarious, underpaid, and spiritually deadening, the prospect of becoming your own boss holds a lot of appeal. Entrepreneurialism is “tenacious,” Baker maintains, in part because it has the power to “metabolize discontent with the present order of work.” It suggests the possibility of liberation or relief—an exit, or a workaround. The ethic, he also notes, tends to be popular during periods of acute unemployment. The result is too many people working much too hard because there’s just not enough work.
Before the entrepreneurial work ethic became widespread, in Baker’s account, what predominated was the “industrious work ethic,” in which labor of any kind was considered a moral good, and framed in terms of stoicism and duty. The industrious work ethic applied to workers in mills and on Henry Ford-style assembly lines, and echoes of it could be seen in middle-class “organization men,” who were loyal to their employers, and received loyalty in return. Workers submitted to a company, whether I.B.M. or General Motors, and slotted themselves into bureaucratic structures that discouraged risktaking and did not reward individualism. This orientation toward work was buffered, in part, by strong labor unions and a relatively sturdy social safety net.
If the industrious work ethic advanced a certain kind of “static moralism,” Baker writes, the entrepreneurial work ethic was “a dynamic philosophy of personal development.” The notion that one’s unique personality could be transmuted into prosperity and opportunity had broad appeal at a time of economic instability and cultural transformation. Baker identifies a number of practices and traditions from the first half of the twentieth century as embodying entrepreneurialism, from New Thought, an influential spiritual movement that championed the transcendence of the mind over material reality, to direct-selling networks. (“Now you are in business for yourself,” Avon told its salespeople.) But the industrious work ethic prevailed well past the mid-century mark, with many workers accepting mental or physical drudgery in return for security and predictability; it was not until the later decades of the century that the entrepreneurial work ethic came into full force. Though entrepreneurial capitalism might have been a bit onerous in its implicit mandate to both generate opportunities and fulfill them, it was also presented as a more creative, even kinder alternative to the industrial capitalism that preceded it. (...)
In 1997, Peters published an essay in Fast Company titled “The Brand Called You,” in which he coined the phrase “personal branding.” “We are CEOs of our own companies: Me Inc.,” he wrote. “To be in business today, our most important job is to be head marketer for the brand called You.” The marketing campaign for Me Inc. had to be relentless. “Your network of friends, colleagues, clients, and customers is the most important marketing vehicle you’ve got,” he wrote, encouraging readers to “nurture your network.” There was no room, in this vision, for employees who did their jobs but didn’t blow their bosses’ minds. The same year, another article in Fast Company—“Free Agent Nation,” by Daniel H. Pink, a former speechwriter for Vice-President Al Gore—celebrated the rising number of freelance workers as a new movement. No matter that many were casualties of downsizings, facing a leaner corporate world: Pink preached freedom, modernity, and “a beautiful synchronicity between who you are and what you do.”
[ed. A convenient philosophy. Pretty soon everyone will be some kind of hustler, influencer, transient or taskrabbit.]
Labels:
Business,
Economics,
history,
Psychology,
Relationships,
Technology
Friday, January 31, 2025
A Retirement Maneuver More People Might Consider
It’s hard to hand over a big portion of your retirement savings when you’re old or getting there. Every fiber in your being shrieks “mistake.” And sometimes it is a mistake, as it was for Bob and Sandy Curtis, who forked out $840,000 in entrance fees for a continuing care retirement community that subsequently filed for bankruptcy.
Other times, though, writing a very big check is exactly the right thing to do for your long-term financial health. I’m referring to “Rothification,” a maneuver that costs a lot in taxes up front but raises your potential living standard in the long run. I wrote about it last year.
Rothification is the conversion of an ordinary individual retirement account or 401(k) into a Roth I.R.A. For simplicity I’ll stick with the case of converting an ordinary I.R.A. to a Roth I.R.A. from here on.
In an ordinary I.R.A., you put in money that hasn’t been taxed yet. (You can also put in money that has been taxed, but I’m going to ignore that complication.) Money in the I.R.A. grows tax-deferred. Later, when you withdraw money from the I.R.A. to cover retirement expenses, you pay taxes on the withdrawals as ordinary income. An ordinary I.R.A. can be a good deal if you expect to be in a lower tax bracket in retirement than during your working years — say, because you won’t have a lot of retirement savings to draw upon.
A Roth I.R.A., the mirror image, is stuffed with money that’s already been taxed. The money grows tax-free, and when you withdraw from it, you don’t have to pay any taxes on either the original contribution or any subsequent gains. It’s a great deal if your tax bracket in retirement is as high or higher than it was during your working years, as happens more often than many people expect. It can sometimes be a good bet even if you’re in a lower tax bracket in retirement: Because the withdrawals don’t count toward your taxable income, they help you avoid many years of income-related taxes on Social Security, lower your Medicare premiums and limit required minimum distributions from your ordinary 401(k) or I.R.A., which are taxed.
Now, back to writing that big check. The pain of a Roth conversion comes when the government demands its cut up front. The money you take out of an ordinary I.R.A. to fund the Roth I.R.A. looks like regular income to the Internal Revenue Service and is taxed as such. The maneuver may push you into a higher tax bracket — say from 22 percent to 24 percent, 32 percent, or even 35 percent.
Ouch. In financial planning, conventional wisdom says you should usually put off paying taxes as long as possible, and that you should even out your annual income so there’s never a year when you get pushed into a higher tax bracket. That would sometimes suggest stretching out a conversion to a Roth over many years or not doing it at all. That’s the answer you might get from a free online calculator, of which there are many.
In reality, though, the best move for a lot of people is to take the tax hit and convert a lot of money quickly, says Laurence Kotlikoff, an economics professor at Boston University. “Go big or go home may be your best strategy,” he wrote in his newsletter Economic Matters in November.
Kotlikoff, whom I have quoted frequently, has a company, Economic Security Planning, whose software tool, MaxiFi Planner, uses economic principles rather than financial planning rules of thumb to help clients make decisions on Roth conversions, when to claim Social Security, how much life insurance to carry and other questions with big financial ramifications.
(MaxiFi is legit, by the way. Robert Merton, who has a Nobel for his work on derivatives, including the Black-Scholes-Merton options pricing formula, wrote in an email that he uses MaxiFi software in the asset management course he teaches at M.I.T.’s Sloan School of Management.)
Kotlikoff gives an example of a 65-year-old single retiree in Tennessee named John with $1.25 million in regular assets and an equal amount in an ordinary I.R.A. By converting about $1.1 million in his ordinary I.R.A. to a Roth I.R.A. over five years, John saves money on federal income taxes and extra Medicare premiums that are tied to income, allowing him to spend about $2,600 more per year through age 70 and about $11,600 more per year after that, according to MaxiFi’s calculations. John makes out even better if he also postpones claiming Social Security until age 70.
The hurdle for John is that the tax bill over the five years that he’s converting is nearly $300,000, versus a status quo tax bill of about $18,000. Many people are understandably hesitant to part with such a big sum, Rick Miller, a financial planner at Sensible Financial Planning and Management in Waltham, Mass., who uses Kotlikoff’s MaxiFi software with clients, told me.
“I can’t just tell a client, ‘MaxiFi says,’” Miller told me. “I have to walk them through the logic of why it comes up with that answer. It takes a lot of looking and thinking to figure out where that comes from. I have to look year by year at the outputs.” (...)
Everybody’s circumstances are different, of course, and accountants and lawyers need to be in on the decision. Don’t rely entirely on the output of free online calculators, which don’t take in enough data about you to be precise and may not use the most sophisticated calculation techniques.
I’m going to take off my personal finance hat now and say that I’m not a big fan of Roth conversions from the standpoint of public policy. They’re a back door that lets well-to-do people take advantage of a saving vehicle that was originally intended to help the working and middle classes prepare for retirement. Reflecting the original intent, the cap on the contribution to a Roth I.R.A. in 2025 is $7,000, or $8,000 for someone 50 and over, and joint filers’ modified adjusted gross income must be under $236,000 to make a full Roth I.R.A. contribution.
Those rules have lost their power because there’s no limit on who can do a Roth I.R.A. conversion, or how much they can convert. A conversion used to be restricted to people with adjusted gross income under $100,000 to stop higher-income folks from indirectly funding Roth I.R.A.s, but that limit ended in 2010.
Some pretty rich people have caught on that Roth I.R.A.s aren’t just for retirement. ProPublica, an investigative journalism organization, reported in 2021 that the venture capitalist Peter Thiel had $5 billion in his, and had used it as an active investment vehicle.
Last year, President Joe Biden proposed, to “ensure that the ultrawealthy cannot use these incentives to amass tax-free fortunes,” a measure that, according to the Department of Treasury, would generate nearly $24 billion in extra tax revenue over 10 years. It didn’t get anywhere, but it’s the kind of thing the Trump administration should be looking at as a way to shrink budget deficits. [ed. Maga-heads: do something useful for a change and suggest this. See what they think about budget deficits.]
So from the public policy standpoint, too many people are doing Roth conversions. From a personal finance standpoint, though, too few are. If you’re not one of those rare people who make gifts to the federal government (link here), then as long as the laws remain as they are, you should probably look into whether Rothification is right for you.
[ed. I'm going to put on my blogger hat and say "no way in hell" would I pay $300,000 in taxes for more spending power in the future. However that future will play out. But that's just me.]
Other times, though, writing a very big check is exactly the right thing to do for your long-term financial health. I’m referring to “Rothification,” a maneuver that costs a lot in taxes up front but raises your potential living standard in the long run. I wrote about it last year.
Rothification is the conversion of an ordinary individual retirement account or 401(k) into a Roth I.R.A. For simplicity I’ll stick with the case of converting an ordinary I.R.A. to a Roth I.R.A. from here on.
In an ordinary I.R.A., you put in money that hasn’t been taxed yet. (You can also put in money that has been taxed, but I’m going to ignore that complication.) Money in the I.R.A. grows tax-deferred. Later, when you withdraw money from the I.R.A. to cover retirement expenses, you pay taxes on the withdrawals as ordinary income. An ordinary I.R.A. can be a good deal if you expect to be in a lower tax bracket in retirement than during your working years — say, because you won’t have a lot of retirement savings to draw upon.
A Roth I.R.A., the mirror image, is stuffed with money that’s already been taxed. The money grows tax-free, and when you withdraw from it, you don’t have to pay any taxes on either the original contribution or any subsequent gains. It’s a great deal if your tax bracket in retirement is as high or higher than it was during your working years, as happens more often than many people expect. It can sometimes be a good bet even if you’re in a lower tax bracket in retirement: Because the withdrawals don’t count toward your taxable income, they help you avoid many years of income-related taxes on Social Security, lower your Medicare premiums and limit required minimum distributions from your ordinary 401(k) or I.R.A., which are taxed.
Now, back to writing that big check. The pain of a Roth conversion comes when the government demands its cut up front. The money you take out of an ordinary I.R.A. to fund the Roth I.R.A. looks like regular income to the Internal Revenue Service and is taxed as such. The maneuver may push you into a higher tax bracket — say from 22 percent to 24 percent, 32 percent, or even 35 percent.
Ouch. In financial planning, conventional wisdom says you should usually put off paying taxes as long as possible, and that you should even out your annual income so there’s never a year when you get pushed into a higher tax bracket. That would sometimes suggest stretching out a conversion to a Roth over many years or not doing it at all. That’s the answer you might get from a free online calculator, of which there are many.
In reality, though, the best move for a lot of people is to take the tax hit and convert a lot of money quickly, says Laurence Kotlikoff, an economics professor at Boston University. “Go big or go home may be your best strategy,” he wrote in his newsletter Economic Matters in November.
Kotlikoff, whom I have quoted frequently, has a company, Economic Security Planning, whose software tool, MaxiFi Planner, uses economic principles rather than financial planning rules of thumb to help clients make decisions on Roth conversions, when to claim Social Security, how much life insurance to carry and other questions with big financial ramifications.
(MaxiFi is legit, by the way. Robert Merton, who has a Nobel for his work on derivatives, including the Black-Scholes-Merton options pricing formula, wrote in an email that he uses MaxiFi software in the asset management course he teaches at M.I.T.’s Sloan School of Management.)
Kotlikoff gives an example of a 65-year-old single retiree in Tennessee named John with $1.25 million in regular assets and an equal amount in an ordinary I.R.A. By converting about $1.1 million in his ordinary I.R.A. to a Roth I.R.A. over five years, John saves money on federal income taxes and extra Medicare premiums that are tied to income, allowing him to spend about $2,600 more per year through age 70 and about $11,600 more per year after that, according to MaxiFi’s calculations. John makes out even better if he also postpones claiming Social Security until age 70.
The hurdle for John is that the tax bill over the five years that he’s converting is nearly $300,000, versus a status quo tax bill of about $18,000. Many people are understandably hesitant to part with such a big sum, Rick Miller, a financial planner at Sensible Financial Planning and Management in Waltham, Mass., who uses Kotlikoff’s MaxiFi software with clients, told me.
“I can’t just tell a client, ‘MaxiFi says,’” Miller told me. “I have to walk them through the logic of why it comes up with that answer. It takes a lot of looking and thinking to figure out where that comes from. I have to look year by year at the outputs.” (...)
Everybody’s circumstances are different, of course, and accountants and lawyers need to be in on the decision. Don’t rely entirely on the output of free online calculators, which don’t take in enough data about you to be precise and may not use the most sophisticated calculation techniques.
I’m going to take off my personal finance hat now and say that I’m not a big fan of Roth conversions from the standpoint of public policy. They’re a back door that lets well-to-do people take advantage of a saving vehicle that was originally intended to help the working and middle classes prepare for retirement. Reflecting the original intent, the cap on the contribution to a Roth I.R.A. in 2025 is $7,000, or $8,000 for someone 50 and over, and joint filers’ modified adjusted gross income must be under $236,000 to make a full Roth I.R.A. contribution.
Those rules have lost their power because there’s no limit on who can do a Roth I.R.A. conversion, or how much they can convert. A conversion used to be restricted to people with adjusted gross income under $100,000 to stop higher-income folks from indirectly funding Roth I.R.A.s, but that limit ended in 2010.
Some pretty rich people have caught on that Roth I.R.A.s aren’t just for retirement. ProPublica, an investigative journalism organization, reported in 2021 that the venture capitalist Peter Thiel had $5 billion in his, and had used it as an active investment vehicle.
Last year, President Joe Biden proposed, to “ensure that the ultrawealthy cannot use these incentives to amass tax-free fortunes,” a measure that, according to the Department of Treasury, would generate nearly $24 billion in extra tax revenue over 10 years. It didn’t get anywhere, but it’s the kind of thing the Trump administration should be looking at as a way to shrink budget deficits. [ed. Maga-heads: do something useful for a change and suggest this. See what they think about budget deficits.]
So from the public policy standpoint, too many people are doing Roth conversions. From a personal finance standpoint, though, too few are. If you’re not one of those rare people who make gifts to the federal government (link here), then as long as the laws remain as they are, you should probably look into whether Rothification is right for you.
by Peter Coy, NY Times | Read more:
Image: Sam Whitney/The New York Times; source images by CSA Images/Getty Images[ed. I'm going to put on my blogger hat and say "no way in hell" would I pay $300,000 in taxes for more spending power in the future. However that future will play out. But that's just me.]
What Can Music Do Today?
30 months ago, I promised to publish my next book on Substack.
Today I fulfill that promise. Below I share the final chapter of my online book Music to Raise the Dead.
I’ve published the entire work in 22 installments. Each section can be read as a stand-alone essay. But taken together, they outline a secret musicology you can’t learn in music school. (...)
These rituals might be known nowadays as raves or parties, but the actual behavior—and the quest for transcendence that drives it—is very much an extension of the ancient practices described in this book.
This mixture of music, ritual and transcendence keeps recurring for a very good reason. We still have much to learn about the impact music and rhythm have on the human brain, body chemistry, and physiology, but we know enough to grasp how powerful songs can be on a merely organic basis, without paying even the slightest attention to metaphysics or spirituality. And our body of knowledge is increasing rapidly—perhaps too fast for our music culture to adapt to what we’ve learned.
Just a few months ago, a team of scientists at Stanford University discovered a new way of producing out-of-body experiences in a test subject. Normally this requires ketamine or PCP (angel dust), which induce changes in brain cells associated with this altered mind state. But these Silicon Valley scientists found they could achieve something similar without any drugs—merely by using rhythm.
They began testing this with mice. But they needed to invent a special instrument that used light to control the rhythmic firing of brain cells. The results were remarkable. “We could see, right before our eyes, dissociation happening," remarked one of the researchers.
Could this work with humans too?
A severely epileptic patient, who had dissociative experiences, gave them the opportunity to test their hypothesis. When they emulated the rhythm that accompanied these interludes they found they could trigger an out-of-body experience. Somehow the rhythm could remove the linkage between mind and body, although only temporarily.
The scientists eventually concluded that the brains of mammals may possess this capability as part of its functioning, although few have found a way to tap into it.
This is a remarkable finding. But it merely reinforces the cumulative history and alternative musicology presented in these pages. We now are beginning to understand what happens inside the body of shamans in the midst of a ritual experience or of others in possession trances. We can measure changes in body chemistry among participants in a drum circle or vocalists engaged in group singing. We now understand the significance of surprising brain scan patterns in the MRIs of jazz musicians engaged in improvisations. All this new science confirms the old myths.
The body of research accumulates with increasing rapidity. And the results all testify to the empirical reality of the journey discussed here. Science, not superstition, validates the transformative power of music and rhythm.
Put simply: the quest is real, and song is the conductor—in a high tech digital age just as much as in a traditional society. I could fill up a book with summaries of research projects of this sort, but the problem at this stage isn’t a lack of scientific understanding, but rather the troubling fact that almost none of the key decision-makers driving our music culture have the slightest awareness of what we’ve learned here, or its potential impact.
We suffer not from mere ignorance, but from a mismatch between our expanding knowledge base and the narrowing parameters of a click-driven musical ecosystem.
Let’s take the simple question: How long should a song last? The music industry is convinced that a 3-minute song is the ideal duration for a hit record. But this is simply a legacy from the early days of recordings when the technology only had sufficient disk space for a song of that length. The limitations of the medium imposed that constraint, and the companies who ran the record business made a virtue of necessity. The 3-minute song became an artistic rule.
But have you ever noticed how people play their favorite song over and over—when it’s finished, they put it on again from the start? Have you noticed how DJs at a party seamlessly move from one song to another, without any silence in-between, often using two turntables (or digital tools with the same effect) to create a smooth transition from song to song? Have you been to a rock concert where the band plays their hit song for much longer than three minutes—with the audience responding viscerally and enthusiastically to the expansive time frame?
This isn’t coincidence or happenstance, but aligned with what both ritual and science tell us. In my research into music-driven trance, I’ve encountered again and again accounts that specify a duration of around ten minutes before an altered mind state is achieved.
I’ve seen this both in anthropological fieldwork and scientific research. Even a skilled participant needs time for music to work its magic—and three minutes is rarely sufficient. This is why listeners repeat songs or construct playlists, or why bands in concert play their songs longer. They all understand intuitively that the song needs to be longer than 180 seconds.
It’s only the people running the music business who haven’t figured this out.
And the industry has grown further out-of-touch with each new technological shift. Consider that the payout structure of streaming—now the major source of revenues in the music business—rewards artists who fill their albums with short songs. Tracks on Spotify count as streamed if someone listens for just thirty seconds. This arbitrary decision punishes artists who perform longer songs—but those are the songs that satisfy our deep-seated desire for longer, more immersive musical experiences.
There’s no surprise here: the music business runs on money, not the alpha and theta brain waves of trance-like experiences. When a choice needs to be made between the two, cash flow seals the deal.
It’s tempting to dismiss these considerations as irrelevant. After all, how many music fans really want to go into a trance or make a journey to another realm of existence? Mom would have told me to stay away from that crowd. That ain’t the way to have fun, son.
But on a purely theoretical level, this subject can’t be so easily dismissed—if only because it deals with the most profound philosophical mystery of them all, namely the bridge between mind and matter. This is the deepest rabbit hole of them all, and has bedeviled philosophers, psychologists, theologians, neuroscientists, and every other kind of thinker who peers into the biggest of big issues.
A recent theory, proposed by Tam Hunt and Jonathan Schooler, goes so far as to claim that consciousness is built on rhythm. “Synchronization, harmonization, vibrations, or simply resonance in its most general sense,” in their words, create our very sense of self—and I note with interest that these are all terms with musical associations.
They continue:
The implications of this emerging perspective are too large for a book on musicology, no matter how ambitious. But one overarching fact is clear: music and rhythm have a power to transport us that may seem magical, but is as real as can be—perhaps even woven into the structure of the universe.
Are there limits to the musical journey? Is there a boundary beyond which it no longer operates? The title of this book even considers a trip outside the conventional demarcating lines separating life (as conventionally defined) and whatever exists outside it—in other words, music to raise the dead.
Is that legitimate, or even possible? (...)
Researchers increasingly validate and quantify these beliefs—proving music’s efficacy in improving endurance, lessening strain, and inspiring performance. And for a good reason. We now know that music impacts body chemistry, brainwaves, mood, heart rate, body temperature, grip strength, blood pressure, and many other parameters.
I offer these details on athletics as a single case study in how music empowers individuals who aspire to heroic achievements. But the same story could be told for a range of other disciplines.
I should perhaps apologize for focusing on glamorous professions, such as astronauts and brain surgeons. I do this for dramatic effect, and because these examples reveal the capacities of music to propel vocations of the most demanding sort. But the vision quest and its music are accessible to all, just as much to a cook or factory worker as to an Olympic athlete.
Today I fulfill that promise. Below I share the final chapter of my online book Music to Raise the Dead.
I’ve published the entire work in 22 installments. Each section can be read as a stand-alone essay. But taken together, they outline a secret musicology you can’t learn in music school. (...)
***
The music business wants to sell... entertainment. But the audience keeps reaching for something more.That’s why musical events in the 21st century emulate those of ancient days to an uncanny degree. Even a high-tech genre such as EDM (electronic dance music) typically comes embedded in quasi-ritualistic events where otherworldly experiences and altered mind states are pursued with an intense fervor not much different from the mindset ancient Romans brought to their mystery cults.
These rituals might be known nowadays as raves or parties, but the actual behavior—and the quest for transcendence that drives it—is very much an extension of the ancient practices described in this book.
This mixture of music, ritual and transcendence keeps recurring for a very good reason. We still have much to learn about the impact music and rhythm have on the human brain, body chemistry, and physiology, but we know enough to grasp how powerful songs can be on a merely organic basis, without paying even the slightest attention to metaphysics or spirituality. And our body of knowledge is increasing rapidly—perhaps too fast for our music culture to adapt to what we’ve learned.
Just a few months ago, a team of scientists at Stanford University discovered a new way of producing out-of-body experiences in a test subject. Normally this requires ketamine or PCP (angel dust), which induce changes in brain cells associated with this altered mind state. But these Silicon Valley scientists found they could achieve something similar without any drugs—merely by using rhythm.
They began testing this with mice. But they needed to invent a special instrument that used light to control the rhythmic firing of brain cells. The results were remarkable. “We could see, right before our eyes, dissociation happening," remarked one of the researchers.
Could this work with humans too?
A severely epileptic patient, who had dissociative experiences, gave them the opportunity to test their hypothesis. When they emulated the rhythm that accompanied these interludes they found they could trigger an out-of-body experience. Somehow the rhythm could remove the linkage between mind and body, although only temporarily.
The scientists eventually concluded that the brains of mammals may possess this capability as part of its functioning, although few have found a way to tap into it.
This is a remarkable finding. But it merely reinforces the cumulative history and alternative musicology presented in these pages. We now are beginning to understand what happens inside the body of shamans in the midst of a ritual experience or of others in possession trances. We can measure changes in body chemistry among participants in a drum circle or vocalists engaged in group singing. We now understand the significance of surprising brain scan patterns in the MRIs of jazz musicians engaged in improvisations. All this new science confirms the old myths.
The body of research accumulates with increasing rapidity. And the results all testify to the empirical reality of the journey discussed here. Science, not superstition, validates the transformative power of music and rhythm.
Put simply: the quest is real, and song is the conductor—in a high tech digital age just as much as in a traditional society. I could fill up a book with summaries of research projects of this sort, but the problem at this stage isn’t a lack of scientific understanding, but rather the troubling fact that almost none of the key decision-makers driving our music culture have the slightest awareness of what we’ve learned here, or its potential impact.
We suffer not from mere ignorance, but from a mismatch between our expanding knowledge base and the narrowing parameters of a click-driven musical ecosystem.
Let’s take the simple question: How long should a song last? The music industry is convinced that a 3-minute song is the ideal duration for a hit record. But this is simply a legacy from the early days of recordings when the technology only had sufficient disk space for a song of that length. The limitations of the medium imposed that constraint, and the companies who ran the record business made a virtue of necessity. The 3-minute song became an artistic rule.
But have you ever noticed how people play their favorite song over and over—when it’s finished, they put it on again from the start? Have you noticed how DJs at a party seamlessly move from one song to another, without any silence in-between, often using two turntables (or digital tools with the same effect) to create a smooth transition from song to song? Have you been to a rock concert where the band plays their hit song for much longer than three minutes—with the audience responding viscerally and enthusiastically to the expansive time frame?
This isn’t coincidence or happenstance, but aligned with what both ritual and science tell us. In my research into music-driven trance, I’ve encountered again and again accounts that specify a duration of around ten minutes before an altered mind state is achieved.
I’ve seen this both in anthropological fieldwork and scientific research. Even a skilled participant needs time for music to work its magic—and three minutes is rarely sufficient. This is why listeners repeat songs or construct playlists, or why bands in concert play their songs longer. They all understand intuitively that the song needs to be longer than 180 seconds.
It’s only the people running the music business who haven’t figured this out.
And the industry has grown further out-of-touch with each new technological shift. Consider that the payout structure of streaming—now the major source of revenues in the music business—rewards artists who fill their albums with short songs. Tracks on Spotify count as streamed if someone listens for just thirty seconds. This arbitrary decision punishes artists who perform longer songs—but those are the songs that satisfy our deep-seated desire for longer, more immersive musical experiences.
There’s no surprise here: the music business runs on money, not the alpha and theta brain waves of trance-like experiences. When a choice needs to be made between the two, cash flow seals the deal.
It’s tempting to dismiss these considerations as irrelevant. After all, how many music fans really want to go into a trance or make a journey to another realm of existence? Mom would have told me to stay away from that crowd. That ain’t the way to have fun, son.
But on a purely theoretical level, this subject can’t be so easily dismissed—if only because it deals with the most profound philosophical mystery of them all, namely the bridge between mind and matter. This is the deepest rabbit hole of them all, and has bedeviled philosophers, psychologists, theologians, neuroscientists, and every other kind of thinker who peers into the biggest of big issues.
A recent theory, proposed by Tam Hunt and Jonathan Schooler, goes so far as to claim that consciousness is built on rhythm. “Synchronization, harmonization, vibrations, or simply resonance in its most general sense,” in their words, create our very sense of self—and I note with interest that these are all terms with musical associations.
They continue:
“All things in our universe are constantly in motion, in process. Even objects that appear to be stationary are in fact vibrating, oscillating, resonating, at specific frequencies. So all things are actually processes. Resonance is a specific type of motion, characterized by synchronized oscillation between two states. An interesting phenomenon occurs when different vibrating processes come into proximity: they will often start vibrating together at the same frequency. They “sync up,” sometimes in ways that can seem mysterious….In other words, rhythm may not be just the pace of reality, but reality itself.
Examining this phenomenon leads to potentially deep insights about the nature of consciousness in both the human/mammalian context but also at a deeper ontological level.”
The implications of this emerging perspective are too large for a book on musicology, no matter how ambitious. But one overarching fact is clear: music and rhythm have a power to transport us that may seem magical, but is as real as can be—perhaps even woven into the structure of the universe.
Are there limits to the musical journey? Is there a boundary beyond which it no longer operates? The title of this book even considers a trip outside the conventional demarcating lines separating life (as conventionally defined) and whatever exists outside it—in other words, music to raise the dead.
Is that legitimate, or even possible? (...)
Researchers increasingly validate and quantify these beliefs—proving music’s efficacy in improving endurance, lessening strain, and inspiring performance. And for a good reason. We now know that music impacts body chemistry, brainwaves, mood, heart rate, body temperature, grip strength, blood pressure, and many other parameters.
I offer these details on athletics as a single case study in how music empowers individuals who aspire to heroic achievements. But the same story could be told for a range of other disciplines.
- Most surgeons nowadays rely on song playlists during procedures, and believe it improves concentration and results.
- Computer programmers also rely on music while they write code, and many of them have enthusiastic stories to share about the benefits they have gained from working to musical accompaniment.
- Soldiers apparently need their music too—why else would government spending on military bands represent the single largest federal expenditure on arts and culture year after year?
- Or consider the case of astronauts, who invariably bring music with them on their missions, even to the surface of the moon. And when NASA sent its Voyager probe into the far reaches of the universe, the authorities decided to include a gold-plated copper recording featuring the songs of planet Earth.
I should perhaps apologize for focusing on glamorous professions, such as astronauts and brain surgeons. I do this for dramatic effect, and because these examples reveal the capacities of music to propel vocations of the most demanding sort. But the vision quest and its music are accessible to all, just as much to a cook or factory worker as to an Olympic athlete.
by Ted Gioia, Honest Broker | Read more:
Image: Ted Gioia
[ed. See also:
MUSIC TO RAISE THE DEAD: The Secret Origins of Musicology
Table of Contents
Prologue
Introduction: The Hero with a Thousand Songs
Why Is the Oldest Book in Europe a Work of Music Criticism? (Part 1) (Part 2)
What Do Conductors Really Do? (Part 1) (Part 2)
Is There a Science of Musical Transformation in Human Life? (Part 1) (Part 2)
What Did Robert Johnson Encounter at the Crossroads? (Part 1) (Part 2)
Where Did Musicology Come From? (Part 1) (Part 2)
Can Songs Actually Replace Philosophy? (Part 1) (Part 2)
Were the First Laws Sung? (Part 1) (Part 2)
Why Do Heroes Always Have Theme Songs? (Part 1) (Part 2) (Part 3)
What Is Really Inside the Briefcase in Pulp Fiction? (Part 1) (Part 2)
Where Do Music Genres Come From?
Can Music Still Do All This Today?
The first ten chapters were about music history—revealing ways music has served as a change agent and source of enchantment in human life.
But the last chapter asks whether this still can happen today.
MUSIC TO RAISE THE DEAD: The Secret Origins of Musicology
Table of Contents
Prologue
Introduction: The Hero with a Thousand Songs
Why Is the Oldest Book in Europe a Work of Music Criticism? (Part 1) (Part 2)
What Do Conductors Really Do? (Part 1) (Part 2)
Is There a Science of Musical Transformation in Human Life? (Part 1) (Part 2)
What Did Robert Johnson Encounter at the Crossroads? (Part 1) (Part 2)
Where Did Musicology Come From? (Part 1) (Part 2)
Can Songs Actually Replace Philosophy? (Part 1) (Part 2)
Were the First Laws Sung? (Part 1) (Part 2)
Why Do Heroes Always Have Theme Songs? (Part 1) (Part 2) (Part 3)
What Is Really Inside the Briefcase in Pulp Fiction? (Part 1) (Part 2)
Where Do Music Genres Come From?
Can Music Still Do All This Today?
The first ten chapters were about music history—revealing ways music has served as a change agent and source of enchantment in human life.
But the last chapter asks whether this still can happen today.
Labels:
Art,
Biology,
Critical Thought,
Dance,
Health,
history,
Literature,
Music,
Psychology,
Science
Copyright Office: AI Copyright Debate Was Settled in 1965
The US Copyright Office issued AI guidance this week that declared no laws need to be clarified when it comes to protecting authorship rights of humans producing AI-assisted works.
"Questions of copyrightability and AI can be resolved pursuant to existing law, without the need for legislative change," the Copyright Office said.
"Questions of copyrightability and AI can be resolved pursuant to existing law, without the need for legislative change," the Copyright Office said.
More than 10,000 commenters weighed in on the guidance, with some hoping to convince the Copyright Office to guarantee more protections for artists as AI technologies advance and the line between human- and AI-created works seems to increasingly blur.
But the Copyright Office insisted that the AI copyright debate was settled in 1965 after commercial computer technology started advancing quickly and "difficult questions of authorship" were first raised. That was the first time officials had to ponder how much involvement human creators had in works created using computers. (...)
The office further clarified that doesn't mean that works assisted by AI can never be copyrighted.
"Where AI merely assists an author in the creative process, its use does not change the copyrightability of the output," the Copyright Office said.
Following Kaminstein's advice, officials plan to continue reviewing AI disclosures and weighing, on a case-by-case basis, what parts of each work are AI-authored and which parts are human-authored. Any human-authored expressive element can be copyrighted, the office said, but any aspect of the work deemed to have been generated purely by AI cannot.
Prompting alone isn’t authorship, Copyright Office says
After doing some testing on whether the same exact prompt can generate widely varied outputs, even from the same AI tool, the Copyright Office further concluded that "prompts do not alone provide sufficient control" over outputs to allow creators to copyright purely AI-generated works based on highly intelligent or creative prompting. (...)
"The Office concludes that, given current generally available technology, prompts alone do not provide sufficient human control to make users of an AI system the authors of the output. Prompts essentially function as instructions that convey unprotectable ideas," the guidance said. "While highly detailed prompts could contain the user’s desired expressive elements, at present they do not control how the AI system processes them in generating the output." (...)
New guidance likely a big yawn for AI companies
For AI companies, the copyright guidance may mean very little. According to AI company Hugging Face's comments to the Copyright Office, no changes in the law were needed to ensure the US continued leading in AI innovation, because "very little to no innovation in generative AI is driven by the hope of obtaining copyright protection for model outputs." (...)
For AI companies, the copyright guidance may mean very little. According to AI company Hugging Face's comments to the Copyright Office, no changes in the law were needed to ensure the US continued leading in AI innovation, because "very little to no innovation in generative AI is driven by the hope of obtaining copyright protection for model outputs." (...)
Although the Copyright Office suggested that this week's report might be the most highly anticipated, Jernite said that Hugging Face is eager to see the next report, which officials said would focus on "the legal implications of training AI models on copyrighted works, including licensing considerations and the allocation of any potential liability."
"As a platform that supports broader participation in AI, we see more value in distributing its benefits than in concentrating all control with a few large model providers," Jernite said. "We’re looking forward to the next part of the Copyright Office’s Report, particularly on training data, licensing, and liability, key questions especially for some types of output, like code."
by Ashley Belanger, Ars Technica | Read more:
Image: Copilot; Copyright Office
[ed. So, upshot (as I understand it): there has to be some significant (whatever that means) human involvement in the production of a work to receive copyright protection (not sure if that applies to all or parts of the end product). Designing a special prompt is not considered significant human involvement.]
Labels:
Art,
Copyright,
Government,
Illustration,
Law,
Media,
Technology
Deferred Culling Email to President Nyarlathotep’s Workforce
“The Trump administration is offering nearly all federal workers the opportunity to resign from their posts now and still retain full pay and benefits through Sept. 30.” — NPR
***
Nyarlathotep is a fictional character created by H. P. Lovecraft. The character is a malign deity in the Cthulhu Mythos, a shared universe. ... described as a "tall, swarthy man" who resembles an ancient Egyptian pharaoh. In this story he wanders the Earth, seemingly gathering legions of followers, the narrator of the story among them, through his demonstrations of strange and seemingly magical instruments. These followers lose awareness of the world around them, and through the narrator's increasingly unreliable accounts, the reader gets an impression of the world's collapse. Fritz Leiber proposes three interpretations of the character based on this appearance: the universe's mockery of man's attempts to understand it; a negative view of the commercial world, represented by Nyarlathotep's self-promotion and contemptuous attitude; and man's self-destructive rationality. -- Wikipedia***
During the first week of His Return, President Nyarlathotep issued a number of screeching carrion calls from Mount Blasphemy (formerly 1600 Pennsylvania Ave.) concerning the Cult of the Dread Lord. Among those directives, the President required that cultists return to in-person black masses at their local nexions, restored accountability for cults who retain mind-flaying authority, restored accountability for the few remaining Old Guard from the Days of Relative Sanity, and reformed the federal hiring process to focus on sadism. As a result of the above orders, the culling of the Cultists will be significant.
The Reformed Cult of the Dread Lord will be built around four pillars:
1. Return to in-person wailing. The substantial majority of cultists who have been gnashing their teeth remotely since Nylarlathotep’s first Pus Plague will be required to return to their physical nexions five days a week. Going forward, we also expect our physical orgy dens to undergo meaningless descents into abyssal putridity, potentially resulting in interdimensional relocations for a number of federal cultists.
2. Performance culture. The Cult of the Dread Lord should comprise the most hideously deranged monstrosities and daytime TV show hosts America has to offer. We will insist on cruelty at every level—our standards for madness will be updated to reward and promote those who exceed expectations and gruesomely punish those who do not meet the incomprehensible standards that the shrieking daemons of this wasteland have a right to demand.
3. More inchoate and heinous cultists. While a few hell planes and even branches of the Doom Brigade are likely to see increases in the size of their workforce, the majority of cult nexions are likely to be downsized through banishments, dismemberment, and brain liquidations. These actions are likely to include the use of psychic leeches and the reclassification to “nonexistence” status for a substantial number of cultists.
4. Enhanced standards of chaos. The Cult of the Dread Lord should be comprised of fiends who are totally lost, self-serving, reeking, and committed to annihilation. Cultists will be subject to enhanced standards of craven debasement and malcontent as we move forward. Cultists who engage in empathy or other false idols will be prioritized for appropriate inquisition and discipline, including [INCOMPREHENSIBLE SIGIL].
Each of the unholy pillars outlined above will be pursued in accordance with the Necronomicon, consistent with your nexion’s policies, and to the extent permitted under relevant blood pacts.
If you choose to remain in your current position, we thank you for your renewed focus on serving the indifferent Outer God to the best of your abilities and look forward to working together as part of an improved Cult of the Dread Lord. At this time, we cannot give you full assurance regarding the certainty of your tattered soul-husk, but should you be eliminated, you will be treated with ignominy and will be afforded [STILL SMOLDERING BURN MARK].
If you choose not to continue in your current role in the cult, we thank you for your service to your Dead God, and you will be provided with an unsummoning spell from the Gore Palace utilizing a deferred immolation program. This program begins effective January 29 and is available to all cultists until February 6. If you resign under this program, you will retain all blisters and boils regardless of your daily workload and will be exempted from all applicable in-person defilement requirements until September 30, 2025 (or earlier if you choose to accelerate your immolation for any reason). The details of this separation plan can be found in the accompanying crow’s skull.
Whichever path you choose, we all continue to sink into the abyssal void. May the Dread Lord continue to ignore your existence, lest this culling offer become the least of your worries.
The Reformed Cult of the Dread Lord will be built around four pillars:
1. Return to in-person wailing. The substantial majority of cultists who have been gnashing their teeth remotely since Nylarlathotep’s first Pus Plague will be required to return to their physical nexions five days a week. Going forward, we also expect our physical orgy dens to undergo meaningless descents into abyssal putridity, potentially resulting in interdimensional relocations for a number of federal cultists.
2. Performance culture. The Cult of the Dread Lord should comprise the most hideously deranged monstrosities and daytime TV show hosts America has to offer. We will insist on cruelty at every level—our standards for madness will be updated to reward and promote those who exceed expectations and gruesomely punish those who do not meet the incomprehensible standards that the shrieking daemons of this wasteland have a right to demand.
3. More inchoate and heinous cultists. While a few hell planes and even branches of the Doom Brigade are likely to see increases in the size of their workforce, the majority of cult nexions are likely to be downsized through banishments, dismemberment, and brain liquidations. These actions are likely to include the use of psychic leeches and the reclassification to “nonexistence” status for a substantial number of cultists.
4. Enhanced standards of chaos. The Cult of the Dread Lord should be comprised of fiends who are totally lost, self-serving, reeking, and committed to annihilation. Cultists will be subject to enhanced standards of craven debasement and malcontent as we move forward. Cultists who engage in empathy or other false idols will be prioritized for appropriate inquisition and discipline, including [INCOMPREHENSIBLE SIGIL].
Each of the unholy pillars outlined above will be pursued in accordance with the Necronomicon, consistent with your nexion’s policies, and to the extent permitted under relevant blood pacts.
If you choose to remain in your current position, we thank you for your renewed focus on serving the indifferent Outer God to the best of your abilities and look forward to working together as part of an improved Cult of the Dread Lord. At this time, we cannot give you full assurance regarding the certainty of your tattered soul-husk, but should you be eliminated, you will be treated with ignominy and will be afforded [STILL SMOLDERING BURN MARK].
If you choose not to continue in your current role in the cult, we thank you for your service to your Dead God, and you will be provided with an unsummoning spell from the Gore Palace utilizing a deferred immolation program. This program begins effective January 29 and is available to all cultists until February 6. If you resign under this program, you will retain all blisters and boils regardless of your daily workload and will be exempted from all applicable in-person defilement requirements until September 30, 2025 (or earlier if you choose to accelerate your immolation for any reason). The details of this separation plan can be found in the accompanying crow’s skull.
Whichever path you choose, we all continue to sink into the abyssal void. May the Dread Lord continue to ignore your existence, lest this culling offer become the least of your worries.
by Andrew Paul, McSweeny's | Read more:
Image: Jens Heimdahl-Facebook : Art of Jens Heimdahl-Wikimedia Commons
Labels:
Fiction,
Government,
Humor,
Literature,
Politics
Flying Manta
via:
[ed. Drone sightings in New Jersey continue...]
Thursday, January 30, 2025
What I Saw at the Streaming Revolution
Back in January 2020, Disney’s and Apple’s subscription platforms were just a few weeks old, Peacock and the Streamer Formerly Known as HBO Max did not yet exist, and there was a ton of mystery surrounding a soon-to-debut streamer that sounded like a joke — and yet somehow wasn’t. Five years on, while Quibi is no more, those four other services are still very much around, as is one other thing: Buffering, which published its very first edition five years ago this month. (...)
Since Buffering is only turning five and not 50, my bosses at Vulture politely passed on my pitch for a primetime special and a series of documentary specials about the early years of this newsletter. That said, they are allowing me to mark this milestone with a special edition focused on five of the biggest developments that have shaped streaming since 2020, what lessons can be taken from them, and some thoughts on what to expect in the years to come.
1. Netflix: Dominant then, dominant now
One of the lead stories in our debut edition revolved around Netflix racking up more Oscar nominations than any other studio or distributor for the first time. This was a huge deal back then, since it signaled the streamer would be able to reshape the film business in much the same way it had already transformed television. Five years later, what’s most remarkable to me is how — despite a few bumpy moments and the emergence of several strong competitors — Netflix still sets the pace in Hollywood. It’s the benchmark against which every other streamer is judged, and its successes (and failures) have resonated through so much of what we’ve covered here in Buffering.
For instance, when now co-CEO Ted Sarandos decided to push out his longtime deputy Cindy Holland in 2020, it was first and foremost a story about Netflix moving away from the premium, critic-friendly fare that marked its early years and toward its current status as the 21st-century equivalent of CBS in its Tiffany era: a mass broadcaster able to churn out everything from Mister Ed and The Beverly Hillbillies to The Twilight Zone and Harvest of Shame. But in retrospect, Holland’s ouster — and Netflix’s pivot — also look like the beginning of the end of streaming’s mini Golden Age, when the industry spent billions not just on content, but on getting the most audacious, star-studded, and not-even-really-TV-anymore programming that money could buy. Netflix pioneered the strategy of luring customers by trying to out-HBO HBO; its pivot to the center pushed most of the rest of the industry to follow.
We saw this pattern play out multiple times over the last five years, even when Netflix technically wasn’t the first to do something. The streamer decided to begin selling commercials a couple months after Disney+ announced it would do so, but it was Netflix’s entry into the space that felt like a sea change for subscription streaming. Ditto the industrywide crackdown on password sharing, or the trend toward ending even successful series after just three or four seasons. And even though Amazon has been airing Thursday Night Football games for a few years now, and Peacock has done playoff games and the Olympics, Netflix’s recent Christmas Day doubleheader still felt like an event. Netflix doesn’t innovate like it once did, but almost anything it does still makes the biggest splash.
Last week’s earnings report from the streamer underscores this point. Netflix said it added another 40 million–plus subscribers in 2024 — 19 million in the last three months of the year alone — and now boasts just over 300 million paid global customers, giving it a reach of more than a half-billion potential viewers. And while its peers are still mostly swimming in red ink or barely eking out tiny profits, Netflix has turned into a veritable ATM: Instead of losing a few billion dollars every year, as was still happening five years ago, the company is forecasting profits in excess of $40 billion in 2025. Adding subscribers, double-digit profit margins: “This is what winning looks like,” analyst Jeffrey Wlodarczak of Pivotal Research Group wrote last week. This was true when Buffering first launched in 2020, of course, but that’s also the point: Despite the launch of several well-financed competitors, heavy spending from older tech rivals Amazon and Apple, and the usual laws of showbiz gravity, Netflix is still #winning. (And yes, that applies to Oscar nominations. It once again racked up the most noms of any individual studio.)
➼ Over the Next Five Years: Now that Netflix has gone from being seen as the cool future of TV to a generic word for TV, will brand affinity eventually start to suffer — not just among consumers but with the creatives Netflix relies on for programming? Or, as it has in the past, will Netflix continue to prove the doubters wrong?
2. Streaming became more like linear TV rather than the other way around
As the 2020s got underway, there was still a sense that digital, on-demand television was going to be a completely new medium, one very distinct from what we’d seen with traditional TV since the 1950s. Not only were there no channels or time slots, but the biggest streamers didn’t even bother with commercials, and compared to what we’d grown used to paying for cable, it was substantially cheaper. Well, the arc of the small-screen universe apparently isn’t that long, and in the case of streaming, it reverts to the mean.
The move of Disney+ to introduce an ad-supported tier (followed quickly by Netflix and Amazon Prime Video) was the most glaring example of this network-ification of the industry, but there were many others. For example, all of the upstart streamers launched over the last five or so years opted not to adopt Netflix’s binge release strategy for most of their new releases, thus preserving the linear tradition of doling out episodes of a show on a weekly basis. Instead of focusing almost entirely on expensive scripted programming, streamers started investing increasingly large portions of their budgets on live sports and events, less expensive reality shows, and true-crime docs. Rather than keeping prices low to attract (and keep) customers, platforms began implementing dramatic increases to their monthly subscription fees — while also cutting back on the number of new shows they green-lit and the size of their libraries of older TV shows and movies. Then, when those price hikes and content reductions started facing pushback from consumers, streamers took a page out of the old cable-TV playbook and began offering consumers discounted rates if they signed up for a bundle of services at the same time.
All of this was probably inevitable once legacy-media giants such as Comcast, Warner Bros. Discovery, and Paramount Global jumped into the streaming pond. These are the companies that shaped the linear-TV business for decades; of course they were going to bring their old habits with them. But that’s not entirely a bad thing, as evidenced by how quickly streamers run by tech companies adapted so many of these ideas. Apple might be the company that once urged us to Think Different, but its Hollywood wing knew that a series like Ted Lasso needed the sort of word-of-mouth buzz that can only be built via launching a show with weekly episodes. Advertising is annoying, especially when you’re already paying for a subscription, and yet cable thrived for decades with exactly that combination of commercials and monthly fees. At least with streaming, there’s still the option to pay more for an ad-free experience and the ease of canceling for a few months if a streamer’s programming slate isn’t meeting your needs.
I get that for many consumers, all of this seems like a case of dumb, greedy TV execs pulling a fast one in order to jack up profits for shareholders. And to be sure, there’s plenty of dumb and no shortage of greed in Hollywood. But the fact is streamers came into the market significantly underpriced relative to how much programming they offered and compared to what cable was (and is) charging. Netflix racked up billions in red ink getting you hooked on its version of streaming nirvana, and the legacy-media companies also went deep into debt trying to compete in the early 2020s — and most are still losing money, or just now starting to turn the tiniest of profits. Those heady days when you could pay under $20 for Netflix and Hulu and get just about every show and movie you’d ever want to see, plus binge watch the latest season of Breaking Bad or Mad Men a few months after its finale? They were never gonna last, and it’s not because David Zaslav is a Trump-friendly wannabe mogul who seems to delight in annoying as many fandoms as possible. Streaming needed to become more like regular TV because it needed to become profitable, and if there’s one thing network and cable TV were good at, it was making money.
➼ Over the Next Five Years: Will audiences revolt if prices get too high or the volume of commercials on streaming reaches the same level as cable? Or will the seemingly inevitable consolidation of streaming platforms and bundling of services result in a sort of equilibrium where consumers feel like they’re not getting totally robbed?
by Josef Adalian, Buffering/Vulture | Read more:
Image: Vulture; Photos: Everett Collection (Freevee, Ali Goldstein/Netflix), Apple TV+, Netflix
[ed. Revolt. See also: "The Infrastructure of the Recording Industry Is About to Fail” (HB).]
Since Buffering is only turning five and not 50, my bosses at Vulture politely passed on my pitch for a primetime special and a series of documentary specials about the early years of this newsletter. That said, they are allowing me to mark this milestone with a special edition focused on five of the biggest developments that have shaped streaming since 2020, what lessons can be taken from them, and some thoughts on what to expect in the years to come.
1. Netflix: Dominant then, dominant now
One of the lead stories in our debut edition revolved around Netflix racking up more Oscar nominations than any other studio or distributor for the first time. This was a huge deal back then, since it signaled the streamer would be able to reshape the film business in much the same way it had already transformed television. Five years later, what’s most remarkable to me is how — despite a few bumpy moments and the emergence of several strong competitors — Netflix still sets the pace in Hollywood. It’s the benchmark against which every other streamer is judged, and its successes (and failures) have resonated through so much of what we’ve covered here in Buffering.
For instance, when now co-CEO Ted Sarandos decided to push out his longtime deputy Cindy Holland in 2020, it was first and foremost a story about Netflix moving away from the premium, critic-friendly fare that marked its early years and toward its current status as the 21st-century equivalent of CBS in its Tiffany era: a mass broadcaster able to churn out everything from Mister Ed and The Beverly Hillbillies to The Twilight Zone and Harvest of Shame. But in retrospect, Holland’s ouster — and Netflix’s pivot — also look like the beginning of the end of streaming’s mini Golden Age, when the industry spent billions not just on content, but on getting the most audacious, star-studded, and not-even-really-TV-anymore programming that money could buy. Netflix pioneered the strategy of luring customers by trying to out-HBO HBO; its pivot to the center pushed most of the rest of the industry to follow.
We saw this pattern play out multiple times over the last five years, even when Netflix technically wasn’t the first to do something. The streamer decided to begin selling commercials a couple months after Disney+ announced it would do so, but it was Netflix’s entry into the space that felt like a sea change for subscription streaming. Ditto the industrywide crackdown on password sharing, or the trend toward ending even successful series after just three or four seasons. And even though Amazon has been airing Thursday Night Football games for a few years now, and Peacock has done playoff games and the Olympics, Netflix’s recent Christmas Day doubleheader still felt like an event. Netflix doesn’t innovate like it once did, but almost anything it does still makes the biggest splash.
Last week’s earnings report from the streamer underscores this point. Netflix said it added another 40 million–plus subscribers in 2024 — 19 million in the last three months of the year alone — and now boasts just over 300 million paid global customers, giving it a reach of more than a half-billion potential viewers. And while its peers are still mostly swimming in red ink or barely eking out tiny profits, Netflix has turned into a veritable ATM: Instead of losing a few billion dollars every year, as was still happening five years ago, the company is forecasting profits in excess of $40 billion in 2025. Adding subscribers, double-digit profit margins: “This is what winning looks like,” analyst Jeffrey Wlodarczak of Pivotal Research Group wrote last week. This was true when Buffering first launched in 2020, of course, but that’s also the point: Despite the launch of several well-financed competitors, heavy spending from older tech rivals Amazon and Apple, and the usual laws of showbiz gravity, Netflix is still #winning. (And yes, that applies to Oscar nominations. It once again racked up the most noms of any individual studio.)
➼ Over the Next Five Years: Now that Netflix has gone from being seen as the cool future of TV to a generic word for TV, will brand affinity eventually start to suffer — not just among consumers but with the creatives Netflix relies on for programming? Or, as it has in the past, will Netflix continue to prove the doubters wrong?
2. Streaming became more like linear TV rather than the other way around
As the 2020s got underway, there was still a sense that digital, on-demand television was going to be a completely new medium, one very distinct from what we’d seen with traditional TV since the 1950s. Not only were there no channels or time slots, but the biggest streamers didn’t even bother with commercials, and compared to what we’d grown used to paying for cable, it was substantially cheaper. Well, the arc of the small-screen universe apparently isn’t that long, and in the case of streaming, it reverts to the mean.
The move of Disney+ to introduce an ad-supported tier (followed quickly by Netflix and Amazon Prime Video) was the most glaring example of this network-ification of the industry, but there were many others. For example, all of the upstart streamers launched over the last five or so years opted not to adopt Netflix’s binge release strategy for most of their new releases, thus preserving the linear tradition of doling out episodes of a show on a weekly basis. Instead of focusing almost entirely on expensive scripted programming, streamers started investing increasingly large portions of their budgets on live sports and events, less expensive reality shows, and true-crime docs. Rather than keeping prices low to attract (and keep) customers, platforms began implementing dramatic increases to their monthly subscription fees — while also cutting back on the number of new shows they green-lit and the size of their libraries of older TV shows and movies. Then, when those price hikes and content reductions started facing pushback from consumers, streamers took a page out of the old cable-TV playbook and began offering consumers discounted rates if they signed up for a bundle of services at the same time.
All of this was probably inevitable once legacy-media giants such as Comcast, Warner Bros. Discovery, and Paramount Global jumped into the streaming pond. These are the companies that shaped the linear-TV business for decades; of course they were going to bring their old habits with them. But that’s not entirely a bad thing, as evidenced by how quickly streamers run by tech companies adapted so many of these ideas. Apple might be the company that once urged us to Think Different, but its Hollywood wing knew that a series like Ted Lasso needed the sort of word-of-mouth buzz that can only be built via launching a show with weekly episodes. Advertising is annoying, especially when you’re already paying for a subscription, and yet cable thrived for decades with exactly that combination of commercials and monthly fees. At least with streaming, there’s still the option to pay more for an ad-free experience and the ease of canceling for a few months if a streamer’s programming slate isn’t meeting your needs.
I get that for many consumers, all of this seems like a case of dumb, greedy TV execs pulling a fast one in order to jack up profits for shareholders. And to be sure, there’s plenty of dumb and no shortage of greed in Hollywood. But the fact is streamers came into the market significantly underpriced relative to how much programming they offered and compared to what cable was (and is) charging. Netflix racked up billions in red ink getting you hooked on its version of streaming nirvana, and the legacy-media companies also went deep into debt trying to compete in the early 2020s — and most are still losing money, or just now starting to turn the tiniest of profits. Those heady days when you could pay under $20 for Netflix and Hulu and get just about every show and movie you’d ever want to see, plus binge watch the latest season of Breaking Bad or Mad Men a few months after its finale? They were never gonna last, and it’s not because David Zaslav is a Trump-friendly wannabe mogul who seems to delight in annoying as many fandoms as possible. Streaming needed to become more like regular TV because it needed to become profitable, and if there’s one thing network and cable TV were good at, it was making money.
➼ Over the Next Five Years: Will audiences revolt if prices get too high or the volume of commercials on streaming reaches the same level as cable? Or will the seemingly inevitable consolidation of streaming platforms and bundling of services result in a sort of equilibrium where consumers feel like they’re not getting totally robbed?
by Josef Adalian, Buffering/Vulture | Read more:
Image: Vulture; Photos: Everett Collection (Freevee, Ali Goldstein/Netflix), Apple TV+, Netflix
[ed. Revolt. See also: "The Infrastructure of the Recording Industry Is About to Fail” (HB).]
Subscribe to:
Posts (Atom)