Thursday, July 18, 2024

Against Slop

Beyond the failure market in video games

It's usually understood that time wasted is art wasted. To edit down to the lean core, that’s often considered in most mediums the mark of quality (or, perhaps more accurately, and sometimes to a fault, professionalism). Historically, that’s been part of the cultural stigma against video games: not only is wasted time a given, it’s an integral part of the experience. Interactivity inverts the responsibility for moving the plot forward from storyteller to audience. A linear, self-propulsive story squanders the medium’s artistic potential. In games, the tension comes from not only the uncertainty about how the plot will resolve but also whether it even can. When the player fails, the story ends without ever having reached a conclusion. The work necessarily has to reset in some manner. Which creates a minor paradox: How can time discarded not also be time wasted? Isn’t this all noise and nonsense for the purpose of keeping a couch potato on the couch so long they sprout roots?

Repetition is usually dramatic poison, and it’s no wonder such failing without finality is erased from representations of gaming in other media. Whether in 1982’s Tron, or the “First Person Shooter” episode of The X-Files, or Gerard Butler’s turn in 2009’s Gamer, the “if you die in the game, you die for real” trope is understandable. Games can be complex, multifaceted cultural objects and are more frequently being covered that way, yet the accusation that games are action for the sake of action with little consequence or meaning is uncomfortably accurate much of the time. The source of the stigma stems from the early arcade days, when games primarily leveraged failure: every loss shook down the player for another quarter to send rattling into the machine. To beat the game and see it in its entirety took a mountain of coins, dedication, and skill—rendering play play, which is to say, largely divorced from narrative or the kinds of emotional experiences other art forms explored.

The pastime became less sport and more medium when home consoles and personal computers allowed games to experiment on a mass scale. Developers had no profit incentive to induce defeat once a cartridge or CD-ROM had been sold. Failure became instead the primary driver of tension within an emerging narrative, allowing story to flourish alongside gameplay from text adventures to action shooters. These stories were, save for those played with perfect skill, littered with loops. With every trap that fatally mangles a player in Tomb Raider, every police chase in Grand Theft Auto that ends in a brick wall instead of an escape, the narrative goes backward, the protagonist character’s story caught in a cyclical purgatory until the player-protagonist achieves a better result.

The sensation of breaking through those barriers is one of the most cathartic experiences that games offer, built on the interactivity that is so unique to gaming as a medium. Failure builds tension, which is then released with dramatic victory. But the accusation that these discarded loops are irrecuperable wastes of time still rings true, as modern game audiences have become comfortable consuming slop. In the past few years, games have trended toward becoming enormous blobs of content for the sake of content: an open world, a checklist of activities, a crafting system that turns environmental scrap into barely noticeable quality-of-life improvements. Ubisoft’s long-running Far Cry franchise has often been an example of this kind of format, as are survival crafting games like Funcom’s Conan Exiles or Bethesda’s overloaded wasteland sim Fallout 76. Every activity in a Far Cry or its ilk is a template activity that only comes to a finite end after many interminable engagements: a base is conquered, just to have three more highlighted. Failure here is a momentary punishment that can feel indistinguishable from success, as neither produces a sense of meaning or progress. These failure states are little moments of inattention and clumsy gameplay that lead only to repeating the same task better this time. Then when you do play better mechanically, you are rewarded with the privilege of repeating the same task, a tiny bit more interesting this time because the enemies are a little tougher in the next valley over. Within games that play for dozens of hours but are largely defined by mechanical combat loops that can last just seconds, everything can boil down to the present-tense experience so detrimentally that it’s hard to remember what you actually did at the end of those dozens of hours.

There is no narrative weight to liberating the same base in Far Cry across multiple attempts, no sense of cumulative progression to repeatedly coming at the same open-world content from different angles. There is only a grim resignation to the sunk-cost fallacy that, if you’ve already invested so much time into the damn thing already, you might as well bring it to some kind of resolution. Cranking up difficulty can make those present-tense moments more dramatic or stressful, but in the end it’s just adding more hours to the total playtime by adjusting the threshold for completing a given segment to a stricter standard. The game does not care if you succeed or fail, only that you spend time with it over its competitors.

As the industry creates limbos of success, the failure market itself has also mutated. See mobile gaming, a distorted echo of the coin-operated era, where players are behaviorally channeled to buy things like extra Poké Balls in Pokemon Go or “continue” permissions in Candy Crush and keep playing just a little longer. In 1980, failure cost a cumulative $2.8 billion in quarters; in 2022, the mobile games market made $124 billion by creating artificial barriers and choke points within their game mechanics, either making things more actively difficult or just slowing them down to prompt impulse spending.

In video games like the ubiquitous Fortnite or Blizzard’s recent Diablo 4, major releases often have “seasons” that heavily encourage cyclical spending. Every three months the game adds new content and asks the player to repeat the experience. The player exchanges between seven to twenty-five dollars to gild the stories they’ve already completed with extra objects, materials, and costumes—real money spent only for the privilege of sinking in the requisite time to acquire these virtual items, creating yet another loop of increasingly meaningless time usage. Fortnite came out in 2017. In 2023 the game generated all by itself a total $4.4 billion of income. A sum larger than the GDP of some countries, generated in one year, six years after release, off the impulse not to look like a scrub with no money in front of your friends even if those friends are dressed as Peter Griffin and the Xenomorph from Alien.
---
“Live service” is used to describe these games that attempt to stay evergreen with seasonal systems and intermittent small content drops. These seasonal titles and mobile cash shops have created feedback loops and cyclical repetitions that, by design, do not resolve. In recent years, however, there has been a counterreaction that tries to integrate these consumerist tendencies in the pursuit of something greater. (...)

Besides its beautiful portrayal of a declining, melancholy world of perpetual autumn, what sets aside Elden Ring is its complexly layered difficulty. Elden Ring is quite eager to kill you, with a million ways to put a player down. But it is not meanly difficult, or insurmountably difficult. Most importantly, it is not difficult as part of a profit-seeking monetization loop. Instead, the failure states that are so often leveraged to extend playtime and coerce spending in most other games are here used as friction to build atmosphere. The constant starting again is exhausting, often stressful, sometimes infuriating. It is never meaningless, however: it confidently contradicts the worries of other mediums and the too-often-true accusations of slop with its deep understanding of how to create drama within any individual moment. Participating in its loops of death and rebirth as a player is to be fully within the Lands Between. Elden Ring presents a once-flourishing kingdom literally consumed by creeping nihilism and reflexive despair, which gives sympathetic resonance to the player’s determined and confident attempts to surmount these challenges. The most powerful or villainous enemies withdraw into themselves and let the world rot, while the weakest literally cower from the player, so exhausted by the idea of another painful death. Not the player, though: they exist in deliberate dramatic contrast to these characters by virtue of their own interactive participation with the world, making them the hero as both part of the text and as a meta-textual frame for the whole story.

By persisting in a world that trends downward, your own failures take on a defiant quality. The failure loop of the game incentivizes the player to loop again. This is where Elden Ring’s difficulty is particularly clever: because a player pays no consequence besides dropping experience points on the ground where they died, there is a hard limit on what the game can take away from them. There is an interactive choice and freedom even within these fail states, as you can abandon them or return again, fighting through all you had before; this in turn creates an incredible carrot-and-stick effect that, should you gamble on reclaiming your hard-won gains, doubles the stakes. While it is repeating the same content on the surface, there is a tangible and meaningful sense of cumulative progress and tactical variation on every death.

Once you’ve spent those points on an upgrade, that’s yours for the rest of the game—a permanent token of your dedication. A player is only ever risking the immediate next step, which adds weight to the fantasy of the gameplay, but not so much actual consequence that failure would crush a player’s spirit to continue. Holding onto your advancements even after dying and coming back makes your arc of progression stand in exciting contrast to the world around you. From a stagnant place, you are rising as something new, something vibrant. By incorporating these meta-textual elements into the mechanical play, there is a sense of individuality and ownership of the experience that more typical open-world check-listing games do not have. When I fail in Far Cry, it feels dramatically evaporative and impersonal. When I fail in Elden Ring, I feel like it’s because I made an active choice to risk something and I come back more engaged and determined than ever. (...)

The expansion’s price tag is less about monetizing the players than it is a reflection of the developmental effort involved. Elden Ring was certainly in a position to cash in at any time. The initial release was as successful as any game using more manipulative methods of extracting value. It was so popular that it sold twelve million copies in two weeks, moving on to over twenty million sold within a year of release. By any metric, but particularly by the metric where you multiply twenty million by the sixty-dollar retail price, the game was a massive success for art of any sort in any medium, doing so without relying on in-app purchases, artificial game resource scarcity, or making certain weapons and armor premium-purchase only.

For the health of video games as an artistic medium, this needs to be enough. That’s plenty of money. That’s such an enormous goddamn pile of money it even justifies the multimillion-dollar cost of developing modern flagship titles. Perhaps the problem with Elden Ring as an example is that it’s a masterpiece. It captured the imagination of millions. Games as an industry, instead of an artistic medium, don’t want that kind of success for only the games that are worthy of it. The industry needs to make money like that on the games built without subtlety, or craft, or heart. The industry needs to pull a profit off the slop too, and there is nothing they won’t gut or sell out to do it. If the old way was to tax failure, the new way is to dilute success, to treadmill the experience such that it never reaches a destination. (...)

This is just the era we live in, our own stagnant age in the Lands Between. With Disney and its subsidiaries sucking all of the air out of the room to repackage the same concept over and over, Hollywood has reached the stale conclusion that the same story can be told repetitively. The embrace of AI across multiple mediums just intensifies this dilution of what feels meaningful. 

by Noah Caldwell-Gervais, The Baffler | Read more:
Image: From Elden Ring.|Bandi Namco

Tuesday, July 16, 2024

After 12 Years of Reviewing Restaurants, Pete Wells is Leaving the Table

Early this year, I went for my first physical in longer than I’d care to admit. At the time, I was about halfway through a list of 140 or so restaurants I planned to visit before I wrote the 2024 edition of “The 100 Best Restaurants in New York City.” It was a fair bet that I wasn’t in the best shape of my life.

My scores were bad across the board; my cholesterol, blood sugar and hypertension were worse than I’d expected even in my doomiest moments. The terms pre-diabetes, fatty liver disease and metabolic syndrome were thrown around. I was technically obese.

OK, not just technically.

I knew I needed to change my life. I promised I’d start just as soon as I’d eaten in the other 70 restaurants on my spreadsheet.

But a funny thing happened when I got to the end of all that eating: I realized I wasn’t hungry. And I’m still not, at least not the way I used to be. And so, after 12 years as restaurant critic for The New York Times, I’ve decided to bow out as gracefully as my state of technical obesity will allow.

Not that I’m leaving the newsroom. I have a couple more restaurant reviews in my back pocket that will appear over the next few weeks, and I plan to stick around at The Times long after that. But I can’t hack the week-to-week reviewing life anymore.

The first thing you learn as a restaurant critic is that nobody wants to hear you complain. The work of going out to eat every night with hand-chosen groups of friends and family sounds suspiciously like what other people do on vacation. If you happen to work in New York or another major city, your beat is almost unimaginably rich and endlessly novel.

People open restaurants for all kinds of reasons. Some want to conjure up the flavors of a place they left behind, and consider their business a success if they win the approval of other people from the same place. Others want to dream up food that nobody has ever tasted or even imagined before, and won’t be satisfied until their name is known in Paris and Beijing and Sydney.

And there are a hundred gradations in between. The city is a feast. Exploring, appreciating, understanding, interpreting and often even enjoying that feast has been the greatest honor of my career. And while the number of restaurant critics is getting smaller every year, everybody I know who works in this endangered profession would probably say the same thing.

So we tend to save our gripes until two or three of us are gathered around the tar pits. Then we’ll talk about the things nobody will pity us for, like the unflattering mug shots of us that restaurants hang on kitchen walls and the unlikable food in unreviewable restaurants.

One thing we almost never bring up, though, is our health. We avoid mentioning weight the way actors avoid saying “Macbeth.” Partly, we do this out of politeness. Mostly, though, we all know that we’re standing on the rim of an endlessly deep hole and that if we look down we might fall in.

“It’s the least healthy job in America, probably,” Adam Platt said recently when I called him to discuss the unmentionable topic. Mr. Platt was New York magazine’s restaurant critic for 24 years before stepping away from the trough in 2022.

“I’m still feeling the effects,” he said. He has a flotilla of doctors treating him for gout, hypertension, high cholesterol and Type 2 diabetes.

“I never ate desserts but when I took the job I started eating desserts,” he said. “I became addicted to sugar. You drink too much. You’re ingesting vastly rich meals maybe four times a week. It’s not good for anybody, even if you’re like me and you’re built like a giant Brahman bull.”

We talked about the alarming frequency with which men in our line of work seem to die suddenly, before retirement age. A.A. Gill, restaurant critic of the Sunday Times of London, was killed by cancer at 62. Jonathan Gold, critic for the Los Angeles Times and LA Weekly, died at 58, right after he was diagnosed with pancreatic cancer. Back in 1963, A.J. Liebling of The New Yorker died after checking into a hospital for bronchial pneumonia. He was 59.

These are isolated stories to be sure, but I’d see the headlines projected on my bedroom ceiling when I woke up in the night with my insides burning like a fire at a chemical refinery.

The women I looked up to lasted longer. Gael Greene, who invented Mr. Platt’s job at New York, lived to 88. Mimi Sheraton, critic for Cue, The Village Voice and The New York Times, made it to 97, despite a professed aversion to exercise.

Christiane Lauterbach, a restaurant critic for Atlanta magazine for more than 40 years, told me she is in good health. She attributes that to “not going to the doctor,” although she was recently talked into having her cholesterol and blood sugar tested. (Both were normal.) “I just take little bites of this and that. I never finish a plate in a restaurant,” she said. “If I finished my plate, I would just be 300 pounds.”

S. Irene Virbila, who ate out six nights a week for 20 years as restaurant critic for the Los Angeles Times, used to bring along a man to finish her plates. She called him Hoover.

“Restaurant food is rich,” she said. “To make those flavor bombs it has to have a lot of rich elements. It’s more of everything than you would eat if you could eat exactly what you wanted.”

After she left the post, she lost 20 pounds in two months, “without thinking about it.” Today, aside from taking medication for an inherited vulnerability to cholesterol, she is in good health.

Virtually all of my 500 or so reviews were the result of eating three meals in the place I was writing about. Typically, I’d bring three people with me and ask each to order an appetizer, main course and dessert. That’s 36 dishes I’d try before writing a word.

This is the simple math of restaurant reviewing, but there is a higher math. Critics eat in a lot of restaurants that Gael Greene once described as “neither good enough nor bad enough” for a review.

Then there are the reference meals, the ones we eat to stay informed, to not be a fraud. Often, this is where I got into real trouble. How many smash burgers did I need to taste, or taste again, before I could write about the ones at Hamburger America, a restaurant I reviewed in the same months I was eating my way toward my “100 Best Restaurants ” list, for which I needed to make sure that the Uyghur hand-pulled noodles and Puerto Rican lechon asado and Azerbaijani organ-meat hash that I loved were, at least arguably, the best in the city?

This is probably the place to mention that naming 100 restaurants was totally my idea. My editors had asked for 50, and I’ll bet they would have settled for 25. When I did do 100, and the time came a year later to do it again, they didn’t ask me to go back to all of them. That was my idea, too.

Omnivorousness, in the metaphorical sense, is a prerequisite for a good critic. My favorite movie critic is still Pauline Kael, who wrote as if she had seen every film ever made. But movies won’t, as a rule, give you gout.

Food writing’s most impressive omnivore was Jonathan Gold. There didn’t seem to be a dish served anywhere in Los Angeles that he hadn’t eaten at least once, and usually several times, until he was sure he understood it. His knowledge inspired me. It also tormented me — there was no way to catch up to him.

Years ago, he used to tell people he had eaten every taco on Pico Boulevard. This was merely an appetizer. His larger goal was to eat in every restaurant on the street “at least once.”

Pico Boulevard is more than 15 miles long.

I have not eaten in every restaurant on Roosevelt Avenue in Queens, far and away the most significant taco artery in my own city. There have been nights, though, as I walked for miles under the elevated No. 7 train, watching women press discs of fresh masa and men shave cherry-tinted strips of al pastor pork from slowly revolving trompos, when it seemed like an excellent idea.

At a certain point, this kind of research starts to look like a pathology. (...)

When I first came to The Times in 2006, a reporter warned me not to identify myself too heavily with my work. “Any job at The Times is a rented tux,” she said.

I nodded, but didn’t get the point until this year.

by Pete Wells, NY Times |  Read more:
Image: Liz Clayman for The New York Times
[ed. So many great reviews. Here are a couple: Senor Frog's and Guy Fieri.]

Randoseru: The Book Bag That Binds Japanese Society


In Japan, cultural expectations are repeatedly drilled into children at school and at home, with peer pressure playing as powerful a role as any particular authority or law. On the surface, at least, that can help Japanese society run smoothly.

During the coronavirus pandemic, for example, the government never mandated masks or lockdowns, yet the majority of residents wore face coverings in public and refrained from going out to crowded venues. Japanese tend to stand quietly in lines, obey traffic signals and clean up after themselves during sports and other events because they have been trained from kindergarten to do so.

Carrying the bulky randoseru to school is “not even a rule imposed by anyone but a rule that everyone is upholding together,” said Shoko Fukushima, associate professor of education administration at the Chiba Institute of Technology.

On the first day of school this spring — the Japanese school year starts in April — flocks of eager first graders and their parents arrived for an entrance ceremony at Kitasuna Elementary School in the Koto neighborhood of eastern Tokyo.

Seeking to capture an iconic moment mirrored across generations of Japanese family photo albums, the children, almost all of them carrying randoseru, lined up with their parents to pose for pictures in front of the school gate.

“An overwhelming majority of the children choose randoseru, and our generation used randoseru,” said Sarii Akimoto, whose son, Kotaro, 6, had selected a camel-colored backpack. “So we thought it would be nice.”

Traditionally, the uniformity was even more pronounced, with boys carrying black randoseru and girls carrying red ones. In recent years, growing discussion of diversity and individuality has prompted retailers to offer the backpacks in a rainbow of colors and with some distinctive details like embroidered cartoon characters, animals or flowers, or inside liners made from different fabrics.

Still, a majority of boys today carry black randoseru, although lavender has overtaken red in popularity among girls, according to the Randoseru Association. And aside from the color variations and an increased capacity to accommodate more textbooks and digital tablets, the shape and structure of the bags have remained remarkably consistent over decades.


The near totemic status of the randoseru dates back to the 19th century, during the Meiji era, when Japan transitioned from an isolated feudal kingdom to a modern nation navigating a new relationship with the outside world. The educational system helped unify a network of independent fiefs — with their own customs — into a single nation with a shared culture.

Schools inculcated the idea that “everyone is the same, everyone is family,” said Ittoku Tomano, an associate professor of philosophy and education at Kumamoto University.

In 1885, Gakushuin, a school that educates Japan’s imperial family, designated as its official school bag a hands-free model that resembled a military backpack from the Netherlands known as the ransel. From there, historians say, the randoseru quickly became Japan’s ubiquitous marker of childhood identity. (...)

Grandparents often buy the randoseru as a commemorative gift. The leather versions can be quite expensive, with an average price of around 60,000 yen, or $380.

Shopping for the randoseru is a ritual that starts as early as a year before a child enters first grade.

At Tsuchiya Kaban, a nearly 60-year-old randoseru manufacturer in eastern Tokyo, families make appointments for their children to try on different-colored models in a showroom before placing orders to be fulfilled at the attached factory. Each bag is assembled from six main parts and takes about a month to put together. (...)


Each Tsuchiya Kaban bag comes with a six-year guarantee on the assumption that most students will use their randoseru throughout elementary school. As a memento, some children choose to turn their used bags into wallets or cases for train passes once they graduate.

In recent years, some parents and children’s advocates have complained that the bags are too burdensome for the youngest children. Randoseru can cover half of the body of a typical first grader. Even unloaded, the average bag weighs about three pounds.

Most schools do not have personal lockers for students or much desk storage space, so students frequently carry textbooks and school supplies back and forth from home. And in a culture that puts a high value on hard work, patience, perseverance and endurance, the movement to relieve children of the randoseru burden hasn’t gotten very far.

“Those who have no heart say that ‘recent children are weak; back in our day we carried around those heavy bags,’” said Ms. Fukushima, the education professor.

A few manufacturers have developed alternatives that retain the randoseru shape while using lighter materials like nylon. But these have been slow to gain traction. (...)

At the end of the day, Kaho Minami, 11, a sixth grader with a deep-red randoseru stitched with embroidered flowers that she had carried throughout elementary school, said she never yearned for any other kind of bag. “Because everyone wears a randoseru,” she said, “I think it is a good thing.”

by Motoko Rich, Hisako Ueno, and Kiuko Notoya, NY Times | Read more:
Images: Noriko Hayashi
[ed. Back in the day in Hawaii when I was in grade school, everybody had plastic Pan Am or Hawaiian Airlines bags - sqaure, two handles, side logo (where did we get them from?). Either that, or boys would just carry their books sidearm and girls would clutch them to their chests (always - you never wanted to be caught doing the opposite!).]

Monday, July 15, 2024

Permanent Crisis

Myopic responses perpetuate the “opioid epidemic”

To express the ambient feeling that “things are getting worse,” there exists, of course, a meme. It plots iterations of a chart, and on its x-axis floats the disembodied, smiling face of President Ronald Reagan. After his inauguration, watch the data veer up and off into oblivion: from health care spending, executive pay, and the size of the federal government, to the privatization of public services, social isolation, and economic inequality. The bottom line: only half of babies born in 1980—today’s forty-four-year-olds—will make as much money as their parents did.

I was surprised, then, to learn that publicists for the Sackler family—the owners of Purdue, which manufactures OxyContin, and, as the purported architects of the “opioid epidemic,” the epitome of contemporary capitalist villainy—presented a Reaganesque chart in a 2021 PR offensive called “Judge For Yourselves.” The project aimed to “correct falsehoods” and push back against a tidal wave of press that presented OxyContin as the epidemic’s singular culprit. Purdue, to be sure, did not literally present a chart with a smiling Reagan, but they might as well have.

This chart was designed by two infectious disease modelers, Hawre Jalal and Donald S. Burke, who made a grim discovery while examining the leading causes of death in America. They plotted drug-overdose deaths from 1979 to 2016, and what they found was utterly baffling: deaths consistently rose 7 percent each year, doubling every eight to ten years, for more than four decades. Nothing else—not gun deaths, not suicide, not AIDS, not car crashes—adheres to an exponential curve for this long. Since 1999, more than one million people have died from overdoses.

But in the United States, we don’t tend to think of this decades-long emergency as a continually accelerating death toll; it gets framed as a series of discrete, though sometimes overlapping, epidemics, implying a predictable arc that spikes, plateaus, and eventually falls. First, as the New York Times warned on the front page in 1971, there was a “G.I. heroin addiction epidemic” in Vietnam. The drug’s use was also on the rise in places like New York, where, in the following year, at least 95 percent of those admitted to drug addiction treatment reported using it. The crack cocaine epidemic arrived in the next decade, followed by a rise in the use of methamphetamines, which the late senator Dianne Feinstein would call the “drug epidemic of the nineties.” But these were soon displaced in the popular imagination by OxyContin, which hit the market in 1996 and set off successive waves of what came to be known as the opioid epidemic, something we’re still struggling through. The past forty-five years of drug use in America does not match this relatively tidy narrative—in reality, there’s a beginning and middle, with no end on the horizon.

But in a strange way, this exponential curve told a story the Sackler family could get behind, one that made them look less culpable: How could Purdue be responsible for the opioid epidemic if overdose deaths were rising for more than a decade before OxyContin was even brought to market? “We were contacted by [Purdue] lawyers,” Burke told me. “It was my sense that they would like us to testify that it wasn’t their fault.” They declined the offer.

Still, Purdue was right about something. Drug mortality in America neither begins nor ends with the company’s actions. What pharmaceutical manufacturers, drug distributors, insurance companies, doctors, and pharmacies—the entire profit-mad medical system—collectively accomplished was to accelerate a train that was already speeding off the rails. But it’s hardly an absolution to argue that you did not start the fire, only poured gasoline on it for personal gain. With corporate power unchallenged and regulators asleep at the wheel, drug markets, like so many other consumer markets, have become more deadly, more dangerous, and, despite decades of aggressive and costly drug enforcement, more ubiquitous.

Jalal and Burke’s finding also presented a paradox. How could four decades of seemingly distinct epidemics—from heroin and cocaine to meth and fentanyl—aggregate into one giant wave of death? How is this wave still gaining power, and when will it crash? When we zoom out, we have what looks less like a collection of epidemics involving a series of novel, addictive drugs, and something more like a chronic social crisis exacerbated by market conditions. Underlying sociological and economic drivers must be at work.

“We can come up with explanations that are specific to some era,” Peter Reuter, a veteran drug policy researcher, told me. For instance, consider how in the 1970s, cocaine manufacturing and trafficking networks in Latin America advanced alongside growing demand for the drug in America. “But then, it’s very hard to find something that goes on for forty-five years now.” David Herzberg, a historian of the pharmaceutical industry and author of White Market Drugs: Big Pharma and the Hidden History of Addiction in America, has an idea. He proposes that drug markets are behaving the way other consumer markets have since the neoliberal turn, when “free enterprise” was unleashed to work its unholy magic. “The rise in overdoses tracks a time period in which corporations that organize human labor and human activity were increasingly given carte blanche,” Herzberg told me. “While OxyContin is an example of a corporation taking advantage of this,” he said, “Purdue didn’t create the conditions that enabled it to do what it did.” Hence the irony of the Sackler family’s lawyers holding up a chart where time begins in 1979.

Across this period, illicit market innovations have mirrored many of the same ones seen in legal markets: sophisticated supply chains, efficiencies in manufacturing, technological advances in communications and transportation, and mass production leading to lower prices. Meanwhile, the social dislocation and alienation of consumer society has left millions of Americans unmoored, adrift, or otherwise floundering.

Contrary to popular rhetoric, drug addiction is not the cause of poverty but one of its chief consequences. Studying the dynamics of crack houses in New York and open-air drug markets in Kensington, Philadelphia, the ethnographer Philippe Bourgois found a pattern of lives scarred by a combination of state neglect and violence: abusive childhoods, crumbling schools, abandoned neighborhoods, all aided by government-incentivized white flight. The historian Nancy Campbell, author of OD: Naloxone and the Politics of Overdose, uses the phrase “unlivable lives” when talking about the increasing immiseration of Americans. “Drugs are powerful ways people use to mitigate their circumstances,” Campbell told me. Opioids work as a salve for pain both physical and psychic. (...)

The public is led to believe that the usual responses to epidemics will somehow work for drug addiction: isolate, quarantine, and treat the sick. This almost always means criminalization, incarceration, and compulsory treatment—or else bizarre interventions like the Department of Defense’s quixotic search for a fentanyl “vaccine.” The endless declaration of one drug epidemic after another also perpetuates a blinkered state of emergency, necessitating the spectacle of a disaster response to yet another drug “outbreak.” This not only forecloses the possibility of a response that’s actually effective, it precludes a deeper understanding of the role of drugs in American life. What Jalal and Burke’s exponential curve lays bare is the accumulation of our long, slow, and violent history. (...)

The idea that we’re living through exceptional times isn’t exactly wrong. The mathematics and physics of fentanyl are unprecedented. The total amount of the synthetic opioids consumed in the United States each year is estimated to be in the single-digit metric tons. By comparison, Americans annually consume an estimated 145 tons of cocaine and 47 tons of heroin. That means all the fentanyl consumed by Americans in just one year can fit inside a single twenty-foot cargo container. Some fifty million shipping containers arrive in America by land, air, and sea every year. Because fentanyl is so potent—with doses measured in micrograms—very small amounts can supply vast numbers of customers. Counterfeit fentanyl pills contain about two milligrams of fentanyl. There are 28,350 milligrams in an ounce, which means one dose amounts to one ten-thousandth of a single ounce. Authorities could barely keep up with cocaine and heroin. To say fentanyl detection is like finding a needle in a haystack is to vastly underestimate the scale of the problem before us.

To add another layer to this already impossible scenario, fentanyl is unlike cocaine and heroin in that it is synthetic, odorless, and tasteless, making shipments even more difficult to detect. And the supply has no real upper limit: production is only tied to the amount of precursor chemicals available, which seem pretty much limitless. Any nation with a pharmaceutical or chemical manufacturing industry can theoretically produce the necessary precursors and ship them to suppliers around the world. If one country cracks down on precursor chemicals, another can fill the void. At this time, India and China manufacture much of America’s generic drug supply.

The global market’s rapid acceleration underscores the folly and futility of relying on the same enforcement tactics on the supply side, and the same medical and health interventions on the demand side. The U.S. policy response has never been this nakedly outmatched and unsuited for the task at hand. Still, authorities boast of massive investments to curb the fentanyl crisis. They champion handshake deals with foreign leaders to staunch the flow of the drug into the country. They publicize record-breaking fentanyl seizures, only to turn around and report record-breaking overdose figures. For example, the state of California’s 2023 “Master Plan” for tackling drugs includes more than $1 billion, from overdose prevention efforts to interdiction and enforcement. The California National Guard seized 62,224 pounds of fentanyl that year, a 1,066 percent increase from 2021. And yet overdose deaths continue to climb across the state, increasing by 121 percent between 2019 and 2021. Conventional enforcement and seizure methods have done little to contain the spread.

The Need for New Direction

In 2022, the disease modelers Jalal and Burke projected that half a million Americans would die of drug overdoses between 2021 and 2025. So far, the data supports this estimate. “Dismayingly predictable,” as they put it. Unless something drastically changes, the curve will keep rising. Drug mortality alarmed officials in 2010 when thirty-eight thousand people died in a single year. Drug deaths were declared a “national health emergency” in 2017, when the annual death toll topped seventy thousand. In 2022, overdose deaths nearly reached 110,000. My fear is that we’ll learn to live with these figures as just another grim and inevitable feature of American life. File drug overdoses away under “intractable problem,” somewhere between gun violence and the climate crisis.

Something obviously needs to change, but American drug policy feels stuck, mired in disproven and outdated modes of thinking. Briefly, it seemed there was real movement toward treating addiction as a public health issue, but the sheer lethality of fentanyl, in part, snapped policy back to the mode of coercive criminalization, derailing newer, progressive reform efforts to roll back racist drug enforcement through decriminalization, with an emphasis on expanding public health, harm reduction, and treatment. The tide of reaction against these nascent efforts has been swift and effective. San Francisco voters passed a measure to drug test welfare recipients. Oregon has ended their decriminalization experiment. With social approaches in retreat, the idea of full-on legalization feels increasingly out of touch with today’s reality.

But is complete legalization even desirable? Every time the left brings up the idea, two substances come to mind: alcohol and tobacco. These two perfectly legal, regulated products are immensely hazardous to individual health and society at large. Tobacco kills nearly five hundred thousand people every year; that’s more than alcohol and every other drug combined. Drinking, meanwhile, kills nearly five hundred Americans a day: more than every illicit substance, including fentanyl, combined. During the pandemic lockdowns, people drank more, and they drank more alone. The trend did not reverse once we returned to “normal.” Contrary to all the buzz around nonalcoholic bars, millennials and Gen X are binge drinking at historic levels. The same set of social, psychological, and economic factors at work in illicit drug use, magnified by the market’s invisible hand, are also apply to alcohol: people are more alone and more stressed, with access to a cheap, heavily marketed product that, thanks to on-demand home delivery, is easier than ever to access. Advertisers spent nearly $1.7 billion marketing alcohol in 2022 alone.

How, then, is the legalization and regulation of drugs going to help us? Benjamin Fong, in Quick Fixes, summarizes the debacle:
A more rational society would undoubtedly minimize the impacts of black markets by regulating all psychoactive drugs (and, perhaps, controlling their sale through state monopolies or public trust systems), but legalization in this society likely means bringing highly potent substances into the purview of profit extraction.
It is clear we live in the worst of all worlds. Black markets flood the country with mass-produced and highly lethal substances, but legal, “regulated” markets do the same. Both are turning record profits. Consumers are at the wrong end either way. It’s hard to not feel deep pessimism about where things go from here. Cringey, commercialized marijuana; the glut of ketamine infusion clinics; venture capital closing in on psychedelics; Adderall and Xanax prescriptions being handed out by telemedicine companies over Zoom. It’s precisely more of what got us here: a bewildering array of addictive products unleashed onto anxious, isolated consumers who are groping in the dark for relief from physical and psychic pain, coping with unlivable lives. Fortunately, it’s almost impossible to fatally overdose on many of these substances, but death shouldn’t be the only way to measure the consequences of the great American drug binge.

The current rhetorical, legal, and medical framework is simply no match for the deep malaise driving the problem. Root causes are downplayed, millions are left untreated, and thousands of preventable deaths are unprevented. We need a stronger, more expansive paradigm for understanding the exponentially increasing number of overdose deaths. A new language of substance use and drug policy that encompasses, and is responsive to, market dynamics and the social dysfunction to which they give rise. A consumer-protection model that does not criminalize the suffering, but also addresses the anxiety and dread that leads to compulsive, chaotic, and risky substance use. There must be something beyond, on the one hand, prohibition by brute force, and on the other, free-for-all drug markets ruled by profit. How can we create a world where people don’t need to use drugs to cope, or when they do use them, whether for relief, enhancement, or plain old fun, the penalty isn’t addiction, prison, or death?

by Zachary Siegel, The Baffler | Read more:
Image: © Ishar Hawkins
See also: Pain and Suffering (Baffler):]
***

"The stigma is not hard to understand: magazine features, books, and movies for two decades now have chronicled America’s drug problems, including the rapacious role of drug manufacturers like Purdue Pharma, which made OxyContin a household name and enriched the Sackler family in the process. The publicity of their misdeeds led lawmakers on a campaign against opioid prescribing. Yet the crackdown had an unintended consequence, one little examined today: it has increased the suffering of patients who experience chronic pain, as medications that were once heavily promoted have since been restricted. And it has added to the needless agony of those like Marshall at the end of life. I told the story of Marshall and others like him in my 2016 book, The Good Death. Since that time, the double-sided problem has only seemed to worsen. Even morphine, which has long been used to ease the final days and hours of patients in hospice care, is only available to the fortunate ones, as supply chain problems have combined with fears of overuse, leading to vast inequities as to who dies in terrible pain. (...)

Those dependent on opioids sought out their own prescriptions, while others began to sell their unused pills for extra income. Instead of addressing drug use with treatment—methadone, buprenorphine, abstinence programs—states and the federal government began to respond by limiting the quantity of opioids that doctors could prescribe, hurting legitimate pain patients, who were now unable to get the medication that allowed them to function, and leaving those dependent on or addicted to illicit prescription medication in deep withdrawal.

“Do you really think that’s not going to generate a local street market?” Szalavitz asked. So, in “towns where there was deindustrialization, a lot of despair, long family histories of addiction to things like alcohol,” she said, people were forced to find a new drug source. Heroin and street fentanyl filled the void. Those addicted to or dependent on prescription opioids were now using drugs that were not commercially made, their dosages variable, unpredictable, and often deadly. (...)

When I asked Szalavitz how she made sense of this misleading popular narrative about addiction and overdose, she told me, “You couldn’t say that the people who got addicted to prescription opioids were starting by recreational use because then white people wouldn’t be innocent—and journalists like innocent victims. We had to get it wrong in order to convict the drug companies.” From this vantage point, every story of, say, a high school athlete getting hooked on Oxy after knee surgery is misleading as an average portrait, defying both the data and what experts know about addiction. Most people with addiction begin drug use in their teens or twenties, which means it’s likely that those proverbial student athletes getting hooked on Oxy were already experimenting with drugs. “If you don’t start any addiction during that time in your life, your odds of becoming addicted are really low,” Szalavitz told me. “So, what are we doing? We’re cutting off middle-aged women with no history of addiction, who are not likely to ever develop it, and have severe chronic pain, to prevent eighteen-year-old boys from doing drugs that they’re going to get on the street instead.”

Understanding—and addressing—addiction is what’s missing from current drug policy. Instead, some types of drug dependence are demonized, dependence is conflated with addiction, and the best, most cost-effective treatment for pain to exist at this time is stigmatized and kept from those who rely on it to function. As Szalavitz explains it, dependence is needing an increasing dose of a drug to function normally. Many on antidepressants or other stabilizing drugs are not shamed for their dependency. Addiction, Szalavitz says, is using a drug for emotional not physical pain; it is “compulsive drug use despite negative consequences, so increasing negative consequences does not help, by definition.”

Truly facing and addressing addiction requires a new vocabulary—and accepting that “say no to drugs” is an inadequate response. It also requires an examination of far-reaching economic and social challenges in our culture: lives of despair, racial prejudice, economic insecurity, isolation, inaccessible health care, expanding police forces and prisons, and, of course, politics. For politicians, “drugs are a great way to get elected,” Szalavitz said. They can campaign on tough drug laws, claiming that their policies will decrease overdose deaths. “It’s really infuriating,” she told me, “because our prejudice against pain and our stereotypes about addiction push us toward solutions to the problem of opioids that simply do not work.”



OHTSU Kazuyuki(大津 一幸 Japanese, b.1935), Summer at Oze
via:

Teshekpuk Lake

America's Arctic - Teshekpuk Wetlands
[ed. NPR-A, Northwestern Alaska. Beautiful photography by Cornell Lab of Ornithology.]

Sunday, July 14, 2024

Red Power

Indigenous Continent: The Epic Contest for North America
by Pekka Hämäläinen. Norton, 571 pp., £17.99, October 2023, 978 1 324 09406 7

The Rediscovery of America: Native Peoples and the Unmaking of US History
by Ned Blackhawk. Yale, 596 pp., £28, April 2023, 978 0 300 24405 2

Our History Is the Future: Standing Rock Versus the Dakota Access Pipeline and the Long Tradition of Indigenous Resistance by Nick Estes. Haymarket, 320 pp., £14.99, July, 979 8 88890 082 6

The conquest​ of most of the North American continent by Anglophone settlers took roughly three hundred years, from the first stake at Jamestown to the last bullet at Wounded Knee. The Spanish had subdued a much vaster population of Indigenous peoples in Mexico and Peru in just under half a century and expected to repeat the formula, mobilising the Indigenous tributaries against the Indigenous core as they moved up from their outposts in Florida, only to find there was no power centre to replace. The last great city-state in pre-colonial North America, Cahokia, had dissolved two centuries before. Instead, the Spanish encountered a patchwork of peoples stretched thinly across the land, which would have to be won over town by town.

The fate of Hernando de Soto was paradigmatic. He sailed to the New World in 1514 and made his fortune in the Spanish campaigns against the Inca. By 1534 he was lieutenant governor of Cuzco, where he took an Incan noblewoman for his mistress and lived in the spectacular palace of the emperor Huayna Cápac. But his expedition of 1540 from present-day Louisiana to the Carolinas amounted to a series of disastrous confrontations with Native groups. He ended his days trying to pass as a god before a local chief, only to be exposed when he failed to dry up the Mississippi, into which his corpse was unceremoniously tossed by his men after he died of a fever. They scrambled back to Mexico City with the horses they had not slaughtered for food.

No prior record of success burdened the early English colonists. They could not afford the more languid colonialism of the Russian and French empires, whose fur traders established tributaries and commerce over the course of centuries, as well as making occasional attempts at the religious indoctrination of peoples in the tundra and wilderness that no settler planned to inhabit. The strength and entrenchment of Natives in North America, along with the Anglo determination to settle and not merely extract goods and labour, meant that there was a longer period of mutual testing before full-scale elimination could become an aspiration. (...)

The most foreboding development for Native peoples in North America was the cohesion of a unified settler state in the wake of the American Revolution. Far more than Black slavery, the Native question was central to the reordering of political loyalties on the eastern seaboard. From the vantage of the American colonials, the Indians were, as the historian Colin Calloway has put it, paraphrasing Thomas Jefferson, ‘the vicious pawns of a tyrannical king’. From the perspective of Westminster, the colonials were ungrateful rogue subjects who provoked needless border clashes that strained the Treasury, which had already been exhausted on their behalf in the French and Indian War. In his 1763 proclamation, George III made major concessions to Indian tribes and declared the Appalachian mountain range to be the outer limit of colonial expansion. For trigger-happy real estate speculators like George Washington, who had ignited the French and Indian War with an ill-planned attack on French forces in Jumonville Glen and who aimed to make his fortune selling land to settlers moving west, this entente was intolerable. Washington himself was at least willing to enforce a settlement line in order to prevent improvident squatters from occupying alienated land, but his more republican peers in the ‘Founding’ generation believed that the point of being an American was having access to cheap land. Any attempt to shut off the supply was met with strategic violence. When the crown sent the Pennsylvania trader and land speculator (and Washington rival) George Croghan into Ohio Country with a pack train of goods, including enough white linen shirts to clothe half the male Indian population, in an attempt to start realising its vision of imperial-Native co-prosperity, it was attacked in 1765 by a gang of American settlers (‘the Black Boys’) dressed up as Indians with charcoaled faces, who destroyed all 30,000 pounds of goods – three times the amount the Tea Partiers, also dressed as Natives, dumped into Boston harbour eight years later.

After the revolution broke out, most tribes treated the conflict as a British civil war. But the results were often dire for them: the Shawnee and the Delaware were pushed west of the Mississippi; the Haudenosaunee Confederacy, split between British and American-aligned factions, moved up to Canada and as far away as present-day Wisconsin, while the Seneca and Mohawks stayed in the east; the Creeks lost great tracts of territory in Georgia. The annexation and confiscation of Indian lands – and the control that the nascent US state would have over areas not already claimed by settlers – was expected to be one of the great boons of the revolution, allowing the state to build up its treasury by selling the land to its citizens. Yet the new federal government took a position similar to that of the empire it had overthrown: wary of the instability that resulted from a population fixed on moving west, it searched for a modus vivendi with the Indigenous peoples. The authors of the constitution considered the inclusion of an Indigenous local government led by the Delawares and with its own representatives in Congress as the 14th state of the union. In 1807, the United States forbade its citizens from surveying lands beyond the federal boundary, or even marking trees to signal future claims. Twenty years later, John Quincy Adams did not hesitate to send troops to burn down squatters’ homes and crops in Alabama. But these legal enforcements would be swept away in the coming demographic storm. The settler-sceptical northeastern Federalists had many political victories, and the state later used much of its ‘land bank’ for developments such as railroads and universities, while most yeomen farmers ended up as renters rather than owners. Despite this, the republican fantasy of numerous smallholders continued to power the trajectory of the young United States, which teemed with schemes for what Jefferson called ‘our final consolidation’.

Evaluations of Native resistance to European occupation have always been bound up with contemporary political reckonings. Dee Brown, an amateur historian from Arkansas, published his bestselling book, Bury My Heart at Wounded Knee (1970), during the Vietnam War. Brown depicted the Indigenous peoples of the continent as heroically resisting an imperial onslaught beyond their control and fixed in the public imagination the notion of Indians as the noble victims of a slow-motion extinction. Though professional historians pointed out the book’s many factual errors and criticised its flattening of all violence in the West into ‘Indian Wars’, its unwitting embrace of the myth of the ‘vanishing Indian’ and its emotional manipulation of readers, Bury My Heart set the tone for nearly half a century of historiography. From Francis Jennings’s The Invasion of America (1975) to Benjamin Madley’s An American Genocide (2016), the subject of this scholarly outpouring has been the destruction of Native peoples at the hands of the British and US empires and their proxies. More recently, in works such as Jeffrey Ostler’s Surviving Genocide (2019), there is increasingly bald acknowledgment that, more than the military or vigilantes or even disease, the organising force behind the destruction was the capitalist economy itself.

But recent road maps of the historiography either sidestep material questions or mistake a colonised mindset for a progressivist one. The symptoms manifest in different, competing ways. Some work overcompensates for Native agency in the face of the European onslaught to the point that it neglects wider historical forces. There are studies by legal historians – Indigenous originalists in all but name – who, however correctly they emphasise the disciplinary power of the law over Native peoples, have so thoroughly internalised constitutional ideology that they seem not to notice how their cause has been instrumentalised by the most fanatically libertarian segment of American society. There is also a nominally left-wing Native scholarship that recognises the unique force of certain Native groups in environmental and anti-capital movements in North America, but resists historicising Native experience itself. Instead, it holds to romantic notions about peoples who are still privy to uncontaminated, non-Western consciousness, immune to the profit motive, and if left to their own devices would build societies, administer land and protect water in ways that modern states fail to emulate at their peril. These three versions of Native history are all the more regrettable because the 20th century offered examples of Indigenous co-operation with the left, cases contemporary political theorists have examined with more care than their historian peers.

Pekka Hämäläinen’s Indigenous Continent, the third book in his celebrated trilogy about Native American ‘empires’ – following Comanche Empire (2008) and Lakota America (2019) – attempts to flip Brown’s script. Hämäläinen gives no quarter to the claim that Native populations in North America were easy prey for Europeans. In his account, the continent was still up for grabs and the Native peoples were capable of inflicting severe, potentially irrevocable losses on the young United States. His evidence includes Native archaeological and material sources such as the Lakota ‘Winter Counts’ – buffalo hides on which they depicted the decisive event of a given year. (...)

The balance of forces in the early decades of the new nation was far from clear. In 1791, General St Clair’s US army was defeated on the banks of the Ohio River by the Northwestern Confederacy; a thousand American troops were killed or wounded. In the periodisation laid out in Richard White’s Middle Ground (1991), the irreversible decline of Indigenous peoples only set in at the end of the War of 1812, when ‘they could no longer pose a major threat or be a major asset to an empire or a republic, and even their economic consequence declined with the fur trade.’ This is where Hämäläinen makes his provocative claim: ‘Indigenous power in North America,’ he argues, ‘reached its apogee in the mid to late 19th century.’ (...)

None of this would have been possible without horses. The domesticated horse originated in North America four million years ago, but had been extinct there for 10,000 years. Hernán Cortés and the Spanish brought the horse back to the Americas in the 1500s, and over the next two centuries they spread across their ancient homeland. Hämäläinen relates the account given to the English explorer David Thompson by one of the Blackfeet Indians, Saahkómaapi. In around 1730, the Blackfeet heard that there were horses in Snake Indian country and that not far away was the body of a horse that had been killed by an arrow. They found the dead horse and gathered around it. ‘We all admired him,’ Saahkómaapi told Thompson. ‘He put us in mind of a stag that had lost his horns; and we did not know what name to give him. But as he was a slave to man, like the dog, which carried our things; he was named the Big Dog.’

The people who most successfully mastered the power of the big dog were the Comanche of the Southern Plains. Like the Lakota, they were relative newcomers in their region, which incorporated parts of what are today Texas, New Mexico, Arizona, Colorado and Kansas. In the early 1700s, the Comanches started buying Spanish horses from the more sedentary Pueblo people, whom they quickly displaced as the major power in the southwest. When they forged an alliance with another horse people, the Utes, the result was a mounted army that raided Spanish settlements. The Comanche also operated a booming slave trade in subject Native peoples and other captives, as well as profiting from an enormous hunting range for buffalo. Hämäläinen writes that
 for the Comanches the sun was ‘the primary cause of all living things’, and horses brought them closer to it, redefining what was possible: the biomass of the continental grasslands may have been a thousand times greater than that of the region’s animals. The Comanches plugged themselves into a seemingly inexhaustible energy stream of grass, flesh, and sunlight.
The Lakota, too, secured a vast hunting range, annexing swathes of the Northern Plains. Their relations with the empire to their east – the United States – was initially a trading one, in which the Lakota were by no means the inferior party. Hämäläinen gives the example of the fur tycoon John Jacob Astor building his supply chain right up to the Lakota’s doorstep so that they did not have to inconvenience themselves delivering furs and hides. By the 1860s, the Lakota, in a loose alliance with the Comanches, held sway over a territory larger than Western Europe.

Indigenous Continent is determined to downplay the usual culprits of Native decline: disease brought by Europeans certainly devastated Native populations, but some, especially horse peoples who lived in less dense clusters, were not greatly affected. Every technological innovation the Europeans brought with them – the mounted horse, the gun, the kettle – was acquired and adopted by Natives. In his headlong rush to overturn the Dee Brown story, Hämäläinen ends up reproducing some of its most dubious elements. The focus on military confrontations between the ‘fledgling United States’ and Native ‘armies’ is one of the chief misprisions. The destruction of Native peoples was a result of commercial imperatives as much as political ones. Between 1820 and 1889, for example, the number of buffalo – a major source of Lakota power – declined by 99.99 per cent, from 28 million to 1091. Anglo-European demand for buffalo leather to use in factory machine belts set off a killing spree in the 1870s. The 1848 Gold Rush lured hundreds of thousands of settlers to California through Indian territory, upsetting agricultural patterns and diminishing food supplies. The market went ahead of the cavalry. When Crazy Horse and George Armstrong Custer confronted each other at the Battle of the Little Bighorn, it was reported to be 44°C in the shade, and Evan Connell noted in Son of the Morning Star that ‘a shrewd Yankee merchant on the Yellowstone turned a neat profit selling straw hats for 25 cents.’ Hämäläinen continually emphasises the amount of land area still under Native control, but as the historian Daniel Immerwahr pointed out (his critical review of Indigenous Continent has been cobbled together as praise on the back cover), this is like the Republican Party claiming mass popular support because much of the map is coloured red, no matter how sparsely populated the area in question. The usefulness of calling the Comanche an empire becomes less clear when one considers that at the height of their power they numbered forty thousand people – the population at the time of Cincinnati. (...)

By the mid-19th century, many Native nations found themselves in the position of powerless rentiers, living under what Emilie Connolly calls ‘fiduciary colonialism’. Washington had devised a system of annual annuities instead of one-off buyouts of land, but much of the money was invested in state and federal bonds, effectively making Natives passive investors in their own dispossession. In 1887, the Dawes Act was passed, allowing the US government to subdivide Indian land – previously commonly held – into private allotments of 160 acres apiece: the idea was to break Native patterns of land tenure and force Indians into the capitalist order. The new ‘owners’ would either have to make their portions profitable or sell up to settlers. Though some tribes were initially exempt, the extension of the act in 1898 and the abolition of tribal governments led to the loss of around two-thirds of Native American land over the next thirty years. 

by Thomas Meaney, London Review of Books |  Read more:
Image: Getty via

Saturday, July 13, 2024

End Game: From Taylor Swift to Stephen Tennant

You don’t have to have attended Taylor Swift’s Eras tour yourself to be aware of it. After 18 months, it has become an inescapable international juggernaut, with documented effects on economies, infrastructure and policy. Perhaps the closest historical parallel is the Great Exhibition of 1851 – except, while that promised “the works of industry of all nations”, this spectacle showcases only those of Taylor Alison Swift.

That this phenomenon boils down to just one woman is staggering, a reflection of both Swift’s once-in-a-generation talent and the direct relationship she has forged with her fans. I started listening to her in 2011, sucked in by the girlish fantasy of Love Story, and never looked back. Many of my closest friendships were built on a shared appreciation: proof of the virtuous cycle started by Swift’s honest expression and vulnerability.

At the same time, I’ve never felt so alienated by my favourite artist. This year I have felt not so much a Swiftie as a conscript, roped into some broader project of streaming, spending and posting so as to cement and grow her cultural dominance – though it’s hard to imagine who, now, could possibly dislodge her.

Barclays estimated that the average Eras tour attender spent nearly £850 on tickets, travel, accommodation and expenses, including £79 on official merchandise. More than one city playing host to the tour has been renamed in her honour. The Beatles joked about being bigger than Jesus, but Swift really is bigger than music. She is spoken about in terms more commonly used for land masses, like GDP or earthquake magnitude.

The cultural tide behind Swift is so sweeping and powerful that I’ve struggled to hang on to my fandom, and the personal relationship to her music that’s always underpinned it. This may sound like I’m holding Swift’s success against her, that I liked her better before she got big (though, it bears repeating, no pop star has ever been this big). But I’ve been perturbed by signs that Swift is not just being overexposed, but actively tightening her grip on the spotlight.

Eras is already the highest grossing tour in history, generating $1bn last year and a further $261m from the concert film in cinemas. Yet Swift hasn’t stopped hustling, even after more than 100 sold-out shows. (...)

Swift is the biggest celebrity in the world and a billionaire, on track to make $2bn by Eras’ end. The suggestion that she is somehow dissatisfied or threatened is offputting, and raises very human questions about her motivation. Even five-star reviews of the tour have wondered about Swift’s endgame, where she possibly goes from here. (...)

But without any more insight into what is driving her, you’re left to assume it’s just money, or maybe revenge. Neither makes me feel more connected to her as an artist. Her songwriting may be personal, but seeing Swift perform I felt as though I was being engaged in a brand activation by a global behemoth like Nike or Apple, delivering focus-group-tested excellence. Even the friendship bracelets being hawked seemed less like a groundswell of fan camaraderie than brisk, industrial trade.

My uneasy feelings were later articulated by the culture writer Jonah Weiner, describing the insidious “co-opting of ‘community’ into a sales strategy”. Weiner was talking about luxury fashion brands, and the exploitation we are willing to overlook to feel part of a club. But his point about how our human desire for connection and belonging is hijacked and reduced by corporate interests seemed to me an apt description of the Eras tour, the economy that’s sprung up around it and our enthusiasm to participate in it.

The show’s supposed community is built on a basis of economic productivity; like a queue for a new Apple product or a sneaker, it “contains the possibility for meaningful interpersonal connection only in spite of itself,” Weiner writes. Not only that, it is actively at odds with building relationships and communities that might nourish us for the long term. 

by Elle Hunt, The Guardian |  Read more:
Image: Ennio Leanza/EPA
[ed. From the marketing/branding/money-making juggernaut of the Eras Tour, to an overwrought/tortured argument in favor of keeping up with new fashion styles, to a short essay describing one of the gayest sissies in modern history (Stephan Tennant - The Man Who Stayed in Bed). All in three easy jumps! What a world. By the way, 1950 feature film referenced in the second fashion link, Munekata Sisters, by Yasujirō Ozu is in fact a knock off of Jun'ichirō Tanizaki's Makioka Sisters (excellent):]

***
SERIOUS PLEASURES The Life of Stephen Tennant. By Philip Hoare. Illustrated. 463 pp. New York: Hamish Hamilton/Viking. $29.95.

In 1910, when Stephen Tennant was 4 years old, he ran through the gardens of his family's Wiltshire estate, Wilsford Manor, and was literally stopped in his tracks when he came face to face with the beauty of the "blossom of a pansy." Thirty years later, so precious and high-strung that he sometimes took to his bed for months at a time, he was coaxed outside by a friend for a ride in the car on the condition that his eyes be bandaged, since passing scenery might make him too "giddy." Aubrey Beardsley, Ronald Firbank, Denton Welch -- believe me, Stephen Tennant made them all seem butch.

According to Philip Hoare, the author of "Serious Pleasures," the witty and amazing life story of this great sissy, Cecil Beaton was one of the first to encourage Tennant's eccentric vocation of doing nothing in life -- but doing it with great originality and flamboyance. Completely protected by class, Stephen Tennant couldn't care less what people thought of his finger waves, his Charles James leopard pajamas, his makeup ("I want to have bee-stung lips like Mae Murray") or his dyed hair dusted with gold. Who would dare criticize this "aristocratic privilege," this self-described "fatal gift of beauty"? As The London Daily Express, in 1928, so succinctly summed up Tennant's attitude toward life, "you . . . feel that condescension, indeed, can go no further."

Although many who knew Tennant later in life maintained that they "could hardly believe the physical act possible for him," the one real love affair of his adult life was with Siegfried Sassoon, the masculine, renowned pacifist poet old enough to be his father. Sassoon brought to their relationship "his fame, his talent, his position," while Tennant's only daily activities were "dressing-up" and reading about himself in the gossip columns. Looking at the photos of the two lovers in Mr. Hoare's book, Tennant posing languidly (vogueing, really), way-too-thin and way-too-rich, as Sassoon looks on proudly, even the most radical Act-Up militant might mutter a private "Oh, brother!" But the author makes us see that Tennant's extreme elegance was close to sexual terrorism, as it flabbergasted society on both sides of the Atlantic for half a century. (...)

To confuse things further, Tennant's idol and great friend was Willa Cather(!). It is hard to imagine the notoriously no-fun author of "O Pioneers!" hanging out with a man whose beauty tips included "an absolute ban on facial grimacing or harsh, wrinkle-forming laughter," but Cather, Tennant's "surrogate mother/nanny figure," always encouraged him to write, even though "Lascar," the novel that obsessed him for the last 50 years of his life, remained unfinished at his death.

After World War II, Tennant became, in the words of Osbert Sitwell, "the last professional beauty." From then on, it was time to hit the sack big time. (...)

"Reeking of perfume," "covered with foundation," with ribbons hanging from his dyed comb-over hairdo, he rested "non-stop" for the next 17 years in "decorative reclusion." Unconcerned about his grossly overweight figure (" 'But I'm beautiful,' he would reason, 'and the more of me there is the better I like it!' "), he lay in bed surrounded by his jewelry, drawings and Elvis Presley postcards while, as Mr. Hoare puts it, his "decorative fantasies were running amok" (the pink and gold statues in the overgrown garden, the fishnets and seashells everywhere, the tiny uncaged pet lizards, the bursting pipes and rotting carpets, the mice still in the traps). Happily re-creating the "perfervid environment" of his youth, Tennant calmly painted the tops of his legs with pancake makeup and proudly showed his "suntan" to astonished visitors like Princess Elizabeth of Yugoslavia. David Bailey, Christopher Isherwood, David Hockney, even Kenneth Anger all made pilgrimages and, though they may have laughed good-naturedly afterward, none laughed as hard as Tennant himself, who, after all, was in on the joke from the beginning. "To call Stephen affected," the artist Michael Wishart recalled, "would be like calling an acrobat a show-off, or a golden pheasant vulgar."

In his later years, as the antiques dealers circled outside his estate like vultures, waiting for the end, Tennant would sometimes stop traffic in nearby country towns by going shopping wearing tight pink shorts or a tablecloth as a skirt. His family had given up on him long before, exhibiting only "bemused resignation," a wonderful English trait sorely missing in America today. V. S. Naipaul may have described Tennant best when he noticed "the shyness that wasn't so much a wish not to be seen as a wish to be applauded on sight."

Friday, July 12, 2024

Are you a Yayoi or a Jomon? - Umehara Lectures on Japanese Studies, Takeshi Umehara and Kenji Nakagami, Design by Nobuyoshi Kikuchi
via:

Details You Should Include In Your Article On How We Should Do Something About Mentally Ill Homeless People

Suppose that you, an ordinary person, open your door and start choking on yellow smoke. You call up your representative and say “there should be less pollution”.

A technical expert might hear “there should be less pollution” and have dozens of questions. Do you just want to do common-sense things, like lower the detection threshold for hexamethyldecawhatever? Or do you want to ban tetraethylpentawhatever, which is vital for the baby formula food chain and would cause millions of babies to die if you banned it?

Any pollution legislation must be made of specific policies. In some sense, it’s impossible to be “for” or “against” the broad concept of “reducing pollution”. Everyone would be against a bill that devastated the baby formula supply chain for no benefit. And everyone would support a magical bill that cleaned the skies with no extra hardship on industry. In between, there are just a million different tradeoffs; some are good, others bad. So (the technocrat concludes), it’s incoherent to support “reducing pollution”. You can only support (or oppose) particular plans.

And yet ordinary people should be able to say “I want to stop choking on yellow smoke every time I go outside” without having to learn the difference between hexamethyldecawhatever and tetraethylpentawhatever.

I think you’re supposed to imagine the environmentalists’ experts and the industries’ experts meeting policy-makers and hammering out a compromise, then moving one direction or another along the Pareto frontier based on how loudly normal people protest pollution.

But if you’ve been demanding an end to pollution for years, and nothing has happened, then it might be time to hit the books, learn about hexamethyldecawhatever, and make sure that what you’re demanding is possible, coherent, and doesn’t have so many tradeoffs that experts inevitably recoil as soon as they have to think about the specifics.

II.

I’m not a pollution expert, but I’m a psychiatrist, and I’ve been involved in the involuntary commitment process. So when people say “we should do something about mentally ill homeless people”, I naturally tend towards thinking this is meaningless unless you specify what you want to do - something most of these people never get to.

Let’s start with a summary of the current process for dealing with disruptive mentally ill homeless people:
  1. A police officer sees a mentally ill homeless person and assesses them as disruptive. Technically the officer should assess whether the person is “a danger to themselves or others”, but in practice it’s all vibes. They bring this person to the ER of a hospital with a psychiatric ward.
  2. In the ER, psychiatrists evaluate the person. If some number of doctors, psychiatrists, and others (it varies on a state-by-state basis, and most people defer to the first psychiatrist anyway) agree the person is a “danger to themselves or others”, they can involuntarily commit them. Psychiatrists know lots of tricks for getting the evaluation result they want. For example, wasn’t the person brought in by the cops? Aren’t cops infamous for shooting mentally ill people? Sounds like whatever they did to attract the cops’ attention put them at risk of getting shot, which makes them a “danger to themselves or others”. Again, in reality this is all vibes.
  3. The patient gets committed to the hospital. The hospital makes an appointment with a judge to legally evaluate the commitment order. But realistically the appointment is 4-14 days out (depending on the state), and by then the patient may well be gone anyway, in which case the hearing can be cancelled. If it does go to trial, the judge will always defer to the psychiatrists, because they’re experts trying to do a tough and socially important job, and the defendant is represented by an overworked public defender who has devoted 0.01 minutes of thought to this case. This is part of why everyone feels comfortable making commitment decisions on vibes.
  4. If the patient seems psychotic, the doctors start them on antipsychotic drugs. These take about 2-4 weeks to make people less psychotic. But one of their side effects is sedation, that side effect kicks in right away, and heavily-sedated people seem less psychotic. So realistically the person will stop seeming psychotic right away.
  5. After a few days, the hospital declares victory and discharges the patient with a prescription for antipsychotics and an appointment with an outpatient psychiatrist who can continue their treatment.
  6. The patient stops taking the antipsychotics almost immediately. Sometimes this is because they’re having side effects. Other times it’s because they’re still psychotic and making irrational decisions. But most of the time, it’s because some trivial hiccup comes up in getting the prescription refilled, or in getting to the doctor’s appointment. Nobody likes dealing with healthcare bureaucracy, but semi-psychotic homeless people are even worse at this than usual. Social services can sometimes help here, but other times they’re just another bureaucracy that it’s hard to deal with, and it usually doesn’t take long for something to slip through the cracks.
  7. Repeat steps 1-6 forever.
This isn’t going to win any of the people involved Doctor Of The Year awards. I’m sympathetic to attempts to change the system. But it’s hard to find the right point of leverage. (...)

Okay, then can you just make it a crime to be mentally ill, and throw everyone in prison? According to NIMH, 22.7% of Americans have a mental illness, so that’s a lot of prisoners. “You know what I mean, psychotic homeless people in tents!” Okay, fine, can you make homelessness a crime? As of last month, yes you can! But before doing this, consider:
  • In San Francisco, the average wait time for a homeless shelter bed is 826 days. So people mostly don’t have the option to go to a homeless shelter. If you criminalize unsheltered homelessness, you’re criminalizing homelessness full stop; if someone can’t afford an apartment or hotel, they go to jail.
  • Most (?) homeless people are only homeless for a few weeks, and 80% of homeless people are homeless for less than a year. If someone was going to be homeless for a week, and instead you imprison them for a year, you’re not doing them or society any favors.
  • How long should prison sentences for homelessness be? Theft is a year, so if homelessness is more than that, it becomes rational for people to steal in order to make rent. And realistically it will take police years to arrest all of the tens of thousands of homeless people, so if a sentence is less than a year, then most homeless people will be on the street (and not in prison) most of the time, and you won’t get much homelessness reduction.
  • What’s your plan for when homeless people finish their prison sentence? Release them back onto the street, then immediately arrest them again (since there’s no way they can suddenly generate a house while in prison)? Connect them to social services in some magical way such that the social service will give them a house within 24 hours of them getting out of prison? If such magical social services exist, wouldn’t it be cheaper and more humane to invoke them before putting someone in prison?
I admit that if you’re willing to be arbitrarily cruel and draconian (life sentence for someone and their entire family the moment the bank forecloses on their home!) you can make this one “work”. But anything less than that and it becomes just another confusing bad option.

In practice, the government tries some combination of these things, each of which works a little. Sometimes they fiddle with the law around inpatient commitment around the edges. Sometimes they give people free houses. Sometimes they threaten them with Involuntary Outpatient Commitment orders. Sometimes they throw them in prison. Most of these things work a little. Some of them could work better with more funding.

Nobody thinks the current system is perfect. I respect people who want to change it. But you’ve got to propose a specific change! Don’t just write yet another article saying “the damn liberals are soft on the mentally ill”.

The damn liberals are soft because some of them are the people who have to develop an alternative plan, and they can’t think of a good one. If you’re going to write yet another article like this, and you want to change minds, you should skip the one hundred paragraphs about the damn liberals, and go straight to the part where you explain how you plan to do better.

by Scott Alexander, Astral Codex Ten |  Read more:
Image: Luis Sinco/Los Angeles Times via Getty Images via

Al Green

The 70’s were different. Al Green on Soul Train
via:
[ed. Along with Let's Stay Together, here's another one covered by a few bands over the years, written by Al Green and Teenie Hodges:]

Magnum P.I.


Magnum, P.I. (CBS, 1980-1988)
via:
[ed. Pretty much the entire series in one minute... used to see them filming around Oahu and would stop and watch every now and then. Great show.]


Onigiri and Ochazuke by Kimiko Ono,
Taste of Japan Series 8,
Kosei Publishing, Design by Tetsuo Nakamura, Cover Illustration by Masayuki Miyata

Wild Vegetable Dishes by Sadako Wakabayashi,
Taste of Japan Series 12,
Kosei Publishing, Design by Tetsuo Nakamura, Cover illustration by Masayuki Miyata
via: here/here

Postmen in the Mountains

An old postman has spent his whole life delivering mail to the mountain of Hunan and is about to retire. His only son is due to take over his duties. As father and son journey through the mountains, the son begins to appreciate the toil and burden his father has to bear as postman for the villagers, and the old postman is also deeply moved as his son relates his mother's anxiety as she waits for him to return home from every trip. (IMDb)

***
The father prepares the postbag the night before, arranging the mail in the order it will be needed, and wrapping everything carefully against the possibility of bad weather. This will be the last time he packs the bag, and the first time the route will be carried by his son -- who is inheriting his job.

The next morning unfolds awkwardly. The boy's mother is worried: Will be find the way? Will he be safe? The father (Teng Rujun) is unhappy to end the job that has defined his life. But his son (Liu Ye) will be accompanied by the family dog, who has always walked along with the father and knows the path. It is a long route, 112 kilometers through a mountainous rural region of China, and the trip will take three days. The son shoulders the bag and sets off, and then there is a problem: The dog will not come along. It looks uncertainly at the father. It runs between them. It is not right that the son and the bag are leaving, and the father is staying behind.

This is the excuse the father is looking for, to walk the route one last time and show his son the way. The two men and the dog set off together in Huo Jianqi's "Postmen in the Mountains," a film so simple and straightforward that its buried emotions catch us a little by surprise. (...)

The trek represents the longest time father and son have ever spent together; the boy was raised by his mother while his father was away, first for long periods, then for three days at a time. They've never even had much of a talk. Now the son observes that his father, who seemed so distant, has many friendships and relationships along the way -- that he plays an important role, as a conduit to the outside world, a bringer of good news and bad, a traveler in gossip, a counselor, adviser and friend.

The villages and isolated dwellings are located in a region that must have been chosen for its astonishing beauty. There are no factories, freeways or fast food to mar the view, and the architecture has the beauty that often results when poverty and necessity dictate the function, and centuries evolve the form. The dog seems proud to show these things to his new young master. (...)  ~  Lives of 'Postmen' all about the journey (Roger Ebert review).