Monday, July 22, 2024

Jack White


"Put bluntly, No Name is a rock record – an incredibly satisfying one. It sounds more like the White Stripes than anything White has cut since that band’s demise – its 13 songs are driven by the blues, his playing sounding like the bastard son of Elmore James and Jimmy Page, swinging between bare-knuckled riffs and sweet slide-guitar with a switchblade edge. The instrumentation is pared back to only what matters, what’s necessary. The drumming often channels the magical primordial stomp of the sorely missed Meg White’s poetic, bone-simple playing.

The album is dark, heavy, thrilling, beautiful." ~  Jack White: No Name review (Guardian)

[ed. Yow. A great one. Reminds me a bit of Jon Spencer Blues Explosion (Orange). Listen to Side A and Side B in their entirety.]

Can Glen Powell be a Movie Star in a Post-Movie-Star Era?

The Twisters actor’s career explains a lot about the state of the industry.

Actor Glen Powell's parents hold up signs behind him as they attend the special screening of Hit Man at the Paramount Theatre in Austin, Texas, on May 15, 2024.

A few weeks ago, a Reddit poster decided to ask about which actors audiences were being “force fed to accept” as movie stars. They had what they felt was a prime example at their fingertips: Glen Powell.

“I feel like this guys [sic] is everywhere doing anything,” the poster mused. Yet they found Powell’s work to be “all just Meh.”

This is Glen Powell’s summer. After spending decades in the Hollywood trenches, Powell is now the star of Twisters, out this week, and of Hit Man, now streaming on Netflix, which he also co-wrote and produced. He’s got big glossy profiles in GQ, the Hollywood Reporter, and Vanity Fair. He’s been anointed, crowned, and feted as the next big thing. (...)

Part of why Powell’s sudden rise feels so notable is its strangely retro vibe. Today’s ambitious young actors, like Timothée Chalamet and Florence Pugh, usually flit back and forth between Marvel or some other big action series — to build their names and paychecks — and quirky off-beat films made by auteurs that will get them critical recognition. Powell, in contrast, has stuck to the genres that conventional wisdom has long held were dead: Romantic comedies. Middlebrow adult dramas not based on an existing franchise. You know, ’90s kind of stuff.

“I’m working to try to be you,” Powell told Tom Cruise when he was cast in a supporting role in Top Gun: Maverick, according to an interview in the Hollywood Reporter earlier this year. But Powell also seems to know that his dream is unlikely because the industry doesn’t really make Tom Cruises anymore.

“First of all, there will never be another Tom Cruise,” he continued in the profile. “That is a singular career in a singular moment, but also movie stars of the ’80s, ’90s, early 2000s, those will never be re-created.”

All the same, Powell looks an awful lot like he’s going to make a play for it — by sheer force of will, if necessary. After all, he’s had a lot of practice. (...)

Tom Cruise became a movie star in the raunchy coming-of-age sex comedy Risky Business, his signature commitment powering him through the iconic scene where he dances around in his underwear. Julia Roberts became a movie star when she flashed her megawatt smile at the camera in the cheesy-but-satisfying Mystic Pizza. These were movies that weren’t stupid but weren’t particularly challenging either, simple and goofy mid-budget fare that almost anyone would want to see.

In the late 2000s going into the 2010s, Hollywood pretty much stopped making that kind of movie. DVDs and then streaming, along with the rise of prestigious cable shows, eroded the audience. As the domestic box office collapsed, the international market became more important, driving a push toward spectacle-laden action franchises. The only thing reliably making money anymore was the ascendent Marvel Cinematic Universe, which in the early 2010s was just entering the so-called Phase 2.

The new financial path for studios became: Focus most of your money on a big flashy action franchise, ideally one based on familiar IP with a built-in fanbase. Allow some money on the side for movies that have a solid chance at the Oscars. Let a more intimate movie get made here and there, but give it a budget that looks like a rounding error, which means it won’t have any stars. Mid-budget movies? Those are for streaming. (...)

Powell, meanwhile, had his sights set on the biggest ’90s throwback of all: Tom Cruise’s new Top Gun sequel. Powell auditioned for the crucial role of Goose’s son and, once again, got close, he told GQ. Not close enough: The part went to Miles Teller. Still, Cruise, who liked Powell’s screen test, offered him the part of Slayer, the equivalent of the Val Kilmer role from the original movie.

Powell said no. He didn’t think Slayer worked in the script. The kid in the tux in him who had put in a lot of time analyzing the way movies worked foresaw himself ending up all over the cutting room floor.

Cruise felt strongly enough about Powell’s potential that he personally called him to give him career guidance. If Powell really wanted to be the next Tom Cruise, he told him, the key wasn’t to pick a great role. It was to pick a great project and then make the role great. He got Powell to sign on as Slayer, and then he got Slayer rewritten into a new character, now called Hangman, who would fit Powell’s smarmy golden boy skill set.

Top Gun: Maverick was the first blockbuster of the post-pandemic era. It was also definitively Tom Cruise’s hit. Powell’s turn as Hangman wasn’t on the cutting room floor, but it wasn’t central enough to the film to be part of the narrative of its success. (...)

If Hollywood stops making movie stars, can you DIY one?

If this story makes it sound like Glen Powell is an underdog, that’s inaccurate, in the same way it was inaccurate to push that narrative about Armie Hammer a few years back. Powell is a tall and handsome white dude who could afford to stick it out through a decade or so of under-employment because he was getting mentored by Denzel Washington and Tom Cruise. He’s not an underdog. He’s doing a different thing.

The thing about Glen Powell that comes through most strongly in profiles is this: You have never read a more earnest celebrity interview than the ones he gives. This man keeps a bingo board where he tracks all the character types he wants to play. He’s currently finishing his final college credits because he thinks it would mean a lot to his mom. He’s got a book he calls an icon wisdom journal he fills with advice from his mentors, most notably Cruise. He wore that tux. He’s a hard worker who is very earnest about the value of hard work.

Powell mostly masks this earnestness by playing insufferable assholes, less a Chris Pratt than a Matt Czuchry. It may be that the closest fit onscreen to Powell’s real personality is the before character in Hit Man, mild-mannered philosophy professor Gary, before he transforms himself into a cold-blooded killer.

Yet ironically, Gary pre-transformation is one of Powell’s least convincing performances. Powell doesn’t seem to know how to fold his broad shoulders in or soften his big Hollywood grin so that he looks less than confident, even when the character he’s playing is lecturing a bored class of college students or letting his co-workers mock him to his face. Part of the reason Powell pops is that whenever he shows up on camera, he gives every evidence of believing he belongs there.

by Constance Grady, Vox | Read more:
Image: Sergio Flores/AFP via Getty Images
[ed. See also: Netflix’s totally delightful Set It Up proves just how durable the romcom formula is (Vox). And, the original Hit Man article here (Texas Monthly):]
***
"On a nice, quiet street in a nice, quiet neighborhood just north of Houston lives a nice, quiet man. He is 54 years old, tall but not too tall, thin but not too thin, with short brown hair that has turned gray around the sideburns. He has soft brown eyes. He sometimes wears wire-rimmed glasses that give him a scholarly appearance.

The man lives alone with his two cats. Every morning, he pads barefoot into the kitchen to feed his cats, then he steps out the back door to feed the goldfish that live in a small pond. He takes a few minutes to tend to his garden, which is filled with caladiums and lilies, gardenias and wisteria, a Japanese plum tree, and rare green roses. Sometimes the man sits silently on a little bench by the goldfish pond, next to a small sculpture of a Balinese dancer. He breathes in and out, calming his mind. Or he goes back inside his house, where he sits in his recliner in the living room and reads. He reads Shakespeare, psychiatrist Carl Jung, and Gandhi. He even keeps a book of Gandhi’s quotations on his coffee table. One of his favorites is “Non-violence is the greatest force at the disposal of mankind. It is mightier than the mightiest weapon of destruction devised by the ingenuity of man.”

He is always polite, his neighbors say. He smiles when they see him, and he says hello in a light, gentle voice. But he reveals little about himself, they say. When he is asked what he does for a living, he says only that he works in “human resources” at a company downtown. Then he smiles one more time, and he heads back inside his house.

What the neighbors don’t know is that in his bedroom, next to his four-poster bed, the man has a black telephone, on which he receives very unusual calls.

“We’ve got something for you,” a voice says when he answers. “A new client.”

“Okay,” the man says.

The voice on the other end of the line tells him that a husband is interested in ending his marriage or that a wife would like to be single again or that an entrepreneur is ready to dissolve a relationship with a partner.

The man hangs up and returns to his recliner. He thinks about what service he should offer his new client. A car bombing, perhaps. Or maybe a drive-by shooting. Or he can always bring up the old standby, the faked residential burglary.

As he sits in his recliner, his cats jump onto his lap. They purr as he strokes them behind their ears. The man sighs, then he returns to his reading. “Always aim at complete harmony of thought and word and deed,” wrote Gandhi. “Always aim at purifying your thoughts and everything will be well.”

The man’s name is Gary Johnson, but his clients know him by such names as Mike Caine, Jody Eagle, and Chris Buck. He is, they believe, the greatest professional hit man in Houston, the city’s leading expert in conflict resolution. For the past decade, more than sixty Houston-area residents have hired him to shoot, stab, chop, poison, or suffocate their enemies, their romantic rivals, or their former loved ones." (...)

“Except for one or two instances, the people I meet are not ex-cons,” says Johnson. “If ex-cons want somebody dead, they know what to do. My people have spent their lives living within the law. A lot of them have never even gotten a traffic ticket. Yet they have developed such a frustration with their place in the world that they think they have no other option but to eliminate whoever is causing their frustration. They are all looking for the quick fix, which has become the American way. Today people can pay to get their televisions fixed and their garbage picked up, so why can’t they pay me, a hit man, to fix their lives?”

Thursday, July 18, 2024


Dipsacus fullonum (Common teasel)
photo: markk
[ed. Also known as Golfcoursus roughus. To be avoided.]

Getting Along With Kids

“Wow, you have a thing with kids, they really like you.”

Is there such a thing like being gifted with ‘getting-along-with-kids’?

I don’t think so. My experience witnessing how people - family, friends and complete strangers - interact with my kids (2 and 4 y/o) have made me pretty clear-sighted about how most people - and I was probably that person - get it wrong.

As a parent, as a human, I can’t truly appreciate a person who doesn’t give a damn about my kids when they’re around. Pretty simple. The same way I’d find it rude if they were ignoring my partner at a dinner, ignoring children - a very common behaviour - is something that profoundly irritates and saddens me.

I also know that most people are just clueless about how to engage with them. And the same way we tend to stay run (?) away from things that make us uncomfortable (mourning of others probably a top one), we assume that we’d be better off avoiding any kind of interaction.

What if engaging with kids was a source of deep joy and plenitude? Children come with a pure, ingenuous will for playfulness that has so much to teach us, if we’re willing to.

On your marks, get seeeet, go!

intro • is a kid a human being yet?

OK, what if we started considering them for who they are: human beings with their own sense of self, carefully, unconsciously, watching us adults and learning how to behave from our crazy codes of conduct.

This little creature you easily look down on is a future you. - let this sink.

So here are common behaviours I noticed from people who “get-along-with-kids”:

1 • use the oldest icebreaker of all times: funny faces

How boring are we, adults, with our serious faces and looks. How about we stop taking ourselves so seriously? The stupid, funny face is telling a kid “Yo! We aren’t all boring, little one!”.

So smiling’s the trick? Yes! And - bam! - one of those things in life that are free and extremely rewarding. Even when the person you smile at doesn’t mimetically reply back, you’ll have shared positive vibes. Good karma. Works with adults, works with kids.

And hey, remember, you ain’t the authority - someone else is - so relax!

2 • let kids come at you

People who are the best with children tend to all keep their distance at first. They aren’t trying to force contact or anything. Giving kids space and time to make friends, especially when they just woke up (adults aren’t so different, are we?). Be patient. It’s a dance.

3 • enquire about them (and speak like… normally)

People who “get-along-with-kids” also enquire like they actually care about the child, how their day was so far, the kind of food they love most, the name of their best friends, their favorite animal… It doesn’t matter how old the child is and whether they can verbally express -yet - all the things they want to say. Eye contact. Consideration. Empathy. You can’t imagine how kids appreciate people truly engaging with them.

Oh, and kids are not stupid - generally. So why use that silly voice? Try it with an adult you meet for the first time, not sure how well they’ll engage!

4 • make room for astonishment

People who “get-along-with-kids” are convinced they can learn something from every single interaction with a kid, be it an activity, a song, a story, a game,… Genuine questions. Humility. Astonishment.

We all know too well how great it feels to be asked questions, to be listened to, and to even trigger a reaction: “noooo way?!”, “seriously?!”

5 • enter their game, follow their lead

More often than not, kids won’t need us adult to come up with ideas to play. Which doesn’t necessarily mean they want to play solo. A sincere “can I play with you?” will sometimes surprise them - I love their face when this happens! - for the best. Try it, you’ll see.

And just like improv’ teaches us - or so I am told -, all in for continuity: “Yes! And…” It is counterintuitive yet beautiful to let children take the lead. They have an intact creativity that our control-freak minds should be learning from. Not the other way round.

Only when things become dangerous (e.g. getting too close to road traffic) or inappropriate (e.g. saying something mean), should we break the playing flow.

6 • go all in

You’re at a café, catching up with a friend. You’re really into the conversation. Suddenly, your friend takes her phone, and… disengage. How do you feel? Pissed. Miserable. Disappointed. Not considered. Not worth their time. Sad.

How could that be any different from a child’s perspective? You’re there, talking, playing, and suddenly, a thought, a social interaction, a chore, a notification gets in the way. Telling the kids: “hold on, there is something more important than you right now.”

Kids won’t like us because we bring along expensive or fancy gifts. They’ll love being with us because we engage, because we are in, 100%.

A survey run by LEGO in 20181 revealed that 81% of kids wished their parents would play with them more. And what’s true with parent’s true with adults in general.
Given the positive effects it has on our wellbeing and happiness levels, family play should be the most important ‘homework’ of all. - says family expert and author, Jessica Joelle Alexander
7 • let go

Of our self-control, of our inhibition. How refreshing it feels to play with kids, allowing ourselves to be someone else for a moment, someone ten times younger.

So yes, I chase imaginary platypuses at the parc. I sail a bed-boat. I order and savour the only available meal - pasta-pesto - from the 3 y/o best chef in town.

I miss playing, and I see kids as a beautiful opportunity to reconnect with play. How about we drop that fear of finding fancy stuff to do and rather let go, give in to unstructured, random and imaginary play?

by Mathilde Baillet, A Wander Woman | Read more:
Image: markk
[ed. And wrestling (which is another form of hugging), they grow out of that so quickly. But especially don't talk down to them (#3). When I hear people "baby talking" to 5-6 year olds (or any age, really) it sets my teeth on edge. Plus, there's a reciprocal thing: they should respect your personhood as well, so feel free to assert yourself when they go beyond your boundaries. Just tell them when, and why. They'll respect that.]


via:

Vintage Italian lighting by designer Gino Vistosi

Orville Peck & Willie Nelson

[ed. Ha!]

Against Slop

Beyond the failure market in video games

It's usually understood that time wasted is art wasted. To edit down to the lean core, that’s often considered in most mediums the mark of quality (or, perhaps more accurately, and sometimes to a fault, professionalism). Historically, that’s been part of the cultural stigma against video games: not only is wasted time a given, it’s an integral part of the experience. Interactivity inverts the responsibility for moving the plot forward from storyteller to audience. A linear, self-propulsive story squanders the medium’s artistic potential. In games, the tension comes from not only the uncertainty about how the plot will resolve but also whether it even can. When the player fails, the story ends without ever having reached a conclusion. The work necessarily has to reset in some manner. Which creates a minor paradox: How can time discarded not also be time wasted? Isn’t this all noise and nonsense for the purpose of keeping a couch potato on the couch so long they sprout roots?

Repetition is usually dramatic poison, and it’s no wonder such failing without finality is erased from representations of gaming in other media. Whether in 1982’s Tron, or the “First Person Shooter” episode of The X-Files, or Gerard Butler’s turn in 2009’s Gamer, the “if you die in the game, you die for real” trope is understandable. Games can be complex, multifaceted cultural objects and are more frequently being covered that way, yet the accusation that games are action for the sake of action with little consequence or meaning is uncomfortably accurate much of the time. The source of the stigma stems from the early arcade days, when games primarily leveraged failure: every loss shook down the player for another quarter to send rattling into the machine. To beat the game and see it in its entirety took a mountain of coins, dedication, and skill—rendering play play, which is to say, largely divorced from narrative or the kinds of emotional experiences other art forms explored.

The pastime became less sport and more medium when home consoles and personal computers allowed games to experiment on a mass scale. Developers had no profit incentive to induce defeat once a cartridge or CD-ROM had been sold. Failure became instead the primary driver of tension within an emerging narrative, allowing story to flourish alongside gameplay from text adventures to action shooters. These stories were, save for those played with perfect skill, littered with loops. With every trap that fatally mangles a player in Tomb Raider, every police chase in Grand Theft Auto that ends in a brick wall instead of an escape, the narrative goes backward, the protagonist character’s story caught in a cyclical purgatory until the player-protagonist achieves a better result.

The sensation of breaking through those barriers is one of the most cathartic experiences that games offer, built on the interactivity that is so unique to gaming as a medium. Failure builds tension, which is then released with dramatic victory. But the accusation that these discarded loops are irrecuperable wastes of time still rings true, as modern game audiences have become comfortable consuming slop. In the past few years, games have trended toward becoming enormous blobs of content for the sake of content: an open world, a checklist of activities, a crafting system that turns environmental scrap into barely noticeable quality-of-life improvements. Ubisoft’s long-running Far Cry franchise has often been an example of this kind of format, as are survival crafting games like Funcom’s Conan Exiles or Bethesda’s overloaded wasteland sim Fallout 76. Every activity in a Far Cry or its ilk is a template activity that only comes to a finite end after many interminable engagements: a base is conquered, just to have three more highlighted. Failure here is a momentary punishment that can feel indistinguishable from success, as neither produces a sense of meaning or progress. These failure states are little moments of inattention and clumsy gameplay that lead only to repeating the same task better this time. Then when you do play better mechanically, you are rewarded with the privilege of repeating the same task, a tiny bit more interesting this time because the enemies are a little tougher in the next valley over. Within games that play for dozens of hours but are largely defined by mechanical combat loops that can last just seconds, everything can boil down to the present-tense experience so detrimentally that it’s hard to remember what you actually did at the end of those dozens of hours.

There is no narrative weight to liberating the same base in Far Cry across multiple attempts, no sense of cumulative progression to repeatedly coming at the same open-world content from different angles. There is only a grim resignation to the sunk-cost fallacy that, if you’ve already invested so much time into the damn thing already, you might as well bring it to some kind of resolution. Cranking up difficulty can make those present-tense moments more dramatic or stressful, but in the end it’s just adding more hours to the total playtime by adjusting the threshold for completing a given segment to a stricter standard. The game does not care if you succeed or fail, only that you spend time with it over its competitors.

As the industry creates limbos of success, the failure market itself has also mutated. See mobile gaming, a distorted echo of the coin-operated era, where players are behaviorally channeled to buy things like extra Poké Balls in Pokemon Go or “continue” permissions in Candy Crush and keep playing just a little longer. In 1980, failure cost a cumulative $2.8 billion in quarters; in 2022, the mobile games market made $124 billion by creating artificial barriers and choke points within their game mechanics, either making things more actively difficult or just slowing them down to prompt impulse spending.

In video games like the ubiquitous Fortnite or Blizzard’s recent Diablo 4, major releases often have “seasons” that heavily encourage cyclical spending. Every three months the game adds new content and asks the player to repeat the experience. The player exchanges between seven to twenty-five dollars to gild the stories they’ve already completed with extra objects, materials, and costumes—real money spent only for the privilege of sinking in the requisite time to acquire these virtual items, creating yet another loop of increasingly meaningless time usage. Fortnite came out in 2017. In 2023 the game generated all by itself a total $4.4 billion of income. A sum larger than the GDP of some countries, generated in one year, six years after release, off the impulse not to look like a scrub with no money in front of your friends even if those friends are dressed as Peter Griffin and the Xenomorph from Alien.
---
“Live service” is used to describe these games that attempt to stay evergreen with seasonal systems and intermittent small content drops. These seasonal titles and mobile cash shops have created feedback loops and cyclical repetitions that, by design, do not resolve. In recent years, however, there has been a counterreaction that tries to integrate these consumerist tendencies in the pursuit of something greater. (...)

Besides its beautiful portrayal of a declining, melancholy world of perpetual autumn, what sets aside Elden Ring is its complexly layered difficulty. Elden Ring is quite eager to kill you, with a million ways to put a player down. But it is not meanly difficult, or insurmountably difficult. Most importantly, it is not difficult as part of a profit-seeking monetization loop. Instead, the failure states that are so often leveraged to extend playtime and coerce spending in most other games are here used as friction to build atmosphere. The constant starting again is exhausting, often stressful, sometimes infuriating. It is never meaningless, however: it confidently contradicts the worries of other mediums and the too-often-true accusations of slop with its deep understanding of how to create drama within any individual moment. Participating in its loops of death and rebirth as a player is to be fully within the Lands Between. Elden Ring presents a once-flourishing kingdom literally consumed by creeping nihilism and reflexive despair, which gives sympathetic resonance to the player’s determined and confident attempts to surmount these challenges. The most powerful or villainous enemies withdraw into themselves and let the world rot, while the weakest literally cower from the player, so exhausted by the idea of another painful death. Not the player, though: they exist in deliberate dramatic contrast to these characters by virtue of their own interactive participation with the world, making them the hero as both part of the text and as a meta-textual frame for the whole story.

By persisting in a world that trends downward, your own failures take on a defiant quality. The failure loop of the game incentivizes the player to loop again. This is where Elden Ring’s difficulty is particularly clever: because a player pays no consequence besides dropping experience points on the ground where they died, there is a hard limit on what the game can take away from them. There is an interactive choice and freedom even within these fail states, as you can abandon them or return again, fighting through all you had before; this in turn creates an incredible carrot-and-stick effect that, should you gamble on reclaiming your hard-won gains, doubles the stakes. While it is repeating the same content on the surface, there is a tangible and meaningful sense of cumulative progress and tactical variation on every death.

Once you’ve spent those points on an upgrade, that’s yours for the rest of the game—a permanent token of your dedication. A player is only ever risking the immediate next step, which adds weight to the fantasy of the gameplay, but not so much actual consequence that failure would crush a player’s spirit to continue. Holding onto your advancements even after dying and coming back makes your arc of progression stand in exciting contrast to the world around you. From a stagnant place, you are rising as something new, something vibrant. By incorporating these meta-textual elements into the mechanical play, there is a sense of individuality and ownership of the experience that more typical open-world check-listing games do not have. When I fail in Far Cry, it feels dramatically evaporative and impersonal. When I fail in Elden Ring, I feel like it’s because I made an active choice to risk something and I come back more engaged and determined than ever. (...)

The expansion’s price tag is less about monetizing the players than it is a reflection of the developmental effort involved. Elden Ring was certainly in a position to cash in at any time. The initial release was as successful as any game using more manipulative methods of extracting value. It was so popular that it sold twelve million copies in two weeks, moving on to over twenty million sold within a year of release. By any metric, but particularly by the metric where you multiply twenty million by the sixty-dollar retail price, the game was a massive success for art of any sort in any medium, doing so without relying on in-app purchases, artificial game resource scarcity, or making certain weapons and armor premium-purchase only.

For the health of video games as an artistic medium, this needs to be enough. That’s plenty of money. That’s such an enormous goddamn pile of money it even justifies the multimillion-dollar cost of developing modern flagship titles. Perhaps the problem with Elden Ring as an example is that it’s a masterpiece. It captured the imagination of millions. Games as an industry, instead of an artistic medium, don’t want that kind of success for only the games that are worthy of it. The industry needs to make money like that on the games built without subtlety, or craft, or heart. The industry needs to pull a profit off the slop too, and there is nothing they won’t gut or sell out to do it. If the old way was to tax failure, the new way is to dilute success, to treadmill the experience such that it never reaches a destination. (...)

This is just the era we live in, our own stagnant age in the Lands Between. With Disney and its subsidiaries sucking all of the air out of the room to repackage the same concept over and over, Hollywood has reached the stale conclusion that the same story can be told repetitively. The embrace of AI across multiple mediums just intensifies this dilution of what feels meaningful. 

by Noah Caldwell-Gervais, The Baffler | Read more:
Image: From Elden Ring.|Bandi Namco

Tuesday, July 16, 2024

After 12 Years of Reviewing Restaurants, Pete Wells is Leaving the Table

Early this year, I went for my first physical in longer than I’d care to admit. At the time, I was about halfway through a list of 140 or so restaurants I planned to visit before I wrote the 2024 edition of “The 100 Best Restaurants in New York City.” It was a fair bet that I wasn’t in the best shape of my life.

My scores were bad across the board; my cholesterol, blood sugar and hypertension were worse than I’d expected even in my doomiest moments. The terms pre-diabetes, fatty liver disease and metabolic syndrome were thrown around. I was technically obese.

OK, not just technically.

I knew I needed to change my life. I promised I’d start just as soon as I’d eaten in the other 70 restaurants on my spreadsheet.

But a funny thing happened when I got to the end of all that eating: I realized I wasn’t hungry. And I’m still not, at least not the way I used to be. And so, after 12 years as restaurant critic for The New York Times, I’ve decided to bow out as gracefully as my state of technical obesity will allow.

Not that I’m leaving the newsroom. I have a couple more restaurant reviews in my back pocket that will appear over the next few weeks, and I plan to stick around at The Times long after that. But I can’t hack the week-to-week reviewing life anymore.

The first thing you learn as a restaurant critic is that nobody wants to hear you complain. The work of going out to eat every night with hand-chosen groups of friends and family sounds suspiciously like what other people do on vacation. If you happen to work in New York or another major city, your beat is almost unimaginably rich and endlessly novel.

People open restaurants for all kinds of reasons. Some want to conjure up the flavors of a place they left behind, and consider their business a success if they win the approval of other people from the same place. Others want to dream up food that nobody has ever tasted or even imagined before, and won’t be satisfied until their name is known in Paris and Beijing and Sydney.

And there are a hundred gradations in between. The city is a feast. Exploring, appreciating, understanding, interpreting and often even enjoying that feast has been the greatest honor of my career. And while the number of restaurant critics is getting smaller every year, everybody I know who works in this endangered profession would probably say the same thing.

So we tend to save our gripes until two or three of us are gathered around the tar pits. Then we’ll talk about the things nobody will pity us for, like the unflattering mug shots of us that restaurants hang on kitchen walls and the unlikable food in unreviewable restaurants.

One thing we almost never bring up, though, is our health. We avoid mentioning weight the way actors avoid saying “Macbeth.” Partly, we do this out of politeness. Mostly, though, we all know that we’re standing on the rim of an endlessly deep hole and that if we look down we might fall in.

“It’s the least healthy job in America, probably,” Adam Platt said recently when I called him to discuss the unmentionable topic. Mr. Platt was New York magazine’s restaurant critic for 24 years before stepping away from the trough in 2022.

“I’m still feeling the effects,” he said. He has a flotilla of doctors treating him for gout, hypertension, high cholesterol and Type 2 diabetes.

“I never ate desserts but when I took the job I started eating desserts,” he said. “I became addicted to sugar. You drink too much. You’re ingesting vastly rich meals maybe four times a week. It’s not good for anybody, even if you’re like me and you’re built like a giant Brahman bull.”

We talked about the alarming frequency with which men in our line of work seem to die suddenly, before retirement age. A.A. Gill, restaurant critic of the Sunday Times of London, was killed by cancer at 62. Jonathan Gold, critic for the Los Angeles Times and LA Weekly, died at 58, right after he was diagnosed with pancreatic cancer. Back in 1963, A.J. Liebling of The New Yorker died after checking into a hospital for bronchial pneumonia. He was 59.

These are isolated stories to be sure, but I’d see the headlines projected on my bedroom ceiling when I woke up in the night with my insides burning like a fire at a chemical refinery.

The women I looked up to lasted longer. Gael Greene, who invented Mr. Platt’s job at New York, lived to 88. Mimi Sheraton, critic for Cue, The Village Voice and The New York Times, made it to 97, despite a professed aversion to exercise.

Christiane Lauterbach, a restaurant critic for Atlanta magazine for more than 40 years, told me she is in good health. She attributes that to “not going to the doctor,” although she was recently talked into having her cholesterol and blood sugar tested. (Both were normal.) “I just take little bites of this and that. I never finish a plate in a restaurant,” she said. “If I finished my plate, I would just be 300 pounds.”

S. Irene Virbila, who ate out six nights a week for 20 years as restaurant critic for the Los Angeles Times, used to bring along a man to finish her plates. She called him Hoover.

“Restaurant food is rich,” she said. “To make those flavor bombs it has to have a lot of rich elements. It’s more of everything than you would eat if you could eat exactly what you wanted.”

After she left the post, she lost 20 pounds in two months, “without thinking about it.” Today, aside from taking medication for an inherited vulnerability to cholesterol, she is in good health.

Virtually all of my 500 or so reviews were the result of eating three meals in the place I was writing about. Typically, I’d bring three people with me and ask each to order an appetizer, main course and dessert. That’s 36 dishes I’d try before writing a word.

This is the simple math of restaurant reviewing, but there is a higher math. Critics eat in a lot of restaurants that Gael Greene once described as “neither good enough nor bad enough” for a review.

Then there are the reference meals, the ones we eat to stay informed, to not be a fraud. Often, this is where I got into real trouble. How many smash burgers did I need to taste, or taste again, before I could write about the ones at Hamburger America, a restaurant I reviewed in the same months I was eating my way toward my “100 Best Restaurants ” list, for which I needed to make sure that the Uyghur hand-pulled noodles and Puerto Rican lechon asado and Azerbaijani organ-meat hash that I loved were, at least arguably, the best in the city?

This is probably the place to mention that naming 100 restaurants was totally my idea. My editors had asked for 50, and I’ll bet they would have settled for 25. When I did do 100, and the time came a year later to do it again, they didn’t ask me to go back to all of them. That was my idea, too.

Omnivorousness, in the metaphorical sense, is a prerequisite for a good critic. My favorite movie critic is still Pauline Kael, who wrote as if she had seen every film ever made. But movies won’t, as a rule, give you gout.

Food writing’s most impressive omnivore was Jonathan Gold. There didn’t seem to be a dish served anywhere in Los Angeles that he hadn’t eaten at least once, and usually several times, until he was sure he understood it. His knowledge inspired me. It also tormented me — there was no way to catch up to him.

Years ago, he used to tell people he had eaten every taco on Pico Boulevard. This was merely an appetizer. His larger goal was to eat in every restaurant on the street “at least once.”

Pico Boulevard is more than 15 miles long.

I have not eaten in every restaurant on Roosevelt Avenue in Queens, far and away the most significant taco artery in my own city. There have been nights, though, as I walked for miles under the elevated No. 7 train, watching women press discs of fresh masa and men shave cherry-tinted strips of al pastor pork from slowly revolving trompos, when it seemed like an excellent idea.

At a certain point, this kind of research starts to look like a pathology. (...)

When I first came to The Times in 2006, a reporter warned me not to identify myself too heavily with my work. “Any job at The Times is a rented tux,” she said.

I nodded, but didn’t get the point until this year.

by Pete Wells, NY Times |  Read more:
Image: Liz Clayman for The New York Times
[ed. So many great reviews. Here are a couple: Senor Frog's and Guy Fieri.]

Randoseru: The Book Bag That Binds Japanese Society


In Japan, cultural expectations are repeatedly drilled into children at school and at home, with peer pressure playing as powerful a role as any particular authority or law. On the surface, at least, that can help Japanese society run smoothly.

During the coronavirus pandemic, for example, the government never mandated masks or lockdowns, yet the majority of residents wore face coverings in public and refrained from going out to crowded venues. Japanese tend to stand quietly in lines, obey traffic signals and clean up after themselves during sports and other events because they have been trained from kindergarten to do so.

Carrying the bulky randoseru to school is “not even a rule imposed by anyone but a rule that everyone is upholding together,” said Shoko Fukushima, associate professor of education administration at the Chiba Institute of Technology.

On the first day of school this spring — the Japanese school year starts in April — flocks of eager first graders and their parents arrived for an entrance ceremony at Kitasuna Elementary School in the Koto neighborhood of eastern Tokyo.

Seeking to capture an iconic moment mirrored across generations of Japanese family photo albums, the children, almost all of them carrying randoseru, lined up with their parents to pose for pictures in front of the school gate.

“An overwhelming majority of the children choose randoseru, and our generation used randoseru,” said Sarii Akimoto, whose son, Kotaro, 6, had selected a camel-colored backpack. “So we thought it would be nice.”

Traditionally, the uniformity was even more pronounced, with boys carrying black randoseru and girls carrying red ones. In recent years, growing discussion of diversity and individuality has prompted retailers to offer the backpacks in a rainbow of colors and with some distinctive details like embroidered cartoon characters, animals or flowers, or inside liners made from different fabrics.

Still, a majority of boys today carry black randoseru, although lavender has overtaken red in popularity among girls, according to the Randoseru Association. And aside from the color variations and an increased capacity to accommodate more textbooks and digital tablets, the shape and structure of the bags have remained remarkably consistent over decades.


The near totemic status of the randoseru dates back to the 19th century, during the Meiji era, when Japan transitioned from an isolated feudal kingdom to a modern nation navigating a new relationship with the outside world. The educational system helped unify a network of independent fiefs — with their own customs — into a single nation with a shared culture.

Schools inculcated the idea that “everyone is the same, everyone is family,” said Ittoku Tomano, an associate professor of philosophy and education at Kumamoto University.

In 1885, Gakushuin, a school that educates Japan’s imperial family, designated as its official school bag a hands-free model that resembled a military backpack from the Netherlands known as the ransel. From there, historians say, the randoseru quickly became Japan’s ubiquitous marker of childhood identity. (...)

Grandparents often buy the randoseru as a commemorative gift. The leather versions can be quite expensive, with an average price of around 60,000 yen, or $380.

Shopping for the randoseru is a ritual that starts as early as a year before a child enters first grade.

At Tsuchiya Kaban, a nearly 60-year-old randoseru manufacturer in eastern Tokyo, families make appointments for their children to try on different-colored models in a showroom before placing orders to be fulfilled at the attached factory. Each bag is assembled from six main parts and takes about a month to put together. (...)


Each Tsuchiya Kaban bag comes with a six-year guarantee on the assumption that most students will use their randoseru throughout elementary school. As a memento, some children choose to turn their used bags into wallets or cases for train passes once they graduate.

In recent years, some parents and children’s advocates have complained that the bags are too burdensome for the youngest children. Randoseru can cover half of the body of a typical first grader. Even unloaded, the average bag weighs about three pounds.

Most schools do not have personal lockers for students or much desk storage space, so students frequently carry textbooks and school supplies back and forth from home. And in a culture that puts a high value on hard work, patience, perseverance and endurance, the movement to relieve children of the randoseru burden hasn’t gotten very far.

“Those who have no heart say that ‘recent children are weak; back in our day we carried around those heavy bags,’” said Ms. Fukushima, the education professor.

A few manufacturers have developed alternatives that retain the randoseru shape while using lighter materials like nylon. But these have been slow to gain traction. (...)

At the end of the day, Kaho Minami, 11, a sixth grader with a deep-red randoseru stitched with embroidered flowers that she had carried throughout elementary school, said she never yearned for any other kind of bag. “Because everyone wears a randoseru,” she said, “I think it is a good thing.”

by Motoko Rich, Hisako Ueno, and Kiuko Notoya, NY Times | Read more:
Images: Noriko Hayashi
[ed. Back in the day in Hawaii when I was in grade school, everybody had plastic Pan Am or Hawaiian Airlines bags - sqaure, two handles, side logo (where did we get them from?). Either that, or boys would just carry their books sidearm and girls would clutch them to their chests (always - you never wanted to be caught doing the opposite!).]

Monday, July 15, 2024

Permanent Crisis

Myopic responses perpetuate the “opioid epidemic”

To express the ambient feeling that “things are getting worse,” there exists, of course, a meme. It plots iterations of a chart, and on its x-axis floats the disembodied, smiling face of President Ronald Reagan. After his inauguration, watch the data veer up and off into oblivion: from health care spending, executive pay, and the size of the federal government, to the privatization of public services, social isolation, and economic inequality. The bottom line: only half of babies born in 1980—today’s forty-four-year-olds—will make as much money as their parents did.

I was surprised, then, to learn that publicists for the Sackler family—the owners of Purdue, which manufactures OxyContin, and, as the purported architects of the “opioid epidemic,” the epitome of contemporary capitalist villainy—presented a Reaganesque chart in a 2021 PR offensive called “Judge For Yourselves.” The project aimed to “correct falsehoods” and push back against a tidal wave of press that presented OxyContin as the epidemic’s singular culprit. Purdue, to be sure, did not literally present a chart with a smiling Reagan, but they might as well have.

This chart was designed by two infectious disease modelers, Hawre Jalal and Donald S. Burke, who made a grim discovery while examining the leading causes of death in America. They plotted drug-overdose deaths from 1979 to 2016, and what they found was utterly baffling: deaths consistently rose 7 percent each year, doubling every eight to ten years, for more than four decades. Nothing else—not gun deaths, not suicide, not AIDS, not car crashes—adheres to an exponential curve for this long. Since 1999, more than one million people have died from overdoses.

But in the United States, we don’t tend to think of this decades-long emergency as a continually accelerating death toll; it gets framed as a series of discrete, though sometimes overlapping, epidemics, implying a predictable arc that spikes, plateaus, and eventually falls. First, as the New York Times warned on the front page in 1971, there was a “G.I. heroin addiction epidemic” in Vietnam. The drug’s use was also on the rise in places like New York, where, in the following year, at least 95 percent of those admitted to drug addiction treatment reported using it. The crack cocaine epidemic arrived in the next decade, followed by a rise in the use of methamphetamines, which the late senator Dianne Feinstein would call the “drug epidemic of the nineties.” But these were soon displaced in the popular imagination by OxyContin, which hit the market in 1996 and set off successive waves of what came to be known as the opioid epidemic, something we’re still struggling through. The past forty-five years of drug use in America does not match this relatively tidy narrative—in reality, there’s a beginning and middle, with no end on the horizon.

But in a strange way, this exponential curve told a story the Sackler family could get behind, one that made them look less culpable: How could Purdue be responsible for the opioid epidemic if overdose deaths were rising for more than a decade before OxyContin was even brought to market? “We were contacted by [Purdue] lawyers,” Burke told me. “It was my sense that they would like us to testify that it wasn’t their fault.” They declined the offer.

Still, Purdue was right about something. Drug mortality in America neither begins nor ends with the company’s actions. What pharmaceutical manufacturers, drug distributors, insurance companies, doctors, and pharmacies—the entire profit-mad medical system—collectively accomplished was to accelerate a train that was already speeding off the rails. But it’s hardly an absolution to argue that you did not start the fire, only poured gasoline on it for personal gain. With corporate power unchallenged and regulators asleep at the wheel, drug markets, like so many other consumer markets, have become more deadly, more dangerous, and, despite decades of aggressive and costly drug enforcement, more ubiquitous.

Jalal and Burke’s finding also presented a paradox. How could four decades of seemingly distinct epidemics—from heroin and cocaine to meth and fentanyl—aggregate into one giant wave of death? How is this wave still gaining power, and when will it crash? When we zoom out, we have what looks less like a collection of epidemics involving a series of novel, addictive drugs, and something more like a chronic social crisis exacerbated by market conditions. Underlying sociological and economic drivers must be at work.

“We can come up with explanations that are specific to some era,” Peter Reuter, a veteran drug policy researcher, told me. For instance, consider how in the 1970s, cocaine manufacturing and trafficking networks in Latin America advanced alongside growing demand for the drug in America. “But then, it’s very hard to find something that goes on for forty-five years now.” David Herzberg, a historian of the pharmaceutical industry and author of White Market Drugs: Big Pharma and the Hidden History of Addiction in America, has an idea. He proposes that drug markets are behaving the way other consumer markets have since the neoliberal turn, when “free enterprise” was unleashed to work its unholy magic. “The rise in overdoses tracks a time period in which corporations that organize human labor and human activity were increasingly given carte blanche,” Herzberg told me. “While OxyContin is an example of a corporation taking advantage of this,” he said, “Purdue didn’t create the conditions that enabled it to do what it did.” Hence the irony of the Sackler family’s lawyers holding up a chart where time begins in 1979.

Across this period, illicit market innovations have mirrored many of the same ones seen in legal markets: sophisticated supply chains, efficiencies in manufacturing, technological advances in communications and transportation, and mass production leading to lower prices. Meanwhile, the social dislocation and alienation of consumer society has left millions of Americans unmoored, adrift, or otherwise floundering.

Contrary to popular rhetoric, drug addiction is not the cause of poverty but one of its chief consequences. Studying the dynamics of crack houses in New York and open-air drug markets in Kensington, Philadelphia, the ethnographer Philippe Bourgois found a pattern of lives scarred by a combination of state neglect and violence: abusive childhoods, crumbling schools, abandoned neighborhoods, all aided by government-incentivized white flight. The historian Nancy Campbell, author of OD: Naloxone and the Politics of Overdose, uses the phrase “unlivable lives” when talking about the increasing immiseration of Americans. “Drugs are powerful ways people use to mitigate their circumstances,” Campbell told me. Opioids work as a salve for pain both physical and psychic. (...)

The public is led to believe that the usual responses to epidemics will somehow work for drug addiction: isolate, quarantine, and treat the sick. This almost always means criminalization, incarceration, and compulsory treatment—or else bizarre interventions like the Department of Defense’s quixotic search for a fentanyl “vaccine.” The endless declaration of one drug epidemic after another also perpetuates a blinkered state of emergency, necessitating the spectacle of a disaster response to yet another drug “outbreak.” This not only forecloses the possibility of a response that’s actually effective, it precludes a deeper understanding of the role of drugs in American life. What Jalal and Burke’s exponential curve lays bare is the accumulation of our long, slow, and violent history. (...)

The idea that we’re living through exceptional times isn’t exactly wrong. The mathematics and physics of fentanyl are unprecedented. The total amount of the synthetic opioids consumed in the United States each year is estimated to be in the single-digit metric tons. By comparison, Americans annually consume an estimated 145 tons of cocaine and 47 tons of heroin. That means all the fentanyl consumed by Americans in just one year can fit inside a single twenty-foot cargo container. Some fifty million shipping containers arrive in America by land, air, and sea every year. Because fentanyl is so potent—with doses measured in micrograms—very small amounts can supply vast numbers of customers. Counterfeit fentanyl pills contain about two milligrams of fentanyl. There are 28,350 milligrams in an ounce, which means one dose amounts to one ten-thousandth of a single ounce. Authorities could barely keep up with cocaine and heroin. To say fentanyl detection is like finding a needle in a haystack is to vastly underestimate the scale of the problem before us.

To add another layer to this already impossible scenario, fentanyl is unlike cocaine and heroin in that it is synthetic, odorless, and tasteless, making shipments even more difficult to detect. And the supply has no real upper limit: production is only tied to the amount of precursor chemicals available, which seem pretty much limitless. Any nation with a pharmaceutical or chemical manufacturing industry can theoretically produce the necessary precursors and ship them to suppliers around the world. If one country cracks down on precursor chemicals, another can fill the void. At this time, India and China manufacture much of America’s generic drug supply.

The global market’s rapid acceleration underscores the folly and futility of relying on the same enforcement tactics on the supply side, and the same medical and health interventions on the demand side. The U.S. policy response has never been this nakedly outmatched and unsuited for the task at hand. Still, authorities boast of massive investments to curb the fentanyl crisis. They champion handshake deals with foreign leaders to staunch the flow of the drug into the country. They publicize record-breaking fentanyl seizures, only to turn around and report record-breaking overdose figures. For example, the state of California’s 2023 “Master Plan” for tackling drugs includes more than $1 billion, from overdose prevention efforts to interdiction and enforcement. The California National Guard seized 62,224 pounds of fentanyl that year, a 1,066 percent increase from 2021. And yet overdose deaths continue to climb across the state, increasing by 121 percent between 2019 and 2021. Conventional enforcement and seizure methods have done little to contain the spread.

The Need for New Direction

In 2022, the disease modelers Jalal and Burke projected that half a million Americans would die of drug overdoses between 2021 and 2025. So far, the data supports this estimate. “Dismayingly predictable,” as they put it. Unless something drastically changes, the curve will keep rising. Drug mortality alarmed officials in 2010 when thirty-eight thousand people died in a single year. Drug deaths were declared a “national health emergency” in 2017, when the annual death toll topped seventy thousand. In 2022, overdose deaths nearly reached 110,000. My fear is that we’ll learn to live with these figures as just another grim and inevitable feature of American life. File drug overdoses away under “intractable problem,” somewhere between gun violence and the climate crisis.

Something obviously needs to change, but American drug policy feels stuck, mired in disproven and outdated modes of thinking. Briefly, it seemed there was real movement toward treating addiction as a public health issue, but the sheer lethality of fentanyl, in part, snapped policy back to the mode of coercive criminalization, derailing newer, progressive reform efforts to roll back racist drug enforcement through decriminalization, with an emphasis on expanding public health, harm reduction, and treatment. The tide of reaction against these nascent efforts has been swift and effective. San Francisco voters passed a measure to drug test welfare recipients. Oregon has ended their decriminalization experiment. With social approaches in retreat, the idea of full-on legalization feels increasingly out of touch with today’s reality.

But is complete legalization even desirable? Every time the left brings up the idea, two substances come to mind: alcohol and tobacco. These two perfectly legal, regulated products are immensely hazardous to individual health and society at large. Tobacco kills nearly five hundred thousand people every year; that’s more than alcohol and every other drug combined. Drinking, meanwhile, kills nearly five hundred Americans a day: more than every illicit substance, including fentanyl, combined. During the pandemic lockdowns, people drank more, and they drank more alone. The trend did not reverse once we returned to “normal.” Contrary to all the buzz around nonalcoholic bars, millennials and Gen X are binge drinking at historic levels. The same set of social, psychological, and economic factors at work in illicit drug use, magnified by the market’s invisible hand, are also apply to alcohol: people are more alone and more stressed, with access to a cheap, heavily marketed product that, thanks to on-demand home delivery, is easier than ever to access. Advertisers spent nearly $1.7 billion marketing alcohol in 2022 alone.

How, then, is the legalization and regulation of drugs going to help us? Benjamin Fong, in Quick Fixes, summarizes the debacle:
A more rational society would undoubtedly minimize the impacts of black markets by regulating all psychoactive drugs (and, perhaps, controlling their sale through state monopolies or public trust systems), but legalization in this society likely means bringing highly potent substances into the purview of profit extraction.
It is clear we live in the worst of all worlds. Black markets flood the country with mass-produced and highly lethal substances, but legal, “regulated” markets do the same. Both are turning record profits. Consumers are at the wrong end either way. It’s hard to not feel deep pessimism about where things go from here. Cringey, commercialized marijuana; the glut of ketamine infusion clinics; venture capital closing in on psychedelics; Adderall and Xanax prescriptions being handed out by telemedicine companies over Zoom. It’s precisely more of what got us here: a bewildering array of addictive products unleashed onto anxious, isolated consumers who are groping in the dark for relief from physical and psychic pain, coping with unlivable lives. Fortunately, it’s almost impossible to fatally overdose on many of these substances, but death shouldn’t be the only way to measure the consequences of the great American drug binge.

The current rhetorical, legal, and medical framework is simply no match for the deep malaise driving the problem. Root causes are downplayed, millions are left untreated, and thousands of preventable deaths are unprevented. We need a stronger, more expansive paradigm for understanding the exponentially increasing number of overdose deaths. A new language of substance use and drug policy that encompasses, and is responsive to, market dynamics and the social dysfunction to which they give rise. A consumer-protection model that does not criminalize the suffering, but also addresses the anxiety and dread that leads to compulsive, chaotic, and risky substance use. There must be something beyond, on the one hand, prohibition by brute force, and on the other, free-for-all drug markets ruled by profit. How can we create a world where people don’t need to use drugs to cope, or when they do use them, whether for relief, enhancement, or plain old fun, the penalty isn’t addiction, prison, or death?

by Zachary Siegel, The Baffler | Read more:
Image: © Ishar Hawkins
See also: Pain and Suffering (Baffler):]
***

"The stigma is not hard to understand: magazine features, books, and movies for two decades now have chronicled America’s drug problems, including the rapacious role of drug manufacturers like Purdue Pharma, which made OxyContin a household name and enriched the Sackler family in the process. The publicity of their misdeeds led lawmakers on a campaign against opioid prescribing. Yet the crackdown had an unintended consequence, one little examined today: it has increased the suffering of patients who experience chronic pain, as medications that were once heavily promoted have since been restricted. And it has added to the needless agony of those like Marshall at the end of life. I told the story of Marshall and others like him in my 2016 book, The Good Death. Since that time, the double-sided problem has only seemed to worsen. Even morphine, which has long been used to ease the final days and hours of patients in hospice care, is only available to the fortunate ones, as supply chain problems have combined with fears of overuse, leading to vast inequities as to who dies in terrible pain. (...)

Those dependent on opioids sought out their own prescriptions, while others began to sell their unused pills for extra income. Instead of addressing drug use with treatment—methadone, buprenorphine, abstinence programs—states and the federal government began to respond by limiting the quantity of opioids that doctors could prescribe, hurting legitimate pain patients, who were now unable to get the medication that allowed them to function, and leaving those dependent on or addicted to illicit prescription medication in deep withdrawal.

“Do you really think that’s not going to generate a local street market?” Szalavitz asked. So, in “towns where there was deindustrialization, a lot of despair, long family histories of addiction to things like alcohol,” she said, people were forced to find a new drug source. Heroin and street fentanyl filled the void. Those addicted to or dependent on prescription opioids were now using drugs that were not commercially made, their dosages variable, unpredictable, and often deadly. (...)

When I asked Szalavitz how she made sense of this misleading popular narrative about addiction and overdose, she told me, “You couldn’t say that the people who got addicted to prescription opioids were starting by recreational use because then white people wouldn’t be innocent—and journalists like innocent victims. We had to get it wrong in order to convict the drug companies.” From this vantage point, every story of, say, a high school athlete getting hooked on Oxy after knee surgery is misleading as an average portrait, defying both the data and what experts know about addiction. Most people with addiction begin drug use in their teens or twenties, which means it’s likely that those proverbial student athletes getting hooked on Oxy were already experimenting with drugs. “If you don’t start any addiction during that time in your life, your odds of becoming addicted are really low,” Szalavitz told me. “So, what are we doing? We’re cutting off middle-aged women with no history of addiction, who are not likely to ever develop it, and have severe chronic pain, to prevent eighteen-year-old boys from doing drugs that they’re going to get on the street instead.”

Understanding—and addressing—addiction is what’s missing from current drug policy. Instead, some types of drug dependence are demonized, dependence is conflated with addiction, and the best, most cost-effective treatment for pain to exist at this time is stigmatized and kept from those who rely on it to function. As Szalavitz explains it, dependence is needing an increasing dose of a drug to function normally. Many on antidepressants or other stabilizing drugs are not shamed for their dependency. Addiction, Szalavitz says, is using a drug for emotional not physical pain; it is “compulsive drug use despite negative consequences, so increasing negative consequences does not help, by definition.”

Truly facing and addressing addiction requires a new vocabulary—and accepting that “say no to drugs” is an inadequate response. It also requires an examination of far-reaching economic and social challenges in our culture: lives of despair, racial prejudice, economic insecurity, isolation, inaccessible health care, expanding police forces and prisons, and, of course, politics. For politicians, “drugs are a great way to get elected,” Szalavitz said. They can campaign on tough drug laws, claiming that their policies will decrease overdose deaths. “It’s really infuriating,” she told me, “because our prejudice against pain and our stereotypes about addiction push us toward solutions to the problem of opioids that simply do not work.”



OHTSU Kazuyuki(大津 一幸 Japanese, b.1935), Summer at Oze
via:

Teshekpuk Lake

America's Arctic - Teshekpuk Wetlands
[ed. NPR-A, Northwestern Alaska. Beautiful photography by Cornell Lab of Ornithology.]