Tuesday, September 17, 2024

A Time We Never Knew

There is a beautiful and melancholic word I like called anemoia. It means nostalgia for a time or a place one has never known.

This is a sentiment I often sense from my generation, Gen Z—especially in recent years. I see it in the YouTube videos of old concerts that get millions of views. I see it in our fascination with polaroids, vinyls, vintage cameras, and VHS tapes. (...)

But perhaps the best example of anemoia is the popularity of ‘90s high school videos, like this one trending on TikTok. Or this one on YouTube, with millions of views, captioned “Phones? No. We had each other.”


This video has nearly 30,000 comments, some from Gen Xers nostalgic for their ‘90s youth—but many from Gen Z, aching for a world they never experienced. Older generations might dismiss this as teens wanting to be different and reject modern culture, as they often do. But the comments reveal something deeper:
“This feels nostalgic despite me not even existing during this time.”

“The whole concept of a real ‘childhood’ is completely out the window at this point in time and that’s extremely sad to me. Btw I’m 15, born in 2003.”

“I’m 20 years old so I wasn’t even conceived at the time of this video but it leaves me feeling empty. My highschool experience was nothing like this. I remember short bursts of people living in the moment but EVERYTHING revolved around our phones, Snapchat/Instagram status. It almost makes me angry because I’ve never had simple straight forward interaction as shown in this video. Even looking at people in the background, they are completely present and not buried in their phones. Everyone seemed a lot more social. I’m jealous of millennials/gen x, yall experienced a golden time to be young and free.”

“As someone who graduated in 2015 this looks like such a nice time. Not a phone in sight. People actually talking face-to-face. I wish I could have grew up in an era like this.”

“The Matrix came out that year. Now we live in it.”
There were hard times, of course—the ‘90s weren’t all bliss; no era is. But the world we inhabit now is so markedly different. New technologies cheapen and undermine every basic human value. Friendship, family, love, self-worth—all have been recast and commodified by the new digital world: by constant connectivity, by apps and algorithms, by increasingly solitary platforms and video games. I watch these ‘90s videos, and I have the overwhelming sense that something has been lost. Something communal, something joyous, something simple.

Of course, we have so many material comforts and conveniences now. I can follow the news all over the world, Google any statistic I want, write on Substack, and WhatsApp someone on the other side of the world. All for free. In a world like this, it’s easy to see why older generations might not understand why Gen Z is struggling to cope.

But God, that loss—that feeling. I am grieving something I never knew. I am grieving that giddy excitement over waiting for and playing a new vinyl for the first time, when now we instantly stream songs on YouTube, use Spotify with no waiting, and skip impatiently through new albums. I am grieving the anticipation of going to the movies, when all I’ve ever known is Netflix on demand and spoilers, and struggling to sit through a entire film. I am grieving simple joys—reading a magazine; playing a board game; hitting a swing-ball for hours—where now even split-screen TikToks, where two videos play at the same time, don’t satisfy our insatiable, miserable need to be entertained. I even have a sense of loss for experiencing tragic news––a moment in world history––without being drenched in endless opinions online. I am homesick for a time when something horrific happened in the world, and instead of immediately opening Twitter, people held each other. A time of more shared feeling, and less frantic analyzing. A time of being both disconnected but supremely connected.
 
But I never knew it. Maybe briefly, as a child. But most of us in Gen Z were given phones and tablets so early that we barely remember life before them. Most of us never knew falling in love without swiping and subscription models. We never knew having a first kiss without having watched PornHub first. We never knew flirting and romance before it became sending DMs or reacting to Snapchat stories with flame emojis. We never knew friendship before it became keeping up a Snapstreak or using each other like props to look popular on Instagram. And the freedom—we never felt the freedom to grow up clumsily; to be young and dumb and make stupid mistakes without fear of it being posted online. Or the freedom to be unavailable, to disconnect for a while without the pressure of Read Receipts and Last Active statuses. We never knew a childhood spent chasing experiences and risks and independence instead of chasing stupid likes on a screen. Never knew life without documenting and marketing and obsessively analyzing it as we went.

Now, the next generation? Gen Alpha? I can only imagine the loss they will feel. They are on track to never know friendship without AI chatbots, or learning without VR classrooms, or life without looking through a Vision Pro. They are being born into a world already anxious and atomised. My guess is that one day they will find family photo albums and hear stories about how their Millennial parents met and be hit with anemoia.

Maybe I’m naive. Maybe all generations look back with nostalgia. But my sense is they don’t do it for a time they never knew. They feel a longing for their youth; their childhood. My parents might flick through black-and-white photos and hear stories from my grandparents and feel intrigued, but not so much grief. I think there is something distinctly different and deserving of our attention about online forums filled with Zoomers wishing that they lived before social media. Wishing it didn’t exist. These are children grieving their youth while they are still children. These are teens mourning childhoods they wasted on the internet, writing laments such as “I know I’m still young (14F), and I have so many years to make up for that, but I can’t help but hate myself for those years I wasted doing nothing all day but go on my stupid phone.”

by Freya India, After Babel | Read more:
Image: YouTube
[ed. How sad. Original article here.]

Debate Team Is Cool Again (If It Ever Was)

It’s still early in the school year. Get your kid to join debate team.

The old-school display of oratorical excellence has suddenly become an outsized measure of leadership ability and prowess in this country. It’s like the cool kids are figuring out what the nerds have known all along: Being good at debate will take you far.

Tuesday’s presidential debate was all anyone could talk about this week. Never mind multimillion-dollar ad campaigns, legions of campaign volunteers and endorsements from famous people. Never mind rally size, the ear injury or the graveside photo shoot. The make-or-break event in the race for the highest elected office in our country has become the live, in-person, unscripted debate.

In a world that has become warped by altered images, out-of-context quotes, fake videos and outright lies, watching a person argue their position in real time offers a glimpse of how they think, raw and real. It doesn’t tell us everything about a candidate’s abilities and fitness for office, but it tells us plenty about their ability to maintain composure, think on their feet and connect with the people they intend to serve.

President Joe Biden may have wrought stability out of chaos, brought decency back to D.C., and put professionals rather than cronies in cabinet positions, but none of his accomplishments mattered when the debate in July revealed his frailties. It effectively ended his decades-long career in politics.

To hear former President Donald Trump tell it, Vice President Kamala Harris wasn’t smart and couldn’t string three words together to make a sentence. Her performance in Tuesday’s debate spoke truth to those lies. The structure, discipline and mental agility one learns when studying debate are high level skills that Harris clearly had down cold.

It could be that America’s future was decided by those two debates.

So sign up your kid and find them a suit. That buttoned-up nerdiness blended with a precocious love of domestic policy is a powerful combination. Public speaking is one of the scariest things known to humanity. Once a person conquers that, other things are easy.

I have volunteered to judge Hawaii high school speech and debate events for years. I try to stick to speech events rather than judge debate because I am in awe of what the students can do and don’t feel I have the proper background to evaluate them. It’s one thing to watch training videos, it’s quite another to actually have first-person experience arguing a case against a smarty-pants team from a rival school.

The thing that always impresses me about high school debate is that the students love how difficult it is. They stand there, scrubbed and coiffed and ready to spit fire. They construct their rebuttals in minutes while the other side is speaking. When it’s over, they sincerely thank their opponents for the opportunity to go against them. There is no name-calling, no personal attacks, no going over the allotted time to speak. It is as elegant as it is fierce. The presidential debates would do well to emulate the formality, decorum and rigor of high school debates. (...)

The Hawaii Speech League reports that there are about 300 students statewide that participated in speech and debate last year. Federal Judge Jill Otake, state Sen. Stanley Chang and former state Rep. Aaron Johanson were debate champions in their day. Legions of Hawaii doctors, lawyers and political leaders are alumni of the statewide debate program. Nationally, high school debate alums include Supreme Court Justices Samuel Alito, Ketanji Brown Jackson, Stephen Breyer, Sonia Sotomayor and former Attorney General Janet Reno.

Luly Unemori, who went twice to nationals as a member of the Baldwin speech and debate team in the 1980s, said, “Perhaps more relevant today than ever, those high school experiences helped me think more critically about the information out there, to separate credible information from misinformation and sift through thick fogs of diverse opinions.”

‘Iolani English teacher Theresa Falk recalled, “Writing evidence cards and spending hours practicing saying what I want to say helped me manage my anxiety. To this day, I’ll write down speaking points when I need to have a difficult conversation, even if that conversation is with myself.”

There is such an emphasis on teaching STEM (Science, Technology, Engineering and Math) in schools, as though only the “hard skills” will save the world. But being able to speak extemporaneously in an impactful, organized fashion is extremely useful in any career path. It’s too bad so many public schools that used to be powerhouse speech and debate teams don’t have that opportunity for their students anymore. 

by Lee Cataluna, Hawaii Civil Beat |  Read more:
Image: Lee Cataluna
[ed. So true. The power of public speaking and debate is almost a guarantee of success.]

Ghosts, Seen Darkly

1.

In the winter of 2012, against my better judgement and for reasons that were not entirely to do with writing—much as I said they were—and which even now are not clear to me, I visited the site of Ohama Camp, Japan, where my father had once been interned. It was very cold, a bitter day, and an iron sky threw a foreboding cast on the Inland Sea beneath which my father had once worked in a coal mine as a slave laborer.

Nothing remained.


Though I had no wish to be bothered with it, I was taken to a local museum where a very helpful woman found numerous photographs documenting a detailed history of the coal mine from the early twen- tieth century—its growth, its processes, its Japanese workers.

There was no photograph of slave laborers.

The woman was kind and, as they say, a fount of knowledge about local history. She had never heard of slave laborers working at the Ohama coal mine. It was as if it had never happened, as if no one had ever been beaten or killed or made to stand naked in the snow until they died. I remember the woman’s tolerant smile: a smile of pity for me thinking there had ever been slave laborers at the Ohama coal mine.

2.

Sometimes I wonder why we keep returning to beginnings—why we seek the single thread we might pull to unravel the tapestry we call our life in the hope that behind it we will find the truth of why.

But there is no truth. There is only why. And when we look closer we see that behind that why is just another tapestry.

And behind it another, and another, until we arrive at oblivion.

3.

At 8:15 a.m. on 6 August 1945, bombardier Major Thomas Ferebee released a lever 31,000 feet over Hiroshima, said “Bomb away!,” and forty-three seconds later 60,000 people died while eighty miles to the south my father, a near-naked slave laborer in his fourth year of captivity as a prisoner of war, continued with his grueling work pushing carriages of rock up long dark tunnels that ran under the Inland Sea.

Broken, ill, body and will near the end, knowing only that when in a few months the winter cold returned he could no longer endure and would die, he was unaware that he was now going to live. As my father made his way along the bleak mine tunnel only very occasionally punctuated with dim electric light bulbs a fellow Tasmanian POW remarked that it looked like his hometown of Penguin on a Friday night.

4.

At the mine-head entrance, where my father and his fellow slave laborers once ran the gauntlet of guards who beat them as they passed, there now stood a love hotel. There was no memorial, no sign, no evidence, in other words, that whatever had once happened had ever happened. There was some neon signage. There was a business that catered for quick opportunistic sex in tiny rooms that allowed for sexual release and deliberately little else.

What remained, or rather what existed, was only the oblivion of pleasure in another’s arms—the same oblivion that simultaneously prefigures and denies death. As if the need to forget is as strong as the need to remember. Perhaps stronger.

And after oblivion? We return to the stories we call our memories, perplexed, strangers to the ongoing invention that is our life.

5.

Next to me that bitter day there stood an elderly Japanese man, Mr. Sato. He was tiny and frail, neatly dressed in a sports blazer and dress pants too long in the cuff from where, I assumed, he had shrunk with the passing of years. His hands were covered in thin white cotton gloves, and when he pointed out some long-vanished feature of the camp and mine below, all I saw was a loose thread dangling from the glove’s cuff. I don’t remember his shoes. Mr. Sato’s head came to my chest.

He lived and cared for a daughter who, I was told by the translator, was very disabled. Mr. Sato had once been a guard at Ohama Camp. He showed me where the barracks had been, the farm up the hill, the mine head downhill, closer to the sea.

In front of us, to my embarrassment and unspoken anger, were a TV crew and several photographers from local newspapers. I had gone through a series of contacts to be at the mine head, and somehow the local council had become involved. Without my knowledge they had asked the media along. The TV crew and newspaper journalists wanted one thing: Mr. Sato and I embracing, an image of forgiveness, of understanding, of time healing.

That would be, I knew, a lie. It wasn’t for me to forgive.

Does time heal? Time does not always heal. Time scars. Mr. Sato’s gloved hand was raised, pointing, the cold world below bisected by an unravelling thread.

6.

Earlier that day I had met local, elderly villagers who had been children during the war. I had not wanted to meet them. I felt—how can I put this?—ashamed. My shame was perhaps that my return might be misunderstood as vengeance or anger.

But I didn’t know what my return was. They had been children and I had not then existed. I felt, in short, unequal to them and their lives. Maybe I was ashamed, somehow, of being my father’s son presuming that his and their history might also be mine. I worried I might be seen as an unwelcome ghost, a specter looking over the scene of an unsolved crime in which I was implicated. But the ghost of whom?—the murdered, the murderer or the witness, or all three?

Because it was an arrangement made by others I didn’t know how to cancel it without causing offense. The elderly villagers were friendly, warm people.

When telling their tales of the wartime privations they had endured as the children of the rural poor, they recalled the dissonance between what the adult world said and what as children they saw, and a childish irreverence took hold of those old voices and weathered faces. They remembered when the POWs had disembarked in late 1944, how the devils they had been taught to fear for so long were no more than pitiful, near-naked skeletons.

As well as the cruelty of the Japanese guards, my father had spoken of the kindness of the Japanese miners, some of whom may have been these elderly villagers’ fathers, who would share their meagre food with the starving POWs.

7.

When I left school I worked as a chainman, the name given to a surveyor’s laborer, a job centuries old set to vanish only a few years later with the advent of digital technology. It was the chainman’s job to drag the twenty-two-yard chain with its hundred links with which the world was measured.

By my time, the measuring chain was a thin steel band fifty meters long, but little else had changed from a century before. The chainman still carried a slash hook and axe to clear the survey lines of scrub and trees, and a whetstone and file to keep both sharp.

I learnt to look out for evidence of old surveys from many decades before—collapsing stone cairns, rotting pegs, or the vulva form of bark on old eucalyptus trees. With the axe I would carefully scarf away the bark until what was revealed was a deep prism-shaped cavity skillfully hewed into the tree trunk long ago, sometimes over a century before. The apex of the inverted prism was the survey point.

I would stare at the marvel of that unaltered wound, the exact same as the day it was hewed by another axe. Time hadn’t healed the tree, only scarred it, hiding something that was still happening. For beneath the scar the wound remained, a portal to the past bleeding fresh sap in the present, into which, if I stared for too long, I would feel myself falling.

Standing there that day in front of a Japanese TV crew with its young woman reporter, the handful of local photographers and bored journalists determined to get the only story that made sense and get out, was a similar moment of accelerating velocity. I had no desire to embrace Mr. Sato. Perhaps he felt the same. But not wishing to embarrass him or anyone else I put my arm around him and he around me. Everyone seemed happy with this.

When the photos and filming were done I let my arm drop. Mr. Sato remained curling into me. When I went to move away, his head seemed to slump into my chest. And so there we stood.

Perhaps he too was falling through some infinite void, or perhaps it was just the cold. I had no idea. I can’t say I liked him holding me that way, this man who may once have beaten my father, but nor did I know how to be rid of him, to withdraw the warm comfort of my body.

I thought of Mr. Sato suffering with his disabled child and how at the end of his life he was being rightly punished. Then I felt ashamed thinking such a thing, aware that beneath this thought lay another I didn’t wish to acknowledge.

I looked away to the trees, bare, sad scrags; the earth unloved and cold around them, the rank grasses and weeds wretched-looking and lost. All of nature there seemed exhausted and disordered and me somehow a part of it.

Below, the sea could be heard rolling in and out along its stony coast, as it always had and always would, concertinaing through the crimes and sins and loves and passions of those who passed through this sad country. My whole trip felt as incomprehensible as that sound of rocks rolling into oblivion.

Would I have done the same as Mr. Sato? And when I tried to push that question away another arose. If Mr. Sato, who seemed a decent man, was capable of being a guard, doing evil or just standing by when evil was done, would I be any different? Would I too join in beating the prisoners, even though I didn’t wish to, would I too order a naked man to freeze to death kneeling in the falling snow, because it was what was expected, because it was too hard to say no? Or would I look away and choose not to help him?

Then I lost the thread of my thoughts. The day was going. I thought of the prisoner made to kneel in the snow overnight without clothes, a story that had always left my father indescribably sad in its pointlessness. Would I do the same? I suddenly hated Mr. Sato’s embrace and I hated Mr. Sato with all my being as he continued to lean on me.

A cold wind blew and petered away. Afterwards it only felt that much colder, the sea continued rolling in and out, we no longer noticed each other, it occurred to me that neither of us had any idea what the other was thinking or feeling, how standing there we could have been mistaken as brothers or lovers.

We stood for a very long time as the journalists and TV crew departed, as dark clouds hastily gathered above us, seeming to shudder in unknown judgement and quickly disperse in their disappointment, as the Inland Sea reflected back to me only its own mystery. What had happened? To this day I have no idea. There is no why.

by Richard Flanagan, Paris Review |  Read more:
Image: uncredited
[ed. A favorite author. Definitely plan to read his new book Question 7.]

Exceptional Minds Tend To Lead Exceptional Lives

Once upon a time, a man sat on a park bench to eat his sandwich. The man was thinking about divorcing his wife—that's the subtext here (except now I suppose it's text). He'd installed a dating app on his phone and set it to stealth mode, so his wife and her friends couldn't see, and for the last few days he’d been favoriting women, trying to see if it was true that his sexual market value had increased substantially since the last time he'd been single.

The man hunched over, angling his body so nobody could see the phone. He swiped and swiped. The last time he'd done this, twelve years ago, it'd been a negative experience—very few matches, very little interest. The man was from a minority race that the majority race in his city considered quite unattractive. Trying to date had beaten him down, and he'd just been happy someone had wanted him.

The man had his reasons for divorcing, but fundamentally, he was bored. He just wanted something different. He kept being told about all these men who used up their wives' youth and then left them. The men lived these swinging, glorious glamorous second lives, just seeing their kids every other weekend, dating younger women, going on adventures, et cetera. That seemed good to the man. Terrible for women, obviously, but…so what? That’s why if you did it, you just had to dress it up with a lot of rhetoric about how you really weren’t happy in your specific marriage, etc. There was a whole routine for this kind of thing, where you pretended you weren't engaging in the bad practice, but actually you were reifying it.

While sitting on the park bench, he had a number of reflections on the nature of masculinity and of his place in the world. He could feel himself slowly inching towards the kind of worldview (monogamy is outdated, marriage is a trap for men, men are emasculated in contemporary society, etc) that would allow him to abandon his family.

You know the really annoying thing was that his wife was a very incisive woman whom he quite enjoyed talking to, when they weren't fighting about housework and shit like that. She'd probably be very interested in his thoughts about what it was like to constantly hear (perhaps accurately!) that you were a much more valuable commodity than your own wife. She was one of the most interesting and thoughtful people he knew—that’s exactly why he’d married her! But obviously now they couldn’t talk openly, because if they did she might wonder, “Is he about to leave me?” and that subtext would lead them to fight.

Shit. Some red sauce had dripped from his sandwich onto his slacks. See, his wife would've told him not to order a meatball sub. He would've ordered it, and she'd have said, "Are you really gonna do that?" And he'd have said, "Let me order what I want! Why’re you always trying to control my order!" And then he'd have been like, "Hmm, I do have that meeting later," and he'd probably have changed his order, and then later been like...wow you were definitely right about the meatball sub, I do not know what I was thinking.

But she wasn't here, and now he had marinara on his slacks.

It wasn't the end of the world though was it? His wife never seemed to understand that he could live with marinara on his slacks—this stain really would not ruin his day in the slightest. He pretended to her, because he loved her, that he was glad she’d saved him from the chance of having a stain on his pants, but….really he didn’t care. She probably understood that though! That’s exactly why she appreciated the gesture of him saying, “Oh you were right about that order.”

Anyway, he just wanted to rebel, he supposed—do something he wasn't supposed to do. As a teenager and twentysomething he'd felt so utterly worthless that rebellion almost didn't seem to matter. Like, why rebel, when basically the world wanted him to not exist anymore? The real rebellion had been actually getting married, getting a job, doing the things he was supposed to do, but which people like him (lonely, overweight, nerdy, etc) often didn't actually manage. (...)

Now his meeting was starting however, so he carried his thoughts into the meeting, and into the rest of his life, where they formed the substrate of...well, everything he did. He constantly had thoughts, all the time, about all kinds of things. They weren't necessarily that special, but they were his. Some he told to other people. Some he didn't. Many would've horrified his wife or his friends, which seemed honestly a bit weird to him (why get offended by a thought?), but maybe when you articulate a thought, it's no longer just a thought—it becomes a position, something you think it's worthwhile to tell other people. In the act of speaking it, the thought becomes an action, in other words. Which seems kind of funny—because then actual thoughts can never really be communicated!

But what wasn't true of course. To communicate thoughts, you'd simply need an audience who understood that thoughts weren't willful—that they arose spontaneously. Which most people did seem to understand! Or claim to! But there were still thoughts you shouldn't say. His thoughts weren't really that bad, in his opinion. But it's not like he wanted to go around teaching people to just say all their thoughts! Because then there'd be a kind of monstrous unleashing of the id. Probably no good for society. Whatever thoughts needed to be expressed likely would be, despite the fear of being poorly received. Because there's a thrill in being honest! That thrill would likely motivate some adventurers or whatever.

The man pondered the idea of being honest with his wife and telling her he was bored and wanted adventure. Seemed like a terrible way to pay her back for loving him! And did he want her to be more honest with him? No way, absolutely not, Jesus Christ, no.

He benefited from repression, even though he hated it, of course.

But the thoughts just continued, in a stream, in a torrent, totally unanswerable. They were the basic building blocks of his life—the stuff he had to work with. It was up to him to decide which, if any, were worth devoting more time to.

by Naomi Kanakia, Woman of Letters | Read more:
Image: Sad Keanu meme Twitter/via

Sunday, September 15, 2024

Under the Influence

Question:

After graduating from college at the beginning of the pandemic, I tried different jobs but never found one I loved. I’ve worked for an advertising agency, been a project coordinator for a nonprofit with a mission I cared about, sold cars for my uncle’s dealership, and freelanced as a photographer. None of these jobs kept me interested.

I’m in between jobs right now, and a couple of my friends suggested I cash in on what’s happening with artificial intelligence. Although I’ve seen AI positions advertised, nothing about them appeals to me. On Friday, I came across an Udemy course on how to become an influencer. Working freelance as an influencer seems ideal. I’m addicted to social media, and that’s a perfect background for an influencer. I’d never considered influencing as a career choice or how people got into the field, but that’s what this Udemy course promises.

The only problem is I’m told that it’s not possible to make money as a newbie influencer, so I guess I’m looking for career advice.

Answer:

If you want a career, find something you love and go for it. In 1978, I set an old door on two concrete blocks in my living room and opened a consulting company. When I sold my business 39 years later, I had a staff of seven and 4,400 clients spread across the country. More importantly, I’d worked in a career I loved.

In recent years, the influencer phenomenon has exploded. People turn to YouTube, X, Nextdoor, Reddit and other platforms for advice and recommendations. Recent research suggests that 75% of people use social media for advice, and 69% of consumers trust influencer recommendations. Fifty million people earn money from regularly posting videos and photos.

That said, newbie influencers have to work hard if they want a living income. Forty-eight percent of influencers earn less than $15,000 a year.

If you decide to become an influencer, it may take you months to develop and gain visibility for your brand before you make money from sponsorship arrangements, brand partnerships, ad revenue and affiliate links. You’ll need to constantly produce engaging posts. You’ll spend your days filming, scripting and editing Instagram reels and TikTok and YouTube videos. You’ll need to hone your skills at performing in front of a camera. You’ll work hard to define your value by providing specialized knowledge and “edu-tainment.”

You’ll need to search out and negotiate sponsorships. Although you won’t report to one supervisor, you’ll have many bosses, as every advertiser and public relations agency that invests in you will expect you to produce deliverables on deadline. When working for yourself, you won’t have a regular salary, health care benefits, paid time off or other benefits. You won’t be able to afford “off days,” because once you lose followers, they don’t return. You’ll discover responding to direct messages and comments to be a full-time job. Even when you’re exhausted at the end of a long day, you’ll have to send out invoices.

That said, you have much going for you if you decide to become an influencer. You love social media and have photography, advertising and sales skills. Your project coordination skills may come in handy as you’ll need to build your brand by planning and executing content across multiple social channels.

You’ll need to build a professional website that advertisers and followers can visit to learn about you. You’ll also want to get started on one, two or three social media channels, each of which require different skills. YouTube influencers promote products with video tutorials. TikTok influencers cater to GenZ. Instagram influencers leverage imagery. You’ll want to set up your posts so your followers can like, comment on and reshare them. You’ll want to create an influencer profile on Influence.co and Intellifluence.com for visibility with influencers.

Can you make money doing this? A recent LinkedIn survey of 5,920 influencers reported these influencers averaged $323.19 monthly. The more successful influencers in the group, who had at least a million followers, earned an average of $6,109.83 per month.

by Lynne Curry, Anchorage Daily News | Read more:
Image:(iStock/Getty Images Plus)

Friday, September 13, 2024

How the Clintons Revolutionized U.S. Politics... Twice

From the mid-1930s through the mid-1960s, the Democratic Party was defined by a ‘New Deal Coalition’ that united white rural and blue-collar workers, religious minorities (Jews, Catholics) and, increasingly, African Americans. However, following Republican Barry Goldwater’s 1964 capture of the South, and Richard Nixon’s 1968 victory over Democrat Hubert Humphrey, Democratic Party insiders decided to aggressively rebrand the party — to form a new coalition that centered women, college students, young professionals, and racial and ethnic minorities. They more aggressively embraced cultural liberalism, adopted a more dovish posture on foreign policy (to appeal to former anti-war activists, despite the fact that the Vietnam War was started and perpetuated by Democrats JFK and LBJ). They de-emphasized ties to organized labor. Indeed, white rural and blue-collar workers increasingly came to be viewed as a liability rather than an asset — depicted by many party insiders as ignorant, bigoted, misogynistic and reactionary — an impediment to the party’s more ‘enlightened’ future.
 
Among these policymakers, the biggest political prize of them all was to win symbolic capitalists – elites who work in fields like law, consulting, media, entertainment, finance, education, administration, science and technology. Professionals who traffic in data, ideas, rhetoric and images instead of physical goods or services. As Clinton’s Secretary of Labor Robert Reich argued in his 1991 bestselling book, The Work of Nations, the future belonged to these professionals. However, securing this voting bloc would ultimately require Democrats to “kill their populist soul,” as political analyst Matt Stoller aptly put it. And as they tried to transition to a new voting base, the party faced a long period of crushing political defeats.

Despite Democratic attempts to woo symbolic capitalists on cultural issues and foreign policy, most continued to support Republicans because of pocket-book priorities. Meanwhile, the GOP managed to successfully capture those disaffected rural and blue-collar voters Democrats sought to leave behind by emphasizing cultural conservativism. As a consequence, the Democratic Party spent decades in the political wilderness. In the quarter-century between 1968 and 1992, Democrats only managed to hold the White House four years — narrowly squeaking out a 1976 win in the immediate aftermath of Watergate. Republicans, meanwhile, won landslide victories in 1972, 1980, 1984 and 1988. And then Bill Clinton changed the game.

Bill Clinton was an embodiment of how symbolic capitalists liked to view themselves. He was relatively young (especially as compared to his Republican rivals in 1992 and 1996). He was smart and charismatic. He was a person from a humble background who managed to ascend into the upper echelons of power as a result of his elite education and savvy. Clinton consistently emphasized the importance of education as a means of competing in the globalized symbolic economy. He surrounded himself with demographically diverse experts from elite institutions. He painted himself as a post-ideological technocrat — as someone who followed ‘the facts’ without regard to what party insiders or his base wanted. Indeed, he regularly went out of his way to alienate remaining vestiges of the traditional Democratic base, or to align with his political rivals, in order to demonstrate his independence.

In his 1996 State of the Union address, Clinton formally announced the death of the Democrats’ earlier New Deal coalition, declaring, “The era of big government is over.” And over the course of his administration, the Democratic Party radically shifted to reflect not just the values, but also the economic priorities, of symbolic capitalists.

Four planks were central to Clinton’s vision of reorienting America around the knowledge economy: social investment in skills, infrastructure and research, enhancing market dynamism (through tax cuts, deregulation, privatization), international openness (through trade deals and immigration reform), and macroeconomic stability (including by using U.S. forces to uphold the global international order) – a platform now referred to as “neoliberalism.” Although versions of these ideas date back to the 1940s, and were first piloted under Democrat Jimmy Carter (accelerated under Reagan), Clinton brought the vision full circle by aggressively reorienting the Democratic Party around this vision – giving rise to what is now derisively referred to as the “neoliberal consensus” in Washington, and generating many of the faultlines that continue to define U.S. politics to the present.

For instance, the urban-rural divide first took off in the early 90s, corresponding to the Democratic Party’s reorientation around the knowledge economy (and contemporaneous moves by many left parties in Europe).

With respect to the “urban” side of that divide, under Clinton’s tenure, the Democratic Party dedicated itself to bringing cities ‘under control’ through tough-on-crime policies — despite significant concerns from the NAACP and the Congressional Black Caucus about the disproportionate and adverse effects these policies would likely have (and indeed, did have) on African Americans and other minorities. Simultaneously, his party committed itself to globalization and free trade, culminating in a series of international agreements that radically expanded China’s economic and geopolitical clout, despite the Clinton Administration’s own forecast that these moves would come at the expense of key U.S. industries and manufacturing workers.

Fulfilling Clinton’s campaign commitment to ‘end welfare as we know it,’ Democrats restructured aid programs, forcing millions of Americans, mostly women, into dead-end and unstable jobs with low pay or benefits in order to continue qualifying for government assistance. Pushing low-income mothers out of the home and into the workforce led to significant increases in child mistreatment incidents and children being dumped into ‘the system.’ However, it also helped expand the pool of workers in the service economy and kept their wages low as a result of the increased labor supply. Simultaneously, the levels, quality and accessibility of government benefits were significantly reduced, as Clinton pushed to ‘downsize’ the federal government (and privatize its functions) in order to balance the budget. As a result of these reforms many low-income Americans ended up with smaller household incomes despite working more, and the share of Americans in deep poverty increased substantially. But in the new and enlightened Democratic Party, it was much better to balance the budget by squeezing the poor than taxing the relatively affluent.

Rather than worrying about the prospects of the working class, the party aligned itself firmly with the tech and finance sectors. The Clinton Administration cut many regulations on these industries, and reduced enforcement of those rules that remained. These moves contributed significantly to the dot-com bubble that burst in 2000, and the housing and financial crisis that came to a head in 2008 (the latter of which had a particularly pernicious and enduring impact on the wealth of black families). Indeed, virtually all of the policies described above advanced the interests and priorities of those affiliated with the symbolic economy at the expense of most others, especially those who were already desperate or vulnerable. The effects of these reforms fell especially hard on women and ethnic / racial minorities. Clinton and his party made these moves nonetheless, confident that they would be able to retain female and minority voters because the Republicans were perceived to be even worse. And, for a while anyway, the bet paid off:

Under Clinton’s tenure, Democrats continued to enjoy roughly the same margins with lower-income and minority voters, but they were able to make significant gains with symbolic economy professionals as well. Looking at the white vote, for instance, we can see that starting in the 1992 election, degree holders shifted hard towards the Democratic Party, and that alignment has only grown over time. Whites without a college degree starting moving away from the Democratic Party by the time Clinton ran for reelection, and moved aggressively away from the party when Al Gore tried to succeed him. (...)

As symbolic capitalists have shifted towards the Democrats, they have also become more “culturally” liberal. According to Pew Research estimates, only about 7 percent of postgraduates held down-the-line liberal views in 1994 (at the beginning of the Clinton realignment). By 2015, that number had more than quadrupled to 31 percent. The share of BA holders with uniformly liberal views increased nearly fivefold, rising from 5 percent in 1994 to 24 percent in 2015— and is significantly higher today.

Indeed, although the Democratic Party platform shifted hard “left” during Obama’s reelection campaign, at the outset of what is today known as the “Great Awokening,” Obama himself was largely focused on painting Mitt Romney as an out-of-touch vulture capitalist who cared too much about corporate profits, and not enough about the struggles of ordinary Americans. It was enough to get him a “win” – albeit by a much smaller margin than in 2008. It was Hillary Clinton who mainstreamed “wokeness” in the Democratic establishment during her 2016 presidential run.

Hillary Clinton had the bad sense to run as the consummate establishment candidate and as a wonky technocrat in a race when growing numbers of Americans across the political spectrum were looking to burn things down. But then she ran one of the most substance-free campaigns of any candidate in either party in contemporary history. Rather than focusing on the substance of Sanders and Trump’s populist platforms, she tried to change the conversation away from criticisms of neoliberal economics via cultural issues.

by Musa Al-Gharbi, Symbolic Capital(ism) |  Read more:
Image: uncredited
[ed. Only fair after the Reagan essay below to profile the worst Democratic president we've had in my lifetime (in my humble opinion).]

Thursday, September 12, 2024

Spinning the Night Self

The creative benefits of insomnia

I wake up, faintly groggy with sleep, and try to guess the time. Midnight is surprisingly noisy, with a steady stream of traffic bringing people home from the West End in London, while 3am carries a curiously muffled sound, and 4:10am is when the first aeroplane skims my house with its familiar whine of descent. As my ears strain into the darkness, I sense the soft silence of 3am. Once I would have groaned, cursed and plugged the (largely ineffectual) sound of gently lapping waves into my ears. But, tonight, I listen to the emptiness for a few pleasurable moments, then I reach for my notebook and a candle.

I’ve had insomnia for 25 years. Three years ago, after a series of bereavements, I stopped battling my sleeplessness. Instead, I decided to investigate my night brain, to explore the curious effects of darkness on my mind. I’d long felt slightly altered at night, but now I wondered whether darkness and sleeplessness might have gifts to give: instead of berating myself, perhaps I could make use of my subtly changed brain.

I’m not the first person to notice a shift in thoughts and emotions after dark. ‘Why does one feel so different at night?’ asks Katherine Mansfield in her short story ‘At the Bay’ (1921). Mansfield herself became more and more fearful after dark, often barricading herself into her apartment by pushing all the furniture against the front door. And yet, later in life, insomniac nights became one of her most creative times, as she confided to her journal:
It often happens to me now that when I lie down to sleep at night, instead of getting drowsy, I feel more wakeful and I … begin to live over either scenes from real life or imaginary scenes … they are marvellously vivid.
Mansfield referred to her nocturnal imagination as the ‘consolation prize’ for her insomnia.

Around the same time, Virginia Woolf was pondering her own feelings of ‘irresponsibility’ that struck when the lights went down. She too recognised that night rendered us ‘no longer quite ourselves’. After completing each of her books, Woolf was plagued by insomnia – which she made use of to plot out her next novel. ‘I make it up in bed at night,’ she explained of her most inventive novel, Orlando (1928). Night was also a time of epiphany: after protracted struggles with her novel The Years (1937), Woolf’s dramatic breakthrough came ‘owing to the sudden rush of two wakeful nights’ when she was finally able to ‘see the end’. A few years later, the writer Dorothy Richardson noted that, around midnight, ‘she grew steady and cool … it was herself, the nearest most intimate self she had known.’ In her fictionalised autobiography, Pilgrimage (1915-38), Richardson’s alter-ego Miriam finds her most authentic, radical and original self in the solitude of her wakeful nights. For Richardson, reading and writing when she should have been sleeping were acts of resistance, acts that revealed herself to herself, undistracted by the detritus of daylight.

My night-awakenings began during my first pregnancy. Ten years later – with four children and several years of working across time zones under my belt – a full night of sleep in a single stretch had become a rarity. Most nights, I woke between 2am and 4am, tossed and turned for an hour, then read until I drifted back for a (short) sleep before the alarm went off. I invested in sleep aids: melatonin, weighted blankets, eye masks, sleep-inducing supplements, oils, mattresses, pillows, sheets, pills, apps, bed socks. I experimented with various sleep hygiene routines proposed by ‘experts’. To no avail.

The latest statistics suggest that one in six of us cannot get to sleep, or stay asleep, a figure that is higher for women. At the last count, 8 per cent were taking sleep medication and 11 per cent were regularly splashing out on sleep aids. In 2019, the global market for sleep products was valued at $74.3 billion. Experts predict it will be worth $125 billion by 2031. Frightening, and sometimes misleading, stories appear regularly in the media linking poor sleep to obesity, heart disease, dementia and premature death. (...)

Published and unpublished letters and journals show that, for centuries, many women embraced nocturne, finding within it a time of solitude and creativity. The literary critic Greg Johnson in 1990 noted that female writers seemed to have a peculiar talent for making ‘creative profit’ from their insomniac nights. He is right, and not just about writers. Over eight months of wakeful nights, the artist Louise Bourgeois produced her Insomnia Drawings (1994-95), a series of 220 sketches. The Insomnia Drawings were immediately snapped up by the Daros Collection in Switzerland, making instant ‘creative profit’ for Bourgeois, who also credited their production with easing 50 frustrating years of nocturnal tossing and turning. Lee Krasner’s ‘night journey’ paintings, made between 1959 and 1962 in the wake of two bereavements, are now among her most valuable and coveted. Meanwhile, Sylvia Plath wrote Ariel (1965), her most brilliant and acclaimed poetry collection, ‘in the blue dawns, all to myself, secret and quiet.’ Ruth Bader Ginsburg and Margaret Thatcher used the sleeping hours to increase the volume of their (arguably bold) output. Enheduanna watched the stars and produced the poetry that made her literature’s earliest known author. And Vera Rubin discovered dark matter, later saying of these wide-awake and alone nights at the telescope: ‘There was just nothing as interesting in my life as watching the stars every night.’

I call these women my Night Spinners.

Several years ago, when I lost loved ones, my flimsy sleep disintegrated and I lost all appetite for battle. Inspired by aeons of Night Spinners, I put away my sleep aids and let my grieving brain lean into the dark nights. When I woke (which could be any time between midnight and 4am), I got up and wrote, drew, watched the stars. I slept outside (night after night), went for long walks, swam in lunar-light, and taught myself the constellations and the phases of the Moon. I tracked and surveyed glow worms and moths. I watched badgers, and followed the call of owls and nightingales. I discovered a mesmerising nocturnal world.

My nocturnal mind was different. Why did I feel both more fearful and more tranquil? Why was I more inclined to fret and fume? To behave with greater recklessness? Why did images, ideas, memories so often collide in a curious collage of colour and novelty? Writing problems I encountered during the day found solutions as I ambled round the darkened house, peering at the night sky from every passing window. In the middle of sleepless nights, my mind felt less logical, less methodical. My grip on assessing and prioritising less assured. But in return, my inner critic fell silent. Ideas and thoughts meandered, melded and merged. I refused to pass judgment, but in the morning, when I looked afresh at whatever I’d written in the night, I often liked it. (...)

The prefrontal cortex (sometimes called our command and control centre, and thought to be the most highly evolved brain region) is very sensitive to sleep and sleep deprivation. Researchers speculate that it takes a restorative break at night – leaving us fractionally less rational, less organised and a little more at the whim of our emotions.

A resting prefrontal cortex might also explain why studies indicate that we are more likely to feel enraged and fearful at night. Or why reformed gamblers, drinkers and smokers are more likely to succumb to old temptations. Or why the celebrated writer Jean Rhys – who frequently wrote at, and about, night – was described by her biographer as ‘a lap-dog’ by day and ‘a wolf’ by night. Rhys liked to rise at a ‘wolfish’ 3am and ‘smoke one cigarette after another’, describing this dark hour as ‘the best part of the day’, when her thoughts were subtly altered. At night, it seems, the filter between us and the outside world is fractionally thinner and frailer. It’s not that our emotions change, but that our ability to control changes. We experience the world more viscerally: the highs are higher and the lows are lower. 

by Annabel Abbs, Aeon |  Read more:
Image: Bright Light at Russell’s Corners (1946) by George Ault

Chappell Roan’s makeup artist breaks down her VMAs look (CNN)
Image: Kevin Mazur/Getty Images
[ed. See also: The best red carpet looks from MTV’s Video Music Awards 2024 (CNN). Yikes, who are these people?]

What Does a Busy President Want to Eat? This White House Chef Has the Answer

You know that old line, "Tell me what you eat and I'll tell you who you are"? If that's true, then Cristeta Comerford knows the last five presidents of the United States better than almost anyone.

Comerford just retired after nearly 30 years as White House chef. She cooked for presidents from Clinton to Biden, making everything from family snacks to state dinners.

Just days before she left D.C. and moved to Florida, she came to the NPR studios to look back on her career, and said she didn't think about the barriers that she broke when she became the first woman and the first person of color to hold the top job in the White House kitchen.

“I didn't even realize that, because I was just doing what I wanted to do. I love to cook. It just so happens that I'm a minority woman,” she said. “But when I broke the glass ceiling, I didn't realize that it was, like, news all over!”

That was in 2005 during the George W. Bush administration that she took the executive chef position. (...)

Interview highlights

Ari Shapiro: You were born in the Philippines. You grew up one of 11 children in Manilla and you came to the U.S. at the age of 23. Did any of the presidents you worked for ask you to cook the food of your childhood, the food you grew up with?

Cristeta Comerford: President Obama, he lived in Hawaii for a while, so there's a lot of Filipino communities there, so he's very familiar with the Filipino food. So every now and then I’m, like, on the grill, and he's like, “Hey, is that smelling good right there.”

Shapiro: Give us an example.

Comerford: The skewered pork, you know, that's like a street food, but that's something that I love very much. And then whenever I did that — I do beef as well, and chicken — he loves it.

Shapiro: That must have been so nice to share the food of your roots, of your childhood, in your job at the White House with the president

Comerford: Exactly, yes.

Shapiro: I think the last time the White House hosted a state dinner for the Philippines, if I'm not mistaken, was 2003 during the George W. Bush administration. What was that day like for you?

Comerford: It was amazing. Because actually, chef Walter Scheib — the executive chef then — asked me to write the menu. I actually did the press preview for [Philippine President Gloria] Macapagal-Arroyo at the time. So I was so excited. They chose lamb. I clearly remember, because it was, like, kind of unusual, like, “Lamb? For Filipinos?” But I'm like, “OK, if that's what the guests want, we're gonna do lamb.”

Shapiro: What did cooking for presidents show you about those leaders that even their chiefs of staff or their closest advisors might not have understood?

Comerford: I think at the end of the day, those presidents, they have the weight of the world on their shoulders. So the only thing that they want when they come home after working the Oval Office, dealing with whatever world or domestic events, is just to come home to a nice, home cooked meal.

So on a daily basis, we just really take care of them: “Hey, what do you like to eat?” And a part of being a chef is just reading the room, but reading a big room, because you have to watch the news. You have to keep up with what's happening, because you almost kind of know what mood is your principal going to be in.

Shapiro: Oh interesting. You're watching the news to see if it was a stressful day for the person you’re cooking for. So it’s like, “Oh, he's gonna need grilled cheese and tomato soup” and the end of this day?

Comerford: Yeah exactly. And people don't teach us that. We just kind of know. I learned it from, actually, one of our butlers, because he was the one who explained to me, “Cris, he's gonna be feeling tired today and just worn out. So give him what you got.”

Shapiro: If I were to ask all five presidents what dish Cris is best known for, do you think more than one of them would give me the same answer?

Comerford: I think two of them would give you the same answer. Because President Clinton's favorite is enchiladas. And of course, so is President Bush's. So they'll give the same answer. I make a mean enchilada — homemade tortillas. It has to be homemade.

Shapiro: Did a president ever say to you, “Cris, you're an extraordinary cook. But you know what? I don't want the handmade tortilla. I want the American cheese wrapped in plastic that I grew up eating”?

Comerford: Actually, it was President Obama. I was making this fancy cheeseburger for him. I made my own brioche dough, and he looked at it and he said, like, “I'm OK with just the grocery bun that you get.”

Shapiro: One of your former colleagues, the pastry chef Bill Yosses, told me that your philosophy of American cuisine is that it's like jazz. What does that mean?

Comerford: It was a New York Times reporter who asked me the question of like, “Do you think French food is the best?” And we were in France. But what I said was true. I'm like, “Hey, look, all of the chefs, we're all classically trained. Like, you know, a pianist is classically trained in music. But in America, we play jazz.”

Shapiro: And what does that mean in terms of food?

Comerford: In terms of food, it's like, every community, every minority groups — we're a land of immigrants, so we share everything that we have. So by the time a food is made, it's a totally different one than it was intended to be. It's because it's a beautiful melting pot.

Shapiro: It's less about authenticity and more about improvisation, is that it?

Comerford: Exactly, yes.

by Ari Shapiro, Elena Burnett, and Katia Riddle, NPR |  Read more:
Image: Susan Walsh/AP

Wednesday, September 11, 2024

Refik Anadol, Works
via:
[ed. I wish I could show a video of this artist's amazing work but he seems as accomplished at preventing that as he is at creating it. Just visit this site. See also: whether AI art should really be considered art: Imagination Mode (Perspective Agents)]

What if Ronald Reagan’s Presidency Never Really Ended?

For many people, the 2016 election was a catastrophe. For Max Boot, it was a betrayal. He’d been a movement conservative: a loud voice for the Iraq War, an editor of The Weekly Standard, and an adviser to the campaigns of John McCain, Mitt Romney, and Marco Rubio. Boot took heart when Republicans initially closed ranks against Donald Trump’s candidacy. Trump is “a madman who must be stopped,” Bobby Jindal said. “The man is utterly amoral,” Ted Cruz agreed. Rubio called him “the most vulgar person to ever aspire to the Presidency.” For Rick Perry, he was “a cancer on conservatism.” Then, one by one, they all endorsed him, and he won.

Trump’s election shook Boot’s world view. Was this what Republicanism was about? Had Boot been deluded the whole time? He wrote a book, “The Corrosion of Conservatism” (2018), about his breakup with the G.O.P. The #MeToo and Black Lives Matter movements, he could now admit, made good points. His advocacy of the war in Iraq had been a “big mistake,” and he felt guilt over “all the lives lost.” Boot was like a confused driver who had arrived at an unintended destination and wondered where he’d missed the off-ramp. When was the right moment to have left the Republican Party?

For many anti-Trump conservatives, the lodestar remains Ronald Reagan. In his sunny spirit and soothing affect, he was Trump’s opposite. Their slogans differed dramatically: Reagan’s “Tear down this wall” versus Trump’s “Build the wall”; Reagan’s “It’s morning again in America” versus Trump’s “American carnage.” Both men survived an assassination attempt, and their instinctive responses were telling. Reagan, though gravely wounded, reassured those around him with genial humor. (To his wife: “Honey, I forgot to duck.” To his surgical team: “I hope you’re all Republicans.”) Trump, in contrast, wriggled free of his bodyguards, raised his fist, and commanded the crowd to “Fight! Fight! Fight!” Three days later, he released a sneaker line featuring an image of him doing so, the FIGHT FIGHT FIGHT high-tops, priced at two hundred and ninety-nine dollars.

Boot grew up idolizing Reagan. “How I loved that man,” he recalled. In 2013, he started writing a book about the fortieth President. His “Reagan: His Life and Legend” (Norton) aims to be the definitive biography, and it succeeds. It’s a thoughtful, absorbing account. It’s also a surprising one. One might expect, given Boot’s trajectory, that this would be a full-throated defense of Reagan, the Last Good Republican. But it is not.

Although Boot once felt “incredulous that anyone could possibly compare Reagan to Trump,” he now sees “startling similarities.” Reagan’s easygoing manner, Boot acknowledges, concealed hard-to-stomach beliefs. Reagan viewed the New Deal, which he’d once supported, as “fascism.” He raised preposterous fears about the Soviet capture of Hollywood, and fed his fellow-actors’ names to the F.B.I. When Republican legislators largely voted for the landmark civil-rights laws of the nineteen-sixties, Reagan stood against them. (He’s on tape calling Black people “monkeys.”) He also campaigned against Medicare, insisting that it would lead the government to “invade every area of freedom as we have known in this country.” For unconscionably long into his Presidency, he refused to address a pandemic, AIDS, that was killing tens of thousands of his constituents, and he privately speculated that it might be God’s punishment for homosexuality. Then there is his campaign motto, ominous in hindsight: “Let’s make America great again.”

Recent events have forced Boot to ask if Reagan was part of the rot that has eaten away at Republicanism. Boot now sees him as complicit in the “hard-right turn” the Party took after Dwight D. Eisenhower which “helped set the G.O.P.—and the country—on the path” to Trump.

And yet Boot sees a redeeming quality as well: Reagan could relax his ideology. He was an anti-tax crusader who oversaw large tax hikes, an opponent of the Equal Rights Amendment who appointed the first female Supreme Court Justice, and a diehard anti-Communist who made peace with Moscow. “I’ve always felt the nine most terrifying words in the English language are: I’m from the government, and I’m here to help,” Reagan famously quipped. But he delivered that line while announcing “record amounts” of federal aid. He viewed the world in black-and-white, yet he governed in gray.

Reagan tolerated a gap between rhetoric and reality because, for him, rhetoric was what mattered. “The greatest leaders in history are remembered more for what they said than for what they did,” he insisted. (The example he offered was Abraham Lincoln, apparently rating the Gettysburg Address a more memorable achievement than the defeat of the Confederacy.) When it came to policy, Reagan was happy to hand things off to “the fellas”—his generic term for his aides, whose names he could not reliably recall.

This, too, sounds familiar. Like Trump, Reagan held facts lightly but grasped larger emotional truths. When he uttered falsehoods, as he frequently did, it was hard to say that he was lying. “He makes things up and believes them,” one of his children explained. Reagan’s lies, like Trump’s, were largely treated as routine, as if he were a child who couldn’t be expected to know better. Fittingly, both came from the spin-heavy world of sales and entertainment. Boot points out that Reagan and Trump are the only Presidents who had television shows.

“Did Reaganism contain the seeds of Trumpism?” Boot asks. Usually, that’s a question about each man’s beliefs. Looking at Reagan’s life through Boot’s eyes, though, one wonders about their styles, too. Was there something about Reagan’s way of operating that got us here? (...)

Reagan hovered above the material plane, and others indulged him. “You wanted to help Reagan to float through life,” his longtime adviser Michael Deaver explained. “You’d be willing to do whatever it took to take the load off of him of all the shitty little things that normal people have to do.”

Those “shitty little things” included running the country. Deaver was sometimes called the “deputy President,” but others bore that title, too—the whole Administration ran on delegation. The President offered little guidance even when it came to taxes, his signature issue. “In the four years that I served as Secretary of the Treasury, I never saw President Reagan alone and never discussed economic philosophy or fiscal and monetary policy with him one-on-one,” Don Regan recalled. “The President never told me what he believed or what he wanted to accomplish.” Without direction, Reagan’s aides—the fellas—held extraordinary power. He accepted their views (though he sometimes fell asleep while they presented them), and he rarely sought outside counsel.

Auteur theory interprets films as fundamentally the creations of directors. A similar notion prevails in politics: the idea that Presidents are fully in charge. But when has that ever been true? Reagan knew, from his years on film and television sets, that the face of a production is just a part of it. There was something refreshingly honest in his ceding policymaking to those who knew more than he did. There was also something ironic: Reagan, the foe of bureaucracy, surrendering to the state.

by Daniel Immerwahr, New Yorker | Read more:
Image: Sunset Boulevard/Corbis/Getty
[ed. Having lived and worked through the Reagan presidency (on the receiving end of some of his policies) there's no doubt in my mind that he (and especially his wife, Nancy) were more interested in cultivating his image than in running the country. Which is not to say he didn't install some of the worst ideologues one could find at the time in key positions - Anne Gorsuch at EPA (Supreme Court justice Neil's mother); James Watt at Interior (Rocky Mt. Legal Foundation); Cap Weinberger, Secretary of Defense (Bechtel); and many, many others. He encouraged Grover Norquist (Mr. drown government in a bathtub) to form the Americans for Tax Reform (ATR), which advocated for big corporate tax breaks and opposed any effort to regulate health care, and worked closely with Newt Gingrich, eventual House Minority Whip, who is credited with creating the extreme party polarization we see today. So, after that, it was game over for moderate, responsible Republicans. Talk about revisionist history, it's always been baffling to me how Repubicans nowadays almost confer sainthood on Reagan who checked out early with dementia, destabilized Latin America, and blew potential lasting world peace and nuclear non-proliferation by underming Russia's recovery after the Soviet Union collapse (see previous: How the Neocons Subverted Russia’s Financial Stabilization in the Early 1990s. In my mind, the last great GOP president was Dwight D. Eisenhower, who'd be considered a flaming liberal these days. Sad.]

Monday, September 9, 2024

Your Book Review: The Pale King

For the longest time, I avoided reading The Pale King. It wasn’t the style—in places thick with the author’s characteristic footnotes, sentences that run for pages, and spasms of dense technical language. Nor was it the subject matter—the book is set at an IRS Center and tussles with postmodernism. Nor the themes, one of which concerns the existential importance of boredom, which the book, at times, takes pains to exemplify.

No—I couldn’t read The Pale King because it was the book that killed him.
***
Prelude: First Encounter

David Foster Wallace died in 2008, a year before I encountered his work; but I didn’t know it at the time. I was nineteen, with a broken wrist that forced me to drop all of my courses and left me homebound and bored. I decided to revenge myself on these irritating circumstances by spending four months lying in bed, stoned, reading fiction and eating snacks. And I happened to have a copy of Infinite Jest.

What to say about Infinite Jest? It remains Wallace’s masterpiece, widely considered the greatest novel of Generation X. It takes place in a near future where the US, Canada and Mexico have been merged into a single state. Each year is corporately branded, with most of the action taking place in “The Year of the Depend Adult Undergarment.” It’s set in three locales: a drug rehabilitation center, an elite tennis academy, and a Quebecois terrorist cell. The novel clocks in at over a thousand pages, two hundred of which are footnotes. It includes sentences of absurd length, with some descending into multi-page molecular descriptions of various drugs. The book pulls the kind of stunts that shouldn’t work, but in Infinite Jest they do, because the book is that good, the characters that deep, the subject matter that prescient. Infinite Jest is often considered the “first internet novel,” predicting in particular its addictive allure.

The Project of David Foster Wallace

Infinite Jest made Wallace a star. The book was both a literary sensation and cultural phenomenon, described by one commentator as “the central American novel of the past thirty years, a dense star for lesser work to orbit." Nonetheless, Wallace wasn’t totally satisfied. “I don’t think it’s very good,” he wrote, “some clipping called a published excerpt feverish and not entirely satisfying, which goes a long way toward describing the experience of writing the thing.” He grew determined to surpass Infinite Jest with something new.

Wallace aimed to write fiction that was “morally passionate, passionately moral.” He believed that “Fiction's about what it is to be a fucking human being.” His active period spanned the late 80s to the 00’s, cresting during the cynical 90s, the age of the neoliberal shrug, when on one hand,“Postmodern irony and cynicism's become an end in itself, a measure of hip sophistication and literary savvy,” and on the other, the average American parked himself in front of the television for six hours a day.

His major concerns were:

1) How to transcend postmodernism

2) The deforming effects of entertainment culture

Postmodernism can be understood as the idea that we’re so trapped within language that reality remains remote. At its most extreme, postmodernism seems to suggest that language is all that exists. In politics, this manifests as movements that focus on how people speak, much more than movements of the past; and in literature, as writing that aims not to immerse the reader in a plausible world, but to keep the reader hyper-focused on the fact that they’re reading a work of fiction. Wallace began his literary career as a postmodernist, before swerving away mid-career, most dramatically with Infinite Jest.

He wasn’t some simple reactionary. His work wove in postmodern self-awareness, metacommentary and irony, all while arguing that we had to transcend it. And to do so, we need the very principles postmodernism had spent the past half-century deconstructing: decency, sincerity, responsibility, neighborliness, sacrifice. (...)

The Pale King: Central Concerns

After Wallace’s death, his editor Michael Pietsch assembled the manuscript, winnowing it down to a set of consistent characters and generally forward-moving narrative. Infinite Jest famously ends before the climax, major plot threads dangling, and so does The Pale King—but while the former is cruelly deliberate, The Pale King remains unfinished through tragic happenstance, major themes underdeveloped, story nascent.

The plot: a group of IRS hires converge on an examination center in Peoria, Illinois, circa 1985. There’s the sense that once they’re there, things will start happening, but nothing really does. The chapters alternate between the 1985 story, character background, debate/discussion of the deeper philosophical meaning of the IRS, metanarrative written in the voice of 2005 David Foster Wallace, scraps of trivia/world building/slices-of-life. (...)

Pale King: Themes

The plot builds towards a war over the future of the IRS: with one side wanting the IRS to remain committed to civic virtue, its tax examinations carried out by humans; and the other wanting the IRS focused on maximizing profits, its examiners to be replaced by computers. The IRS here is standing in for all institutions where people operate both as individuals and as part of a larger collective: the conflict between the IRS as civic organization and the IRS as corporation reflects a general conflict taking place in the 80s, and arguably still today.

Wallace is, of course, on team human. His criticism of the profit motive parallels his rejection of minimalism, the aesthetic of postmodernism: when we reduce reality to a thin, abstract variable, whether that be profit or discourse, we mutilate it. And once we’re there , all that’s left is our role as solipsistic consumers. (...)

Chris’ story is located close in the book to a philosophical dialogue concerning the nature of the IRS and the moral crisis in society. As one character expounds (emphasis mine):

‘It’ll all be played out in the world of images. There’ll be this incredible political consensus that we need to escape the confinement and rigidity of conforming, of the dead fluorescent world of the office and the balance sheet, of having to wear a tie and listen to Muzak, but the corporations will be able to represent consumption-patterns as the way to break out—use this type of calculator, listen to this type of music, wear this type of shoe because everyone else is wearing conformist shoes. It’ll be this era of incredible prosperity and conformity and mass-demographics in which all the symbols and rhetoric will involve revolution and crisis and bold forward-looking individuals who dare to march to their own drummer by allying themselves with brands that invest heavily in the image of rebellion. This mass PR campaign extolling the individual will solidify enormous markets of people whose innate conviction that they are solitary, peerless, non-communal, will be massaged at every turn.’

This speech is set in the 80s, but was written in the 00s, when the internet was nascent and social media hadn’t yet taken off. Wallace’s diagnosis is prescient: between Quiet Quitting and Live to Work, young people are rejecting the tedium of office life and embracing the life of the influencer, which does indeed involve both the trappings of rebellion and conspicuous consumption.

It hasn’t gone down exactly as Wallace predicted. He was concerned about the withering effects of hedonism (which true to his predictions have persisted), but he underestimated the resurgence of doctrinaire political ideology.

The Pale King is in many ways revanchist, arguing for reclamation of territory lost to hedonism in the name of old-fashioned ideals like civic responsibility, neighborliness, and going to work every day. And revanchism has certainly made a comeback: today we face a proliferation of conservative/Trad movements, but very few seem interested in rehabilitating old fashioned civic virtue. Cynicism in societal institutions is endemic on both the right and the left, perhaps with good reason: while a bureaucrat in the 80s could expect to own a home and support a family, these days an ‘ordinary’ job doesn’t cut it. The IRS’s of the world have taken the path that Wallace warned against, embracing automation and the bottom line, and neglecting the real, human realities of the people they’re meant to serve.

The Millennial/Gen Z complaint is real: the economic conditions are harder than they were in the 50s/70s/90s; the world of our parents no longer exists; starting a family is exorbitant. So why should we subject ourselves to bureaucratic tedium and keep society running, when society doesn’t seem to care much about us?  (...)

The Path Forward

Wallace suggests that boredom, far from being something to avoid, might point the way to deeper self-knowledge. “Maybe dullness is associated with psychic pain because something that’s dull or opaque fails to provide enough stimulation to distract people from some other, deeper type of pain that is always there, if only in an ambient low-level way, and which most of us spend nearly all our time and energy trying to distract ourselves from feeling, or at least from feeling directly or with our full attention.” Boredom might even gesture towards enlightenment: “It turns out that bliss—a second-by-second joy + gratitude at the gift of being alive, conscious—lies on the other side of crushing, crushing boredom. Pay close attention to the most tedious thing you can find (tax returns, televised golf), and, in waves, a boredom like you’ve never known will wash over you and just about kill you. Ride these out, and it’s like stepping from black and white into color. Like water after days in the desert. Constant bliss in every atom.”

In Wallace’s conception, boredom isn’t only personally enlightening—it can also be a heroic sacrifice for the collective good. At one point Chris Fogel wanders into the wrong classroom and ends up in the exam review for Advanced Tax, taught by a capable and dignified Jesuit (possibly the eponymous “pale king”). The Jesuit makes a speech which sparks an epiphany in Chris, where he declares the profession of accounting a heroic one: “True heroism is you, alone, in a designated work space. True heroism is minutes, hours, weeks, year upon year of the quiet, precise, judicious exercise of probity and care—with no one there to see or cheer.’”

There it is: the vision, the cure, the path forward. We accept the burden of adult responsibility, go to work every day and engage in the important but unglamorous work that keeps society running. We orient our institutions not towards money but principle. We refuse to treat people like numbers or cogs or some great undifferentiated mass—we treat them as fully human, always, even and especially when they’ve chosen to subsume some part of their individuality to a soul-killing institution, because we recognize this as a heroic sacrifice they’re making for the good of the collective. And we withstand our negative emotions, embrace them fully, travel through their every texture until we transform and open to a deeper and richer experience.

The problem with all this, of course, is that in the middle of writing the book, Wallace killed himself.

by Anonymous, Astral Codex Ten |  Read more:
Image: The Pale King/Amazon
[ed. I've met three people in my life who've read Infinite Jest in its entirety - which is exactly three more than The Pale King. I guess a 600-page novel about boredom and the IRS doesn't exactly scream best-seller. But I enjoyed it (like I enjoyed Infinite Jest, more for certain component parts than as a complete/coherent whole). For example, the happy hour/after work get-together, with the beautiful Meredith Rand's sudden appearance and effect on group dynamics still makes me laugh. See also: Maximized Revenue, Minimized Existence (NYT); and, Men Recommend David Foster Wallace to Me (Electric Lit).]