Tuesday, July 30, 2024

Scale of Life: Everyday Life in Real Time

Scale of Life (link)

What is The Scale of Life?

Our site is a "real-time" visualization of the relative scale of different life events and natural phenomena (details on what real-time means below). You can select from various categories, time periods, and some unique units of measure that we created in the dropdowns to modify the counter lists.

Each counter counts up in sync with the event listed unless the pause button is selected. You can also click each counter to get sources as well as thought provoking articles and videos that explore the topic in depth. (...)

We make a point of generally choosing statistics that are researched by professional agencies or institutions or are provided directly from a company, person, etc. involved (e.g. production volume of a product). Many of these types of statistics like births, commercial flights, red blood cells created, legos manufactured, the speed of light, etc. are very well tracked or researched.

While this site is intended primarily for a lay audience, we make our best efforts to obtain the most accurate data available; if we can't find data for a statistic that meets our (patent pending) "Scale of Life Data Trustworthiness" standards, we don't use it.

When creating a final calculation, we will generally weight and average multiple data sources depending on their percieved quality. In some cases we select a single, most authoritative data point. This is a highly subjective process, which is why we make the "per day" calculation and sources available so you can evaluate for yourself.

Image: Kalen Emsley, Wet Mountain Valley, via Wikimedia Commons

The Seagulls Descend

So I guess it's time to write about elections.

In the United States, I've noticed that elections seem to be a time for politically activated people to come together to bicker and fight and make points that everyone has already heard and considered, and prognosticate and proclaim and act like pundits even though we aren't pundits, and then, after a year or two of this, low-information voters who haven't been paying the slightest bit of attention at all swoop in during the final few days and decide the outcome based on something that they think they read somewhere. If a seagull has ever stolen some slightly unappetizing carnival food out of your hands after you stood in line for an hour to get it, you'll have a general idea how this feels.

Democracy is the way we choose our political representation, and while the electoral process currently presents choices that an increasing number of people find too unpalatable to consider, it's also a tradition that a gang of brutal fascists led by TV star Donald Trump are trying rather openly to bring to an end, so I'd observe that elections appear to be something that this gang of brutal fascists believe will not help their agenda, and ending them appears to be something that will help their agenda, and they're going to an awful lot of trouble to kill them off. There's a danger to focusing overmuch on elections, as if elections are the sole component of a politically activated life, but it seems to me that there's also a danger to ignoring elections, as if elections don't matter.

So today I'm thinking about our national flock of low-information seagulls, and those who feed them.

I understand why somebody would choose to be low-information. There's a cost to awareness. Awareness of a wrong creates a moral imperative to care, and that sort of thing can lead to real work and real expense and other inconveniences. And, in a time of rising fascism, the cost of awareness is made unnaturally high, because few things are more deadly to fascism than awareness—while ignorance serves fascism's ends.

And yes, it's tempting to stay in unawareness—to be a seagull, if you're at a level of privilege where that’s still allowed, floating along undecided on whatever breeze is most convenient, disengaged from reality, freed from natural responsibilities that attend civic life in the natural human system that is society. When somebody is low-information, the most important thing is staying low-information. Yet nobody likes to be thought of as unaware. What other people think of them seems to be a real issue for seagulls.

I think this is why, especially at election time, there are two sandwiches (binaries again!) that seagulls crave most: ease and permission.

"Ease" means whatever choice allows the most comfort. "Permission" means the most comfortable choice that can be made which still allows the chooser to be thought of as decent by those whose regard matters to them—which includes themselves. What's required are simple narratives, because human beings are creatures of narrative, and so a low-information voter needs a narrative to give themselves. Something simple to tell the mirror. Something simple to tell their friends. A bunch of people waved MASS DEPORTATION NOW signs at the Republican Convention last week, which reflects a desire to enact inevitable terror and tragedy and abuse and death for millions of our friends and neighbors. And a bunch of elected representatives in Congress hosted a man who has been bombing thousands of civilians to death using our bombs and our tax dollars, and starving them using our political support, among other atrocities, and they cheered and cheered for him. These sorts of things can only happen when permission has been manufactured.

Which suggests a booming market for simple narratives.

Let's keep the binaries going, and say that simple narratives typically fall into two buckets: equivalences and differentiators.

Equivalences assist fascism most, in case you didn't know, and false ones are best. As the quote goes, the opposite of love isn’t hate, but indifference.

Differentiation seems like it would probably increase awareness more, so it strikes me that it might be good if our election narratives, especially the simplest narratives, provided the most important points of differentiation rather than the most mendacious equivalencies.

I can't help to notice, however, that our legacy media, which so often treats being low-information as normal and even admirable, seems to be mostly in the business of manufacturing false equivalencies rather than identifying true differentiators.

My feeling is that journalistic neutrality is important; a journalist shouldn't allow their political beliefs to interfere with uncovering and publishing the truth and the evidence for that truth, whatever that truth might be. This would be a principle of neutrality grounded in evidentiary and investigated truth. And there are still a lot of journalists out there who actually hold to this type of neutrality, who work hard and risk their reputations and their well-being to deliver the truth no matter what, and increasingly journalists of this kind are being made to pay an increasingly high price for this sort of neutrality, because this sort of neutrality raises awareness, and awareness doesn’t serve the interests of power.

However, this is not what appears to be the primary mission of large parts of the mainstream American journalism project these days.

Journalistic neutrality is—within American political media anyway—the idea that news organizations shouldn't put their finger on the scale for either political party. This sounds fair on the surface, but this means its goal is not investigating the truth, but rather establishing balance between two sides—which flattens everything into just two sides, and then flattens both sides until they are equivalent.

A moment’s thought reveals this as a posture that requires putting the finger on the scale; ignoring and submerging huge swaths of truth in order to make the better side seem worse and the worse side seem better—and the worse the worst side gets, the more its crimes against decency must be artificially ignored, buried, pasteurized, neutralized, all in order to create the narrative that both choices are essentially the same. The worse the other side gets, the more this form of neutrality requires our journalistic institutions to stop reporting what is true, because reporting the truth as plain truth would mean reporting not only that the worse party is worse, but how much worse they have been allowed to get, growing in a field fertilized by this cynical ersatz journalistic neutrality. And it's not just the worse side that is the beneficiary. When the project is crafting equivalencies, normalizing badness takes on its own momentum, and those points of badness that both sides truly do share equally are normalized for both as good, or at least natural, or at least a given, the unchangeable reality, "how things are."

It's not that truths aren't reported. It's that they are under-reported, printed below the fold, hidden on page 13, festooned with weasel words and equivocating headlines. Lies are insinuated as possibilities by people who are "just asking questions." People trying to do some good and necessary thing are advised in open letters that goodness is a mistake, by people who (though we are not told this) are paid operatives for preventing that good thing. Truths are buried entirely, omitted from questions asked in debates, in press conferences, in talk shows. They're negated by equivocating wish-casting punditry that posits nonsense hypotheticals, alternate realities where the clear problems might not be real dangers to worry about, because at any moment, reality itself might change. Credulity is extended to individuals and institutions who have proved themselves inveterate liars, grotesque hypocrisies are politely ignored. Incredulity is manufactured based on the most specious pretexts and set against clear evidence. And more and more, what is reported is not so much investigating what is as speculating how might it be received? It's reporting not on the bloody present but some hypothetical shadow future, so that, when the seagulls finally grab at the proffered falsehoods, the fact of their acceptance is breathlessly reported as evidence that the chosen lies are true—reported by institutions that neglect to mention it was they who made the sandwich, it was they who waved it in the air.

Let's have an example, shall we?

The president is old, and his capacities appear to be failing. Maybe you've heard. It's a real story. It appears to be a real problem. It's a story that needs to be covered, in my opinion, and a journalistic neutrality committed to truth would report it in the context of everything else, and would give it the weight it deserves beside all other issues.

In my opinion that's not what we've been getting.

It was pretty clear ever since his disastrous debate performance last month that Biden's age—which, again, is a relevant story—was the story our legacy media was going to obsess over. And we all obsessed over it too, most of us, and as we all flew with the flock, it became the One Topic.

I can’t help but notice how this topic made such an easy counterbalance—the latest one—to all the narratives about Trump from "fascist" to "rapist" to "insurrectionist" to "threat to democracy" to "unfit" to "pathological liar" to "flagrantly corrupt" to "34-time felon," all of which are facts that have been disguised as opinions beneath a mask of weasel phrases and punditry and predictions that become self-fulfilling—none of which ever get down to the business of actually framing the entire differentiated picture for voters, all of which seem to be intended to create simple narratives of false equivalency for seagulls.

But then ... an event! An assassination attempt! The media seemed poised to switch the whole narrative over to the dangers of violent rhetoric—Biden had used the word "bullseye" and had responded to Trump's open threat to end democracy by pointing out that Trump is a threat to end democracy, and somebody had taken a shot at Trump—but unfortunately for the media, the shooter was a Republican and his motive remains unclear and the facts muddy. And perhaps you noticed that, just as suddenly as normalized political violence—a very real problem—became a concern, it stopped being one to our manufacturers of simple narratives.

I'd say this reveals that the real mission has always been crafting a false equivalence, one that would benefit the party most engaged in the normalization of political violence, in order to maintain the illusory neutrality that such a false equivalence projects. After all, if the concern really is normalization of political violence, there's a much easier and simpler narrative to craft about it. Trump's entire life as a public figure is marked not only by normalization of political violence but a proud celebration of it. If the media's concern was political violence, we'd get a picture of political violence that makes it clear that the Republican Party and their leader are the root cause of our atmosphere of normalized violence, even if they momentarily experienced some of that atmosphere’s malign effects. But if the media's true concern is building an equivalency that manufactures a sense of unbiased neutrality, then you'd find them trying out political violence as a message, then abandoning it when the ingredients proved unpalatable. Which, I'd argue, is exactly what we saw.

If our "neutral" media had been able to make Joe Biden's candidacy about normalizing political violence, it would have been able to create an equivalence with one of Trump's most frightening and alienating qualities—in service of the most violent party. Since the equivalence didn't take hold, the entire story—an almost successful assassination attempt!—was mostly discarded, and we returned to endless obsession over the fact that Biden is old and shows signs of cognitive decline, in such a way that obscures the observable truth that Trump is also old and has been utterly deranged since he was young. Equivalencies can work when they are false, for sure, but our equivalency-manufacturing media really salivates when they find one that feels true that they can keep making all the way to November. And Biden is old.

"Rapist" versus "Not Rapist" doesn't offer a seagull as much permission and ease as "Violent" versus "Also Violent." If the former is the simplest narrative available when a seagull swoops in, many will choose "Not Rapist," because—even if they don't really give a shit—choosing "Rapist" creates a narrative about themselves that disturbs their comfort, and comfort was the whole reason for staying proudly ignorant about most things in the first place. A focus on "Old" versus "Popular" meanwhile, deliberately downplays and ignores the whole "Rapist" angle, which signals that ignoring the "Rapist" angle is something reasonable that reasonable people can do, freeing seagulls to choose whatever sandwich they want, and one thing I’ve learned about people who don’t give a shit is that they usually choose whatever meat is bloodiest.

And so, in Milwaukee last week, after three days of ritual hate, the Republicans trotted Donald Trump out on stage, where he delivered 90 minutes of nonsense and threats and lies, and the simple narrative about Trump's speech rolled out on the front pages, proclaiming not that he was RAPIST, or a FASCIST, or AUTHORITARIAN, or a BIGOT, or an aspirational DICTATOR, or an INSURRECTIONIST, or clearly UNFIT for office, or a mass promoter of political VIOLENCE, or a THREAT to end democracy, or any of the other things that he manifestly is.

In the same way that they have decided that the narrative about Biden will be OLD, they decided that the simple narrative about Trump—almost as elderly, far more unfit, utterly deranged, the most divisive politician in a century, speaking in front of a crowd waving signs calling for rounding up the undesirables—was UNIFIER, on front page after front page after front page, based not on anything he is or anything he said, but only on the incorrect a priori speculation that a near-death experience might perhaps have changed him.

It's one of the most chilling things I've ever seen: the legacy media abandoning even their false neutrality to create a totally alternate reality, for the direct benefit of the worst person imaginable. It was an open capitulation to the fascist demand that media enter their misinformation stream and report on whatever it is they want said as if it were real. And it created permission for low-information people, who don't give a shit for anything beyond their own ease, to ignore reality; false equivalence where a more principled neutrality would delineate the differences.

And so the seagulls descended.

But then Joe Biden dropped out of the race. And suddenly there was a simple differentiator being offered, one that has proved rather popular so far: FORWARD vs BACKWARD.

So now the seagulls have scattered again, as confused as ants in a freshly kicked hill, while our narrative manufacturers scramble desperately to decide which of the new journalistically unbiased pro-fascist narrative will be most palatable, and the creepy fascist weirdos in the Republican Party, bereft of their talking points, resort to things like "she's a childless trollop!" and "I hate her laugh!"

Meanwhile we all try to predict what the seagulls will do.

I suppose I should make some predictions.

I should confess now that I have no idea what is going to happen. Some things seem more likely than other things, but as to what will come from all this, I simply don't know. This is something I've learned over the years by having been very confident about such things and being proved wrong. I used to know a lot more, and I knew it very confidently. In 2016 I was sure Trump couldn't win. In 2020 I was sure Biden couldn't win. Wrong and wrong. What the hell do I know about the future behavior of seagulls? Not much.

This is a bit of a disqualifying statement in our modern age. These days we are mostly in the business of punditry, especially around elections. We are encouraged to behave exactly like our media: to obsess over what is going to happen if person X does or says Y. We're invited to make savvy predictions and craft our support of candidate and policy and slogan strategically, not based on what is right or wrong or desperately needed, but to match our support to whatever a centrist voter in Pennsylvania who lives only in our imagination will or will not support. We're meant to know exactly which sandwiches the seagulls will find most appetizing before they descend and then position ourselves, not where we ought to be to effect necessary change, but wherever we think the flock will be, not so we can effect that change, but only so we can have been right ahead of time.

So often it feels as if we're accepting the current media framework of speculation and prediction and punditry, not so much dealing with what is or contending for what should be but living in a bleak and unbroken shadow future, where everything is already decided, which frees us from the moral imperative to have to do anything. It’s just as freeing in a way to believe everything is doomed as it is to believe that everything will be fine; either way you don’t have to do much thinking or work, or even take the next step that will allow us to take the next step, however easy or hard or palatable or unpalatable that next step might be.

So we behave as if we are political operatives, predictive wizards, demonstrating not our commitment to a better vision of the future by contending with reality here in the present and working for best outcomes, but rather our ability to know what will happen before it happens, so that when it happens we can say see? and if we are wrong, we just run on to the next topic, chasing seagulls. Trump survived an assassination attempt and my feed was full of instant surrender my God that image, it's so iconic, the Democrats are through, we're all cooked. This weekend Biden resigned and everyone is proclaiming that the Republicans are cooked. We argue over whether the nominee should be Harris or a brokered convention, proclaiming which will succeed, and deciding what we will support based on that. We argue over the VP selection as if our preferences will have bearing on the decision.

This is how our crafters of narratives operate. It doesn't have to be our way.

As fascism is most assisted by false equivalence, I think we'd do better to seek differentiators from both fascists and those who manufacture false equivalences.

How might we differentiate?

In answer, let me make some speculations and predictions of my own as promised.

by A.R. Moxon, The Reframe |  Read more:
Image: uncredited
[ed. Nice to see someone do a no-BS job of framing what everybody feels but can't quite articulate (well, mostly everybody... I hope?). We're being played by professionals all the time who have years and years of experience in crafting public opinion, especially around politics and media news/images/stories. There are college courses on this stuff.]


via:

Monday, July 29, 2024


Gretchen Bender, YOU & ME ACCESSIBLE. 1984
via: Speaking in Tounges (N+1)

Andrei Lankov: The Real North Korea

The Real North Korea: Life and Politics in the Failed Stalinist Utopia, Andrei Lankov (Oxford University Press, 2014).

In the 1980s, Japan experienced a crisis of disinformation. For years, there had been mysterious disappearances of Japanese people with no known history of mental illness, drug addiction, or gambling debts. All kinds of people — men and women, young and old, just suddenly vanishing without a trace. Many theories were put forward to explain the puzzle (for instance, some believed it was alien abductions), but the most widespread, pernicious, and dangerous view was that North Korea was responsible. There were people who claimed to have actually seen teams of North Korean commandos lurking on beaches, nabbing random passers-by, and bundling them into waiting submersibles just off the coast. This was obviously crazy. Products, no doubt, of atavistic xenophobia and reactionary sentiments. The Japanese media, government, and academic authorities put a lot of effort into refuting this dangerous disinformation throughout the 1980s and 1990s…which made them look real silly when in 2002 Kim Jong-Il issued a formal apology for the abductions and ordered the surviving captives returned to Japan.

This has always felt like the ur-North Korea story to me, because it has a little bit of everything. First of all, it’s delightfully madcap — they KIDNAPPED RANDOM PEOPLE on BEACHES using SUBMARINES and they did it for DECADES. Second, it’s full of bizarre irony. The North Koreans got away with this scot-free until, in a gesture of goodwill and altruism designed to improve relations with Japan, they fessed up and tried to make things right…at which point everything blew up in their faces and had the exact opposite effect. But third and perhaps most important, beneath the cartoonish antics and the bumbling diplomacy, there is thoughtful rationality at work. And that is perhaps the most important message of this book — everything the North Koreans do, including the stuff that seems crazy, perhaps especially the stuff that seems crazy, is actually deeply considered, strategic, and rational. These aren’t crazy people, these aren’t aliens, these are people with a very strange value system and a very strange situation, and if you put yourself in their shoes, all of their actions make a ton of sense.

Does that sound like good news? It’s actually very bad news.

Before we get to all that, though, let me tell you about this book. The author, Andrei Lankov, knows a lot about North Korea. Lankov grew up in the Soviet Union and attended university during glasnost. But while all his buddies were wearing jeans and doing student exchange programs in Western Europe, Lankov got chosen to study at… *sad trombone* Kim Il-Sung University in Pyongyang. He’s been back many times since then, speaks fluent Korean, chats frequently with North Korean defectors, and now has a teaching position at a university in Seoul. But Lankov has one other very important quality: that peculiar Slavic combination of grim fatalism and bleak humor. An American might get huffy or moralistic writing a book about decades of mass slaughter. It takes a Russian to treat the same topic with a deadpan sense of irony, thereby resulting in a very depressing book that is also very funny.

Lankov starts with a whirlwind tour of the early years of the North Korean state. The official histories all informed me that the Ever-Victorious Generalissimo was born to poor Korean farmers, so I was surprised to learn that actually he was born to a moderately affluent family of schoolteachers, doctors, and Christian activists in Northeastern China. Similarly, my DPRK handlers had always taught me that the Eternal President spent the 1940s leading daring guerrilla raids into Korea (with his pregnant wife) from the holy slopes of Mount Paektu, so I was shocked by Lankov’s claim that he was actually chilling in Russia at a military base near Khabarovsk. Finally, I had always assumed that the Great Leader drove out the Japanese himself via the indomitable Juche spirit of the Korean people, but Lankov has the temerity to suggest that he was installed by the Soviet army instead.

Okay, so maybe Kim Il-Sung had a little bit of foreign assistance… Okay, maybe more than a little, maybe the Sun of the Nation was hand-picked by Soviet officers who also wrote the North Korean constitution, wrote all of the initial laws, and selected the precise geographic and social composition of the nascent North Korean parliament. Who cares, the important thing is that they picked poorly. Kim Il-Sung was dangerously independent, not interested in being a mere puppet, and immediately began triangulating between the Soviets and the Chinese. After all, there were things to admire about both systems — he appreciated the ideological austerity and florid personality cults of Maoism, but loved the Stalinist emphasis on centralization and heavy industry. No matter, he could pick the best aspects of both systems and combine them here, in Korea. He could build paradise.

And he could begin from a wonderful starting point. The North had a fraction of the population of the South, but it was blessed with abundant mineral resources and was quite rich and urbanized by the standards of early 20th-century Asia. Remember that the architects of Japan’s postwar industrial policy had originally been colonial administrators in Manchuria and Korea. Their legacy was a vast capital stock of heavy industry, mines, and factories — later expanded upon by the Soviets. There was more than enough here to deliver on Kim Il-Sung’s promise to the peasantry that they would “eat boiled rice and meat soup, dress in silk, and live in houses with tile roofs.” (...)

This was a pretty nice starting point, and then the Korean War and some subsequent purges enabled Kim to complete what may be the most perfect totalitarianism ever constructed. The model was Stalinism, but without all the dangerous liberality and the other compromises that crept into Soviet government over time. So for instance, in the Soviet Union under Stalin it was legal to buy food at a market. Not so in Korea under Kim, where all food would be distributed via a state rationing system. And while it was expected of Soviet graduates that they would go on to find a job, in the DPRK all jobs would be allocated by the authorities.

Another way in which Stalin was dangerously liberal was his approach to punishing dissent. A curious fact about North Korea is that it has an almost premodern caste system called songbyÅ«n, SongbyÅ«n is inherited via the paternal line, and if you have “good songbyÅ«n” then your life is pretty much made from the day you’re born — you will receive a good education, good work assignments, and extra rations. If you have “bad songbyÅ«n,” on the other hand, you are a despised slave, your life will consist of grinding labor and constant humiliations, and you will live and die with the knowledge that your descendants will suffer this too. What determines if you have good or bad songbyÅ«n? Pretty much just what your male line ancestors were doing in 1945. There are vanishingly few ways to change songbyÅ«n in either direction, but one of those few ways is via acts of disloyalty to the state. So a North Korean tempted to dissent has to reckon with the fact that not only will he likely be condemned to torture and execution, but his descendants will be degraded to the status of human cattle. Not just his children and his grandchildren. All of his descendants. Forever.

Similarly, it’s a well known fact within North Korea that if you commit a crime that gets you sent to a prison camp, your entire family will be sent there as well, which is a good bit stricter than Stalin ever got. (...)

Anyway, while he was building utopia inside of his country, Kim had to reckon with an increasingly hostile world outside. Relations with the Soviet Union got very strained after the death of Stalin. Kim admired Stalin as a great man, even if one marred by sentimentality, but his successors were a dangerous pack of bleeding-hearts. So Kim ordered all student exchanges to be cancelled, all Soviet advisors sent home, and all North Korean husbands to divorce their foreign wives (who were promptly expelled from the country). But just when the “pivot to China” was nearly complete, Mao kicked off the Great Proletarian Cultural Revolution, and Kim watched in horror as his neighbor descended into antinomian chaos. Before long, young Chinese Red Guards were openly criticizing Kim as a “neo-feudal ruler” (imagine), and so North Korea expelled the Chinese ambassador in turn.

As a result of all this, North Korea embraced an “equidistance” policy of diplomatic balancing, wherein it used a combination of ambiguity, guile, and blackmail to squeeze escalating economic concessions out of its now mutually-hostile sponsors. The Sino-Soviet split made this strategy extremely effective. Both sides wanted North Korea friendly to them, but above all they wanted it not to be friendly to their opponent. So North Korea was able to maintain a dollar auction where they continuously demanded that each side increase their bribes or else they’d go over to the other side. Over time, these bribes…I’m sorry, this “aid” came to constitute a majority of the North Korean economy. (...)

Everybody looking on expected the North Korean state to collapse, like many communist regimes across Eastern Europe just had. But they did not reckon with the perfect society that Kim Il-Sung, now on his deathbed, had built. Despite all advanced technological inputs vanishing, despite social organization regressing to iron age levels, and despite mass starvation of the kind most of the world hasn’t seen since Norman Borlaug, the state clung on. The sublime machinery of oppression was able to maintain its icy hold even as something close to an apocalypse unfolded. But that isn’t to say that nothing changed. The songbyÅ«n and the prison camps and the Kim family cult all remained the same, but under the surface some big things were happening.

Imagine that you are a North Korean prison guard (that’s some good songbyÅ«n!). For years you’ve been supervising the slave laborers at one of the massive 1930s-style collective farms that grow all the food for the country. Your orders are clear. You need to maintain order, maintain proper veneration of the Dear Leader, maintain physical and spiritual hygiene. But lately the supply of old Soviet fertilizers has run out, and the farm isn’t producing anything. Your orders are clear. It doesn’t matter that scratching at the unyielding ground is now fruitless, the slaves must continue to do it. But…if a few of them, after spending all day pointlessly failing to grow food in your collective farm, then sneak away at night and do a little extracurricular farming up in the mountains, and if they happen to give you some of the food that they produce… Your orders are clear. Maintain order, maintain proper veneration of the Dear Leader, maintain physical and spiritual hygiene. It doesn’t say “stop people from growing unauthorized food in the middle of the night” anywhere. Your orders are clear.

Lankov argues that through the crucible of famine and economic collapse, the world’s most totalitarian society seamlessly transformed into a surreal hybrid of totalitarianism and anarcho-capitalism. The farms are a good example. Approximately all of the food is now grown off-the-books, unauthorized, in the middle of the night and then sold by private dealers. But the truly demented part is that the vast, centralized state-run farms are still there, occupying all the good land, producing nothing, fully-staffed by slave-farmers who go through the motions all day, and then sneak off at night to grow food for sale in their private plots. But those private plots that produce everything are necessarily located in the worst and least productive soil, soil that the state has officially written off.

Or consider the factories. The black market economy that comprises the vast majority of North Korean GDP is dominated by women. This is a curious setup for such a fearsomely patriarchal society. But actually…it’s because the society is fearsomely patriarchal. The men are allocated jobs by the state. Many of those jobs are jobs in the factories. The problem is that the factories don’t actually exist anymore, and the machinery inside them has all been sold off for scrap metal. But the men have been assigned jobs, and the jobs are in the factories. So they sit in the rotting, empty shells of factories that haven’t functioned for decades, and that takes up a lot of their time. But the women, ah, the good North Korean woman is a housewife and a mother and a homemaker, which means she does not have a fake job in a fake factory, which means she can work a real job in a secret workshop in her house producing unauthorized goods, or buying and selling them in an unauthorized market. (...)

This “secret privatization” of the entire North Korean economy has been incredibly thorough. It’s estimated that around 80 percent of all goods and services in North Korea are provided in secret and in shadow. It’s capitalism as an extremophile species of lichen, colonizing the cracks and crevices of the official society, and keeping the whole system afloat. They are actually speedrunning the entire history of primitive accumulation leading to investment leading to the joint stock corporation. Large (secret) transportation companies now exist in North Korea and maintain unofficial roads forming an unofficial transit network. The trucks and buses are smuggled in from abroad, then “donated” to various government agencies, which then lease them back in exchange for kickbacks. In this way, they’ve reinvented the idea of funding government operations through corporate taxation in a hilariously roundabout way. There is a booming private restaurant scene.

The North Korean government occasionally tries to crack down on all of this, and it’s very important to understand why. It’s not, as you might assume, because they’re true-believing hyper-Stalinists who are ideologically offended by the existence of capitalism. No, the reason they don’t like it is because it’s making their society richer and more functional.

by John Psmith, Mr and Mrs Psmith's Bookshelf |  Read more:
Images: uncredited

Andy Warhol and Jack Nicklaus
via: misplaced


Cy Twombly, Untitled, 1970

Sunday, July 28, 2024

Lessons from the East Asian Economic Miracle

I’ve always had highly libertarian instincts, for both pragmatic and ideological reasons. You say civilians should be able to own rocket launchers, I demand that these rocket launchers not face a sales tax. But for me and people like me, the East Asian economic miracle poses a serious challenge: the greatest anti-poverty program in history involved not just a lot of capitalism, but a ton of state intervention as well. The history of East Asian economic growth in the last half of the twentieth century is a history of academics and the World Bank insisting that their policies couldn’t possibly work, followed by decades and decades of torrid growth.

So I decided to look into it. I read a few books (skip to the end for sources and recommendations), and learned a ton. As it turns out, though the Miracle is largely a triumph of capitalism, it also illuminates that economic growth depends on judiciously insulating certain parts of the economy from market forces.

In a way, you can look at success stories like Japan and South Korea as a different instantiation of two very American institutions: venture capital and private equity. The difference is that it was VC and PE as practiced by the state, rather than by individual companies; the timelines were longer, the plans were bolder, and the results were stunning.

No American company or investor has improved as many lives by as wide a margin as MITI, Park Chung-Hee, and Deng Xiaoping. But the methods are, at heart, startlingly similar: identify a critical inflection point, make a bold bet on an unproven market, “blitzscale” as quickly as possible, deftly react to crises, and carefully expose budding monopolists to hormetic doses of competition until they’re strong enough to monopolize on their own merits.

This is not just a theoretical exercise. An active trade policy has landed suddenly back on the policy menu in the US and other places. In the past, we’ve done it sort of shame-facedly, giving nice subsidies to Iowa in exchange for their tactically priceless supply of early caucus dates, periodically bailing out American companies when international exporters teach them the meaning of the word “competition,” and occasionally gesturing towards energy independence without putting any meaningful goals to paper.

But now we’re talking about trade in a more adversarial way, which is appropriate inasmuch as, in international trade, America has many adversaries — not places that hate us, just places that, as a side effect of their own policy goals, end up harming American economic interests. Of the many possible trade policies, most are bad and free trade is — on net, and in the aggregate — the best. However,the simple Ricardian model of trade makes a few untenable assumptions, and other countries have learned to exploit them. In the spirit of free inquiry and open debate, I am happy to explain why anyone who advocates either contemporary protectionism or contemporary free trade is wrong.

But we’ll have to start at the beginning: how did East Asian countries get so rich, so fast?

by Byrne Hobart, Medium |  Read more: 
Image: uncredited via


via:

Saturday, July 27, 2024

I Suppose I Must Have

ON GASLIGHTING
by Kate Abramson.

Princeton, 217 pp., £20, May, 978 0 691 24938 4

In​ the TV drama Bad Sisters, set in Dublin, four sisters conspire to murder their brother-in-law John Paul, an abusive monster who is married to their beloved sister, Grace. The dynamics of the marriage are clear from the pilot. It’s Christmas Day and tradition has it that the siblings meet at Forty Foot – a swimming spot just south of Dublin – for a dip. But John Paul doesn’t want his wife to go. To keep her at home, he brings her a glass of champagne when she is getting ready. Grace is surprised, and touched, at the gesture. ‘You deserve it,’ John Paul says. ‘Drink up.’ Minutes later, as Grace is about to leave, he appears, feigning shock. ‘What are you doing? You just had a glass of champagne, you can’t drive, are you kidding me? Come on now, think.’ A glazed, cowering look comes over Grace as she protests that she’s fine, completely fine. Or, if he insists, he can drive her to Forty Foot. ‘Of course I can’t. I had a glass myself.’ She begins to look frantic, while John Paul holds firm: ‘Sorry, sweetheart. I just worry too much.’ Grace defiantly opens the door. He slams it shut, grazing her hand. ‘Why would you go and make a scene,’ he says, ‘on Christmas Day?’ Grace gives in. She texts her sisters: ‘My fault. Too much to drink.’

The scene is a helpful introduction to the concept of gaslighting, in which the abuser manipulates the victim and then convinces them that they are at fault (‘I’m not making a scene, you’re making a scene’). The usual techniques are ridiculing the victim and making deliberately confusing and misleading statements. Anger is quickly succeeded by excessive affection or concern – a technique known as ‘love-bombing’ – which further undermines the victim. George Cukor’s 1944 film noir Gaslight, based on the 1938 play by Patrick Hamilton, inspired the term, though it took some time to gain ground. Psychoanalytical scholarship first mentioned ‘the gaslight phenomenon’ in the late 1960s. In 1981, two doctors, Victor Calef and Edward Weinshel, gave an account of gaslighting in Psychoanalytic Quarterly: the ‘victimiser’, they wrote, tries ‘to make the victim feel he or she is going crazy, and the victim more or less complies’. As Kate Abramson explains in her new book, On Gaslighting, the gaslighter
doesn’t just want other people to think his target is wrong. He wants her protests framed as ‘oversensitive’, ‘paranoid’, ‘acting out’ and ‘rants’. The more he succeeds, the less she will be able to engage in the relevant acts of telling, of protesting and so on ... But the silencing involved in gaslighting is actually much worse than this ... The gaslighter wants the target to see herself in the terms he paints her.
In Gaslight Ingrid Bergman and Charles Boyer play Paula and Gregory, a newly married couple in 1880s London. Gregory, we soon learn, wants to see Paula carted off to ‘the madhouse’. To accomplish this, he plants his pocket watch in her handbag, then comments on its absence before supposedly discovering it in her bag. He moves a painting, then convinces Paula that she moved it. She reads aloud to him a letter addressed to her aunt; he later maintains that the letter never existed. He gives her a brooch, which mysteriously disappears (he ‘forgives’ her for losing it). Cukor keeps the camera tightly focused on Bergman’s anguished face as she cries out, again and again, ‘I didn’t! I swear I didn’t,’ before her protests give way to self-doubt and depression. ‘I suppose I must have.’ Each time she gives in, Gregory’s face flashes with something like arousal. ‘Yes. YES,’ he says. ‘That’s right: you’re imagining things.’

Gregory’s chief motivation isn’t sadism but jewels, in particular a stash hidden in the attic of their house, which Paula inherited from her murdered opera-singer aunt. We discover that jewel-lust isn’t new to Gregory. Indeed, he strangled Paula’s aunt after failing to retrieve the stash and then hatched a plan to marry her heir. She had hidden the jewels well, however, and it takes months – long enough to break a just-married girl’s mind – for Gregory to ferret out his treasure. He only has a moment to savour his success before Scotland Yard arrives. ‘I don’t ask you to understand me,’ he says to Paula: ‘Between us all the time were those jewels, like a fire in my brain, a fire that separated us.’ That’s all right, then.

The ‘gaslight’ of the title refers to the lamps in the house which grow dim every evening when Gregory leaves for work (in reality, to search for the diamonds in the attic), but this isn’t part of his plan: he doesn’t realise that he’s diverting the house’s gas supply to the attic, something a canny police detective eventually works out. The gaslight is what gives him away. It is the only unintentional part of the infernal treatment to which Bergman’s character is subjected, and the one she knows for sure she isn’t imagining. Gaslight is what undermines Gregory’s gaslighting.

Gaslight has been useful for thinking with, despite the pretext of the jewels, because it holds the potential for more sinister and less explicit readings. The diamonds represent something even more covetable: Paula’s mind. Will he finally have her in the palm of his hand? The answer appears to be yes; her final swoon is almost orgasmic. But she recovers at once, and turns the tables on Gregory, who has been tied to a chair by the cops. ‘How can a mad woman help her husband to escape?’ she asks, in mock simplicity. When he asks for a knife, she gets one, but withholds it: ‘Are you suggesting that this is a knife I hold in my hand? Have you gone mad, my husband?’ Nor is she done with her revenge: ‘If I were not mad,’ she tells him, ‘I could have helped you ... But because I am mad ... I’m rejoicing in my heart, without a shred of pity, without a shred of regret, watching you go with glory in my heart!’

The distance between Gaslight and Bad Sisters is almost eighty years and no hidden treasure is now required to explain the desire to undermine another person’s self-possession. People gaslight those close to them for the same reason that, say, husbands rape their wives – because they can. At the same time, the concept of gaslighting has become a popular heuristic for forms of psychic domination, real or imagined, beyond personal relations: Obama’s ‘gaslight presidency’ (Wall Street Journal), Donald Trump’s ‘gaslighting of the world’ (Washington Post) or pro-transgender activists’ supposed ‘gaslighting of Americans’ (Daily Signal). This seems to me no bad thing, but according to Abramson we shouldn’t use the term in this way. ‘I want to urge that we not broaden our conception of gaslighting to include such social-structural issues under a new subcategory of “structural gaslighting”,’ she writes. Rather, we should use the term only to denote interpersonal interactions, because while ‘oppressive social structures can play an extremely significant role’ in gaslighting, it is ultimately ‘not something the social structures do but something people do with those social structures’. She grants that the ‘self-disguising features ... of subjugating systems’ are pervasive and pernicious – for example, the bind of apparently benevolent racism or sexism, or the hermeneutical injustice of lacking the language to describe your abuse. (...) As she sees it, ‘people gaslight, social structures don’t,’ and pretending otherwise only ‘conceals the fact that individual people are here the proper loci of moral responsibility’.

The first mistake here is to see the expansion of the term as letting individuals off the hook, when in fact it allows us to identify dissembling and manipulation in a wider range of contexts. Gaslighting is a helpful way of explaining what is happening when Donald Trump gives fake-news briefings and refuses to be held accountable for his actions while claiming – or allowing others to claim on his behalf – that it is his critics who are lying, whose actions have consequences. In emphasising private dynamics and interpersonal relationships, Abramson lets all of us off the hook. One could argue that we’re all complicit in gaslighting, that we all feel its allure, whether the gaslighter is Trump or Hannibal Lecter.

by Sophie Lewis, London Review of Books | Read more:
Image: Gaslight, IMDB

Friday, July 26, 2024

The Next New Thing

I served for a decade on the jury of the Richard H. Driehaus Prize, awarded each year to the architect who best represents the values of traditional and classical design. As Martin C. Pedersen observed recently on his website Common Edge: “The Driehaus is architecture’s traditional-classical design version of the Pritzker Prize. Although it comes with a hefty $200,000 check—twice the size of the Pritzker’s honorarium—and previous winners include such luminaries as Robert A. M. Stern, Michael Graves, Léon Krier, and Andrés Duany and Elizabeth Plater-Zyberk, the award still exists in a sort of media vacuum.”

Pedersen is right. The design press pays scant attention to the Driehaus Prize, probably because its readers—the architectural mainstream—have little interest in traditional/classical architecture. Never mind that this approach accounts for countless private residences nationwide, as well as academic buildings, public libraries, concert halls, a federal courthouse, and a presidential library. One building that should have penetrated the media vacuum is 15 Central Park West, a luxury apartment building in Manhattan whose record-breaking commercial success gained it renown among real-estate mavens; the stately limestone façades consciously recall such prewar classics as the Apthorp and the Beresford.

But popular acclaim counts for little in the closeted architectural world. As the New York architect Peter Pennoyer, this year’s Driehaus winner, told Pedersen, “There is a deep-seated interest—if not delusion—in the idea that the avant-garde, the cutting edge, the next new thing is what we should all be concerned about, at the exclusion of history, tradition, community, and context.”

The next new thing. Précisément! It was a French book published in 1923 that sparked the attitude that Pennoyer describes. The author was a Swiss-born architect, Charles-Édouard Jeanneret, who had recently adopted the pen name Le Corbusier, and the book was his spirited manifesto, Vers une architecture (literally Towards an Architecture). The first English translation was titled Towards a New Architecture, adding a word that was not inaccurate, for despite including illustrations of ancient temples and Hadrian’s Villa and referring to Donato Bramante and Raphael, Le Corbusier’s book was determinedly forward looking in its message. “We do not appreciate sufficiently the deep chasm between our own epoch and earlier periods,” he proclaimed. “If we set ourselves against the past, we are forced to the conclusion that the old architectural code, with its mass of rules and regulations evolved during four thousand years, is no longer of any interest; it no longer concerns us; all the values have been revised; there has been revolution in the conception of what Architecture is.” Stirring stuff.

All the values have been revised. Le Corbusier made it sound as if the modern era, propelled by the Great War, represented an epochal moment, which it did in many ways, but he didn’t count on one thing: how rapidly things would change in the modern age. The ponderous Farman Goliath biplane that he featured in Vers une architecture was out of service in less than a decade, and the majestic Cunard ocean liner Aquitania, whose white superstructure served him as an architectural model, was decommissioned in 1950. Le Corbusier often used his own Voisin C7 automobile as a prop in photographs of the villas he had designed in the Paris suburbs—the boxy car and his boxy architecture were all of a piece. But less than a decade later, Gabriel Voisin introduced the C25 Aérodyne, a streamlined beauty that wasn’t boxy at all. Where did that leave the “new” architecture? Having cut itself off from the old rules and regulations, as Le Corbusier put it, it had no choice but to keep changing with the changing times.

Regularly reinventing architecture is exciting, but it faces a number of challenges. Architecture is an art, hence creativity is important, but it is an applied art, which makes it fundamentally empirical—that is, ruled not by theory but by experience. What works is worth repeating; what doesn’t, isn’t. As a result, building design has traditionally depended on rules of thumb: the dimensions of tread and riser that make for a comfortable stair, the pleasing proportion between the width of a room and its ceiling height, the shapes and details that ensure a roof doesn’t leak. The skill of the architect lies in knowing when to innovate and when to stick to the tried and true.

Architects rushing to discover the next new thing tend to undervalue the tried and true. Willfully ignoring experience and implementing untested new solutions can be risky, as I. M. Pei, a talented architect, found after he designed the East Building of the National Gallery in Washington, D.C. The exterior of the new wing, completed in 1978, is covered in pink Tennessee marble to match the old gallery. There is nothing particularly novel about using marble as a cladding. The ancient Romans covered the brick Pantheon with marble, and more than a thousand years later, John Russell Pope designed basically the same system—thick, self-supporting walls of marble—for the main building of the National Gallery. Pope concealed the necessary expansion joints behind moldings and pilasters, but Pei wanted the surface of his walls to be smooth and unbroken, so he used thin marble panels and supported them individually on stainless steel hangers that were embedded in the concrete structure. Only 33 years after the building opened, the panels started to show signs of buckling, and the entire marble skin—some 16,000 panels—had to be removed and reinstalled on new hangers, at a cost of more than $80 million. The debacle was in sharp contrast to Pope’s enduring building next door.

Ignoring the past often means ignoring the good ideas of one’s immediate predecessors. In the past, copying masters was a valuable part of architectural design—Andrea Palladio copied Bramante, Inigo Jones copied Palladio, Christopher Wren copied Jones. Now copying is taboo. For example, the work of early Scandinavian modernists such as Sigurd Lewerentz and Alvar Aalto, who humanized their stripped-down modern designs with interesting handcrafted details, was ignored by later generations. Similarly, when Louis Kahn produced the sublime skylit vaults of the Kimbell Art Museum in Fort Worth, his ingenious solution was highly praised, but it was never repeated. As a result, instead of a considered evolution, modern architecture has been marked by a succession of fresh starts, some real and many false.

Reinventing architecture faces another, less obvious challenge. When Le Corbusier presented his Plan Voisin—a fictive proposal to rebuild the center of Paris with high-rise office towers—he took it for granted that the new would entirely replace the old. But of course, real cities consist of both new and old buildings. The old buildings are not historic relics but functioning places where people live, work, study, and in the case of old concert halls, listen to music. For most people, old buildings are as much a part of modern life as flat-screen televisions and smartphones. Le Corbusier maintained that the old architectural values need no longer concern us. But the contrary is true: the old buildings are often cherished, not primarily—or at least not only—because they are old, but because they are, well, beautiful.

It was perhaps inevitable that a reaction to ahistorical modern architecture would emerge at some point. This happened in the 1980s, and the reaction was largely facilitated by the architectural movement known as postmodernism. Although short-lived, this facile flirtation with history opened the door to a more serious reconsideration of the past. This included not only the American Renaissance of the late 19th and early 20th centuries, which is a touchstone for many classicists, but also the work of inventive masters such as Bertram Goodhue, who designed the Nebraska State Capitol; Ralph Adams Cram, who was responsible for Rice University; and Raymond Hood, the lead architect of Rockefeller Center.

It turns out that there are advantages to reconnecting with history. Without the imperative to constantly innovate, which can lead to risky experimentation and construction failures, architects can rely on time-tested methods of construction, and traditional materials and details. The modern steel frames of the Nebraska State Capitol and the buildings of Rockefeller Center, for example, are clad in traditional limestone. Architects who are free to find inspiration in their predecessors and contemporaries produce buildings that not only work but also gain the affection of the general public: libraries and courthouses that don’t look like flashy casinos, academic buildings that cannot be mistaken for workaday office buildings, and places of worship that don’t resemble utilitarian industrial plants.

It would be inaccurate to say that people don’t like modern architecture. After more than a century, it’s an accepted feature of contemporary life, almost a tradition. Office workers expect their workplaces to be sleek; shoppers expect high-fashion boutiques and automobile showrooms to be minimalist exercises in bare concrete and industrial details; and museumgoers expect galleries to resemble artists’ lofts, and museum cafés to have chic furniture and Zen-like décor.

But there are limits. It’s okay to have a minimalist kitchen or bathroom, but a living room shouldn’t look like an Apple store, and a house shouldn’t look like an upscale health spa. Nor should a college campus be mistaken for a suburban office park. 

by Witold Rybczynski, The American Scholar | Read more:
Image: 15 Central Park West, NY. Thomas Craven, Wikipedia
[ed. See also: Huge in Palm Springs (Slate).]

Thursday, July 25, 2024

via:

[ed. Pretty good day. See also: I’ve been thinking about life after death – and I want to come back as Keanu Reeves (Guardian).]

via:

Deep Reading Will Save Your Soul

Higher ed is at an impasse. So much about it sucks, and nothing about it is likely to change. Colleges and universities do not seem inclined to reform themselves, and if they were, they wouldn’t know how, and if they did, they couldn’t. Between bureaucratic inertia, faculty resistance, and the conflicting agendas of a heterogenous array of stakeholders, concerted change appears to be impossible. Besides, business is good, at least at selective schools. The notion, floated now in certain quarters, that students and parents will turn from the Harvards and Yales in disgust is a fantasy. As long as elite institutions remain the principal pipeline to elite employers (and they will), the havers and strivers will crowd toward their gates. Everything else—the classes, the politics, the arts and sciences—is incidental.

Which is not to say that interesting things aren’t happening in post-secondary (and post-tertiary) education. They just aren’t happening, for the most part, on campus. People write to me about this: initiatives they’ve started or are starting or have taken part in. These come, as far as I can tell, in two broad types, corresponding to the two fundamental complaints that people voice about their undergraduate experience. The first complaint is that college did not prepare them for the real world: that the whole exercise—papers, busywork, pointless requirements; siloed disciplines and abstract theory—seemed remote from anything that they actually might want to do with their lives.

Programs that address this discontent exhibit a remarkably consistent set of characteristics. They are interdisciplinary, integrating methods and perspectives—from, say, engineering and the social sciences—that are normally kept apart. They are informal, eschewing frontal instruction and traditional modes of evaluation. They are experiential, more about doing—creating, collaborating—than reading and writing. They are extramural, bringing students into the community for service projects, internships, artistic installations or performances. They are directed to specific purposes, usually to do with social amelioration or environmental rescue. Above all, they are student-centered. Participants are enabled (and expected) to direct their education by constructing bespoke curricula out of the resources the program gives them access to. In a word, these endeavors emphasize “engagement.”

All this is fine, as far as it goes. It has analogues and precedents in higher ed (Evergreen, Bennington, Antioch, Hampshire) as well as in the practice of progressive education, especially at the secondary level. High schools will focus on “project-based learning,” with assessment conducted through portfolios and public exhibitions. A student will identify a problem (a human need, an injustice, an instance of underrepresentation), then devise and implement a response (a physical system, a community-facing program, an art project).

Again, I see the logic, it is just what many students want, but what bothers me about this educational approach—the “problem” approach, the “STEAM” (STEM + arts) approach—is what it leaves out. It leaves out the humanities. It leaves out books. It leaves out literature and philosophy, history and art history and the history of religion. It leaves out any mode of inquiry—reflection, speculation, conversation with the past—that cannot be turned to immediate practical ends. Not everything in the world is a problem, and to see the world as a series of problems is to limit the potential of both world and self. What problem does a song address? What problem will reading Voltaire help you solve, in any predictable way? The “problem” approach—the “engagement” approach, the save-the-world approach—leaves out, finally, what I’d call learning.

And that is the second complaint that graduates tend to express: that they finished college without the feeling that they had learned anything, in this essential sense. That they hadn’t been touched. That they hadn’t been changed. That there is a treasure out there—call it the Great Books or just great books, the wisdom of the ages or the best that has been thought and said—that its purpose is to activate the treasure inside them, that they had come to one of these splendid institutions (whose architecture speaks of culture, whose age gives earnest of depth) to be initiated into it, but that they had been denied, deprived. For unclear reasons, cheated.

I had students like this at Columbia and Yale. There were never a lot of them, and to judge from what’s been happening to humanities enrollments, there are fewer and fewer. (From 2013 to 2022, the number of people graduating with bachelors degrees in English fell by 36%. As a share of all degrees, it fell by 42%, to less than 1 in 60.) They would tell me—these pilgrims, these intellectuals in embryo, these kindled souls—how hard they were finding it to get the kind of education they had come to college for. Professors were often preoccupied, with little patience for mentorship, the open-ended office-hours exploration. Classes, even in fields like philosophy, felt lifeless, impersonal, like engineering but with words instead of numbers. Worst of all were their fellow undergraduates, those climbers and careerists. “It’s hard to build your soul,” as one of my students once put it to me, “when everyone around you is trying to sell theirs.”

That student’s name was Matthew Strother. It was through Matthew—he was in his early thirties by this point, and still seeking—that I learned about perhaps the two most prominent initiatives to have sprung up off-campus of late in response to the hunger for serious study.

by William Deresiewicz, Persuasion | Read more:
Image: Matthew Strother (courtesy of Berta Willisch).

The Flattening Machine

A wonder of the internet is that, from the right perch, you can watch information wash over people in real time. I happened to check X on Saturday only minutes after the attempted assassination of Donald Trump, and I experienced immediate disbelief. Surely the stills and live-feed screenshots were fake—AI-generated or Photoshopped.

But the sheer volume of information in a high-stakes news event such as this one has a counterintuitive effect: Distinguishing real from fake is actually quite easy when the entire world focuses its attention on the same thing. Amid a flurry of confusion and speculation, the basic facts of this horrifying event emerged quickly. The former president was shot at. He was injured but is recovering. For a brief moment, the online information apparatus worked to deliver important information—a terrifying shared reality of political violence.

Our information ecosystem is actually pretty good while the dust is up. But the second it begins to settle, that same system creates chaos. As my own shock wore off, leaving me to contemplate the enormity of the moment, I could sense a familiar shift on Reddit, X, and other platforms.

The basic facts held attention for only so long before being supplanted by wild speculation—people were eager to post about the identity of the shooter, his possible motives, the political ramifications of the event, the specter of more violence. It may be human nature to react this way in traumatic moments—to desperately attempt to fill an information void—but the online platforms so many of us frequent have monetized and gamified this instinct, rewarding those who create the most compelling stories. Within the first four hours, right-wing politicians, perhaps looking to curry favor with Trump, hammered out reckless posts blaming Joe Biden’s campaign for the shooting; Elon Musk suggested that the Secret Service may have let the shooting happen on purpose; as soon as the shooter’s name was released, self-styled online investigators dug up his name and his voter registration, eager for information they could retrofit to their worldview. Yesterday, conspiracy theorists pointed to a two-year-old promotional video from BlackRock that was filmed at the shooter’s school and features the shooter for a moment—proof, they said, of some inexplicable globalist conspiracy. As my colleague Ali Breland noted in an article on Sunday, conspiracy theorizing has become the “default logic for many Americans in understanding all major moments.”

An attempted assassination became a mass attentional event like any other. Right-wing hucksters, BlueAnon posters, politicians, news outlets, conspiracy shock jocks, ironic trolls, and Instagram dropshippers all knew how to mobilize and hit their marks. Musk let only about 30 minutes pass before he brought attention back to himself by endorsing Trump for president. It took just 86 minutes for Barstool Sports’ Dave Portnoy to post a link to a black T-shirt with the immediately iconic image of a bloodied Trump raising a fist. Trolls made fake online accounts to dupe people into thinking the shooter was part of the anti-fascist movement.

Some may wish to see the conspiracy peddling, cynical politicking, and information warfare as a kind of gross aberration or the unintended consequences and outputs of a system that’s gone awry. This is wrong. What we are witnessing is an information system working as designed. It is a machine that rewards speed, bravado, and provocation. It is a machine that goads people into participating as the worst version of themselves. It is a machine that is hyperefficient, ravenous, even insatiable—a machine that can devour any news cycle, no matter how large, and pick it apart until it is an old, tired carcass.

All of these people are following old playbooks honed by years of toxic online politics and decades of gun violence in schools, grocery stores, nightclubs, and movie theaters. But what feels meaningful in the days after this assassination attempt is the full embrace of the system as somehow virtuous by the bad actors who exploit it; unabashed, reckless posting is now something like a political stance in and of itself, encouraged by the owners, funders, and champions of the tech platforms that have created these incentives. (...)

The overall effect of this transformation is a kind of flattening. Online, the harrowing events of Saturday weren’t all that distinguishable from other mass shootings or political scandals. On X, I saw a post in my feed suggesting, ironically or not, “I know this sounds insane now but everyone will totally forget about this in ten days.” The line has stuck in my head for the past few days, not because I think it’s true, but because it feels like it could be. The flattening—of time, of consequence, of perspective—more than the rage or polarization or mistrust, is the main output of our modern information ecosystem. 

by Charlie Warzel, The Atlantic |  Read more:
Image: Illustration by The Atlantic. Source: Jabin Botsford/The Washington Post/Getty.

Wednesday, July 24, 2024

via:


Angel Nenov

via:

Project 2025: J.D. Vance Writes Forward

As Trump desperately tries to separate his campaign from Project 2025, users on X have noted one big problem: J.D. Vance wrote the foreword to a forthcoming book by the plan’s lead author, Heritage Foundation President Kevin Roberts.

On the Amazon product page, the promotional material for the book, titled Dawn’s Early Light, highlights Roberts’s role in composing Project 2025, the Heritage Foundation proposal for a conservative overhaul of the federal government.

The product page also includes a favorable review from Vance. “Never before has a figure with Roberts’s depth and stature within the American Right tried to articulate a genuinely new future for conservatism,” the review says. “We are now all realizing that it’s time to circle the wagons and load the muskets. In the fights that lay ahead, these ideas are an essential weapon.”

When the book first became available for pre-order on June 19, Vance promoted it on X, writing, “I was thrilled to write the foreword for this incredible book, which contains a bold new vision for the future of conservatism in America.”

On the Amazon page for Dawn’s Early Light, the subtitle reads, “Taking Back Washington to Save America,” but an archived version of the page from June 19 indicates it was initially “Burning Down Washington to Save America.”

Inflammatory language in the blurb has also apparently been tamped down.

A sentence on the archived page that says the book “blazes a warpath for the American people to take back their country” now says it “blazes a promising path.” Another fiery sentence on the archived page read, “Just as a controlled burn preserves the longevity of a forest, conservatives need to burn down these institutions [the FBI, The New York Times, the Department of Education, etc.] if we’re to preserve the American Way of life.” It now says that those institutions “need to be dissolved if the American way of life is to be passed down to future generations.”

by Robert McCoy, Yahoo News | Read more:
Image: uncredited
[ed. When your own candidates disavow and try to hide your Party's agenda - not a good sign. See also: What is Project 2025? (Yahoo News):]
***
What is Project 2025? Conversations, both online and off, surrounding the conservative agenda have exploded recently — more than a year after the policy proposal was published.

Project 2025 is a 922-page proposed blueprint for the next Republican administration produced by conservative think tank The Heritage Foundation.

Critics have labeled it “an authoritarian takeover of the United States,” while supporters call it a plan to return “our federal government to one ‘of the people, by the people, and for the people.’” (...)

What is Project 2025, and what is it calling for?

Project 2025 bills itself as “a policy agenda, personnel, training and a 180-day playbook” to be implemented “on day one” by the next Republican president, outlining various agenda items, including which bills to propose, laws to revoke and government agencies to restructure. (...)

Some of its directives include:
  • An overhaul of the Department of Justice and FBI, the former of which it labels "a bloated bureaucracy" with employees "who are infatuated with the perpetuation of a radical liberal agenda."
  • Implement Schedule F, a Trump-era executive order that the Biden administration repealed that would allow the reclassification — and potential replacement — of thousands of government workers.
  • Eliminate the Department of Education.
  • Impose wide restrictions on abortion access, including reversing federal approval of the abortion pill mifepristone.
  • Allocate funding for “construction of additional border wall systems.”
  • Ban pornography and imprison anyone who produces or distributes it.
  • Promote "Sabbath Rest" by encouraging Congress to amend the Fair Labor Standards Act to require people who work these days to be paid time and a half.
  • Have the federal government promote “biblically based, social science reinforced” heterosexual marriages.
  • Call on the new Health and Human Services secretary to “reverse the Biden Administration’s focus on 'LGBTQ+ equity'" and “subsidizing single-motherhood.”
  • Remove sexual orientation, gender identity, diversity, equity, inclusion and gender equality from any federal rule, regulation or legislation.
  • Revive Trump’s plan to open most of the National Petroleum Reserve of Alaska to leasing and development.

The Knotty Death of the Necktie

Not long ago, on a Times podcast, Paul Krugman breezily announced (and if we can’t trust Paul Krugman in a breezy mood, whom can we trust?) that, though it’s hard to summarize the economic consequences of the pandemic with certainty, one sure thing is that it killed off ties. He meant not the strong social ties beloved of psychologists, nor the weak ties beloved of sociologists, nor even the railroad ties that once unified a nation. No, he meant, simply, neckties—the long, colored bands of fabric that men once tied around their collars before going to work or out to dinner or, really, to any kind of semi-formal occasion. Zoom meetings and remote work had sealed their fate, and Krugman gave no assurance that they would ever come back.

Actual facts—and that near-relation of actual facts, widely distributed images—seem to confirm this view. Between 1995 and 2008, necktie sales plummeted from more than a billion dollars to less than seven hundred million, and, if a fashion historian on NPR is to be believed (and if you can’t believe NPR . . . ), ties are now “reserved for the most formal events—for weddings, for graduations, job interviews.” Post-pandemic, there is no sign of a necktie recovery: a now famous photograph from the 2022 G-7 summit shows the group’s leaders, seven men, all in open collars, making them look weirdly ready for a slightly senescent remake of “The Hangover.” As surely as the famous, supposedly hatless Inauguration of John F. Kennedy was said to have been the end of the hat, and Clark Gable’s bare chest in “It Happened One Night” was said to have been the end of the undershirt, the pandemic has been the end of the necktie.

Such truths are always at best half-truths. Sudden appearances and disappearances tend to reflect deeper trends, and, when something ends abruptly, it often means it was already ending, slowly. (Even the dinosaurs, a current line of thinking now runs, were extinguished by that asteroid only after having been diminished for millennia by volcanoes.) In “Hatless Jack,” a fine and entertaining book published several years ago, the Chicago newspaperman Neil Steinberg demonstrated that the tale of Kennedy’s killing off the hat was wildly overstated. The hat had been on its way out for a while, and Jack’s hatless Inauguration wasn’t, in any case, actually hatless: he wore a top hat on his way to the ceremony but removed it before making his remarks. Doubtless the same was true of the undershirt that Gable didn’t have on. They were already starting to feel like encumbrances, which might explain why Gable didn’t wear one. And so with the necktie. Already diminishing in ubiquity by the Obama years, it needed only a single strong push to fall into the abyss. (...)

What we now think of as the necktie—cut on the bias, made of three or four pieces of fabric, and faced with a lining—was actually a fairly recent, and local, invention, that of a New York schmatte tradesman named Jesse Langsdorf. What we call “ties” generically are, specifically, Langsdorf ties.

The Langsdorf necktie that emerged early in the twentieth century was, to be sure, hideously uncomfortable. (It is no accident that a necktie party was a grotesque nickname for a hanging.) Their constriction made them perhaps the masculine counterpart of the yet more uncomfortable fashion regime—high heels—forced upon women. (...)

Examine any now unused collection of ties, and you will find that they are full of tightly compressed meanings—once instantly significant to the spectator of the time and still occultly visible now. Not only the specific meanings of club membership but also the broader semiotics of style. In any vintage closet, there are likely to be knitted neckties that still reside within the eighties style of “American Gigolo”—which, believe it or not, helped bring Armani to America. The knit tie meant Italy, sports cars, daring, and a slight edge of the criminal. There are probably ties from Liberty of London—beautiful, flowered-print ties whose aesthetic ultimately derives from the Arts and Crafts movement, with its insistence on making the surfaces of modern life as intricate and complexly ornamental as a medieval tapestry or Pre-Raphaelite painting. If the closet is old enough, its ties will show a whole social history of the pallid fifties turning into the ambivalent sixties turning into the florid seventies. The New Yorker cartoonist Charles Saxon captured these transitions as they occurred, in a career that can be seen as a dazzling study of ties and their meanings. The neatly knotted ties of Cheeveresque commuters give way in the early seventies to the ever-broadening ties of advertising men, flags they waved to show off their desire to simultaneously woo the counterculture and keep out of it.

The tie could sometimes get so compressed in its significance as to lose its witty, stealthy character and become overly and unambiguously “loaded.” There is no better story of suicide-by-semiotics than that of the rise and death of the bow tie, which, beginning in the nineteen-eighties, became so single-mindedly knotted up with neoconservatism, in the estimable hands of George Will, that to wear one was to declare oneself a youngish fogy, a reader of the National Review, and a skeptic of big government. The wider shores of bow-tie-dom—the dashing, jaunty, self-mocking P. G. Wodehouse side of them—receded, and were lost. It became impossible to wear a bow tie and vote Democratic. (...)

Of course, the human appetite for display will never end, and, so, as the concentrated symbolism of the necktie evaporates, the rest of our clothes must carry its messages. The purposes of Warburgian pattern have now spread everywhere: to the cut of your jogging pants and the choice of your sneakers and, well, the cock of your snook. Where once the necktie blazoned out a specific identity from the general background of tailored gray, now everything counts. The most obvious successor garment to the necktie is the baseball cap, which declares its owner’s identity and affiliation not with some tantalizing occult pattern but the painful unsubtlety of actual text—the club named on the cap.

by Adam Gopnik, New Yorker |  Read more:
Image: Jaedoo Lee
[ed. Cock of your snook? Look it up yourself, I'm not doing it for you.]