[ed. Funny.]
Tuesday, November 19, 2024
Self Defense For Dummies
Things you need to know: how to defend yourself against assailant armed with a spatula
Monday, November 18, 2024
New Fusion
Mike Bono
Kinga Glyk
Matteo Mancuso
[ed. From my Fusion playlist.]
The Seeds of Social Revolution: Extreme Wealth Inequality
The seeds of social revolution have been sown and sprouted. What we harvest is up to us.
If there is any potential catalyst for social upheaval that attracts less attention than extreme wealth inequality, it's mighty obscure. As I noted yesterday, the present extreme of wealth inequality draws an occasional bit of lip service or handwringing, but very little serious focus, despite ample historical foundations for its role in sowing the seeds of social revolutions.
As I tried to explain in yesterday's post, extreme wealth inequality might not be the spark that ignites a revolution, but it is a tectonic shift that destabilizes the social order. For extreme wealth inequality isn't a consequence of fate or sorcery; it is the consequence of policies that favor the few at the expense of the many, a reality that is exceedingly uncomfortable for those benefiting from the asymmetry.
For a rundown of the policies that have exacerbated wealth inequality, consider the following excerpts from Time magazine, September 2020: The Top 1% of Americans Have Taken $50 Trillion From the Bottom 90% -- And That's Made the U.S. Less Secure.
We can best understand extreme wealth inequality as the destabilizing result of one set of competing economic interests gaining dominance over other economic interests: broadly speaking, the balance between labor and capital has collapsed in favor of capital. To take one example, consider the minimum wage, which did not kept up with inflation for decades as a policy decision.
The different interests within each sector can also destabilize into asymmetric distributions. For example, within the broad category of capital, there are many competing interests: industrial capital, financial capital, land-based capital, domestic and global interests, and so on. Within labor, there are blue-collar and white collar interests, and gradations of skills, regional interests, and so on.
Broadly speaking, globalization and financialization greatly increased the share of some interests at the expense of others.
The social boundaries of what's acceptable and unacceptable change, enabling or restricting financial policies. For example, in the postwar boom of the 1950s, corporate CEOs earned multiples of their average employee that by today's standards were ludicrously low, as present-day CEOs routinely take home compensation (including stock options) that are in the tens of millions of dollars annually.
In the broad sweep of history, extreme asymmetries in the distribution of the economy's output are rebalanced one way or the other, if not with policy changes than by the overthrow of the status quo. The book The Great Leveler: Violence and the History of Inequality from the Stone Age to the Twenty-First Century breaks down the various pieces of this complex puzzle.
The history and data are too varied to be easily summarized, but we can start with humanity's innate sense of fairness in social organizations: we sense when our contributions are getting short shrift while others are grabbing shares that are not commensurate with their contributions--despite their claims to "earning" their outsized shares.
Some write this off as envy, and to be sure envy is an innate human response, but fairness and envy are two different things. If someone strips us of power that we once held to benefit their own accumulation of wealth, our sense that this is unfair is not envy.
We seem to be approaching the point where a rebalancing of extreme asymmetries is at hand, and so we have to choose between policy changes and social upheaval. Those benefiting from the current asymmetrical distribution naturally feel that all is right with the world, while those whose purchasing power and political power have been stripmined feel that regaining what was taken from them is only fair.
As I tried to explain in yesterday's post, extreme wealth inequality might not be the spark that ignites a revolution, but it is a tectonic shift that destabilizes the social order. For extreme wealth inequality isn't a consequence of fate or sorcery; it is the consequence of policies that favor the few at the expense of the many, a reality that is exceedingly uncomfortable for those benefiting from the asymmetry.
For a rundown of the policies that have exacerbated wealth inequality, consider the following excerpts from Time magazine, September 2020: The Top 1% of Americans Have Taken $50 Trillion From the Bottom 90% -- And That's Made the U.S. Less Secure.
"There are some who blame the current plight of working Americans on structural changes in the underlying economy--on automation, and especially on globalization. According to this popular narrative, the lower wages of the past 40 years were the unfortunate but necessary price of keeping American businesses competitive in an increasingly cutthroat global market. But in fact, the $50 trillion transfer of wealth the RAND report documents has occurred entirely within the American economy, not between it and its trading partners. No, this upward redistribution of income, wealth, and power wasn't inevitable; it was a choice--a direct result of the trickle-down policies we chose to implement since 1975.In other words, extreme wealth inequality is not the result of economic forces outside our control; it's the result of our policy responses to changing social, political and economic conditions. While those benefiting from the policies attribute the asymmetric distribution of the economy's gains to "forces outside our control" such as globalization and automation, those losing ground sense that this is an excuse for taking advantage of the situation, to the detriment of the national interest.
We chose to cut taxes on billionaires and to deregulate the financial industry. We chose to allow CEOs to manipulate share prices through stock buybacks, and to lavishly reward themselves with the proceeds. We chose to permit giant corporations, through mergers and acquisitions, to accumulate the vast monopoly power necessary to dictate both prices charged and wages paid. We chose to erode the minimum wage and the overtime threshold and the bargaining power of labor. For four decades, we chose to elect political leaders who put the material interests of the rich and powerful above those of the American people."
We can best understand extreme wealth inequality as the destabilizing result of one set of competing economic interests gaining dominance over other economic interests: broadly speaking, the balance between labor and capital has collapsed in favor of capital. To take one example, consider the minimum wage, which did not kept up with inflation for decades as a policy decision.
The different interests within each sector can also destabilize into asymmetric distributions. For example, within the broad category of capital, there are many competing interests: industrial capital, financial capital, land-based capital, domestic and global interests, and so on. Within labor, there are blue-collar and white collar interests, and gradations of skills, regional interests, and so on.
Broadly speaking, globalization and financialization greatly increased the share of some interests at the expense of others.
The social boundaries of what's acceptable and unacceptable change, enabling or restricting financial policies. For example, in the postwar boom of the 1950s, corporate CEOs earned multiples of their average employee that by today's standards were ludicrously low, as present-day CEOs routinely take home compensation (including stock options) that are in the tens of millions of dollars annually.
In the broad sweep of history, extreme asymmetries in the distribution of the economy's output are rebalanced one way or the other, if not with policy changes than by the overthrow of the status quo. The book The Great Leveler: Violence and the History of Inequality from the Stone Age to the Twenty-First Century breaks down the various pieces of this complex puzzle.
The history and data are too varied to be easily summarized, but we can start with humanity's innate sense of fairness in social organizations: we sense when our contributions are getting short shrift while others are grabbing shares that are not commensurate with their contributions--despite their claims to "earning" their outsized shares.
Some write this off as envy, and to be sure envy is an innate human response, but fairness and envy are two different things. If someone strips us of power that we once held to benefit their own accumulation of wealth, our sense that this is unfair is not envy.
We seem to be approaching the point where a rebalancing of extreme asymmetries is at hand, and so we have to choose between policy changes and social upheaval. Those benefiting from the current asymmetrical distribution naturally feel that all is right with the world, while those whose purchasing power and political power have been stripmined feel that regaining what was taken from them is only fair.
by Charles Hugh Smith, Of Two Minds | Read more:
Image: Lon Tweeten/Time
[ed. This has been obvious for a long time, yet continues to persist and actually worsen. Why? The post following this one - A New Phase of Cultural Conflict explains a lot, especially how deceptive allegiences (a lying billionaire con man, hero to the underclass?) can generate popular support (and electoral victories) while hiding true intentions.]
Labels:
Business,
Critical Thought,
Culture,
Economics,
Politics
15 Observations on the New Phase in Cultural Conflict
Back in 2014, I sketched out a widely-read outline of an alternative interpretation of cultural conflict. Curiously enough, the conceptual tools I used came from a 1929 book from philosopher José Ortega y Gasset entitled The Revolt of the Masses—a work that offers surprisingly timely insights into our current situation.
That article stirred up a lot of debate at the time, but the whole situation has intensified further since 2014. Everything I’ve seen in those eight years has made painfully clear how insightful Ortega had been. The time has come to revisit that framework, summarizing its key insights and offering predictions for what might happen in the future.
Here’s part of what I wrote back in 2014:
(1) Analysis of cultural conflict is still obsessed with left-versus-right strategizing, but the actual battle lines are increasingly down-versus-up. A lot of work goes into hiding this, because both left and right want to present an image of unity, but both spheres are splintering into intensely hostile up-and-down factions.
(2) The frequency with which you hear the “lesser of two evils” argument is an indicator of how powerful this up-and-down rupture has become. This is the argument used by Ups to retain the loyalty of the Downs. You have to stick with us, even if we are tainted elites, or else we both lose.
(3) When commentators give any attention to down-versus-up, they usually reduce the conflict to income disparities, but that is misleading. Down-versus-up is more attitudinal than economic. Sometimes the tension manifests itself along traditional class and wealth lines, with disputes focused primarily on money, but that’s only a small part of the conflict. Down-versus-up is multidimensional and adapts rapidly to current events. Adding to the complexity, rich people frequently act like Down members, while people with tiny incomes can be fiercely loyal to the Up worldview.
(4) The essence of down-versus-up is that a numerically large group of dissenters focus their anger on a small number of elites who they view as antagonists, perhaps even evil villains. These Down movements cut across left-versus-right political ideologies, and thus encompass seemingly incompatible groups such as Occupy Wall Street, the truck convoys, Black Lives Matter, the Tea Party, ANTIFA, cryptocurrency fanatics, and a host of other cohort groups in the news. In every instance, these groups have proven capable of mobilizing intense energy among members—much greater energy than the Ups can ever hope to match. Participants seem to appear out of nowhere, leaping almost instantaneously into action.
(5) There will be more groups like this next year—and every year from now on. As strange as it sounds, an organization that doesn’t even exist today is likely to transform the entire sociocultural landscape in the near future. I’m not sure what it will look like, but one thing is certain—it won’t arise from any legacy institution.
(6) The targets are people at the top of the heap, but that can include a dizzying array of individuals—including wealthy CEOs, DC politicians, celebrity TV newscasters, law enforcement authorities, experts of all stripes, Ivy League academics, hedge fund managers, tech titans at huge Silicon Valley companies, movie stars, etc. A key element of the narrative is not simply that these people have different agendas than those at the bottom, but even more to the point, these elites are depicted as inherently untrustworthy—they don’t play fairly, they have sold their souls to the Dark Side. Hence the Down opposition feels the need to take extreme measures. The critiques brandished by the Downs are often reduced to the banal, mind-numbing explanation that people on the Dark Side do bad things and must be stopped. The very banality of the message makes it all the more viral.
(7) The members of the Up group want to rebrand themselves as Down adherents. They work tirelessly to do this. Hence you see billionaires proclaiming their alignment with all of the leading Down agendas. Politicians see that Down constituencies are the most energized voters and curry their favor—proclaiming at every opportunity that I’m just like you. Even the most established DC insiders with the most elite backgrounds must act as if they aren’t really members of the Up cohort. Media personalities, in particular, take every opportunity to act as Down as possible, realizing that this is the only genuine street cred worth having in the current moment.
(8) When well known political figures move from right to left, or vice versa, many onlookers are surprised. But in almost every instance, the Up maintain their Up allegiance, and the Down retain their Down status. It's much easier to make the psychological shift from one party to another than to abandon your emotional attachment to the Down or Up worldview.
(9) All of the cultural energy right now is on the bottom. And that energy has been intensifying. The attempts to distort this conflict into conventional left-versus-right battle lines has prevented opinion leaders from grasping the actual dynamic at play. Any ambitious agenda that doesn’t take into account down-versus-up is doomed to failure.
(10) This is not just a political shift but also impacts arts and entertainment. Reality TV, for example, is a manifestation of legacy institutions trying to capture the vitality of the Down lifestyles in faux narratives that emulate non-elites in everyday situations. Music genres each have their own up-versus-down positioning—just consider your mental images of the audience for rap, classical, country, jazz, etc. (But genres can move: jazz was once Down, but it has become Up.) Art forms that seem to be in crisis—sculpture, the novel, the symphony—are always aligned with the Up cohort. Nobody ever claims that Down genres are in crisis.[ed. See also: I told you so (Numb at the Lodge).]
That article stirred up a lot of debate at the time, but the whole situation has intensified further since 2014. Everything I’ve seen in those eight years has made painfully clear how insightful Ortega had been. The time has come to revisit that framework, summarizing its key insights and offering predictions for what might happen in the future.
Here’s part of what I wrote back in 2014:
First, let me tell you what you won’t find in this book. Despite a title that promises political analysis, The Revolt of the Masses has almost nothing to say about conventional party ideologies and alignments. Ortega shows little interest in fascism or capitalism or Marxism, and this troubled me when I first read the book. (Although, in retrospect, the philosopher’s passing comments on these matters proved remarkably prescient—for example his smug dismissal of Russian communism as destined to failure in the West, and his prediction of the rise of a European union.) Above all, he hardly acknowledges the existence of ‘left’ and ‘right’ in political debates.
Ortega’s brilliant insight came in understanding that the battle between ‘up’ and ‘down’ could be as important in spurring social and cultural change as the conflict between ‘left’ and ‘right’. This is not an economic distinction in Ortega’s mind. The new conflict, he insists, is not between “hierarchically superior and inferior classes…. upper classes or lower classes.” A millionaire could be a member of the masses, according to Ortega’s surprising schema. And a pauper might represent the elite.
The key driver of change, as Ortega sees it, comes from a shocking attitude characteristic of the modern age—or, at least, Ortega was shocked. Put simply, the masses hate experts. If forced to choose between the advice of the learned and the vague impressions of other people just like themselves, the masses invariably turn to the latter. The upper elites still try to pronounce judgments and lead, but fewer and fewer of those down below pay attention.This dynamic is now far more significant than it was eight years ago. So I want to share 15 observations on the emerging vertical dimension of cultural conflict—these both define the rupture and try to predict how it will play out.
(1) Analysis of cultural conflict is still obsessed with left-versus-right strategizing, but the actual battle lines are increasingly down-versus-up. A lot of work goes into hiding this, because both left and right want to present an image of unity, but both spheres are splintering into intensely hostile up-and-down factions.
(2) The frequency with which you hear the “lesser of two evils” argument is an indicator of how powerful this up-and-down rupture has become. This is the argument used by Ups to retain the loyalty of the Downs. You have to stick with us, even if we are tainted elites, or else we both lose.
(3) When commentators give any attention to down-versus-up, they usually reduce the conflict to income disparities, but that is misleading. Down-versus-up is more attitudinal than economic. Sometimes the tension manifests itself along traditional class and wealth lines, with disputes focused primarily on money, but that’s only a small part of the conflict. Down-versus-up is multidimensional and adapts rapidly to current events. Adding to the complexity, rich people frequently act like Down members, while people with tiny incomes can be fiercely loyal to the Up worldview.
(4) The essence of down-versus-up is that a numerically large group of dissenters focus their anger on a small number of elites who they view as antagonists, perhaps even evil villains. These Down movements cut across left-versus-right political ideologies, and thus encompass seemingly incompatible groups such as Occupy Wall Street, the truck convoys, Black Lives Matter, the Tea Party, ANTIFA, cryptocurrency fanatics, and a host of other cohort groups in the news. In every instance, these groups have proven capable of mobilizing intense energy among members—much greater energy than the Ups can ever hope to match. Participants seem to appear out of nowhere, leaping almost instantaneously into action.
(5) There will be more groups like this next year—and every year from now on. As strange as it sounds, an organization that doesn’t even exist today is likely to transform the entire sociocultural landscape in the near future. I’m not sure what it will look like, but one thing is certain—it won’t arise from any legacy institution.
(6) The targets are people at the top of the heap, but that can include a dizzying array of individuals—including wealthy CEOs, DC politicians, celebrity TV newscasters, law enforcement authorities, experts of all stripes, Ivy League academics, hedge fund managers, tech titans at huge Silicon Valley companies, movie stars, etc. A key element of the narrative is not simply that these people have different agendas than those at the bottom, but even more to the point, these elites are depicted as inherently untrustworthy—they don’t play fairly, they have sold their souls to the Dark Side. Hence the Down opposition feels the need to take extreme measures. The critiques brandished by the Downs are often reduced to the banal, mind-numbing explanation that people on the Dark Side do bad things and must be stopped. The very banality of the message makes it all the more viral.
(7) The members of the Up group want to rebrand themselves as Down adherents. They work tirelessly to do this. Hence you see billionaires proclaiming their alignment with all of the leading Down agendas. Politicians see that Down constituencies are the most energized voters and curry their favor—proclaiming at every opportunity that I’m just like you. Even the most established DC insiders with the most elite backgrounds must act as if they aren’t really members of the Up cohort. Media personalities, in particular, take every opportunity to act as Down as possible, realizing that this is the only genuine street cred worth having in the current moment.
(8) When well known political figures move from right to left, or vice versa, many onlookers are surprised. But in almost every instance, the Up maintain their Up allegiance, and the Down retain their Down status. It's much easier to make the psychological shift from one party to another than to abandon your emotional attachment to the Down or Up worldview.
(9) All of the cultural energy right now is on the bottom. And that energy has been intensifying. The attempts to distort this conflict into conventional left-versus-right battle lines has prevented opinion leaders from grasping the actual dynamic at play. Any ambitious agenda that doesn’t take into account down-versus-up is doomed to failure.
(10) This is not just a political shift but also impacts arts and entertainment. Reality TV, for example, is a manifestation of legacy institutions trying to capture the vitality of the Down lifestyles in faux narratives that emulate non-elites in everyday situations. Music genres each have their own up-versus-down positioning—just consider your mental images of the audience for rap, classical, country, jazz, etc. (But genres can move: jazz was once Down, but it has become Up.) Art forms that seem to be in crisis—sculpture, the novel, the symphony—are always aligned with the Up cohort. Nobody ever claims that Down genres are in crisis.
by Ted Gioia, Honest Broker | Read more:
Image: via the author
Labels:
Critical Thought,
Culture,
Education,
history,
Media,
Philosophy,
Politics,
Psychology,
Relationships
Sunday, November 17, 2024
Valerius De Saedeleer (Belgian, 1867-1941, b. De Kat, Aalst, Belgium, d. Oudenaarde, Belgium) - Paysage d'Hiver (Winter Landscape), 1931, Paintings: Oil on Canvas
Benoît Maubrey (American, b. 1952, Washington D.C, USA) - Torii Gate made of speakers in Kamiyama, Tokushima, Japan.
via:
via:
Guitarist Tim DiJulio, ‘Seattle’s Best-Kept Secret’
A beloved Seattle guitarist takes the spotlight on his first major tour (Seattle Times)
Images: Jennifer Buchanan
[ed. Nice story, nice guy. Tough business. See also: Guitarist Tim DiJulio, ‘Seattle’s best-kept secret,’ has stars for fans (ST).]
Saturday, November 16, 2024
Deeply Thoughts
A while back I put up a tweet saying “‘Deeply’ is the new ‘very,’” but to my shock and dismay the world paid no attention. My little jape failed to quell the rising tide in the usage of the d-word in all sorts of public-facing discourse. Now “deeply” has become the universal adverb. Since I am otherwise in a bit of a midsummer lull, I thought I would take up my cudgel once more and square up to this menace.
“Very,” the feebler predecessor of “deeply,” was one of those words that editors and English teachers automatically red-pencilled into oblivion whenever they saw it on the page. Of course, people used it anyway, because they believed it made their sentences stronger. Why wouldn’t it? Something that’s very big must be bigger than something that’s merely big, right? So, commonsensically, putting “very” in front of an adjective should intensify it.
It didn’t actually work out that way, though. Readers came to understand, at some probably subliminal level, that “very” was just a marker for weak or tendentious writing. Serious people just didn’t use the word. The New York Times might write of “severe flooding” in a disaster area, for example, but you’d never see them use the phrase “very severe flooding.” A politician trying to plead for disaster relief funds might say “very severe flooding” and be quoted as such. But in actual coverage, the newspaper of record would never use “very.” Instead it would present facts and statistics, leaving the readers to judge for themselves the level of severity.
Sophisticated readers thus came to understand that “very” was a marker for lazy writing. Users of “very” were trying to bring you around to a certain point of view without earning it. That’s why editors and English teachers hated it.
“Deeply” has inherited all of the badness of “very” but piled on some additional noxious qualities.
A couple of years ago I set a personal policy that when reading anything at all—a tweet, a press release, a newspaper article—as soon as I encountered the word “deeply” I would simply stop reading and turn my attention elsewhere. I don’t think I’ve missed anything as a result. On the contrary, I’m pretty sure that the rigorous enforcement of this rule has improved my quality of life and upgraded the flow of information coming into my brain.
What “deeply” has that “very” didn’t is the overlay of pious moralism. You can easily get the idea by comparing these three statements:
- I was offended by this tweet
- I was very offended by this tweet
- I was deeply offended by this tweet
(2) is weaker despite—in fact, because of—the attempt to strengthen it by addition of “very.” That’s okay. It’s just a poorly written sentence. The world’s full of those. I might keep reading on the off chance that this is just an inept writer honestly struggling to make a good point.
(3) has all the weakness of (2) but attempts to make up for that by implicitly suggesting that there is some underlying moral cause for taking offense that is impossible to gainsay. Only a monster would refuse to take with the greatest seriousness the concerns of a person who was deeply offended! I stop reading (3) as a matter of principle.
It’s a little bit aligned with how the word “sacred” gets used. Both “deeply” and “sacred” are shorthand for “under no circumstances is it acceptable for anyone to fail to take seriously, let alone disagree with, what I am about to say. All within the sound of my voice must now put on their Serious Faces and hastily knuckle under.”
“Deeply” is, in other words, a marker for cant: a wonderful old word that has been used in various related senses since the 1500s.
Cant’s definition #6 in the OED is so spot on that I can make this essay a lot shorter merely by quoting it here. It is
“To affect religious or pietistic phraseology, esp. as a matter of fashion or profession; to talk unreally or hypocritically with an affectation of goodness or piety.”
Dr. Johnson’s definition is the one shown in the image at the top of this post. Just as a side note, it is fascinating that 270 years ago this sort of talk was a common enough feature of the rhetorical landscape that the likes of Dr. Johnson were absolutely nailing it with one four-letter word.
by Neal Stephenson, Graphomane | Read more:
Image: Samuel Johnson
[ed. I am deeply nonplussed (ha!) that one of my favorite authors spends time worrying about this stuff.]
A Single Green Feather
Brody Atwell’s fascination with Carolina Parakeets began in the ninth grade. Mrs. Jenkins had shown them a painting of a bird he’d thought existed only in jungles or on pirate ships. They were here, in these mountains, she’d told the class. Scientists say they are extinct, but I hope they are wrong. Brody hoped so too.
Whenever outdoors, he was watchful. A flash of bright feather brought a moment of possibility, only to reveal a bunting or goldfinch. His interest in all birds grew. At NC State Brody majored in biology before returning to Enka to teach high school. His interest in the parakeet remained, evidenced by the Audubon print hung on his classroom’s wall.
Although there had been reported sightings as late as the 1920s, the last confirmed Carolina Parakeet died in 1916 at the St. Louis Zoo. Fifty-three years. Yet there was so much wilderness left in these mountains, miles and miles of national parkland and large individual holdings. Several students had relatives who swore they’d recently seen panthers, though biologists claimed the big cats had also been absent for decades. Brody wanted to believe, even as astronauts gazed down on earth, recently left their footprints on the moon, that the world yet concealed some secrets. However, science demands evidence, his professor, Dr. Willard, had said, declaring that the Ivorybill Woodpecker and the Carolina Parakeet were extinct until proven otherwise. Unlike Saint Paul, the professor had continued, we cannot believe in things unseen. Now, on a Thursday morning before homeroom, Brody remembered these words as he stared at the green feather laid on his desk.
“It looks like it could of come off one of them parakeets,” Lester said, nodding at the Audubon print.
With many students, Brody would have thought it no more than some high-school prank, a dyed feather pulled off a souvenir from a Cherokee or Boone tourist trap, but though Lester was more interested in hunting and fishing than schoolwork, he was a quiet, respectful boy. Brody picked up the feather. Holding it by the quill, he moistened his free hand’s thumb and forefinger, rubbed the inner vane. No dye smudged his skin.
“What do you think, Mr. Atwell?” Lester asked.
“It’s not a bunting,” Brody said, turning the feather slowly, inspecting it with a jeweler’s attentiveness. The tinge of yellow on the outer vane was significant. Lester’s family had lived in the county for generations, so it could be an heirloom passed down from an older relative, or perhaps detached from a grandmother’s once-fashionable hat. However, as Brody brushed a finger across the feather, he found it not brittle with age but soft and pliable.
Whenever outdoors, he was watchful. A flash of bright feather brought a moment of possibility, only to reveal a bunting or goldfinch. His interest in all birds grew. At NC State Brody majored in biology before returning to Enka to teach high school. His interest in the parakeet remained, evidenced by the Audubon print hung on his classroom’s wall.
Although there had been reported sightings as late as the 1920s, the last confirmed Carolina Parakeet died in 1916 at the St. Louis Zoo. Fifty-three years. Yet there was so much wilderness left in these mountains, miles and miles of national parkland and large individual holdings. Several students had relatives who swore they’d recently seen panthers, though biologists claimed the big cats had also been absent for decades. Brody wanted to believe, even as astronauts gazed down on earth, recently left their footprints on the moon, that the world yet concealed some secrets. However, science demands evidence, his professor, Dr. Willard, had said, declaring that the Ivorybill Woodpecker and the Carolina Parakeet were extinct until proven otherwise. Unlike Saint Paul, the professor had continued, we cannot believe in things unseen. Now, on a Thursday morning before homeroom, Brody remembered these words as he stared at the green feather laid on his desk.
“It looks like it could of come off one of them parakeets,” Lester said, nodding at the Audubon print.
With many students, Brody would have thought it no more than some high-school prank, a dyed feather pulled off a souvenir from a Cherokee or Boone tourist trap, but though Lester was more interested in hunting and fishing than schoolwork, he was a quiet, respectful boy. Brody picked up the feather. Holding it by the quill, he moistened his free hand’s thumb and forefinger, rubbed the inner vane. No dye smudged his skin.
“What do you think, Mr. Atwell?” Lester asked.
“It’s not a bunting,” Brody said, turning the feather slowly, inspecting it with a jeweler’s attentiveness. The tinge of yellow on the outer vane was significant. Lester’s family had lived in the county for generations, so it could be an heirloom passed down from an older relative, or perhaps detached from a grandmother’s once-fashionable hat. However, as Brody brushed a finger across the feather, he found it not brittle with age but soft and pliable.
by Ron Rash, Salvation South | Read more:
Image: uncredited
Hey, Celebrities - Shut Up!
I wish celebrities would learn the art of the French exit. But they can’t, which is why Eva Longoria has announced she no longer lives in America. “I get to escape and go somewhere,” she explained. “Most Americans aren’t so lucky – they’re going to be stuck in this dystopian country.” What’s brought this on, apart from the obvious? “Whether it’s the homelessness or the taxes … it just feels like this chapter in my life is done now.” Great to learn that Eva dislikes both homelessness and taxes. America’s loss of this major political thinker is some other country’s gain – and this highly called-for intervention reminds us why celebrities should speak their brains even more often. If only into a pillow, or an abyss.
As always in these moments of the silly voters making a silly mistake, many stars have pledged to follow her. We’ll see. Either way, celebrities seem totally unaware that these high-handed statements of first-class migration are not the admonishment to the lesser orders that they are meant to be, and may even encourage them.
But then, stars have always been totally unaware of how very little they bring to this particular party. The last few days of the Harris campaign were an increasingly excruciating riot of celebrity bandwagonning. Did the Kamala campaign ask man-born-in-Pennsylvania Richard Gere to make his video for her – or did the actor freelance one out of fear of not having “used his platform”? It was certainly Richard’s most critically misunderstood electoral outing since his address to the Palestinians before their 2005 elections. “Hi, I’m Richard Gere,” that one began, “and I’m speaking for the entire world …”
If anything good were to come out of the wreckage of the Harris campaign, let it be the final death of the idea that showbiz endorsements can help swing elections. They can’t. Not one bit. (...)
Meanwhile, it is easier to leave Twitter than America, as I think Marcus Aurelius once remarked. In the week the Guardian exited X – though not in the French style – you couldn’t move for people informing you they were herding with almost impossible dignity over to Bluesky.
And it does feel slightly hilarious that huge numbers of people who have spent the past decade-plus shrieking about the evils of social media – usually on social media – have been “liberated” from one platform, only to promptly rush and enslave themselves to another. Really? You can see it all stretching ahead of you – fun period, emergence of Blueskyocracy, the first Bluesky cancellation of someone, the exponentially intensifying purity spiral, followed by legacy titles or legacy humans announcing an exit from that one too. It’s all such a predictable timesuck. Bluesky might be the new email.
by Marina Hyde, The Guardian | Read more:
As always in these moments of the silly voters making a silly mistake, many stars have pledged to follow her. We’ll see. Either way, celebrities seem totally unaware that these high-handed statements of first-class migration are not the admonishment to the lesser orders that they are meant to be, and may even encourage them.
But then, stars have always been totally unaware of how very little they bring to this particular party. The last few days of the Harris campaign were an increasingly excruciating riot of celebrity bandwagonning. Did the Kamala campaign ask man-born-in-Pennsylvania Richard Gere to make his video for her – or did the actor freelance one out of fear of not having “used his platform”? It was certainly Richard’s most critically misunderstood electoral outing since his address to the Palestinians before their 2005 elections. “Hi, I’m Richard Gere,” that one began, “and I’m speaking for the entire world …”
If anything good were to come out of the wreckage of the Harris campaign, let it be the final death of the idea that showbiz endorsements can help swing elections. They can’t. Not one bit. (...)
Meanwhile, it is easier to leave Twitter than America, as I think Marcus Aurelius once remarked. In the week the Guardian exited X – though not in the French style – you couldn’t move for people informing you they were herding with almost impossible dignity over to Bluesky.
And it does feel slightly hilarious that huge numbers of people who have spent the past decade-plus shrieking about the evils of social media – usually on social media – have been “liberated” from one platform, only to promptly rush and enslave themselves to another. Really? You can see it all stretching ahead of you – fun period, emergence of Blueskyocracy, the first Bluesky cancellation of someone, the exponentially intensifying purity spiral, followed by legacy titles or legacy humans announcing an exit from that one too. It’s all such a predictable timesuck. Bluesky might be the new email.
by Marina Hyde, The Guardian | Read more:
Image: Evelyn Hockstein/Reuters
[ed. Then there's this, about a Scientific American editor resigning because she'd voiced her true feelings when she wasn't clocked in at work (and then, denying those feelings). Which proved nothing and only provided gleeful fodder for the stupid hoards who rejoice in claiming another scalp.]
Friday, November 15, 2024
A Letter To Elon Musk
[ed. Advice from Francis Fukuyama (End of History) for Elon Musk, who was recently selected to lead a newly created Department of Government Efficiency (D.O.G.E). Yes, DOGE. Because ruining people's lives and careers is funny.]
Congratulations on the resounding victory of your candidate, Donald Trump, a result to which you contributed significantly. I understand that you are tapped to become an efficiency Czar in the new administration, a post that will be very critical since the federal bureaucracy does indeed need fixing. However, I have some suggestions for things to keep in mind when embarking on this post.
As I’m sure you know, you will find working in government very different from working in the private sector. The chief difference is that people in government are hugely constrained by rules. For example, you cannot begin firing people on day one as you did at Twitter. Federal employees are covered by a host of job protections created by Congress. Trump has a plan to eliminate those protections by restoring an executive order from his first administration to create a “Schedule F” category that would permit the president to fire any worker at will. But such a move will be heavily contested, and it will likely be months before the legal barriers to action are eliminated.
In any event, firing government bureaucrats is not necessarily a path to greater efficiency. It is a widely believed myth that the federal bureaucracy is bloated and overstaffed. This is not the case: there are basically the same number of full-time federal employees today as there were back in 1969, about 2.3 million. This is despite the fact that the government now disburses more than five times as many dollars as it did back then. In fact, you can argue that the government is understaffed, due to relentless pressure over the decades to keep headcounts down. The Center for Medicare and Medicaid Services, for example, oversees the spending of $1.4 trillion, or one fifth of the entire federal budget, with a staff of only 6,400 full-time employees. These workers have to check for Medicare fraud, evaluate and certify tens of thousands of health providers, and make sure that payments to tens of millions of Americans are made in a timely manner. If you cut this staff, the amount of fraud and waste in the Medicare system is likely to go up, not down. The Office of Refugee Resettlement, which looks after the millions of refugees entering the country, has a staff of 150. By increasing the staff at the Internal Revenue Service, the government is expected to take in an additional $561 billion over the next decade.
The government has compensated for this understaffing by hiring legions of contractors (among which is your company, SpaceX). It is easier to fire a contractor than a regular federal employee, but then who is going to perform the services the contractor provides? You may actually save money by taking these functions back into the government because federal workers are paid less, but then you will need to hire more people and will likely get lower quality.
Deregulation has to be part of any plan to make government more efficient. There are clear targets for deregulation, particularly in the construction industry—something you already know given your experience building plants in the United States. We have way too many permitting rules that slow down or altogether prevent infrastructure projects, like the National Environmental Protection Act (NEPA) that requires environmental impact statements that run to thousands of pages and take years to write. Moreover, federal and state laws invite private litigation to enforce environmental laws, which is both expensive and time-consuming. This is why it takes nearly a decade to get approvals for offshore wind farms, and years to construct transmission lines to send electricity from Texas to California. So anything you can do to streamline this process will be welcome. This will be one of the easiest wins for a new administration, one that will have positive effects in areas from affordable housing to climate adaptation. (You should, however, recognize that a lot of over-regulation occurs at a state level, over which you will have no control. That is, of course, why you moved Tesla from California to Texas.)
There is another type of deregulation that needs to occur, however, if the government is to be made more efficient. People blame the bureaucracy for over-regulating the private sector, but the bureaucracy itself is over-regulated. Americans have never trusted the government, and over the decades have piled up a mountain of rules that bureaucrats must follow. An example of this are the Federal Acquisition Regulations (FAR), which contains hundreds of pages of rules that government procurement officers must follow before they can acquire anything from an F-35 fighter to office furniture. Hiring new employees is also extremely difficult; my students often have to wait months before getting a job interview for an open position in the federal government. There are, moreover, a lot of DEI requirements that don’t necessarily reward merit, rules that I’m sure a Trump administration would be happy to torch.
Many conservatives believe that government bureaucrats have too much discretionary authority and use it to enact a liberal agenda, thereby eluding democratic control. This does occur in some instances. But the real truth is rather the opposite: bureaucrats spend way too much of their time complying with hundreds of rules mandated by Congress, rather than using their independent judgment to make decisions that lead to good results for citizens. They need to be liberated from these constraints, and have their performance judged by the outcomes they achieve rather than how risk-averse they are. This is, of course, how Silicon Valley and the private sector operate. (...)
So here’s the deal. You will never be able to run the government the way you run your companies. But you can do a lot to make it more efficient. The trick is to avoid simplistic moves like mass layoffs and the closing of entire agencies. Remember that Donald Trump’s appointee Rick Perry wanted to close the Department of Energy, not realizing that one of its most important functions was to run the system of national laboratories that were responsible for, among other things, research on nuclear weapons and energy. You will also run into the problem that Congress has a say in how the government operates. Even if that branch is controlled by Republicans, they will have equities in different parts of the American state, and may not allow you to violate statutes that they had earlier endorsed.
We need to cut back government regulation of many parts of the private sector. But we also need to deregulate the government itself, and allow those who work for it to actually do their jobs. If Donald Trump wants to help the American people, he needs to see the government not as an enemy to be dismantled, but as an effective and indeed necessary means of doing so.
Yours Sincerely,
Francis Fukuyama
via: Persuasion | Read more:
[ed. Many people who rage against the government overlook the fact that it is simply implementing the various laws and regulations that politicians pass and legal rulings mandate. True, there's some discretion in interpreting those laws and rulings (mainly based on expertise) but bureaucrats aren't just making stuff up on their own (unless they're political appointees who come in with an agenda). Want better government? Elect better representatives. But don't blame folks for doing the best they can with what they're given, or because you don't like some program or other. Here's a good example of this kind of blame shifting.]
Epic Games' Unreal Engine
Epic Games is the maker of Unreal Engine, which is one of the biggest and most capable game engines. What is a game engine? Well, all games, no matter how diverse their play style or visual presentation, have certain functionality in common. Developers used to code all of that from scratch (and still can, if they so choose), but it makes sense to bundle all of those capabilities into one piece of software so that devs can work on the things that make their projects unique instead of continually re-inventing a thousand different kinds of wheels. This is what game engines do.
There’s a feedback loop that has now been running for decades between companies like Epic that make the engines, and developers who come up with new ways to use them. Engines become more capable as companies like Epic say “hey, a lot of devs are implementing Feature X, we should just add that to the next release.”
Meanwhile the hardware that people use to run games becomes much more powerful. Developers think up interesting ways to take advantage of all of these improvements.
Game engines are really a general infrastructure for immersive experiences—by which I mean, audiovisual productions that you can move around in and interact with. Many of the applications built on this foundation fit cleanly into established genres such as first-person shooter games, but increasingly people use these things to make art projects, movies, and commercial/industrial applications.
Nick Whiting, a former studio head and engineering director at Epic, has co-founded a company called Kumikai that, among other things, helps developers who are using Unreal Engine to create applications that are not games. Part of his inspiration came from this brain aneurysm surgery simulator. Nick generously provided me with a list of links to other non-game projects that I can’t fully do justice to here, so I’ll just drop them in:
- The “Guided Ureterovesical Anastomosis” section of this page on the Surgical Science site. Maybe a better and quicker view here though
- Tesla’s use of Unreal Engine to generate synthetic data for training AI. If you’re teaching an AI to deal with conditions that arise in three-dimensional space, you can get data much more easily and cheaply by simulating it photorealistically than by going out into the real world and shooting video.
- The inevitable NASA link. Label says it has been taken down but it seems to play.
- A mining construction simulator. “By changing things like lighting and tunnel sizes to give a bit more "breathing room"…before they blast holes in the ground, they were able to save large amounts of money and have better safety for folks that are already in incredibly hazardous environments.”
So if there’s going to be a Metaverse, game engines are going to run it. And game developers—the people who are proficient at using those engines and the toolchains that feed assets into them—are going to build it.
In 2017 Epic released Fortnite Battle Royale, which most people just refer to as Fortnite. It is an immensely successful game. In any given month, 70 - 80 million people play it. At peak it generated $5 - 6 billion a year in revenue.
This is relevant to the Metaverse because Fortnite is an online, multi-player game. 100 avatars parachute onto an island at about the same time and fight each other until only one team remains. The players can be anywhere on Earth. So in order for Epic’s engineers to make this game work, they had to solve a host of technical problems around synchronizing those 100 players’ perceptions and experiences of the same virtual space.
They’re not the first or the only engineers to have tackled such challenges. MMORPGs (Massively Multiplayer Online Role Playing Games) are a genre unto themselves and have been around since long before Fortnite. More recently Minecraft and Roblox have achieved phenomenal success enabling users to craft experiences that can be experienced by multiple players at once.
Tim Sweeney, however, has been openly stating for a long time that the goal is to develop all of this into something like the Metaverse. He’s been personally working on a new programming language called Verse that is tailored to the needs of Metaverse builders. After years of development Verse has recently broken the surface in UEFN, Unreal Editor for Fortnite, which is a system that Epic has released in order to make it possible for developers to extend the base Fortnite experience into games of their own design, and to make money doing so.
by Neal Stephenson, Graphomane | Read more:
Image: YouTube/Unreal Engine
[ed. If you find this topic interesting, spend some time on the Unreal Engine website (which provides the introductory video at the top of this post. Coupled with AI it's easy to see how the long imagined (and much hyped) 'metaverse' might evolve.]
Labels:
Business,
Critical Thought,
Design,
Games,
Media,
Technology
Thursday, November 14, 2024
Tianjin
FlyOverChina | A Fantastic aerial tour of Tianjin
via: Tianjin Municipal Bureau of Culture and Tourism
[ed. The Chinese menace.]
via: Tianjin Municipal Bureau of Culture and Tourism
[ed. The Chinese menace.]
A performance featuring traditional Dunhuang music and dance is staged during the 7th Silk Road (Dunhuang) International Cultural Expo in Dunhuang, northwest China's Gansu Province, Sept. 20, 2024. The three-day expo concluded here on Sunday. (Xinhua/Lang Bingbing)
Wednesday, November 13, 2024
Book Review: The Rise Of Christianity
The rise of Christianity is a great puzzle. In 40 AD, there were maybe a thousand Christians. Their Messiah had just been executed, and they were on the wrong side of an intercontinental empire that had crushed all previous foes. By 400, there were forty million, and they were set to dominate the next millennium of Western history.
Imagine taking a time machine to the year 2300 AD, and everyone is Scientologist. The United States is >99% Scientologist. So is Latin America and most of Europe. The Middle East follows some heretical pseudo-Scientology that thinks L Ron Hubbard was a great prophet, but maybe not the greatest prophet.
This can only begin to capture how surprised the early Imperial Romans would be to learn of the triumph of Christianity. At least Scientology has a lot of money and a cut-throat recruitment arm! At least they fight back when you persecute them! At least they seem to be in the game!
Rodney Stark was a sociologist of religion. He started off studying cults, and got his big break when the first missionaries of the Unification Church (“Moonies”) in the US let him tag along and observe their activities. After a long and successful career in academia, he turned his attention to the greatest cult of all and wrote The Rise Of Christianity. He spends much of it apologizing for not being a classical historian, but it’s fine - he’s obviously done his homework, and he hopes to bring a new, modern-religion-informed perspective to the ancient question.
So: how did early Christianity win?
Slowly But Steadily
Previous authorities assumed Christianity spread through giant mass conversions, maybe fueled by miracles. Partly they thought this because the Biblical Book of Acts describes some of these. But partly they thought it because - how else do you go from a thousand people to forty million people in less than 400 years?
Stark answers: steady exponential growth.
Suppose you start with 1,000 Christians in 40 AD. It’s hard to number the first few centuries’ worth of early Christians - they’re too small to leave much evidence - but by 300 AD (before Constantine!) they’re a sizeable enough fraction of the empire that some historians have tentatively suggested a 10% population share. That would be about 6 million people.
From 1,000 to 6,000,000 in 260 years implies a 40% growth rate per decade. Stark finds this plausible, because it’s the same growth rate as the Mormons, 1880 - 1980 (if you look at the Mormons’ entire history since 1830, they actually grew a little faster than the early Christians!)
Instead of being forced to attribute the Christians’ growth to miracles, we can pin down a specific growth rate and find that it falls within the range of the most successful modern cults. Indeed, if we think of this as each existing Christian having to convert 0.4 new people, on average, per decade, it starts to sound downright do-able.
Still, how did the early Christians maintain this conversion rate over so many generations?
Through The Social Graph
This is another of Stark’s findings from his work with the Moonies.
The first Moonie in America was a Korean missionary named Young Oon Kim, who arrived in 1959. Her first convert was her landlady. The next two were the landlady’s friends. Then came the landlady’s friends’ husbands and the landlady’s friends’ husbands’ co-workers. That was when Stark showed up. “At the time . . . I arrived to study them, the group had never succeeded in attracting a stranger.”
Stark theorized that “the only [people] who joined were those whose interpersonal attachments to members overbalanced their attachments to nonmembers.” I don’t think this can be literally correct - taken seriously, it implies that the second convert could have no other friends except the first, which would prevent her from spreading the religion further. But something like “your odds of converting are your number of Moonie friends, divided by your number of non-Moonie friends” seems to fit his evidence.
History confirms this story. Mohammed’s first convert was his wife, followed by his cousin, servant, and friend. Joseph Smith’s first converts were his brothers, friends, and lodgers. Indeed, in spite of the Mormons’ celebrated door-knocking campaign, their internal data shows that only one in a thousand door-knocks results in a conversion, but “when missionaries make their first contact with a person in the home of a Mormon friend or relative of that person, this results in conversion 50% of the time”. 1
This theory of social-graph-based-conversation was controversial when Stark proposed it, because if you ask cultists retrospectively, they’ll usually say they were awed by the beauty of the sacred teachings. But Stark says:
Imagine taking a time machine to the year 2300 AD, and everyone is Scientologist. The United States is >99% Scientologist. So is Latin America and most of Europe. The Middle East follows some heretical pseudo-Scientology that thinks L Ron Hubbard was a great prophet, but maybe not the greatest prophet.
This can only begin to capture how surprised the early Imperial Romans would be to learn of the triumph of Christianity. At least Scientology has a lot of money and a cut-throat recruitment arm! At least they fight back when you persecute them! At least they seem to be in the game!
Rodney Stark was a sociologist of religion. He started off studying cults, and got his big break when the first missionaries of the Unification Church (“Moonies”) in the US let him tag along and observe their activities. After a long and successful career in academia, he turned his attention to the greatest cult of all and wrote The Rise Of Christianity. He spends much of it apologizing for not being a classical historian, but it’s fine - he’s obviously done his homework, and he hopes to bring a new, modern-religion-informed perspective to the ancient question.
So: how did early Christianity win?
Slowly But Steadily
Previous authorities assumed Christianity spread through giant mass conversions, maybe fueled by miracles. Partly they thought this because the Biblical Book of Acts describes some of these. But partly they thought it because - how else do you go from a thousand people to forty million people in less than 400 years?
Stark answers: steady exponential growth.
Suppose you start with 1,000 Christians in 40 AD. It’s hard to number the first few centuries’ worth of early Christians - they’re too small to leave much evidence - but by 300 AD (before Constantine!) they’re a sizeable enough fraction of the empire that some historians have tentatively suggested a 10% population share. That would be about 6 million people.
From 1,000 to 6,000,000 in 260 years implies a 40% growth rate per decade. Stark finds this plausible, because it’s the same growth rate as the Mormons, 1880 - 1980 (if you look at the Mormons’ entire history since 1830, they actually grew a little faster than the early Christians!)
Instead of being forced to attribute the Christians’ growth to miracles, we can pin down a specific growth rate and find that it falls within the range of the most successful modern cults. Indeed, if we think of this as each existing Christian having to convert 0.4 new people, on average, per decade, it starts to sound downright do-able.
Still, how did the early Christians maintain this conversion rate over so many generations?
Through The Social Graph
This is another of Stark’s findings from his work with the Moonies.
The first Moonie in America was a Korean missionary named Young Oon Kim, who arrived in 1959. Her first convert was her landlady. The next two were the landlady’s friends. Then came the landlady’s friends’ husbands and the landlady’s friends’ husbands’ co-workers. That was when Stark showed up. “At the time . . . I arrived to study them, the group had never succeeded in attracting a stranger.”
Stark theorized that “the only [people] who joined were those whose interpersonal attachments to members overbalanced their attachments to nonmembers.” I don’t think this can be literally correct - taken seriously, it implies that the second convert could have no other friends except the first, which would prevent her from spreading the religion further. But something like “your odds of converting are your number of Moonie friends, divided by your number of non-Moonie friends” seems to fit his evidence.
History confirms this story. Mohammed’s first convert was his wife, followed by his cousin, servant, and friend. Joseph Smith’s first converts were his brothers, friends, and lodgers. Indeed, in spite of the Mormons’ celebrated door-knocking campaign, their internal data shows that only one in a thousand door-knocks results in a conversion, but “when missionaries make their first contact with a person in the home of a Mormon friend or relative of that person, this results in conversion 50% of the time”. 1
This theory of social-graph-based-conversation was controversial when Stark proposed it, because if you ask cultists retrospectively, they’ll usually say they were awed by the beauty of the sacred teachings. But Stark says:
I knew better, because we had met them well before they had learned to appreciate the doctrines, before they had learned how to testify to their faith, back when they were not seeking faith at all. Indeed, we could remember when most of them regarded the religious beliefs of their new set of friends as quite odd. I recall one who told me that he was puzzled that such nice people could get so worked up about “some guy in Korea” . . . Then, one day, he got worked up about this guy too.
by Scott Alexander, Astral Codex Ten | Read more:
Image: uncredited
[ed. There's much more, but I'd also suggest that Christianity benefited from an abundance of scribblers, editors and transcriptionists who, over successive generations, defined and redefined God in the Bible (and Christianity in general) to fit an evolving religion and events of the time. Jack Miles' God: A Biography covers this ground quite thoroughly. See also: this review of the book (Commentary); God: A Biography: Q&A (Jack Miles); and, Christian theology (Wikipedia).]
[ed. There's much more, but I'd also suggest that Christianity benefited from an abundance of scribblers, editors and transcriptionists who, over successive generations, defined and redefined God in the Bible (and Christianity in general) to fit an evolving religion and events of the time. Jack Miles' God: A Biography covers this ground quite thoroughly. See also: this review of the book (Commentary); God: A Biography: Q&A (Jack Miles); and, Christian theology (Wikipedia).]
Labels:
Critical Thought,
Culture,
history,
Philosophy,
Religion
Subscribe to:
Comments (Atom)





