Friday, November 22, 2024

Daniel Inouye Documentary - Renegades

Daniel Inouye wanted to serve the United States from a young age. Growing up in Hawaii, he was rattled by the attack on Pearl Harbor; in 1944, at the age of 19, Inouye deployed to Italy, then France, to fight the Nazis. War changes most soldiers’ lives, but Inouye, fighting in an all–Japanese American combat unit, also had to get his right arm amputated: A Nazi soldier struck him with a grenade launcher, partly destroying the arm and forcing him to pry the undetonated grenade out with his left hand. He threw it back at the Nazi—this time, it detonated.

After being rehabilitated, Inouye continued to serve the United States, first as one of Hawaii’s earliest delegates to the House of Representatives, then, in 1963, in the Senate, where he remained for nearly 50 years. Inouye supported civil rights, but he was not at the forefront of the disability rights movement; in fact, Inouye did not see himself as a disabled person, likely due to stigma at the time. By 2010, Inouye was president pro tempore of the Senate, making him the highest-ranking person of color with a disability in the presidential line of succession, ever.

Inouye’s story is the subject of a new documentary, out October 8, in PBS’ Renegades series of five short films telling the stories of underrecognized disabled figures in US history, like Inouye and Black Panther Party member Brad Lomax.

Mother Jones spoke with Renegades series creator Day Al-Mohamed, who has worked on disability policy in the Biden-Harris administration, and Tammy Botkin, who directed the short on the late senator, on Inouye’s relationship to his disability and more. (...)

In your work in disability policy, even decades later, do you see similarities in how many veterans may not view themselves as part of the disability community—like Daniel Inouye didn’t?

Al-Mohamed: I still remember, as one veteran explained it to me, “I don’t have a disability. I’m just busted out.” It’s very much a way of thinking about that. Veterans are a community in and of themselves and [had] a job, in many ways, that is based on your your body, abilities and capacity.

We all have different perceptions of what it means to be disabled, and we can even see that within the non-veteran community as well. There’s this general mainstream perception that disability is a wheelchair user, or it’s somebody who is blinded. I think that that has done a disservice to many folks who don’t see the opportunity to take advantage of the policies and politics that protect them, which is also, in some ways, at the heart of the episode.

It does seem there’s a generational shift, where younger people are embracing that identity more than in the days when more people were being institutionalized.

Botkin: It’s definitely related to generational views of disability. It is also related to the Senator’s identity as a war veteran, who has seen many other friends who died and were maimed far worse than he. It also has to do with his identity as a Japanese American. Then, his need as a politician to show himself as strong—and when he started in politics, to have a disability would have been a weakness.

Why was it important to explore multiple aspects of Inouye’s identity—including how anti-Japanese sentiment made it difficult for Inouye to enlist, and led to his being called a communist?

Botkin: First off, the senator being smushed into 12 minutes feels like an aberration. How do you do that? He [had] such a massive, massive life, and he himself was such a prolific storyteller and framer of his experience and our collective experience.

There were so many facets to him that to really even begin to understand him as an individual, to leave any of those out is to not be able to really grasp who he is—that he belonged to many communities. He’s Japanese American, yes, but also Hawaiian. Yes, he’s military. He’s a politician. He’s a man from a certain generation of Americanism. He would fight for people with disabilities, but for him to take the lead on it would be self-serving. He wouldn’t do that, and that leans a lot into his Japanese American heritage. We worked with Japanese American consultants to nail this in.

When you’re telling somebody’s story, it’s terrifying because I personally feel like I have to get it right. Luckily, in this case, the Senator’s best friend, who’s in the film, Jeff Watanabe, was incredibly pleased with the representation, so I can breathe.

Al-Mohamed: If you watch the film, you can see [Tammy’s] pulling strands of different labels. As you even highlighted, the discussion around communism, discussion about being Japanese American, discussion about disability, discussion about veteran, those are all labels. At the heart, it’s about the ones you choose to embrace, the ones you don’t, the ones society puts on you, and the ones that you choose for yourself.

by Julia Métraux, Mother Jones |  Read more:
Image: via uncredited
[ed. Will have to look for this. The international airport in Honolulu is named after Sen. Inouye, and the one in Anchorage, Alaska named after Ted Stevens. Both were great friends and co-conspirators during their time in Congress (one Democrat, the other Republican), and were together extremely effective in getting the most they could for each of their small states. In the process both became legends, greatly shaping each state's destiny. See also: Grit, Courage: 2nd Lt. and U.S. Senator Daniel K. Inouye, U.S. Army (Hawaii Reporter):]

"Kame and Hyotaro would not have dared to dream from their home in the Japanese-American community of Mo’ili’ili, a suburb of Honolulu, that the son born to them on September 7th, 1924 would one day be third in line for the Presidency of the United States. They named him Daniel K. Inouye.

Dan had a normal happy childhood, as it is today it was then, a neat place for a child to grow up. Dan was a senior at McKinley high school on December 7th 1941 when Pearl Harbor was bombed. (...)

The following September, Dan – who had plans to become a doctor – enrolled in the University of Hawaii. Like many Nisei, Dan had tried to enlist but the War Department at first refused to accept Japanese-American volunteers after Pearl Harbor. Then they changed their minds. Dan put his plans to become a doctor on hold, quit school and enlisted. He was assigned to the 442nd Regimental Combat Team. During training in Mississippi, the unit found its motto: “Go for Broke!”

By the time the 442nd shipped out for Naples in May 1944, Dan was a sergeant and squad leader. As a matter of interest, the casualty rate was so high that it eventually took 12,000 men to fill the original 4,500 places in the regiment. (...)

On April 21, 1945, Dan’s company was ordered to attack a heavily defended ridge guarding an important road in the vicinity of San Terenzo. His platoon wiped out an enemy patrol and mortar observation post and reached the main line of resistance before the rest of the American force. As the troops continued up the hill, three German machine guns focused their fire on them, pinning them down. Dan worked his way toward the first bunker. Pulling out a grenade, he felt something hit him in his side but paid no attention and threw the grenade into the machine-gun nest. After it exploded, he advanced and killed the crew.

Dan continued up the hill, throwing two more grenades into the second gun emplacement and destroying it before he collapsed from loss of blood from his wounds. His men, trying to take the third bunker, were forced back. He dragged himself toward it, then stood up and was about to pull the pin on his last grenade when a German appeared in the bunker and fired a rifle grenade. It hit Dan in the right elbow and literally tore off his arm. He pried the grenade out of his dead right fist with his other hand and threw it at the third bunker, then lurched toward it, firing his tommy gun left-handed. A German bullet hit him in the leg. A medic reached him and gave him a shot of morphine. In his typical stoic manner he didn’t allow himself to be evacuated until the position was secured. In the hospital, the remnants of his right arm were amputated.

Dan left the Army and after a long period of recuperation, Dan finished college. Forced to give up his dream of practicing medicine, he decided to study law. He was elected to the U.S. House of Representatives from Hawaii in 1959—Congress’s first Japanese-American—and to the Senate in 1962."

Thursday, November 21, 2024

An Immense World

An Immense World: How Animal Senses Reveal the Hidden Realms Around Us, Ed Yong (Random House, 2022).

You live in a world of physical phenomena: surrounded by objects of all sorts, suffused by electromagnetic radiation, buffeted by waves of pressure propagating through the air and awash with tiny organic molecules that waft on it. Your senses are exquisitely attuned to perceive some of these things — certain frequencies of radiation become a beloved face, a mess of floating chemicals resolves into the scent of baking bread — but many others fall outside the narrow band of your perception. Without specialized equipment, you are quite literally blind to the ultraviolet or infrared and deaf to the ultrasonic cries of rodents or the infrasounds of elephants and whales. And forget about electric and magnetic fields; they’re so far outside our actual experience that we don’t even have a word for our inability to sense them.

Animals are attuned to a different spread of phenomena. Everyone knows that dogs are good at smelling (though just how good — good enough to detect a single fingerprint on a glass slide that has been left out on a rooftop for a week — is still a surprise). Fewer know about elephants, who can identify supposedly-odorless TNT and have been known to survive droughts by scenting out buried water and digging wells. Almost no one knows that the family of seabirds called tubenoses are able to navigate the trackless ocean by following diffuse plumes of the gas released when plankton are eaten by krill. And as for smell, so for sight: who knew that nearly all animals, including most non-primate mammals, can see well into the ultraviolet spectrum? (In fact, many flowers that look solid-colored to us actually have clear UV runways to guide their pollinators in for a landing.) And on and on, for more than three hundred gloriously diverse pages full of senses I’d never even thought of.

This book is a guide to the physical world and how animal senses perceive it, with plenty of fascinating descriptions of biomechanics and organic chemistry. More than that, though, it’s an invitation to imagine what it might be like if our senses worked differently. Borrowing a term from early 20th century Baltic German zoologist Jakob von Uexküll, Yong describes the “sensory bubbles” of our Umwelt: like the blind men and the elephant, we have access to only a fraction of the available data, but it seems like the whole world.

We have pretty good noses and exceptionally sensitive fingertips, but the human Umwelt is dominated by sight. Not so for many other creatures, for whom touch or scent is more important — and it’s hard to overemphasize how differently other senses work. Light travels rapidly over great distances, but it can be easily blocked and it vanishes quickly and with little trace. Smells, on the other hand, seep and spread. A barrier impenetrable to sight poses no difficulty to scent; odorant molecules are so small they’re virtually impossible to entirely block, and they move around corners and through darkness as easily as they do in straight lines. But even more importantly, they linger. An Umwelt where scent reigns is one of layers, of history, of trails that slowly waft and dissolve over the course of hours or days. What would your relationship to time and space be if you came to the world nose-first?

And then make it even weirder: what if we could interpret the pressure waves of water moving between sand grains to find clams buried deeper than our probing fingers can reach, like the way the red knot can with its bill? What if we could feel the tiny air currents of an insect in flight, or the track the passage of a fish through water by the turbulence it leaves behind?

It’s difficult to imagine, and so we often don’t. After all, we have remarkable trouble wrapping our heads around other humans whose culture differs from our own; how much harder with something thoroughly alien? Maybe it’s no surprise that while the monsters we come up with may look different, they often act basically like “humans but” — human but larger, human but with big teeth and wings, human but with face tentacles and mind control. But they needn’t! Think of the (seemingly) simplest of additions, the ability to see in the dark. It adds tactical complications, sure, but it would do more than that: depending on how the dark-vision works, it can change nearly everything. Pit vipers use (you guessed it) specialized pits to “see” in infrared, but only at very low resolution and very close range. Cats and many other mammals have a reflective layer behind the retina that sends back light to be gathered a second time; in a reindeer, it grows and changes during the cold dark winter. Bats and dolphins “see” by echolocation. The golden mole finds mounds of dune grass amidst the sands of its desert home by listening for the ground-borne vibrations caused by the wind rustling the grass. Each one of these senses enables a creature to navigate a lightless environment much better than you can, but each also makes its world strange in ways you’d never think of — and which are therefore much more fun.

Just to whet your appetite for this book, I’ll leave you with a few animals.

Here’s the emerald jewel wasp:
The wasp — a beautiful inch-long creature with a metallic green body and orange thighs — is a parasite that raises its young on cockroaches. When a female finds a roach, she stings it twice — once in its midsection to temporarily paralyze its legs, and a second time in its brain. The second sting targets two specific clusters of neurons and delivers venom that nullifies the roach’s desire to move, turning it into a submissive zombie. In this state, the wasp can lead the roach to her lair by its antennae, like a human walking a dog. Once there, she lays an egg on it, providing her future larva with a docile source of fresh meat. This act of mind control depends on that second sting, which the wasp must deliver to exactly the right location. Just as a red knot has to find a clam hidden somewhere in the sand, an emerald jewel wasp has to find the roach’s brain hidden somewhere within a tangle of muscles and internal organs.

Fortunately for the wasp, her stinger is not only a drill, a venom injector, and an egg-laying tube but also a sense organ. Ram Gal and Frederic Libersat showed that its tip is covered in small bumps and pits that are sensitive to both smell and touch. With them, she can detect the distinctive feel of a roach’s brain. When Gal and Libersat removed the brain from a cockroach before offering the roach to some wasps, they repeatedly stung it, trying in vain to find the organ that was no longer there. If the missing brain was replaced with a pellet of the same consistency, the wasps stung it with the usual precision. If the replacement pellet was squishier than a typical brain, the wasps seemed confused and kept rooting around with their stingers. They knew what a brain should feel like.
And a whale:
The scale of a whale’s hearing is hard to grapple with. There’s the spatial vastness, of course, but also an expanse of time. Underwater, sound waves take just under a minute to cover 50 miles. If a whale hears the song of another whale from a distance of 1,500 miles, it’s really listening back in time by about half an hour, like an astronomer gazing upon the ancient light of a distant star. If a whale is trying to sense a mountain 500 miles away, it has to somehow connect its own call with an echo that arrives 10 minutes later. That might seem preposterous, but consider that a blue whale’s heart beats around 30 times a minute at the surface, and can slow to just 2 beats a minute on a dive. They surely operate on very different timescales than we do. If a zebra finch hears beauty in the milliseconds within a single note, perhaps a blue whale does the same over seconds and minutes. To imagine their lives, “you have to stretch your thinking to completely different levels of dimension,” Clark tells me. He compares the experience to looking at the night sky through a toy telescope and then witnessing its full majesty through NASA’s spaceborne Hubble telescope. When he thinks about whales, the world feels bigger, stretching out in space and time.

Whales weren’t always big. They evolved from small, hoofed, deer-like animals that took to the water around 50 million years ago. Those ancestral creatures probably had vanilla mammalian hearing. But as they adapted for an aquatic life, one group of them—the filter-feeding mysticetes, which include blues, fins, and humpbacks—shifted their hearing to low infrasonic frequencies. At the same time, their bodies ballooned into some of the largest Earth has ever seen. These changes are probably connected. The mysticetes achieved their huge size by evolving a unique style of feeding, which allows them to subsist upon tiny crustaceans called krill. Accelerating into a krill swarm, a blue whale expands its mouth to engulf a volume of water as large as its own body, swallowing half a million calories in one gulp. But this strategy comes at a cost. Krill aren’t evenly distributed across the oceans, so to sustain their large bodies, blue whales must migrate over long distances. The same giant proportions that force them to undergo these long journeys also equip them with the means to do so—the ability to make and hear sounds that are lower, louder, and more far-reaching than those of other animals.

Back in 1971, Roger Payne speculated that foraging whales could use these sounds to stay in touch over long distances. If they simply called when fed and stayed silent when hungry, they could collectively comb an ocean basin for food and home in on bountiful areas that lucky individuals have found. A whale pod, Payne suggested, might be a massively dispersed network of acoustically connected individuals, which seem to be swimming alone but are actually together.
by Jane Psmith, Mr. and Mrs. Psmith's Bookshelf |  Read more:
Image: uncredited

Japanese Wood Joints


MITI and the Japanese Miracle

MITI and the Japanese Miracle: The Growth of Industrial Policy, 1925-1975, Chalmers Johnson (Stanford University Press, 1982).

I've been interested in East Asian economic planning bureaucracies ever since reading Joe Studwell's How Asia Works (briefly glossed in my review of Flying Blind). But even among those elite organizations, Japan's Ministry of International Trade and Industry (MITI) stands out. For starters, Japanese people watch soap operas about the lives of the bureaucrats, and they're apparently really popular! Not just TV dramas; huge numbers of popular paperback novels are churned out about the men (almost entirely men) who decide what the optimal level of steel production for next year will be. As I understand it, these books are mostly not about economics, and not even about savage interoffice warfare and intraoffice politics, but rather focus on the bureaucrats themselves and their dashing conduct, quick wit, and passionate romances... How did this happen?

It all becomes clearer when you learn that when the Meiji period got rolling, Japan's rulers had a problem: namely, a vast, unruly army of now-unemployed warrior aristocrats. Samurai demobilization was the hot political problem of the 1870s, and the solution was, well…in many cases it was to give the ex-samurai a sinecure as an economic planning bureaucrat. Since positions in the bureaucracy were often quasi-hereditary, what this means is that in some sense the samurai never really went away, they just hung up their swords — frequently literally hung them up on the walls of their offices — and started attacking the problem of optimal industrial allocation with all the focus and fury that they'd once unleashed on each other. According to Johnson, to this day the internal jargon of many Japanese government agencies is clearly and directly descended from the dialects and battle-codes of the samurai clans that seeded them.

This book is about one such organization, MITI, whose responsibilities originally were limited to wartime rationing and grew to encompass, depending who you ask, the entire functioning of the Japanese government. Because this is the buried lede and the true subject of this book: you thought you were here to read about development economics and a successful implementation of the ideas of Friedrich List, but you’re actually here to read about how the entire modern Japanese political system is a sham. This suggestion is less outrageous than it may sound at first blush. By this point most are familiar with the concept of “managed democracy,” wherein there are notionally competitive popular elections, culminating in the selection of a prime minister or president who’s notionally in charge, but in reality some other locus of power secretly runs things behind the scenes.

There are many flavors of managed democracy. The classic one is the “single-party democracy,” which arises when for whatever reason an electoral constituency becomes uncompetitive and returns the same party to power again and again. Traditional democratic theory holds that in this situation the party will split, or a new party will form which triangulates the electorate in just such a way that the elections are competitive again. But sometimes the dominant party is disciplined enough to prevent schisms and to crush potential rivals before they get started. The key insight is that there’s a natural tipping-point where anybody seeking political change will get a better return from working inside the party than from challenging it. This leads to an interesting situation where political competition remains, but moves up a level in abstraction. Now the only contests that matter are the ones between rival factions of party insiders, or powerful interest groups within the party. The system is still competitive, but it is no longer democratic. This story ought to be familiar to inhabitants of Russia, South Africa, or California.

The trouble with single-party democracies is that it’s pretty clear to everybody what’s going on. Yes, there are still elections happening, there may even be fair elections happening, and inevitably there are journalists who will point to those elections as evidence of the totally-democratic nature of the regime, but nobody is really fooled. The single-party state has a PR problem, and one solution to it is a more postmodern form of managed democracy, the “surface democracy.”

Surface democracies are wildly, raucously competitive. Two or more parties wage an all-out cinematic slugfest over hot-button issues with big, beautiful ratings. There may be a kaleidoscopic cast of quixotic minor parties with unusual obsessions filling the role of comic relief, usually only lasting for a season or two of the hit show Democracy. The spectacle is gripping, everybody is awed by how high the stakes are and agonizes over how to cast their precious vote. Meanwhile, in a bland gray building far away from the action, all of the real decisions are being made by some entirely separate organ of government that rolls onwards largely unaffected by the show.

Losers and haters are perpetually accusing the United States of being a surface democracy. Enemies of the state ranging from Ralph Nader to Vladimir Putin are constantly banging on about it, but this is a Patriotic Substack and we would obviously never countenance such insinuations about our noble republic. So there’s absolutely no chance it’s even the slightest bit true of the US, but…what about Japan?

Well, awkwardly enough, it turns out that the central drama of preindustrial Japanese history was the growing power of unofficial rulers (the shoguns) who ran the country in reality while the official rulers (the emperors) gradually devolved into puppets and figureheads. A “surface monarchy,” if you will. Of course that all ended with the Meiji Restoration of 1868 (c’mon, it says “restoration” right there in the name) which returned the emperor to being fully in charge…which is why when the Japanese declared war on America in 1941, neither the Emperor Hirohito nor the parliament was even consulted. Hang on a minute!

In fact, yes, prewar Japan may have been reigned over by a monarch, but it was ruled by the deep state — especially the career military general staff and the economic planning bureaucracies. I know it’s hard to believe that drab agencies regulating coal and steel production were able to go toe-to-toe with General Tojo, but just imagine that they were all being staffed by fanatical clans of demobilized samurai or something crazy like that. When MacArthur rolled in with the occupation forces, he had a goal of creating total discontinuity with Japan’s past and utterly bulldozing the government. But a guy needs to pick his battles, and so he obviously focused on getting rid of all those nasty generals and admirals he’d just spent years fighting. The harmless paper-pushers, on the other hand, how much trouble could they be? Maybe they could even help organize the place.

The chapter about the post-war occupation is one of the deadpan funniest in Johnson’s book. The American occupiers are genuinely trying to create a liberal democracy out of the ashes, but have no idea that the friendly, helpful bureaucrats they’ve enlisted in this quest were the secret rulers of the regime they'd just conquered. The stats bear this out — of all the officials who controlled Japan’s wartime industry, only a few dozen were ever purged by the Americans. The most striking example of continuity has to be Nobusuke Kishi, but there were countless others like him. These were the men charged with translating the occupiers' desires into policy, reconstructing Japanese society, and finally drafting a new constitution. Then eventually the Americans sailed off, and the bureaucrats smiled and waved, and went back to ruling as they'd done for hundreds of years, behind the scenes.

Okay, but how well does that version of history line up with the reality of Japanese government in the second half of the 20th century? Johnson brings a lot of evidence to back up his claim that Japan is still secretly ruled by the bureaucracies, chief among them MITI. He points out, for example, that hardly any bills proposed by individual legislators and representatives go anywhere, while bills proposed by MITI itself are almost always instantly approved by the parliament. But MITI’s authority isn’t limited to the government, it's pretty clear that they control the entire private sector too. That might seem tautological — if MITI’s will always becomes law, then they can unilaterally impose new regulations or mandates that can destroy any company, with zero recourse, so everybody will naturally do what MITI says. But it’s subtler than that — the real mechanism is tangled up in MITI’s dynastic and succession customs.

Remember, this may look like an economic planning bureaucracy, but it’s actually a secret samurai clan. So they’re constantly doing the kinds of stuff that any good feudal nobility does. For instance, the economic planning bureaucrats frequently cement their treaties by marrying off their sister/daughter/niece to a mentor or to a protegé. They also sometimes legally adopt each other, ancient Roman-style. Naturally they also have an extremely complicated set of rules governing their internal hierarchy, rights of deference, etc. But remember, this isn’t just a secret samurai clan, it’s also a government agency! Agencies have rules too — explicit rules written down in binders, rules governing promotion and succession and all the rest. Sometimes, the official rules and the secret rules conflict, butt against each other, and out of that friction something beautiful emerges.

The highest rank in MITI is “Vice-Minister” (the “Minister” is one of those elected political guys who don't actually matter). But it's also the case that somebody who's been at MITI longer or who's older than you (these are actually the same thing, because everybody joins at the same age) is strictly superior to you in seniority. But that can create a paradox! What happens if a young guy becomes Vice-Minister? He would then be more senior than his older colleagues by virtue of office, but they would be more senior by virtue of tenure, and that would mean either an official rule or a secret rule being broken. To resolve this impossible conflict, the instant a new Vice-Minister is selected, everybody who's been in the bureaucracy longer than him resigns immediately, so that his absolute seniority is unambiguous and unquestionable. And then...the first act of the new Vice-Minister is to give everybody who fell on their swords powerful jobs as executives and board members of the biggest Japanese corporations. The entire process is called amakudari, which means “descent from heaven.”

Amakudari is really a win-win-win-win: the new Vice-Minister has unchallenged power within the agency and a whole host of new friends in the private sector, the guys who resigned all have cushy new jobs that come with better pay and perks, the companies that are descended upon now have an employee with great connections to the agency that controls their fates, and MITI as a gestalt entity can spread its tentacles throughout the economy, aided by cadres of alumni who think its way and help translate policy into reality.

I joked before about refusing to tolerate speculation about the US being a surface democracy like Japan, but joking aside I think even the staunchest defender of the reality of popular rule would concede that things have moved in that direction on the margin. Compare the power of agency rulemaking, federal law enforcement, spy agencies, or ostensibly independent NGOs now to where they were even 10 years ago. It would be a stretch to say that the electorate didn’t have influence over the American state, but can they really be said to rule it? Regardless of exactly where you come down on that question, it’s probably safe to say that you'd give a different answer today than you would have twenty, fifty, or a hundred years ago. Moreover, the movement has been fairly monotonic in the direction of less direct popular control over the government. And in fact this phenomenon is not unique to the United States, but reappears in country after country.

by John Psmith, Mr. and Mrs. Psmith’s Bookshelf |  Read more:
Image: uncredited
[ed. When I was coordinating the cleanup of the Exxon Valdez oil spill, every once in a while I had to brief visiting dignitaries/representatives on topics ranging from the length of commercial and sport fisheries closures, to seafood tainting and testing, to the biological implications of various cleanup methods/products and other issues. These briefings included everyone from U.S. Senators/Representatives, to the shipping insurer Lloyds of London, to national journalists (NY Times, Rolling Stone, Scientific American), and large environmental groups (Sierra Club, Audubon, Wilderness Society, etc.) and anyone else who had enough standing at the time to disrupt my schedule (not trying to sound self-important here, it was just an extremely busy time). Of all such briefings, the most intense involved a trade delegation from Japan (Lloyds, maybe second). I believe they were with or mostly affiliated with the Japanese Commerce Dept. or Trade industry somehow, it wasn't clear. Of course, Alaskan seafood products at the time represented quite a significant segment of the Japanese market - in the hundreds of millions, if not billions. I don't know if the dozen or so people who grilled me for a couple of hours were MITI, but who else? Secret samurai businessmen! Ha! Definitely some serious people.]

Wednesday, November 20, 2024

Pat Metheny & Lyle Mays



[ed. Lyle. Gone but not forgotten.]

Fame

[ed. It's a bitch.]

AI Turing Test

via:
[ed. Not part of the test, but still kinda cool. See: How did you do on the Ai Turing Test (ASX):]

"Last month, I challenged 11,000 people to classify fifty pictures as either human art or AI-generated images.

I originally planned five human and five AI pictures in each of four styles: Renaissance, 19th Century, Abstract/Modern, and Digital, for a total of forty. After receiving many exceptionally good submissions from local AI artists, I fudged a little and made it fifty. The final set included paintings by Domenichino, Gauguin, Basquiat, and others, plus a host of digital artists and AI hobbyists. (...)

What Did We Learn About Art?

Alan Turing recommended that if 30% of humans couldn’t tell an AI from a human, the AI could be considered to have “passed” the Turing Test. By these standards, AI artists pass the test with room to spare; on average, 40% of humans mistook each AI picture for human.

What does this tell us about AI? Seems like they’re good at art. I’m more interested in what it tells us about humans."

Humans keep insisting that AI art is hideous slop. But also, when you peel off the labels, many of them can’t tell AI art from some of the greatest artists in history. I’ve tried to be as fair as possible to these people, proposing that maybe they’re just expressing frustration with the proliferation of the DALL-E house style. And maybe some really do have an amazing eye for tiny incongruous details.

But it also seems very human to venerate sophisticated prestigious people, and to pooh-pooh anything that feels too new or low-status or too easy for ordinary people to access - without either impulse connecting with the actual content of the painting in front of you."

Tuesday, November 19, 2024

Haruki Murakami - Writing Mode

via: Ilan Lampl
[ed. Probably why people like Twitter/X. Only 280 characters.]

How To Win a Nobel Prize


How to win a Nobel prize (Nature)

The Nobel prize has been awarded in three scientific fields — chemistry, physics and physiology or medicine — almost every year since 1901, barring some disruptions mostly due to wars.

Nature crunched the data on the 346 prizes and their 646 winners (Nobel prizes can be shared by up to three people) to work out which characteristics can be reliably linked to medals.

Each circle here represents a Nobel laureate, a person who has received a Nobel prize. (...)
  • You can greatly improve your chances of winning a Nobel by working in the laboratory of a scientist who already has one or will in the future, or by working with someone whose mentors won. Prizewinners often beget or emerge from the labs of other laureates2. They frequently share mentors or mentees — those who supervised them or their students, or their students’ students.(...)
  • You might expect lots of separate clusters to emerge as distinct academic families. But it turns out that almost all Nobel laureates share some connection, however distant, as represented by this sprawling network.
An incredible 702 out of 736 researchers who have won science and economics prizes up to 2023 are part of the same academic family — connected by an academic link in common somewhere in their history.
by Kerri Smith & Chris Ryan, Nature | Read more:
Image: Chu-Chieh Lee
[ed. See also: Yes, scientific progress depends on like a thousand people (partial paywall, Intrinsic Perspective).]

Self Defense For Dummies

Things you need to know: how to defend yourself against assailant armed with a spatula
[ed. Funny.]

via:

Monday, November 18, 2024

via:

New Fusion

 

Mike Bono

Kinga Glyk

Matteo Mancuso

[ed. From my Fusion playlist.]

The Seeds of Social Revolution: Extreme Wealth Inequality

The seeds of social revolution have been sown and sprouted. What we harvest is up to us.

If there is any potential catalyst for social upheaval that attracts less attention than extreme wealth inequality, it's mighty obscure. As I noted yesterday, the present extreme of wealth inequality draws an occasional bit of lip service or handwringing, but very little serious focus, despite ample historical foundations for its role in sowing the seeds of social revolutions.

As I tried to explain in yesterday's post, extreme wealth inequality might not be the spark that ignites a revolution, but it is a tectonic shift that destabilizes the social order. For extreme wealth inequality isn't a consequence of fate or sorcery; it is the consequence of policies that favor the few at the expense of the many, a reality that is exceedingly uncomfortable for those benefiting from the asymmetry.

For a rundown of the policies that have exacerbated wealth inequality, consider the following excerpts from Time magazine, September 2020: The Top 1% of Americans Have Taken $50 Trillion From the Bottom 90% -- And That's Made the U.S. Less Secure.
"There are some who blame the current plight of working Americans on structural changes in the underlying economy--on automation, and especially on globalization. According to this popular narrative, the lower wages of the past 40 years were the unfortunate but necessary price of keeping American businesses competitive in an increasingly cutthroat global market. But in fact, the $50 trillion transfer of wealth the RAND report documents has occurred entirely within the American economy, not between it and its trading partners. No, this upward redistribution of income, wealth, and power wasn't inevitable; it was a choice--a direct result of the trickle-down policies we chose to implement since 1975.

We chose to cut taxes on billionaires and to deregulate the financial industry. We chose to allow CEOs to manipulate share prices through stock buybacks, and to lavishly reward themselves with the proceeds. We chose to permit giant corporations, through mergers and acquisitions, to accumulate the vast monopoly power necessary to dictate both prices charged and wages paid. We chose to erode the minimum wage and the overtime threshold and the bargaining power of labor. For four decades, we chose to elect political leaders who put the material interests of the rich and powerful above those of the American people."
In other words, extreme wealth inequality is not the result of economic forces outside our control; it's the result of our policy responses to changing social, political and economic conditions. While those benefiting from the policies attribute the asymmetric distribution of the economy's gains to "forces outside our control" such as globalization and automation, those losing ground sense that this is an excuse for taking advantage of the situation, to the detriment of the national interest.

We can best understand extreme wealth inequality as the destabilizing result of one set of competing economic interests gaining dominance over other economic interests: broadly speaking, the balance between labor and capital has collapsed in favor of capital. To take one example, consider the minimum wage, which did not kept up with inflation for decades as a policy decision.

The different interests within each sector can also destabilize into asymmetric distributions. For example, within the broad category of capital, there are many competing interests: industrial capital, financial capital, land-based capital, domestic and global interests, and so on. Within labor, there are blue-collar and white collar interests, and gradations of skills, regional interests, and so on.

Broadly speaking, globalization and financialization greatly increased the share of some interests at the expense of others.

The social boundaries of what's acceptable and unacceptable change, enabling or restricting financial policies. For example, in the postwar boom of the 1950s, corporate CEOs earned multiples of their average employee that by today's standards were ludicrously low, as present-day CEOs routinely take home compensation (including stock options) that are in the tens of millions of dollars annually.

In the broad sweep of history, extreme asymmetries in the distribution of the economy's output are rebalanced one way or the other, if not with policy changes than by the overthrow of the status quo. The book The Great Leveler: Violence and the History of Inequality from the Stone Age to the Twenty-First Century breaks down the various pieces of this complex puzzle.

The history and data are too varied to be easily summarized, but we can start with humanity's innate sense of fairness in social organizations: we sense when our contributions are getting short shrift while others are grabbing shares that are not commensurate with their contributions--despite their claims to "earning" their outsized shares.

Some write this off as envy, and to be sure envy is an innate human response, but fairness and envy are two different things. If someone strips us of power that we once held to benefit their own accumulation of wealth, our sense that this is unfair is not envy.

We seem to be approaching the point where a rebalancing of extreme asymmetries is at hand, and so we have to choose between policy changes and social upheaval. Those benefiting from the current asymmetrical distribution naturally feel that all is right with the world, while those whose purchasing power and political power have been stripmined feel that regaining what was taken from them is only fair.

by Charles Hugh Smith, Of Two Minds |  Read more:
Image: Lon Tweeten/Time
[ed. This has been obvious for a long time, yet continues to persist and actually worsen. Why? The post following this one - New Phase of Cultural Conflict explains a lot, especially how deceptive allegiences (a lying billionaire con man, hero to the underclass?) can generate popular support (and electoral victories) while hiding true intentions.]

via:

15 Observations on the New Phase in Cultural Conflict

Back in 2014, I sketched out a widely-read outline of an alternative interpretation of cultural conflict. Curiously enough, the conceptual tools I used came from a 1929 book from philosopher José Ortega y Gasset entitled The Revolt of the Masses—a work that offers surprisingly timely insights into our current situation.

That article stirred up a lot of debate at the time, but the whole situation has intensified further since 2014. Everything I’ve seen in those eight years has made painfully clear how insightful Ortega had been. The time has come to revisit that framework, summarizing its key insights and offering predictions for what might happen in the future.

Here’s part of what I wrote back in 2014:
First, let me tell you what you won’t find in this book. Despite a title that promises political analysis, The Revolt of the Masses has almost nothing to say about conventional party ideologies and alignments. Ortega shows little interest in fascism or capitalism or Marxism, and this troubled me when I first read the book. (Although, in retrospect, the philosopher’s passing comments on these matters proved remarkably prescient—for example his smug dismissal of Russian communism as destined to failure in the West, and his prediction of the rise of a European union.) Above all, he hardly acknowledges the existence of ‘left’ and ‘right’ in political debates.
Ortega’s brilliant insight came in understanding that the battle between ‘up’ and ‘down’ could be as important in spurring social and cultural change as the conflict between ‘left’ and ‘right’. This is not an economic distinction in Ortega’s mind. The new conflict, he insists, is not between “hierarchically superior and inferior classes…. upper classes or lower classes.” A millionaire could be a member of the masses, according to Ortega’s surprising schema. And a pauper might represent the elite.
The key driver of change, as Ortega sees it, comes from a shocking attitude characteristic of the modern age—or, at least, Ortega was shocked. Put simply, the masses hate experts. If forced to choose between the advice of the learned and the vague impressions of other people just like themselves, the masses invariably turn to the latter. The upper elites still try to pronounce judgments and lead, but fewer and fewer of those down below pay attention.
This dynamic is now far more significant than it was eight years ago. So I want to share 15 observations on the emerging vertical dimension of cultural conflict—these both define the rupture and try to predict how it will play out.

(1) Analysis of cultural conflict is still obsessed with left-versus-right strategizing, but the actual battle lines are increasingly down-versus-up. A lot of work goes into hiding this, because both left and right want to present an image of unity, but both spheres are splintering into intensely hostile up-and-down factions.

(2) The frequency with which you hear the “lesser of two evils” argument is an indicator of how powerful this up-and-down rupture has become. This is the argument used by Ups to retain the loyalty of the Downs. You have to stick with us, even if we are tainted elites, or else we both lose.

(3) When commentators give any attention to down-versus-up, they usually reduce the conflict to income disparities, but that is misleading. Down-versus-up is more attitudinal than economic. Sometimes the tension manifests itself along traditional class and wealth lines, with disputes focused primarily on money, but that’s only a small part of the conflict. Down-versus-up is multidimensional and adapts rapidly to current events. Adding to the complexity, rich people frequently act like Down members, while people with tiny incomes can be fiercely loyal to the Up worldview.

(4) The essence of down-versus-up is that a numerically large group of dissenters focus their anger on a small number of elites who they view as antagonists, perhaps even evil villains. These Down movements cut across left-versus-right political ideologies, and thus encompass seemingly incompatible groups such as Occupy Wall Street, the truck convoys, Black Lives Matter, the Tea Party, ANTIFA, cryptocurrency fanatics, and a host of other cohort groups in the news. In every instance, these groups have proven capable of mobilizing intense energy among members—much greater energy than the Ups can ever hope to match. Participants seem to appear out of nowhere, leaping almost instantaneously into action.

(5) There will be more groups like this next year—and every year from now on. As strange as it sounds, an organization that doesn’t even exist today is likely to transform the entire sociocultural landscape in the near future. I’m not sure what it will look like, but one thing is certain—it won’t arise from any legacy institution.

(6) The targets are people at the top of the heap, but that can include a dizzying array of individuals—including wealthy CEOs, DC politicians, celebrity TV newscasters, law enforcement authorities, experts of all stripes, Ivy League academics, hedge fund managers, tech titans at huge Silicon Valley companies, movie stars, etc. A key element of the narrative is not simply that these people have different agendas than those at the bottom, but even more to the point, these elites are depicted as inherently untrustworthy—they don’t play fairly, they have sold their souls to the Dark Side. Hence the Down opposition feels the need to take extreme measures. The critiques brandished by the Downs are often reduced to the banal, mind-numbing explanation that people on the Dark Side do bad things and must be stopped. The very banality of the message makes it all the more viral.

(7) The members of the Up group want to rebrand themselves as Down adherents. They work tirelessly to do this. Hence you see billionaires proclaiming their alignment with all of the leading Down agendas. Politicians see that Down constituencies are the most energized voters and curry their favor—proclaiming at every opportunity that I’m just like you. Even the most established DC insiders with the most elite backgrounds must act as if they aren’t really members of the Up cohort. Media personalities, in particular, take every opportunity to act as Down as possible, realizing that this is the only genuine street cred worth having in the current moment.

(8) When well known political figures move from right to left, or vice versa, many onlookers are surprised. But in almost every instance, the Up maintain their Up allegiance, and the Down retain their Down status. It's much easier to make the psychological shift from one party to another than to abandon your emotional attachment to the Down or Up worldview.

(9) All of the cultural energy right now is on the bottom. And that energy has been intensifying. The attempts to distort this conflict into conventional left-versus-right battle lines has prevented opinion leaders from grasping the actual dynamic at play. Any ambitious agenda that doesn’t take into account down-versus-up is doomed to failure.

(10) This is not just a political shift but also impacts arts and entertainment. Reality TV, for example, is a manifestation of legacy institutions trying to capture the vitality of the Down lifestyles in faux narratives that emulate non-elites in everyday situations. Music genres each have their own up-versus-down positioning—just consider your mental images of the audience for rap, classical, country, jazz, etc. (But genres can move: jazz was once Down, but it has become Up.) Art forms that seem to be in crisis—sculpture, the novel, the symphony—are always aligned with the Up cohort. Nobody ever claims that Down genres are in crisis.

by Ted Gioia, Honest Broker |  Read more:
Image: via the author
[ed. See also: I told you so (Numb at the Lodge).]

Sunday, November 17, 2024

Valerius De Saedeleer (Belgian, 1867-1941, b. De Kat, Aalst, Belgium, d. Oudenaarde, Belgium) - Paysage d'Hiver (Winter Landscape), 1931, Paintings: Oil on Canvas


Benoît Maubrey (American, b. 1952, Washington D.C, USA) - Torii Gate made of speakers in Kamiyama, Tokushima, Japan.
via:

Guitarist Tim DiJulio, ‘Seattle’s Best-Kept Secret’

A beloved Seattle guitarist takes the spotlight on his first major tour (Seattle Times)
Images: Jennifer Buchanan
[ed. Nice story, nice guy. Tough business. See also: Guitarist Tim DiJulio, ‘Seattle’s best-kept secret,’ has stars for fans (ST).]

Saturday, November 16, 2024

Deeply Thoughts

(from Samuel Johnson’s definition of the word cant)

A while back I put up a tweet saying “‘Deeply’ is the new ‘very,’” but to my shock and dismay the world paid no attention. My little jape failed to quell the rising tide in the usage of the d-word in all sorts of public-facing discourse. Now “deeply” has become the universal adverb. Since I am otherwise in a bit of a midsummer lull, I thought I would take up my cudgel once more and square up to this menace.

“Very,” the feebler predecessor of “deeply,” was one of those words that editors and English teachers automatically red-pencilled into oblivion whenever they saw it on the page. Of course, people used it anyway, because they believed it made their sentences stronger. Why wouldn’t it? Something that’s very big must be bigger than something that’s merely big, right? So, commonsensically, putting “very” in front of an adjective should intensify it.

It didn’t actually work out that way, though. Readers came to understand, at some probably subliminal level, that “very” was just a marker for weak or tendentious writing. Serious people just didn’t use the word. The New York Times might write of “severe flooding” in a disaster area, for example, but you’d never see them use the phrase “very severe flooding.” A politician trying to plead for disaster relief funds might say “very severe flooding” and be quoted as such. But in actual coverage, the newspaper of record would never use “very.” Instead it would present facts and statistics, leaving the readers to judge for themselves the level of severity.

Sophisticated readers thus came to understand that “very” was a marker for lazy writing. Users of “very” were trying to bring you around to a certain point of view without earning it. That’s why editors and English teachers hated it.

“Deeply” has inherited all of the badness of “very” but piled on some additional noxious qualities.

A couple of years ago I set a personal policy that when reading anything at all—a tweet, a press release, a newspaper article—as soon as I encountered the word “deeply” I would simply stop reading and turn my attention elsewhere. I don’t think I’ve missed anything as a result. On the contrary, I’m pretty sure that the rigorous enforcement of this rule has improved my quality of life and upgraded the flow of information coming into my brain.

What “deeply” has that “very” didn’t is the overlay of pious moralism. You can easily get the idea by comparing these three statements:
  1. I was offended by this tweet
  2. I was very offended by this tweet
  3. I was deeply offended by this tweet
Of course, (1) is by far the strongest version. Seeing this, I might roll my eyes at yet another person claiming to be offended by something, but I might keep reading just to see what they’re on about.

(2) is weaker despite—in fact, because of—the attempt to strengthen it by addition of “very.” That’s okay. It’s just a poorly written sentence. The world’s full of those. I might keep reading on the off chance that this is just an inept writer honestly struggling to make a good point.

(3) has all the weakness of (2) but attempts to make up for that by implicitly suggesting that there is some underlying moral cause for taking offense that is impossible to gainsay. Only a monster would refuse to take with the greatest seriousness the concerns of a person who was deeply offended! I stop reading (3) as a matter of principle.

It’s a little bit aligned with how the word “sacred” gets used. Both “deeply” and “sacred” are shorthand for “under no circumstances is it acceptable for anyone to fail to take seriously, let alone disagree with, what I am about to say. All within the sound of my voice must now put on their Serious Faces and hastily knuckle under.”

“Deeply” is, in other words, a marker for cant: a wonderful old word that has been used in various related senses since the 1500s.

Cant’s definition #6 in the OED is so spot on that I can make this essay a lot shorter merely by quoting it here. It is

“To affect religious or pietistic phraseology, esp. as a matter of fashion or profession; to talk unreally or hypocritically with an affectation of goodness or piety.”

Dr. Johnson’s definition is the one shown in the image at the top of this post. Just as a side note, it is fascinating that 270 years ago this sort of talk was a common enough feature of the rhetorical landscape that the likes of Dr. Johnson were absolutely nailing it with one four-letter word.

by Neal Stephenson, Graphomane |  Read more:
Image: Samuel Johnson
[ed. I am deeply nonplussed (ha!) that one of my favorite authors spends time worrying about this stuff.]