Saturday, May 20, 2023

The Tyranny of ‘the Best’

Let me tell you about my friend Dan Symons. There is a kind of person who finds the idea of seeking out “the best” incredibly enticing, on an almost spiritual level. The kind of person who genuinely enjoys perusing articles like “the nine best hair dryers of 2023,” who is overcome with clammy dread at the idea of drinking in a bar with only a four-star rating on Google, who, in order to plan a weekend getaway, requires a prolonged and extensive operation that involves several spreadsheets. You know the type. Maybe you even are the type.

Dan is that kind of person. He is also from California (perhaps not a coincidence that this is the home of both the eternal quest for self-optimization and the internet ecosystem that underpins the explosion in ratings culture), works in tech and has a keen appreciation for the finer things in life. He is known throughout our friend group for his fastidiously curated lists of restaurants, bars and even specific menu items; his near refusal to venture into establishments that are anything less than excellent; and his hours spent trawling reviews for everything from mini-fridges to trail shoes. When I questioned him on a rumor that I heard recently, that he had been freezing packets of “the best butter” to bring back from France, he confirmed it was true.

Dan may be an extreme case, but a dampened version of his instinct permeates our culture. If it didn’t, the raft of articles solemnly decreeing this year’s superlative vacuum cleaners simply would not exist.I had been wondering about where this culture of ratings and rankings came from and how it came to take over our lives; how even the least exciting consumer choices are framed in terms of elusive state-of-the-art options; and, conversely, how necessarily subjective things (novels, colleges, where to live) are increasingly presented as consumer choices for which there is an objective “best.”

I thought Dan could shed some light on what the pursuit of “the best” really means. But when I asked him, what I learned was that the motivation behind his “quest for the best” wasn’t what I expected. It wasn’t about snobbery, or even, really, self-optimization.

“I think the really important thing to me — which is probably not a healthy thing — is I want to make sure the people I’m with have the best time possible,” Dan told me. “And this comes down to not just going to the restaurant, but even ordering as well. Like, ‘Are you sure you want to get that dish? Based on the other things we’re getting, is that the right thing to get?’ It extends through the whole meal.” To me it actually sounds, instead, as if Dan is trying to guarantee something closer to happiness. But can happiness really be found in a packet of butter? (...)

There is a way of talking about the psychedelic, hysterical effect of the information glut produced by the internet that tends to exaggerate the nefariousness of certain elements — like repeatedly being shown advertisements for things you’ve already bought — while minimizing how chaotic and messy it all feels. You could, for example, say that these lists are a product of “ratings derangement syndrome” and then say something like “In a world where our tech overlords manipulate their distraction vortex to shovel us into the slobbering maw of capitalism, ratings offer the illusion of taking back a semblance of control.”

Maybe. But to me, anyway, the experience of shopping for a hair dryer online feels less like being a pawn in a Matrix-esque mind-control operation and more like being trapped inside a box of plastic toys that have all been wound up so they constantly chatter and clatter against one another. The reality is I just want to spend as little time as possible in that box, while also hopefully buying something that won’t break the second time I use it. Best-of lists and rankings can seem like a simple solution to this problem.

There are a lot of areas where people don’t feel as if they have expertise, and ratings “get rid of that feeling of not being competent,” Rick Larrick, a professor of management and organization at Duke University, told me. (...)

I’d venture that not many people (possibly not any people) are able to tell the difference between the top toilet cleaner and the second best — or even the fifth or sixth. This suggests that the appeal of the best is not really about a simple difference in the quality of the product, but more about a feeling: of reassurance, maybe; of having won, having got the right thing.

Dan’s proclivities would place him squarely as “a maximizer,” a category of consumer invented by Barry Schwartz, an emeritus professor of psychology at Swarthmore College and the author of “The Paradox of Choice,” which examines the detrimental side of endless consumer options. Dr. Schwartz defines people who are happy to settle for something that will probably be pretty good (a restaurant with above average, but not excellent, ratings; the third song they come to on a playlist; the midpriced toaster on the first page of Amazon) as “satisficers” and those who search exhaustively for the best version as “maximizers.” (Many people who are generally satisficers will have certain things that bring out their inner maximizer. In other words, we all have an inner Dan Symons.) (...)

The people I know who broadly seem most content with their lives have adopted the satisficer’s mind-set. When my hairdresser (a man of infinite wisdom) suggested I get a heat protection spray and I asked which one, he imparted some advice in this vein: “Oh, it doesn’t really matter,” he said. “Just don’t spend £3, but don’t spend £25 either.” (The latter is the price of the brand the salon sells.) “Nobody really needs that. Spend around £7. That should be all right. Yes, always stick to the midrange. I should be a salesman!” He laughed. “Although not for the one we actually sell.”

Dr. Schwartz goes further: He has found that those who are on the higher end of the maximizing scale not only have a harder time making decisions but also are less satisfied with the decisions they do make. They’re also more likely to be borderline clinically depressed, he told me. “So it’s really not doing anyone a favor,” he said.

But the sensible attitude — as obvious as it may sound — can be hard to put into practice. Or maybe it’s easier to put into practice with hair spray than it is with, say, a “perfect vacation” or the “optimal time to retire.”

by Rachel Connolly, NY Times |  Read more:
Image: Tracy Ma

Thursday, May 18, 2023

Bad Manors

The street I grew up on in Moore County, North Carolina, is unrecognizable now. What was once a mix of modest, low-slung ranch-style houses interspersed with pockets of turkey oak scrub has been invaded by gargantuan homes with equally oversized trucks parked in the driveway. They tower over their older neighbors at a tragicomical scale difficult to convey, each identically crafted for maximum cheapness and interchangeability. Behold the McMansion in all its readymade, disposable grandeur.

Unlike the McMansions that predominated prior to the financial crisis—over-inflated, fake-stuccoed colonials festooned with some tacky approximation of European finery—the new iterations are whitewashed and modern, their windows undifferentiated voids. The compound hip roofs of the aughts have been replaced with peaky clusters of clumsy gables, a nod to the faux-folksy “modern farmhouse” trend ushered in ten years ago by HGTV. Moore County, meanwhile, is a prototypical American sprawl scenario: boundless, monotonous growth laying waste to what was once a network of stolid retirement communities orbiting the quiet resort town of Pinehurst. Who the hell are all these new interlopers? When I ask, my mother simply says, “They’re military.” Indeed, Moore County has become a de facto upscale exurb for high-level military personnel and civil servants working in nearby Fort Bragg. But it isn’t only happening in North Carolina: like something out of Invasion of the Body Snatchers, the McMansion has replicated, reduplicated, and overtaken the country overnight.

But what is a McMansion, exactly? Well, you know it when you see it. Take a look at any remotely desirable area on Zillow, sort by “new construction,” and you’ll see an endless array of them: bloated, dreary, amenity-choked domiciles. (...)

Seven years ago, I started the blog McMansion Hell to document—and deride—the endless cosmetic variations of this uniquely American form of architectural blight. I’ve mostly tackled prerecession McMansions, just for the novelty of houses both dated and perched on the ugly/interesting Möbius strip. But I worry that I’ve actually reinforced the idea that McMansions are a relic of the recent past. In fact, there remains a certain allure to these seemingly soulless suburban developments, and, more specifically, their construction and inhabitation. Increasing interest rates, inflation, and supply chain disruptions notwithstanding, the McMansion is alive and well. Far from being a boomtime fad, it has become a durable emblem of our American way of life.

McShitshow

McMansions began proliferating even before the term first appeared in the 1980s. (Meant to lampoon the graceless strivers of the nouveau riche, the portmanteau has no known direct origin.) But, it wasn’t until 2008 that the McMansion firmly imprinted itself on the national consciousness. Recall the endless newsreels of oversized, foreclosed houses that implied that the subprime mortgage crisis was caused not by the predatory lending institutions who foisted junk mortgages on inexperienced homebuyers but by the greedy poors who wanted more house than they could afford, all in order to imitate their idols on MTV Cribs. The McMansion did not cause the financial crisis; its role was negligible at best. But it became indelibly associated with debauched, prerecession excess—and, in the wake of the collapse, seemed as though it might become an anachronism, a memorial to a bygone housing bubble.

Nevertheless, once the economy began to recover, the McMansion quietly returned, albeit in more respectable costumes: a Disneyfied version of the Craftsman style, the Tudor, and today’s neutered modernism or its farmhouse equivalent. (...)

Reviewing the many case studies I’ve undertaken at McMansion Hell over the last seven years, it becomes clear that the McMansion, for all its garish variation, follows a consistent floorplan. A central foyer opens up on either side to a formal dining and sitting room, both rarely used outside of tense Thanksgiving dinners. Two-story McMansions feature a large, often curved, staircase that leads up to a mezzanine off of which the private rooms (bedrooms, offices) are located. On the first floor, the foyer empties into a large space for entertaining—a cavernous great room and an open kitchen, invariably with an outsized island, often with a breakfast nook. Off a secondary hallway is a master suite, purposefully distanced from all the other bedrooms; it is usually flanked by a sitting room and a decadent little bathroom. As square footage expands, so, too, do the amenities: a wet bar, a bonus or rumpus room in the basement, a gym, a den exclusively for watching television, a (decorative) library. These are merely tacked onto the existing core plan as the house metastasizes outward, upward, or both. The social structure of the nuclear heterosexual family permeates the plan. Rooms are excessively gendered, both for children and adults. Man caves and she sheds abound. (...)

McPocalypse Now

The real question now is, Who is still building, buying, and living in these houses? It is stubbornly difficult to nail down. According to Realtor.com, millennials are moving to the suburbs, where mortgages are often cheaper than urban rents. Boomers are downsizing for accessibility reasons, often competing with millennials for the same entry-level houses. Gen X—making up 22 percent of homebuyers—are now the ones “looking for larger, trade-up homes.” An American Home Shield survey indicates that the largest homes are being built in the West, in Utah and Colorado, with other concentrations forming in emerging tech hubs like Raleigh, North Carolina, and Austin, Texas. In essence, the only certainty is that when Americans get richer—through generational wealth transfer or through industry—they tend to seek out McMansions. When boomers die and bequeath their wealth to their children, those children will probably also build a bunch of McMansions.

Why? Some of the correlating factors are cultural, others architectural or material. For starters, you get more house for your money in the suburbs than in the city, where the price of land is astronomical. Buyers with children, but without the means to send them to private school, want to live in good school districts, which necessitates moving to wealthier neighborhoods on account of the American public school system’s entrenched racism and inequality. Architecturally speaking, the reason for the McMansion’s persistence is that it is the path of least resistance for building a house of a certain size. It’s hard to be efficient when forcing four thousand-plus square feet under one roof. Tailor-made architectural creations remain out of reach (or undesirable) for many people. The McMansion is a structurally stable, if visually clunky, formula. Contrary to almost four decades of urbanistic thought highlighting the need for walkability, density, and transit-oriented development, companies like Pulte Homes continue to construct McMansion neighborhoods near highway off-ramps and high-traffic arterial roads. They do this because people buy these houses and drive to work, and because building single-family homes doesn’t require suffering through rezoning battles or complying with extensive building code requirements, to name just two pesky bureaucratic hurdles of the plethora associated with multifamily residential development. Perplexingly, despite the ascent of interest rates that might otherwise deter buyers from procuring a mortgage, building McMansions remains immensely profitable. PulteGroup—which constructs housing under several subsidiaries, including Pulte Homes—made over $13 billion in 2021, and while that revenue encompasses a range of property types, McMansions are certainly among them. These are simple, crude realities.

The McMansion has also endured because, in the wake of the recession, the United States declined the opportunity to meaningfully transform the financial system on which our way of life is based. The breach was patched with taxpayer money, the system was restored, and we resumed our previous trajectory. The McMansion survived what could have been an existential crisis; it remains an unimpeachable symbol of having “made it” in a world where advancement is still measured in ostentation. It is a one-stop shop of wealth signifiers: modernist décor (rich people like modernism now), marble countertops (banks have marble), towering foyers (banks also have foyers), massive scale (everything I see is splendor). Owing to its distance from all forms of communal space, the McMansion must also become the site of sociality. It can’t just be a house; it has to be a ballroom, a movie theater, a bar.

It is a testament, too, to a Reagan-era promise of endless growth, endless consumption, and endless easy living that we’ve been loath to disavow. The McMansion owner is unbothered by the cost of heating and cooling a four-thousand-square-foot mausoleum with fifteen-foot ceilings. They see no problem being dependent—from the cheap material choice of the house to the driving requirements of suburban life—on oil in all its forms, be it in extruded polystyrene columns or gas at the pump. The McMansion is American bourgeois life in all its improvidence.

by Kate Wagner, The Baffler |  Read more:
Image: © José Quintanar

Pat Benetar


Source: YouTube

Heaven Has a Bathrobe-Clad Receptionist Named Denise

If you ask anyone on TikTok what happens when you die, there's a decent chance they'll put it this way: You appear in a waiting room. You're wearing a bathrobe. And you're greeted not by St. Peter or Mother Mary, but by a gum-snapping, keyboard-clacking New Yorker named Denise.

As heaven's receptionist, Denise will hand you a welcome packet and ask what you want your ghost outfit to be. She'll fill you in on heaven's amenities (there's a free margarita bar), and she'll likely leave you with a little bit of gossip, lowering her voice to gripe about Paul Revere's latest email (all caps, subject line: URGENT) or that time in the nail salon when Jackie Kennedy met Marilyn Monroe ("like two cats on a hot tin roof").

But for all her office-gal kvetching, Denise is a people person. When someone shows up in the waiting room with fear or confusion — having died too young or too soon — it's Denise who's there to scoop them up in a hug and show them all of heaven's silver linings.

And for the TikTokers watching along, she has become a tool for thinking through the afterlife — and for grieving those who've already made their way there.

The real Denise is a 26-year-old pageant queen

Though arguably just as poignant as The Good Place or Field of Dreams, the world of heaven's reception is a low-fi, short-form experience. And like most TikTok series, it's the imaginings of one person alone: Taryn Delanie Smith.

The 26-year-old, better known as @taryntino21, considers herself first and foremost a content creator — she has gained 1.2 million TikTok followers in two years of posting. But she's an offline celebrity in her own right as well, having been crowned 2022's Miss New York and runner-up in the Miss America competition.

But before Smith had any sort of platform, she herself was a receptionist, working long hours to pay her way through a master's degree in international communication. It's that experience that she pulls from to inform Denise's character.

"I got promoted to the call center eventually, which was definitely not the promotion I thought it'd be," Smith said in an interview with NPR.

Even heaven's receptionist has to go through the same mundane daily dramas as any earthly office worker.

There's the slew of entitled folks who think they deserve the Angel Premium Plus package but are short on the cost: 7,899 good deeds. But then there's the creepy resident with red eyes who keeps abusing a downstairs pass to terrorize a suburban family.

"Why can't we just let women do it all?"

It's these types of creative, world-building details that keep Smith's audience so hooked. But like all great ideas, Denise's character was born in the least grandiose of ways — as a stray thought in the shower.

"I was standing there thinking, 'If I die in a chicken suit, then I have to wear the chicken suit forever.' Can you imagine a ghost coming to you in a chicken suit?" Smith said. "And I just couldn't stop giggling."

She hopped out of the shower and into a robe and towel, found the first stock image of heaven that came up on Google and made what she thought would be the stupidest video on the internet.

Today, the heaven's receptionist videos have been viewed over 37 million times on Smith's TikTok page, and at least 22 million times on other platforms. Smith gets recognized on the street as Denise more often than she does as Miss New York.

by Emily Olson, NPR |  Read more:
Image: Screenshot by NPR/TikTok @taryntino21
[ed. I wish I could embed her TikTok videos here, but alas...nope. Anyway, click on the NPR link for some examples, or her TikTok page (The Heaven Receptionist). Funny. For something completely different (but still TikTok-y), see also: Lemon8 Is for the (Hot, Rich) Girlies (PW):]

"Our timelines are populated with raw posts from unextraordinary people with a few diamonds mixed into the rough.

Lemon8 is different. It’s all diamond, no rough. Even the upper crust of TikTok finds it intimidating:"

Skin-Colonizing Bacteria Create Topical Cancer Therapy in Mice

While studying a type of bacteria that lives on the healthy skin of every human being, researchers from Stanford Medicine and a colleague may have stumbled on a powerful new way to fight cancer.

After genetically engineering the bacteria, called Staphylococcus epidermidis, to produce a tumor antigen (a protein unique to the tumor that’s capable of stimulating the immune system), they applied the live bacteria onto the fur of mice with cancer. The resulting immune response was strong enough to kill even an aggressive type of metastatic skin cancer, without causing inflammation.

“It seemed almost like magic,” said Michael Fischbach, PhD, an associate professor of bioengineering. “These mice had very aggressive tumors growing on their flank, and we gave them a gentle treatment where we simply took a swab of bacteria and rubbed it on the fur of their heads.”

Their research was published online April 13 in Science. Fischbach is the senior author, and Yiyin Erin Chen, MD, PhD, a former postdoctoral scholar at Stanford Medicine, now an assistant professor of biology at the Massachusetts Institute of Technology, is the lead author.

Skin colonizers

Millions of bacteria, fungi and viruses live on the surface of healthy skin. These friendly colonists play a crucial role in maintaining the skin barrier and preventing infection, but there are many unknowns about how the skin microbiota interacts with the host immune system. For instance, unique among colonizing bacteria, staph epidermidis triggers the production of potent immune cells called CD8 T cells — the “killer” cells responsible for battling severe infections or cancer.

The researchers showed that by inserting a tumor antigen into staph epidermidis, they could trick the mouse’s immune system into producing CD8 T cells targeting the chosen antigen. These cells traveled throughout the mice and rapidly proliferated when they encountered a matching tumor, drastically slowing tumor growth or extinguishing the tumors altogether.

“Watching those tumors disappear — especially at a site distant from where we applied the bacteria — was shocking,” Fischbach said. “It took us a while to believe it was happening.”

The mystery of the T cells that do nothing

Fischbach and his team didn’t start out trying to fight cancer. They wanted to answer a much more basic question: Why would a host organism waste energy making T cells designed to attack helpful colonizing bacteria? Especially as these T cells are “antigen-specific,” meaning each T cell has a homing receptor that matches a single fragment of the bacterium that activated it.

Even stranger, the CD8 T cells induced by naturally occurring staph epidermidis don’t cause inflammation; in fact, they appear to do nothing at all. Most scientists thought colonist-induced T cells must be fundamentally different from regular T cells, Fischbach said, because instead of traveling throughout the body to hunt for their target, they seemed to stay right below the skin surface, somehow programmed to keep the peace between bacteria and host.

by Hadley Leggett, Stanford Medicine |  Read more:
Image: Arif Biswas/Shutterstock.com

Wednesday, May 17, 2023

Jump

Back in elementary school a ‘scientific theory’ hit the playground that blew my mind: if every person in China jumped at the same time, their impact would knock our planet off its axis and the world would end.

I was always a sort of gullible person. I think I just liked to believe things, and in them. But this idea really captured me. It was the frightening image of it first, every single person in a country doing the same thing, at the same time. I didn’t even feel comfortable in Catholic Mass when the monotonous, somber group prayer started. But at the scale of a billion people? I used to watch a lot of Star Trek with my dad, and this was Borg shit. It was also just confusing on a practical level because the billion jumpers weren’t drones. They were people, just like me, and I didn’t want to die. Why would they? Naturally, I assumed, they’d have to be fooled into doing it by a megalomaniacal supervillain. But how could he pull it off?

Information traveled differently in the nineties, and more slowly. To succeed at a scam so spectacular as the Jump, the time and place of the apocalyptic act would have to be announced by broadcast days in advance, and it would have to be framed as something not only beneficial, but essential. This would be the only way for the instructions to make it to the billion people required, and for them to go through with it. But by the time the information reached them, there would be an enormous media reaction. There would be counter information. There would be experts on planet stuff, probably, and they would tell people this was dangerous. If the megalomaniacal Jump enthusiast pirated a television signal (supervillains loved to do this), he could trick as many people as were watching a single, live broadcast. But hundreds of millions of people? Billions? Instantaneous, global mass hysteria was just not possible, let alone the direction of that hysteria to some particular end. I could rest easy, I decided, and it was back to my dreams of the Starship Enterprise.

But a lot has changed since 1993.

Today, almost half the global population is connected to the internet by the supercomputing smartphones that live in our pocket. That’s 3.5 billion people. More significantly, the way we access “news,” or live information about the world, has paradigmatically changed...

Ubiquitous mobile internet dramatically increased our immersion in media, but ubiquitous social media dramatically increased the speed at which ideas travel and, perhaps more significantly, deeply socialized the dynamic. We no longer learn about the world from institutions, or even the illusion of them. We learn about the world from people we care about. This binds our sense of truth to tribal identity, and that is a powerful, fundamentally emotional connection. It’s also now operating at the scale of a planet. Today, a single piece of information — a tweet from your president, an update from the World Health Organization, video footage of police brutality — is polarized and shared across our social network. From there, it can reach hundreds of millions of people, often furious, in less than an hour.

Jump...

Not every revolution is a net disaster, just most of them. Political violence around the world has far more often led to destruction and widespread human misery than it has to peace and prosperity. France, Russia, China, Cuba, Venezuela, countless nations of the Middle East, and Africa — for most people in most nations on this planet, throughout most of recorded history, revolution has preceded authoritarianism, poverty, and death. Americans have a unique blindness to the subject, as our own violent insurrection preceded directly the founding of our nation, the most stable liberal government in history, and that story is a central part of our mythology. We are a prosperous, heroic country, and we credit our existence to a righteous founding war for freedom. But history is more complicated than legend. The U.S. Founding Fathers did not just change their government. On victory they set immediately to separating powers and guaranteeing that future change, while possible and expected, would come slowly in increments. Today the word “democracy” is sacrosanct among Americans, but we don’t and never have had a democracy. This is an absence by design. An inherently unstable form of government, our Founding Fathers believed, without exception, democracy would lead to chaos, and that chaos would lead to tyranny. The architects of our nation therefore designed a democratic republic, with a representative democracy, and at founding that looked a lot like a system of firewalls between masses of people and power. Local leaders elected state leaders, and state leaders elected national leaders. With our rules for political change themselves drafted in such a way as redrafting them would be slow and difficult, it was checks and balances all the way down. The United States does not owe its prosperity to dramatic change, but to an historically rare stability.

Even absent social media, the speed at which rapid political change is possible in America has been accelerating for two centuries. Checks have eroded. Balances have become less balanced. At the same time, the federal government has grown more powerful, and the executive branch commands more of that power than ever. For years, support from the political establishment, itself a kind of moderating function, has not been entirely necessary to succeed in presidential politics. It was only a matter of time before the weakness was exploited. In 2016, America elected a reality television star to the most powerful edifice of political power, at the head of the largest economy, and in command of the most powerful military, in human history. Today, beyond all doubt, anyone can be the president. But even with so unpredictable an office as our presidency the United States is a more stable nation than most. In addition to the genius framework of our government and a couple hundred years of binding, national identity, we are supported by a strong economy, abundant arable land, and friendly neighbors. A far more significant concern is we are now living in a world of smaller nuclear powers with fewer resources that are many of them one trending hashtag away from violent insurrection, and there is no telling what governments, or gangs, will take power in their place. The threat of a fallen nuclear state would of course affect us all. In this way, a meme-induced international mass hysteria would not even be necessary for global cataclysm. A national hysteria, in almost any corner of the world, would do just fine. But there will be international crises. Twitter may have started as a fun place to share jokes, but it has long since morphed into a virtual battleground for ideological war. While most of the conflicts are civil, at least a few have pit governments against each other, and such conflicts will undoubtedly proliferate. We have already watched national leaders threaten each other on the platform, in real time, egged on by crowds of millions. The question is not if a real war, in the physical world, can be started in this environment. We all know it can. Without some dramatic course correction, the question is only when.

Many people correctly intuit something is wrong with social media, and they wonder if it can be fixed with government regulation. It cannot. A federal law prohibiting all politicians at every level from sharing to the popular platforms would be a compelling, partial solution to the specific threat of state-backed, mob-initiated conflict. Legislation of this kind would also be positioned to survive a consumer shift to disintermediated, decentralized social media. But it would not address the central problem with social sharing at scale, and is anyway not the sort of regulation being prescribed. Our loudest regulatory enthusiasts are almost entirely censorship oriented, and they suspiciously tend to map their censorship prescriptions to their personal politics. This alone should be enough of a warning that we shut the notion down. Alas, the conversation rages on, and no one is focused on the principle issue. Content moderation is irrelevant. The greatest possible danger of social media is the catalyzation of mass, relatively instant global action on incomplete or incorrect information. It is true our next information disaster could conceivably take color from whatever sort of speech is at the moment socially unacceptable. But if an idea is already perceived as socially unacceptable to so dramatic a degree as top-down censorship of its discussion is politically feasible, it almost certainly lacks the cultural support for any kind of rapid global movement. The hysteria we’re most at risk of will likely relate in some misguided way to an idea most people already generally value, but it will also be, in some aesthetic sense, new. To be so swept up emotionally as one is moved to immediate physical action, in the physical world, a person must be either very scared or very angry, and the mundane inspires neither of these emotions. We’re not in danger of painful speech, we’re in danger of temporary madness, and the only madness we are existentially vulnerable to is almost impossible to predict with any kind of specificity. It is from this unpredictable madness we need protection. But how can we protect ourselves from an idea that doesn’t yet exist?

Anger is the binding agent of every mob, from the scale of a few to the scale of a few billion. It feels good to be angry, and when we’re in it we don’t want to let it go. Our greatest defense against madness, then, would be calming down while on some powerful, primal level wanting the opposite. This is something small groups of men have struggled with for as long as we’ve existed, but it has not been until the last few years that a single fit of rage could almost instantly infect the planet. Social media has been an integral part of culture for a period of time that represents seconds of human existence, and we have already seen the emergence of globally-destabilizing conflict because of it. Conflicts of this kind will continue to emerge, and there is no reason to believe we’ve seen the most destructive of them. For the first time in history, we actually have to find a way to manage our impulse toward meme-induced hysteria. At its simplest, a little mental hygiene might be helpful. The notion we all suffer from confirmation bias needs to be normalized, and discussed. When relaying some emotionally-charged story, it is worth relaying first how this kind of story makes you feel in general, and the sort of things you might be missing. Admittedly, in the fever of rage, this will be incredibly difficult. But what about the other end? When receiving a piece of information that evokes anger, could one reflect on the bias of a source, be it a journalist or a friend? Who is the bearer of this bad news, and what are their values? If you had to guess, how would you think they wanted this piece of information to make you feel? Angry? To what end? Getting comfortable with being wrong would also help, as would expecting people around us to be wrong. This, by the way, is something that happens more than it doesn’t. People are constantly wrong. Stories are constantly corrected. That we are not yet skeptical of every new piece of information we receive, with so much evidence all around us now that misinformation is not the exception but the rule, is indication that skepticism of this kind is simply not something we are meaningfully capable of on our own.

by Mike Solana, Pirate Wires |  Read more:
Image: uncredited
[ed. Many good examples in this piece, not including this one (below). AI panic seems to be in first place at the moment.]

Tuesday, May 16, 2023

Nitty Gritty Dirt Band

via:

The Worst Crime of the 21st Century

The United States’ war on Iraq remains the deadliest act of aggressive warfare in our century, and a strong candidate for the worst crime committed in the last 30 years. It was, as George W. Bush said in an unintentional slip of the tongue, “wholly unjustified and brutal.” At least 500,000 Iraqis died as a result of the U.S. war. At least 200,000 of those were violent deaths—people who were blown to pieces by coalition airstrikes, or shot at checkpoints, or killed by suicide bombers from the insurgency unleashed by the U.S. invasion and occupation. Others died as a result of the collapse of the medical system—doctors fled the country in droves, since their colleagues were being killed or abducted. Childhood mortality and infant mortality in the country rose, and so did malnutrition and starvation. Millions of people were displaced, and a “generation of orphans” was created, hundreds of thousands of children having lost parents with many being left to wander the streets homeless. The country’s infrastructure collapsed, its libraries and museums were looted, and its university system was decimated, with professors being assassinated. For years, residents of Baghdad had to deal with suicide bombings as a daily feature of life, and of course, for every violent death, scores more people were left injured or traumatized for life. In 2007 the Red Cross said that there were “mothers appealing for someone to pick up the bodies on the street so their children will be spared the horror of looking at them on their way to school.” Acute malnutrition doubled within 20 months of the occupation of Iraq, to the level of Burundi, well above Haiti or Uganda, a figure that “translates to roughly 400,000 Iraqi children suffering from ‘wasting,’ a condition characterized by chronic diarrhea and dangerous deficiencies of protein.” The amount of death, misery, suffering, and trauma is almost inconceivable. In many places, the war created an almost literal hell on earth. (...)

Very few mainstream criticisms of the war call it what it was: a criminal act of aggression by a state seeking to exert regional control through the use of violence. A great deal of this criticism has focused on the costs of the war to the United States, with barely any attention paid to the cost to Iraq and the surrounding countries.

Those who critique the execution are not actually opposing the crime of the war itself. When we apply to ourselves the standards that we apply to others, we see just how little principled opposition to the Iraq War there has actually been, and how little acknowledgement that the war was fundamentally wrong and immoral from the outset.

by Nathan J. Robinson and Noam Chomsky, Current Affairs |  Read more:
Image: uncredited
[ed. It still makes my blood boil. But hey, at least Condi finally broke through the male-only membership at Augusta National (The Masters). So there's that.]

The Mother Who Changed

A Story of Dementia

In the philosophical literature on dementia, scholars speak of a contest between the “then-self” before the disease and the “now-self” after it: between how a person with dementia seems to want to live and how she previously said she would have wanted to live.

Many academic papers on the question begin in the same way: by telling the story of a woman named Margo, who was the subject of a 1991 article in The Journal of the American Medical Association (JAMA), by a physician named Andrew Firlik. Margo, according to the article, was 55 and had early-onset Alzheimer’s disease and couldn’t recognize anyone around her, but she was very happy. She spent her days painting and listening to music. She read mystery novels too: often the same book day after day, the mystery remaining mysterious because she would forget it. “Despite her illness, or maybe somehow because of it,” Firlik wrote, “Margo is undeniably one of the happiest people I have known.”

A couple of years after the JAMA article was published, the philosopher and constitutional jurist Ronald Dworkin revisited the happy Margo in his 1993 book, “Life’s Dominion.” Imagine, he asked readers, that years ago, when she was fully competent, Margo had written a formal document explaining that if she ever developed Alzheimer’s disease, she should not be given lifesaving medical treatment. “Or even that in that event she should be killed as soon and as painlessly as possible?” What was an ethical doctor to do? Should he kill now-Margo, even though she was happy, because then-Margo would have wanted to be dead?

In Dworkin’s view, it was then-Margo whose wishes deserved moral weight. In his book, he made a distinction between two kinds of interests: “experiential” and “critical.” An experiential interest was reactive and bodily: the pleasure of eating ice cream, say. A critical interest was much more cerebral; it reflected the character of a person and how she wanted her life to be lived. In the case of advanced Alzheimer’s disease, Dworkin argued, there is a danger that critical interests will be usurped by experiential ones. Still, it was the critical interests, previously stated, that deserved to be satisfied, because it was those interests that gave human life its meaning and its dignity — and even made it sacred, in a kind of secular way. A person was respected if she was helped to live out her chosen course, not if her life trajectory was allowed to be derailed by the amnesiac whims of her diseased self.

Some philosophers have devoted themselves to reconsidering Margo. They accuse Dworkin of holding too limited a view of meaning. Couldn’t a life of tiny pleasures be meaningful, even if it wasn’t the product of some sophisticated life plan? Critics have asked why we should privilege the decisions of a person who effectively no longer exists over the expressed choices of the person who is sitting before us, here and now. On a practical level, what authority could the then-self possibly exert over the now-self?

And while Dworkin’s theory might apply to those in the advanced stages of the disease, it speaks less to a majority of patients in the mild and moderate phases. The in-between Margos. Dworkin’s theory also distinguishes between selves in a way that strikes some critics as misguided. A person is not like Plutarch’s Ship of Theseus: replaced, plank by plank, over the course of her voyages, leaving those aboard to wonder if she is still the old ship or instead a new one — and, if she is a new one, when exactly she ceased to be the other. A person always is and is not who she used to be.

Still, many adult children cling to an image of a parent’s then-self and work relentlessly to protect it. Adult children “tend to be confident leaning on the side of a Dworkin-type view,” says Matilda Carter, a postdoctoral fellow in philosophy at the University of Glasgow and a former dementia caregiver. They don’t want a parent’s confused, 11th-hour choices to “tarnish the legacy of her life beforehand.” (...)

In our own lives, we insist on the right to make our own choices, even bad ones — what is sometimes called “the right to folly.” As independent agents, we are free to be unreasonable and unwise and to act against our own best interests: maybe because of flawed reasoning, or just because we want to. But with older relatives, we often insist on prudence over passion. “Ageism,” warns a 2016 paper in American Psychologist, “exacerbates the tendency to overprotect older adults.” In the end, this can mean that older people are held to a higher standard than everyone else; they are not allowed to choose poorly.

Any degree of cognitive impairment muddies these waters. When a person with dementia makes a decision that seems misguided, we might assume that the choice is not just bad but pathologically bad: a result of a cognitive failing. Eventually, each new decision — each expression of will — becomes suspect. Is this choice coming from Mom or from her disease? If the former is true, the decision should be honored; if the latter is true, perhaps it should be thwarted. But as the disease progresses, this effort at cognitive sorting becomes less tenable, because how do you separate a person from her diseased brain anyway? The more advanced a person’s dementia, the more her every choice becomes disputable, and thus worthy of custodial intervention.

“The question becomes, for the older adult, what are the barriers to evolving, to changing your opinions, to forming new relationships?” asks Nina Kohn, a law professor at Syracuse University with a specialty in the civil rights of older people.“When you form these new relationships, does that trigger people trying to remove your rights? The answer is: In some cases, it does.”

by Katie Engelhart, NY Times Magazine | Read more:
Image: uncredited/Facebook

James Booker

The Greatest Piano Player You’ve Never Heard (CA)

Booker deserves to be as acclaimed and widely known as Beethoven. The fact that he isn’t may have something to do with the fact that Booker did not play in the royal courts of Vienna. He was a gay, Black heroin addict who played New Orleans dive bars, and never had a hit record (though he played on the hits of others, including Ringo Starr, the Doobie Brothers, and Fats Domino). (...)

Booker had a tough life. When he was 9 years old, he was hit by an ambulance traveling 70 miles an hour. Given morphine to deal with the pain of his injuries, he became an addict, and he was repeatedly busted on drug charges, even serving a stint in Louisiana’s infamous Angola prison. Later in his life, Booker was appreciated in Europe, and some of the best recordings of his work are from live performances at jazz events in Switzerland and Germany. But he spent most of his career drifting around New Orleans, playing its various pianos.
 
One of those pianos was at the district attorney’s house. The DA was an amateur singer, and allegedly offered an arrangement whereby Booker would avoid prison time for drug charges if he would give piano lessons to the prosecutor’s son. The DA, Harry Connick Sr., would become infamous for putting innocent people behind bars, and one of his wrongful convictions led to a major Supreme Court case, Connick v. Thompson.

The son of that infamous DA, the one who received Booker’s piano coaching, was Harry Connick Jr.. He would go on to become one of the leading jazz pianists in the world, selling 30 million records and winning three Grammy Awards. (...)

I am not a music writer, so it is beyond my capacity to evocatively describe his sound. All I can say is it’s some of the best piano playing I’ve ever heard, and I think James Booker deserves to be considered the finest piano player of the 20th century.

I think other piano players like Toussaint and Dr. John would agree with that. Connick Jr. himself has said that “There’s nobody that could even remotely come close to his playing ability…I’ve played Chopin Etudes, I’ve done the whole thing, but there is nothing harder than James,” and “he did innumerable things that would be considered revolutionary.”

by Nathan J. Robinson, Current Affairs |  Read more:
Image: uncredited
[ed. See also: The Ed Sheeran Copyright Lawsuit Exposes The Absurdity of Music Ownership (CA) -  especially now that AI is involved (see below).]

Monday, May 15, 2023

AI Grimes


AI Grimes banger drops. Last week I wrote about Grimes’ proposition to anon music producers around the world: produce a banger with her voiceprint, and she’ll split any revenue it generates with person who made it (she detailed the proposal here). This week a viable banger candidate dropped, though to my ears there’s still some work to do on getting her sound right. Oh, and the lyrics are written by GPT. (@nickwebb)
via:

In the realm of the unseen, 
Where the colors blur and bleed, 
A ghostly whisper calls my name, 
From the ether, I reclaim... 

We are bound by frequencies, 
Synchronicity, secrets we keep, 
Our souls entwined, a cosmic dance, 
In the ether, we advance.

[ed. Scary. Also from the same source (Pirate Wires):]

Excellent GPT usecase demo: saving you money. Really impressive demo from Joshua Browder of GPT-lawyer fame — after he gave AutoGPT access to all his financial accounts and documents, it began to (allegedly) save him money in interesting and unexpected ways. If this is real, I can’t wait until his company DoNotPay’s plugin hits GPT. (@jbrowder1) Link

American Graffiti Revisted


[ed. Watched this again last night for the first time in years (Netflix). What a great movie (like Jaws, Jurassic Park, Butch Cassidy, Back to the Future, etc. - everything fits perfectly). I miss cruising.]

What We Lose When We Push Our Kids to ‘Achieve’

When I was 12, I disappeared into my bedroom with a $40 folk guitar and a giant book of Beatle songs, with elementary, large-type “E-Z” chord diagrams to follow. I had no musical gift, as a series of failed music lessons had assured me — it was actually the teachers who assured me; the lessons were merely dull — and no real musical training. My fingers stung as I tried to press down on the strings without making them buzz and my left hand ached as I tried — and for a long time failed — stretching it across the neck. Nonetheless, I worked my way through “Rain” (abbreviated to two chords) and “Love Me Do” (three) and finally “Yellow Submarine” (four chords, or was it five?) and discovered by myself the matchless thrill of homemade musical harmony.

No one asked me to do this, and surely no one was sorry the door was closed as I strummed and stumbled along after the nirvana of these simplified songs. But the sense of happiness I felt that week — genuine happiness, rooted in absorption in something outside myself — has stayed with me.

Fifty years later, I am still not a very good guitar player, but that week’s work, and the months and years of self-directed practice on the instrument that followed it, became a touchstone of sorts for me, and a model and foundation for almost every meaningful thing I’ve done since. It gave me confidence, often wavering but never entirely extinguished, that perseverance and passion and patience can make one master any task.

So it seems suitable at this season, as the school year ends and graduates walk out into the world, most thinking hard about what they might do with their lives, to talk about a distinction that I first glimpsed in that room and in those chord patterns. It’s the difference between achievement and accomplishment.

Achievement is the completion of the task imposed from outside — the reward often being a path to the next achievement. Accomplishment is the end point of an engulfing activity we’ve chosen, whose reward is the sudden rush of fulfillment, the sense of happiness that rises uniquely from absorption in a thing outside ourselves.

Our social world often conspires to denigrate accomplishment in favor of the rote-work of achievement. All our observation tells us that young people, particularly, are perpetually being pushed toward the next test, or the “best” grammar school, high school or college they can get into. We invent achievement tests designed to be completely immune to coaching, and therefore we have ever more expensive coaches to break the code of the non-coachable achievement test. (Those who can’t afford such luxuries are simply left out.) We drive these young people toward achievement, tasks that lead only to other tasks, into something resembling not so much a rat race as a rat maze, with another hit of sugar water awaiting around the bend, but the path to the center — or the point of it all — never made plain. (...)

I know there are objections to his view: At some moment, all accomplishment, however self-directed, has to become professional, lucrative, real. We can’t play with cards, or chords, forever. And surely many of the things that our kids are asked to achieve can lead to self-discovery; taught well, they may learn to love new and unexpected things for their own sake. The trick may lie in the teaching. My sister Alison Gopnik, a developmental psychologist and author, puts this well: If we taught our kids softball the way we teach them science, they would hate softball as much as they hate science; but if we taught them science as we teach them softball, by practice and absorption, they might love both.

Another objection is that accomplishment is just the name people of good fortune give to things that they have the privilege of doing, which achievement has already put them in a place to pursue. But this is to accept, unconsciously, exactly the distinction between major and minor, significant and insignificant tasks, that social coercion — what we used to call, quaintly but not wrongly, “the system” — has always been there to perpetuate.

Pursuit of a resistant task, if persevered in stubbornly and passionately at any age, even if only for a short time, generates a kind of cognitive opiate that has no equivalent. There are many drugs that we swallow or inject in our veins; this is one drug that we produce in our brains, and to good effect. The hobbyist or retiree taking a course in batik or yoga, who might be easily patronized by achievers, has rocket fuel in her hands. Indeed, the beautiful paradox is that pursuing things we may do poorly can produce the sense of absorption, which is all that happiness is, while persisting in those we already do well does not.

by Adam Gopnik, NY Times |  Read more:
Image: Millennium Images/Gallery Stock


Joro Chen (?), two men wearing headphones, crowded jazz club, small explosions around them, in the style of Nighthawks by Edward Hopper, via DALL-E
via: (The Active Voice)

Sunday, May 14, 2023

Leaving the Cradle

Leaving the Cradle
Image: NASA’s Cassini spacecraft during one of its final dives between Saturn and its innermost rings (source: NASA/JPL-Caltech — Public Domain)

Humanity has always wondered what is over the next hill, around the bend, on the other side of the river, or across the sea; and now we are finally at a point where our technology has caught up to our natural inclination and desire to ‘go and see for ourselves.’ (...)

We started off strong. A mere 66 years after the Wright brothers made their historic flight, Neil Armstrong and Buzz Aldrin set foot on the Moon. The 1969 landing on our nearest celestial neighbor marked a turning point in human history. No longer were we confined to the planet that gave us birth; we could step out and start to see for ourselves what wonders, treasures and challenges await us outside the borders of our first and still only home. Sadly, it was not to be. After the triumph of Apollo 11, only 6 more manned missions to the Moon took place; and in 1972 the last of the only 12 humans to have ever set foot on another world left and came home. We have yet to return. (...)

Now that finally seems to be changing.

Scientists Dramatically Extend Cell Lifespan in Anti-Aging Breakthrough

Scientists have achieved a significant breakthrough in the effort to slow the aging process with a novel technique that increased the lifespans of yeast cells by a whopping 82 percent, reports a new study.

By programming cells to constantly switch between two aging pathways, researchers were able to prevent them from fully committing to either deteriorative process, a method that nearly doubled the lifespan of the cells. In other words, rather than the entire cell aging at once, the aging process was toggled between different physical parts of the organism, extending its life. This synthetic “toggle switch” offers a potential roadmap toward treatments that could one day extend human longevity, though that future is highly speculative at this time.

We are born, we age, we die—so goes the story of humanity since time immemorial. However, this familiar progression could be shaken up by enormous advances in genetics that have opened new windows into the underlying biological mechanisms that cause us to age, raising the possibility that they could be rewired to extend our lifespans.

Now, a team of scientists at the University of California, San Diego (UCSD), have developed a new solution to this age-old problem that essentially tricks cells into waffling between two common deteriorative processes in cells. Using synthetic biology, the researchers genetically reprogrammed a circuit that chooses between these divergent paths toward death, causing it to constantly oscillate between its fates instead of actually dedicating itself to one.

These “oscillations increased cellular lifespan through the delay of the commitment to aging,” a result that establishes “a connection between gene network architecture and cellular longevity that could lead to rationally designed gene circuits that slow aging,” according to a study published on Thursday in Science.

“The circuit resembles a toggle switch that drives the fate decision and progression toward aging and death,” said Nan Hao, a professor of molecular biology at UCSD and a senior author of the study, in an email to Motherboard.

“Once the fate of a cell is determined, then it will have accelerated damage accumulation and progression to death,” continued Hao, who also serves as co-director of UCSD’s Synthetic Biology Institute. “[I]t became obvious to us that if we could rewire this naturally-occurring toggle switch circuit to an oscillator, it will make the cell to cycle between the two pre-destined aging paths and prevent the cell from making this fate decision toward deterioration and death, and it will make the cell live longer.”

Hao and his colleagues have been working on cellular aging for seven years, focusing on the concept of an oscillator for much of that time. In 2020, the researchers published another study, also in Science, that identified two major fates for budding yeast cells. About half of the cells they observed aged due to the deterioration of structures within the cellular nucleus, a cluster that holds most of an organism’s genome. The other half aged when the energy production units of the cell, known as mitochondria, started to break down over time.

Those observations transformed the oscillator concept from “an abstract idea to an executable idea,” Hao said. To build on the 2020 findings, the researchers used computer simulations of aging-related genetic circuits to develop a synthetic strain that could spark the desired feedback loop between the nucleolar and mitochondrial aging processes. In the new study, they introduced the synthetic oscillator into cells of the yeast species Saccharomyces cerevisiae, a model organism that has already shed light on many of the genetic factors that influence longevity in complex organisms, such as humans.

The approach resulted in an 82 percent increase in the lifespan of cells with the synthetic oscillators, compared with a control sample of cells that aged under normal circumstances, which is “the most pronounced life-span extension in yeast that we have observed with genetic perturbations,” according to the study.

“A major highlight of the work is our approach to achieve longevity: using computers to simulate the natural aging system and guide the design and rational engineering of the system to extend lifespan,” Hao told Motherboard. “This is the first time this computationally-guided engineering-based approach has been used in aging research. Our model simulations actually predicted that an oscillator can double the lifespan of the cell, but we were happily surprised that it actually did in experiments.”

The study is part of a growing corpus of mind-boggling research that may ultimately stave off some of the unpleasant byproducts of aging until later in life, while boosting life expectancy in humans overall. Though countless hurdles have to be cleared before these treatments become a reality, Hao thinks his team’s approach could eventually be applied to humans.

by Becky Ferreira, Vice | Read more:
Image: Kateryna Kon/Science Photo Library via Getty Images
[ed. See also: Vice, Decayed Digital Colossus, Files for Bankruptcy (NYT).]

Runway Brings AI Movie-Making to the Masses


Runway AI, the generative AI company whose video editing tools were critical to making "Everything Everywhere All at Once," has made parts of its toolkit available to the public — so anyone can turn images, text or video clips into 15-second reels.

Why it matters: There's grave and justifiable concern about AI's potential for abuse. But the flip side is that tools like Runway's are unleashing all manner of artistic creativity and giving rise to new AI-generated art forms.

Driving the news: Runway just granted public access to its image-to-video model, Gen-1, and, in a more limited release, to its text-to-video product, Gen-2.
  • Gen-1 is available on a mobile app that lets users take or upload a picture or video and instantly generate a short video clip based on it — applying filters like "Claymation" or "Cloudscape."
  • Gen-2, which Runway says is the first publicly available text-to-video product, is available now in limited form via chat platform Discord (and will soon be available more widely).
  • These fun and highly preliminary features are just a small, public-facing snapshot of the powerful suite of back-end tools available to video professionals who've been drawn to Runway's products like catnip.

Backstory: Runway was a co-creator of Stable Diffusion, a breakthrough text-to-image generator.
  • Runway's creative suite has been a loosely kept secret among filmmaking cognoscenti, who use it to save countless hours on editing tasks traditionally done by hand.
  • The company's AI Magic Tools let users edit and generate content in dozens of ways — by removing backgrounds from videos, erasing and replacing parts of a photo, or blurring out faces or backgrounds, for example.
What they're saying: Runway's technology is like "having your own Hollywood production studio in your browser," Cristóbal (Cris) Valenzuela, the company's CEO, tells Axios.
  • Its tools — and others, like ChatGPT — are "democratizing content creation" and "reducing the cost of creation to almost zero," he said.
Where it stands: Runway's tools have been used to help produce the "The Late Show with Stephen Colbert," build sequences for MotorTrend's "Top Gear America," design shoes for New Balance, and make videos for Alicia Keys.And they're a growing staple in creating visual effects (VFX) for amateur and professional films.

by Jennifer A. Kingson, Axios |  Read more:
Images: Aïda Amer/Axios; Runway AI