Friday, July 26, 2024

The Next New Thing

I served for a decade on the jury of the Richard H. Driehaus Prize, awarded each year to the architect who best represents the values of traditional and classical design. As Martin C. Pedersen observed recently on his website Common Edge: “The Driehaus is architecture’s traditional-classical design version of the Pritzker Prize. Although it comes with a hefty $200,000 check—twice the size of the Pritzker’s honorarium—and previous winners include such luminaries as Robert A. M. Stern, Michael Graves, Léon Krier, and Andrés Duany and Elizabeth Plater-Zyberk, the award still exists in a sort of media vacuum.”

Pedersen is right. The design press pays scant attention to the Driehaus Prize, probably because its readers—the architectural mainstream—have little interest in traditional/classical architecture. Never mind that this approach accounts for countless private residences nationwide, as well as academic buildings, public libraries, concert halls, a federal courthouse, and a presidential library. One building that should have penetrated the media vacuum is 15 Central Park West, a luxury apartment building in Manhattan whose record-breaking commercial success gained it renown among real-estate mavens; the stately limestone façades consciously recall such prewar classics as the Apthorp and the Beresford.

But popular acclaim counts for little in the closeted architectural world. As the New York architect Peter Pennoyer, this year’s Driehaus winner, told Pedersen, “There is a deep-seated interest—if not delusion—in the idea that the avant-garde, the cutting edge, the next new thing is what we should all be concerned about, at the exclusion of history, tradition, community, and context.”

The next new thing. Précisément! It was a French book published in 1923 that sparked the attitude that Pennoyer describes. The author was a Swiss-born architect, Charles-Édouard Jeanneret, who had recently adopted the pen name Le Corbusier, and the book was his spirited manifesto, Vers une architecture (literally Towards an Architecture). The first English translation was titled Towards a New Architecture, adding a word that was not inaccurate, for despite including illustrations of ancient temples and Hadrian’s Villa and referring to Donato Bramante and Raphael, Le Corbusier’s book was determinedly forward looking in its message. “We do not appreciate sufficiently the deep chasm between our own epoch and earlier periods,” he proclaimed. “If we set ourselves against the past, we are forced to the conclusion that the old architectural code, with its mass of rules and regulations evolved during four thousand years, is no longer of any interest; it no longer concerns us; all the values have been revised; there has been revolution in the conception of what Architecture is.” Stirring stuff.

All the values have been revised. Le Corbusier made it sound as if the modern era, propelled by the Great War, represented an epochal moment, which it did in many ways, but he didn’t count on one thing: how rapidly things would change in the modern age. The ponderous Farman Goliath biplane that he featured in Vers une architecture was out of service in less than a decade, and the majestic Cunard ocean liner Aquitania, whose white superstructure served him as an architectural model, was decommissioned in 1950. Le Corbusier often used his own Voisin C7 automobile as a prop in photographs of the villas he had designed in the Paris suburbs—the boxy car and his boxy architecture were all of a piece. But less than a decade later, Gabriel Voisin introduced the C25 Aérodyne, a streamlined beauty that wasn’t boxy at all. Where did that leave the “new” architecture? Having cut itself off from the old rules and regulations, as Le Corbusier put it, it had no choice but to keep changing with the changing times.

Regularly reinventing architecture is exciting, but it faces a number of challenges. Architecture is an art, hence creativity is important, but it is an applied art, which makes it fundamentally empirical—that is, ruled not by theory but by experience. What works is worth repeating; what doesn’t, isn’t. As a result, building design has traditionally depended on rules of thumb: the dimensions of tread and riser that make for a comfortable stair, the pleasing proportion between the width of a room and its ceiling height, the shapes and details that ensure a roof doesn’t leak. The skill of the architect lies in knowing when to innovate and when to stick to the tried and true.

Architects rushing to discover the next new thing tend to undervalue the tried and true. Willfully ignoring experience and implementing untested new solutions can be risky, as I. M. Pei, a talented architect, found after he designed the East Building of the National Gallery in Washington, D.C. The exterior of the new wing, completed in 1978, is covered in pink Tennessee marble to match the old gallery. There is nothing particularly novel about using marble as a cladding. The ancient Romans covered the brick Pantheon with marble, and more than a thousand years later, John Russell Pope designed basically the same system—thick, self-supporting walls of marble—for the main building of the National Gallery. Pope concealed the necessary expansion joints behind moldings and pilasters, but Pei wanted the surface of his walls to be smooth and unbroken, so he used thin marble panels and supported them individually on stainless steel hangers that were embedded in the concrete structure. Only 33 years after the building opened, the panels started to show signs of buckling, and the entire marble skin—some 16,000 panels—had to be removed and reinstalled on new hangers, at a cost of more than $80 million. The debacle was in sharp contrast to Pope’s enduring building next door.

Ignoring the past often means ignoring the good ideas of one’s immediate predecessors. In the past, copying masters was a valuable part of architectural design—Andrea Palladio copied Bramante, Inigo Jones copied Palladio, Christopher Wren copied Jones. Now copying is taboo. For example, the work of early Scandinavian modernists such as Sigurd Lewerentz and Alvar Aalto, who humanized their stripped-down modern designs with interesting handcrafted details, was ignored by later generations. Similarly, when Louis Kahn produced the sublime skylit vaults of the Kimbell Art Museum in Fort Worth, his ingenious solution was highly praised, but it was never repeated. As a result, instead of a considered evolution, modern architecture has been marked by a succession of fresh starts, some real and many false.

Reinventing architecture faces another, less obvious challenge. When Le Corbusier presented his Plan Voisin—a fictive proposal to rebuild the center of Paris with high-rise office towers—he took it for granted that the new would entirely replace the old. But of course, real cities consist of both new and old buildings. The old buildings are not historic relics but functioning places where people live, work, study, and in the case of old concert halls, listen to music. For most people, old buildings are as much a part of modern life as flat-screen televisions and smartphones. Le Corbusier maintained that the old architectural values need no longer concern us. But the contrary is true: the old buildings are often cherished, not primarily—or at least not only—because they are old, but because they are, well, beautiful.

It was perhaps inevitable that a reaction to ahistorical modern architecture would emerge at some point. This happened in the 1980s, and the reaction was largely facilitated by the architectural movement known as postmodernism. Although short-lived, this facile flirtation with history opened the door to a more serious reconsideration of the past. This included not only the American Renaissance of the late 19th and early 20th centuries, which is a touchstone for many classicists, but also the work of inventive masters such as Bertram Goodhue, who designed the Nebraska State Capitol; Ralph Adams Cram, who was responsible for Rice University; and Raymond Hood, the lead architect of Rockefeller Center.

It turns out that there are advantages to reconnecting with history. Without the imperative to constantly innovate, which can lead to risky experimentation and construction failures, architects can rely on time-tested methods of construction, and traditional materials and details. The modern steel frames of the Nebraska State Capitol and the buildings of Rockefeller Center, for example, are clad in traditional limestone. Architects who are free to find inspiration in their predecessors and contemporaries produce buildings that not only work but also gain the affection of the general public: libraries and courthouses that don’t look like flashy casinos, academic buildings that cannot be mistaken for workaday office buildings, and places of worship that don’t resemble utilitarian industrial plants.

It would be inaccurate to say that people don’t like modern architecture. After more than a century, it’s an accepted feature of contemporary life, almost a tradition. Office workers expect their workplaces to be sleek; shoppers expect high-fashion boutiques and automobile showrooms to be minimalist exercises in bare concrete and industrial details; and museumgoers expect galleries to resemble artists’ lofts, and museum cafés to have chic furniture and Zen-like décor.

But there are limits. It’s okay to have a minimalist kitchen or bathroom, but a living room shouldn’t look like an Apple store, and a house shouldn’t look like an upscale health spa. Nor should a college campus be mistaken for a suburban office park. 

by Witold Rybczynski, The American Scholar | Read more:
Image: 15 Central Park West, NY. Thomas Craven, Wikipedia

Thursday, July 25, 2024

via:

[ed. Pretty good day.]

via:

Deep Reading Will Save Your Soul

Higher ed is at an impasse. So much about it sucks, and nothing about it is likely to change. Colleges and universities do not seem inclined to reform themselves, and if they were, they wouldn’t know how, and if they did, they couldn’t. Between bureaucratic inertia, faculty resistance, and the conflicting agendas of a heterogenous array of stakeholders, concerted change appears to be impossible. Besides, business is good, at least at selective schools. The notion, floated now in certain quarters, that students and parents will turn from the Harvards and Yales in disgust is a fantasy. As long as elite institutions remain the principal pipeline to elite employers (and they will), the havers and strivers will crowd toward their gates. Everything else—the classes, the politics, the arts and sciences—is incidental.

Which is not to say that interesting things aren’t happening in post-secondary (and post-tertiary) education. They just aren’t happening, for the most part, on campus. People write to me about this: initiatives they’ve started or are starting or have taken part in. These come, as far as I can tell, in two broad types, corresponding to the two fundamental complaints that people voice about their undergraduate experience. The first complaint is that college did not prepare them for the real world: that the whole exercise—papers, busywork, pointless requirements; siloed disciplines and abstract theory—seemed remote from anything that they actually might want to do with their lives.

Programs that address this discontent exhibit a remarkably consistent set of characteristics. They are interdisciplinary, integrating methods and perspectives—from, say, engineering and the social sciences—that are normally kept apart. They are informal, eschewing frontal instruction and traditional modes of evaluation. They are experiential, more about doing—creating, collaborating—than reading and writing. They are extramural, bringing students into the community for service projects, internships, artistic installations or performances. They are directed to specific purposes, usually to do with social amelioration or environmental rescue. Above all, they are student-centered. Participants are enabled (and expected) to direct their education by constructing bespoke curricula out of the resources the program gives them access to. In a word, these endeavors emphasize “engagement.”

All this is fine, as far as it goes. It has analogues and precedents in higher ed (Evergreen, Bennington, Antioch, Hampshire) as well as in the practice of progressive education, especially at the secondary level. High schools will focus on “project-based learning,” with assessment conducted through portfolios and public exhibitions. A student will identify a problem (a human need, an injustice, an instance of underrepresentation), then devise and implement a response (a physical system, a community-facing program, an art project).

Again, I see the logic, it is just what many students want, but what bothers me about this educational approach—the “problem” approach, the “STEAM” (STEM + arts) approach—is what it leaves out. It leaves out the humanities. It leaves out books. It leaves out literature and philosophy, history and art history and the history of religion. It leaves out any mode of inquiry—reflection, speculation, conversation with the past—that cannot be turned to immediate practical ends. Not everything in the world is a problem, and to see the world as a series of problems is to limit the potential of both world and self. What problem does a song address? What problem will reading Voltaire help you solve, in any predictable way? The “problem” approach—the “engagement” approach, the save-the-world approach—leaves out, finally, what I’d call learning.

And that is the second complaint that graduates tend to express: that they finished college without the feeling that they had learned anything, in this essential sense. That they hadn’t been touched. That they hadn’t been changed. That there is a treasure out there—call it the Great Books or just great books, the wisdom of the ages or the best that has been thought and said—that its purpose is to activate the treasure inside them, that they had come to one of these splendid institutions (whose architecture speaks of culture, whose age gives earnest of depth) to be initiated into it, but that they had been denied, deprived. For unclear reasons, cheated.

I had students like this at Columbia and Yale. There were never a lot of them, and to judge from what’s been happening to humanities enrollments, there are fewer and fewer. (From 2013 to 2022, the number of people graduating with bachelors degrees in English fell by 36%. As a share of all degrees, it fell by 42%, to less than 1 in 60.) They would tell me—these pilgrims, these intellectuals in embryo, these kindled souls—how hard they were finding it to get the kind of education they had come to college for. Professors were often preoccupied, with little patience for mentorship, the open-ended office-hours exploration. Classes, even in fields like philosophy, felt lifeless, impersonal, like engineering but with words instead of numbers. Worst of all were their fellow undergraduates, those climbers and careerists. “It’s hard to build your soul,” as one of my students once put it to me, “when everyone around you is trying to sell theirs.”

That student’s name was Matthew Strother. It was through Matthew—he was in his early thirties by this point, and still seeking—that I learned about perhaps the two most prominent initiatives to have sprung up off-campus of late in response to the hunger for serious study.

by William Deresiewicz, Persuasion | Read more:
Image: Matthew Strother (courtesy of Berta Willisch).

The Flattening Machine

A wonder of the internet is that, from the right perch, you can watch information wash over people in real time. I happened to check X on Saturday only minutes after the attempted assassination of Donald Trump, and I experienced immediate disbelief. Surely the stills and live-feed screenshots were fake—AI-generated or Photoshopped.

But the sheer volume of information in a high-stakes news event such as this one has a counterintuitive effect: Distinguishing real from fake is actually quite easy when the entire world focuses its attention on the same thing. Amid a flurry of confusion and speculation, the basic facts of this horrifying event emerged quickly. The former president was shot at. He was injured but is recovering. For a brief moment, the online information apparatus worked to deliver important information—a terrifying shared reality of political violence.

Our information ecosystem is actually pretty good while the dust is up. But the second it begins to settle, that same system creates chaos. As my own shock wore off, leaving me to contemplate the enormity of the moment, I could sense a familiar shift on Reddit, X, and other platforms.

The basic facts held attention for only so long before being supplanted by wild speculation—people were eager to post about the identity of the shooter, his possible motives, the political ramifications of the event, the specter of more violence. It may be human nature to react this way in traumatic moments—to desperately attempt to fill an information void—but the online platforms so many of us frequent have monetized and gamified this instinct, rewarding those who create the most compelling stories. Within the first four hours, right-wing politicians, perhaps looking to curry favor with Trump, hammered out reckless posts blaming Joe Biden’s campaign for the shooting; Elon Musk suggested that the Secret Service may have let the shooting happen on purpose; as soon as the shooter’s name was released, self-styled online investigators dug up his name and his voter registration, eager for information they could retrofit to their worldview. Yesterday, conspiracy theorists pointed to a two-year-old promotional video from BlackRock that was filmed at the shooter’s school and features the shooter for a moment—proof, they said, of some inexplicable globalist conspiracy. As my colleague Ali Breland noted in an article on Sunday, conspiracy theorizing has become the “default logic for many Americans in understanding all major moments.”

An attempted assassination became a mass attentional event like any other. Right-wing hucksters, BlueAnon posters, politicians, news outlets, conspiracy shock jocks, ironic trolls, and Instagram dropshippers all knew how to mobilize and hit their marks. Musk let only about 30 minutes pass before he brought attention back to himself by endorsing Trump for president. It took just 86 minutes for Barstool Sports’ Dave Portnoy to post a link to a black T-shirt with the immediately iconic image of a bloodied Trump raising a fist. Trolls made fake online accounts to dupe people into thinking the shooter was part of the anti-fascist movement.

Some may wish to see the conspiracy peddling, cynical politicking, and information warfare as a kind of gross aberration or the unintended consequences and outputs of a system that’s gone awry. This is wrong. What we are witnessing is an information system working as designed. It is a machine that rewards speed, bravado, and provocation. It is a machine that goads people into participating as the worst version of themselves. It is a machine that is hyperefficient, ravenous, even insatiable—a machine that can devour any news cycle, no matter how large, and pick it apart until it is an old, tired carcass.

All of these people are following old playbooks honed by years of toxic online politics and decades of gun violence in schools, grocery stores, nightclubs, and movie theaters. But what feels meaningful in the days after this assassination attempt is the full embrace of the system as somehow virtuous by the bad actors who exploit it; unabashed, reckless posting is now something like a political stance in and of itself, encouraged by the owners, funders, and champions of the tech platforms that have created these incentives. (...)

The overall effect of this transformation is a kind of flattening. Online, the harrowing events of Saturday weren’t all that distinguishable from other mass shootings or political scandals. On X, I saw a post in my feed suggesting, ironically or not, “I know this sounds insane now but everyone will totally forget about this in ten days.” The line has stuck in my head for the past few days, not because I think it’s true, but because it feels like it could be. The flattening—of time, of consequence, of perspective—more than the rage or polarization or mistrust, is the main output of our modern information ecosystem. 

by Charlie Warzel, The Atlantic |  Read more:
Image: Illustration by The Atlantic. Source: Jabin Botsford/The Washington Post/Getty.

Wednesday, July 24, 2024

via:


Angel Nenov

via:

Project 2025: J.D. Vance Writes Forward

As Trump desperately tries to separate his campaign from Project 2025, users on X have noted one big problem: J.D. Vance wrote the foreword to a forthcoming book by the plan’s lead author, Heritage Foundation President Kevin Roberts.

On the Amazon product page, the promotional material for the book, titled Dawn’s Early Light, highlights Roberts’s role in composing Project 2025, the Heritage Foundation proposal for a conservative overhaul of the federal government.

The product page also includes a favorable review from Vance. “Never before has a figure with Roberts’s depth and stature within the American Right tried to articulate a genuinely new future for conservatism,” the review says. “We are now all realizing that it’s time to circle the wagons and load the muskets. In the fights that lay ahead, these ideas are an essential weapon.”

When the book first became available for pre-order on June 19, Vance promoted it on X, writing, “I was thrilled to write the foreword for this incredible book, which contains a bold new vision for the future of conservatism in America.”

On the Amazon page for Dawn’s Early Light, the subtitle reads, “Taking Back Washington to Save America,” but an archived version of the page from June 19 indicates it was initially “Burning Down Washington to Save America.”

Inflammatory language in the blurb has also apparently been tamped down.

A sentence on the archived page that says the book “blazes a warpath for the American people to take back their country” now says it “blazes a promising path.” Another fiery sentence on the archived page read, “Just as a controlled burn preserves the longevity of a forest, conservatives need to burn down these institutions [the FBI, The New York Times, the Department of Education, etc.] if we’re to preserve the American Way of life.” It now says that those institutions “need to be dissolved if the American way of life is to be passed down to future generations.”

by Robert McCoy, Yahoo News | Read more:
Image: uncredited
[ed. When your own candidates disavow and try to hide your Party's agenda - not a good sign. See also: What is Project 2025? (Yahoo News):]
***
What is Project 2025? Conversations, both online and off, surrounding the conservative agenda have exploded recently — more than a year after the policy proposal was published.

Project 2025 is a 922-page proposed blueprint for the next Republican administration produced by conservative think tank The Heritage Foundation.

Critics have labeled it “an authoritarian takeover of the United States,” while supporters call it a plan to return “our federal government to one ‘of the people, by the people, and for the people.’” (...)

What is Project 2025, and what is it calling for?

Project 2025 bills itself as “a policy agenda, personnel, training and a 180-day playbook” to be implemented “on day one” by the next Republican president, outlining various agenda items, including which bills to propose, laws to revoke and government agencies to restructure. (...)

Some of its directives include:
  • An overhaul of the Department of Justice and FBI, the former of which it labels "a bloated bureaucracy" with employees "who are infatuated with the perpetuation of a radical liberal agenda."
  • Implement Schedule F, a Trump-era executive order that the Biden administration repealed that would allow the reclassification — and potential replacement — of thousands of government workers.
  • Eliminate the Department of Education.
  • Impose wide restrictions on abortion access, including reversing federal approval of the abortion pill mifepristone.
  • Allocate funding for “construction of additional border wall systems.”
  • Ban pornography and imprison anyone who produces or distributes it.
  • Promote "Sabbath Rest" by encouraging Congress to amend the Fair Labor Standards Act to require people who work these days to be paid time and a half.
  • Have the federal government promote “biblically based, social science reinforced” heterosexual marriages.
  • Call on the new Health and Human Services secretary to “reverse the Biden Administration’s focus on 'LGBTQ+ equity'" and “subsidizing single-motherhood.”
  • Remove sexual orientation, gender identity, diversity, equity, inclusion and gender equality from any federal rule, regulation or legislation.
  • Revive Trump’s plan to open most of the National Petroleum Reserve of Alaska to leasing and development.

The Knotty Death of the Necktie

Not long ago, on a Times podcast, Paul Krugman breezily announced (and if we can’t trust Paul Krugman in a breezy mood, whom can we trust?) that, though it’s hard to summarize the economic consequences of the pandemic with certainty, one sure thing is that it killed off ties. He meant not the strong social ties beloved of psychologists, nor the weak ties beloved of sociologists, nor even the railroad ties that once unified a nation. No, he meant, simply, neckties—the long, colored bands of fabric that men once tied around their collars before going to work or out to dinner or, really, to any kind of semi-formal occasion. Zoom meetings and remote work had sealed their fate, and Krugman gave no assurance that they would ever come back.

Actual facts—and that near-relation of actual facts, widely distributed images—seem to confirm this view. Between 1995 and 2008, necktie sales plummeted from more than a billion dollars to less than seven hundred million, and, if a fashion historian on NPR is to be believed (and if you can’t believe NPR . . . ), ties are now “reserved for the most formal events—for weddings, for graduations, job interviews.” Post-pandemic, there is no sign of a necktie recovery: a now famous photograph from the 2022 G-7 summit shows the group’s leaders, seven men, all in open collars, making them look weirdly ready for a slightly senescent remake of “The Hangover.” As surely as the famous, supposedly hatless Inauguration of John F. Kennedy was said to have been the end of the hat, and Clark Gable’s bare chest in “It Happened One Night” was said to have been the end of the undershirt, the pandemic has been the end of the necktie.

Such truths are always at best half-truths. Sudden appearances and disappearances tend to reflect deeper trends, and, when something ends abruptly, it often means it was already ending, slowly. (Even the dinosaurs, a current line of thinking now runs, were extinguished by that asteroid only after having been diminished for millennia by volcanoes.) In “Hatless Jack,” a fine and entertaining book published several years ago, the Chicago newspaperman Neil Steinberg demonstrated that the tale of Kennedy’s killing off the hat was wildly overstated. The hat had been on its way out for a while, and Jack’s hatless Inauguration wasn’t, in any case, actually hatless: he wore a top hat on his way to the ceremony but removed it before making his remarks. Doubtless the same was true of the undershirt that Gable didn’t have on. They were already starting to feel like encumbrances, which might explain why Gable didn’t wear one. And so with the necktie. Already diminishing in ubiquity by the Obama years, it needed only a single strong push to fall into the abyss. (...)

What we now think of as the necktie—cut on the bias, made of three or four pieces of fabric, and faced with a lining—was actually a fairly recent, and local, invention, that of a New York schmatte tradesman named Jesse Langsdorf. What we call “ties” generically are, specifically, Langsdorf ties.

The Langsdorf necktie that emerged early in the twentieth century was, to be sure, hideously uncomfortable. (It is no accident that a necktie party was a grotesque nickname for a hanging.) Their constriction made them perhaps the masculine counterpart of the yet more uncomfortable fashion regime—high heels—forced upon women. (...)

Examine any now unused collection of ties, and you will find that they are full of tightly compressed meanings—once instantly significant to the spectator of the time and still occultly visible now. Not only the specific meanings of club membership but also the broader semiotics of style. In any vintage closet, there are likely to be knitted neckties that still reside within the eighties style of “American Gigolo”—which, believe it or not, helped bring Armani to America. The knit tie meant Italy, sports cars, daring, and a slight edge of the criminal. There are probably ties from Liberty of London—beautiful, flowered-print ties whose aesthetic ultimately derives from the Arts and Crafts movement, with its insistence on making the surfaces of modern life as intricate and complexly ornamental as a medieval tapestry or Pre-Raphaelite painting. If the closet is old enough, its ties will show a whole social history of the pallid fifties turning into the ambivalent sixties turning into the florid seventies. The New Yorker cartoonist Charles Saxon captured these transitions as they occurred, in a career that can be seen as a dazzling study of ties and their meanings. The neatly knotted ties of Cheeveresque commuters give way in the early seventies to the ever-broadening ties of advertising men, flags they waved to show off their desire to simultaneously woo the counterculture and keep out of it.

The tie could sometimes get so compressed in its significance as to lose its witty, stealthy character and become overly and unambiguously “loaded.” There is no better story of suicide-by-semiotics than that of the rise and death of the bow tie, which, beginning in the nineteen-eighties, became so single-mindedly knotted up with neoconservatism, in the estimable hands of George Will, that to wear one was to declare oneself a youngish fogy, a reader of the National Review, and a skeptic of big government. The wider shores of bow-tie-dom—the dashing, jaunty, self-mocking P. G. Wodehouse side of them—receded, and were lost. It became impossible to wear a bow tie and vote Democratic. (...)

Of course, the human appetite for display will never end, and, so, as the concentrated symbolism of the necktie evaporates, the rest of our clothes must carry its messages. The purposes of Warburgian pattern have now spread everywhere: to the cut of your jogging pants and the choice of your sneakers and, well, the cock of your snook. Where once the necktie blazoned out a specific identity from the general background of tailored gray, now everything counts. The most obvious successor garment to the necktie is the baseball cap, which declares its owner’s identity and affiliation not with some tantalizing occult pattern but the painful unsubtlety of actual text—the club named on the cap.

by Adam Gopnik, New Yorker |  Read more:
Image: Jaedoo Lee
[ed. Cock of your snook? Look it up yourself, I'm not doing it for you.] 

Tuesday, July 23, 2024

Dark Oxygen

In the total darkness of the depths of the Pacific Ocean, scientists have discovered oxygen being produced not by living organisms but by strange potato-shaped metallic lumps that give off almost as much electricity as AA batteries.

The surprise finding has many potential implications and could even require rethinking how life first began on Earth, the researchers behind a study said on Monday.

It had been thought that only living things such as plants and algae were capable of producing oxygen via photosynthesis – which requires sunlight.

But four kilometres (2.5 miles) below the surface of the Pacific Ocean, where no sunlight can reach, small mineral deposits called polymetallic nodules have been recorded making so-called dark oxygen for the first time.

The discovery was made in the Clarion-Clipperton Zone (CCZ), an abyssal plain stretching between Hawaii and Mexico, where mining companies have plans to start harvesting the nodules.

The lumpy nodules – often called “batteries in a rock” – are rich in metals such as cobalt, nickel, copper and manganese, which are all used in batteries, smartphones, wind turbines and solar panels.

They then noticed how the nodules were carrying a startling electric charge.

On the surface of the nodules, the team “amazingly found voltages almost as high as are in an AA battery”, Sweetman said. This charge could split seawater into hydrogen and oxygen in a process called seawater electrolysis, the researchers said.

This chemical reaction occurs at about 1.5 volts – approximately the charge of an AA battery.

Nicholas Owens, the SAMS director, said it was “one of the most exciting findings in ocean science in recent times”.

by Agence France-Presse in Paris | Read more:
Image: GSR/Reuters

Monday, July 22, 2024

Why Biden Finally Quit

The Saturday night decision that ended Biden’s reelection campaign.

For 23 days, President Joe Biden insisted on pushing forward with his reelection bid in the face of calls from Democratic lawmakers and donors for him to step aside.

And then, almost on a dime, things changed.

Early Saturday, Biden told senior aides it was “full steam ahead” for the campaign. But by later that evening, he had changed his mind following a long discussion with his two closest aides.

Steve Ricchetti, who’s been with Biden since his days in the Senate, drove to see the president at his house on the Delaware shore on Friday. Mike Donilon arrived on Saturday. The two men, both of whom had been by Biden’s side during key decisions about whether to seek the presidency in 2016 and 2020, sat at a distance from the president, still testing positive for Covid, and presented damning new information in a meeting that would hasten the end of Biden’s political career.

In addition to presenting new concerns from lawmakers and updates on a fundraising operation that had slowed considerably, they carried the campaign’s own polls, which came back this week and showed his path to victory in November was gone, according to five people familiar with the matter, who, like others interviewed for this article, were granted anonymity to discuss private conversations. Biden asked several questions during the exchange.

The only other people with Biden in the residence when he arose Sunday were first lady Jill Biden and two other trusted aides: deputy chief of staff Annie Tomasini and assistant to the first lady Anthony Bernal. At 1:45 p.m., he notified a somewhat larger group of close aides that he had decided the night before to end his quest for another term, reading his letter and thanking them for their service. A minute later, before any other campaign and White House staffers could be notified, he posted the historic letter from his campaign account on the social media site X.

The announcement, which shocked the political world, almost immediately flipped the narrative around Biden: His own party, after three weeks of deriding him privately as an isolated, deluded lion in winter dragging other Democrats down with him, was showering him with loving tributes, praising his record, career of public service and a selfless decision they said put his country first.

It wasn’t that the president had grown tired of the drip of defections from within his own party — although he had. Rather, it was that Biden himself was finally convinced of what so many other Democrats had come to believe since his poor debate performance last month: He couldn’t win.

When the campaign commissioned new battleground polling over the last week, it was the first time they had done surveys in some key states in more than two months, according to two people familiar with the surveys. And the numbers were grim, showing Biden not just trailing in all six critical swing states but collapsing in places like Virginia and New Mexico where Democrats had not planned on needing to spend massive resources to win.

With that knowledge and the awareness that more party elders, including more of his former Senate colleagues, would pile on the public pressure campaign, a sudden exit offered the president his best chance to make it appear that the decision came on his own terms. It was a face-saving move of high importance to Jill Biden, who, according to people familiar with recent conversations, was adamant that her husband’s dignity be preserved.

Senior Biden aides were bracing for former House Speaker Nancy Pelosi (D-Calif.), who’d worked behind the scenes to encourage others in the party toward the kind of collective action that might finally push the president to end his campaign, to go public this week and possibly even disclose Democratic polling clarifying Biden’s dire political straits.

“Nancy made clear that they could do this the easy way or the hard way,” said one Democrat familiar with private conversations who was granted anonymity to speak candidly. “She gave them three weeks of the easy way. It was about to be the hard way.”

With Biden vowing in a statement to return to the campaign trail next week, some in the party came to believe that more direct and public opposition might be the only way left to convince Biden to step aside. At least a half-dozen House and Senate Democrats — including senior lawmakers — had planned to call for the president to leave the campaign on Monday and Tuesday, according to one lawmaker who had a pre-drafted statement.

“We were giving him the respect of the weekend to make his decision. We were hopeful that this is the decision we would make,” the Democrat said. This lawmaker, who had personally spoken with dozens of lawmakers in recent weeks about their district-level polling and voter concerns back home, said they had already been sharing that data with the Biden campaign team on a regular basis.

On Capitol Hill, Democratic leadership sensed Biden’s decision was coming. A lawmaker close to leadership, granted anonymity to speak candidly, said the president had “gone offline” in recent days as he spent time with his family, a signal that he was digesting several weeks of firm Democratic messages that he needed to step aside.

“He got the message,” said the House Democrat, granted anonymity to speak frankly. Referring to the Senate Majority Leader, House Minority Leader and Speaker Emeritus, the lawmaker said: “It was from Chuck, Hakeem, Pelosi.”

This account of what led to the president’s reversal is based on conversations with 22 people who were granted anonymity to discuss sensitive matters.

by Eli Stokols, Jonathan Lemire, Elena Schneider, and Sarah Harris, Politico |  Read more:
Image: Illustration by Bill Kuchman/Politico (source images via Getty)
[ed. For the best. Now he can be remembered for his accomplishments (of which there are many) rather than blamed for a lost election (and whatever follows). See also: The Men Who Gave Trump His Brutal Worldview (Politico)]

Jack White


"Put bluntly, No Name is a rock record – an incredibly satisfying one. It sounds more like the White Stripes than anything White has cut since that band’s demise – its 13 songs are driven by the blues, his playing sounding like the bastard son of Elmore James and Jimmy Page, swinging between bare-knuckled riffs and sweet slide-guitar with a switchblade edge. The instrumentation is pared back to only what matters, what’s necessary. The drumming often channels the magical primordial stomp of the sorely missed Meg White’s poetic, bone-simple playing.

The album is dark, heavy, thrilling, beautiful." ~  Jack White: No Name review (Guardian)

[ed. Yow. A great one. Reminds me a bit of Jon Spencer Blues Explosion (Orange). Listen to Side A and Side B in their entirety.]

Can Glen Powell be a Movie Star in a Post-Movie-Star Era?

The Twisters actor’s career explains a lot about the state of the industry.

Actor Glen Powell's parents hold up signs behind him as they attend the special screening of Hit Man at the Paramount Theatre in Austin, Texas, on May 15, 2024.

A few weeks ago, a Reddit poster decided to ask about which actors audiences were being “force fed to accept” as movie stars. They had what they felt was a prime example at their fingertips: Glen Powell.

“I feel like this guys [sic] is everywhere doing anything,” the poster mused. Yet they found Powell’s work to be “all just Meh.”

This is Glen Powell’s summer. After spending decades in the Hollywood trenches, Powell is now the star of Twisters, out this week, and of Hit Man, now streaming on Netflix, which he also co-wrote and produced. He’s got big glossy profiles in GQ, the Hollywood Reporter, and Vanity Fair. He’s been anointed, crowned, and feted as the next big thing. (...)

Part of why Powell’s sudden rise feels so notable is its strangely retro vibe. Today’s ambitious young actors, like Timothée Chalamet and Florence Pugh, usually flit back and forth between Marvel or some other big action series — to build their names and paychecks — and quirky off-beat films made by auteurs that will get them critical recognition. Powell, in contrast, has stuck to the genres that conventional wisdom has long held were dead: Romantic comedies. Middlebrow adult dramas not based on an existing franchise. You know, ’90s kind of stuff.

“I’m working to try to be you,” Powell told Tom Cruise when he was cast in a supporting role in Top Gun: Maverick, according to an interview in the Hollywood Reporter earlier this year. But Powell also seems to know that his dream is unlikely because the industry doesn’t really make Tom Cruises anymore.

“First of all, there will never be another Tom Cruise,” he continued in the profile. “That is a singular career in a singular moment, but also movie stars of the ’80s, ’90s, early 2000s, those will never be re-created.”

All the same, Powell looks an awful lot like he’s going to make a play for it — by sheer force of will, if necessary. After all, he’s had a lot of practice. (...)

Tom Cruise became a movie star in the raunchy coming-of-age sex comedy Risky Business, his signature commitment powering him through the iconic scene where he dances around in his underwear. Julia Roberts became a movie star when she flashed her megawatt smile at the camera in the cheesy-but-satisfying Mystic Pizza. These were movies that weren’t stupid but weren’t particularly challenging either, simple and goofy mid-budget fare that almost anyone would want to see.

In the late 2000s going into the 2010s, Hollywood pretty much stopped making that kind of movie. DVDs and then streaming, along with the rise of prestigious cable shows, eroded the audience. As the domestic box office collapsed, the international market became more important, driving a push toward spectacle-laden action franchises. The only thing reliably making money anymore was the ascendent Marvel Cinematic Universe, which in the early 2010s was just entering the so-called Phase 2.

The new financial path for studios became: Focus most of your money on a big flashy action franchise, ideally one based on familiar IP with a built-in fanbase. Allow some money on the side for movies that have a solid chance at the Oscars. Let a more intimate movie get made here and there, but give it a budget that looks like a rounding error, which means it won’t have any stars. Mid-budget movies? Those are for streaming. (...)

Powell, meanwhile, had his sights set on the biggest ’90s throwback of all: Tom Cruise’s new Top Gun sequel. Powell auditioned for the crucial role of Goose’s son and, once again, got close, he told GQ. Not close enough: The part went to Miles Teller. Still, Cruise, who liked Powell’s screen test, offered him the part of Slayer, the equivalent of the Val Kilmer role from the original movie.

Powell said no. He didn’t think Slayer worked in the script. The kid in the tux in him who had put in a lot of time analyzing the way movies worked foresaw himself ending up all over the cutting room floor.

Cruise felt strongly enough about Powell’s potential that he personally called him to give him career guidance. If Powell really wanted to be the next Tom Cruise, he told him, the key wasn’t to pick a great role. It was to pick a great project and then make the role great. He got Powell to sign on as Slayer, and then he got Slayer rewritten into a new character, now called Hangman, who would fit Powell’s smarmy golden boy skill set.

Top Gun: Maverick was the first blockbuster of the post-pandemic era. It was also definitively Tom Cruise’s hit. Powell’s turn as Hangman wasn’t on the cutting room floor, but it wasn’t central enough to the film to be part of the narrative of its success. (...)

If Hollywood stops making movie stars, can you DIY one?

If this story makes it sound like Glen Powell is an underdog, that’s inaccurate, in the same way it was inaccurate to push that narrative about Armie Hammer a few years back. Powell is a tall and handsome white dude who could afford to stick it out through a decade or so of under-employment because he was getting mentored by Denzel Washington and Tom Cruise. He’s not an underdog. He’s doing a different thing.

The thing about Glen Powell that comes through most strongly in profiles is this: You have never read a more earnest celebrity interview than the ones he gives. This man keeps a bingo board where he tracks all the character types he wants to play. He’s currently finishing his final college credits because he thinks it would mean a lot to his mom. He’s got a book he calls an icon wisdom journal he fills with advice from his mentors, most notably Cruise. He wore that tux. He’s a hard worker who is very earnest about the value of hard work.

Powell mostly masks this earnestness by playing insufferable assholes, less a Chris Pratt than a Matt Czuchry. It may be that the closest fit onscreen to Powell’s real personality is the before character in Hit Man, mild-mannered philosophy professor Gary, before he transforms himself into a cold-blooded killer.

Yet ironically, Gary pre-transformation is one of Powell’s least convincing performances. Powell doesn’t seem to know how to fold his broad shoulders in or soften his big Hollywood grin so that he looks less than confident, even when the character he’s playing is lecturing a bored class of college students or letting his co-workers mock him to his face. Part of the reason Powell pops is that whenever he shows up on camera, he gives every evidence of believing he belongs there.

by Constance Grady, Vox | Read more:
Image: Sergio Flores/AFP via Getty Images
[ed. See also: Netflix’s totally delightful Set It Up proves just how durable the romcom formula is (Vox). And, the original Hit Man article here (Texas Monthly):]
***
"On a nice, quiet street in a nice, quiet neighborhood just north of Houston lives a nice, quiet man. He is 54 years old, tall but not too tall, thin but not too thin, with short brown hair that has turned gray around the sideburns. He has soft brown eyes. He sometimes wears wire-rimmed glasses that give him a scholarly appearance.

The man lives alone with his two cats. Every morning, he pads barefoot into the kitchen to feed his cats, then he steps out the back door to feed the goldfish that live in a small pond. He takes a few minutes to tend to his garden, which is filled with caladiums and lilies, gardenias and wisteria, a Japanese plum tree, and rare green roses. Sometimes the man sits silently on a little bench by the goldfish pond, next to a small sculpture of a Balinese dancer. He breathes in and out, calming his mind. Or he goes back inside his house, where he sits in his recliner in the living room and reads. He reads Shakespeare, psychiatrist Carl Jung, and Gandhi. He even keeps a book of Gandhi’s quotations on his coffee table. One of his favorites is “Non-violence is the greatest force at the disposal of mankind. It is mightier than the mightiest weapon of destruction devised by the ingenuity of man.”

He is always polite, his neighbors say. He smiles when they see him, and he says hello in a light, gentle voice. But he reveals little about himself, they say. When he is asked what he does for a living, he says only that he works in “human resources” at a company downtown. Then he smiles one more time, and he heads back inside his house.

What the neighbors don’t know is that in his bedroom, next to his four-poster bed, the man has a black telephone, on which he receives very unusual calls.

“We’ve got something for you,” a voice says when he answers. “A new client.”

“Okay,” the man says.

The voice on the other end of the line tells him that a husband is interested in ending his marriage or that a wife would like to be single again or that an entrepreneur is ready to dissolve a relationship with a partner.

The man hangs up and returns to his recliner. He thinks about what service he should offer his new client. A car bombing, perhaps. Or maybe a drive-by shooting. Or he can always bring up the old standby, the faked residential burglary.

As he sits in his recliner, his cats jump onto his lap. They purr as he strokes them behind their ears. The man sighs, then he returns to his reading. “Always aim at complete harmony of thought and word and deed,” wrote Gandhi. “Always aim at purifying your thoughts and everything will be well.”

The man’s name is Gary Johnson, but his clients know him by such names as Mike Caine, Jody Eagle, and Chris Buck. He is, they believe, the greatest professional hit man in Houston, the city’s leading expert in conflict resolution. For the past decade, more than sixty Houston-area residents have hired him to shoot, stab, chop, poison, or suffocate their enemies, their romantic rivals, or their former loved ones." (...)

“Except for one or two instances, the people I meet are not ex-cons,” says Johnson. “If ex-cons want somebody dead, they know what to do. My people have spent their lives living within the law. A lot of them have never even gotten a traffic ticket. Yet they have developed such a frustration with their place in the world that they think they have no other option but to eliminate whoever is causing their frustration. They are all looking for the quick fix, which has become the American way. Today people can pay to get their televisions fixed and their garbage picked up, so why can’t they pay me, a hit man, to fix their lives?”