Saturday, November 14, 2020

“The Queen’s Gambit” Is the Most Satisfying Show on Television

In 1884, the American star chess player Paul Morphy was found dead in his bathtub, at the age of forty-seven. “The pride and the sorrow of chess is gone forever,” the Austrian chess master Wilhelm Steinitz wrote in an elegy, the following year. Morphy had begun winning citywide tournaments in his native New Orleans at the age of nine. By the time he was twenty, he was the United States champion, and by the time he was twenty-one, many considered him to be the best player on earth. In 1858, Morphy held a notorious “blindfold” exhibition, in Paris, at the Café de la Régence: he sat in one room while eight opponents sat in another and called out his moves without looking at a single board. He played for ten hours straight, without stopping to eat, and ended the night with six wins and two draws. But Morphy grew bored; he was so gifted at chess that he began to consider it a child’s game. He walked away from competition and opened a law office, but the business quickly failed. He spent his final two decades living as a vagabond on family money, growing increasingly paranoid and haunted by his former fame.

The parable of Paul Morphy and his squandered genius pops up halfway through the fifth episode of “The Queen’s Gambit,” Scott Frank and Allan Scott’s handsome, dexterous new Netflix miniseries, based on Walter Tevis’s 1983 novel of the same name, about a female chess prodigy from Lexington, Kentucky, and her pursuit of a world title in the late nineteen-sixties. The prodigy in question, Beth Harmon (Anya Taylor-Joy), is speaking to her friend and sometimes lover Harry Beltik (Harry Melling), a local chess champion with capped teeth and a nebbishy demeanor. Harry, who has been living with Beth to help her train for an upcoming match against the dominant Russian champion, Borgov, in Paris, announces that he will be moving out. He realizes that he has taught Beth all he knows, and that, in turn, she has taught him that his own passion for the game will never compare to hers. As he leaves, he hands her a tattered copy of the book “Paul Morphy and the Golden Age of Chess,” by William Ewart Napier. “You think that’s gonna be me?” Beth sneers, with a diffident jutting of her chin. “I think that is you,” Harry replies.

In Beth’s case, the sorrow that threatens to undercut her pride is not boredom with the game. She loves the game, and has since, as a young orphan, she began sneaking away from classes at the Methuen Home For Girls to play chess in the basement with the gruff janitor, Mr. Shaibel. Chess, for her, is a refuge; her trouble is everything else. When Beth was nine years old, her birth mother killed herself by crashing a car, with Beth still in it. As a teen-ager, Beth is adopted by an unhappy couple, the Wheatleys, who divorce soon after she moves in. Alma Wheatley (played, in a quietly devastating performance, by the film director and sometimes actor Marielle Heller, with a period-accurate, stilted manner of speaking that includes phrases like “my tranquility needs to be refurbished”) is a familiar type, the depressive midcentury housewife, chipper at the department store but a mess at home. She and Beth have an addiction in common: inhaling fistfuls of green tranquilizer pills in order to maintain a façade of equilibrium. Beth was first given the medication at Methuen, where she would take it at night and hallucinate chess games on the ceiling, the pieces dancing above her head like tipsy débutantes.

This premise might suggest that “The Queen’s Gambit” will be a predictable variation on the trope of the damaged genius, the poor brilliant maverick who is held back only by buried trauma. Yet the show, which begins with Beth as a child (played with a placid scowl by Isla Johnston) and then jumps forward to follow her through her teen years and early adulthood, proceeds less like a dark psychological drama or a gritty underdog sporting tale and more like the origin story of a wizard, or a superheroine. Beth is assiduous, serious, well-read, almost nunlike in her studies of strategies such as the Ruy Lopez and the Sicilian Defense (as a teen-ager, she’s so desperate to read chess magazines that she shoplifts them from the local pharmacy). We see her decimate her opponents, easily winning tournaments against men twice her age. She learns to speak Russian in anticipation of facing the Soviets one day. She plays through games by herself for hours—in her head, on the edge of the bathtub, on the kitchen counter. She rarely wavers in her confidence, and can often come across as arrogant, or at least disarmingly unflappable. When Alma discovers that Beth’s talent could be a ticket to a better life for them both—prize money, international travel, fame—Beth doesn’t resent the fact that she is soon supporting her adoptive mother (and her fondness for Gibson cocktails) but, instead, embraces their arrangement as business partners. Alma doesn’t really understand chess, but she has other skills to teach—through her, Beth is introduced to cosmetics, classical music, and her first beer, and gains a companion who is kind, if not quite maternal.

What makes “The Queen’s Gambit” so satisfying comes in large measure from the character Taylor-Joy brings to screen: a charming, elegant weirdo who delivers her lines with a cool, wintergreen snap, and never really reacts the way one might expect. Taylor-Joy, whose breakout role was in the 2015 horror film “The Witch,” starred in Autumn de Wilde’s eminently lovable new adaptation of “Emma,” and she will step into the role of Furiosa in the anticipated “Mad Max” prequel that starts filming next year. But “The Queen’s Gambit” is her star-making performance, a showcase for her particular oddball brand of elegance. Nearly every review of the series has mentioned Taylor-Joy’s eyes, which are the size of silver dollars and set far apart, giving her the appearance of a beautiful hammerhead shark. But what she brings to “The Queen’s Gambit” is a peculiar poise, a capricious hauteur that is willowy but never weak. Taylor-Joy’s background is in ballet, and as Beth she brings a subtle physicality to the game of chess. The chess masters Garry Kasparov and Bruce Pandolfini, who consulted on the show, taught Taylor-Joy how professionals move the pieces along the board, but, as she told the magazine Chess Life, she developed her own way of gliding her hands across the board. When she captures a piece, she floats her long fingers above it and then gently flicks it into her palm, like she is fishing a shiny stone out of a river. When she begins a game, she rests her chin on her delicate folded hands, like a female mantis preparing to feast, staring at her opponent with such unblinking intensity that at least once I had to glance away from the screen.

There has been some grumbling that Taylor-Joy’s twiggy beauty—as svelte, white, and doll-like as it is—undercuts the story “The Queen’s Gambit” tells; is it not enough for a woman to beat the boys without her having to look like a Richard Avedon editorial? But for me the glamour of the series is another of its quiet subversions. In life and on screen, chess is considered the domain of hoary men in moth-eaten cardigans, playing in smoky gymnasia that reek of stale coffee. “The Queen’s Gambit,” instead, finds an unlikely synergy between the heady interiority of chess and the sensual realm of style. Beth develops a prim, gamine flair for fashion with the same studied meticulousness that she brings to the chessboard, and in the course of the show her look evolves apace with her game. She spends her first chess winnings on a new plaid dress; she grows out her blunt baby bangs and adopts a more feminine, Rita Hayworth-esque waved bob. Her covetable wardrobe, of mod minidresses and boxy crepe blouses in creamy shades of mint green and eggshell, makes her a press darling, who gets asked about her look at chess junkets. Her clothing, along with the show’s dazzling interiors (not only the Wheatleys’ home, a sumptuous parade of sherbet-colored floral wallpaper, but the many swanky hotel suites that Alma and Beth stay in on the road), calls to mind the aesthetic enchantments of “Mad Men,” the nineteen-sixties period piece against which every other nineteen-sixties period piece will forever be measured (and the fervor around “The Queen’s Gambit” ’s costume design has similarly hijacked the zeitgeist, inspiring close readings in Vogue and a zazzy virtual exhibition at the Brooklyn Museum). In turn, the game of chess, in “The Queen’s Gambit,” sheds its schlubbiness and reveals a bewitching (and, it must be said, sometimes erotic) elegance. When a reporter from LIFE presses Beth for a juicy quote on what it’s like to be a girl competing among men, Beth demurs. “Chess isn’t always competitive,” she says. “Chess can also be beautiful.”

by Rachel Syme, New Yorker | Read more:
Image: Netflix

Friday, November 13, 2020


Tano Festa
(Italian, 1938-1988), Senza titolo

How Enduring the Promise?

Economic Myths in a Time of Rupture

Thinking about their lives, their livelihoods, and their identities, Americans have long placed great stock in morally charged economic mythologies. The “land of opportunity,” the “American Dream,” the “pursuit of happiness”—all have survived crushing depressions, civil strife, and global wars to give meaning and value to Americans’ daily strivings. While the growing diversity of the American social landscape has made Norman Rockwell–like depictions of faith and family seem almost obsolete, the equally venerable imagery of the disciplined work ethic has demonstrated far more staying power, with only the occupations and demographics of workers requiring updating. The short-lived coworking space startup WeWork drew attention in 2018 for the pro-work messages inscribed on the produce floating in its watercoolers. “Don’t stop when you’re tired, stop when you’re done,” was just one of the exhortations etched into floating cucumber slices, intended to inspire indefatigable “creatives” beavering away at their MacBooks.

Such mythic notions draw on a different dimension of economics from the one presented in most textbooks. Yet this cultural dimension may ultimately be as vital to the well-being of society as all of the projections and prescriptions derived from highly mathematized models of econometric theory. As Max Weber and other social theorists have shown, the fundamental human need to confer meaning on events, particularly those bearing on livelihoods, makes people depend on mythic frameworks and “webs of meaning” not only to ennoble their daily efforts but to ward off the threat of nihilistic randomness or chaotic disorder. These frameworks serve to make sense of economic success, failure, prosperity, and devastation, not just for individuals but for groups as a whole.

In America, these frameworks have undergone gradual but significant changes. Early colonial settlers viewed their trade and commerce as contributing to the commonwealth and a wider covenantal order, subject to religious and civil authorities alike. Likewise, southern planters and others justified their exploitation of enslaved laborers through frameworks of a racialized hierarchical order and Christian paternalism. It was only toward the middle of the nineteenth century, as the American market economy was being transformed by advances in transportation and technology, that a new and more characteristically modern economic mythology took form. This one emphasized individual effort and reward as the basis of a new covenant, this one with sovereign market forces rather than the sacredly ordained hierarchical order. Adherents to this covenant believed they inhabited an economic world that was controllable, predictable, and largely fair. Provided that individuals were committed to working hard and playing by the rules, they were assured at least a fighting chance to survive the “prevailing gales of creative destruction.”

The 1843 McGuffey Reader confidently pronounced to millions of American schoolchildren that the “road to wealth” was “open to all, and all who will, may enter upon it with the almost certain prospect of success.” This became the “American assumption,” as W.E.B. Du Bois labeled it, which posited that wealth was mainly a result of a person’s effort and that “any average worker can by thrift become a capitalist.” While the relative inequality of American society always left such claims open to doubt and debate, such thinking had at least an aspirational role in the American experience throughout the twentieth century. In his 1992 campaign speeches Bill Clinton would continue to speak of America’s promise, that “if you work hard and play by the rules you should be given a chance to go as far as your God-given abilities will take you.”

But problems can result from tying individual and national identities to economic mythologies, particularly in our age of global capitalism, when extensive economic dislocations and disruptions have created uncertainty and precarity among wide swaths of the population. Since the passing of the postwar era’s “Golden Age of Capitalism” (generally pegged as the time between 1945 and 1973, or what the French have dubbed les trentes glorieuses), many Americans have begun to doubt that the economic system is keeping up its side of the bargain. Stagnant wages since the 1970s alongside instability brought on by oil supply shocks and stagflation were followed by the expanded influence of a profit-hungry financial sector and globalized trade that yielded even greater volatility of wages and employment. The disruptive force of the 2008 recession—and the relatively minimal consequences suffered by those leading the responsible institutions—unleashed new fires of anti-elite populism.

Our present moment finds not only workers but also politicians and business leaders seeking to renew the reigning economic mythologies in a manner that can either make sense of the post-2008 mode of capitalism or successfully counter its costly deviations from its earlier mode. Yet this task now confronts the social realities that were papered over in the simpler times of the McGuffey Reader. The growing recognition of institutionalized inequality and the disproportional power of financial elites seem to counter the doctrine that the road to prosperity is open to all Americans.

Writing about cultural systems and meaning, the anthropologist Clifford Geertz suggested that three sorts of events can send a society’s cultural frameworks into a tailspin: bafflement, suffering, and a sense of intractable ethical paradox. It is at these points that the interpretability of life begins to break down, that individuals lose confidence that they can effectively orient themselves. Societies are then pushed to the limits of understanding and endurance, as “chaos” threatens to “break in” upon them.

The last two decades provide ample evidence that conventional American mythologies have now reached such a breaking point. The rhetoric of populist politicians and surveys of the populace as a whole reveal a common sentiment: that the system is “rigged.” This sentiment has disproportionately taken hold among lower-income workers across both political parties, populations struggling to keep their heads above water as they try to cope with rising health-care costs and the disappearance of benefit-bearing jobs. Many workers in publicly traded corporations also see the stagnation of their own wages in sharp contrast with the relative stability and prosperity enjoyed by their employers.

But extending well beyond these populations is a more general skepticism toward institutions tasked with leading the economic and financial sectors and overseeing their well-ordered functioning. The last ten years have seen a growing suspicion that the “invisible hand” of global capitalism does not appear to be disbursing the spoils of free trade in a fair manner. Where the old economic mythologies preached submission to “the system” because of its essential fairness, a new politics of resentment now draws attention to those not-so-invisible hands that seem to have weighted the scales to favor the wealthy few. This has raised the possibility that perhaps the American Dream is not actually dying a peaceful death, the victim of job automation and declining American economic hegemony, but is instead being killed off and replaced with a walled garden of success that denies access to all but a select few. (...)

Indeed, in many respects, the present moment seems to recapitulate certain features of the late-nineteenth-century Populist movement. Then, too, dedication to hard work and its value as a source of personal dignity coexisted with concerns that the system was becoming “rigged.” The rise of industrialism and the consolidation of smaller economic entities into larger ones placed new stresses on once largely independent American workers, roughly two thirds of whom had been had been absorbed into wage labor positions by the end of the nineteenth century. As historian Daniel Rodgers observed of these workers, “No amount of sheer hard work would open the way to self-employment or wealth.” Yet the historical record reveals no rejection of the value of hard work among those objecting to the conditions of wage labor. Rather, criticism and protests were directed against “the system,” which workers believed was abandoning the principles of competition and fair access to opportunity. An 1891 editorial in a Nebraska Populist publication for farmers lambasted conditions that subjected workers to fourteen to sixteen hours of work a day while reducing their relationship to their employer to that of “servant to master, of a machine to its director.” The “competitive system” had become not the pathway to success and wealth but the source of alienation, with the survival of the fittest now serving as a “satanic creed.” Populist writers also blamed the industrial system for breaking its own rules by permitting an “artificial individual”—the publicly traded corporation—to bend the law and the conditions of competition to its own interests.

Calling out the transgressors did not signify a loss of faith in a fair economy, as Populists continued to praise the merits of the “productive classes” of “farmers, laborers, merchants, and all other people who produce wealth,” in the words of an 1896 Populist manifesto. It was the monopolists and financiers of the time who no longer complied with the dictates of fair and competitive capitalism. The Populists sought government remedies precisely to restore the competition and openness assured by the economic myths in which they continued to believe.

The Populist resistance to wage labor and corporate power had little staying power in the twentieth century. As the historian Christopher Lasch explained it, agrarian populism represented the last stand of “producerism” in American history, an ideology that equally valorized small proprietors, shopkeepers, and the yeoman farmer. Increased wages and rising standards of living offered by the Fordist production regimes led to general acceptance of the expanded power of corporations. But as it turned out, the Populist response to transgressions against the American economic mythology in the Gilded Age adumbrated some of the ways American working people since the early 1970s have kept their faith in the American Dream even while economic transformations have made it hard to attain.

Sadly, one of the signal defeats of the Populist movement—its failure to adequately address the continued exclusion of most African Americans from the full range of opportunities—also foreshadowed another problem in the present struggles against a “rigged” system. To be sure, the Populists at times made surprising inroads across racial lines, forming alliances with organizations of black farmers. Their gatherings occasionally brought together both black and white southerners to rally support for Populist candidates and organize strikes for higher farm wages. But as the historian Lawrence Goodwyn observed, the southern blacks joining these associations quickly saw that the protests against vicious corporate monopoly were not sufficient to challenge the underlying racial caste system. Economic improvement by way of higher commodity prices and flexible currency certainly offered some relief from their conditions, but it was not enough to ensure the needed level of protection from prejudice and the threat of violence. There was “no purely ‘economic’ way out,” as Goodwyn wrote.

It is fair to say that a new economic populism—at least as a coherent political movement with significant voting power—has been rendered impotent by cultural identity markers that shape voting patterns. But even more detrimental to any political organizing is the large number of the economically alienated who are swayed by neither left-wing nor right-wing appeals to populism. Their perception of a “rigged” economy goes hand in hand with perceptions of a “rigged” political system offering little hope for change. The best data on the 40 percent of eligible adults who do not vote in US elections (around 100 million Americans in 2016) suggests that they are, in comparison to voters, more likely to be people of color, more likely to make less than $30,000 a year, more likely to have had trouble paying bills in the past twelve months, less likely to have a savings account, and less likely to have any kind of retirement account or health care.

This political reality blunts the chance of real populist challenges to the monopolistic behavior of many of the large corporations that have in recent years consistently pursued anticompetitive acquisitions and mergers, exploited vender lock-in powers, forced no-compete clauses on low-wage workers, and taken advantage of general economies of scale to eliminate competition. A 2014 Wall Street Journal guest editorial by PayPal founder Peter Thiel sums up the view of the new monopolists: Competition, he wrote, is now only for “losers.” Winners, in other words, no longer have to play on a level playing field. Perhaps what Americans need from their leaders is not so much moralistic scolding about personal responsibility but greater reflection on why figures like Thiel have come to profess an economic mythology so contrary to traditional mythic commitments to fairness and equal opportunity.

by Andrew Lynn, The Hedgehog Review |  Read more:
Image: CalypsoArt/Alamy Stock Photo
[ed. See also: The Next Decade Could Be Even Worse (The Atlantic).]

Oregon’s Decriminalization Vote Might Be Biggest Step Yet to Ending War on Drugs

Thanks to voters, Oregon will be the first state in the country to decriminalize the personal possession of all drugs, including heroin and cocaine. Oregonians passed Ballot Measure 110, also known as the Drug Decriminalization and Addiction Treatment Initiative, with 59 percent of the vote; it’s the most far-reaching of numerous successful drug-related measures on ballots nationwide, including the legalization of recreational marijuana in New Jersey, Arizona, Montana, and South Dakota. Every one of these victories constitutes a long overdue challenge to the racist, carceral logic of drug criminalization. Oregonians have taken a historic step in recognizing that if a carceral approach to drug use is harmful, it is harmful regardless of the drug in question.

In a general election between two candidates proudly committed to “law and order” politics and aggressive policing — regardless of Joe Biden’s minimal gestures towards marijuana reform — Oregon’s ballot decision offered a glimmer of hope for those interested in significant criminal justice changes. The success of Measure 110, alongside other decriminalization and legalization efforts, is a rebuke to the notion that any person who uses illegalized drugs, no matter what the substance, is best served by an interaction with the police and prison system.

“This is part of how we reform policing: by getting them out of the drug business,” wrote Brooklyn College sociology professor Alex Vitale, author of “The End of Policing,” on Twitter. Vitale was referring to the four states that voted to legalize recreational marijuana, but he added that “Measure 110 in Oregon is even better.”

Oregon’s decriminalization measure should not be confused with the legalization of all drugs; it instead entails the removal of criminal penalties for the possession of small amounts of illegal substances. After February 1, the penalty for drug possession will be akin to a hefty traffic ticket: a $100 fine. Those who cannot or do not want to pay can choose to agree to a “health assessment” at an addiction recovery center. The ballot measure also includes the expansion of access to recovery treatments, housing, and harm reduction services, to be funded through the reallocation of tens of millions of dollars from Oregon’s cannabis tax. Money saved by not arresting, prosecuting, and caging people found with drugs will also be redirected to a fund for treatment services.

Measure 110 would reduce convictions for drug possession by nearly 90 percent: from 4,057 convictions in 2019 to an estimated 378 in the coming year, according to the Oregon Criminal Justice Commission. The same report found that racial disparities in drug arrests could drop by 95 percent, and that convictions of Black and Indigenous Oregonians could drop by a staggering 94 percent. These figures alone show the unerringly racist bent of drug arrests and convictions — just further proof of an insupportable system.

While a first step in the U.S., all-drug decriminalization is not an untested experiment. The Drug Policy Alliance, among other decriminalization and legalization advocates, point to Portugal, where all drugs have been decriminalized for the last two decades. Drug use rates in Portugal have remained lower than the average use rates in Europe, and far lower than those in the U.S. — even while Portugal has suffered through years of heightened economic crisis and extreme spikes in unemployment. Meanwhile, rates of HIV infection have fallen, from 1,575 cases in 2000 to 78 cases in 2013. Following decriminalization, arrest rates for drug-related offenses dropped by 60 percent, while the number of people enrolled in drug treatment programs increased by 60 percent in turn. The number of annual drug overdose deaths has plummeted.

“Usually the focus is on the decriminalization itself, but it worked because there were other services, and the coverage increased for needle replacement, detox, therapeutic communities, and employment options for people who use drugs,” Ricardo Fuertes, a Lisbon-based project coordinator with an outreach organization founded by people living with HIV, told Vice News in 2016. “It was the combination of the law and these services that made it a success. It’s very difficult to find people in Portugal who disagree with this model.”

by Natasha Lennard, The Intercept | Read more:
Image: Yes on Measure 110

Thursday, November 12, 2020

Red Shirt

Kirk: All right, men, this is a dangerous mission. And it's likely one of us will be killed. The landing party will consist of myself, Mr. Spock, Dr. McCoy, and Ensign Ricky.
Ensign Ricky: Aw, crap.
      ~Family Guy

This is the Good Counterpart of Evil Minions and Mooks — set filler for our heroes' side. Their purpose is almost exclusively to give the writers someone to kill who isn't a main character, although they can also serve as Spear Carriers. In a series where The Main Characters Do Everything, if you suddenly see someone else who you've never seen before involved in the main story, they are probably Redshirts.

They are used to show how the monster works, and demonstrate that it is indeed a deadly menace, without having to lose anyone important. Expect someone to say "He's Dead, Jim", lament this "valued crew member's senseless death", and then promptly forget him. Security personnel in general fall victim to the worst shade of this trope, as most of the time their deaths aren't even acknowledged at all; according to Hollywood, you could walk into a bank and shoot a security guard right in the face without anyone making a fuss. If you shot anyone else afterward, the headline would just read "Bank Customers Killed".

Please note: this Trope is actually very inaccurate when you compare it to Real Life. If you were to watch every episode of Star Trek: The Original Series, count the number of casualties that the Enterprise had, and then compare that to an actual military, you'd see that Kirk's record as a leader in this regard is excellent, far better than any general in U.S. history. Even war heroes like George Washington and Dwight D. Eisenhower had proportionately more casualties among their troops.

Also note that while this trope was true in a strictly numerical sense for the original Star Trek series (25 crew member died with red shirts on, 10 with gold, and 8 with blue), it is not true in terms of the percentage of red shirts shown. In percentages to total crew, 10 percent of red shirts died, against 18 percent of gold shirts.

In mass quantities, they make up the Red Shirt Army. Frequently overlaps with Men Are the Expendable Gender and Black Dude Dies First.

by TV Tropes |  Read more:
Image: Star Trek: A Red Shirt in his natural state.
[ed. We're all red shirts in this crazy reality show.]

Wednesday, November 11, 2020

'Make America Rake Again'


We begin in many people’s happy place, at Four Seasons Total Landscaping. As you may know, Donald Trump’s losing presidential campaign held a press conference that has passed immediately into the annals of political comedy. And also the annals of horticultural business marketing. Consider this Philadelphia gardening establishment the world’s leading purveyor of seasonal colour.

If you somehow missed the Four Seasons Total Landscaping story, it was truly the quattro stagioni of political events. Each time it seemed it couldn’t get any better, there turned out to be some new quarter of it to enjoy.

But let me briefly summarise. On Saturday, the current US president tweeted that a “big press conference” would be held that morning at the Four Seasons in Philadelphia. Shortly thereafter, his account offered clarification – that wasn’t the hotel, but somewhere called Four Seasons Total Landscaping. Double-taking at their satnavs, reporters scrambled to this prestige location in a suburban business park, where Trump branding had been hastily affixed to the roller door of a single-storey building. Then again, the backdrop was really the best of it. Pan out, and the venue lay next door to a sex shop and a crematorium.

Clearly this was … unconventional. Yet amazingly, the world’s media would indeed end up being addressed there. Not by Trump, but by his personal lawyer, Rudy Giuliani. Dead people were always voting in Philadelphia, Rudy claimed. Joe Frazier, and Will Smith’s dad (twice).

And as he said all this, he was flanked by a long line of unsmiling campaign guys trying to look like nothing could be more normal than standing in a forgotten corner of suburbia in front of some garden hoses. There are millions of potential captions to the picture. Let’s go with something befitting the tragedy: They Were Four Years In Power.

Perhaps the biggest question to come out of the Four Seasons Total Landscaping press conference is: why did they carry on with it? Some sort of mistake had clearly been made, so why did they persist and pretend it hadn’t? Many speculate it was down to fear of not obeying the will of the White House idiot, however lunatic the reality of it may appear. Others simply think that by the time the campaign staff stopped screaming, they felt they were in too deep to turn around.

Either way, the upshot is the same: no matter the absurdity of any situation, no matter how ridiculous it looks when you get there, there will ALWAYS be a line of guys ready to butch it out like it was their plan along. There will ALWAYS be a line of guys who feel that it is somehow less ridiculous to look completely ridiculous than it is to simply say: “Oh wait, we made a mistake – give us half an hour and we’ll tell you the new venue.” There will ALWAYS be a line of guys who, even if they walked over a cliff, would leave very specific last words echoing behind them. “I meant to do that.”

by Marina Hyde, The Guardian |  Read more:
Image: Mark Makela/Reuters

Masters 2020: A Hole by Hole Guide


Masters 2020: Tommy Fleetwood's hole-by-hole guide to Augusta (see more)

[ed. It's Masters Week (delayed 7 months)]

Facebook, QAnon and the World's Slackening Grip on Reality

It’s hard to describe the movement that Philip fell into. QAnon has its roots in the “pizzagate” conspiracy, which emerged four years ago after users poring over hacked Democratic party emails on the message board 4chan said that, if you replaced the word “pizza” with “little girl”, it looked as if they were discussing eating children. That claim – whether it was made in jest or sincerity is impossible to tell – spiralled into allegations of a vast paedophilic conspiracy centred on Comet Pizza, a restaurant in Washington DC.

A year later, a 4chan user with the handle “Q Clearance Patriot” appeared, claiming to be a government insider tasked with sharing “crumbs” of intel about Donald Trump’s planned counter-coup against the deep state forces frustrating his presidency. As Q’s following grew, the movement became known as the Storm – as in, “the calm before …” – and then QAnon, after its founder and prophet. At that point, QAnon was a relatively understandable conspiracy theory: it had a clear set of beliefs rooted in support for Trump and in the increasingly cryptic posts attributed to Q (by then widely believed to be a group of people posting under one name).

Now, though, it’s less clearcut. There’s no one set of beliefs that define a QAnon adherent. Most will claim some form of mass paedophilic conspiracy; some, particularly in the US, continue to focus on Trump’s supposed fightback. But the web of beliefs has become all-encompassing. One fan-produced map of all the “revelations” linked to the group includes references to Julius Caesar, Atlantis and the pharaohs of Egypt in one corner, former Google CEO Eric Schmidt and 5G in another, the knights of Malta in a third, and the Fukushima meltdown in a fourth – all tied together with a generous helping of antisemitism, from the Protocols of the Elders of Zion to hatred of George Soros. QAnon isn’t one conspiracy theory any more: it’s all of them at once.

In September, BuzzFeed News made the stylistic decision to refer to the movement as a “collective delusion”. “There’s more to the convoluted entity than the average reader might realise,” wrote BuzzFeed’s Drusilla Moorhouse and Emerson Malone. “But delusion does illustrate the reality better than conspiracy theory does. We are discussing a mass of people who subscribe to a shared set of values and debunked ideas, which inform their beliefs and actions.”

At first, QAnon was a largely US phenomenon, with limited penetration in the UK. The pandemic, however, has changed that. According to recent polling by Hope Not Hate, one in four people in Britain now agree with some of the basic conspiracies it has promulgated: that “secret satanic cults exist and include influential elites”, and that “elites in Hollywood, politics, the media” are secretly engaging in large-scale child trafficking and abuse. Nearly a third believe there is “a single group of people who secretly control events and rule the world together”, and almost a fifth say that Covid-19 was intentionally released as part of a “depopulation plan”. (...)

For many, the existence of Facebook – and its sister products, including WhatsApp, Instagram and Messenger – has been a lifeline in this period. The social network has always prided itself on connecting people, and when the ability to socialise in person, or even leave the house, was curtailed, Facebook was there to pick up the slack.

But those same services have also enabled the creation of what one professional factchecker calls a “perfect storm for misinformation”. And with real-life interaction suppressed to counter the spread of the virus, it’s easier than ever for people to fall deep down a rabbit hole of deception, where the endpoint may not simply be a decline in vaccination rates or the election of an unpleasant president, but the end of consensus reality as we know it. What happens when your basic understanding of the world is no longer the same as your neighbour’s? And can Facebook stop that fate coming to us all? (...)

Facebook has become a centre point of civil society. It’s more than just a place to share photos and plan parties: it’s where people read news, arrange protests, engage in debate, play games and watch bands. And that means that all the problems of civil society are now problems for Facebook: bullying, sexual abuse, political polarisation and conspiracy theorists all existed before the social network, but all took on new contours as they moved online.

And this year they really moved online. As the initial lockdown was imposed across much of the world, people’s relationship to the internet, and to Facebook in particular, evolved rapidly. Stuck socially distancing, people turned to social networking to fill an emotional void.

Suddenly, the company found itself staring at unprecedented demands. “Our busiest time of the year is New Year’s Eve,” says Nicola Mendelsohn, Facebook’s vice-president for Europe, the Middle East and Africa, over a Zoom call from her London home. “And we were seeing the equivalent of New Year’s Eve every single day.” It was, she says, the inevitable result of having “almost the entire planet at home at the same time”.

Rachel agrees. “I believe the lockdown played a huge part in altering people’s perception of reality,” she says. When Covid restrictions came in, the rules of social interaction were rewritten. We suddenly stopped meeting friends in pubs, at the coffee point or by the school gates, and our lives moved online. And for many of us, “online” meant “on Facebook”.

by Alex Hern, The Guardian |  Read more:
Image: Stephanie Keith/Getty Images

Tuesday, November 10, 2020

The Swamp Gets Nervous

Over the weekend, people started making lists.

Alexandria Ocasio-Cortez kicked things off on Friday with a tweet that terrified Trumpworld.

“Is anyone archiving these Trump sycophants for when they try to downplay or deny their complicity in the future?” she wrote. “I foresee decent probability of many deleted Tweets, writings, photos in the future.”

A group calling itself the Trump Accountability Project sprung up to heed AOC’s call.

“Remember what they did,” the group’s sparse website declares. “We should not allow the following groups of people to profit from their experience: Those who elected him. Those who staffed his government. Those who funded him.”

Rarely a healthy sign in any democracy, the enemies lists started to freak out some normally unflappable Trump officials in the White House.

“At first I brushed it off as ridiculous, but what is scary is that she’s serious,” said a White House official of AOC’s tweet. “That is terrifying that a sitting member of Congress is calling for something like that. I believe there is a life after this in politics for Trump officials, but the idea that a sitting member of Congress wants to purge from society and ostracize us should scare the American people. It definitely should scare the American people more than it scares me. That type of rhetoric is terrifying when you have 70 million Americans who voted for this president. It might start with Trump officials but what if they go further?”

Before the election, when polls suggested an anti-Trump rout, some current and former Trump officials seemed to be positioning themselves for a new era when they would be forced to shed their association with the president. One top official at the White House became a bit of an inside joke among Washington reporters for sending conspicuous private texts taking digs at the administration and claiming to crave post-election life without Trump.

And some Republicans found it curious when a recent RNC official suddenly tweeted his support of Biden the day before the election. Was he having trouble finding a new job? Did he move to Silicon Valley or Portland, Ore.?

But the results, at least for the moment, have changed that conversation, with more Republicans on the Hill and Trump officials now insisting there may be less of a penalty for service to Trump.

Many top Trump advisers now say they’re not worried, and they point to the aftermaths of similarly controversial administrations as reassurance. They argue that if the Bush-era politicians and staffers who led the country to war in Iraq survived without being purged from politics, media and corporate America, then Trump’s advisers won’t either.

“The Bush people faced this,” said one of the president’s closest advisers. “Bush left office very unpopular, people thought thousands of people died in an unnecessary war and he was responsible for it. Everybody forgets that now that he’s an artist who doesn’t do partisan politics.”

This person pointed to the wealth accumulated by the two main architects of the war since Bush left office. “Don Rumsfeld did very well for himself when he left government,” said the close Trump adviser, who already has an unannounced book deal in hand. “Dick Cheney? I’ve been to his house in Wyoming!”

The close Trump adviser did allow that some staffers could have trouble in pockets of corporate America or Hollywood but the adviser isn’t personally concerned about finding work. “For somebody like me, I'm writing a book, I'm going to write a sequel,” the close Trump adviser said. “I get paid handsomely to give speeches. I have my corporate consulting. Maybe that’s not everyone else. But I can’t imagine I’m alone in that way. Are people going to say, ‘Oh shit, Mike Pompeo, you’re not secretary of State anymore so we can’t talk to you!’ Even the younger staffers — people still want people who worked in the White House. You have breathed rarefied air.”

Interviews with numerous current and former Trump officials reveal that while the talk of lists and permanent cancelation bubbling up on social media is worrisome, few are taking it seriously. Most Trump officials feel that the president’s better-than-expected showing in the election, the history of Bush-era “warmongers” (as one Trump official called them) easily re-integrating into polite society, and the myopia of both the news media and the loudest voices on the left will all conspire to allow even the most controversial Trump aides to continue working in politics and the private sector. (...)

While Trump officials with good reputations and bipartisan relationships will likely land well in the private sector, other mid-level Trump aides might have to launder their experience by working on another campaign or two, or finding a job on Capitol Hill.

“The easier path for some of the people might be to go work on a campaign or go work for another official and then you have another line on your resume,” said a Senate GOP aide, adding that his LinkedIn page “has been getting lit up” with Trump aides seeking to chat about potential jobs.

by Ryan Lizza, Daniel Lippman and Meridith McGraw, Politico | Read more:
Image:J. Scott Applewhite/AP Photo
[ed. Screw these people and screw Politico. Getting rid of the rot in this system is definitely a sign of a healthy democracy. To be charitable, I hope they at least find a nice place under some warm overpass to pitch their tents. See also: Who Goes Nazi (Dorothy Thompson/Harpers).]

Monday, November 9, 2020


Hanna Putz
via:

Tony Joe White

[ed. Brook Benton version here (Cornell Dupree, guitar). Boz Scaggs version here.]

What to Know as ACA Heads to Supreme Court — Again

The Supreme Court on Tuesday will hear oral arguments in a case that, for the third time in eight years, could result in the justices striking down the Affordable Care Act.

The case, California v. Texas, is the result of a change to the health law made by Congress in 2017. As part of a major tax bill, Congress reduced to zero the penalty for not having health insurance. But it was that penalty — a tax — that the high court ruled made the law constitutional in a 2012 decision, argues a group of Republican state attorneys general. Without the tax, they say in their suit, the rest of the law must fall, too.

After originally contending that the entire law should not be struck down when the suit was filed in 2018, the Trump administration changed course in 2019 and joined the GOP officials who brought the case.

Here are some key questions and answers about the case:

What Are the Possibilities for How the Court Could Rule?

There is a long list of ways this could play out.

The justices could declare the entire law unconstitutional — which is what a federal district judge in Texas ruled in December 2018. But legal experts say that’s not the most likely outcome of this case.

First, the court may avoid deciding the case on its merits entirely, by ruling that the plaintiffs do not have “standing” to sue. The central issue in the case is whether the requirement in the law to have insurance — which remains even though Congress eliminated the penalty or tax — is constitutional. But states are not subject to the so-called individual mandate, so some analysts suggest the Republican officials have no standing. In addition, questions have been raised about the individual plaintiffs in the case, two consultants from Texas who argue that they felt compelled to buy insurance even without a possible penalty.

The court could also rule that by eliminating the penalty but not the rest of the mandate (which Congress could not do in that 2017 tax bill for procedural reasons), lawmakers “didn’t mean to coerce anyone to do anything, and so there’s no constitutional problem,” University of Michigan law professor Nicholas Bagley said in a recent webinar for the NIHCM Foundation, the Commonwealth Fund and the University of Southern California’s Center for Health Journalism.

Or, said Bagley, the court could rule that, without the tax, the requirement to have health insurance is unconstitutional, but the rest of the law is not. In that case, the justices might strike the mandate only, which would have basically no impact.

It gets more complicated if the court decides that, as the plaintiffs argue, the individual mandate language without the penalty is unconstitutional and so closely tied to other parts of the law that some of them must fall as well.

Even there the court has choices. One option would be, as the Trump administration originally argued, to strike down the mandate and just the pieces of the law most closely related to it — which happen to be the insurance protections for people with preexisting conditions, an extremely popular provision of the law. The two parts are connected because the original purpose of the mandate was to make sure enough healthy people sign up for insurance to offset the added costs to insurers of sicker people.

Another option, of course, would be for the court to follow the lead of the Texas judge and strike down the entire law.

While that’s not the most likely outcome, said Bagley, if it happens it could be “a hot mess” for the nation’s entire health care system. As just one example, he said, “every hospital is getting paid pursuant to changes made by the ACA. How do you even go about making payments if the thing that you are looking to guide what those payments ought to be is itself invalid?”

What Impact Will New Justice Amy Coney Barrett Have?

Perhaps a lot. Before the death of Justice Ruth Bader Ginsburg, most court observers thought the case was highly unlikely to result in the entire law being struck down. That’s because Chief Justice John Roberts voted to uphold the law in 2012, and again when it was challenged in a less sweeping way in 2015.

But with Barrett replacing Ginsburg, even if Roberts joined the court’s remaining three liberals they could still be outvoted by the other five conservatives. Barrett was coy about her views on the Affordable Care Act during her confirmation hearings in October. But she has written that she thinks Roberts was wrong to uphold the law in 2012.

Could a New President and Congress Make the Case Go Away?

Many have suggested that, if Joe Biden assumes the presidency, his Justice Department could simply drop the case. But the administration did not bring the case; the GOP state officials did. And while normally the Justice Department’s job is to defend existing laws in court, in this case the ACA is being defended by a group of Democratic state attorneys general. A new administration could change that position, but that’s not the same as dropping the case.

Congress, on the other hand, could easily make the case moot. It could add back even a nominal financial penalty for not having insurance. It could eliminate the mandate altogether, although that would require 60 votes in the Senate under current rules. Congress could also pass a “severability” provision, saying that, if any portion of the law is struck down, the rest should remain.

“The problem is not technical,” said Bagley. “It’s political.”

by Julie Rovner, Kaiser Health News |  Read more:
Image: Yegor Aleyev/TASS via Getty Images

Halalu and Sharks at Kaimana Beach (Waikiki)


[ed. For whatever reasons, there seem to be larger than usual runs of halalu (akule - or juvenile scad) and sardines in Hawaii this year. Here's Kaimana Beach at the south end of Waikiki. Everyone seems to want a piece of the action.  See also: The Fish — And Fishermen — Are Back At Ala Moana Beach. Swimmers Are Not Happy (Honolulu Civil Beat).]

Sunday, November 8, 2020

Alex Trebek (July, 1940 - November, 2020)


What Is The End Of An Era? 'Jeopardy!' Host Alex Trebek Dies At 80 (NPR)
Image: Sony
[ed. What can one say? I'm sure we'll see many tributes in the days to follow. Alex Trebek transformed a mostly trivial medium (game shows) into an intelligent and transfixing art form, and in the process became a cultural touchstone for generations.]

Saturday, November 7, 2020

What Did We Learn?


[ed. At long last, at least a part of our national nightmare appears to be over. But what did we learn?]

The Electoral College Is Close. The Popular Vote Isn’t.

As the presidential race inches agonizingly toward a conclusion, it might be easy to miss the fact that the results are not really close.

With many ballots still left to count in heavily Democratic cities, former Vice President Joseph R. Biden Jr. was leading President Trump on Friday by more than 4.1 million votes. Amid all the anxiety over the counts in Pennsylvania and Georgia, and despite Americans’ intense ideological divisions, there was no question that — for the fourth presidential election in a row, and the seventh of the past eight — more people had chosen a Democrat than a Republican.

Only once in the past 30 years have more Americans voted for a Republican: in 2004, when President George W. Bush beat John Kerry by about three million votes. But three times, a Republican has been elected.

Mr. Biden is very likely to win the Electoral College, avoiding another split with the popular vote. But the prolonged uncertainty in spite of the public’s fairly decisive preference — Mr. Biden’s current vote margin is larger than the populations of more than 20 states, and larger than Hillary Clinton’s margin in 2016 — has intensified some Americans’ anger at a system in which a minority of people can often claim a majority of power.

“We look at a map of so-called red and blue states and treat that map as land and not people,” said Carol Anderson, a professor of African-American studies at Emory University who researches voter suppression. “Why, when somebody has won millions more votes than their opponent, are we still deliberating over 10,000 votes here, 5,000 votes there?”

In principle, the Electoral College could benefit either party depending on the geographic distribution of its supporters. As recently as four years ago, it looked like it would help Democrats, and in 2004, if Mr. Kerry had won just 119,000 more votes in Ohio out of more than 5.6 million cast there, he would have won the presidency despite losing the popular vote.

But in practice, it has overwhelmingly benefited Republicans in recent years despite the national electorate tilting the other way. And the potential for the Electoral College to diverge from the popular vote has only grown as more Americans have come to live in urban areas and many communities have become more ideologically homogeneous.

In 2000, when Al Gore won the popular vote by about 550,000 votes but Mr. Bush won the Electoral College, such a split hadn’t happened in more than a century. Now, it has happened twice in 20 years and come close to happening a third time, despite much larger popular-vote margins. What used to be an extreme rarity has begun to feel common.

Therein lies a more serious concern than partisan politics: the potential delegitimization of the United States’ democratic systems in the eyes of its citizens.

“The more this happens, the more you get the sense that voters don’t have a say in the choice of their leaders,” said Norman Ornstein, a resident scholar at the American Enterprise Institute, a conservative think tank. “And you cannot have a democracy over a period of time that survives if a majority of people believe that their franchise is meaningless.”

The United States, as supporters of the Electoral College often note, is a republic, meaning decisions are made through elected representatives rather than by direct vote. But “the fundamental of a republican form of democracy,” Dr. Ornstein said, “is that voters choose their representatives, who then make decisions on their behalf.”

The prospect of minority rule is certainly not new, and the fact that the Constitution allows it is by design, not accident. Most obvious, the Constitution originally allowed only white men to vote, and most states required voters to own property, too, disenfranchising most Americans.

The three-fifths clause, which counted slaves as three-fifths of a person for the purposes of congressional apportionment, gave Southern states more representation on the backs of people who couldn’t vote for and weren’t represented by their ostensible representatives. By 1820, Dr. Anderson said, the South had 18 to 20 extra seats in the House as a result.

The framers also consciously made the Senate unrepresentative, giving each state two seats regardless of population and leaving it to state legislators to fill them. The intent was for the Senate to serve as a check on the will of the people, which was to be represented in the House.

But the 17th Amendment established direct election of senators in 1913, and the difference in population between the largest and smallest states has vastly increased since the Constitution was written. The current Democratic minority in the Senate was elected with more votes than the Republican majority, and by 2040, based on population projections, about 70 percent of Americans will be represented by 30 percent of senators. (...)

John Koza, the chairman of National Popular Vote Inc., said his group — which has been pushing state legislatures for years to sign on to a compact in which states would pledge to award their electors to the winner of the national popular vote — planned to lobby intensively next year in states including Arizona, Minnesota, North Carolina and Pennsylvania. The compact has already been signed by states, mainly blue, totaling 196 electoral votes, but it will not take effect unless that number reaches 270.

by Maggie Astor, NY Times |  Read more:
Image: Adriana Zehbrauskas
[ed. See also: Victory for Joe Biden, at Last (NYT Editorial Board).]

Friday, November 6, 2020

U.S. Shattered Records For New Coronavirus Cases This Week As Hospitalizations Climb

Image: via:

Stressed? Pick a Color


Aegean Teal

As with so many things, the pandemic has altered the way we see color, and specifically, what colors we do — and do not — want to surround ourselves with while bunkered down at home. Some color trends have accelerated during the pandemic. Other long-popular shades are suddenly all wrong.

“There is a huge wave away from gray,” said Joa Studholme, the color curator for Farrow & Ball, the fancy English paint company. “There’s nothing about gray that evokes wellness.”

by Steven Kurutz, NY Times | Read more:
Image: Benjamin Moore

No Refs, No Teams, Few Rules

In the Forty-Niners Hockey Club, friendships matter more than championships.

“It’s the ultimate pond hockey,” said Steve Carlson of Anchorage.

He should know. The 69-year-old lifelong Alaskan got on the ice when he was 6, played at Dimond High in the late 1960s and hasn’t stopped chasing the puck with his buddies since.

Created for men over the age of 49, the club has no teams, few rules and no refs. The icing on the cake?

“We don’t keep score,” said 82-year-old Jimmy Reese.

But they do keep skating.

“Hockey’s a good workout,” said Reese, a retired truck driver.

The club was created in 2000 so aging players — the average age is about 65 — wouldn’t have to skate with young hotheads who don’t play well with others.

Its members come from all walks of life — plenty of blue-collar workers, city and state employees, coaches, a doctor, fisherman, dentist, lawyer, pilot, judge, teacher and one guy who allegedly wore a skate over his ankle bracelet for a while.

“We don’t do background checks,” Doug Webster said while pulling on his gear in the locker room at the O’Malley Sports Complex.

The experience and maturity of the players means respect for the game takes precedence over egos. Games feel like old-time hockey at its best, the way kids play when there are no adults around.

With teams determined by the traditional method of throwing sticks in the middle of the ice and dividing them equally at random, games begin quietly, without whistle-blowing refs.

Offsides and icing are called by participants. With no faceoffs, perpetual motion is the norm.

Since the club has enough members for two games simultaneously, it rents both sheets of ice. One rink is for the guys feeling their oats. The other is for the fellas who are feeling their age. Careers are extended by allowing players to age gracefully while not slowing down the action for everyone.

There’s a waiting list to get in, and there’s more to it than just waiting your turn. Hopefuls have to pay their dues.

Sweat equity is the currency. Dedication is noted as well.

For those who want to be considered, pickup games in the summer and general reputation weigh heavily in being invited. Regulars in the summer who are deemed a good fit maintain their eligibility.

When someone decides to step away from the club, a wait-lister gets the call.

The 49ers have a committee that considers prospective members. It doesn’t matter if you’re the mayor or a millionaire. How you handle yourself on the ice has more to do with acceptance than who you are or how you stickhandle.

The number fluctuates, but about 75 players are on the club roster.

Carlson, one of the club’s founders, said the group doesn’t take itself too seriously, as illustrated by an email used to welcome new members:

“With diminished skills and/or speed, you have demonstrated your ability to play down to our level. You are age-appropriate and marginal competence is all that we are looking for. That being said, the membership committee has selected you in the 1st round of the supplemental draft.”

by Casey Brogan, ADN | Read more:
Image: Emily Mesner / ADN