Thursday, October 15, 2020

Wednesday, October 14, 2020

Total Collapse Of Democracy So Horrifying America Decides It Hasn’t Happened Yet


WASHINGTON—As citizens across the nation sought to insulate themselves from mounting evidence to the contrary, several reports indicated Monday that the idea of the total collapse of democracy was so horrifying that America decided it hadn’t happened yet. “We can’t let them take away our democracy,” said Prescott, AZ insurance agent Daniel Cross, echoing the concerns of a terrified American populace that imagined a future in which the nation’s democratic ideals were hopelessly compromised, and determined that in the meantime, the Electoral College, U.S. Senate, unelected Supreme Court, increased power concentrated in the presidency, lack of universal suffrage, frequent executive overrides of decisions that had majority support of the American populace, the manipulation of voting boundaries on the federal, state, and local levels, a strict two-party system that used legislative means to effectively prevent additional parties from gaining traction, widespread voter suppression, corporate control of the media, massive lobbying sector, outsourcing of public services to profit-driven private firms, concentration of power among a few wealthy individuals, complex legal labyrinths designed to prevent regular people from exercising their basic rights, deregulation that led to widespread health, environmental, and economic hardship, unfettered campaign donations, effective legal immunity on the basis of status, wealth, or membership in a state police force, legal and economic obstacles to free assembly and free speech, poor education in both critical thinking and democratic ideas, unelected local councils and boards with significant influence over the distribution of public resources without fair notice or inclusion of the general populace, and the repeated efforts by the United States to undermine democracy in foreign countries at the expense of undermining its own democratic processes at home didn’t currently exist. “This election is a make-or-break moment for our democracy. It’s the most important election of our lifetimes.” Additional reports suggested that the prospects of a badly compromised political system in the United States were so disturbing to contemplate that Americans decided that real democracy had at some point actually existed.

by The Onion |  Read more:
Image: Uncredited

Why the George Floyd Protests Succeeded Where Others Failed

Yves here. This post describes some of the factors that pushed the Black Lives Matter protests past a tipping point. Recall that Black Lives Matter looked about to achieve a break-out into national consciousness after the killing of Eric Garner. “Die-ins” in high-profile places like Grand Central, which regularly had substantial non-black participation, were getting traction. Lambert was tracking Black Lives Matter intensely at the time and saw how Democratic party operatives moved quickly to try to assume leadership or at least influential positions and succeeded often enough to co-opt the movement. The tactically brilliant die-ins inexplicably stopped.

The reason for featuring this post is to look at an example of activism/movement building that made headway. But having said that, I am not enthusiastic about the author’s claims about the supposed effectiveness of the “defunding the police” demand. It was taken too literally by opponents as a push for no police at all, and some supporters even backed that idea. That demand has generated pushback from the Dems, with Biden promising to increase police budgets (and I don’t buy the idea that embedding social workers will produce more judicious policing; look at how embedding journalists in military forces is a deliberate strategy by governments to generate friendly accounts).

It would have been more persuasive to give concrete examples of what “defunding” would mean, like entirely reconstituting the police force, as was done successfully in Camden, NJ, or getting rid of the budget for military toys and related training. Note that the Pentagon recently confirmed that giving military equipment to local police forces was unproductive, save that some sold it for real money to other police units.

Although correlation is not causation, the transfer of military equipment to police parallels change in police training to a paramilitary approach. From a June Atlantic article:
To be sure, federal military-surplus transfers like those through the Defense Department’s 1033 Program do little good, and much harm: Police departments obtaining used Army filing cabinets at cost isn’t cause for concern, but there’s no earthly reason for small-town cops to wear military fatigues, ride around in mine-resistant Humvees, or carry bayonets. Studies suggest that police departments that receive such equipment see no measurable improvement in officer safety or crime rates, but greater quantities do seem to correlate with higher rates of officer-involved shootings and reduced public trust.
By Nara Roberta Silva, a sociologist with a Ph.D. from the State University of Campinas, Brazil. She is a Core Faculty Member at the Brooklyn Institute for Social Research, where she researches and teaches on social movements and democracy, global Marxism, post/anti-colonialism, and social theory. Originally published at openDemocracy

Grievances are commonly pointed out as great protest and movement sparkers. When protests erupted after George Floyd’s death, it was common to hear in news analyses and protesters’ testimonials that Floyd’s fate was “the last straw”. But while blatant racism certainly inspired the protests, it is crucial to consider other elements for a more accurate picture of their rise.

The “last straw” argument is based on two factors. First, Minneapolis diligently adopted the Obama-era reform playbook following other high-profile police killings, including Jamar Clark’s in 2015 and Philando Castile’s in 2016. For five years, the city has implemented a range of measures including training officers in de-escalation tactics, the hiring of more African-American cops and early-warning systems to identify problem officers. But none of that prevented a white officer with a record of complaints from placing his knee on Floyd’s neck for almost nine minutes.

Second, by late May, it was evident that the toll of the COVID-19 pandemic was distributed unequally. African Americans and Latinos are getting sick and dying at higher rates, even in areas where they make up a smaller portion of the population. In this context, the death of George Floyd was a strong symbolic representation of the scandalous disposal of racialised bodies in the current United States.

The current struggle for racial justice does not imply that racism has gotten worse – even in light of the atrocious state-of-affairs. Floyd’s murder and the disproportionate number of COVID-19 deaths among African Americans could be considered sparkers, but what made the spark become a fire?

by Nara Roberta Silva and Yves Smith, openDemocracy via Naked Capitalism |  Read more:

Tuesday, October 13, 2020

School Is a Form of Child Care

This summer, as debate raged among lawmakers, school districts and parents about whether it was safe to send kids back to school, something strange happened in Howard County, Md.

The Howard County school system decided to remain remote for at least the first semester. But to help parents deal with the lack of in-person care for their children, the county offered elementary school students a spot in parks and recreation programs, which provide “support for virtual learning assignments” along with “work sessions” and “crafts, physical activities, and games” — activities not totally unlike, say, school.

Little mention was made of the adults who will supervise the children during this child care. There was no hand-wringing about classroom configurations or safety guidelines. The catch? Unlike regular public school, which is guaranteed and free, the spots were limited — and cost $219 each week for a full day.

Similar programs have been created in places such as Texas, Vermont, Boston and New York City. Even before the school year began, child care providers never shut down in many places and were mostly left on their own to figure out how to keep their doors open while keeping everyone safe.

What is going on? Why would we endlessly debate the safety of putting children in classrooms, but put them in day care settings without anyone batting an eye — and certainly with no national debate?

In reality, there is no magical distinction that leaves children and adults immune to the coronavirus in a child care setting but not in a school. And yet throughout the pandemic we’ve operated as if there is.

It has taken a once-in-a-lifetime crisis to reveal what was always true: School is — whisper it — a form of child care; child care, at its best, fosters children’s development.

We have long drawn a sharp distinction where there shouldn’t be one. School is, first and foremost, about education. But it is also a safe place for parents to send their children while they’re at work. That fact became torturously clear as school was yanked away, throwing families into chaos. At the same time, while child care allows parents of young children to go to work, the early years are also critical for children’s development, making the educational aspects crucial.

The dichotomy we’ve set up between the two doesn’t serve anyone now, but it didn’t work under normal circumstances, either. Separating child care from the larger K-12 educational system forces many of us to live with an expensive, patchwork, private system for children up to age 5. And ignoring the fact that school is a place where children both learn and are kept safe while their parents work means we haven’t reconciled short school days and academic calendars with a typical working parent’s schedule.

How did we ever let such a bifurcated system grow so established in the first place? The story begins, for a University of Pennsylvania education historian, Jonathan Zimmerman, when formal school was first established in this country. In the 1800s, school was transformed state by state from a few weeks of instruction by a teenage girl in a one-room house into a system of formal classrooms with grades and professional teachers. Educational reformers like Horace Mann and Henry Barnard insisted that children needed more formal schooling in order to be informed citizens upholding democracy.

But that same reform push also excluded the nation’s youngest children. When “school” still meant a one-room schoolhouse, working mothers were happy to send babies along with their siblings — which doesn’t mean that this was a good outcome. “They would park the babies in front of the fireplace and the babies would fall asleep and fall backward,” Mr. Zimmerman said. “Kids were injured.” Horace Mann insisted that babies ought to remain at home, to allow the other children to get a better shot at a real education.

“Nobody’s wrong here,” Mr. Zimmerman said. “Mom’s right that we need a place for the 6-month-old, and Horace Mann is right that kids don’t actually belong in the schoolhouse.”

But no one swooped in to help Mom get what she needed — even as a broad consensus formed across the country that all children above the age of 6 had a right to a publicly financed education. “Public support for education starting in grade 1 has never really been a question,” Mr. Zimmerman said, even if we’ve frequently — and vehemently — fought over the particulars of how it’s carried out. School was never about alleviating a burden for parents. It was, for a new republic, based on the idea that the country needed an educated population to make democracy function. It was a civic service.

Child care, on the other hand, was guided by an entirely different logic. “There’s a common thread that runs through the history of child care in this country,” said Chris Herbst, an Arizona State University associate professor who studies child care policy. “And that’s employment.”

Mothers have always worked outside the home. During Mann’s time, they were in mills and factories alongside men. Their presence in the workplace then was controversial, as it remains today. Convincing the public that there should be a service available to take care of their kids while they worked was “a really heavy lift,” Mr. Zimmerman said.

In fact, mothers didn’t get any help from the government until the government needed them to work. During the Great Depression, the Works Progress Administration operated a network of child care centers aimed at helping unemployed mothers get jobs. Then, in the midst of World War II, President Franklin Roosevelt built a network of centers across the country. By then, men were fighting abroad and the ranks of childless women were running thin. But without child care, the nation was subjected to an array of horror stories about what happened when mothers went to work, such as children locked in cars near factories or chained to trailer homes.

And so, between 1943 and 1946, the country had a publicly funded, universal child care system built on the idea that women needed to leave the home to keep America running. It was “one of the earliest examples of child care policy in this country,” Mr. Herbst said. And it sprung directly from an economic need. “It was not designed specifically with education goals or child development goals in mind,” Mr. Herbst said. “It was all about parent employment.”

The centers shut down when the war ended, and the country has yet to create a universal child care program since. It has barely even invested public resources in it. But the few times it has, work has nearly always been the driving priority. In the 1990s, the federal government created programs like the Child Care & Development Block Grant and child care subsidies in the Temporary Assistance for Needy Families program with the goal of “quickly moving mostly unmarried mothers with young children from welfare to work,” Mr. Herbst said. “These were employment programs.”

So child care has always by default been ensnared in the thicket of debate over the role women are meant to play at home and in public. School has rarely come into conflict with that debate. The basic premise that children deserve to learn — and that education is important for the country as a whole — hasn’t been disputed.

And yet the two suffer from being divided from each other. “In both realms, we’re only having half the necessary conversation,” Mr. Herbst said.

by Bryce Covert, NY Times | Read more:
Image: FatCamera/E+, via Getty Images

The Big Boys Are Back: Financializing Single-Family Houses

In many parts of the country, house buying has turned into a drunken land rush. Sales of new houses in August jumped 45% from a year ago, after having jumped 50% in July, to the highest sales rate since 2006, which was the end of the prior housing bubble. Sales of existing homes jumped to the highest rate since early 2007, particularly concentrated on single family houses.

This land rush has been ascribed to different factors, including people leaving rental apartments in big cities to move to the suburbs or exurbs; people suddenly deciding to start families; people – especially those that benefited from the crisis and made a killing with the Fed’s schemes – buying a second home outside the city; and people buying a home in a hurry without selling their old home first, hoping for a higher price later.

And this is happening during a massive unemployment crisis; a time when 7% of all mortgages have been moved into forbearance; a time when 17% of the FHA-insured mortgages are 30 days or more delinquent, though many of those loans have also been moved into forbearance and put on ice.

Clearly the housing market has split into two, with one part being red hot, and the other part crumbling before our eyes.

But there is something else going on too: A surge in big money into single family houses as an asset class.

This is now taking different forms: There are the buy-to-rent companies that grew out of the Financial Crisis. And there are companies that are now building new houses specifically as rental houses; and there are the iBuyers – companies that buy houses to then sell them, a business model that is intended to replace the brokerage model though it has done nothing but lose money, but no problem. And then there are companies buying houses and leasing them back to the former homeowners so that they can resolve a mortgage delinquency without having to move.

And these companies have raised many billions of dollars since April in the capital markets that have gone totally nuts.

Homebuilder ResiBuild was set up specifically to build houses as rental properties. The Atlanta-based company is now raising $1.2 billion, including $400 million in equity and $800 million in debt, to build 5,000 houses with three or four bedrooms and a two-car garage that rent for about $1,850 a month. And it plans to manage those rentals.

Co-founder Jay Byce comes out of Colony Capital, which got into the buy-to-rent business that the Fed so hotly encouraged during the mortgage crisis in 2011 and 2012, where these companies with cheaply borrowed billions of dollars swooped in and bought tens of thousands of houses out of foreclosure to then rent them out.

Jay Byce told Bloomberg: “We were already seeing both boomers and millennials move to rental communities because they wanted more room and a low-maintenance lifestyle. Covid has accelerated the shifts that were already happening.”

The theory is that people want more space and live in the suburbs but want to rent instead of owning. (...)

Despite the malaise in the big-city rental apartment sector, the buy-to-rent companies, such as Invitation Homes, have reported record high occupancy rates and on-time rent collections that are roughly in line with pre-Covid averages.

Invitation Homes said in an investor presentation in September that it would be getting into the sale-leaseback market – buying single-family houses from homeowners and leasing them back to the former homeowners, who would do this to cash out without having to move.

The sale-leaseback method of raising cash has been practiced for a long time by owners of commercial property, airplanes, equipment, etc. But in terms of single-family houses, it’s fairly new.

But now there is a different angle to the sale-leaseback model, that is totally new: 7% of the mortgages are in forbearance and others are delinquent but are at the moment protected by a moratorium on foreclosures. These homeowners will eventually have to deal with reality, which could mean a forced sale or foreclosure.

But with a sale-leaseback, they could sell the home and lease it back, so they’d become renters and wouldn’t have to move. Startups are getting into this deal, raising lots of money to do this.

In other words, companies are lining up to take advantage of the mess that will ensue when the forbearance period and the foreclosure moratorium end, which is when these mortgages become officially delinquent and face foreclosure – and it will produce another wave of homes being taken over by investors and becoming an asset class.

by Wolf Richter, Wolf Street |  Read more:

To Mend a Broken Internet, Create Online Parks

As we head into the most consequential, contentious election in our history, it’s time to fix some of the structural problems that led us to this moment.

Let’s face it: Our digital public sphere has been failing for some time. Technologies designed to connect us have instead inflamed our arguments and torn our social fabric.

It doesn’t have to be this way. History offers a proven template for how to build healthier public spaces. As wild as it sounds, part of the solution is no further than your nearest public park.

For my family, that’s Fort Greene Park, a 30-acre square of elm trees, winding paths, playgrounds and monuments in Brooklyn. The park serves as an early-morning romper room, mid-day meeting point, festival ground and farmstand. There are house-music dance parties, soccer games during which you can hear cursing in at least five languages, and, of course, the world-famous Great Pupkin Halloween Dog Costume Contest. In short, the park allows very different people to gather, see each other, and co-exist in the same space. When it’s all working, Fort Greene Park can feel like an ode to pluralistic democracy itself.

That’s not a coincidence—it’s by design. In 1846, Walt Whitman envisioned Fort Greene park to serve this precise purpose. New York City had no public parks at the time—only walled commercial pleasure gardens for those who could afford to enter. Whitman, then an up-and-coming newspaper editor, used the Brooklyn Eagle’s front page to advocate for a space that would accommodate everyone, especially the working-class immigrants crowded into shanty towns along nearby Myrtle Avenue.

Whitman saw public spaces as critical elements of the new American democracy. They were spaces to celebrate individuality and build collective identity. Public parks, he argued, could help weave a greater, more egalitarian “we.”

In Fort Greene Park, this project—the building of a collective identity, the weaving of a social fabric—is ongoing. That the Park was the rallying point for one of New York’s first major Black Lives Matter protests after George Floyd’s murder is not incidental. Conflict and contestation are important parts of how healthy democracies progress, as long as there are structures that facilitate it. Functional public spaces are central to this work. They allow us to assemble, to share common experiences, and to demonstrate that what might have seemed like individual struggles are actually the result of unjust systems that demand correction.

Now, accelerated by the pandemic, we spend much of our time living and conversing with others in a different location: digital space. But social media and messaging platforms weren't designed to serve as public spaces. They were designed to monetize attention. (...)

Private spaces and businesses are critical for a flourishing digital life, just as cafes, bars and bookstores are critical for a flourishing urban life. But no communities have ever survived and grown with private entities alone. Just as bookstores will never serve all the same community needs as a public library branch, it’s unreasonable to expect for-profit corporations built with “addressable markets” in mind to accommodate every digital need.

Alongside and between the digital corporate empires, we need what scholars like Ethan Zuckerman are calling “digital public infrastructure.” We need parks, libraries, and truly public squares on the Internet.

by Eli Pariser, Wired |  Read more:
Image: Getty

Monday, October 12, 2020

The Best Movies on Amazon Prime Video Right Now


The Best Movies on Amazon Prime Video Right Now (NY Times)

[ed. Placeholder, to remember which one's to check out.]

Record Subsidies to Farmers Ahead of Election Day


Trump Funnels Record Subsidies to Farmers Ahead of Election Day (NY Times):

Federal payments to farmers are projected to hit a record $46 billion this year as the White House funnels money to Mr. Trump’s rural base in the South and Midwest ahead of Election Day.

The gush of funds has accelerated in recent weeks as the president looks to help his core supporters who have been hit hard by the double whammy of his combative trade practices and the coronavirus pandemic. According to the American Farm Bureau, debt in the farm sector is projected to increase by 4 percent to a record $434 billion this year and farm bankruptcies have continued to rise across the country. (...)


The breadth of the payments means that government support will account for about 40 percent of total farm income this year. If not for those subsidies, U.S. farm income would be poised to decline in 2020. 

(Image: Rose Marie Cromwell for The New York Times)
[ed. Socialism? No, just bribery. Buying votes, pure and simple.]

Kafka in Pieces

The unfinished draft of The Castle, Franz Kafka’s third and final novel, ends mid-sentence. But when the manuscript made its initial entree into the world, the text had been edited into completion. Max Brod, Kafka’s friend and literary executor, who prepared the original 1926 edition, later reflected that his “aim was to present in accessible form an unconventional, disturbing work which had not been quite finished: thus every effort was made to avoid anything that might have emphasized its fragmentary state.” To accomplish this obfuscation of the novel’s incomplete form, Brod redacted nearly a fifth of the text. He eventually thought better of the choice, and in the second edition restored most of what he’d cut—but by then, his success at attracting interest in Kafka’s work had led to its placement on the Nazis’ “List of Harmful and Undesirable Literature.” This prevented the more faithful edition from reaching a wide German audience until the fall of the Reich. Meanwhile, Kafka’s readership grew abroad thanks to the 1930 English translations by Willa and Edwin Muir, who based their rendering of The Castle on Brod’s original edition—presenting the novel not as a fragment, but as a completed whole.

The state in which Kafka left The Castle is representative of the condition of his entire oeuvre. During his life, he published a few stories in periodicals, released one collection of fiction, and prepared another that appeared only posthumously. But he left behind the vast majority of his work incomplete—infamously, with a note beseeching Brod to burn every word. Brod approached the other novels he declined to destroy much as he did The Castle, omitting unfinished chapters from The Trial and altering the ending of Kafka’s first novel, The Man Who Disappeared, which he renamed Amerika. As for the reams of stories and aphorisms, Brod bestowed titles on many pieces that lacked them and amended aborted conclusions.

Over the intervening decades, generations of scholars and translators have contested the comparatively polished Kafka that Brod constructed. (...) Poet and translator Michael Hofmann has been another advocate of the rougher-edged Kafka, through his version of Kafka’s first novel, given the compromise title Amerika: The Man Who Disappeared (2004); a collection of Kafka’s stories, Investigations of a Dog: And Other Creatures (2017); and now The Lost Writings.

This new volume collects seventy-four short pieces—few longer than two pages, many unconcluded—curated by Reiner Stach, author of the definitive three-volume biography of Kafka. In his afterword, Stach argues that, though “the fragile, fragmentary quality” of Kafka’s work has been extremely influential, even “caus[ing] us to begin to take the literary fragment seriously,” the writer’s most fragmentary material has remained hidden from view: rarely translated, often out of print, barely read, and thus, even if not completely absent, still fundamentally “lost.” (...)

Though Stach does admit to certain Brodian concessions—he writes that the “selection seeks above all to be accessible,” presenting “highly approachable, ‘readable’ pieces”—The Lost Writings thrillingly foregoes any organizational infrastructure that might help us orient ourselves, from titles to sections to a table of contents. Besides the selections themselves and the afterword, there is only an index of first lines. This minimalist approach submerges readers immediately into Kafka’s world. The very first page fixes us in one of his classic traps: “I lay on the ground at the foot of a wall, writhing in agony, trying to burrow down into the damp earth.”

We might divide the pieces collected in The Lost Writings into gems and shards—the former seemingly conceptually complete, the latter obviously broken-off—each with their own particular kinds of obscurity. Within that scheme, this first text would be a gem; the few sentences that comprise the rest of the tale briefly sketch the scene, unfurling the plight contained in the opening. There’s the narrator as prey, a coach equipped with a driver and dogs already bored with the kill, and a hunter “greedily pinching [the narrator’s] calves.” Most startlingly, there is also a brutal expression of desire thwarted: “athirst, with open mouth, I breathed in clouds of dust.”

This opening parable of doomed impotence stutters into the next—another gem, in which the circumstances and mode are completely transformed, but the essential dynamics of predator and prey are unchanged. Here the narrator assumes the cruel, empowered position occupied by the hunter’s retinue in the first piece. He addresses the ensnared:
So, you want to leave me? Well, one decision is as good as another. Where will you go? Where is away-from-me? The moon? Not even that is far enough, and you’ll never get there. So why the fuss? Wouldn’t you rather sit down in a corner somewhere, quietly? Wouldn’t that be an improvement? A warm, dark corner? Aren’t you listening? You’re feeling for the door. Well, where is it? So far as I remember, this room doesn’t have one.
Absent exits recur throughout the collection, as do impassable doorways, the portal’s useless presence only heightening the atmosphere of entrapment. In a later fragment, a shard, a pack of beasts return home from stealing a drink of water from a pond, only to be chased by punishers wielding whips into “the ancestral gallery, where the door was slammed shut, and we were left alone.” In another, a gem, the narrator inexplicably paces “an averagely large hall softly lit by electric light.” “The room had doors,” he reports, “but if you opened them, you found yourself facing a dark wall of sheer rock barely a hand’s breadth from the threshold, and running straight up to either side, as far as one could see. There was no way out there.”

This same quietly suffocating phrase, “no way out”—a Kafkan fragment in itself—appears in another text, a dialogue between an unnamed interlocutor and a chimpanzee named Red Peter who has been seized from his jungle home by humans and eventually learns to behave like one. The ape is also the narrator of “A Report to an Academy,” one of the few stories Kafka finished and published in his lifetime. In the full story, Red Peter expresses his desire for a “way out” and devotes some time to glossing the meaning of the phrase: “I fear that perhaps you do not quite understand what I mean by ‘way out,’” he tells his audience in Willa and Edwin Muir’s translation. “I use the expression in its fullest and most popular sense. I deliberately do not use the word ‘freedom.’ I do not mean the spacious feeling of freedom on all sides”—leaving us to deduce what sort of comparatively inhibited liberty he seeks instead. In the version in The Lost Writings, a dejected Red Peter recounts the story of his capture, before which, he says, he “hadn’t known what that means: to have no way out.” He goes on to explain that he was contained not in “a four-sided cage with bars”; rather, “there were only three walls, and they were made fast to a chest, the chest constituted the fourth wall.” Everything hinges, it seems, on this fourth wall. In the Kafkan cosmology, three walls are presumed. The frightening tragedy is the way one’s own existence constitutes the final barrier.

The specter of this absent fourth wall haunts The Lost Writings. It returns explicitly in one other fragment, in which the narrator describes the space in which he’s held captive. “It was no prison cell, because the fourth wall was completely open,” he tells us in the first line. Here self-obstruction is expressed not in the character’s actual physical predicament, but in his interpretation of it: the fourth wall’s openness serves only to accentuate that he doesn’t even attempt to break free. In fact, the narrator says it’s for the best that his nudity prevents him from fashioning an escape rope out of garments; given the possible “catastrophic results,” it’s “better to have nothing and do nothing.”

Elsewhere, Kafka finds other ways to represent the structure of self-defeat. In one fragment, the narrator tells of being mysteriously unable to stay with a girl he loves. “It was as though she was surrounded by a ring of armed men who held out their lances in all directions,” he claims at first, but then revises this account: “I too was ringed by armed men, though they pointed their lances backward, in my direction. As I moved toward the girl, I was immediately caught in the lances of my own men and could make no further progress.” Another piece draws the self-abnegation deeper into the speaker, and exhilaratingly accelerates the velocity of its expression. “I can swim as well as the others,” the narrator says, “only I have a better memory than they do, so I have been unable to forget my formerly not being able to swim. Since I have been unable to forget it, being able to swim doesn’t help me, and I can’t swim after all.”

by Nathan Goldman, The Baffler |  Read more:
Image: Colin Laurel

Louise Glück Should Refuse the Nobel Prize for Literature

I don't know what the American poet Louise Glück said when the Swedish Academy informed her that she won this year’s Nobel Prize for Literature, but I know what she should have said: “Thanks, but no thanks.”

October is the season of the Nobel Prizes, when a handful of people are catapulted into fame and fortune due to the philanthropic legacy of the inventor of dynamite. Four of the six prizes named after Alfred Nobel are generally uncontroversial — physics, chemistry, medicine, and economics — but the peace and literature prizes arouse passions. There is good reason to be dubious of the peace prize, which has gone to some great people and organizations but also went to Henry Kissinger and Aung San Suu Kyi. Yet it’s the literature prize that, in its current form, has definitely outlived its usefulness and caused great damage.

Last year, the Nobel Prize for Literature was awarded to Peter Handke, an Austrian writer who created a set of impressive literary works in the first part of his career but since the 1990s fell into a morass of genocide denial. In recent decades, Handke wrote at least a half-dozen books and plays that downplayed and denied the genocide committed by Serbs against Muslims during Bosnia’s war. Handke even attended the funeral, and delivered an eulogy, for the former leader of Serbia, Slobodan Milosevic, who died while on trial for war crimes. (...)

The Swedish Academy is a strange organization. It has just 18 members who are appointed for life and who select new members by secret ballot — and the country’s king must approve them. The decision to give the 2019 prize to Handke is not the only evidence of the organization’s unfitness to manage the literature prize. The Academy had to postpone the 2018 award because of revelations that for decades it had abetted sexual harassment and rape by the husband of one of its members. Once that scandal broke open, thanks to the investigative work of journalist Matilda Gustavsson of Dagens Nyheter, the dismal response of the male-dominated Academy included forcing out a female member, Sara Danius, who was pushing for sweeping reforms in its ranks.

In a way, we can be thankful for these scandals because they are reminders of the need to implement a root-and-branch reform of the Nobel literature prize. For much of its existence, the prize generally served as a referendum on the best in Western literature. For that task, the 18 members of the Swedish Academy were a serviceable jury. But more than ever, the reach and aspiration of the Nobel literature prize is truly global. It is laughable and tragic that an award of such influence should be controlled by a tiny and secretive group of Swedes, let alone ones who have shown themselves to be abettors of sexual assault and genocide denial.

by Peter Maass, The Intercept |  Read more:
Image: Robin Marchant/Getty Images

Sunday, October 11, 2020

Jack White


Jack White Gives A Thrilling Performance On 'SNL' — On 2 Days' Notice (NPR)

This week's Saturday Night Live musical guest was supposed to be Morgan Waller, before the country singer got himself disinvited. On Friday morning, SNL creator Lorne Michaels announced that Jack White — whose best-known band, The White Stripes, releases a greatest-hits album in December — would show up to perform in Waller's place.

Though we're only two episodes into Season 46, it's hard to imagine that White's turn won't come out near the top when it's time to rank SNL's 2020-21 musical guests. Unencumbered by new material to promote, White cranked out a few scorching career highlights, including 2014's solo hit "Lazaretto" — which he performed with a guitar designed for him by the late Eddie Van Halen — and kicking off with a fantastic medley.

The earlier performance fused unexpected pieces in remarkable ways, kicking off with "Don't Hurt Yourself" — the song he co-wrote and performed on Beyoncé's Lemonade — before shifting into a version of The White Stripes' "Ball and Biscuit" that incorporated lyrics from Blind Willie Johnson's "Jesus Is Coming Soon." So, on two days notice, we got an Eddie Van Halen tribute, Beyoncé, The White Stripes, a Jack White solo song and Blind Willie Johnson. Not bad!

It was the sort of performance SNL could stand to showcase more often: a veteran star without much to promote or prove, popping by with the sole objective of putting on the best show humanly possible.



Andy Warhol
, Knives, 1982
via: here and here

Political Economy After Neoliberalism

If anything could have dislodged the neoliberal doctrine of freeing the market from the government, you might have expected the coronavirus pandemic to do the trick. Of course, the same was said about the global financial crisis, which was supposed to transform everything from macroeconomic policy to financial regulation and the social safety net.

Now we are facing a particularly horrifying moment, defined by the triple shock of the Trump presidency, the pandemic, and the economic disasters that followed from it. Perhaps these—if combined with a change in power in the upcoming election—could offer a historic window of opportunity. Perhaps. But seizing the opportunity will require a new kind of political-economic thinking. Instead of starting from a stylized view of how the world ought to work, we should consider what policies have proved effective in different societies experiencing similar challenges. This comparative way of thinking increases the menu of options and may suggest novel solutions to our problems that lie outside the narrow theoretical assumptions of market-fundamentalist neoliberalism.

Neoliberalism implies a one-size-fits-all set of policy solutions: less government and more market, as if the “free market” were a single equilibrium. To the contrary, we know that there have been multiple paths to economic growth and multiple solutions to economic crises in different societies. By recognizing that there is not one single path to good outcomes, that real-world markets are complex human constructions—governed in different places by different laws, practices, and norms—we open up the possibility that policies that seem objectionable in light of neoliberal abstractions may deliver high performance along both social and economic dimensions. 

We know about these possibilities from the work of economic sociologists, who stress the political, cultural, and social embedding of real-world markets. From work in comparative political economy, demonstrating how the relationships between government and industry and among firms, banks, and unions vary from one country to another. From political and economic geographers, who place regional economies in their spatial contexts and natural environments. From economic historians, who explore the transformation of the institutions of capitalism over time. From an emergent Law and Political Economy (LPE) movement that aspires to shift priorities from efficiency to power, from neutrality to equality, and from apolitical governance to democracy. And from economists—often villainized as the agents of neoliberalism—who are exploring novel approaches to the problem of inequality and the slowdown in productivity, and show renewed concern with the economic dominance of a few large firms.

The challenge is to bring these insights together.

As a step in this direction, we will propose three core principles of an alternative political economy. We then illustrate these principles by discussing the dynamics of the American political economy, focusing particularly on the rise of “shareholder capitalism” in the 1980s. Finally, we apply the principles to the ongoing national policy responses to the COVID-19 pandemic, comparing the United States to Germany.

We recognize that these principles do not resolve the very real problem of the dominance of business in U.S. politics and the political gridlock produced by this configuration of power. Still, they point in new and urgent directions.
***
First, then, governments and markets are co-constituted. Government regulation is not an intrusion into the market but rather a prerequisite for a functioning market economy. Critics of neoliberalism often make the case for government “intervention” in the market. But why refer to government action as intervention? The language of intervention implies that government action contaminates a market otherwise free of public action. To the contrary, the alternative to government action is not a perfect market, but rather real-world markets thoroughly sullied with collusion, fraud, imbalances of power, production of substandard or dangerous products, and prone to crises due to excessive risk-taking.

Likewise, critics of neoliberalism often adopt the fictional “free market” as a reference point even as they make the case for deviation from it. For example, they follow the standard practice of economists by identifying market failures and proposing solutions to those failures. To be fair, this can be a useful way to see how government action can remedy specific problems, and to assess when action may be helpful or not. But this approach also risks obscuring the fact that market failure is the rule and not the exception. More fundamentally, the government is not a repair technician for a market economy that functions reasonably well, but rather the master craftsperson of market infrastructure.

Thus, governments pacify a territory and centralize the means of violence, making investment safer and trade less precarious. They create ways to write and enforce contracts via the rule of law. They provide public goods like education and transport infrastructure. No neoliberal denies the value of these things.

Beyond these basic functions, governments establish the conditions for the emergence of new markets, provide the architecture to stabilize existing ones, and manage crises to limit damage and facilitate recovery. Historically, governments fostered many of the largest markets, such housing and banking, by designing new market structures that enabled the mass expansion of goods and services. In the case of the housing market, the U.S. federal government created the 30-year fixed interest rate mortgage as the standard mortgage product. It also stabilized the savings and loan industry by creating rules about paying interest on bank accounts and deposit insurance.

In the postwar era, this system helped propel home ownership from around 40 percent to 64 percent. More recently, many policy failures, such as the financial crisis of 2007–2009, occurred because governments shirked their role of making markets work through “deregulation.” Essentially, the U.S. government allowed financial institutions to enter whichever businesses they liked and with little oversight. In the wake of the Great Recession, predictably, the government re-established control and oversight over the banking sector with the Dodd-Frank Act. One of the provisions of that act was to give the Federal Reserve the ability to ask the largest banks to undergo stress tests every year to determine whether or not they could manage a serious downturn.

Governments also support knowledge creation and dissemination and underwrite the cost of innovation in the private sector. They facilitate the organization of market activity by establishing the legal basis for corporations and by setting the rules for fair and efficient trading practices on stock exchanges. A political economy that does not value the role of government along these different dimensions distorts how markets do contribute to society.

by Neil Fligstein and Steven Vogel, Boston Review | Read more:
Image: Timothy A. Clary/AFP via Getty Images
[ed. See also: Trump’s America Remains Stuck in the Shadow of Reagan (Boston Review):
But in the end, Trump’s most enduring deformation of U.S. political life may derive from his slavish devotion to unchecked corporate power and his work in further consolidating power in the hands of a few billionaires. As Christian Lorentzen recently wrote in Bookforum, the Republican Party under Trump should primarily be understood as “an electoral entity that reliably obtains tax cuts for the wealthy, deregulation for big business, increased budgets for the military, and little of anything else for anyone else.”

Being Eaten


As high tide inundates the muddy shallows of the Fraser river delta in British Columbia, what looks like a swarm of mosquitoes quivers in the air above. Upon closer inspection, the flitting mass turns out to be a flock of small shorebirds. The grey-brown wings and white chests of several thousand Pacific dunlins move in synchrony, undulating low over the water, then rising up like a rippling wave, sometimes for hours on end. Staying aloft like this is exhausting, especially in midwinter when the internal furnaces of these small birds, weighing less than a tennis ball, must be refuelled continuously. But setting down to rest and digest their mud-dug meals in the adjacent coastal marshes comes at a cost: an obscured, fearsome view of lurking predators like the skydiving peregrine falcon. The dunlins won’t alight until the ebbing tide buys them back their safer, open vistas.

The evidence that fear motivates dunlin flocking is circumstantial, but compelling. In the 1970s, when populations of peregrine falcons were depressed due to pesticides, dunlins spent less time flying and more roosting. But as pesticides such as DDT waned due to regulations, more peregrines have returned.

Fear is a powerful force not just for wintering dunlins, but across the natural world. Ecologists have long known that predators play a key role in ecosystems, shaping whole communities with the knock-on effects of who eats whom. But a new approach is revealing that it’s not just getting eaten, but also the fear of getting eaten, that shapes everything from individual brains and behaviour to whole ecosystems. This new field, exploring the non-consumptive effects of predators, is known as fear ecology.

by Lesley Evans Ogden, Aeon |  Read more:
Image: Robbie George/The National Geographic Image Collection

Saturday, October 10, 2020

Nirvana

Load up on guns, bring your friends. It's fun to lose and to pretend. She's over-bored and self-assured. Oh no, I know a dirty word. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello. With the lights out, it's less dangerous. Here we are now, entertain us. I feel stupid and contagious. Here we are now, entertain us. A mulatto, an albino, a mosquito, my libido. Yeah. I'm worse at what I do best. And for this gift I feel blessed. Our little group has always been. And always will until the end. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello. With the lights out, it's less dangerous. Here we are now, entertain us. I feel stupid and contagious. Here we are now, entertain us. A mulatto, an albino, a mosquito, my libido. Yeah, hey. And I forget just why I taste. Oh yeah, I guess it makes me smile. I found it hard, it's hard to find. Oh well, whatever, never mind. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello. With the lights out, it's less dangerous. Here we are now, entertain us. I feel stupid and contagious. Here we are now, entertain us. A mulatto, an albino, a mosquito, my libido. A denial, a denial, a denial, a denial, a denial. A denial, a denial, a denial, a denial.

[ed. I was practicing the other day and ran through this old song (which, along with the video, pretty much killed the 80s). The comments section is still going strong, 11 years later.]


via:

The Wall Between What’s Private and What’s Not is Dissolving

A celebrity story broke last week that gave me, as my fellow young people would say, all the feels. But they were not good feels. In fact, they were pretty much every feel except the good kind: sad for the celebrity, bad about myself, uncertain about the world today.

This story was about Chrissy Teigen, a model and the wife of the singer John Legend, although neither of those descriptors really explains her popularity. Rather, that is down to what is frequently described as her “relatability”, or her willingness to share her personal life with the world. This, according to current thinking, makes this extremely beautiful and wealthy woman more real to the public. Over several days, she posted videos of herself on Twitter and Instagram, talking about how she’d been having heavy bleeding while pregnant. “Chrissy Teigen shares updates from hospital bed as she prepares for second blood transfusion” and “Pregnant Chrissy Teigen’s horror scare as she scrambled to hear baby’s heartbeat” were just two of the newspaper headlines, as if it were totally normal that a woman’s intimate pregnancy issues should be international news.

Normally, I ignore news stories ripped from a celebrity’s social media feed, as they are little more than press releases, given that they were written by the celebrity. But it turns out Teigen is more relatable than I thought. Last year, I also had some bleeding while pregnant, and went to the same hospital as Teigen. As I waited for the scan, I cried and blamed myself: for travelling to Los Angeles while pregnant, for being 41, for maybe losing yet another baby. I also thought of Ariel Levy’s 2013 article about her stillbirth: “I knew that this change in fortune was my fault. I had boarded a plane out of vanity and selfishness, and the dark Mongolian sky had punished me.”

In the end my baby was fine, but seeing Teigen’s daily posts was like watching the past unspool and knowing that the future is not as guaranteed as Teigen’s followers seemed to think (“You got this, girl!” “Stay strong! You’re amazing!”). Then, last Thursday morning, Teigen posted more photos from hospital: she had lost her baby.

How should we talk about this kind of loss? I admit, my first thought on seeing the photos on every news website of Teigen bent forward and weeping – photos taken from her social media – was “Maybe not like this?” It reminded me of a time last year when Alec Baldwin’s wife, Hilaria, posted a video of herself telling their young daughter that she’d just had a miscarriage. So-called mumfluencers are praised for taking whatever stigma there still is out of breast-feeding, fertility trouble and more. But when I saw a photo of one – taken by who, her husband? – sitting on a toilet and crying, with a long caption about her miscarriage, I wondered if the cost of stigma removal was self-exploitation. It felt not intimate but voyeuristic, and I know too well how long it takes to recover from these things.

Is it helpful to these women to have these images, taken in the heat of shock and grief, follow them around for ever? (...)

We live in a performative age. We’re rewarded for revealing our private lives to strangers, for exaggerating our emotions online, for sharing every crisis that happens in our bodies, every thought that passes through our heads. So many of us now depend on the reactions of strangers for our own identity. Why, four months later, did I need to post a photo on Instagram of my baby the day after she was born? I tell myself that it’s so family and friends can know all is well, but I’d be lying if I said I didn’t get a kick of validation from seeing the likes rack up. When you can’t even go through one of the most intimate experiences of your life without seeing how others react, how do you know what you feel about anything any more?

It is the height of narcissism (more so than taking a selfie) to assume that my feelings are applicable to all women. I can see online that many women find Teigen’s openness helpful, with some realising, at last, that it wasn’t their fault after all. And while I would have killed anyone who responded to my own miscarriage with an emoji (“So sad! Sadface!”), I can also imagine how, for some, to have millions commenting on a personal loss might be helpful – liberating, even. Once the wall between your private and public lives has dissolved, as it now has for so many, then what’s the difference?

by Hadley Freeman, The Guardian |  Read more:
Image: The Project Twins/Synergy


Kurppa Hosk / Wrapp / Printed Matter / 2018

Julian Lage


The Vocabulary of Violence

Terrorists (noun): evil brown people.

Thugs (noun): violent black people.

Militia (noun): misunderstood white men. Groups of heavily armed individuals whose actions, while not exactly ideal, deserve compassion and should be looked at within a wider socioeconomic context. Instead of rushing to judgment or making generalisations, one must consider the complex causes (economic anxiety, video games, mental health issues) that have triggered these poor guys into committing mass murder, conspiring to violently overthrow the state or plotting to kidnap government officials.

I’m afraid to say that the misunderstood white men have struck again – or attempted to, at least. On Thursday 13 men were charged in relation to an alleged plot to kidnap Michigan’s Democratic governor, Gretchen Whitmer. The plan was to, “grab the bitch”, as they put it, and then try her for “treason”. The eventual goal being to create “a society that followed the US Bill of Rights and where they could be self-sufficient”.

Much of the media coverage of Whitmer’s would-be kidnappers referred to them as members of a Michigan militia group called Wolverine Watchmen. The wolverine, by the way, isn’t just a Marvel character – it’s an animal that looks like a small bear but is actually part of the weasel family. This seems appropriate because “militia” is very much a weasel word. It’s a way to avoid putting white extremists in the same bucket as brown people. It lends them legitimacy. It obfuscates what these people really are.

Governor Whitmer, to her immense credit, was having none of it. “They’re not ‘militias’,” she tweeted on Friday morning. “They’re domestic terrorists endangering and intimidating their fellow Americans. Words matter.”

Donald Trump’s words, in particular, matter. In April the president tweeted “LIBERATE MICHIGAN!” as far-right protesters, many of them armed, railed against stay-at-home orders imposed by Whitmer. Protesters waving semi-automatic rifles later tried to storm the state capitol. “The Governor of Michigan should give a little, and put out the fire,” Trump wrote on 1 May. “These are very good people, but they are angry.”

Trump’s words, Whitmer said in televised comments on Thursday, had served as a “rallying cry” to far-right extremists. Not only had the president refused to condemn white supremacists, he stood on the debate stage last week and told the Proud Boys, a violently racist gang, to “stand back and stand by”. When our leaders “stoke and contribute to hate speech, they are complicit”, Whitmer said.

It’s not just the White House that’s complicit, it’s the media. Kyle Rittenhouse, for example, the 17-year-old accused of killing two protesters in Wisconsin last month, was celebrated as a vigilante by rightwing outlets. “How shocked are we that 17-year-olds with rifles decided they had to maintain order when no one else would?” Tucker Carlson asked on Fox News. Far-right pundit Ann Coulter tweeted that she wanted the teenager “as my president”. The New York Post, meanwhile, published photos of Rittenhouse cleaning up graffiti; he was framed as a concerned citizen rather than a cold-blooded killer.

To be clear: double standards aren’t just a rightwing media problem. A study conducted by Georgia State University last year found that terror attacks carried out by Muslims receive on average 357% more media coverage than those committed by other groups. While this is clearly racist, it’s also dangerous. White supremacists, plenty of evidence shows, are the deadliest domestic threat facing the US. By downplaying the threat of white nationalist terrorism, by finding politer ways to refer to it, the media have allowed it to proliferate. So please, let’s call things by their name. Enough with the “militias”, these people are terrorists.

by Arwa Mahdawi, The Guardian | Read more:
Image: Jeff Kowalsky/AFP/Getty Images
[ed. And labelling them "terrorists" means what? (they should be locked up in Guantanamo without legal recourse for years?)]