Thursday, March 5, 2020

Coronavirus Might Make Americans Miss Big Government

In South Korea, the number of people who are confirmed to have been infected with Covid-19, the pandemic disease commonly known as coronavirus, has ballooned to over 5000 as of the time of this writing and will certainly continue to rise. In the U.S. the official number infected is only 118. But much of this difference may be an illusion, due to differences in how many people are getting tested. South Korea has made a concerted effort to identify all the people infected with the virus, creating drive-through testing stations. The U.S.’ testing efforts, in contrast, look almost comically bungled.

The list of ways that U.S. institutions have fumbled the crisis reads like something out of a TV comedy: The number of test kits issued in the U.S. has been a tiny fraction of the number issued in South Korea. An early testing kit from the U.S. Centers for Disease Control and Prevention (CDC) contained a faulty ingredient and had to be withdrawn. Regulatory hurdles have slowed the rollout of tests, with officials from the CDC and the Food and Drug Administration only now discussing what to do. There are stories of possible coronavirus patients being denied testing due to maddeningly strict CDC limits on who can get a test. Some cities may have to wait weeks for tests to become widely available, during which time the populace will be left in the dark. Worst of all, the CDC has now stopped disclosing the number of people being tested, a move that seems likely to spread panic while reducing awareness.

What are the reasons for this institutional breakdown? It’s tempting to blame politics – President Donald Trump is obviously mainly concerned with the health of the stock market, and conservative media outlets have worked to downplay the threat. But the failures of the U.S.’ coronavirus response happened far too quickly to lay most of the blame at the feet of the administration. Instead, it points to long-term decay in the quality of the country’s bureaucracy.

Political scientist Francis Fukuyama has been sounding the alarm about the weakening of our bureaucracy for years now. He notes that the civil service has always been less powerful in the U.S. than in other advanced nations in Europe and East Asia, as the U.S. relies more on the courts. That may lend an imprimatur of fairness to government decisions, but courts are obviously ill-equipped to handle acute threats like pandemics. Fukuyama also points out that less bureaucratic power means more direct political control, which allows the wishes of civil servants to be overridden by the desires of lobbyists. All of these tendencies, he argues, have become worse in recent decades.

A decline in institutional competence is hard to measure. But the amount of resources that the U.S. as a whole devotes to its bureaucracy is shrinking, in terms of both wages and number of employees:

Very little of this decline is a function of our shrinking military. The civilian federal workforce is a much bigger factor, having fallen from about 4% of GDP in 1970 to under 2% today. Much of the work is now handled by contractors, or not done at all. State government cutbacks have also been big.

A bureaucracy whose size doesn’t grow in step with the overall economy means that civil service is not a promising career for young college graduates. Inflexible government pay structures also make it hard to reward excellence or get ahead. Salaries are uncompetitive at the top end, topping out at just under $200,000 for the most highly-paid senior bureaucrats -- only slightly more than an entry-level engineer makes at Google.

Part of the problem is that it’s very difficult for the U.S. to compete with the private sector when the latter is so efficient. But that’s not true everywhere: Singapore manages to have a thriving private sector alongside a highly effective bureaucracy that recruits the best and brightest. So it’s largely a question of political will.

In the U.S. the public sector has come under sustained attack from the political right for many years. President Ronald Reagan famously declared that “government is the problem,” while anti-tax activist Grover Norquist stated that he wanted to “drown [government] in a bathtub.” Republican and Democratic presidents alike have enacted pay freezes for federal workers.

Trump is continuing the assault on the civil service, proposing yet another pay freeze. In 2018 he fired an executive branch team responsible for responding to pandemics and attempted to decrease funding for public health.

    by Noah Smith, Bloomberg |  Read more:
    Image: Stefani Reynolds/Bloomberg
    [ed. If you wanted to make a point about how ineffective government is, just keep strangling it in the name of fiscal restraint and spending cuts. Eventually all that's left is a barely functional bureaucracy and a self-fulfilling prophesy. Republicans have been pushing this strategy for decades (disassembling a powerful engine, piece by piece).]

    Haiku Stairs


    The Folly Of Spending Tax Dollars To Tear Down The Haiku Stairs (Honolulu Civil Beat).]
    Image: uncredited

    America Punished Elizabeth Warren for Her Competence

    In November 2019, as the Democratic presidential candidates prepared for the primaries that had been taking place unofficially for more than a year and that would begin in earnest in February, FiveThirtyEight’s Clare Malone profiled Pete Buttigieg. In the process, Malone spoke with two women at a Buttigieg event in New Hampshire. One liked Joe Biden, but felt he was a bit too old for the presidency. The other liked Buttigieg, without qualification: “I feel he’s well positioned,” she explained. “The country is ready for a more gentle approach.”

    As for Elizabeth Warren? “When I hear her talk, I want to slap her, even when I agree with her.”

    A version of that sentiment—Warren inspiring irrational animus among those whom she has sought as constituents—was a common refrain about the candidate, who announced today that she was suspending her campaign after a poor showing on Super Tuesday. This complaint tends to take on not the substance of Warren’s stated positions, but instead the style with which she delivers them. And it has been expressed by pundits as well as voters. Politico, in September, ran an article featuring quotes from Obama-administration officials calling Warren “sanctimonious” and a “narcissist.” The Boston Herald ran a story criticizing Warren’s “self-righteous, abrasive style.” The New York Times columnist Bret Stephens, in October, described Warren as “intensely alienating” and “a know-it-all.” Donny Deutsch, the MSNBC commentator, has dismissed Warren, the person and the candidate, as “unlikable”—and has attributed her failure to ingratiate herself to him as a result, specifically, of her “high-school principal” demeanor. (“This is not a gender thing,” Deutsch insisted, perhaps recognizing that his complaint might read as very much a gender thing. “This is just kind of [a] tone and manner thing.”)

    The campaigns of those who deviate from the traditional model of the American president—the campaign of anyone who is not white and Christian and male—will always carry more than their share of weight. But Warren had something about her, apparently: something that galled the pundits and the public in a way that led to assessments of her not just as “strident” and “shrill,” but also as “condescending.” The matter is not merely that the candidate is unlikable, these deployments of condescending imply. The matter is instead that her unlikability has a specific source, beyond bias and internalized misogyny. Warren knows a lot, and has accomplished a lot, and is extremely competent, condescending acknowledges, before twisting the knife: It is precisely because of those achievements that she represents a threat. Condescending attempts to rationalize an irrational prejudice. It suggests the lurchings of a zero-sum world—a physics in which the achievements of one person are insulting to everyone else. When I hear her talk, I want to slap her, even when I agree with her.

    To run for president is to endure a series of controlled humiliations. It is to gnaw on bulky pork products, before an audience at the Iowa State Fair. It is to be asked about one’s skin-care routine, and to be prepared to defend the answer. The accusation of condescension, however, is less about enforced humiliation than it is about enforced humility. It cannot be disentangled from Warren’s gender. The paradox is subtle, but punishing all the same: The harder she works to prove to the public that she is worthy of power—the more evidence she offers of her competence—the more “condescending,” allegedly, she becomes. And the more that other anxious quality, likability, will be called into question. Warren’s “‘my way or the highway’ approach to politics,” Joe Biden argued in November, attempting to turn what might also be called principle into a liability, is “condescending to the millions of Democrats who have a different view.” (...)

    One of the truisms of the 2020 campaign—just as it was a truism in 2016, and in 2008—is that women candidates are punished, still, for public displays of ambition. (One resonant fact of Hillary Clinton’s political life is that she was much more popular, in opinion polls, during her tenure as secretary of state—a role for which she did not campaign, and in which she served as at the pleasure of the president—than she was when, just a few years after that, she sought the presidency herself.) American culture has maintained a generally awkward relationship with political self-promotion: That George Washington was conscripted into the presidency rather than campaigning for it remains a foundational bit of lore. When women are the ones doing the promoting, the tension gets ratcheted up.

    Kate Manne, a philosopher at Cornell University, describes misogyny as an ideology that serves, ultimately, to reinforce a patriarchal status quo. “Misogyny is the law-enforcement branch of patriarchy,” Manne argues. It rewards those who uphold the existing order of things; it punishes those who fight against it. It is perhaps the mechanism at play when a woman puts herself forward as a presidential candidate and finds her attributes—her intelligence, her experience, her compassion—understood as threats. It is perhaps that mechanism at play when a woman says, “I believe in us,” and is accused of being “self-righteous.”

    by Megan Garber, The Atlantic |  Read more:
    Image: Drew Angerer
    [ed. I hope Elizabeth will have a major influence in politics and public policy for years to come. See also: Why Michael Bloomberg Spent Half a Billion Dollars to Be Humiliated (The Atlantic). Sad, satisfying, and definitely schadenfreudic (sp?).]

    On The Market

    At the museum, I am standing with my spouse in front of a Flemish vanitas scene. There is an old man hunched over his accounting books, surrounded by gold coins and jewels; a skull sits on his desk, and Death himself perches undetected above his shoulder. What, I ask her, is the “takeaway” of such scenes supposed to be? That one would do well to start thinking of one’s soul, she says. And I think, but do not say: I thought of nothing but my soul for forty years, never learned the first thing about how money works, and now time is much shorter than in our youth, and I’ve managed to save so little money, and I am worried about leaving you alone in this world without me, with only the small amounts we’ve been able to put away for us, for you, as we move about from country to country, renting one modest apartment after another, like dry old students. O my love, I hate to envision you alone and frightened. Is it wrong for me now to count our coins and to keep our accounting books? Am I compromising the fate of my soul? Is this vanity?

    In November of last year, I opened a brokerage account. I had been reading simple, bullet-pointed introductions to financial literacy for a few months before that, manuals “for dummies” of the sort that I am conditioned to hold in contempt when their subject is, say, Latin, or the Protestant Reformation. After this period of study, I determined I was ready to invest the bulk of the money I had to my name, around $150,000, in the stock market (an amount large enough to make me already worthy of the guillotine, for some who have nothing, and small enough to burn or to lose with no consequences, for some who have much more). The fact that I had that amount of money in the first place was largely a bureaucratic mistake. When I quit my job at a university in Canada after nine years of working there, the human-resources people closed my retirement account and sent me the full amount in a single check. That check—the “retirement” I unwittingly took with severe early-withdrawal penalties at the age of forty-one when in fact I was only moving to a job in another country—plus some of the money I had saved over just the past few years from book-contract advances, was to be the seed funding for what I hoped, and still hope, might grow into something much larger through the alchemy of capital gains.

    It was driven home to me repeatedly in my early efforts to build an investment strategy that, quite apart from the question of whether the quest for wealth is sinful in the sense understood by the painters of vanitas scenes, it is most certainly and irredeemably unethical. All of the relatively low-risk index funds that are the bedrock of a sound investment portfolio are spread across so many different kinds of companies that one could not possibly keep track of all the ways each of them violates the rights and sanctity of its employees, of its customers, of the environment. And even if you are investing in individual companies (while maintaining healthy risk-buffering diversification, etc.), you must accept that the only way for you as a shareholder to get ahead is for those companies to continue to grow, even when the limits of whatever good they might do for the world, assuming they were doing good for the world to begin with, have been surpassed. That is just how capitalism works: an unceasing imperative for growth beyond any natural necessity, leading to the desolation of the earth and the exhaustion of its resources. I am a part of that now, too. I always was, to some extent, with every purchase I made, every light switch I flipped. But to become an active investor is to make it official, to solemnify the contract, as if in blood.

    When I was eleven, I learned that a check is the form of currency you use when you do not have any other. My mother, recently divorced and under severe financial strain trying to get her family-law practice off the ground, used to take us to Kentucky Fried Chicken when cash reserves were depleted, since that was the only fast-food restaurant in town that accepted these strange promissory notes as a form of payment (and in those days there was no possibility of immediate verification of the availability of funds). We kept careful track of which KFC locations in town might have got a bounced check from us, and avoided them by moving out to ever more peripheral neighborhoods in search of dinner. This was among my earliest and most vivid lessons in what I now think of as my first financial education. When I turned eighteen, with no understanding at all of how interest works, I got my first credit card; when I ran it up to its limit, I got my second credit card; then I got a third. When I finished my undergraduate studies, and was admitted to several graduate programs, I decided I simply had to go to the only one of them that did not offer me a financial package, including tuition remission. So instead, I took student loans. I spent my twenties and thirties under constant pressure from collection agencies. Routine robocalls terrified me, calls from live agents often induced me to either break down in tears or fight back with ridiculous counterthreats, or some combination of the two. This condition, too, I can attest, is something like sin, and something like disease. I carry it with me still, in my body, as if as a child I had suffered from polio, and now must go through the world with a slight recurvatum in my gait, always announcing that because my freedom was delayed, I will never fully be free.

    by Justin E.H. Smith, Cabinet |  Read more:
    Image: uncredited

    Wednesday, March 4, 2020


    Geometric Shapes / 200223
    via:

    The Party Cannot Hold

    In early January, as Democratic voters began to focus more intently on the approaching primary season, New York magazine published a profile of Representative Alexandria Ocasio-Cortez. The writer, David Freedlander, spoke with her about the divisions within the Democratic Party, and asked what sort of role she envisioned for herself in a possible Joe Biden presidency. “Oh, God,” Ocasio-Cortez replied (“with a groan,” Freedlander noted). “In any other country, Joe Biden and I would not be in the same party, but in America, we are.”

    This was in some respects an impolitic, even impolite, thing for the first-term politician to say. AOC, a democratic socialist, had endorsed Bernie Sanders the previous October, so it was no secret where her loyalties lay. Still, Biden was at that point the clear front-runner for the presidential nomination, and freshman members of Congress don’t usually make disparaging remarks about their party’s front-runner. Her comment thus carried a considerable charge—a suggestion that if Biden were the nominee, this luminary and her 6.3 million Twitter followers might not just placidly go along.

    And yet, she is correct. In a parliamentary system, Biden would be in the main center-left party and AOC in a smaller, left-wing party. So her comment was an accurate description of an oddity of American politics that has endured since just before the Civil War—the existence of our two, large-tent parties battling for primacy against each other, but often battling within themselves. (...)

    The current divide is not about one war. It is about capitalism—whether it can be reformed and remade to create the kind of broad prosperity the country once knew, but without the sexism and racism of the postwar period, as liberals hope; or whether corporate power is now so great that we are simply beyond that, as the younger socialists would argue, and more radical surgery is called for. Further, it’s about who holds power in the Democratic Party, and the real and perceived ways in which the Democrats of the last thirty years or so have failed to challenge that power. These questions are not easily resolved, so this internal conflict is likely to last for some time and grow very bitter indeed. If Sanders wins the nomination, he will presumably try to unify the party behind his movement—but many in the party establishment will be reluctant to join, and a substantial number of his most fervent supporters wouldn’t welcome them anyway. It does not seem to me too alarmist to wonder if the Democrats can survive all this; if 2020 will be to the Democrats as 1852 was to the Whigs—a schismatic turning point that proved that the divisions were beyond bridging.

    When did it begin, this split in the Democratic Party over these most basic questions of our political economy? One could trace it back to William Jennings Bryan and the Free Silver Movement (an early rebellion against the eastern bankers), or perhaps even earlier. But if pressed to name a modern starting point, I would choose the mid-1980s: the crushing 1984 defeat of Walter Mondale, and Al From’s creation the next year of the Democratic Leadership Council, which was founded to move the party away from statism and unions and toward positions friendlier to the free market. Mondale was the last old-fashioned Keynesian to capture the Democratic nomination. Ever since, the party’s nominees have offered, to one degree or another, hybrids of Keynesianism and neoliberalism.

    Bill Clinton, the 1992 nominee, probably tilted more toward neoliberalism than any other Democrat, although wholesale dismissals of him as a neoliberal sellout aren’t fair or accurate. People forget, for example, that he rolled the dice on government shutdowns in 1995 and 1996 because he refused to sign a budget Newt Gingrich and Bob Dole pressed on him with enormous domestic spending cuts. It was by no means a given when the first shutdown started that he would win that fight politically (which he did, even if he lost in another way, because of the intern he met who brought him pizza while the White House staff was furloughed). Clinton was a Keynesian at times, but in broad strokes, on trade and financial deregulation, he pushed the Democrats much closer to that then-aborning creature, the global financial elite.

    Like Clinton, Al Gore had been a “New Democrat,” as the more centrist Democrats of the day called themselves, most of his career, but as the nominee in 2000, he tried on both suits. I was at the convention in Los Angeles for his surprisingly high-octane, populist speech announcing that his campaign would rest on the idea of “the people versus the powerful.” But over the next few weeks, the powerful must have started calling. Gore toned that rhetoric down. We never got to see him govern, of course, as he won the election by 500,000 votes but lost it by one at the Supreme Court. John Kerry continued in a similar style in 2004. He proposed new health care and jobs spending, to be paid for by rescinding the Bush tax cuts. He also pledged to cut the deficit in half in four years. But the 2004 election turned more on national security—Iraq and the September 11 attacks—than the economy, and he narrowly lost.

    None of these candidates really had to worry about “the left.” It certainly existed. There was a fairly robust movement against free trade, backed by the labor unions, though it never succeeded in nominating a president. And there were numerous columnists and policy intellectuals who protested every time a Democratic president or congressional leader emphasized the importance of deficit reduction, or otherwise embraced austerity. But electorally, Democrats could get by just paying occasional lip service to the economic left.

    Then came the meltdown of 2008 and the Great Recession. As thrilled as millions were by Barack Obama’s election victory, the activists and intellectuals who cared most about breaking the neoliberal grip on the party were appalled by his appointments of Tim Geithner, Larry Summers, and Rahm Emanuel (not an economic adviser per se but a brutish enforcer of centrist orthodoxy), among others. To be fair, Obama had never done anything to indicate, on the campaign trail or in his short career, that he would govern as a left populist. Adam Tooze, in Crashed, his authoritative book on the financial crisis, notes that in April 2006, Senator Obama was selected for the rare privilege of speaking at the founding meeting of the Hamilton Project, a group of centrist economists brought together by Robert Rubin, Clinton’s Treasury secretary and the bête noire of the left populists. Presidential ambitions no doubt on his mind during this important audition, he carefully walked the Keynes-neoliberal line: he reminded his audience of the people the global economy had left behind in Illinois towns like Decatur and Galesburg, yet he also nodded toward two Hamilton Project priorities when he spoke of “keep[ing] the deficit low” and keeping US debt low and “out of the hands of foreign nations.”

    In the early years of Obama’s presidency, the only anger most of the media noticed emanated from the right, in the form of the Tea Party movement, supported financially by figures like the Koch brothers and promoted by the Fox News Channel. The angry left, lacking such resources, was less visible, but it was always there. It found its avatar in Elizabeth Warren, named by then Senate majority leader Harry Reid to chair a congressional oversight panel on emergency economic relief. It was from this perch that she became such a thorn in Geithner’s, and Obama’s, side—and such a star of the progressive left.

    Outside of official Washington circles, the impatience, and the insurgency, were building, especially among young people born since about the early 1990s. They had grown up under a capitalism very different from the one Baby Boomers experienced; they’d seen a rigged game all their adult lives—a weak job market and heavy college debt for them, more and more riches for the one percent, and no one seeming to do anything about it. In 2010 a young leftist named Bhaskar Sunkara started Jacobin, a socialist journal that became an immediate surprise success. The next year, the Occupy Wall Street demonstrations began, making it clear that anger was real and widespread, and eventually having a strong influence on debate within the Democratic Party. The Democratic Socialists of America, founded in 1982, saw its membership rise from 6,000 in 2016 to 40,000 in 2018. Two other movements of the left, while not mainly concerned with economics, became potent political forces—the Black Lives Matter movement, founded in 2013, and the movement seeking permanent legal status for the so-called Dreamers, undocumented immigrants who came to the United States as children.

    All this activity might have remained inchoate had Sanders not decided to run for president against Hillary Clinton in 2016 (he deferred at first to Warren, who declined to run). Sanders had been inveighing against the banks and rigged political system in exactly the same language for years, but his general ineffectiveness on Capitol Hill, and his comprehensive lack of interest in schmoozing, reduced him to background noise as far as most of Washington was concerned. Now, however, people were coming out by the tens of thousands to hear him speak bluntly about the banks and the billionaires in a way Clinton never would have. And he gave this movement a figurehead, a cynosure around which to rally; his conveniently uncommon first name seemed to dance joyfully out of his supporters’ mouths.

    There is no harsher spotlight in the world than the one shone on major-party candidates for president of the United States, and he handled it with a skill that not everyone thrust into that position could. His critics—and I have been one, especially when I felt in 2016 that he attacked Clinton too viciously for too long, well after he was mathematically eliminated cannot deny him that. Whatever happens with this nominating process and election, he has gone from being an afterthought backbencher to a historical figure.

    To what extent was all this left-wing anger at mainstream Democrats justified? It’s a complicated question. The left was correct that Obama could have been far more aggressive on mortgage rescues and penalties imposed on the banks that brought on the financial crisis, as well as in its criticisms (which I joined) of Obama’s lamentable embrace of deficit reduction. It is also correct that Democrats have, since the 1990s, gotten themselves far too indebted to certain donor groups, notably Wall Street and the tech industry.

    Yet the left, in its critiques, sometimes acts as if Republicans don’t exist and have no say in political outcomes. Leftists tend to interpret the policy failures of the Obama era as a function of his own lack of will, or his reliance on corporate interests, rather than what they more often were, in my view—a reflection of the facts that in the Senate, a unified and dug-in minority can thwart a majority, and even a majority can pass legislation only as progressive as the sixtieth senator will allow because of the super-majority voting rules. I recall several conversations with administration officials who had worked for months on certain policy matters but who knew that the ideas would never get through the Senate. And presidents just don’t have endless political capital.

    I’ve always found this a useful heuristic: imagine Obama in his first term with LBJ-like majorities in Congress, sixty-eight senators and nearly three hundred House members. What would he have passed? It’s useful because our answers define the limits of mainstream liberalism—what it would be willing to push for, and the interests it would be hesitant to take on.

    by Michael Tomasky, NYRB |  Read more:
    Image: Tom Bachtell
    [ed. Sorry for all the politics lately, but it and the virus are saturating the news.]

    The Great Wall Street Housing Grab

    One of Ellingwood’s goals had always been to buy a house by the time he turned 30 — a birthday that unceremoniously came and went six months earlier. When Ellingwood began speaking to lenders, he realized he could easily get a loan, even two; this was the height of the bubble, when mortgage brokers were keen to generate mortgages, even risky ones, because the debt was being bundled together, securitized and spun into a dizzying array of bonds for a hefty profit. The house was $840,000. He put down $15,000 and sank the rest of his savings into a $250,000 bedroom addition and kitchen remodel, reasoning that this would increase the home’s value.

    Suddenly adulthood was upon him. He married on New Year’s Eve, and his wife gave birth to their first child, a son, in April. When his 88-year-old grandfather, an emeritus professor of electrical engineering at the University of Houston, had a bad fall, Ellingwood urged him to move into the house for sale just across his backyard. The grandfather bought the house with his daughter, Ellingwood’s mother, and the first thing they did was tear down the fence between the two properties, creating one big family compound. In 2009, Ellingwood’s older sister bought a house around the corner.

    But shortly after the birth of Ellingwood’s second son, in June 2010, his marriage fell apart. He and his wife each sued for sole custody. To pay his lawyer, he planned to refinance his house, and his grandfather advanced him his inheritance. By 2012, Ellingwood had paid his lawyer more than $80,000, and in the chaos of fighting for his children, he stopped making his mortgage payments. He consulted with several professionals, who urged him to file for bankruptcy protection so that he could get an automatic stay preventing the sale of his house.

    In May 2012, Ellingwood was driving his two boys to the beach, desperate to make the most of his limited time with them, when he got a call. He pulled over and, with cars whizzing by and his boys babbling excitedly in the back seat, learned that he had lost his house. He had dispatched a friend to stop the auction with a check for $27,000 — the amount he was behind on his mortgage — but there was nothing to be done. Because Ellingwood began to file for bankruptcy and then didn’t go through with it, a lien was put on his house, his “vortex of love” as he called it, that precluded him from settling his debt. The house sold within a couple of minutes for $486,000, which was $325,000 less than what he owed on it.

    In the months after, though, Ellingwood was graced with what seemed like a bit of luck. The company that bought his home offered to sell it back to him for $100,000 more than it paid to acquire it. He told the company, Strategic Acquisitions, that he just needed a little time to get together a down payment. In the meantime, the company asked him to sign a two-page rental agreement with a two-page addendum.

    It was clear from the beginning that there was something a little unusual about his new landlords. Instead of mailing his rent checks to a management company, men would swing by to pick them up. Within a few months, Ellingwood noticed that one of the checks he had written for $2,000 wasn’t accounted for on his rental ledger, though it had been cashed. He called and emailed and texted to resolve the problem, and finally emailed to say that he wouldn’t pay more rent until the company could explain where his $2,000 went. For more than three months, he withheld rent, waiting for a response. Instead, the company posted an eviction notice to his door. (...)

    Wall Street’s latest real estate grab has ballooned to roughly $60 billion, representing hundreds of thousands of properties. In some communities, it has fundamentally altered housing ecosystems in ways we’re only now beginning to understand, fueling a housing recovery without a homeowner recovery. “That’s the big downside,” says Daniel Immergluck, a professor of urban studies at Georgia State University. “During one of the greatest recoveries of land value in the history of the country, from 2010 and 2011 at the bottom of the crisis to now, we’ve seen huge gains in property values, especially in suburbs, and instead of that accruing to many moderate-income and middle-income homeowners, many of whom were pushed out of the homeownership market during the crisis, that land value has accrued to these big companies and their shareholders.”

    Before 2010, institutional landlords didn’t exist in the single-family-rental market; now there are 25 to 30 of them, according to Amherst Capital, a real estate investment firm. From 2007 to 2011, 4.7 million households lost homes to foreclosure, and a million more to short sale. Private-equity firms developed new ways to secure credit, enabling them to leverage their equity and acquire an astonishing number of homes. The housing crisis peaked in California first; inventory there promised to be some of the most lucrative. But the Sun Belt and Sand Belt were full of opportunities, too. Homes could be scooped up by the dozen in Phoenix, Atlanta, Las Vegas, Sacramento, Miami, Charlotte, Los Angeles, Denver — places with an abundance of cheap housing stock and high employment and rental demand. “Strike zones,” as Fred Tuomi, the chief executive of Colony Starwood Homes, would later describe them.

    Jade Rahmani, one of the first analysts to write about this trend, started going to single-family-rental industry networking events in Phoenix and Miami in 2011 and 2012. “They were these euphoric conferences with all of these individual investors,” he told me — solo entrepreneurs who could afford a house but not an apartment complex, or perhaps a small group of doctors or dentists — “representing small pools of capital that they had put together, loans from regional banks, and they were buying homes as early as 2010, 2011.” But in later years, he said, the balance began to shift: Individual and smaller investor groups still made up, say, 80 percent of the attendees, but the other 20 percent were very visible institutional investors, usually subsidiaries of large private-equity firms. Jonathan D. Gray, the head of real estate at Blackstone, one of the world’s largest private-equity firms and the one with the strongest real estate holdings, thought he could “professionalize” the fragmented single-family-rental market and partnered with a British property-investment firm, Regis Group P.L.C., as well as a local Phoenix company, Treehouse Group. Blackstone “would show up with teams of people and would look for portfolio acquisitions,” recalled Rahmani, who works for the firm Keefe, Bruyette & Woods, known as K.B.W. (K.B.W. sold some shares of Invitation Homes during its public offering.)

    Throughout the country, the firms created special real estate investment trusts, or REITs, to pool funds to buy bundles of foreclosed properties. A REIT enables investors to buy shares of real estate in much the same way that they buy shares of corporate stocks. REITs typically target office buildings, warehouses, multifamily apartment buildings and other centralized properties that are easy to manage. But after the crash, the unprecedented supply of cheap housing in good neighborhoods made corporate single-family home management feasible for the first time. The REITs were funded with money from all over the world. An investment company in Qatar, the Korea Exchange Bank on behalf of the country’s national pension, shell companies in California, the Cayman Islands and the British Virgin Islands — all contributed to Colony American Homes. Columbia University and G.I. Partners (on behalf of the California Public Employee’s Retirement System) invested $25 million and $250 million in the REIT Waypoint Homes. By the middle of 2013, private-equity companies had raised or spent nearly $20 billion on single-family real estate, and more than 100,000 homes were in the hands of institutional investors. Blackstone’s Invitation Homes REIT accounted for half of that spending. Today, the number of homes is roughly 260,000, according to Amherst Capital. (...)

    Landlords can be rapacious creatures, but this new breed of private-equity landlord has proved itself to be particularly so, many experts say. That’s partly because of the imperative for growth: Private-equity firms chase double-digit returns within 10 years. To get that, they need credit: The more borrowed, the higher the returns.

    by Francesca Mari, NY Times | Read more:
    Image: Nix + Gerber Studio for The New York Times

    Bob Dylan


    [ed. I've posted this before but couldn't remember the song. Love this video. Just a tight shot of five guys, no theatrics. Dylan with that just got out of a mental institution vibe, like he's on lithium or something. Never acknowledges the camera (except once, after turning around to say - nothing? - to his guitarist at 3:40), wears his cowboy hat backwards...and they rock! Who says Dylan doesn't have a sense of humor?] 

    Tuesday, March 3, 2020

    What The Stakes Are

    I feel like I’m going crazy. I have a pit of terror in my stomach that never goes away. I am stressed and afraid at every moment.

    To me, a set of facts about the world is difficult to deny:
    1. If Donald Trump is reelected in November, very bad things will happen to a large number of people. Climate change will worsen. The brutalization of immigrants will escalate, with dementia patients and diabetics deported to their deaths. Workplace safety and labor protections will be gutted. Public assets from the national parks to the postal service will be sold off to corporations. A global arms race will intensify, possibly with civilization-ending weapons placed in outer space, waiting to destroy us at a moment’s notice.
    2. To stop these things from happening, we have exactly one chance on exactly one day: Nov. 3, 2020. On that day, something extremely difficult must be done: well over 60 million people must be motivated enough to put aside whatever else they are doing in their lives in order to go to polling stations and cast ballots. 
    3. Donald Trump will do whatever it possibly takes to prevent this from happening. He has a colossal amount of money. He is ruthless. He will say anything. Do anything. He will attack candidates from the left if he has to. He will mock their physical appearance. He will lie about them shamelessly. And he is the most powerful man in the world. Trump has the triple advantages of incumbency, low unemployment, and a decent approval rating. It will be incredibly difficult for anyone to beat him. 
    4. The Democratic party “establishment,” meaning the people who have been in leadership positions in the party, does not actually understand Trump. They do not see why his message is appealing. They don’t understand how talented he is. They think he is stupid. They don’t know why he thrives, and they don’t understand why they’re failing to effectively oppose him. When his approval rating rises, it mystifies them. When nobody comes to their rallies, they don’t know why. They didn’t get what was going on in 2016, when their own message was totally out-of-touch with ordinary people’s concerns. They will not admit that his State of The Union address was terrifyingly effective. They think that by pointing out that Trump is a liar and a cad, they can hurt him.
    5. Even in a concerningly out-of-touch and inept party, Joe Biden stands out as uniquely out of touch and inept. It’s not just that he seems mentally not-that-with-it, but that he fundamentally can’t organize people. He certainly can’t inspire them. In fact, Biden’s political instincts are atrocious: he constantly told Iowa voters to “go vote for someone else,” and 85% of them did. He tells millennials he has “no empathy” for them. He promises no change. He is a serial liar who fabricates absurd details about his life story, like fictitious arrests and a history of civil rights activism.
    6. The only other Democratic candidate than Joe Biden who has a viable chance at the Democratic nomination is Bernie Sanders. This is almost universally accepted. 
    7. Between the two of them, Bernie Sanders is the only one with even a chance of beating Trump. As in 2016, Bernie is different from other Democrats in that he knows how to speak to Trump’s own voters. Not only does he beat Trump consistently in head-to-head polling, but he offers ordinary people an ambitious social democratic agenda that is designed to deal with their real-world problems. He has a decades-long record of fighting hard for them to get healthcare, decent wages, and family leave. He has waged an often lonely struggle on behalf of those whose interests are too frequently ignored in Washington, even taking on the Obama administration over cuts to Social Security. When Bernie tells working people he is in their corner, they can believe him, because he has acted on the same clear set of values for decades. Plus, Bernie’s supporters are motivated. They get out and knock doors for him in the cold. They will do whatever it takes for him. (And on the flipside, if Joe Biden was nominated, millions of them would probably not only decline to put in the same level of organizing energy, but would simply stay home, unwilling to assist a candidate who has made it clear he has no empathy for them.) 
    8. Many wealthy and powerful Democrats will do whatever it takes to stop Bernie Sanders from being the nominee. This means that they will do whatever it takes to make sure that Joe Biden is the nominee. Already, Pete Buttigieg and Amy Klobuchar have dropped out and thrown their support behind Biden. Barack Obama has apparently “sent the signal” to Democrats that they need to come together behind Biden. Some Democrats even appear to be funneling money to supporting Elizabeth Warren’s campaign, so that she can continue to siphon enough votes away from Bernie Sanders to keep him from winning the nomination.
    9. If these Democrats succeed in stopping Bernie, perhaps through a contested convention in which superdelegates override the plurality vote, and they put the feeble and uninspiring Biden at the top of the ticket, it will be an absolute calamity. Bernie’s supporters, many of whom already dislike the party for working hard to stop Bernie in 2016 and the incredibly fishy Iowa caucus shenanigans, will simply give up on the Democrats. Millennials will leave the party in droves, feeling that their votes don’t matter. Some will probably support a third party candidacy. Others will argue that in the interests of pragmatism, they should still vote for a dishonest and weak candidate who says he has no empathy for them. Their appeals will mostly fail. The party will be riven with bitter conflict. Biden will have no clear message, no strategy. He will perform embarrassingly in debates with Trump, forgetting his words and seeming to wonder why he is even on the stage. (He will also have no good explanation for what his son Hunter was doing for that Ukranian gas company, which will be the subject of constant discussion.) Trump, being a bully, will seize his advantage and relentlessly mock Biden’s performance. Trump will (as he has before) talk a lot about how Sanders was “robbed” by a “rigged” primary, delegitimize Biden’s nomination, and stoke the intra-party conflict. Biden will look dazed and confused on Election Night, as Democrats wonder yet again how they managed to lose to Donald Trump of all people. 
    10. If Bernie is nominated, things will go differently, though we do not yet know quite how. Trump’s propaganda machine will try to brand Sanders a communist who hates America. Will this work? It is not clear. Sanders has been an open socialist in the public eye for a long time without it affecting his popularity, but the war that is waged against him will be relentless. And, of course, liberals might not pitch in to help Sanders. Many of them repeat right-wing talking points about him already, scaring people by implying Sanders wants to leave them uninsured. Sanders and his army of organizers will do their damndest to expose Trump for the fraud he is, to unite working-class people behind a candidacy that truly speaks to their interests, and behind an ambitious agenda for single-payer healthcare, a comprehensive climate plan, a living wage, and an end to the indentured servitude of student debt. Will they succeed? This I do not know. Everything else here seems clear as day to me. But how exactly a Sanders-Trump race will play out is mystifying indeed. There are strong reasons to believe Sanders will win, like his strong fundraising in Obama-Trump swing counties, voters’ high assessments of his honesty and credibility, his declining to antagonize conservatives on some cultural issues and ability to speak to conservative audiences, and of course, all of the actual polls. But I have never thought that it was certain Sanders will beat Trump. What I think is that it is certain any other Democrat will lose.
    I run all these facts through my head all day, every day. If Trump gets reelected, untold horrors will be released. Unless Sanders prevails, Trump will get reelected. Therefore Sanders must prevail. We must do everything possible to get Sanders the nomination. There is no alternative.

    This same reasoning seemed just as obvious to me in 2016, when Democrats didn’t notice that nominating Hillary Clinton was a catastrophic blunder, and proceeded to lose to Donald Trump, ignoring the warnings of people like me and Michael Moore. And when I say I feel like I’m “going crazy,” it’s because it’s really hard for me to believe that after all these years, the lessons have still not been learned. “Oh my God,” I think. “They’re really going to do it again. They’re still not going to nominate Bernie. They’re going to put up another establishment candidate, this time an even weaker one who doesn’t even have the promise of ‘historic change’ that Hillary would have represented.” They’re literally going to fight Bernie to the death, even if it very obviously would result in the suicide of the Democratic Party as an institution. (...)

    It’s so weird to me that people don’t get this. Do they really believe the idiotic attacks on Bernie’s “radicalism”? Look at Bernie’s agenda: a national health insurance plan, of the kind that exists successfully all over the world. A giant ambitious climate investment plan, of the kind that we absolutely need if we are going to save the earth because this is a fucking emergency. A living wage that allows people to actually afford to pay their rent and feed themselves. What is the problem here? Why are people like Barack Obama and Beto O’Rourke prepared to destroy the Democratic Party and put the entire future of the planet at risk in order to stop this? What exactly is the threat that Bernie poses? (...)

    The charitable answer, and the one they would probably give themselves, is that do not share my view of point #7 on my above list. They simply do not think Bernie is “electable.” They think he would lose to Donald Trump, that because he is too “far left” he would be the equivalent of George McGovern in 1972, and would lose in a landslide. They think he would hurt the prospects of “down ballot” Democrats, with Democratic members of Congress in conservative districts being forced to share the ticket with a socialist. They will insist that it is not Bernie’s agenda that they despise. They simply believe he threatens the party. He must be stopped at all costs in order to save democracy. I think many Democrats have probably convinced themselves of this, which is why some have been willing to entertain the prospect of nominating Mike Bloomberg to stop Sanders. If it takes a racist, sexist, transphobic Republican to save the party, so be it. Better victory with Bloomberg than defeat with Bernie. (...)

    The fact that many high-up people in the Democratic party think this way is frightening. Because if they are completely convinced that Bernie can’t beat Trump, they’re not going to step aside at any point and let him be the nominee. They will fight him to the bitter end, because they will tell themselves that in doing so they are being pragmatic. If their actions result in tearing the party apart through a disastrous brokered convention, they will still insist that their actions were right, because they think anything that stops Bernie has to be done. Yes, even if that means overriding the popular vote with superdelegates.

    But I don’t actually think it is harder for a left-wing candidate to win, and I think people who assume this assume it in part because they don’t really understand what the “left” is or what our theory is. Socialist values pose a significant threat to the wealth and power of certain people in society who have a strong self-interest in making sure people misunderstand and distrust socialists. But actually, the left stands for ideas that, once people understand them clearly and see through all the myths, have the possibility of mass appeal. Medicare for All is popular, and it would probably be far more popular if you explained to people exactly how it worked and what it would mean for them, and showed them how it would affect their pocketbooks and their experience with the healthcare system. Instead, pollsters ask things like “Would you support Medicare For All even if it took away your private insurance and increased your taxes?” and people get jittery, because they think that means they’re going to be uninsured and have less money. People try to mislead the public about what the left is trying to do, then when the public swallows the misconception, we are told that America rejects left ideas. It’s silly.

    by Nathan J. Robinson, Current Affairs |  Read more:
    [ed. Exactly. So-called moderates risk losing a good segment of their party's base and generations of future voters because they're too pussified to support someone who's not down with business as usual. It's like abused spouse syndrome. See also: This Psychological Concept Could Be Shaping the Presidential Election (hint: pluralistic ignorance). Nautilus.]

    via:
    [ed. Good times.]

    The President Is Winning His War on American Institutions

    When Donald Trump came into office, there was a sense that he would be outmatched by the vast government he had just inherited.

    The new president was impetuous, bottomlessly ignorant, almost chemically inattentive, while the bureaucrats were seasoned, shrewd, protective of themselves and their institutions. They knew where the levers of power lay and how to use them or prevent the president from doing so. Trump’s White House was chaotic and vicious, unlike anything in American history, but it didn’t really matter as long as “the adults” were there to wait out the president’s impulses and deflect his worst ideas and discreetly pocket destructive orders lying around on his desk.

    After three years, the adults have all left the room—saying just about nothing on their way out to alert the country to the peril—while Trump is still there.

    James Baker, the former general counsel of the FBI, and a target of Trump’s rage against the state, acknowledges that many government officials, not excluding himself, went into the administration convinced “that they are either smarter than the president, or that they can hold their own against the president, or that they can protect the institution against the president because they understand the rules and regulations and how it’s supposed to work, and that they will be able to defend the institution that they love or served in previously against what they perceive to be, I will say neutrally, the inappropriate actions of the president. And I think they are fooling themselves. They’re fooling themselves. He’s light-years ahead of them.”

    The adults were too sophisticated to see Trump’s special political talents—his instinct for every adversary’s weakness, his fanatical devotion to himself, his knack for imposing his will, his sheer staying power. They also failed to appreciate the advanced decay of the Republican Party, which by 2016 was far gone in a nihilistic pursuit of power at all costs. They didn’t grasp the readiness of large numbers of Americans to accept, even relish, Trump’s contempt for democratic norms and basic decency. It took the arrival of such a leader to reveal how many things that had always seemed engraved in monumental stone turned out to depend on those flimsy norms, and how much the norms depended on public opinion. Their vanishing exposed the real power of the presidency. Legal precedent could be deleted with a keystroke; law enforcement’s independence from the White House was optional; the separation of powers turned out to be a gentleman’s agreement; transparent lies were more potent than solid facts. None of this was clear to the political class until Trump became president.

    But the adults’ greatest miscalculation was to overestimate themselves—particularly in believing that other Americans saw them as selfless public servants, their stature derived from a high-minded commitment to the good of the nation.

    When Trump came to power, he believed that the regime was his, property he’d rightfully acquired, and that the 2 million civilians working under him, most of them in obscurity, owed him their total loyalty. He harbored a deep suspicion that some of them were plotting in secret to destroy him. He had to bring them to heel before he could be secure in his power. This wouldn’t be easy—the permanent government had defied other leaders and outlasted them. In his inexperience and rashness—the very qualities his supporters loved—he made early mistakes. He placed unreliable or inept commissars in charge of the bureaucracy, and it kept running on its own.

    But a simple intuition had propelled Trump throughout his life: Human beings are weak. They have their illusions, appetites, vanities, fears. They can be cowed, corrupted, or crushed. A government is composed of human beings. This was the flaw in the brilliant design of the Framers, and Trump learned how to exploit it. The wreckage began to pile up. He needed only a few years to warp his administration into a tool for his own benefit. If he’s given a few more years, the damage to American democracy will be irreversible.

    This is the story of how a great republic went soft in the middle, lost the integrity of its guts and fell in on itself—told through government officials whose names under any other president would have remained unknown, who wanted no fame, and who faced existential questions when Trump set out to break them.

    by George Packer, The Atlantic |  Read more:
    Image: Patrick White
    [ed. Must read. See also: What The Stakes Are, and The Failure of Democratic Opposition (Current Affairs).]

    The Tyranny of Terrazzo: Will the Millennial Aesthetic Ever End?

    You walk beneath a white molded archway. You’ve entered a white room.

    A basketlike lamp hangs overhead; other lamps, globes of brass and glass, glow nearby. Before you is a couch, neatly tufted and boxy, padded with an assortment of pillows in muted geometric designs. Circles of faded terra-cotta and pale yellow; mint-green and mustard confetti; white, with black half-circles and two little dots — aha. Those are boobs. You look down. Upon the terrazzo nougat of the coffee table, a glass tray trimmed in brass. It holds a succulent in a lumpy ceramic pot, a scented candle with a matte-pink label. A fiddle-leaf fig somewhere looms. Above a bookshelf (spines organized by color), a poster advises you to WORK HARD & BE NICE TO PEOPLE. In the far corner, within the shrine of an arched alcove, atop a marble plinth: one lonely, giant cartoon jungle leaf, tilting from a pink ceramic tube. You sense — in a way you could neither articulate nor explain — the presence of a mail-order foam mattress somewhere close at hand.

    All that pink. All those plants. All that white. It’s so clean! Everything’s fun, but not too much fun. And there, in the round mirror above the couch: It’s you. You know where you are. Or do you?

    Search your brain. Swap out the monstera leaf for waxy red anthurium, WORK HARD & BE NICE TO PEOPLE for GOOD VIBES ONLY. Maybe the pillows were succulent-print; maybe the ceramics had boobs. IT WAS ALL A DREAM, says a neon sign in schoolgirl cursive. You hadn’t noticed that before.

    Maybe it is a dream, this room you do and don’t know, assembled from cliché and half-recollected spare parts; a fever dream — or, no, that’s too much. This room functions more like a CBD seltzer, something you might buy in a salmon-pink can. There’s not a lot of distinctive taste, but still, it’s hard to resist when you’re on a permanent search for ways to feel better. The ambience is palliative — simple but not severe. Even the palette faintly suggests a medicine cabinet: powdery pharmaceutical pastels, orange pill bottles, Band-Aid pink. (...)

    Ever since modernism brought industry into design, tastes have cycled between embracing and rejecting what it wrought. A forward-looking, high-tech style obsessed with mass commercial appeal will give way to one that’s backward-looking, handmade, authenticity-obsessed — which will then give way to some new variation on tech-forward mass style. (Furniture dealers joke that “brown” goes in and out with every generation.) It’s a logic that gets filtered through the reliable desire for the world the way it looked when we were young, and lately this has meant looking back 30 or so years to the Memphis-inflected pastel pop of the ’80s and ’90s. We might call the latest iteration of the cycle the “millennial aesthetic” — not to say that it was embraced by all millennials, just that it came to prominence alongside them and will one day be a recognizable artifact of their era.

    Consider a previous youth-style shorthand: the hipster, preeminent cultural punch line of the aughts. Both hipster and millennial were terms that drifted away from strict definitions (hipsters being subcultural, millennials being generational) to become placeholders for “whatever fussy young people seem to like.” It is strange, now, to remember a time when chunky-framed glasses were understood as a hipster affectation; today they just look like Warby Parker. The hipster aesthetic harked back to a grimy past: Its spaces were wood-paneled, nostalgic; perhaps they contained taxidermy. Behind lumberjack beards and ’70s rec-room mustaches, there was a desire for something preindustrial or at least pre-internet.

    The hipster was Vice; the millennial is virtue, or at least virtuous consumption. The hipster aesthetic was capable of rendering even plain cotton T-shirts a little gross under the gaze of Dov Charney — the grossness was, indeed, part of the appeal. The millennial aesthetic, meanwhile, could take something disgusting and attempt — through sheer force of branding — to make it cute and fun. One such product, called “Come&Gone” (sans-serif logo, pastel website, friendly gifs), attracted notice on Twitter last year. Essentially a sponge on a stick, it was marketed as an “after-sex cleanup” device by a start-up “on a mission to ban the dripping, forever.”

    Sometimes the hipster flirted with racism and misogyny, couched as irony or provocation — a certain performance of exclusivity (even just daring your audience not to get the joke or know the band) was central to the hipster aesthetic’s appeal. But the millennial aesthetic aims its appeal at everyone. Propagated by brands and advertisements, it is a fundamentally commercial aesthetic — and why alienate any potential customer? Millennial marketing showcases models of many races and body types, and the products on offer are obvious in their charms. Every sofa and soft-cup bra presents itself not as evidence of distinctive taste but as the most elegant, economical, and ethical solution to the problem of sofas or soft-cup bras. Simplicity of design encourages an impression that all errors and artifice have fallen away. The millennial aesthetic promises a kind of teleology of taste: as if we have only now, finally, thanks to innovation and refinement, arrived at the objectively correct way for things to look.

    If you simultaneously can’t afford any frills and can’t afford any failure, you end up with millennial design: crowd-pleasing, risk-averse, calling just enough attention to itself to make it clear that you tried. For a cohort reared to achieve and then released into an economy where achievement held no guarantees, the millennial aesthetic provides something that looks a little like bourgeois stability, at least. This is a style that makes basic success cheap and easy; it requires little in the way of special access, skills, or goods. It is style that can be borrowed, inhabited temporarily or virtually.

    by Molly Fischer, The Cut | Read more:
    Image: Fala Atelier

    Monday, March 2, 2020

    Coronavirus Will Test Our New Way of Life

    Constant connectivity defines 21st-century life, and the infrastructure undergirding it all is both digital (the internet and our social media platforms) and physical (the gig economy, e-commerce, global workplaces). Despite a tumultuous first two decades of the century, much of our connected way of life has evaded the stress of a singular global event. The possibility of a global pandemic currently posed by the new coronavirus threatens to change that altogether. Should the virus reach extreme levels of infection globally, it would very likely be the first true test of the 21st-century way of life, laying bare the hidden fragility of a system that has long felt seamless.

    The most obvious example is our global and connected economy, which has already weathered a deep recession. There could be shortages in crucial imports.

    On Thursday, the Food and Drug Administration reported one of its first shortages of a drug for human use (they did not specify which) as a result of supply chain disruptions. The agency is monitoring 63 manufacturers in China supplying medical devices “that may be prone to potential shortage if there is a supply disruption.”

    Worries about the future of the global economy have had interest rates headed toward to record lows while oil prices have dropped. This past week the United States saw its worst weekly decline for stocks since the 2008 financial crisis. Major indexes around the world fell between 4 percent and 12 percent.

    “It’s common when thinking about networks to talk about the trade-off between efficiency versus resilience,” Jon Stokes, a founder of Ars Technica and a deputy editor at The Prepared, an emergency preparedness site, told me recently. “Computers enable us to dial in the efficiency and complexity to insane degrees but we lose resilience in the system.”

    “We design systems presuming a steady state of normalcy,” Mr. Stokes argued. “But now, we’re about to hit this big ball of stress imminently. It will flex the system in weird ways that will cause parts to snap. And it’s impossible to predict what will snap.”

    A global pandemic also threatens to test other systems in ways that are harder to quantify. Chief among them: our complex information ecosystem. In the event of widespread illness, we’ll need to rely on accurate, vetted information to keep us safe. While the internet has made distribution easier than ever before, the democratization of information has created platforms and advertising economies built to reward misinformation.

    When it comes to the coronavirus, the spread of misinformation hoaxes and rumors about the outbreak in China have plagued YouTube and Facebook while adapting to new platforms. As BuzzFeed News’s Ryan Broderick recently explained, “unverified videos from Chinese social media are shared by local Twitter influencers, viral WhatsApp forwards warn users of government advisories that don’t actually exist, and people share bogus cures for the virus.” Literal virality and online virality begin to mimic and influence each other.

    Over the past few years, it has become clear that our social media ecosystem is easily hijacked to incentivize behavior from the worst actors, further amplifying existing tensions and disagreements. The result? A volatile political climate, where news is weaponized for political gain — a state further exacerbated by black-box algorithms protected as corporate secrets that dictate the information we see. Their unknowable nature breeds conspiratorial ideas about the flow and control of information. Trust in what we see online decreases, and news fatigue grows more widespread, especially among the least engaged political-news consumers. Those who are checked out become even more susceptible to cynicism and deception.

    A global pandemic and its attendant fear and uncertainty will only add more strain into an already flawed and complex system. Politically, we can already see the contours of the information war around the coronavirus. For Democrats, the response to the virus is a demonstration of the failure of America’s health care and private insurance systems — and a way to highlight the incompetence of the Trump administration. At the same time, the Trump administration and the pro-Trump media ecosystem are invoking factual reporting about the seriousness of the virus and concern about government ineptitude to claim political bias and downplay the risks to Americans. A legitimate public health crisis becomes yet another choose-your-own-reality event, a wedge to amplify divisions.

    Information pollution bleeds into our online commerce systems as well. Conspiracy grifters like the website Infowars are already stoking fears of government-caused food shortages, using fear to drive product sales. (...)

    Amazon, which is the biggest retailer on the planet, operating in over 180 countries, also represents the connection between the digital and the physical. With its Prime same-day and second-day delivery, it has reshaped shopping behaviors, supported by millions of workers managing the logistics of delivery, package sorting and fulfillment warehouses.

    The company’s labor practices have already come under fire, for long, demanding shifts, dangerous expectations for delivery drivers and wage issues. Such concerns would certainly be exacerbated by a global pandemic. Increased desire to prepare to shelter from the virus will no doubt drive up grocery and essentials orders. Should U.S. coronavirus cases spike, demand would probably increase drastically, forcing low-wage employees — even those who may feel sick — to report to work, subjecting them and others to contagions. But the reverse scenario also creates problems: Imagine an impending pandemic scenario where quarantines or shelter in place orders require warehouse and delivery employees to stay home, causing panic when Amazon can no longer guarantee or fulfill orders. (...)

    Each example — and there are legions more, including our current campaign and election system, which is predicated on large public gatherings — is but one node in an enormous and extremely fragile network. It’s a network that has been building for centuries but that in the past two decades has grown through seamless connection to modern technology. Our way of life has shifted — from individuals to markets, from localized to globalized. So far, this interconnectivity has largely been a strength, creating a network so big that each of its smaller nodes can be imperfect or fail while the others persist. But much like a virus exploits a small vulnerability, creating a chain of reactions that allow it to weaken its host, a true global pandemic could work its way through the interconnected ecosystems that support our present way of life.

    by Charlie Warzel, NY Times | Read more:
    Image: Elizabeth R. Fischer/National Institute of Allergy and Infectious Diseases’ Rocky Mountain Laboratories

    Sunday, March 1, 2020

    Unbuttoned

    I’ve been writing about my father for ages, but when it comes to the details of his life, the year he graduated from college, etc., I’m worthless. Even his job remains a mystery to me. He was an engineer, and I like to joke that up until my late teens I thought that he drove a train. “I don’t really know all that much about him,” I said, scooting my chair closer to his recliner. He looked twenty years older than he had on my last visit to Raleigh, six months earlier. One change was his nose. The skin covering it was stretched tight, revealing facets I’d never before noticed. His eyes were shaped differently, like the diamonds you’d find on playing cards, and his mouth looked empty, though it was in fact filled with his own teeth. He did this thing now, opening wide and stretching out his lips, as if pantomiming a scream. I kept thinking it was in preparation for speech, but then he’d say nothing.

    I was trying to push the obituary off on Lisa when we heard him call for water.

    Hugh got a cup, filled it from the tap in the bathroom, and stirred in some cornstarch to thicken it. My father’s oxygen tube had fallen out of his nose, so we summoned a nurse, who showed us how to reattach it. When she left, he half raised his hand, which was purpled with spots and resembled a claw.

    “What’s on your . . . mind?” he asked Amy, who had always been his favorite, and was seated a few yards away. His voice couldn’t carry for more than a foot or two, so Hugh repeated the question.

    “What’s on your mind?”

    “You,” Amy answered. “I’m just thinking of you and wanting you to feel better.”

    My father looked up at the ceiling, and then at us. “Am I . . . real to you kids?” I had to lean in close to hear him, especially the last half of his sentences. After three seconds he’d run out of steam, and the rest was just breath. Plus the oxygen machine was loud.

    “Are you what?”

    “Real.” He gestured to his worn-out body, and the bag on the floor half filled with his urine. “I’m in this new . . . life now.”

    “It’ll just take some getting used to,” Hugh said.

    My father made a sour face. “I’m a zombie.”

    I don’t know why I insisted on contradicting him. “Not really,” I said. “Zombies can walk and eat solid food. You’re actually more like a vegetable.”

    “I know you,” my father said to me. He looked over at Amy, and at the spot that Gretchen had occupied until she left. “I know all you kids so well.”

    I wanted to say that he knew us superficially at best. It’s how he’d have responded had I said as much to him: “You don’t know me.” Surely my sisters felt the way I did, but something—most likely fatigue—kept them from mentioning it.

    As my father struggled to speak, I noticed his fingernails, which were long and dirty.

    “If I just . . . dropped out of the sky like this . . . you’d think I was a freak.”

    “No,” I said. “You’d think you were a freak, or at least a loser.”

    Amy nodded in agreement, and I plowed ahead. “It’s what you’ve been calling your neighbors here, the ones parked in the hall who can’t walk or feed themselves. It’s what you’ve always called weak people.”

    “You’re a hundred per cent right,” he said.

    I didn’t expect him to agree with me. “You’re vain,” I continued. “Always were. I was at the house this morning and couldn’t believe all the clothes you own. Now you’re this person, trapped in a chair, but you’re still yourself to us. You’re like . . . like you were a year ago, but drunk.”

    “That’s a very astute . . . observation,” my father said. “Still, I’d like to . . . apologize.”

    “For being in this condition?” I asked.

    He looked over at Amy, as if she had asked the question, and nodded.

    Then he turned to me. “David,” he said, as if he’d just realized who I was. “You’ve accomplished so many fantastic things in your life. You’re, well . . . I want to tell you . . . you . . . you won.”

    A moment later he asked for more water, and drifted mid-sip into that neither-here-nor-there state. Paul arrived, and I went for a short walk, thinking, of course, about my father, and about the writer Russell Baker, who had died a few weeks earlier. He and I had had the same agent, a man named Don Congdon, who was in his mid-seventies when I met him, in 1994, and who used a lot of outdated slang. “The blower,” for instance, was what he called the phone, as in “Well, let me get off the blower. I’ve been gassing all morning.”

    “Russ Baker’s mother was a tough old bird,” Don told me one rainy afternoon, in his office on Fifth Avenue. “A real gorgon to hear him tell it, always insisting that her son was a hack and would never amount to anything. So on her deathbed he goes to her saying, ‘Ma, look, I made it. I’m a successful writer for the New York Times. My last book won the Pulitzer.’ ”

    “She looked up at him, her expression blank, and said, ‘Who are you?’ ”

    I’ve been told since then that the story may not be true, but still it struck a nerve with me. Seek approval from the one person you desperately want it from, and you’re guaranteed not to get it.

    As for my dad, I couldn’t tell if he meant “You won” as in “You won the game of life,” or “You won over me, your father, who told you—assured you when you were small and then kept reassuring you—that you were worthless.” Whichever way he intended those two faint words, I will take them, and, in doing so, throw down this lance I’ve been hoisting for the past sixty years. For I am old myself now, and it is so very, very heavy.

    by David Sedaris, New Yorker |  Read more:
    Image: Ross MacDonald

    China’s Bookstores Band Together To Survive the Epidemic


    China’s Bookstores Band Together To Survive the Epidemic (Sixth Tone).
    [ed. Cool pics of China's bookstores. Also, businesses that "rely heavily on social gatherings and in-person experiences" are likely to be more vulnerable than others.]

    Bloomberg Has Hired the Vice Chairs of the Texas and California Democratic Parties

    Former New York City Mayor Mike Bloomberg has hired two state Democratic party vice chairs in Super Tuesday states with two of the top three highest number of pledged delegates. Bloomberg hired Texas Democratic Party Vice Chair Carla Brailey as a senior adviser to his campaign in December, and he hired California State Democratic Party Vice Chair Alexandra Rooker for a similar role in January.

    Both Brailey and Rooker are superdelegates who will likely vote for the Democratic presidential nominee at the party’s national convention this summer. Hiring the leadership of a state party doesn’t appear to break any campaign laws, but it indicates Bloomberg’s intent to effectively purchase political support, said Brendan Fischer, the federal reform program director at the Campaign Legal Center. “This does seem to fit a longstanding pattern of Bloomberg using his billions to help generate support among political elites,” he said.

    Rooker is one of two members of Bloomberg’s campaign staff who also sits on the Democratic National Committee’s rules committee, which recommends rules for the convention, the convention agenda, the convention’s permanent officers, amendments to the party’s charter, and other resolutions. In November, the month he entered the presidential race, Bloomberg gave $320,000 to the DNC, his first contributions to the committee since 1998. (He was a registered Republican from 2001 to 2007, after which he became an independent. He registered as a Democrat in 2018.) He also donated $10,000 to the Texas Democratic Party, where Brailey has been vice chair since June 2018, as well as $10,000 to the California Democratic Party. Brailey, Rooker, and the Bloomberg campaign did not respond to requests for comment on their hiring. (...)

    Bloomberg will appear on the ballot for the first time on Super Tuesday, March 3. His campaign has poured tens of millions of dollars into both Texas and California where there are 228 and 416 delegates up for grabs, respectively.

    by Akela Lacy, The Intercept |  Read more:
    Image: Steve Breen, via
    [ed. Swamp people. So far, everything I've read about the DNC's so-called leadership is pretty disgusting. Speaking of which, superdelegates: DNC Superdelegate Promoting Brokered Convention is a Significant GOP Donor, Health Care Lobbyist (The Intercept); and A Field Guide for Bloomberg-Campaign Deputy Digital Organizers (New Yorker)]