Monday, March 9, 2020

Look Out Below


Here’s how the plunging stock market could cause a recession (MarketWatch)
Image: Getty
[ed. Shock and Awe. Well, that didn't take long. One circuit breaker and we're already seeing calls for various forms of fiscal stimulus (and the White House appears to be listening) ZH. Calculated Risk has additional suggestions.]

Sunday, March 8, 2020

How To Surf Alaska's Bore Tide


How To Surf Alaska's Bore Tide (Smithsonian)
Image: Streeter Lecka/Getty Images

We Are All Irrational Panic Shoppers

Costco diehards know that weekends are for amateurs. The true connoisseur of bulk condiments and discount televisions and multi-pack leggings avoids the choke of the Saturday-and-Sunday crowd, commandeering her oversized shopping cart only on weekdays, and especially on weekday mornings: the store opens its doors at nine, and it generally takes an hour of customer inflow for the winding aisles to reach a critical mass of mayhem. Being one of the first people inside Costco can be an oddly peaceful experience: the quiet hum of the refrigerator cases, the sweeping linoleum canyons with walls of neatly stacked inventory.

Panic, perhaps unsurprisingly, throws this rhythm into disarray. Fear of covid-19, the flu-like illness that has made its way from China to other cities around the world, including a growing number in the U.S., has sent many New Yorkers into crisis mode. (The city, as of Thursday, has two reported cases.) Health authorities, including the C.D.C., have advised to prepare for a period of social distancing; based on transmission-prevention efforts in China, South Korea, Italy, and Iran, it seems possible that Americans in some cities will eventually face full-on quarantine. This involves certain elements of what emergency managers call “shelter in place,” and doomsday preppers refer to as “bugging in”: enough supplies for a household to remain isolated for at minimum a few days, and ideally a few weeks. Where better to get your fix of canned beans than a store where the cans are gallon-size?

At eight-forty-five on a recent Monday morning, fifteen minutes before Costco’s official opening time, the crowd waiting to get inside the warehouse in Brooklyn was already about a hundred strong. Bodies and carts were jammed together inside the store’s open vestibule, pressing up against the still-locked doors; outside, spilling into the parking lot and blocking the flow of traffic, nearly twice as many shoppers fanned out around the vestibule entrances, aiming their carts with the tense energy of bobsledders waiting for the starting gun. An employee pushing a pallet cart shouted for people to get off the asphalt and instead wrap around the building, which everyone ignored, so instead he called out to a co-worker standing closer to the front of the throng: “Tell them they’ve got to open the doors early!” The second employee snaked his way through the crowd to the doors. As he turned sideways to fit between two logjammed carts, he muttered, “This is a madhouse.”

Inside the store, chaos descended almost instantly. Just past the entrance, and then again by the escalators to the second floor, hundreds of shrink-wrapped five-packs of Clorox Disinfecting Wipes (“Kills 99.9% of Viruses & Bacteria!”) were stacked like battlements, the walls slowly eroding as shoppers threw them into their carts. There was a traffic jam near the medical-grade nitrile gloves—hundreds of boxes remained of the small and medium size, but the store’s stock of large gloves (which are sized to fit most men’s hands) were down to the last dozen. The antibacterial soap and hand sanitizer were long gone before any of us had set foot in the store that morning, cleaned out by the weekend hordes.

In the past week, shoppers across the country have been sharing pictures of Costco’s denuded shelves on social media, with captions simultaneously horrified and entertained: the store is premised on a fantasy of endless abundance, and there’s something intensely satisfying in finding the bottom of the allegedly bottomless pit. There’s an element of fantasy in the panic, as well. Apocalyptic bug-in plans require emergency stores of water, but staying home for a few weeks isn’t quite nuclear holocaust: why, then, were shoppers filling entire carts with cases upon cases of Poland Spring? (Curiously, those bottles seemed to be moving more swiftly than Costco’s less expensive, in-house Kirkland brand of water—a Giffen-good irrationality also present in the toilet-paper department, where a fortress of name-brand Scott rolls was dismantled by shoppers in the space of forty minutes while a far more imposing fortress of Kirkland Signature Bath Tissue barely had a dent.) (...)

There’s an undeniable logic to bulk shopping in times of bulk need (or, at the very least, times of bulk panic-induced compulsion), but not everyone has a Costco membership. The same day I visited Costco, an appointment brought me to the Upper West Side, where I stopped by the Fairway on Seventy-fourth and Broadway. The store was day-before-Thanksgiving crowded: the first thing I saw when I stepped through the door was the tail end of a checkout line, one of three wending through the narrow aisles. The verdant jungle of the produce section was at stark odds with the dry goods and grocery shelves, which had been ravaged. All the beans, dried and canned, were completely gone; the dried pastas were well picked over, and what remained was in violent disarray. There was a closed-off squareness to people’s posture, a bulldozeriness to their gait—a clear transmission that no, sorry, we’re not actually all in this together, so you’d better back away slowly from the granola bars. (The mood probably wasn’t helped by the marquee for the Beacon Theatre, directly across the street from Fairway, promoting that night’s unfortunately timed rock show by flashing the words “Widespread Panic” in huge red letters.)

As more and more covid-19 cases are being confirmed in the United States (and as it becomes clear that the federal government’s ability to properly inform and protect the population is, at best, deeply flawed), health experts and epidemiologists are emphasizing that individual risk is quite low, as long as basic precautions are taken, and that those of us who are in general good health should refrain from hoarding masks and other resources that are more vital to those urgently in need of care. This is rational, well-considered advice; unfortunately, man is not a terribly rational creature. Fear is contagious: when we see people go out of their way to protect themselves from disaster, no matter how unlikely, we don’t want to be the only ones left undefended. The ultra-rich are ditching first class in order to fly private, where at least the germs are more rarefied; the D.I.Y.-minded are mixing rubbing alcohol with aloe gel to approximate the effects of hand sanitizer. In recent days, I’ve heard stories of friends and friends of friends making their own frantic grocery trips, walking in the door of Wegman’s or C-Town knowing intellectually that this kind of bunker mentality is unwarranted, it’s silly, it’s probably counterproductive—but coming home nevertheless with stacks of SpaghettiOs and canned green beans, or filling their freezers with Lean Cuisines. “I jokingly call it calamity capitalism,” David Sanders, the owner of Doomsday Prep, a company that sells survival supplies and gear, told Slate, about this deeply human drive to soothe uncertainty by buying and buying and buying. There is, of course, a German word for it: Hamsterkäufe, meaning to shop like a nervous, bulging-cheeked hamster.

by Helen Rosner, New Yorker |  Read more:
Image: Tayfun Coskun / Anadolu Agency / Getty
[ed. I learned a new term today: Giffen Goods (Investopedia). And here (Wikipedia).]

Can the Center Hold? It Never Does

Here we go again. Something is slouching in the general direction of Bethlehem, or whatever passes for “Bethlehem” these days, and as the slouching proceeds, a lot of people are saying that the center is coming undone.

America has suffered “a hollowing out of the political center,” one writer says. Another notes that revolution is “creeping closer as the political center collapses.” The New York Times is on the case, publishing a column helpfully titled, “The Center Cannot Hold.” Times readers get the reference.

What they get, of course, is a line from a William Butler Yeats poem that, over many years of earnest post-collegiate flogging, has been stripped of bark and flower, leaving it a bare ruined choir.

More than Yeats, however, it’s Joan Didion who deserves credit for this endlessly recycled trope about an uncentered center. Didion’s 1968 essay collection, “Slouching Towards Bethlehem,” borrowing another phrase from the same poem, is a cultural landmark much beloved by the sorts of people who write opinion pieces and headlines. (“Bare Ruined Choir” is a book title stolen from another poet, Shakespeare, by another journalist who gained fame in the 1960s, Garry Wills.) (...)

“Of course not all of the pieces in this book have to do, in a ‘subject’ sense, with the general breakup, with things falling apart,” Didion writes in the preface to her book. Maybe not. But if you’re inclined to see unraveling, Didion gives you plenty of what you came for. She chose the title, and chose to begin the book with the Yeats poem that stalks us still. You won’t find a middle to cling to here.

Indeed, the uneasy anchor of the book is the title essay, the collection’s longest, which occupies the physical center of a slender volume. It’s the void at the core of Didion’s creation.

“Slouching Towards Bethlehem” is great journalism today. It must have induced chills when it was published. At a time when others went to the Haight Ashbury neighborhood of San Francisco in search of flower power, as if such a thing could exist, Didion descended in 1967 with another purpose in mind. “San Francisco,” she writes, “was where the social hemorrhaging was showing up. San Francisco was where the missing children were gathering and calling themselves ‘hippies’.”

Perhaps there was peace and love to be found. But Didion chronicled social disintegration with a side of youthful brutality, presaging the Manson murders that took place 380 miles and two years down the road. She quotes a pamphleteer who chronicled the hippie scene on mimeographed sheets.
Pretty little 16-year-old middle-class chick comes to the Haight to see what it’s all about & gets picked up by a 17-year-old dealer who spends all day shooting her full of speed again & again, then feeds her 3,000 mikes & raffles off her temporarily unemployed body for the biggest Haight Street gangbang since the night before last. The politics and ethics of ecstasy.
Didion provides no context for the violence, no judgment on the horror. It’s just an event that occurred in a particular time and place — Haight Ashbury, year of the Summer of Love.

Eventually, after Max and Sharon and Tom and Barbara get pretty high on hash, and the Grateful Dead show up with groupies to spare, or share, and various characters trip or eat or simply surrender to the collective confusion, Didion gets down to business. “We were seeing the desperate attempt of a handful of pathetically unequipped children to create a community in a social vacuum,” she writes. “Once we had seen these children, we could no longer overlook the vacuum, no longer pretend that the society’s atomization could be reversed.”

Having fled the world of adults, Didion’s children constructed an alternative filled with hopes and feelings and attitudes (and drugs) but lacking coherence, direction, rigor, order — a future. Wandering the psychedelic labyrinth, Didion eventually arrives at her story’s end, her ultimate lost child. She is guided to a girl named Susan who likes ice cream and wants a bicycle for Christmas. Susan is wearing white lipstick and reading a comic book on a living-room floor. She is 5 years old and high on acid.

In a documentary about Didion — it’s called “The Center Will Not Hold,” in case you had hoped otherwise — she is asked what she thought of that eerie scene. She pauses before offering this: “Let me tell you, it was gold. You live for moments like that if you're doing a piece.”

It’s a horrifying revelation, a jolting, sickening, aftershock, decades removed from the original quake. It turns out Didion was not a detached observer. She was an enthusiastic witness. Watching centrifugal forces pulling at the center, she rooted for the unraveling. Crashes make good copy.

Yet Didion’s perceptions were genuine, her focus on children shrewd. Her fragile subjects, and her insights about them, give her essay a disturbing, undiminished power. Indeed, the plight of youth might be what links the uncentering of 2020 to the anomie of 1967. It’s not just the rapier politics of race and resentment, the disorientation of high-speed culture or the pervasive, species-wide threat — more biological than nuclear at the moment. It’s their harrowing effects on a young generation reeling from overdoses of debt and doubt.

To be young in America in 2020 might be more perilous than it was in the late 1960s. The federal government keeps some migrant children in cages, awaiting processing. Others it stole from parents, then passed on to new homes without leaving a forwarding address. A bad trip.

Children born in America are also under duress. The cost of education is exorbitant; the dominion of drugs, conveyed by doctor’s prescription, pervasive. In 2015-2016, 18% of children under age 12, and 27% of adolescents ages 12 to 19, had used prescription drugs in the past 30 days.

George Wallace and Richard Nixon, who in 1968 both sought to carve up the nation for political advantage, have been supplanted by Donald Trump, who combines their most vicious instincts and venal appetites. Trump’s racial aggression, his most authentic contribution to public life, is felt most keenly by the young. The most common age of white Americans is 58. For black America, it’s 27. For Hispanics, 11.

As older whites cling to power, climate change bears down on the young — who wonder what, exactly, they stand to inherit. Nihilism, the unbranded version of Trumpism, is manifested in a deathly enthusiasm for carbon, by the president’s sabotage of any attempt to ameliorate its effects, and by his eagerness, even impatience, to release additional toxins, both chemical and metaphorical, into the world. The president amuses his followers by publicly attacking a teenage girl from Sweden whose sole offense is a desire for a healthier planet.

Between the bomb and the war, the Weathermen and the drugs, the youth surveyed by Didion came to their untethering honestly enough. But a generation coming of age today under planetary stress and political demagogy has every reason to suspect that the center has no place for it, and may well not hold its own. Investing their hopes in an odd and angry old man from Vermont seems as plausible a strategy as any.

So will the center hold? In truth, it never does. It collapses, reemerges, reconstitutes, adapting to new forces and new realities. The humans who occupy it, or orbit around it, evolve as well, recalibrating to align themselves to an emergent nucleus. That’s how societies keep from dying. At least until the center neither holds nor reconstitutes, but implodes altogether.

by Francis Wilkinson, Bloomberg |  Read more:
Image: Robyn Beck/AFP

Tourism's Free Fall

This is all happening just weeks before high season is about to get under way. But with millions and millions of tourists voting with their feet by staying at home, one of Europe’s most important and (until four weeks ago) fastest growing industries is taking a hammering.

The world right now is full of places that should be teeming with people but are not, including many iconic tourist landmarks and attractions. In Italy, home to Europe’s third biggest tourism industry, large parts of the country are on lock down after being hit by the biggest outbreak of the COVID-19 outside of Asia. Many of the most famous tourist attractions have been closed and big international events, including the Venice Carnival, have been cancelled.

The impact on the country’s tourism industry has been brutal, prompting panicked representatives to warn that a “generalized panic” over coronavirus could “sink” the sector. “There is a risk that Italy will drop off the international tourism map altogether,” said Carlo Sangalli, president of Milan’s Chamber of Commerce. “The wave of contagions over the past week is causing huge financial losses that will be difficult to recoup.”

Even by late February, when the outbreak was still in its infancy, €200 million worth of travel and accommodation bookings in March had already been cancelled, reported Italian tourism association Assoturismo Confesercenti. That figure, based on data provided by Italy’s hotels, B&Bs and travel agencies, doesn’t include lost tourist revenue for transport, tour guides, bars, restaurants and shops. Bookings are also “sharply down” until June.

In the three most affected regions — Lombardy, Veneto and Emilia-Romagna (in descending order) — cancellation rates on bookings of hotels, flights and apartments have reached as high as 90%. These three regions also happen to be the main motor of Italy’s economy, accounting for 40% of Italy’s GDP. The country’s financial capital (and capital of Lombardy), Milan, is like a ghost town, with many of its most important landmarks, including the Teatro alla Scala opera house, closed to visitors.

“In recent history Italian tourism has never experienced a crisis like this,” Vittorio Messina, National President of Assoturismo, stated in a press release. “It is the darkest moment. Not even 9/11 affected it so heavily.”

The industry group Confturismo- Confcomercio has forecast that the sector, which accounts for 13% of Italian GDP, will suffer total losses of €7.4 billion in the second quarter. It’s the small businesses that are most at risk, warns Messina: “If the situation of generalized panic continues, thousands of businesses, especially small ones, will first enter a liquidity crisis, then close their doors. We urgently need to work towards normalization.”

This unprecedented slowdown of a sector as vital as tourism does not bode well for a country whose economy has barely grown for 20 years and whose banking sector continues to be plagued by systemic problems, including dangerously high levels of non-performing loans (NPL). While the NPL ratio has fallen from a peak of 17% in late 2015 to 8.2% (in September 2019), its still way too high for comfort. In the coming months it’s likely to undergo a sharp resurgence as businesses and households struggle to generate enough income to cover their liabilities and service their debt. And that is the last thing that Italy’s already fragile financial sector needs.

In Spain, tourism is even more important to the national economy, generating approximately €180 billion a year — close to 15% of GDP. In 2019, Spain was the second most visited country in the world, attracting 83.7 million foreign tourists.

by Nick Corbishley, Wolfstreet |  Read more:
Image: Susan Wright via
[ed. Interesting tourism numbers. See also: Shock and Awe (Wolfstreet).]

Saturday, March 7, 2020

The People of Las Vegas

It’s February in Las Vegas, and because I have managed to step on my glasses and break them, as I do at least once a year, I have gone to the LensCrafters at the Boulevard Mall, a faux deco artifact of midcentury Vegas that, like so many malls in America, is a mere husk of its former self. In a faculty meeting a few days earlier, I’d watched as one of my colleagues bent and manipulated a paper clip, then used it to refasten the left bow of his glasses, creating a tiny antenna at his temple. That’s not a look I’m after, so I am here, obsessively trying on frame after frame, as the young Iranian man who is helping me on this quiet Monday afternoon patiently nods or shakes his head: yes, yes; no, no, no.

I order two pairs. LensCrafters, the movie theater chain of eyeglasses, is always offering deals: half off a second set of frames, a supersize popcorn for fifty cents more. While I wait, I walk around the mall, a 1.2-million-square-foot monstrosity built on seventy-five acres, with a 31,000-square-foot SeaQuest aquarium and a 28,000-square-foot Goodwill.

Next door to LensCrafters, there’s a shop that sells gemstones, crystals, sage, and pink Himalayan salt lamps. The burning sage makes that end of the mall smell musky, animalistic—a strangely feral odor in this synthetic environment. Snaking its lazy way around the scuffed tile floor is an automated miniature train, the sort children might ride at the zoo, driven by an adult man dressed as a conductor; it toots loudly and gratingly at regular intervals. JCPenney and Macy’s and Dillard’s closed months and years ago, while Sears is limping along in its fire-sale days. At Foot Locker, I try on black-and-white Vans in an M. C. Escher print. At Hot Topic, I browse the cheap T-shirts printed with sayings like Keep Calm and Drink On and Practice Safe Hex. I eat a corn dog, fried and delicious, at a place called Hot Dog on a Stick. (I really do.) The atmosphere is depressing, in all its forced cheerfulness and precise evocation of the empty material promises of my ’80s-era youth.

I am almost three miles east of the Strip, but I could be anywhere, at any ailing mall in America. The only clues that I am in Las Vegas are a couple of clothing shops that carry items like six-inch Lucite stilettos and pearl-encrusted crop tops. And then, outside, a well-worn swimsuit someone has discarded on a pedestal near the entrance, where Fleetwood Mac’s “Rhiannon” blares. The swimsuit has a built-in corset-like bra, an exoskeleton of sorts—it could probably stand on its own—and it’s as if someone has left a part of her body behind. There’s no pool, I think. Who undressed here? Such odd Vegas-y details are everywhere in this city—the Elvis impersonator shopping in full-spangled regalia at my local health food store, the pink vibrator melting on Maryland Parkway in 110-degree heat—and I assume you eventually become blind to them, but after four years here, I still see them.

Las Vegas is a place about which people have ideas. They have thoughts and generalizations, takes and counter-takes, most of them detached from any genuine experience and uninformed by any concrete reality. This is true of many cities—New York, Paris, Prague in the 1990s—owing to books and movies and tourism bureaus, but it is particularly true of Las Vegas. It is a place that looms large in popular culture as a setting for blowout parties and high-stakes gambling, a place where one might wed a stripper with a heart of gold, like Ed Helms does in The Hangover, or hole up in a hotel room and drink oneself to death, as Nicolas Cage does in Leaving Las Vegas. Even those who would never go to Las Vegas are in the grip of its mythology. Yet roughly half of all Americans, or around 165 million people, have visited and one slivery weekend glimpse bestows on them a sense of ownership and authority.

by Amanda Fortini, The Believer |  Read more:
Image: Carrole Barraud

Thursday, March 5, 2020

My Brilliant Friend: Exploring the Joyousness of Dogs

The infectious joy of dogs figures large in On Dogs: An anthology, introduced by the actor and comedian Tracey Ullman. Although she is a devoted dog-lover, who hopes to die “covered in cashmere blankets and lots and lots of dogs”, the selections in the anthology are not all feel-good. Several are dark or poignant pieces on a dog’s death; others offer sour or sardonic comments on pet dogs. The collection is an eclectic mixture drawn from fiction, poems, anecdotes, and scientific or philosophical essays. (...)

For several of the contributors, the most prominent thread that runs through the book is love – both the love dogs have for people and the love that people return. Our love of dogs is in part a response to their happiness but also, as the legendary French actor and animal welfare activist Brigitte Bardot observes, to their wanting us to be happy. Our love, in effect, responds to their love. “Response”, perhaps, is not the ideal word. Certainly, love for a dog need not be an unconsidered, mechanical reaction to their affection. As Monty Don pointed out in his book on his golden retriever Nigel, a dog is an “opportunity” for a person to develop, shape and manifest love for a being that is not going to reject or betray this love. In a fine essay in the anthology, the late Roger Scruton argues that while dogs may rightly invite love, it must be of the right kind. Although dogs have been “raised to the edge of personhood”, they are not persons, and to ignore this will damage dog and owner alike. The owner will have unreasonable expectations that the dog is bound to disappoint, or a dog may suffer longer than necessary when an owner, viewing the pet as a person, refuses to have it euthanized.

If our love is a response to the dog’s, so, in turn, is the dog’s love a response to ours. It is not true, as Ullman maintains, echoing the popular view, that dogs “offer unconditional love and loyalty, no matter how badly we behave”. It is possible, indeed horribly frequent, for people to forfeit a dog’s love. Those dull-eyed, mangy and broken animals whose owners chain them up and ignore them no longer love these people. It is true that dogs do not impose conditions on us, but this, as Scruton explains, is because they cannot do so, not because they generously refrain from doing so. For a similar reason, it is questionable for Alice Walker to praise her labrador, Marley, for “how swiftly she forgives me”. Marley neither forgives nor blames, for these are actions that presuppose a range of concepts – responsibility, intention, negligence and so on – that are not in her or any other dog’s repertoire. A term better than “unconditional” for characterizing a dog’s love might be “uncomplicated” or “unreflecting”, neither of which is intended to detract from what Lorenz called the “immeasurability” of this love.

by David E. Cooper, TLS |  Read more:
Images: markk, Lucile

via: (source lost)

Coronavirus Might Make Americans Miss Big Government

In South Korea, the number of people who are confirmed to have been infected with Covid-19, the pandemic disease commonly known as coronavirus, has ballooned to over 5000 as of the time of this writing and will certainly continue to rise. In the U.S. the official number infected is only 118. But much of this difference may be an illusion, due to differences in how many people are getting tested. South Korea has made a concerted effort to identify all the people infected with the virus, creating drive-through testing stations. The U.S.’ testing efforts, in contrast, look almost comically bungled.

The list of ways that U.S. institutions have fumbled the crisis reads like something out of a TV comedy: The number of test kits issued in the U.S. has been a tiny fraction of the number issued in South Korea. An early testing kit from the U.S. Centers for Disease Control and Prevention (CDC) contained a faulty ingredient and had to be withdrawn. Regulatory hurdles have slowed the rollout of tests, with officials from the CDC and the Food and Drug Administration only now discussing what to do. There are stories of possible coronavirus patients being denied testing due to maddeningly strict CDC limits on who can get a test. Some cities may have to wait weeks for tests to become widely available, during which time the populace will be left in the dark. Worst of all, the CDC has now stopped disclosing the number of people being tested, a move that seems likely to spread panic while reducing awareness.

What are the reasons for this institutional breakdown? It’s tempting to blame politics – President Donald Trump is obviously mainly concerned with the health of the stock market, and conservative media outlets have worked to downplay the threat. But the failures of the U.S.’ coronavirus response happened far too quickly to lay most of the blame at the feet of the administration. Instead, it points to long-term decay in the quality of the country’s bureaucracy.

Political scientist Francis Fukuyama has been sounding the alarm about the weakening of our bureaucracy for years now. He notes that the civil service has always been less powerful in the U.S. than in other advanced nations in Europe and East Asia, as the U.S. relies more on the courts. That may lend an imprimatur of fairness to government decisions, but courts are obviously ill-equipped to handle acute threats like pandemics. Fukuyama also points out that less bureaucratic power means more direct political control, which allows the wishes of civil servants to be overridden by the desires of lobbyists. All of these tendencies, he argues, have become worse in recent decades.

A decline in institutional competence is hard to measure. But the amount of resources that the U.S. as a whole devotes to its bureaucracy is shrinking, in terms of both wages and number of employees:

Very little of this decline is a function of our shrinking military. The civilian federal workforce is a much bigger factor, having fallen from about 4% of GDP in 1970 to under 2% today. Much of the work is now handled by contractors, or not done at all. State government cutbacks have also been big.

A bureaucracy whose size doesn’t grow in step with the overall economy means that civil service is not a promising career for young college graduates. Inflexible government pay structures also make it hard to reward excellence or get ahead. Salaries are uncompetitive at the top end, topping out at just under $200,000 for the most highly-paid senior bureaucrats -- only slightly more than an entry-level engineer makes at Google.

Part of the problem is that it’s very difficult for the U.S. to compete with the private sector when the latter is so efficient. But that’s not true everywhere: Singapore manages to have a thriving private sector alongside a highly effective bureaucracy that recruits the best and brightest. So it’s largely a question of political will.

In the U.S. the public sector has come under sustained attack from the political right for many years. President Ronald Reagan famously declared that “government is the problem,” while anti-tax activist Grover Norquist stated that he wanted to “drown [government] in a bathtub.” Republican and Democratic presidents alike have enacted pay freezes for federal workers.

Trump is continuing the assault on the civil service, proposing yet another pay freeze. In 2018 he fired an executive branch team responsible for responding to pandemics and attempted to decrease funding for public health.

    by Noah Smith, Bloomberg |  Read more:
    Image: Stefani Reynolds/Bloomberg
    [ed. If you wanted to make a point about how ineffective government is, just keep strangling it in the name of fiscal restraint and spending cuts. Eventually all that's left is a barely functional bureaucracy and a self-fulfilling prophesy. Republicans have been pushing this strategy for decades (disassembling a powerful engine, piece by piece).]

    Haiku Stairs


    The Folly Of Spending Tax Dollars To Tear Down The Haiku Stairs (Honolulu Civil Beat).]
    Image: uncredited

    America Punished Elizabeth Warren for Her Competence

    In November 2019, as the Democratic presidential candidates prepared for the primaries that had been taking place unofficially for more than a year and that would begin in earnest in February, FiveThirtyEight’s Clare Malone profiled Pete Buttigieg. In the process, Malone spoke with two women at a Buttigieg event in New Hampshire. One liked Joe Biden, but felt he was a bit too old for the presidency. The other liked Buttigieg, without qualification: “I feel he’s well positioned,” she explained. “The country is ready for a more gentle approach.”

    As for Elizabeth Warren? “When I hear her talk, I want to slap her, even when I agree with her.”

    A version of that sentiment—Warren inspiring irrational animus among those whom she has sought as constituents—was a common refrain about the candidate, who announced today that she was suspending her campaign after a poor showing on Super Tuesday. This complaint tends to take on not the substance of Warren’s stated positions, but instead the style with which she delivers them. And it has been expressed by pundits as well as voters. Politico, in September, ran an article featuring quotes from Obama-administration officials calling Warren “sanctimonious” and a “narcissist.” The Boston Herald ran a story criticizing Warren’s “self-righteous, abrasive style.” The New York Times columnist Bret Stephens, in October, described Warren as “intensely alienating” and “a know-it-all.” Donny Deutsch, the MSNBC commentator, has dismissed Warren, the person and the candidate, as “unlikable”—and has attributed her failure to ingratiate herself to him as a result, specifically, of her “high-school principal” demeanor. (“This is not a gender thing,” Deutsch insisted, perhaps recognizing that his complaint might read as very much a gender thing. “This is just kind of [a] tone and manner thing.”)

    The campaigns of those who deviate from the traditional model of the American president—the campaign of anyone who is not white and Christian and male—will always carry more than their share of weight. But Warren had something about her, apparently: something that galled the pundits and the public in a way that led to assessments of her not just as “strident” and “shrill,” but also as “condescending.” The matter is not merely that the candidate is unlikable, these deployments of condescending imply. The matter is instead that her unlikability has a specific source, beyond bias and internalized misogyny. Warren knows a lot, and has accomplished a lot, and is extremely competent, condescending acknowledges, before twisting the knife: It is precisely because of those achievements that she represents a threat. Condescending attempts to rationalize an irrational prejudice. It suggests the lurchings of a zero-sum world—a physics in which the achievements of one person are insulting to everyone else. When I hear her talk, I want to slap her, even when I agree with her.

    To run for president is to endure a series of controlled humiliations. It is to gnaw on bulky pork products, before an audience at the Iowa State Fair. It is to be asked about one’s skin-care routine, and to be prepared to defend the answer. The accusation of condescension, however, is less about enforced humiliation than it is about enforced humility. It cannot be disentangled from Warren’s gender. The paradox is subtle, but punishing all the same: The harder she works to prove to the public that she is worthy of power—the more evidence she offers of her competence—the more “condescending,” allegedly, she becomes. And the more that other anxious quality, likability, will be called into question. Warren’s “‘my way or the highway’ approach to politics,” Joe Biden argued in November, attempting to turn what might also be called principle into a liability, is “condescending to the millions of Democrats who have a different view.” (...)

    One of the truisms of the 2020 campaign—just as it was a truism in 2016, and in 2008—is that women candidates are punished, still, for public displays of ambition. (One resonant fact of Hillary Clinton’s political life is that she was much more popular, in opinion polls, during her tenure as secretary of state—a role for which she did not campaign, and in which she served as at the pleasure of the president—than she was when, just a few years after that, she sought the presidency herself.) American culture has maintained a generally awkward relationship with political self-promotion: That George Washington was conscripted into the presidency rather than campaigning for it remains a foundational bit of lore. When women are the ones doing the promoting, the tension gets ratcheted up.

    Kate Manne, a philosopher at Cornell University, describes misogyny as an ideology that serves, ultimately, to reinforce a patriarchal status quo. “Misogyny is the law-enforcement branch of patriarchy,” Manne argues. It rewards those who uphold the existing order of things; it punishes those who fight against it. It is perhaps the mechanism at play when a woman puts herself forward as a presidential candidate and finds her attributes—her intelligence, her experience, her compassion—understood as threats. It is perhaps that mechanism at play when a woman says, “I believe in us,” and is accused of being “self-righteous.”

    by Megan Garber, The Atlantic |  Read more:
    Image: Drew Angerer
    [ed. I hope Elizabeth will have a major influence in politics and public policy for years to come. See also: Why Michael Bloomberg Spent Half a Billion Dollars to Be Humiliated (The Atlantic). Sad, satisfying, and definitely schadenfreudic (sp?).]

    On The Market

    At the museum, I am standing with my spouse in front of a Flemish vanitas scene. There is an old man hunched over his accounting books, surrounded by gold coins and jewels; a skull sits on his desk, and Death himself perches undetected above his shoulder. What, I ask her, is the “takeaway” of such scenes supposed to be? That one would do well to start thinking of one’s soul, she says. And I think, but do not say: I thought of nothing but my soul for forty years, never learned the first thing about how money works, and now time is much shorter than in our youth, and I’ve managed to save so little money, and I am worried about leaving you alone in this world without me, with only the small amounts we’ve been able to put away for us, for you, as we move about from country to country, renting one modest apartment after another, like dry old students. O my love, I hate to envision you alone and frightened. Is it wrong for me now to count our coins and to keep our accounting books? Am I compromising the fate of my soul? Is this vanity?

    In November of last year, I opened a brokerage account. I had been reading simple, bullet-pointed introductions to financial literacy for a few months before that, manuals “for dummies” of the sort that I am conditioned to hold in contempt when their subject is, say, Latin, or the Protestant Reformation. After this period of study, I determined I was ready to invest the bulk of the money I had to my name, around $150,000, in the stock market (an amount large enough to make me already worthy of the guillotine, for some who have nothing, and small enough to burn or to lose with no consequences, for some who have much more). The fact that I had that amount of money in the first place was largely a bureaucratic mistake. When I quit my job at a university in Canada after nine years of working there, the human-resources people closed my retirement account and sent me the full amount in a single check. That check—the “retirement” I unwittingly took with severe early-withdrawal penalties at the age of forty-one when in fact I was only moving to a job in another country—plus some of the money I had saved over just the past few years from book-contract advances, was to be the seed funding for what I hoped, and still hope, might grow into something much larger through the alchemy of capital gains.

    It was driven home to me repeatedly in my early efforts to build an investment strategy that, quite apart from the question of whether the quest for wealth is sinful in the sense understood by the painters of vanitas scenes, it is most certainly and irredeemably unethical. All of the relatively low-risk index funds that are the bedrock of a sound investment portfolio are spread across so many different kinds of companies that one could not possibly keep track of all the ways each of them violates the rights and sanctity of its employees, of its customers, of the environment. And even if you are investing in individual companies (while maintaining healthy risk-buffering diversification, etc.), you must accept that the only way for you as a shareholder to get ahead is for those companies to continue to grow, even when the limits of whatever good they might do for the world, assuming they were doing good for the world to begin with, have been surpassed. That is just how capitalism works: an unceasing imperative for growth beyond any natural necessity, leading to the desolation of the earth and the exhaustion of its resources. I am a part of that now, too. I always was, to some extent, with every purchase I made, every light switch I flipped. But to become an active investor is to make it official, to solemnify the contract, as if in blood.

    When I was eleven, I learned that a check is the form of currency you use when you do not have any other. My mother, recently divorced and under severe financial strain trying to get her family-law practice off the ground, used to take us to Kentucky Fried Chicken when cash reserves were depleted, since that was the only fast-food restaurant in town that accepted these strange promissory notes as a form of payment (and in those days there was no possibility of immediate verification of the availability of funds). We kept careful track of which KFC locations in town might have got a bounced check from us, and avoided them by moving out to ever more peripheral neighborhoods in search of dinner. This was among my earliest and most vivid lessons in what I now think of as my first financial education. When I turned eighteen, with no understanding at all of how interest works, I got my first credit card; when I ran it up to its limit, I got my second credit card; then I got a third. When I finished my undergraduate studies, and was admitted to several graduate programs, I decided I simply had to go to the only one of them that did not offer me a financial package, including tuition remission. So instead, I took student loans. I spent my twenties and thirties under constant pressure from collection agencies. Routine robocalls terrified me, calls from live agents often induced me to either break down in tears or fight back with ridiculous counterthreats, or some combination of the two. This condition, too, I can attest, is something like sin, and something like disease. I carry it with me still, in my body, as if as a child I had suffered from polio, and now must go through the world with a slight recurvatum in my gait, always announcing that because my freedom was delayed, I will never fully be free.

    by Justin E.H. Smith, Cabinet |  Read more:
    Image: uncredited

    Wednesday, March 4, 2020


    Geometric Shapes / 200223
    via:

    The Party Cannot Hold

    In early January, as Democratic voters began to focus more intently on the approaching primary season, New York magazine published a profile of Representative Alexandria Ocasio-Cortez. The writer, David Freedlander, spoke with her about the divisions within the Democratic Party, and asked what sort of role she envisioned for herself in a possible Joe Biden presidency. “Oh, God,” Ocasio-Cortez replied (“with a groan,” Freedlander noted). “In any other country, Joe Biden and I would not be in the same party, but in America, we are.”

    This was in some respects an impolitic, even impolite, thing for the first-term politician to say. AOC, a democratic socialist, had endorsed Bernie Sanders the previous October, so it was no secret where her loyalties lay. Still, Biden was at that point the clear front-runner for the presidential nomination, and freshman members of Congress don’t usually make disparaging remarks about their party’s front-runner. Her comment thus carried a considerable charge—a suggestion that if Biden were the nominee, this luminary and her 6.3 million Twitter followers might not just placidly go along.

    And yet, she is correct. In a parliamentary system, Biden would be in the main center-left party and AOC in a smaller, left-wing party. So her comment was an accurate description of an oddity of American politics that has endured since just before the Civil War—the existence of our two, large-tent parties battling for primacy against each other, but often battling within themselves. (...)

    The current divide is not about one war. It is about capitalism—whether it can be reformed and remade to create the kind of broad prosperity the country once knew, but without the sexism and racism of the postwar period, as liberals hope; or whether corporate power is now so great that we are simply beyond that, as the younger socialists would argue, and more radical surgery is called for. Further, it’s about who holds power in the Democratic Party, and the real and perceived ways in which the Democrats of the last thirty years or so have failed to challenge that power. These questions are not easily resolved, so this internal conflict is likely to last for some time and grow very bitter indeed. If Sanders wins the nomination, he will presumably try to unify the party behind his movement—but many in the party establishment will be reluctant to join, and a substantial number of his most fervent supporters wouldn’t welcome them anyway. It does not seem to me too alarmist to wonder if the Democrats can survive all this; if 2020 will be to the Democrats as 1852 was to the Whigs—a schismatic turning point that proved that the divisions were beyond bridging.

    When did it begin, this split in the Democratic Party over these most basic questions of our political economy? One could trace it back to William Jennings Bryan and the Free Silver Movement (an early rebellion against the eastern bankers), or perhaps even earlier. But if pressed to name a modern starting point, I would choose the mid-1980s: the crushing 1984 defeat of Walter Mondale, and Al From’s creation the next year of the Democratic Leadership Council, which was founded to move the party away from statism and unions and toward positions friendlier to the free market. Mondale was the last old-fashioned Keynesian to capture the Democratic nomination. Ever since, the party’s nominees have offered, to one degree or another, hybrids of Keynesianism and neoliberalism.

    Bill Clinton, the 1992 nominee, probably tilted more toward neoliberalism than any other Democrat, although wholesale dismissals of him as a neoliberal sellout aren’t fair or accurate. People forget, for example, that he rolled the dice on government shutdowns in 1995 and 1996 because he refused to sign a budget Newt Gingrich and Bob Dole pressed on him with enormous domestic spending cuts. It was by no means a given when the first shutdown started that he would win that fight politically (which he did, even if he lost in another way, because of the intern he met who brought him pizza while the White House staff was furloughed). Clinton was a Keynesian at times, but in broad strokes, on trade and financial deregulation, he pushed the Democrats much closer to that then-aborning creature, the global financial elite.

    Like Clinton, Al Gore had been a “New Democrat,” as the more centrist Democrats of the day called themselves, most of his career, but as the nominee in 2000, he tried on both suits. I was at the convention in Los Angeles for his surprisingly high-octane, populist speech announcing that his campaign would rest on the idea of “the people versus the powerful.” But over the next few weeks, the powerful must have started calling. Gore toned that rhetoric down. We never got to see him govern, of course, as he won the election by 500,000 votes but lost it by one at the Supreme Court. John Kerry continued in a similar style in 2004. He proposed new health care and jobs spending, to be paid for by rescinding the Bush tax cuts. He also pledged to cut the deficit in half in four years. But the 2004 election turned more on national security—Iraq and the September 11 attacks—than the economy, and he narrowly lost.

    None of these candidates really had to worry about “the left.” It certainly existed. There was a fairly robust movement against free trade, backed by the labor unions, though it never succeeded in nominating a president. And there were numerous columnists and policy intellectuals who protested every time a Democratic president or congressional leader emphasized the importance of deficit reduction, or otherwise embraced austerity. But electorally, Democrats could get by just paying occasional lip service to the economic left.

    Then came the meltdown of 2008 and the Great Recession. As thrilled as millions were by Barack Obama’s election victory, the activists and intellectuals who cared most about breaking the neoliberal grip on the party were appalled by his appointments of Tim Geithner, Larry Summers, and Rahm Emanuel (not an economic adviser per se but a brutish enforcer of centrist orthodoxy), among others. To be fair, Obama had never done anything to indicate, on the campaign trail or in his short career, that he would govern as a left populist. Adam Tooze, in Crashed, his authoritative book on the financial crisis, notes that in April 2006, Senator Obama was selected for the rare privilege of speaking at the founding meeting of the Hamilton Project, a group of centrist economists brought together by Robert Rubin, Clinton’s Treasury secretary and the bête noire of the left populists. Presidential ambitions no doubt on his mind during this important audition, he carefully walked the Keynes-neoliberal line: he reminded his audience of the people the global economy had left behind in Illinois towns like Decatur and Galesburg, yet he also nodded toward two Hamilton Project priorities when he spoke of “keep[ing] the deficit low” and keeping US debt low and “out of the hands of foreign nations.”

    In the early years of Obama’s presidency, the only anger most of the media noticed emanated from the right, in the form of the Tea Party movement, supported financially by figures like the Koch brothers and promoted by the Fox News Channel. The angry left, lacking such resources, was less visible, but it was always there. It found its avatar in Elizabeth Warren, named by then Senate majority leader Harry Reid to chair a congressional oversight panel on emergency economic relief. It was from this perch that she became such a thorn in Geithner’s, and Obama’s, side—and such a star of the progressive left.

    Outside of official Washington circles, the impatience, and the insurgency, were building, especially among young people born since about the early 1990s. They had grown up under a capitalism very different from the one Baby Boomers experienced; they’d seen a rigged game all their adult lives—a weak job market and heavy college debt for them, more and more riches for the one percent, and no one seeming to do anything about it. In 2010 a young leftist named Bhaskar Sunkara started Jacobin, a socialist journal that became an immediate surprise success. The next year, the Occupy Wall Street demonstrations began, making it clear that anger was real and widespread, and eventually having a strong influence on debate within the Democratic Party. The Democratic Socialists of America, founded in 1982, saw its membership rise from 6,000 in 2016 to 40,000 in 2018. Two other movements of the left, while not mainly concerned with economics, became potent political forces—the Black Lives Matter movement, founded in 2013, and the movement seeking permanent legal status for the so-called Dreamers, undocumented immigrants who came to the United States as children.

    All this activity might have remained inchoate had Sanders not decided to run for president against Hillary Clinton in 2016 (he deferred at first to Warren, who declined to run). Sanders had been inveighing against the banks and rigged political system in exactly the same language for years, but his general ineffectiveness on Capitol Hill, and his comprehensive lack of interest in schmoozing, reduced him to background noise as far as most of Washington was concerned. Now, however, people were coming out by the tens of thousands to hear him speak bluntly about the banks and the billionaires in a way Clinton never would have. And he gave this movement a figurehead, a cynosure around which to rally; his conveniently uncommon first name seemed to dance joyfully out of his supporters’ mouths.

    There is no harsher spotlight in the world than the one shone on major-party candidates for president of the United States, and he handled it with a skill that not everyone thrust into that position could. His critics—and I have been one, especially when I felt in 2016 that he attacked Clinton too viciously for too long, well after he was mathematically eliminated cannot deny him that. Whatever happens with this nominating process and election, he has gone from being an afterthought backbencher to a historical figure.

    To what extent was all this left-wing anger at mainstream Democrats justified? It’s a complicated question. The left was correct that Obama could have been far more aggressive on mortgage rescues and penalties imposed on the banks that brought on the financial crisis, as well as in its criticisms (which I joined) of Obama’s lamentable embrace of deficit reduction. It is also correct that Democrats have, since the 1990s, gotten themselves far too indebted to certain donor groups, notably Wall Street and the tech industry.

    Yet the left, in its critiques, sometimes acts as if Republicans don’t exist and have no say in political outcomes. Leftists tend to interpret the policy failures of the Obama era as a function of his own lack of will, or his reliance on corporate interests, rather than what they more often were, in my view—a reflection of the facts that in the Senate, a unified and dug-in minority can thwart a majority, and even a majority can pass legislation only as progressive as the sixtieth senator will allow because of the super-majority voting rules. I recall several conversations with administration officials who had worked for months on certain policy matters but who knew that the ideas would never get through the Senate. And presidents just don’t have endless political capital.

    I’ve always found this a useful heuristic: imagine Obama in his first term with LBJ-like majorities in Congress, sixty-eight senators and nearly three hundred House members. What would he have passed? It’s useful because our answers define the limits of mainstream liberalism—what it would be willing to push for, and the interests it would be hesitant to take on.

    by Michael Tomasky, NYRB |  Read more:
    Image: Tom Bachtell
    [ed. Sorry for all the politics lately, but it and the virus are saturating the news.]

    The Great Wall Street Housing Grab

    One of Ellingwood’s goals had always been to buy a house by the time he turned 30 — a birthday that unceremoniously came and went six months earlier. When Ellingwood began speaking to lenders, he realized he could easily get a loan, even two; this was the height of the bubble, when mortgage brokers were keen to generate mortgages, even risky ones, because the debt was being bundled together, securitized and spun into a dizzying array of bonds for a hefty profit. The house was $840,000. He put down $15,000 and sank the rest of his savings into a $250,000 bedroom addition and kitchen remodel, reasoning that this would increase the home’s value.

    Suddenly adulthood was upon him. He married on New Year’s Eve, and his wife gave birth to their first child, a son, in April. When his 88-year-old grandfather, an emeritus professor of electrical engineering at the University of Houston, had a bad fall, Ellingwood urged him to move into the house for sale just across his backyard. The grandfather bought the house with his daughter, Ellingwood’s mother, and the first thing they did was tear down the fence between the two properties, creating one big family compound. In 2009, Ellingwood’s older sister bought a house around the corner.

    But shortly after the birth of Ellingwood’s second son, in June 2010, his marriage fell apart. He and his wife each sued for sole custody. To pay his lawyer, he planned to refinance his house, and his grandfather advanced him his inheritance. By 2012, Ellingwood had paid his lawyer more than $80,000, and in the chaos of fighting for his children, he stopped making his mortgage payments. He consulted with several professionals, who urged him to file for bankruptcy protection so that he could get an automatic stay preventing the sale of his house.

    In May 2012, Ellingwood was driving his two boys to the beach, desperate to make the most of his limited time with them, when he got a call. He pulled over and, with cars whizzing by and his boys babbling excitedly in the back seat, learned that he had lost his house. He had dispatched a friend to stop the auction with a check for $27,000 — the amount he was behind on his mortgage — but there was nothing to be done. Because Ellingwood began to file for bankruptcy and then didn’t go through with it, a lien was put on his house, his “vortex of love” as he called it, that precluded him from settling his debt. The house sold within a couple of minutes for $486,000, which was $325,000 less than what he owed on it.

    In the months after, though, Ellingwood was graced with what seemed like a bit of luck. The company that bought his home offered to sell it back to him for $100,000 more than it paid to acquire it. He told the company, Strategic Acquisitions, that he just needed a little time to get together a down payment. In the meantime, the company asked him to sign a two-page rental agreement with a two-page addendum.

    It was clear from the beginning that there was something a little unusual about his new landlords. Instead of mailing his rent checks to a management company, men would swing by to pick them up. Within a few months, Ellingwood noticed that one of the checks he had written for $2,000 wasn’t accounted for on his rental ledger, though it had been cashed. He called and emailed and texted to resolve the problem, and finally emailed to say that he wouldn’t pay more rent until the company could explain where his $2,000 went. For more than three months, he withheld rent, waiting for a response. Instead, the company posted an eviction notice to his door. (...)

    Wall Street’s latest real estate grab has ballooned to roughly $60 billion, representing hundreds of thousands of properties. In some communities, it has fundamentally altered housing ecosystems in ways we’re only now beginning to understand, fueling a housing recovery without a homeowner recovery. “That’s the big downside,” says Daniel Immergluck, a professor of urban studies at Georgia State University. “During one of the greatest recoveries of land value in the history of the country, from 2010 and 2011 at the bottom of the crisis to now, we’ve seen huge gains in property values, especially in suburbs, and instead of that accruing to many moderate-income and middle-income homeowners, many of whom were pushed out of the homeownership market during the crisis, that land value has accrued to these big companies and their shareholders.”

    Before 2010, institutional landlords didn’t exist in the single-family-rental market; now there are 25 to 30 of them, according to Amherst Capital, a real estate investment firm. From 2007 to 2011, 4.7 million households lost homes to foreclosure, and a million more to short sale. Private-equity firms developed new ways to secure credit, enabling them to leverage their equity and acquire an astonishing number of homes. The housing crisis peaked in California first; inventory there promised to be some of the most lucrative. But the Sun Belt and Sand Belt were full of opportunities, too. Homes could be scooped up by the dozen in Phoenix, Atlanta, Las Vegas, Sacramento, Miami, Charlotte, Los Angeles, Denver — places with an abundance of cheap housing stock and high employment and rental demand. “Strike zones,” as Fred Tuomi, the chief executive of Colony Starwood Homes, would later describe them.

    Jade Rahmani, one of the first analysts to write about this trend, started going to single-family-rental industry networking events in Phoenix and Miami in 2011 and 2012. “They were these euphoric conferences with all of these individual investors,” he told me — solo entrepreneurs who could afford a house but not an apartment complex, or perhaps a small group of doctors or dentists — “representing small pools of capital that they had put together, loans from regional banks, and they were buying homes as early as 2010, 2011.” But in later years, he said, the balance began to shift: Individual and smaller investor groups still made up, say, 80 percent of the attendees, but the other 20 percent were very visible institutional investors, usually subsidiaries of large private-equity firms. Jonathan D. Gray, the head of real estate at Blackstone, one of the world’s largest private-equity firms and the one with the strongest real estate holdings, thought he could “professionalize” the fragmented single-family-rental market and partnered with a British property-investment firm, Regis Group P.L.C., as well as a local Phoenix company, Treehouse Group. Blackstone “would show up with teams of people and would look for portfolio acquisitions,” recalled Rahmani, who works for the firm Keefe, Bruyette & Woods, known as K.B.W. (K.B.W. sold some shares of Invitation Homes during its public offering.)

    Throughout the country, the firms created special real estate investment trusts, or REITs, to pool funds to buy bundles of foreclosed properties. A REIT enables investors to buy shares of real estate in much the same way that they buy shares of corporate stocks. REITs typically target office buildings, warehouses, multifamily apartment buildings and other centralized properties that are easy to manage. But after the crash, the unprecedented supply of cheap housing in good neighborhoods made corporate single-family home management feasible for the first time. The REITs were funded with money from all over the world. An investment company in Qatar, the Korea Exchange Bank on behalf of the country’s national pension, shell companies in California, the Cayman Islands and the British Virgin Islands — all contributed to Colony American Homes. Columbia University and G.I. Partners (on behalf of the California Public Employee’s Retirement System) invested $25 million and $250 million in the REIT Waypoint Homes. By the middle of 2013, private-equity companies had raised or spent nearly $20 billion on single-family real estate, and more than 100,000 homes were in the hands of institutional investors. Blackstone’s Invitation Homes REIT accounted for half of that spending. Today, the number of homes is roughly 260,000, according to Amherst Capital. (...)

    Landlords can be rapacious creatures, but this new breed of private-equity landlord has proved itself to be particularly so, many experts say. That’s partly because of the imperative for growth: Private-equity firms chase double-digit returns within 10 years. To get that, they need credit: The more borrowed, the higher the returns.

    by Francesca Mari, NY Times | Read more:
    Image: Nix + Gerber Studio for The New York Times

    Bob Dylan


    [ed. I've posted this before but couldn't remember the song. Love this video. Just a tight shot of five guys, no theatrics. Dylan with that just got out of a mental institution vibe, like he's on lithium or something. Never acknowledges the camera (except once, after turning around to say - nothing? - to his guitarist at 3:40), wears his cowboy hat backwards...and they rock! Who says Dylan doesn't have a sense of humor?]