Tuesday, June 30, 2020

Seattle Pride (Revisted)


Repost: Seattle Pride (photos from 2015)
Images: markk
[ed. Life used to be so much fun when we actually had something called Culture. See also: Virtual Seattle Pride Fest Has Businesses Yearning for Years Past (Bloomberg).]

Private Gain, Public Loss

Putting public services in private hands is bad economics. Worse, it undermines our bonds as a political community

Barbara and Mary are happily married. Barbara wants to buy Mary a new ring for her birthday. The problem is that Barbara knows nothing about jewellery. Fortunately, their neighbour John is an expert. Barbara can ask John to select the ring, or she could invest the time and energy to learn about gemstones and alloys herself, and choose the ring on the basis of her own judgment. What should Barbara do?

One answer might be that, given John’s expertise, Barbara should delegate the power to select the jewellery to John. After all, he’ll probably make a better choice. Another equally compelling answer, though, is that Mary might care not only about the beauty and quality of the chosen ring, but also about who selected it. Mary might want Barbara to engage in the task of buying the ring, even if Barbara’s choice is ultimately inferior to John’s.

This little fable illustrates something that’s often missed in debates about a very different subject: privatisation. The project of selling state or public assets to be owned or run by private businesses has always been controversial. What characterises the controversy, though, is that both advocates and opponents tend to cast it in instrumental terms. That is, the identity of the body or entity doesn’t matter in and of itself; what matters is whether or not they achieve a good outcome or do a better job. Whether or not something should be privatised, then, appears to depend on who is more likely to make the right decisions for the right ends. What’s more, the mainstream conversation about privatisation assumes that civil servants and public institutions are mere tools, more or less, for making these decisions.

But that view is shortsighted. We don’t just care about what the decision is and whether a decision is right, just, efficient or good. We also care about who makes the decision. As the story of Barbara, Mary and John shows, we often feel strongly not just about which ring gets chosen, but also about who chooses the ring. Similarly, a public institution differs from a private one not only in the quality or justness of the outcome, but also because decisions made by a public body are attributable to citizens – as a matter of fact, they are the decisions of the citizens. Only a public agent can speak in our name. So mass privatisation doesn’t simply shift decision-making away from public institutions to unaccountable, private entities; it also undermines shared civic responsibility and the very existence of collective political will.

At the level of the state, the equivalent of allowing John to choose Mary’s ring could theoretically be a nation in which all parks, public museums, prisons, forests, health services and other institutions are private. In such a world, citizens couldn’t really affect how these services operate. What’s more, they’d be unlikely to feel responsible for these institutions – to sense that these institutions are theirs. They might have grudges or complaints but, ultimately, the power of making decisions would rest with the private owner of the service or institution. In this scenario, people would lose their very sense of belonging to a political unit, whose future they should control through collective effort. (...)

When private actors take over state functions, they’re authorised to act for reasons beyond the public interest – to make money for themselves, for example, or to improve the value of their stock for shareholders. Within the boundaries set by law, private entities don’t need to defer to the state. Indeed, privatisation presupposes that companies and other private entities are empowered to act in their own interests, within the terms of their contract, to perform a service for citizens. In the absence of this power there would be no difference between private and public entities. By its very nature, then, privatisation deprives the public of control. By privatising the provision of a good or a service, the state distances itself from the activity, or, at least, from the decisions of a company (or another private entity) acting within the limits set by law. In contrast, by acting as a unified public – by using civil servants to perform certain tasks – citizens remain responsible, and are more likely to regard the acts of the political community as their own.

On this view, privatisation undermines an important dimension of our moral practices: the taking of responsibility on the part of citizens. In particular, privatisation downplays the political dimension of responsibility by absolving citizens of their duty to be involved in important choices. This doesn’t rely on a particular view of human psychology, but stems from the fact that being involved in political decisions is part of what facilitates collective responsibility. Spheres of activity that are privatised are excluded from this collective undertaking and are hidden away behind a corporate veil; they become the exclusive business of the private entity tasked with making the decision.

by Alon Harel, Aeon | Read more:
Image: John Moore/Getty

The Great Filter

When water was discovered on Mars, people got very excited. Where there is water, there may be life. Scientists are planning new missions to study the planet up close. NASA’s next Mars rover is scheduled to arrive in 2010. In the decade following, a Mars Sample Return mission might be launched, which would use robotic systems to collect samples of Martian rocks, soils, and atmosphere, and return them to Earth. We could then analyze the sample to see if it contains any traces of life, whether extinct or still active. Such a discovery would be of tremendous scientific significance. What could be more fascinating than discovering life that had evolved entirely independently of life here on Earth? Many people would also find it heartening to learn that we are not entirely alone in this vast cold cosmos.

But I hope that our Mars probes will discover nothing. It would be good news if we find Mars to be completely sterile. Dead rocks and lifeless sands would lift my spirit.

Conversely, if we discovered traces of some simple extinct life form—some bacteria, some algae—it would be bad news. If we found fossils of something more advanced, perhaps something looking like the remnants of a trilobite or even the skeleton of a small mammal, it would be very bad news. The more complex the life we found, the more depressing the news of its existence would be. Scientifically interesting, certainly, but a bad omen for the future of the human race.

How do I arrive at this conclusion? I begin by reflecting on a well‐known fact. UFO‐ spotters, Raelian cultists, and self‐certified alien abductees notwithstanding, humans have, to date, seen no sign of any extraterrestrial intelligent civilization. We have not received any visitors from space, nor have our radio telescopes detected any signals transmitted by any extraterrestrial civilization. The Search for Extra‐Terrestrial Intelligent Life (SETI) has been going for nearly fifty years, employing increasingly powerful telescopes and data mining techniques, and has so far consistently 1 corroborated the null hypothesis. As best we have been able to determine, the night sky is empty and silent—the question “Where are they?” thus being at least as pertinent today as it was when Enrico Fermi first posed it during a lunch discussion with some of his physicist colleagues back in 1950. 

Here is another fact: There are on the order of 100 billion stars in our galaxy alone, and the observable universe contains on the order of 100 billion galaxies. In the last couple of decades, we have learnt that many of these stars have planets circling around them. By now, several hundred exoplanets we have discovered. Most of these are gigantic, but this is due to a selection effect: It is very difficult to detect smaller exoplanets with current observation methods. (In most cases, the planets cannot be directly observed. Their existence is inferred from their gravitational influence on their parent sun, which wobbles slightly when pulled towards a large orbiting planet; or alternatively by a slight fluctuation in their sun’s perceived luminosity which occurs when it is partially eclipsed by the exoplanet.) We have every reason to believe that the observable universe contains vast numbers of solar systems, including many that have planets that are Earth‐like at least in the sense of having a mass and temperature similar to those of our own orb. We also know that many of these solar systems are much older than ours. 

From these two facts it follows that there exists a “Great Filter”. The Great Filter can be thought of as a probability barrier. It consists of one of more highly improbable evolutionary transitions or steps whose occurrence is required in order for an Earth‐like planet to produce an intelligent civilization of a type that would be visible to us with our current observation technology. You start with billions and billions of potential germination points for life, and you end up with a sum total of zero extraterrestrial civilizations that we can observe. The Great Filter must therefore be powerful enough— which is to say, the critical steps must be improbable enough—that even with many billions rolls of the dice, one ends up with nothing: no aliens, no spacecraft, no signals, at least none that we can detect in our neck of the woods. Now, an important question for us is, just where might this Great Filter be located? There are two basic possibilities: It might be behind us, somewhere in our distant past. Or it might be ahead of us, somewhere in the millennia or decades to come. Let us ponder these possibilities in turn.

by Nick Bostrom, Future of Humanity Institute, Oxford University |  Read more (pdf):
Image: NASA, ESA, and the Hubble Heritage Team [STScI/AURA]
[ed. From 2008 but still relevant as we seem to be acquiring more objects of Great Filtering potential. Various possiblities include things like nuclear holocaust, bioterrorism, cyberterrorism, uncontrollable AI, climate change, etc. (for a scary list, see: Global Catastrophic Risk). Also, for another completely different possibility, see: The Dark Forest (Duck Soup). Recommended.

By the way, the 9 steps involved in the Great Filter include:

With no evidence of intelligent life other than ourselves, it appears that the process of starting with a star and ending with "advanced explosive lasting life" must be unlikely. This implies that at least one step in this process must be improbable. Hanson's list, while incomplete, describes the following nine steps in an "evolutionary path" that results in the colonization of the observable universe (as proposed by Robin Hanson):
  1. The right star system (including organics and potentially habitable planets)
  2. Reproductive molecules (e.g. RNA)
  3. Simple (prokaryotic) single-cell life
  4. Complex (eukaryotic) single-cell life
  5. Sexual reproduction
  6. Multi-cell life
  7. Tool-using animals with intelligence
  8. A civilization advancing toward the potential for a colonization explosion (where we are now)
  9. Colonization explosion
According to the Great Filter hypothesis at least one of these steps—if the list were complete—must be improbable. If it's not an early step (i.e., in our past), then the implication is that the improbable step lies in our future and our prospects of reaching step 9 (interstellar colonization) are still bleak. If the past steps are likely, then many civilizations would have developed to the current level of the human species. However, none appear to have made it to step 9, or the Milky Way would be full of colonies. So perhaps step 9 is the unlikely one, and the only things that appear likely to keep us from step 9 are some sort of catastrophe, an underestimation of the impact of procrastination as technology increasingly unburdens existence or resource exhaustion leading to the impossibility of making the step due to consumption of the available resources (for example highly constrained energy resources). (Wikipedia).]

Monday, June 29, 2020

14 Veteran Touring Artists on Life Without Concerts

With concerts indefinitely shut down thanks to COVID-19, musicians who’ve spent much of their lives on tour are stuck at home this summer, and pondering an uncertain future. “I just don’t even know what is realistic at this moment,” says Cheap Trick’s Tom Petersson.

Buddy Guy

I haven’t picked up the guitar since they canceled me in Arizona almost two months ago. I was born on a farm down in Louisiana, and this is a flashback, because this time of year we were sharecropping in the fields all day. And then we would stay locked in the house, trying to stay home as much as you can. I grew up distancing from people except for the family in my house. Even before I got the chance to make a living playing music, I was driving a tow truck. This is the longest I’ve been home in 50 years, maybe a little longer. I want to get back out there. People are so mad at the world, but when I play music, I see them smiling. I own the largest blues club in the city, they closed that down. Before that, [business] was fine. My next birthday, I’ll be 84, so when you get up in that kinda age, people say “I better go check him out.” I hope they come up with a vaccine, so I can get back out there and let them know I’m alive and well and trying to keep the blues alive. I don’t know what else to do now. I can’t go looking for a bus-driving job.

Stevie Nicks

All we have right now, if you’re home in quarantine, is time, unless you’re taking care of kids. So, really, you could do anything that you’ve been wanting to do your whole life. That’s how I’m trying to look at it. But, even though I didn’t have a tour planned, my brain doesn’t know that yet. My brain is like, “OK, you came off the road, and usually you would be going to rehearse.” It’s still bugging me that I should be getting ready for something, and I’m not. This has never happened to me ever in my life. The second I come off the tour with one career, the phone’s ringing off the hook from the other career, saying, “Are you ready to do something cool?” This is the year I was going to talk to everybody about making my movie and do some recording and meet new people. Well, you’re not going to meet any new people, because you can’t leave your house.

John Fogerty

The coronavirus is so real and so scary and life-threatening. I haven’t seen yet a solution that will work until we get a vaccine. I guess I’m more patient than some. I keep telling my family, if it was lions and tigers roaming out there, you could see that, so that prepares you psychologically, so you realize you don’t want to go out there and be reckless. All of this opening-up talk is pretty scary to me. I’m afraid we’re probably end up going backwards. And I don’t want to be the guy who contributes to that. You go do a concert with 10,000 people, and then find out afterwards that some of them died? I don’t think any of us will really be ready until after we have a vaccine and people feel safe again. I’m an older person, and a lot of people my age have died. Maybe some other guy thinks it’s a good idea, but I’m not dying for Donald Trump. I’m not dying for the economy. How can you have any kind of a crowd?

Sammy Hagar

I’ll be comfortable playing a show before there’s a vaccine, if it’s declining and seems to be going away. I’m going to make a radical statement here. This is hard to say without stirring somebody up, but truthfully, I’d rather personally get sick and even die, if that’s what it takes. We have to save the world and this country from this economic thing that’s going to kill more people in the long run. I would rather see everyone go back to work. If some of us have to sacrifice on that, OK. I will die for my children and my grandchildren to have a life anywhere close to the life that I had in this wonderful country. That’s just the way that I feel about it. I’m not going to go around spreading the disease. But there may be a time where we have to sacrifice. I mean, how many people die on the Earth every day? I have no idea. I’m sorry to say it, but we all gotta die, man.

David Crosby


I’m not making any money from anywhere and [my house] is in jeopardy. I’m not whining about it, though — it’s what we have to do, or we can’t beat the coronavirus. But I don’t think most people know what it’s done to the music business. It’s everyone that I know. They’re completely out of work, and a lot of them don’t make a lot of money. Everyone is like, “You’re a rock star and you drive in a Cadillac and you burn money.” Bullshit. Ninety percent of us are working people, and our job is gone. I hope I’m back on tour next year, but I’m not sure I’ve got a next year. That’s the thing: I’m almost 80 years old. When you take away my next year, you might have just taken the last one I got. That’s a bitch. I think they are doing the right thing to not have aggregations of people, but don’t kid yourself about the effect. To us? To the musicians? It’s a goddamn disaster.

by Jonathan Bernstein, David Browne, Patrick Doyle, Andy Greene, Kory Grow & Brian Hiatt, Rolling Stone |  Read more:
Image: Buddy Guy by Lyndon French
[ed. And these are successful musicians who at least have some royalties coming in. Musicians as a whole are going to be in bad shape for a long time.]

Employer-Based Health Care, Meet Massive Unemployment

In the early months of 2020, Americans were engaged in the perennial election-year debate over how best to reform the nation’s health care system. As usual, the electorate was torn and confused. Polling indicated that a small majority of likely voters favored a new universal system that would cover everyone. But that support evaporated when it was made clear that any such overhaul would involve abolishing the private insurance market. At the time, nearly 160 million Americans received their health benefits through an employer, and the vast majority of them liked that coverage just fine — maybe not enough to sing about it, but enough to be wary of a potential replacement.

Then came the pandemic of the century. And the highest level of unemployment since the Great Recession. And the most concentrated wave of job loss in the nation’s history — more than 40 million Americans filed new unemployment claims between mid-March and late May. It will take time to ascertain the full impact of those losses on the nation’s health insurance rate, but an early survey from the Commonwealth Fund is not encouraging: 41 percent of those who lost a job (or whose spouse lost a job) because of the pandemic relied on that job for health insurance; 20 percent of those people have not managed to secure alternative coverage.

Nothing illuminates the problems with an employer-based health care system quite like massive unemployment in the middle of a highly contagious and potentially deadly disease outbreak. For one thing, uninsured people are less likely to seek medical care, making this coronavirus that much more difficult to contain. Also, people with chronic or immune-compromising medical conditions are particularly susceptible to this new contagion — which means the people most in need of employer-sponsored health benefits are the same ones who can least afford to return to work at the moment.

“The pandemic has amplified all the vulnerabilities in our health care system,” says Drew Altman, president of the nonpartisan Kaiser Family Foundation, including “the uninsured, racial disparities, the crisis of unmanaged chronic conditions and the general lack of national planning.”

As dire as the crisis is, though, it’s also an opportunity to look at health care reform with fresh eyes — and to maybe, finally, rebuild the nation’s health care system in a way that works for all Americans, not just the wealthy and the well employed.

The first step will be acknowledging the problems of our current system. If American health care were its own country, it would be the fourth largest in the world by gross domestic product. The nation spends an average of $3.5 trillion per year on health care — more than Japan, Germany, France, China, the United Kingdom, Italy, Canada, Brazil, Spain and Australia combined — and still loses more people to preventable and treatable medical conditions than any of those countries do.

In other words, America has created the most expensive, least effective health care system in the modern world, and the most vulnerable Americans have been paying for that failure with their lives since long before the coronavirus came to town.

In many ways, of course, that system is no system at all. It’s a patchwork in which access to care depends on a roster of factors, including age, employment status and state of residence. It’s a free-for-all in which the prices of life-or-death essentials like insulin and heart surgery are set at whatever the market will bear, and efforts to check those prices are routinely bludgeoned by interest groups that hold enormous sway over lawmakers. It’s a labyrinth in which consultants, billing clerks and administrators vastly outnumber medical professionals. And it’s a voracious beast that feeds American households with well-paying jobs, then devours them with insurmountable medical bills — often at their weakest moments. (...)

By 1960, roughly two-thirds of all Americans were insured by their employers, by 1970 health insurance had become big business, and by the 1980s health care costs were soaring. Some of that increase can be attributed to advances in technology that made care more expensive. But a great deal of the spike resulted from what economists refer to as “price insensitivity” and what the rest of us might call obliviousness. “If the insurer is paying, nobody looks at the bill,” says Zack Cooper, a health economist at the Yale School of Medicine. “So you can raise prices as much as you want, and you can create a much more luxurious system overall, to justify it.”

Unencumbered by the demands of a cost-conscious clientele, hospitals ramped up equipment purchases, expanded hospital wings and workforces, created specialty clinics — and then increased their reimbursement rates to pay for it all. Rather than scrutinize those price hikes, which were passed from hospitals to insurers to customers, employers simply accepted them. And why wouldn’t they? The more generous the insurance package and the nicer the hospitals and clinics, the bigger the tax break for the companies paying the tab. “For employers, it’s essentially the house’s money,” Mr. Cooper says. “But then, for anyone not on that raft of good coverage, it’s enormous costs or nothing.” (...)

To change this system, Americans will have to change their thinking. There is a tendency among workers with good health insurance to see those benefits as something that’s purely earned, through work. But employer-based insurance is heavily subsidized by the federal government. Those subsidies are not much different than the ones granted to low-income Americans through Medicaid and the Affordable Care Act, but through the lens of American politics the latter are frequently derided as an outrageous form of welfare, while the former are accepted as par for the course.

by Jeneen Interlandi, NY Times | Read more:
Image: Illustration by Alicia Tatone; Photographs by Win McNamee/Getty Images

Zander Blom, Untitled 1.672, May 2014
via:

Tom Leighton
via:

Poker and the Psychology of Uncertainty

"You are going to be a gambler?

That’s my grandmother Baba Anya speaking. My last living grandparent. I’ve come to Boston for a family visit, nearly bouncing with excitement at my new project, and she is not impressed. To call her lukewarm would be the understatement of the hour. She has a way of setting her jaw that makes it jut out like it’s about to slice through stone. The chiseled expression of a conquering hero atop a pedestaled horse. A conquering hero—or an angry general. I can feel the full brunt of grandmotherly disappointment gather on my shoulders. She has almost (though not quite) come to forgive me for not wanting kids after over a decade of my persistent explanations, but this—this is a new low. If you think you know the kind of disappointment a five-foot-some-odd 92-year-old is capable of, think again. She was a Soviet-era schoolteacher. She’s had more practice than an army drill sergeant.

She shakes her head.

“Masha,” she says—my Russian nickname. “Masha.” The word is laden with so much sadness, so much regret for the life I’m about to throw away. In a single word, she has managed to convey that I’m on the brink of ruin, about to make a decision so momentously bad that it is beyond comprehension. A Harvard education and this, this, is what I’m choosing to do?

“Masha,” she repeats. “You are going to be a gambler?”

My grandmother’s reaction may be extreme—nothing is quite as personal as your grandchildren heading out to ruin on your watch; you have to throw your body in the breach—but it is far from atypical. In the coming months, I’ll be accused of being responsible for a society-wide “sin slide” for advocating for poker as a teaching tool. I’ll be called a moral degenerate by strangers. A group of highly intelligent people at a retreat will tell me playing poker is all well and good, but how do I feel about encouraging people—children even!—to lie?

The world of poker is laden with misconceptions. And first among them is the very one I’m seeing from a stricken Baba Anya: equating poker with gambling. To my mind, the journey was well motivated: Of course people would understand that poker was an important way to learn about decisionmaking. I mean, think of John von Neumann! One of the great polymaths of the 20th century, father of the computer, one of the inventors of the hydrogen bomb, the creator of game theory. And a poker player! Not just a poker player, but someone for whom poker inspired brilliant insights into human decisionmaking, someone who considered it the ultimate game for approximating the strategic challenges of life. Let’s get to the tables! But looking at Baba Anya, I realize that the battle for support—and the justification for poker as not just a learning tool, but as one of the best tools there is for making decisions that have nothing to do with the game itself—is going to take a bit more fighting. I’m going to be explaining this over and over, so I may as well get it right.

Poker, to the untrained eye, is easy. Just like everyone who meets me seems to have “a book in them,” which they’ll write just as soon as they get a chance, so everyone who meets my coach, Erik Seidel—one of the most legendary poker players in the world—thinks they are just a hop away from becoming a poker pro or, at the very least, a badass poker bro. Most of us underestimate the skill involved. It just seems so simple: get good cards and rake in the dough. Or bluff everyone blind and rake in the dough once more. Either way, you’re raking it in.

And poker does have an element of chance, to be sure—but what doesn’t? Are poker professionals “gamblers” any more than the man signing away his life on a professional football contract, who may or may not be injured the next week, or find himself summarily dropped from the team in a year because he failed to live up to his promise? We judge the poker player for gambling; we respect the stockbroker for doing the same thing with far less information. In some ways, poker players gamble less than most. After all, even if they lose an arm, they can still play.

But the misperception is ingrained in the popular mind for one simple reason. Unlike, say, Go or chess, poker involves betting. And betting involves money. And as soon as that enters the picture, you might as well be playing craps or baccarat—games that truly are gambling. And so I tell my grandmother the words that I’ve come to repeat so often they are like my own private mantra: In poker, you can win with the worst hand and you can lose with the best hand. In every other game in a casino—and in games of perfect information like chess and Go—you simply must have the best of it to win. No other way is possible. And that, in a nutshell, is why poker is a skilled endeavor rather than a gambling one.

Imagine two players at a table. The cards are dealt. Each player must look at her cards and decide whether or not the cards on their own are good enough to bet. If she wishes to play, she must at minimum “call” the big blind—that is, place as much into the pot as the highest bet that already exists. She may also choose to fold (throw out her cards and sit this hand out) or raise (bet more than the big blind). But who knows what factors she’s using to make her decision? Maybe she has a premium hand. Maybe she has a mediocre hand but thinks she can outplay her opponent and so chooses to engage anyway. Maybe she has observed that the other player views her as conservative because she doesn’t play many hands, and she’s taking advantage of that image by opening up with worse cards than normal. Or maybe she’s just bored out of her mind. Her reasoning, like her cards, is known only to her.

The other player observes the action and reacts accordingly: If she bets big, she may have a great hand—or be bluffing with a bad one. If she simply calls, is it because her hand is mediocre or because she’s a generally passive player or because she wants to do what’s known as “slow playing”—masking an excellent hand by playing it in a restrained fashion, as Johnny Chan did in that 1988 World Series Of Poker matchup with Erik Seidel? Each decision throws off signals, and the good player must learn to read them. It’s a constant back-and-forth interpretive dance: How do I react to you? How do you react to me? More often than not, it’s not the best hand that wins. It’s the best player. This nuance, this back-and-forth, this is why von Neumann saw the answer to military strategy in the cards. Not because everyone is a gambler, but because to be a winning player, you have to have superior skill, in a very human sense.

Indeed, when the economist Ingo Fiedler analyzed hundreds of thousands of hands played on several online poker sites over a six-month period, he found that the actual best hand won, on average, only 12 percent of the time, and less than a third of hands went to showdown (meaning that players were skillful enough to persuade others to let go of their cards prior to the end of the hand). In mid-stakes games, with blinds of 1/ 2 and 5/10—that is, where the blind bets two players are forced to pay each round to start the action are $1 and $2, or $5 and $10, respectively—there were some players who were consistent winners, and as stakes went to nosebleed, 50/100 and up, the variability in skill went down significantly. That is, the higher the amount of money for which people played, the greater their actual skill edge. When Chicago economists Steven Levitt and Thomas Miles looked at live play and compared the ROI, or return on investment, for two groups of players at the 2010 WSOP, they found that recreational players lost, on average, over 15 percent of their buy-ins (roughly $400), while professionals won over 30 percent (roughly $1,200). They write, “The observed differences in ROIs are highly statistically significant and far larger in magnitude than those observed in financial markets where fees charged by the money managers viewed as being most talented can run as high as 3 percent of assets under management and 30 percent of annual returns.” Success in poker, in other words, implies far more skill than does success in that far more respectable profession, investing.

by Maria Konnikova, Wired | Read more:
Image: Penguin Press

Sunday, June 28, 2020

Why is the New York Times Threatening to Reveal Blogger Scott Alexander’s True Identity?

The old adage about online anonymity goes: “On the internet no one knows if you’re a dog.” It hasn’t stood the test of time, not least because it has proved possible, and often easy, to work out not just the species, but the full identity of someone who tries to hide online.

That is what has happened to “Scott Alexander” the author of the much-loved blog Slate Star Codex, which became a nexus for the rationalist community and others who seek to apply reason to debates about situations, ideas and moral quandaries.

Scott Alexander are the real first and middle names of the author, a psychiatrist based in California, who had kept his full identity secret. However, as he revealed in a post this week, a New York Times tech reporter decided to write about his blog and the community around it, and intended to publish Scott Alexander’s full name. In response, Alexander decided to close down Slate Star Codex, claiming that revealing his identity would undermine his ability to treat his patients, and expose him to death threats, something he said he had already received in small numbers.

The response on Twitter, where many of the blog’s readers often dwell, has been one of outrage. Luminaries such as Steven Pinker described it as a “tragedy on the blogosphere”. Others such as software inventor and investor Paul Graham talked of cancelling their NYT subscriptions. The title’s “threat” has been widely described as “doxxing”, a term more commonly used for posting online the personal details of an individual behind a social media account than publishing someone’s name in a newspaper story.

The NYT has so far been tight-lipped on the matter, saying only in a statement that: “We do not comment on what we may or may not publish in the future. But when we report on newsworthy or influential figures, our goal is always to give readers all the accurate and relevant information we can.”

By most industry standards, the NYT’s rules on the use of anonymous sources are pretty stringent, especially after they were tightened in 2016. The essence is that anonymous sourcing should be used as “a last resort, for situations in which The Times could not otherwise publish information it considers newsworthy and reliable”. The tightening involved a signoff process from senior editors dependent on how important the anonymous sourcing was to the story, in response to complaints that such sources were being used too frequently. Alexander says the reporter told him it was policy to name those involved in a story, and others contacted by the reporter have said that he told them his editor wouldn’t publish the story without the blogger’s true identity.

That policy on anonymity is in many ways a good one. Anonymous sourcing is overused, often protecting paid spokespeople who want to spin without repercussions, or those who want to make allegations without having to substantiate them. Yet those rules are primarily aimed at anonymous people commenting on stories about something other than themselves, not the central subject of a story.

At first glance the most obvious parallel in the UK is with NightJack, the Orwell Prize-winning anonymous police blogger who was exposed by the Times in 2009. The blog was, as lawyer David Allen Green described it for the New Statesman in 2012 an “unflinchingly personal account of front-line police work”.

But there are key differences. Alexander isn’t a whistleblower revealing important information about his profession. The writing he produces could be done without the job (psychiatrist) he is concerned about undermining (a profession that frowns upon its practitioners revealing much about their work, anonymous or not).

It subsequently transpired that the Times had discovered NightJack’s identity via some rudimentary hacking. There is no suggestion that the NYT has committed any crime in discovering Alexander’s real name, and indeed Alexander has admitted that it isn’t hard to identify him. It took me about 30 minutes, and for many it won’t take more than five.

It’s also worth pointing out that Slate Star Codex is not uncontroversial. Its fans describe it as a place for people to disagree with each other reasonably, a space to think about and discuss big topics. But reason can lead down some dark rabbit holes, and prioritising it over pretty much everything else inevitably means that some of what is published offends various communities. That is even more so in the case of those commenting on the site’s posts.

But none of the above means the NYT has any imperative to reveal Alexander’s identity. So why would it?

by Jasper Jackson, New Statesman |  Read more:
Image: Ramin Talaie/Getty Images
[ed. The site is indeed down, and that's a terrible shame. Most people would probably think wtf... who cares? But. Just plug Slate Star Codex into the search function on this blog to see how many informative and interesting subjects Scott has covered over the years. He's one of the most interesting writers on the internet today (and it's not even his day job!). I've been following SSC for years - it's one of the few sites I visit nearly every day - and this makes no obvious sense.] 

'We Could Be Feeling This for the Next Decade’: Virus Hits College Towns

The community around the University of California, Davis, used to have a population of 70,000 and a thriving economy. Rentals were tight. Downtown was jammed. Hotels were booked months in advance for commencement. Students swarmed to the town’s bar crawl, sampling the trio of signature cocktails known on campus as “the Davis Trinity.”

Then came the coronavirus. When the campus closed in March, an estimated 20,000 students and faculty left town.

With them went about a third of the demand for goods and services, from books to bikes to brunches. City officials are expecting most of that demand to stay gone even as the economy reopens.

Fall classes will be mostly remote, the university announced last week, with “reduced density” in dorms. Davis’s incoming vice mayor, Lucas Frerichs, said the city was anticipating “a huge impact” with a majority of the university’s 39,000-plus students still dispersed in September.

For “townies,” rules require congregation to remain limited, too, as confirmed coronavirus cases continue to climb in California. One of the Davis Trinity bars has closed, with no plan to reopen. On a recent Sunday, downtown was filled with “takeout only” signs and half-empty, far-flung cafe tables. Outside the closed theater, a lone busker stood on a corner playing “Swan Lake” on a violin to virtually no one.

Efforts to stem the pandemic have squeezed local economies across the nation, but the threat is starting to look existential in college towns.

Reliant on institutions that once seemed impervious to recession, “town and gown” communities that have evolved around rural campuses — Cornell, Amherst College, Penn State — are confronting not only Covid-19 but also major losses in population, revenue and jobs.

Where business as usual has been tried, punishment has followed: This week, Iowa health authorities reported case spikes among young adults in its two largest college towns, Ames and Iowa City, after the governor allowed bars to reopen. And on campuses across the country, attempts to bring back football teams for preseason practice have resulted in outbreaks.

More than 130 coronavirus cases have been linked to athletic departments at 28 Division 1 universities. At Clemson, at least 23 football players and two coaches have been infected. At Arkansas State University, seven athletes across three teams tested positive. And at the University of Houston, the athletic department stopped off-season workouts after an outbreak was discovered.

Sports are not the only source of outbreaks in college towns. Mississippi officials tied several cases to fraternity rush parties that apparently flouted social distancing rules. In Baton Rouge, La., at least 100 cases were linked to bars in the Tigerland nightlife district near Louisiana State’s campus. And in Manhattan, Kan., home to Kansas State, officials said Wednesday that there had been two recent outbreaks: one on the football team, and another in the Aggieville entertainment district just off campus.

For the cities involved, the prognosis is also daunting. In most college towns, university students, faculty and staff are a primary market. Local economies depend on their numbers and dollars, from sales taxes to football weekends to federal funds determined by the U.S. census.

by Shawn Hubler, NY Times | Read more:
Image: Tommy Ly
[ed. It's possible smaller State universities might survive by retaining more in-state students simply because they're closer to home, and a financial bargain compared to so-called elite (and expensive) big colleges: See: Why Some State Universities Are Seeing An Influx (NY Times).

Saturday, June 27, 2020

The Decline of the American World

“He hated America very deeply,” John le Carré wrote of his fictional Soviet mole, Bill Haydon, in Tinker Tailor Soldier Spy. Haydon had just been unmasked as a double agent at the heart of Britain’s secret service, one whose treachery was motivated by animus, not so much to England but to America. “It’s an aesthetic judgment as much as anything,” Haydon explained, before hastily adding: “Partly a moral one, of course.”

I thought of this as I watched the scenes of protest and violence over the killing of George Floyd spread across the United States and then here in Europe and beyond. The whole thing looked so ugly at first—so full of hate, and violence, and raw, undiluted prejudice against the protesters. The beauty of America seemed to have gone, the optimism and charm and easy informality that entrances so many of us from abroad.

At one level, the ugliness of the moment seems a trite observation to make. And yet it gets to the core of the complicated relationship the rest of the world has with America. In Tinker Tailor, Haydon at first attempts to justify his betrayal with a long political apologia, but, in the end, as he and le Carré’s hero, the master spy George Smiley, both know, the politics are just the shell. The real motivation lies underneath: the aesthetic, the instinct. Haydon—upper class, educated, cultured, European—just could not stand the sight of America. For Haydon and many others like him in the real world, this visceral loathing proved so great that it blinded them to the horrors of the Soviet Union, ones that went far beyond the aesthetic.

Le Carré’s reflection on the motivations of anti-Americanism—bound up, as they are, with his own ambivalent feelings about the United States—are as relevant today as they were in 1974, when the novel was first published. Where there was then Richard Nixon, there is now Donald Trump, a caricature of what the Haydons of this world already despise: brash, grasping, rich, and in charge. In the president and first lady, the burning cities and race divides, the police brutality and poverty, an image of America is beamed out, confirming the prejudices that much of the world already have—while also serving as a useful device to obscure its own injustices, hypocrisies, racism, and ugliness.

It is hard to escape the feeling that this is a uniquely humiliating moment for America. As citizens of the world the United States created, we are accustomed to listening to those who loathe America, admire America, and fear America (sometimes all at the same time). But feeling pity for America? That one is new, even if the schadenfreude is painfully myopic. If it’s the aesthetic that matters, the U.S. today simply doesn’t look like the country that the rest of us should aspire to, envy, or replicate.

Even in previous moments of American vulnerability, Washington reigned supreme. Whatever moral or strategic challenge it faced, there was a sense that its political vibrancy matched its economic and military might, that its system and democratic culture were so deeply rooted that it could always regenerate itself. It was as if the very idea of America mattered, an engine driving it on whatever other glitches existed under the hood. Now, something appears to be changing. America seems mired, its very ability to rebound in question. A new power has emerged on the world stage to challenge American supremacy—China—with a weapon the Soviet Union never possessed: mutually assured economic destruction.

China, unlike the Soviet Union, is able to offer a measure of wealth, vibrancy, and technological advancement—albeit not yet to the same level as the United States—while protected by a silk curtain of Western cultural and linguistic incomprehension. In contrast, if America were a family, it would be the Kardashian clan, living its life in the open glare of a gawping, global public—its comings and goings, flaws and contradictions, there for all to see. Today, from the outside, it looks as if this strange, dysfunctional, but highly successful upstart of a family were suffering a sort of full-scale breakdown; what made that family great is apparently no longer enough to prevent its decline.

The U.S.—uniquely among nations—must suffer the agony of this existential struggle in the company of the rest of us. America’s drama quickly becomes our drama. Driving to meet a friend here in London as the protests first erupted in the States, I passed a teenager in a basketball jersey with jordan 23 emblazoned on the back; I noticed it because my wife and I had been watching The Last Dance on Netflix, a documentary about an American sports team, on an American streaming platform. The friend told me he’d spotted graffiti on his way over: i can’t breathe. In the weeks since, protesters have marched in London, Berlin, Paris, Auckland, and elsewhere in support of Black Lives Matter, reflecting the extraordinary cultural hold the United States continues to have over the rest of the Western world. (...)

To understand how this moment in U.S. history is being seen in the rest of the world, I spoke to more than a dozen senior diplomats, government officials, politicians, and academics from five major European countries, including advisers to two of its most powerful leaders, as well as to the former British Prime Minister Tony Blair. From these conversations, most of which took place on the condition of anonymity to speak freely, a picture emerged in which America’s closest allies are looking on with a kind of stunned incomprehension, unsure of what will happen, what it means, and what they should do, largely bound together with angst and a shared sense, as one influential adviser told me, that America and the West are approaching something of a fin de siècle. “The moment is pregnant,” this adviser said. “We just don’t know what with.” (...)

Those that I spoke to divided their concerns, implicitly or explicitly, into those caused by Trump and those exacerbated by him—between the specific problems of his presidency that, in their view, can be rectified, and those that are structural and much more difficult to solve. Almost everyone I spoke to agreed that the Trump presidency has been a watershed not just for the U.S. but for the world itself: It is something that cannot be undone. Words once said cannot be unsaid; images that are seen are unable to be unseen.

The immediate concern for many of those I interviewed was the apparent hollowing out of American capacity. Lawrence Freedman, a professor of war studies at King’s College London, told me the institutions of American power themselves have been “battered.” The health system is struggling, the municipalities are financially broke, and, beyond the police and the military, little attention is being paid to the health of the state itself. Worst of all, he said, “they don’t know how to fix it.”

by Tom McTague, The Atlantic | Read more:
Image: The Atlantic

Vitamin D Might Help Prevent COVID-19 Infections

Abstract

COVID-19, the disease caused by SARS-CoV-2 (1), was declared a pandemic by the World Health Organization (WHO) in March 2020 (2). While awaiting a vaccine, several antivirals are being used to manage the disease with limited success (3, 4). To expand this arsenal, we screened 4 compound libraries: a United States Food and Drug Administration (FDA) approved drug library, an angiotensin converting enzyme-2 (ACE2) targeted compound library, a flavonoid compound library as well as a natural product library. Of the 121 compounds identified with activity against SARS-CoV-2, 7 were shortlisted for validation. We show for the first time that the active form of Vitamin D, calcitriol, exhibits significant potent activity against SARS-CoV-2. This finding paves the way for consideration of host-directed therapies for ring prophylaxis of contacts of SARS-CoV-2 patients.

by Chee Keng Mok, Yan Ling Ng, Bintou Ahmadou Ahidjo, Regina Ching Hua Lee, Marcus Wing Choy Loe, Jing Liu, Kai Sen Tan, Parveen Kaur, Wee Joo Chng, John Eu-Li Wong, De Yun Wang, Erwei Hao, Xiaotao Hou, Yong Wah Tan, Tze Minn Mak, Cui Lin, Raymond Lin, Paul Tambyah, JiaGang Deng, Justin Jang Hann Chu | bioRxiv | Read more:

[ed. Calcitrol (Vitamin D). Can't hurt I guess (pre-print, not peer reviewed). Here's the full text.]

Thursday, June 25, 2020


The American Nursing Home Is a Design Failure

With luck, either you will grow old or you already have. That is my ambition and probably yours, and yet with each year we succeed in surviving, we all face a crescendo of mockery, disdain, and neglect. Ageism is the most paradoxical form of bigotry. Rather than expressing contempt for others, it lashes out at our own futures. It expresses itself in innumerable ways — in the eagerness to sacrifice the elderly on the altar of the economy, in the willingness to keep them confined while everyone else emerges from their shells, and, in a popular culture that sees old age (when it sees it at all) as a purgatory of bingo nights. Stephen Colbert turned the notion of a 75-year-old antifa into a comic riff on geriatric terrorists, replete with images of octogenarians innocently locomoting with walkers, stair lifts, and golf carts.

In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”

In this country, we have erected a vast apparatus of last-stop living arrangements that, during the pandemic, have proven remarkably successful at killing the very people they were supposed to care for. The disease that has roared through nursing homes is forcing us to look hard at a system we use to store large populations and recognize that, like prisons and segregated schools, it brings us shame.

The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re an sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.

The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.

The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.

Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.6 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.

The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says David Reingold, the CEO of Riverspring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”

Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”

by Justin Davidson, Intelligencer |  Read more:
Image: C.F. Møller

Wednesday, June 24, 2020

Why Reopening Isn't Enough To Save The Economy

Brooklyn Heights sits across the East River from Lower Manhattan. It's filled with multimillion-dollar brownstones and — usually — Range Rovers, Teslas and BMWs. These days it's easy to find parking. The brownstones are mostly dark at night. The place is a ghost town. And the neighborhood's sushi restaurants, Pilates studios, bistros and wine bars are either closed or mostly empty. It's a microcosm for what has been the driver of the pandemic recession: Rich people have stopped going out, destroying millions of jobs.

That's one of the key insights of a blockbuster study that was dropped late last week by a gang of economists led by Harvard University's Raj Chetty. If you don't know who Chetty is, he's sort of like the Michael Jordan of policy wonks. He's a star economist. He and his colleagues assemble and crunch massive data sets and deliver insights that regularly shift core economic debates about inequality and opportunity. This new study focuses on the economic impact of COVID-19 and the government response. To us nerds, this is like Game 7 of the NBA Finals, and Chetty just swooped in at a crucial moment to drop some threes.

On the day the study came out, Chetty participated in a Zoom webinar sponsored by Princeton University's Bendheim Center for Finance. Dressed in a white-collared shirt with bookshelves as his background, Chetty took us all through the study. The data? Good lord. They've assembled several gigantic new data sets from private companies, including credit and debit card processors and national payroll companies. The data are all freely available online, updated in real time and presented in an easily digestible form. Chetty and his team have crunched it all to give some precise insights about consumer spending, jobs and the geographic impact of the crisis. The study represents an advance for economics as a science, and it has got some bombshells.

First up, consumer spending. Typically, Chetty said, recessions are driven by a drop in spending on durable goods, like refrigerators, automobiles and computers. This recession is different. It's driven primarily by a decline in spending at restaurants, hotels, bars and other service establishments that require in-person contact. We kinda already knew that. But what the team's data show is that this decline in spending is mostly in rich ZIP codes, whose businesses saw a 70% drop-off in their revenue. That compares with a 30% drop in revenue for businesses in poorer ZIP codes.

Second, jobs. This 70% fall in revenue at businesses in rich ZIP codes led those businesses to lay off nearly 70% of their employees. These employees are mostly low-wage workers. Businesses in poorer ZIP codes laid off about 30% of their employees. The bottom line, Chetty said in his presentation, is that "reductions in spending by the rich have led to loss in jobs mostly for low-income individuals working in affluent areas."

Third, the government rescue effort. They find it has mostly failed. The $500 billion Paycheck Protection Program, which has given forgivable loans to businesses with fewer than 500 employees, doesn't appear to have done much to save jobs. When the researchers compare the employment trends of businesses with fewer than 500 employees with those with more, the smaller businesses eligible for PPP don't see a relative boost after the program went into effect. It looks like the program didn't do its job of saving jobs. Meanwhile, the stimulus checks, while increasing spending, did not have much stimulative effect because the spending mostly flowed to big companies like Amazon and Walmart. The money didn't flow to the rich ZIP code, in-person service businesses most affected by the downturn. Overall, the federal rescue package, they find, has failed to rescue the businesses and jobs getting hammered most by the pandemic.

Finally, there are state-permitted reopenings: They don't seem to boost the economy either. Chetty and his team compare, for example, Minnesota and Wisconsin. Minnesota allowed reopening weeks before Wisconsin, but if you look at spending patterns in both states, Minnesota did not see any boost compared with Wisconsin after it reopened. "The fundamental reason that people seem to be spending less is not because of state-imposed restrictions," Chetty said. "It's because high-income folks are able to work remotely, are choosing to self-isolate and are being cautious given health concerns. And unless you fundamentally address that concern, I think there's limited capacity to restart the economy."

As long as rich people are scared of the virus, they won't go out and spend money, and workers in the service sector will continue to suffer. Low-income workers — especially those whose jobs focused on providing services in rich urban areas — are in for a period of turbulence. Many of these workers are getting a lifeline in the form of unemployment insurance, but some of these benefits will expire soon if the federal government doesn't act.

by Greg Rosalsky, NPR |  Read more:
Image: Geoff Caddick/AFP via Getty Images
[ed. See also: COVID-19: This is when life will return to normal, according to the experts (World Economic Forum).]

Surfing the Wave of Reopenings

Segway Will Stop Making Its Iconic Self-Balancing Scooter

It’s the end of an unusual era in transportation. Fast Company has learned that the Segway brand will stop producing the Segway PT (Personal Transporter) at its Bedford, New Hampshire plant, where most production has taken place, on July 15th. The move will result in 25 people being laid off, and reflects the long-term struggles of a product that was supposed to revolutionize transportation, but never really took off.

Inventor Dean Kamen launched the Segway PT in December 2001 with promises that it would revolutionize city transport — the self-balancing two-wheeler was supposed to cover the middle ground between walking and driving in a way that bikes couldn’t. However, it never sold in huge numbers, managing just 140,000 units in nearly 20 years. It ultimately found the most use among security teams (immortalized by Paul Blart: Mall Cop) and tourists. Kamen sold the company in 2009, and Chinese mobility firm Ninebot acquired it in 2015.

Company executives were quick to acknowledge that the basic concept had its issues. VP Tony Ho told FC that the classic Segway design was still seen as “very novel” and required a learning curve (as this writer can attest) that kick scooters and other forms of transportation never really did. And as Segway president Judie Cai added, the PT’s design may have been too durable for its own good. Its highly redundant nature may have been great for reliability and safety, but it also meant that a customer might not have to replace their transporter for decades.

by Jon Fingas, Endgadget |  Read more:
Image: Robert Alexander/Getty Images

Anyone Can Play Guitar


[ed. One of my new favorite guitar instructors - Adrian Woodward. Nice dry sense of humor and solid lesson technique. I'm working on this one right now. Here's his website: Anyone Can Play Guitar.]

Tuesday, June 23, 2020

The Double Pandemic Of Social Isolation And COVID-19: Cross-Sector Policy Must Address Both

The struggle to balance literal survival with all the things that make surviving worthwhile has never been so clear, with the COVID-19 pandemic forcing many to sacrifice social connections – and therefore quality of life – for life itself. And yet, as I wrote in a recent Health Affairs policy brief, Social Isolation and Health (released June 22, 2020), being socially connected in meaningful ways is actually key to human health and survival.

The COVID-19 pandemic and the need to slow the virus’ spread have highlighted the pervasiveness of social contact within, and social relevance of, nearly every sector of our lives, including employment, education, entertainment, travel, transportation, and recreation. The pandemic has also highlighted the underlying weaknesses of our current social “support systems” for older adults, students, families, workers, and at-risk populations. As such, COVID-19 has underscored the necessity of strengthening local and federal systems to rebuild and sustain the social and emotional needs of the population – a task that will be critical to the nation’s public health recovery from the pandemic.

In this post, I explain why concerns about social isolation are heightened during the pandemic; discuss why policy responses must consider the impact of reduced or changed social connection across all sectors; highlight possible unintended consequences of improved digital connection; and, emphasize the importance of prioritizing social needs in recovery efforts.

Population-Wide Social Isolation Due To COVID-19

While social isolation and loneliness were prevalent in the population prior to COVID-19, efforts to reduce the virus’ spread via stay-at-home orders, quarantine, and social distancing recommendations have exacerbated an already serious problem. With the exception of “essential workers,” the pandemic has meant limiting physical proximity to those with whom one lives. For the 28 percent of Americans who live alone, this has meant little to no human contact for months. Regardless of living situation, interactions with anyone outside the home have been severely limited for everyone. Preliminary surveys suggest that within the first month of COVID-19, loneliness increased by 20 to 30 percent, and emotional distress tripled. While several surveys are still ongoing to capture the full extent of the problem, current evidence suggests the pre-existing public health crisis of social isolation and loneliness may be far more widespread than previously estimated. (...)

Social Isolation Carries Long-Term And Immediate Risks To Survival

With a highly infectious and deadly novel virus, why should we care about social isolation and loneliness? As described in my brief, robust evidence links social isolation to increased risk of death from all causes and increased morbidity across a variety of physical health outcomes. These well-established risks are a result of chronic effects over time. Thus, understandably, restrictions associated with the immediate risks of the coronavirus were prioritized for public health. Nonetheless social isolation and loneliness do have immediate effects that are health-relevant that should not be ignored.

The increase in distress due to social distancing that many Americans are experiencing is a normal response. Given that humans are a social species, this is our biology signaling a need to reconnect socially, just like hunger signals us to eat, and thirst signals us to drink water. Proximity to others, particularly trusted others, signals safety. When we lack proximity to trusted others our brain and body may respond with a state of heightened alert. This can result in increases in blood pressure, stress hormones, and inflammatory responses—which if experienced on an ongoing basis can put us at increased risk for a variety of chronic illnesses. Among those with pre-existing health conditions, these changes in physiology could potentially exacerbate the condition, precipitate the onset of an acute event, or hasten disease progression.

Immediate effects of social isolation related to the pandemic have already been observed, with surges in mental health concerns, substance abuse, and domestic violence. Early observation suggests problematic health behaviors, including substance use, poorer sleep, and emotional or overeating, may increase. Further, more than two million Americans purchased guns during the month of March (the second highest monthly total in the decades since such records have been kept), raising concerns for increased risk for suicide. Both short-term and long-term public health concerns will emerge if steps are not taken to mitigate these effects.

Social Contact Is A Key Component Of Every Sector

The pandemic has shown the world how fundamental social contact is in our lives, as almost every aspect of life has changed to create social distance. These social distancing efforts have led to remote working; remote or online education; cancellation of sporting, entertainment, and professional events; and, closures of museums, parks, churches and much more. Going forward, we are likely to see sustained changes to the way we live, work, and play, and even in the way we age. In fact, we have already seen calls for permanent changes in social norms, policies and physical environments– all of which are social determinants of health. The Centers for Disease Control define the social determinants of health as “the conditions in which people are born, grow, live, work and age, as well as the complex, interrelated social structures and economic systems that shape these conditions.” The substantial changes in our social behavior as a result of this pandemic are clearly far-reaching, but we do not yet know what lingering longer-term public health effects the pandemic may foreshadow. If the prevalence rates of social isolation and loneliness remain elevated or increase, such changes are likely to lead to a greater public health burden in the longer term. (...)

Addressing The Digital And Social Divides

Maintaining connections to others outside the home during the quarantine has increased our reliance upon phones and digital technologies. Of course, reliance on technology was rapidly increasing prior to the pandemic, but with increasing demand for telehealth, telework, and online education, issues of connectivity and the digital divide have been catapulted to the forefront of many policy discussions. Access to the internet is more crucial than ever, but we must pause to ask the bigger question: what is the full scope of consequences that may result from scaling digital capabilities and solutions?

While digital tools have clear benefits, including the capability to provide access to information and resources and bridge distances, there are potential tradeoffs. It is unclear to what extent digital tools approximate the human experience of in-person contact, or whether our biological needs for human connection can be satisfied through such tools. There is some evidence of a “loneliness paradox” wherein tech and social media that should make us more socially connected actually increase loneliness. In addition, the pandemic has highlighted limitations of video conferencing tools that go beyond Zoom Fatigue. For example, anyone who has attended a virtual funeral or wedding, or even just a virtual happy hour, realizes that it may be better than nothing but feels drastically inadequate. According to one survey, these virtual social gatherings failed to reduce loneliness among 48 percent, and actually increased loneliness among 10 percent of respondents. Permanent scaling of digital solutions may create a different kind of digital divide, such that human contact becomes a luxury exacerbating economic disparities. (...)

Social Needs Must Be Prioritized In Pandemic And Recovery Policy

Concerns about the secondary ramifications of the pandemic have focused nearly exclusively on a global economic recession. There should be similar concerns of a social recession. Similar to an economic recession that can have lasting effects even after the economy begins to grow, the social restrictions put in place during the pandemic may have profound long-term consequences, even after restrictions are lifted.

by Julianne Holt-Lunstad, Health Affairs | Read more:
Image: uncredited
[ed. See also: the following post.]