Repost
Thursday, August 20, 2020
The American Nursing Home Is a Design Failure
With luck, either you will grow old or you already have. That is my ambition and probably yours, and yet with each year we succeed in surviving, we all face a crescendo of mockery, disdain, and neglect. Ageism is the most paradoxical form of bigotry. Rather than expressing contempt for others, it lashes out at our own futures. It expresses itself in innumerable ways — in the eagerness to sacrifice the elderly on the altar of the economy, in the willingness to keep them confined while everyone else emerges from their shells, and, in a popular culture that sees old age (when it sees it at all) as a purgatory of bingo nights. Stephen Colbert turned the notion of a 75-year-old antifa into a comic riff on geriatric terrorists, replete with images of octogenarians innocently locomoting with walkers, stair lifts, and golf carts.
In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”
In this country, we have erected a vast apparatus of last-stop living arrangements that, during the pandemic, have proven remarkably successful at killing the very people they were supposed to care for. The disease that has roared through nursing homes is forcing us to look hard at a system we use to store large populations and recognize that, like prisons and segregated schools, it brings us shame.
The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.
The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.
The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.
Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.
The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”
Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”
It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.
The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.
In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”

The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.
The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.
The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.
Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.
The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”
Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”
It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.
The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.
by Justin Davidson, NY Mag/Intelligencer | Read more:
Image: C.F. Møller
[ed. Personally, I'd prefer an endless supply of good drugs, or something like the euthanasia scene in Soylent Green - Death of Sol (not available on YouTube for some reason).]
Labels:
Architecture,
Business,
Culture,
Design,
Health
Wednesday, August 19, 2020
'One-Shot' Radiotherapy As Good For Breast Cancer As Longer Course
Women with breast cancer who receive one shot of radiotherapy immediately after surgery experience the same benefits as those who have up to 30 doses over three to six weeks, an international medical study has found.
The technique, known as targeted intraoperative radiotherapy, is increasingly being used around the world instead of women having to undergo weeks of painful and debilitating treatment.
Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.
The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.
The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.
by Denis Campbell, The Guardian | Read more:
Image: Rui Vieira/PA

Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.
The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.
The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.
Image: Rui Vieira/PA
Obama and the Beach House Loopholes
[ed. Magnum P.I.'s old property. Obama P.I.? Just doesn't have the same ring to it.]
A home in the nearby neighborhood of Kailua had served as the winter White House for the Obama family every Christmas, and photographers often captured shots of Obama and Nesbitt strolling on the beach or golfing over the holidays.
The prospective property was located just down the shore in the Native Hawaiian community of Waimanalo. Wedged between the Koʻolau mountains that jut 1,300 feet into the sky and a stunning turquoise ocean, the beachfront estate sprawled across 3 acres, featuring a five-bedroom manse, gatehouse, boat house and tennis courts. Fronting the property was a historic turtle pond that used to feed Hawaiian chiefs. Local families took their children to splash and swim in its calm waters.

But the sellers of the Waimanalo property found a way to ensure the seawall remained in place for another generation. They asked state officials for something called an easement, a real estate tool that allows private property owners to essentially lease the public land that sits under the seawall. The cost: a one-time payment of $61,400. Officials with the state Department of Land and Natural Resources approved the permit, which authorized the wall for another 55 years, and Nesbitt purchased the property.
State officials and community members say the Obamas will be among the future occupants.
The easement paved the way for building permits and allowed developers to exploit other loopholes built into Hawaii’s coastal planning system. Nesbitt went on to win another environmental exemption from local officials and is currently pursuing a third — to expand the seawall. According to building permits, the Obamas’ so-called First Friend is redeveloping the land into a sprawling estate that will include three new single-family homes, two pools and a guard post. The beach fronting the seawall is nearly gone, erased completely at high tide.
Community members are now rallying against the proposed seawall expansion. Some are directing their criticism at Obama, who staked his legacy, in part, on fighting climate change and promoting environmental sustainability.
Obama’s personal office declined to comment, referring inquiries to Nesbitt. And Nesbitt, who declined to be interviewed, would not directly address questions about ownership, only saying that he and his wife bought the land and were “the developers” of the estate.
In written responses to questions, Nesbitt, now chair of the Obama Foundation board and co-CEO of a Chicago-based private-equity firm, said the steps he’s taken to redevelop the property and expand the seawall are “consistent with and informed by the analysis of our consultants, and the laws, regulations and perspectives of the State of Hawaii.” Any damage the structure caused to the Waimanalo beach, he added, occurred decades ago “and is no longer relevant.”
In Hawaii, beaches are a public trust, and the state is constitutionally obligated to preserve and protect them. But across the islands, officials have routinely favored landowners over shorelines, granting exemptions from environmental laws as the state loses its beaches. (...)
Intended to protect homeowners’ existing properties, easements have also helped fuel building along portions of Hawaii’s most treasured coastlines, such as Lanikai on Oahu and west side beaches on Maui. Scores of property owners have renovated homes and condos on the coast while investors have redeveloped waterfront lots into luxury estates. Meanwhile, the seawalls protecting these properties have diminished the shorelines. With nowhere to go, beaches effectively drown as sea levels rise against the walls and waves claw away the sand fronting them, moving it out to sea.
Researchers estimate that roughly a quarter of the beaches on Oahu, Maui and Kauai have already been lost or substantially narrowed because of seawalls over the past century. That has left less coastal habitat for endangered monk seals to haul up and rest and sea turtles to lay eggs. By midcentury, experts predict, the state will be down to just a handful of healthy beaches as climate change causes sea levels to rise at unprecedented rates. (...)
Beaches and open coastlines have always been central to Hawaii’s way of life. For centuries, Native Hawaiians enjoyed access to the ocean’s life-sustaining resources. Natural sand dunes provided protection against strong storms and served as a place for Native Hawaiians to bury their loved ones.
After Hawaii became a state in 1959, development of homes and hotels along the coastlines exploded as investors sought to capitalize on what was becoming some of the most valuable real estate in the country. An environmental review commissioned by the state in the 1970s found that three-quarters of the state’s sandy coastlines were now hugged by private property, curtailing public access to shorelines. Many property owners erected seawalls to try to hold back the ocean.
By the 1990s, scientists were warning that those seawalls were causing significant beach loss on all the Hawaiian islands.
Alarmed by these losses, state officials in 1997 released a roadmap for protecting the state’s beaches. The report emphasized that the seawalls were destroying coastal ecosystems, threatening the state’s tourist-driven economy and limiting the public’s access to beaches and the ocean, a right enshrined in the Hawaii Constitution.
If beaches continue to disappear throughout the state, the report warned, “the fabric of life in Hawaii will change and the daily miracle of living among these islands will lose its luster.”
by Sophie Cocke, ProPublica/Honolulu Star Advertiser | Read more:
Image: Darryl Oumi, special to Honolulu Star-Advertiser
[ed. How many houses do the Obama's own? Let's see, there's that one in Washington D.C., the recent one in Martha's Vineyard, and wasn't there one in Chicago? I can't keep track. Being ex-president can be a pretty lucrative gig if you protect the status quo.]
Get Ready for a Teacher Shortage Like We’ve Never Seen Before
Usually on the first day back to work after summer break, there’s this buzzing, buoyant energy in the air. My school is a small school-within-a-school designated to serve gifted children, so there are only 16 teachers and staff members. We typically meet in a colleague’s tidy classroom, filled with natural light and the earthy smell of coffee.
We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.
Coming back this year was different.
It was Thursday, Aug. 6, the same day that the Houston area reported its new single-day high for deaths from Covid-19. Instead of gathering, we all tuned in to a Zoom meeting from our separate classrooms.
There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.
We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.
“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”
I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”
I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”
She sent a message loud and clear: “I really, really don’t want to be here.”
We are about to see schools in America filled with these kinds of teachers.
Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.
You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.
That’s not what they do.
They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.
So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.
This has been happening for years. We’re about to see it get a lot worse.
by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields
We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.
Coming back this year was different.

There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.
We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.
“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”
I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”
I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”
She sent a message loud and clear: “I really, really don’t want to be here.”
We are about to see schools in America filled with these kinds of teachers.
Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.
You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.
That’s not what they do.
They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.
So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.
This has been happening for years. We’re about to see it get a lot worse.
by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields
Tuesday, August 18, 2020
Jack Kirby. From a golden age story reprinted in an early ‘70s “Marvel Premiere” comic
[ed. Living in the bubble]
Deceptively Bright, in an Up & Coming Area
Bunker: Building for the End Times
By Bradley Garrett
What is a bunker? The term derives from an Old Swedish word meaning ‘boards used to protect the cargo of a ship’. But if we take it, as we usually do, to mean a defended structure, often underground, intended to shield people and important goods through a period of strife, then it is one of the oldest building types made by humans. In Cappadocia, central Turkey, there are twenty-two subterranean settlements made by Hittite peoples around 1200 BC. As their empire faltered, the Hittites dug into soft hillsides to shelter themselves. As many as twenty thousand people lived at Derinkuyu, the deepest complex.
But the word ‘bunker’ also has the scent of modernity about it. As Bradley Garrett explains in his book, it was a corollary of the rise of air power, as a result of which the battlefield became three-dimensional. With the enemy above and equipped with high explosives, you had to dig down and protect yourself with metres of concrete. Garrett’s previous book, Explore Everything, was a fascinating insider’s look at illicit ‘urban exploration’, and he kicks off Bunker with an account of time spent poking around the Burlington Bunker, which would have been used by the UK government in the event of a nuclear war. The Cold War may have ended, but governments still build bunkers, as Garrett shows: Chinese contractors have recently completed a 23,000-square-metre complex in Djibouti. But these grand, often secret manifestations of official fear are not the main focus of the book. Instead, Garrett is interested in private bunkers and the people who build them, people like Robert Vicino, founder of the Vivos Group, who purchased the Burlington Bunker with the intent of making a worldwide chain of apocalypse retreats.
Garrett calls these people the ‘dread merchants’. Dread differs from fear in that it has no object: it is fear that has not yet found a focus. And if dread is your business, business has never been better, with the sustaining structures of modern life seeming ever more fragile and challenged. The dark charisma of the bunker is probably what will attract readers to this book, but the energetic and gregarious Garrett keeps the story focused on people rather than buildings. Much of the emphasis is on his native USA, where ‘prepping’ – disaster and Armageddon preparedness – has become a significant subculture, though there are also excursions to Australia, where ecological precarity is fuelling the bunker biz, and New Zealand and Thailand, favoured global ‘bug-out’ locations of the elite.
The first wave of private bunker-building followed the Cuban Missile Crisis of 1962, during which the American government made it plain that it had no intention of providing for the shelter of more than the military and political elite. The rest of the population got the message: if the worst happens, you’re on your own. Since then, American society appears to have been locked in a spiral of mistrust. In the 1990s, religiously minded ‘survivalist’ movements sought to divorce themselves from what they saw as an increasingly controlling federal state by forming autonomous fortified communities. Alarmed at these splinter groups walling themselves up and stockpiling weapons, the government reacted with overwhelming force, resulting in multiple deaths at Ruby Ridge and at the Branch Davidian compound in Waco, Texas. This bloodshed did nothing but confirm survivalists’ worst fears.
After the 9/11 attacks, survivalism entered the mainstream, giving birth to the modern prepper movement. As bunker salesman Gary Lynch tells Garrett, 9/11 was good for business on two fronts, as some Americans began to fear further terrorist attacks while others became alarmed by the prospect of increasing domestic authoritarianism. (...)
Buried, seemingly secure, as much a target for robbers as protection against them, the bunker shares many characteristics with the tomb. Both structures mediate with a kind of afterlife: the tomb ferries the dead to the hereafter, while the bunker is designed to deliver the still-living through a period of calamity to a safer future. Hope and survival are, in theory, uplifting themes, but Bunker is, in some ways, rather depressing. The people who want bunkers have, in one form or another, given up on society, taking a dim view of its prospects and seeing it as a thin veneer of order laid over Hobbesian chaos. The salespeople naturally promote this view: ‘dread merchants’ is the right phrase for them, since dread is really the product they’re selling.
By Bradley Garrett

Garrett calls these people the ‘dread merchants’. Dread differs from fear in that it has no object: it is fear that has not yet found a focus. And if dread is your business, business has never been better, with the sustaining structures of modern life seeming ever more fragile and challenged. The dark charisma of the bunker is probably what will attract readers to this book, but the energetic and gregarious Garrett keeps the story focused on people rather than buildings. Much of the emphasis is on his native USA, where ‘prepping’ – disaster and Armageddon preparedness – has become a significant subculture, though there are also excursions to Australia, where ecological precarity is fuelling the bunker biz, and New Zealand and Thailand, favoured global ‘bug-out’ locations of the elite.
The first wave of private bunker-building followed the Cuban Missile Crisis of 1962, during which the American government made it plain that it had no intention of providing for the shelter of more than the military and political elite. The rest of the population got the message: if the worst happens, you’re on your own. Since then, American society appears to have been locked in a spiral of mistrust. In the 1990s, religiously minded ‘survivalist’ movements sought to divorce themselves from what they saw as an increasingly controlling federal state by forming autonomous fortified communities. Alarmed at these splinter groups walling themselves up and stockpiling weapons, the government reacted with overwhelming force, resulting in multiple deaths at Ruby Ridge and at the Branch Davidian compound in Waco, Texas. This bloodshed did nothing but confirm survivalists’ worst fears.
After the 9/11 attacks, survivalism entered the mainstream, giving birth to the modern prepper movement. As bunker salesman Gary Lynch tells Garrett, 9/11 was good for business on two fronts, as some Americans began to fear further terrorist attacks while others became alarmed by the prospect of increasing domestic authoritarianism. (...)
Buried, seemingly secure, as much a target for robbers as protection against them, the bunker shares many characteristics with the tomb. Both structures mediate with a kind of afterlife: the tomb ferries the dead to the hereafter, while the bunker is designed to deliver the still-living through a period of calamity to a safer future. Hope and survival are, in theory, uplifting themes, but Bunker is, in some ways, rather depressing. The people who want bunkers have, in one form or another, given up on society, taking a dim view of its prospects and seeing it as a thin veneer of order laid over Hobbesian chaos. The salespeople naturally promote this view: ‘dread merchants’ is the right phrase for them, since dread is really the product they’re selling.
by Will Wiles, Literary Review | Read more:
Image: via
Labels:
Culture,
Government,
Journalism,
Psychology,
Security
Love Letter To A Vanishing World
1.
I have several times been within spitting distance: to the Philippines—as far south as Panay; to the court cities of central Java and to the highlands of Sulawesi, in Indonesia. I’ve spent many happy days on Peninsular Malaysia. Have lived in Tokyo, Hong Kong, and Kaoshiung~~~But as they say, “Close, but no cigar!”

A cruel young woman, I vetoed Borneo –and dragged him off to Kashmir instead. And to make matters worse, a year later, Gavin Young came out with his highly acclaimed book, In Search of Conrad, in which he does just what my boyfriend had wanted to do: follow Conrad to that famed trading post up “an Eastern river.”
2.
Recently, I re-read Eric Hansen’s travel classic, Stranger in the Forest. The book came out in the mid-80s. This was about ten years before I vetoed our trip to Borneo. It was also a time before the Internet and GPS. To prepare for his trip, Hansen had to go to a university library and read books, flip through journals, and consult maps—and to his great delight, he discovered there were still uncharted areas. And these were the very spots he wanted to see! Beginning his journey on the Malaysian side of Borneo, in Kuching, he traveled upriver on the Rajang (every bit as legendary as the Mahakam), and made his way inland toward the Highlands, where the indigenous Dayak peoples lived.
Did I mention he was mainly going on foot?
His trip occurred just a few years before Bruno Manser’s legendary ramble across Borneo. You’ve heard the expression “Fact is stranger than fiction?” Well, that term was invented for Swiss environmentalist, Bruno Manser’s life story. Arriving in Borneo in the mid-80s, within a year, he was living with one of the most elusive tribes in the highlands, the Penan. Carl Hoffman (who wrote the best seller, Savage Harvest) has just come out with a double biography called The Last Wild Men of Borneo about Bruno Manser and American tribal art dealer Michael Palmieri. The cover of the book has a photograph of Manser that I did not realize was a white man until I was nearly finished reading. Dressed in a loincloth and carrying a poison arrow quiver and blowpipe, his hair has been cut in the Dayak fashion, and he is shown squatting on a rock near the river’s edge. It is a touching photograph of a man who gave his life to fight for the rights of the indigenous peoples of the highlands.

Even as early as 1980, logging was already a huge issue. In Japan, especially, environmentalists rightly bemoaned the destruction being caused by the timber industry—so much of that wood being imported into Japan (The majority is now being imported into China). Logging was pushing the indigenous Dayak peoples of the highland into greater and greater peril as the land they considered to be theirs was being destroyed. Water was contaminated and animals were dying in great numbers. Manser realized that a people who had lived harmoniously in the interior of the island for thousands of years were now in grave danger of being pushed out–all in the name of corporate greed.
And so he fought valiantly to bring their plight to the attention of the world—including climbing up a 30-foot-tall London lamppost outside of the media center covering the 1991 G7 Summit and unfurling a banner about Dayak rights and then the following year, paragliding into a crowded stadium during the Earth Summit in Rio de Janeiro. In 1992, after meeting Manser, Vice-President Al Gore introduced a resolution into the senate calling upon the government of Malaysia to protect the rights of the indigenous peoples and for Japan to look into its logging companies’ practices. By the mid-90s, Manser had become a serious headache to the huge logging industry in Malaysia and an embarrassment to the government. Manser was to disappear in 2000 and was officially pronounced dead in 2005 (though his body was never found).
3.
It is a tragic story, with the only possible silver-lining being that at least Manser was not around to see what happened next, when the palm oil industry came to town. I had began wondering how much of that Borneo my boyfriend dreamt of was left? So, I picked up The Wasting of Borneo, by Alex Shoumatoff (2017) and quickly realized the situation was far worse than I was imagining. A staff writer for the New Yorker, Shoumatoff has been a contributing editor at Vanity fair and Conde Nast traveler among others. A travel writer and environmentalist, he has been to Borneo several times. In this latest book, he begins his Borneo journey with a visit to Birute Galdikas at her Orangutan Care Center near the Tanjung Puting National Park in Central Kalimantan.
Have you heard of Leakey’s Angels?
by Leanne Ogasawara, 3 Quarks Daily | Read more:
Images: uncredited
Monday, August 17, 2020
La Caravana del Diablo
La Caravana del Diablo: a migrant caravan in Mexico (The Guardian)
Image: Ada Trillo
[ed. Photo essay.]
[ed. Photo essay.]
Labels:
Economics,
Government,
history,
Politics,
Security
The Fully Industrialized Modern Chicken
A century ago, Americans would not recognise our modern hunger for chicken. The year-long market for tender but relatively bland chicken meat is a newish phenomenon, and without it the idea of chicken cutlets, $5 rotisseries, or the McNugget would be a fantasy.
How did America go from thinking of chicken as an “alternative” meat to consuming it more than any other meat?
The story starts with corn.
How American corn fueled a taste for chicken
At the turn of the 20th century, chicken was almost always eaten in the spring. The priority for chicken raisers at the time was egg production, so after the eggs hatched, all the male birds would be fed up and then quickly harvested as “spring chickens” – young, tender birds that were sold whole for roasting or broiling (hence the term “broilers”). Outside the spring rush, you might be buying a bigger, fatter fryer or an old hen for stewing.
“Farmers were sending chickens of all sorts of ages, different feather colours, and tremendous variety to the marketplace in the early 20th century,” says Roger Horowitz, food historian and author of Putting Meat on the American Table. But almost all chickens in the market were simply surplus to egg production, making them relatively uncommon – even rare. Tender spring chickens in particular could fetch a good price. But it is worth noting, Horowitz says, that the higher price wasn’t necessarily coming from pent-up demand.
“It’s not as if consumers were clamoring for broilers,” he says. Though there was some consumer demand for chickens, the relatively high price for broilers likely had more to do with the limited, seasonal supply than a passion for poultry.
During the second world war, however, red meat was rationed, and a national campaign encouraged the consumption of poultry and fish to save “meat” (beef, pork and lamb) for “the army and our allies”. Eating chicken became more common, but the preference for young broilers, and white breast meat, persisted.
As the war drew to a close, feed millers, which buy and grind corn and other grains to feed livestock, saw a big opportunity to spur that demand for meat chickens, which consume large amounts of corn. When traditional banks refused to finance new-fangled “chicken farms”, the feed companies themselves offered farmers loans to buy feed and equipment, putting the pieces of the modern contract poultry system in place.
Consumer acceptance of broilers out of season was not automatic. In the 1930s, the average American ate 10lbs (4.5kg) or less of chicken annually; by 2017 that had risen to 64lbs (29kg), according to the Economic Research Service at the United States Department of Agriculture (USDA). For decades chicken battled to be seen as a “meat”, and did not surpass its most expensive competitor, beef, in terms of overall consumption until 2010. A strong USDA-funded marketing campaign helped out.
“In the 50s and 60s, you see where these agricultural extension operations start pushing out recipes very aggressively about broilers,” Horowitz says, and as feed companies and hatcheries (most of which would eventually become so-called “integrators”, which own several of the businesses involved in chicken production) continued to consolidate the industry, they were able to more carefully calibrate the chicken itself to what would sell most profitably, focusing on lowering costs and raising proportions of the highest-demand cuts, namely breast meat.
Don Tyson, the late president of Tyson Foods, famously said: “If breast meat is worth two dollars a pound and dark meat is worth one dollar, which would I rather have?” But for generations, the idea of buying just the most coveted cuts of chicken was foreign to most consumers. It wasn’t until the 1980s that preferences began to switch to cuts of meat over the whole bird.
These companies owned and understood their chickens from egg to table and were able to exert unprecedented control over the biology of their flocks. Now, not only are they able to fine tune the birds’ characteristics with incredible accuracy, they can also map interactions with feed, environment, and processing to maximise profits.
For integrators and corn farmers alike, the investment paid off. In 2019, 9.2 billion 6lb (2.7kg) broiler chickens were harvested in the US, consuming about 1.8lbs (820g) of grain for every pound of chicken.
But the impact on chickens from the changes in production is troubling.
The modern industrial chicken
Over the past 70 years, the poultry industry has measured its success in terms of how many pounds of meat a chicken can produce for a given amount of feed. Modern chickens are more efficient than ever, with producers able to calculate to the ounce how much “input” of food, water, air and time are required to get a set amount of white and dark meat.
The modern chicken is fully industrialised.
With more than 500 chicken breeds existing on Earth, it might surprise you to learn that every nugget, breast, and cup of chicken noodle soup you’ve ever eaten likely came from one breed, a specialised cross between a Cornish and a white rock.
How did America go from thinking of chicken as an “alternative” meat to consuming it more than any other meat?
The story starts with corn.
How American corn fueled a taste for chicken
At the turn of the 20th century, chicken was almost always eaten in the spring. The priority for chicken raisers at the time was egg production, so after the eggs hatched, all the male birds would be fed up and then quickly harvested as “spring chickens” – young, tender birds that were sold whole for roasting or broiling (hence the term “broilers”). Outside the spring rush, you might be buying a bigger, fatter fryer or an old hen for stewing.

“It’s not as if consumers were clamoring for broilers,” he says. Though there was some consumer demand for chickens, the relatively high price for broilers likely had more to do with the limited, seasonal supply than a passion for poultry.
During the second world war, however, red meat was rationed, and a national campaign encouraged the consumption of poultry and fish to save “meat” (beef, pork and lamb) for “the army and our allies”. Eating chicken became more common, but the preference for young broilers, and white breast meat, persisted.
As the war drew to a close, feed millers, which buy and grind corn and other grains to feed livestock, saw a big opportunity to spur that demand for meat chickens, which consume large amounts of corn. When traditional banks refused to finance new-fangled “chicken farms”, the feed companies themselves offered farmers loans to buy feed and equipment, putting the pieces of the modern contract poultry system in place.
Consumer acceptance of broilers out of season was not automatic. In the 1930s, the average American ate 10lbs (4.5kg) or less of chicken annually; by 2017 that had risen to 64lbs (29kg), according to the Economic Research Service at the United States Department of Agriculture (USDA). For decades chicken battled to be seen as a “meat”, and did not surpass its most expensive competitor, beef, in terms of overall consumption until 2010. A strong USDA-funded marketing campaign helped out.
“In the 50s and 60s, you see where these agricultural extension operations start pushing out recipes very aggressively about broilers,” Horowitz says, and as feed companies and hatcheries (most of which would eventually become so-called “integrators”, which own several of the businesses involved in chicken production) continued to consolidate the industry, they were able to more carefully calibrate the chicken itself to what would sell most profitably, focusing on lowering costs and raising proportions of the highest-demand cuts, namely breast meat.
Don Tyson, the late president of Tyson Foods, famously said: “If breast meat is worth two dollars a pound and dark meat is worth one dollar, which would I rather have?” But for generations, the idea of buying just the most coveted cuts of chicken was foreign to most consumers. It wasn’t until the 1980s that preferences began to switch to cuts of meat over the whole bird.
These companies owned and understood their chickens from egg to table and were able to exert unprecedented control over the biology of their flocks. Now, not only are they able to fine tune the birds’ characteristics with incredible accuracy, they can also map interactions with feed, environment, and processing to maximise profits.
For integrators and corn farmers alike, the investment paid off. In 2019, 9.2 billion 6lb (2.7kg) broiler chickens were harvested in the US, consuming about 1.8lbs (820g) of grain for every pound of chicken.
But the impact on chickens from the changes in production is troubling.
The modern industrial chicken
Over the past 70 years, the poultry industry has measured its success in terms of how many pounds of meat a chicken can produce for a given amount of feed. Modern chickens are more efficient than ever, with producers able to calculate to the ounce how much “input” of food, water, air and time are required to get a set amount of white and dark meat.
The modern chicken is fully industrialised.
With more than 500 chicken breeds existing on Earth, it might surprise you to learn that every nugget, breast, and cup of chicken noodle soup you’ve ever eaten likely came from one breed, a specialised cross between a Cornish and a white rock.
by Sarah Mock, The Guardian | Read more:
Image: Glowimages/GettyWhat's Up With the USPS
Donald Trump has never hidden his intention to destroy the United States Postal Service (USPS) as we know it. The administration released plans openly declaring that its long-term aim was to privatize the USPS, enriching private investors by handing them a valuable public asset. Now, Trump’s postmaster general, Louis DeJoy, is under fire for internal changes that are hindering the USPS’s ability to deliver mail efficiently, and Trump himself has implied that he is reluctant to fund the USPS due to his longstanding opposition to mail-in voting.
DeJoy is a prototypical “political crony” appointee, a Republican party donor who never worked for the postal service and has financial interest in private delivery competitors to the USPS. The Intercept discovered that when DeJoy was in the private sector, he had a long history of overseeing labor violations. DeJoy has admitted that his changes to the USPS have caused delays to service, though he insists it has been unintentional. Trump has targeted the USPS for years, threatening to jack up prices and treating it as in need of an overhaul, one that DeJoy is now ruthlessly implementing.
The postal service has long been a target for Republicans, in part because a successful USPS is a threat to Republican ideology. After all, the conservative argument is that efficient public services are essentially impossible, that most government functions should be handed over to the private sector. A popular socialized mail service threatens to severely undercut this case. After all, if people are satisfied with the government delivering their mail, they might turn out to be satisfied with the government providing their health insurance. It could be a slippery slope toward socialism. A number of other countries have privatized their postal services.
Trump did not actually start the war on the USPS. Barack Obama actually pushed austerity measures, including a plan to eliminate Saturday delivery and cut the service’s budget. Obama’s former Office of Management and Budget director, Peter Orszag, endorsed full privatization. The ideology that government should be “lean” and “run like a business”, and that the private sector is inherently superior to the public sector, is a bipartisan delusion.
The postal service’s infamous financial woes are not actually hard to fix. While Trump tries to suggest it is all a result of inefficiency and mismanagement, we know that it mostly boils down to an absurdly unnecessary requirement imposed on the USPS which required it to put away billions of dollars each year for future retirement benefits. It would be easy to get the USPS shipshape again, but it would require a commitment to building an excellent public service, one that Obama didn’t really show and Trump certainly doesn’t have.
We should also remember, though, that talk of the USPS “losing money” is inherently a bit misleading and strange. Public services do not “lose money”, because they’re not designed to make money. If people said that the public library, or the school system, or the fire department was “losing money”, it would be very strange. Of course they are: they don’t take in revenue, because their purpose is to give everyone a free service, paid for out of government funds. It’s not like that money just goes into a pit or is frittered away. The money pays for a service that we then all get to enjoy. So even though we should point out that the USPS’s financial distress is in an important way politically manufactured, we should also be careful about embracing the logic that a government agency needs to “break even”. That’s not what the government is for. (...)
A very clever Republican tactic is to mismanage the government, and then point to government mismanagement as a case for privatization. (Hence hobbling the USPS with absurd budgetary requirements and then pointing out holes in its budget.) To counter that, it’s very important to make the general public aware of whose fault the problem is. If people see their mail delayed, and become frustrated, they need to understand that it’s Trump, not their local letter carrier, who is at fault. Trump is going to try to turn the agency into the villain of the story, because the USPS’s popularity is one of the reasons it has been relatively safe.
[ed. When I first moved to Washington state a few years ago and got to vote by mail I wondered, why haven't we been doing this forever? It's so simple and easy. You get a ballot in the mail along with a detailed brochure providing both pro and con arguments by advocates on either sides of the issues, fill in your votes, sign it and send it off (no postage necessary), or drop off at libraries, postal and county offices, etc. Easy peasy. Sure beats standing in long lines after work. See also:
Almost every citizen is at least inconvenienced. I’ve been corresponding throughout the day with readers from around the country who have gotten mail delivery half of the days this week, who are waiting for overdue prescriptions, waiting on packages who are two weeks overdue, Social Security checks which are sole sources of income. For many life saving prescriptions are delayed or lost. Critical medical tests are being invalidated because they spend to line in the mail. Businesses already battered by COVID are imperiled because shipments are late. These all apply to citizens from the far right to the far left.
The Post Office isn’t some newfangled federal responsibility. It is one of very few federal responsibilities and agencies of government explicitly referenced in the federal constitution.
President Trump is far from the first corrupt American President. But it is genuinely hard to think of a case in almost a quarter millennium of US history in which a chief executive has inconvenienced, damaged and imperiled so many citizens so directly for the sole purpose of corruptly maintaining power in defiance of the constitutional order. There’s really nothing comparable.
DeJoy is a prototypical “political crony” appointee, a Republican party donor who never worked for the postal service and has financial interest in private delivery competitors to the USPS. The Intercept discovered that when DeJoy was in the private sector, he had a long history of overseeing labor violations. DeJoy has admitted that his changes to the USPS have caused delays to service, though he insists it has been unintentional. Trump has targeted the USPS for years, threatening to jack up prices and treating it as in need of an overhaul, one that DeJoy is now ruthlessly implementing.

Trump did not actually start the war on the USPS. Barack Obama actually pushed austerity measures, including a plan to eliminate Saturday delivery and cut the service’s budget. Obama’s former Office of Management and Budget director, Peter Orszag, endorsed full privatization. The ideology that government should be “lean” and “run like a business”, and that the private sector is inherently superior to the public sector, is a bipartisan delusion.
The postal service’s infamous financial woes are not actually hard to fix. While Trump tries to suggest it is all a result of inefficiency and mismanagement, we know that it mostly boils down to an absurdly unnecessary requirement imposed on the USPS which required it to put away billions of dollars each year for future retirement benefits. It would be easy to get the USPS shipshape again, but it would require a commitment to building an excellent public service, one that Obama didn’t really show and Trump certainly doesn’t have.
We should also remember, though, that talk of the USPS “losing money” is inherently a bit misleading and strange. Public services do not “lose money”, because they’re not designed to make money. If people said that the public library, or the school system, or the fire department was “losing money”, it would be very strange. Of course they are: they don’t take in revenue, because their purpose is to give everyone a free service, paid for out of government funds. It’s not like that money just goes into a pit or is frittered away. The money pays for a service that we then all get to enjoy. So even though we should point out that the USPS’s financial distress is in an important way politically manufactured, we should also be careful about embracing the logic that a government agency needs to “break even”. That’s not what the government is for. (...)
A very clever Republican tactic is to mismanage the government, and then point to government mismanagement as a case for privatization. (Hence hobbling the USPS with absurd budgetary requirements and then pointing out holes in its budget.) To counter that, it’s very important to make the general public aware of whose fault the problem is. If people see their mail delayed, and become frustrated, they need to understand that it’s Trump, not their local letter carrier, who is at fault. Trump is going to try to turn the agency into the villain of the story, because the USPS’s popularity is one of the reasons it has been relatively safe.
by Nathan J. Robinson, The Guardian | Read more:
Image: Rob Latour/Rex/Shutterstock[ed. When I first moved to Washington state a few years ago and got to vote by mail I wondered, why haven't we been doing this forever? It's so simple and easy. You get a ballot in the mail along with a detailed brochure providing both pro and con arguments by advocates on either sides of the issues, fill in your votes, sign it and send it off (no postage necessary), or drop off at libraries, postal and county offices, etc. Easy peasy. Sure beats standing in long lines after work. See also:
Almost every citizen is at least inconvenienced. I’ve been corresponding throughout the day with readers from around the country who have gotten mail delivery half of the days this week, who are waiting for overdue prescriptions, waiting on packages who are two weeks overdue, Social Security checks which are sole sources of income. For many life saving prescriptions are delayed or lost. Critical medical tests are being invalidated because they spend to line in the mail. Businesses already battered by COVID are imperiled because shipments are late. These all apply to citizens from the far right to the far left.
The Post Office isn’t some newfangled federal responsibility. It is one of very few federal responsibilities and agencies of government explicitly referenced in the federal constitution.
President Trump is far from the first corrupt American President. But it is genuinely hard to think of a case in almost a quarter millennium of US history in which a chief executive has inconvenienced, damaged and imperiled so many citizens so directly for the sole purpose of corruptly maintaining power in defiance of the constitutional order. There’s really nothing comparable.
Make Him Own It. (Talking Points Memo).]
Saturday, August 15, 2020
The Unraveling of America
Never in our lives have we experienced such a global phenomenon. For the first time in the history of the world, all of humanity, informed by the unprecedented reach of digital technology, has come together, focused on the same existential threat, consumed by the same fears and uncertainties, eagerly anticipating the same, as yet unrealized, promises of medical science.
In a single season, civilization has been brought low by a microscopic parasite 10,000 times smaller than a grain of salt. COVID-19 attacks our physical bodies, but also the cultural foundations of our lives, the toolbox of community and connectivity that is for the human what claws and teeth represent to the tiger.
Our interventions to date have largely focused on mitigating the rate of spread, flattening the curve of morbidity. There is no treatment at hand, and no certainty of a vaccine on the near horizon. The fastest vaccine ever developed was for mumps. It took four years. COVID-19 killed 100,000 Americans in four months. There is some evidence that natural infection may not imply immunity, leaving some to question how effective a vaccine will be, even assuming one can be found. And it must be safe. If the global population is to be immunized, lethal complications in just one person in a thousand would imply the death of millions.
Pandemics and plagues have a way of shifting the course of history, and not always in a manner immediately evident to the survivors. In the 14th Century, the Black Death killed close to half of Europe’s population. A scarcity of labor led to increased wages. Rising expectations culminated in the Peasants Revolt of 1381, an inflection point that marked the beginning of the end of the feudal order that had dominated medieval Europe for a thousand years.
The COVID pandemic will be remembered as such a moment in history, a seminal event whose significance will unfold only in the wake of the crisis. It will mark this era much as the 1914 assassination of Archduke Ferdinand, the stock market crash of 1929, and the 1933 ascent of Adolf Hitler became fundamental benchmarks of the last century, all harbingers of greater and more consequential outcomes.
COVID’s historic significance lies not in what it implies for our daily lives. Change, after all, is the one constant when it comes to culture. All peoples in all places at all times are always dancing with new possibilities for life. As companies eliminate or downsize central offices, employees work from home, restaurants close, shopping malls shutter, streaming brings entertainment and sporting events into the home, and airline travel becomes ever more problematic and miserable, people will adapt, as we’ve always done. Fluidity of memory and a capacity to forget is perhaps the most haunting trait of our species. As history confirms, it allows us to come to terms with any degree of social, moral, or environmental degradation.
To be sure, financial uncertainty will cast a long shadow. Hovering over the global economy for some time will be the sober realization that all the money in the hands of all the nations on Earth will never be enough to offset the losses sustained when an entire world ceases to function, with workers and businesses everywhere facing a choice between economic and biological survival. (...)
In the wake of the war, with Europe and Japan in ashes, the United States with but 6 percent of the world’s population accounted for half of the global economy, including the production of 93 percent of all automobiles. Such economic dominance birthed a vibrant middle class, a trade union movement that allowed a single breadwinner with limited education to own a home and a car, support a family, and send his kids to good schools. It was not by any means a perfect world but affluence allowed for a truce between capital and labor, a reciprocity of opportunity in a time of rapid growth and declining income inequality, marked by high tax rates for the wealthy, who were by no means the only beneficiaries of a golden age of American capitalism.
But freedom and affluence came with a price. The United States, virtually a demilitarized nation on the eve of the Second World War, never stood down in the wake of victory. To this day, American troops are deployed in 150 countries. Since the 1970s, China has not once gone to war; the U.S. has not spent a day at peace. President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world.” Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
As America policed the world, the violence came home. On D-Day, June 6th, 1944, the Allied death toll was 4,414; in 2019, domestic gun violence had killed that many American men and women by the end of April. By June of that year, guns in the hands of ordinary Americans had caused more casualties than the Allies suffered in Normandy in the first month of a campaign that consumed the military strength of five nations.
More than any other country, the United States in the post-war era lionized the individual at the expense of community and family. It was the sociological equivalent of splitting the atom. What was gained in terms of mobility and personal freedom came at the expense of common purpose. In wide swaths of America, the family as an institution lost its grounding. By the 1960s, 40 percent of marriages were ending in divorce. Only six percent of American homes had grandparents living beneath the same roof as grandchildren; elders were abandoned to retirement homes.
For the first time, the international community felt compelled to send disaster relief to Washington. For more than two centuries, reported the Irish Times, “the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” As American doctors and nurses eagerly awaited emergency airlifts of basic supplies from China, the hinge of history opened to the Asian century.
In a single season, civilization has been brought low by a microscopic parasite 10,000 times smaller than a grain of salt. COVID-19 attacks our physical bodies, but also the cultural foundations of our lives, the toolbox of community and connectivity that is for the human what claws and teeth represent to the tiger.
Our interventions to date have largely focused on mitigating the rate of spread, flattening the curve of morbidity. There is no treatment at hand, and no certainty of a vaccine on the near horizon. The fastest vaccine ever developed was for mumps. It took four years. COVID-19 killed 100,000 Americans in four months. There is some evidence that natural infection may not imply immunity, leaving some to question how effective a vaccine will be, even assuming one can be found. And it must be safe. If the global population is to be immunized, lethal complications in just one person in a thousand would imply the death of millions.

The COVID pandemic will be remembered as such a moment in history, a seminal event whose significance will unfold only in the wake of the crisis. It will mark this era much as the 1914 assassination of Archduke Ferdinand, the stock market crash of 1929, and the 1933 ascent of Adolf Hitler became fundamental benchmarks of the last century, all harbingers of greater and more consequential outcomes.
COVID’s historic significance lies not in what it implies for our daily lives. Change, after all, is the one constant when it comes to culture. All peoples in all places at all times are always dancing with new possibilities for life. As companies eliminate or downsize central offices, employees work from home, restaurants close, shopping malls shutter, streaming brings entertainment and sporting events into the home, and airline travel becomes ever more problematic and miserable, people will adapt, as we’ve always done. Fluidity of memory and a capacity to forget is perhaps the most haunting trait of our species. As history confirms, it allows us to come to terms with any degree of social, moral, or environmental degradation.
To be sure, financial uncertainty will cast a long shadow. Hovering over the global economy for some time will be the sober realization that all the money in the hands of all the nations on Earth will never be enough to offset the losses sustained when an entire world ceases to function, with workers and businesses everywhere facing a choice between economic and biological survival. (...)
In the wake of the war, with Europe and Japan in ashes, the United States with but 6 percent of the world’s population accounted for half of the global economy, including the production of 93 percent of all automobiles. Such economic dominance birthed a vibrant middle class, a trade union movement that allowed a single breadwinner with limited education to own a home and a car, support a family, and send his kids to good schools. It was not by any means a perfect world but affluence allowed for a truce between capital and labor, a reciprocity of opportunity in a time of rapid growth and declining income inequality, marked by high tax rates for the wealthy, who were by no means the only beneficiaries of a golden age of American capitalism.
But freedom and affluence came with a price. The United States, virtually a demilitarized nation on the eve of the Second World War, never stood down in the wake of victory. To this day, American troops are deployed in 150 countries. Since the 1970s, China has not once gone to war; the U.S. has not spent a day at peace. President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world.” Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
As America policed the world, the violence came home. On D-Day, June 6th, 1944, the Allied death toll was 4,414; in 2019, domestic gun violence had killed that many American men and women by the end of April. By June of that year, guns in the hands of ordinary Americans had caused more casualties than the Allies suffered in Normandy in the first month of a campaign that consumed the military strength of five nations.
More than any other country, the United States in the post-war era lionized the individual at the expense of community and family. It was the sociological equivalent of splitting the atom. What was gained in terms of mobility and personal freedom came at the expense of common purpose. In wide swaths of America, the family as an institution lost its grounding. By the 1960s, 40 percent of marriages were ending in divorce. Only six percent of American homes had grandparents living beneath the same roof as grandchildren; elders were abandoned to retirement homes.
For the first time, the international community felt compelled to send disaster relief to Washington. For more than two centuries, reported the Irish Times, “the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” As American doctors and nurses eagerly awaited emergency airlifts of basic supplies from China, the hinge of history opened to the Asian century.
by Wade Davis, Rolling Stone | Read more:
Image: Gary Hershorn/Getty Images
[ed. Let's note again: "President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world. Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
[ed. Let's note again: "President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world. Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
See also: ---
When you’re in a pessimistic cast of mind, which developments of the COVID era do you find yourself dwelling on, and what grim scenarios do you imagine them portending? And then, when you’re in better spirits, what causes for optimism do you see?
I do think that the American project, the American experiment, is on the rack right now. We don’t know how things are going to go in the next 90 days. We really need to know whether this electoral process will go smoothly and whether it will deliver what it is supposed to, which is a decisive vote of the American public that confirms somebody to the presidency and thereby demonstrates the capacity of this place to govern itself.
And there is a very distinct possibility that that won’t happen. Or that the decision will fall in favor of the candidate and party that has demonstrated its incapacity to govern — and has in fact demonstrated its capacity to drive this country to ever-greater degrees of ungovernability. I never thought I would live under curfew. I’ve lived under curfew now in New York. It was insane. It made me indignant and outraged, and I didn’t think I would ever experience that.
The counterpart to the American election, globally, is obviously Hong Kong. They, too, have elections. And the brutality Beijing is capable of is shocking. For all of my advocacy for détente — in fact, because of my advocacy for détente — I’m haunted by memories of the 1930s and 1940s and the naïveté of many people who advocated for collective security and Popular Front collaboration with the Soviet Union, all for very good reasons that I would have certainly endorsed. We have to reckon with what we now know about the violence of which the Soviet Union was capable. And we have to reckon with what the Chinese Communist regime is capable of too. So those are the two advanced economy problems that are most on my mind.
I recently had the chance to be involved in conversations with a bunch of colleagues in South Africa. If COVID were to become yet another devastating shock to the developmental possibilities of sub-Saharan Africa, in terms of the humanitarian crisis, that has the makings of a truly catastrophic drama. Already, the economic and social news out of South Africa is biblically bad. They started the year with a 30 percent unemployment rate. They think they will have a 50 percent unemployment rate in the townships by the end of the year. Coming of age when I did, the end of apartheid and the advent of multiracial democracy in South Africa stood out as one of the great triumphs of humanity. And if South Africa becomes a basket case, then this is a disaster of traumatic proportions.
But the good news is … (?)
Oh, right. Hopeful signs. Well, let me try. At the risk of sounding trite, I actually do still marvel at the lockdown. And this actually goes back to our earlier discussion — to the question of the extent to which history is determined by the capitalist pursuit of profit. I’m enough of an economic historian to think that it’s a hugely important variable. But there was something really extraordinary that happened in March, in which nearly the entire world — individually and collectively — made this decision to shut down the economy to preserve human life. Politicians and businesses and citizens and trade unions — the whole mass of collective actors — made this decision. The vast majority of humanity was subject to it.
And it may have been a catastrophic mistake. I don’t think we can rule that possibility out. We can’t run it again. We don’t know what the consequences would have been. We’ve ended up with what we’ve ended up with. But part of what we ended up with was this collective decision — and as costly and painful as it was, there’s something truly spectacular about that moment.
And then, of course, all hell breaks loose. Inequalities make themselves dramatically felt. We can’t hold it together. It’s a shitshow. None of that struck me as surprising. But March was a different story.
How Will the Covid 19 Pandemic Change World History (NY Mag/Intelligencer).]
When you’re in a pessimistic cast of mind, which developments of the COVID era do you find yourself dwelling on, and what grim scenarios do you imagine them portending? And then, when you’re in better spirits, what causes for optimism do you see?
I do think that the American project, the American experiment, is on the rack right now. We don’t know how things are going to go in the next 90 days. We really need to know whether this electoral process will go smoothly and whether it will deliver what it is supposed to, which is a decisive vote of the American public that confirms somebody to the presidency and thereby demonstrates the capacity of this place to govern itself.
And there is a very distinct possibility that that won’t happen. Or that the decision will fall in favor of the candidate and party that has demonstrated its incapacity to govern — and has in fact demonstrated its capacity to drive this country to ever-greater degrees of ungovernability. I never thought I would live under curfew. I’ve lived under curfew now in New York. It was insane. It made me indignant and outraged, and I didn’t think I would ever experience that.
The counterpart to the American election, globally, is obviously Hong Kong. They, too, have elections. And the brutality Beijing is capable of is shocking. For all of my advocacy for détente — in fact, because of my advocacy for détente — I’m haunted by memories of the 1930s and 1940s and the naïveté of many people who advocated for collective security and Popular Front collaboration with the Soviet Union, all for very good reasons that I would have certainly endorsed. We have to reckon with what we now know about the violence of which the Soviet Union was capable. And we have to reckon with what the Chinese Communist regime is capable of too. So those are the two advanced economy problems that are most on my mind.
I recently had the chance to be involved in conversations with a bunch of colleagues in South Africa. If COVID were to become yet another devastating shock to the developmental possibilities of sub-Saharan Africa, in terms of the humanitarian crisis, that has the makings of a truly catastrophic drama. Already, the economic and social news out of South Africa is biblically bad. They started the year with a 30 percent unemployment rate. They think they will have a 50 percent unemployment rate in the townships by the end of the year. Coming of age when I did, the end of apartheid and the advent of multiracial democracy in South Africa stood out as one of the great triumphs of humanity. And if South Africa becomes a basket case, then this is a disaster of traumatic proportions.
But the good news is … (?)
Oh, right. Hopeful signs. Well, let me try. At the risk of sounding trite, I actually do still marvel at the lockdown. And this actually goes back to our earlier discussion — to the question of the extent to which history is determined by the capitalist pursuit of profit. I’m enough of an economic historian to think that it’s a hugely important variable. But there was something really extraordinary that happened in March, in which nearly the entire world — individually and collectively — made this decision to shut down the economy to preserve human life. Politicians and businesses and citizens and trade unions — the whole mass of collective actors — made this decision. The vast majority of humanity was subject to it.
And it may have been a catastrophic mistake. I don’t think we can rule that possibility out. We can’t run it again. We don’t know what the consequences would have been. We’ve ended up with what we’ve ended up with. But part of what we ended up with was this collective decision — and as costly and painful as it was, there’s something truly spectacular about that moment.
And then, of course, all hell breaks loose. Inequalities make themselves dramatically felt. We can’t hold it together. It’s a shitshow. None of that struck me as surprising. But March was a different story.
How Will the Covid 19 Pandemic Change World History (NY Mag/Intelligencer).]
Labels:
Critical Thought,
Government,
Health,
history,
Politics
Office Noise Simulators
During the first few days of quarantine, many displaced office workers likely enjoyed the peace and quiet of working from home. Now enough time has passed for them to miss the typing, chatter, and other background noises they would have complained about less than two months ago. If you're feeling nostalgic for the bustle of your workplace, this website, designed by Reichenbergerstr 121, can keep you company.
This tool, spotted by Lifehacker, simulates the ordinary, sometimes distracting noises of office life. When you visit imisstheoffice.eu and press the play button in the bottom left corner, a track of soft typing and muffled conversations fills your speakers. To adjust the number of colleagues sharing your space, toggle the tool in the bottom right corner.
Clicking the objects animating the page will add more sounds to the mix. A scanner, a water cooler, and a ping pong table are just a few of the office noise-makers you can activate to make your home feel less empty (or maybe remind you that working in silence isn't that bad).
People used to working outside an office before quarantine may be missing other sounds right now, like those of public spaces. This tool recreates the ambient noises of cafés around the globe.
by Michele Debczak, Lifehacker | Read more:
Image: Oli Scarff

Clicking the objects animating the page will add more sounds to the mix. A scanner, a water cooler, and a ping pong table are just a few of the office noise-makers you can activate to make your home feel less empty (or maybe remind you that working in silence isn't that bad).
People used to working outside an office before quarantine may be missing other sounds right now, like those of public spaces. This tool recreates the ambient noises of cafés around the globe.
by Michele Debczak, Lifehacker | Read more:
Image: Oli Scarff
Last Decade Was Earth's Hottest on Record, Exposing Grim Reality of Climate Change
A new report released Wednesday details how 2019 was another year of extremes for Earth's climate, adding to a litany of evidence exposing the grim reality of our warming world.
Last year saw devastating wildfires burn through Australia; large regions including Europe, Japan, Pakistan, and India experienced deadly heat waves; almost 100 tropical cyclones created havoc; glaciers and sea ice continued to melt at worrying levels; and drought and floods destroyed vital crops and infrastructure.
Among the key findings of the State of the Climate in 2019, published by the American Meteorological Society, was that 2019 was among the warmest years on record, that greenhouse gases in the Earth's atmosphere are at their highest recorded levels and this decade is the hottest since records began in the mid-1800s.
"Each decade since 1980 has been successively warmer than the preceding decade, with the most recent (2010-1019) being around 0.2°C warmer than the previous (2000-2009)," the report said. "As a primary driver for our changing climate, the abundance of many long-lived greenhouse gases continues to increase."
The study also reported other key findings:
Global carbon dioxide concentrations, which represent the bulk of the gases' warming power, rose during 2010 to a record 409.8 parts per million, the study found. That was the "highest in the modern 61-year measurement record as well as the highest ever measured in ice core records dating back as far as 800,000 years," the report said.
The report was led by the National Oceanic and Atmospheric Administration's Centers for Environmental Information and was based on contributions from more than 520 scientists from 60 countries. The annual report is often described by meteorologists as the "annual physical of the climate system."
"A number of extreme events, such as wildfires, heatwaves and droughts, have at least part of their root linked to the rise in global temperature. And of course the rise in global temperature is linked to another climate indicator: the ongoing rise in emissions of greenhouse gases, notably carbon-dioxide, nitrous oxide and methane," Dunn said.
by Helen Regan, CNN | Read more:
Image: NOAA NCEI Climate
Last year saw devastating wildfires burn through Australia; large regions including Europe, Japan, Pakistan, and India experienced deadly heat waves; almost 100 tropical cyclones created havoc; glaciers and sea ice continued to melt at worrying levels; and drought and floods destroyed vital crops and infrastructure.

"Each decade since 1980 has been successively warmer than the preceding decade, with the most recent (2010-1019) being around 0.2°C warmer than the previous (2000-2009)," the report said. "As a primary driver for our changing climate, the abundance of many long-lived greenhouse gases continues to increase."
The study also reported other key findings:
- The six warmest years on record have all occurred in the past six years, since 2014.
- 2019 was among the three hottest years since records began in the mid-1800s. Only 2016, and for some datasets 2015, were warmer than 2019.
- Average sea surface temperatures in 2019 was the second highest on record, surpassed only by 2016.
- Sea levels rose to a new record high for the eighth consecutive year.
- Surface air temperatures for the Arctic were the second highest in 120 years of records, trailing only 2016. In the Antarctic, 2019 was the second warmest year for the continent since 1979.
- Glaciers continue to melt at a concerning rate for the 32nd straight year.
Global carbon dioxide concentrations, which represent the bulk of the gases' warming power, rose during 2010 to a record 409.8 parts per million, the study found. That was the "highest in the modern 61-year measurement record as well as the highest ever measured in ice core records dating back as far as 800,000 years," the report said.
The report was led by the National Oceanic and Atmospheric Administration's Centers for Environmental Information and was based on contributions from more than 520 scientists from 60 countries. The annual report is often described by meteorologists as the "annual physical of the climate system."
Robert Dunn, one of the report's lead editors from the UK Met Office, said in a statement that, "The view for 2019 is that climate indicators and observations show that the global climate is continuing to change rapidly."
"A number of extreme events, such as wildfires, heatwaves and droughts, have at least part of their root linked to the rise in global temperature. And of course the rise in global temperature is linked to another climate indicator: the ongoing rise in emissions of greenhouse gases, notably carbon-dioxide, nitrous oxide and methane," Dunn said.
by Helen Regan, CNN | Read more:
Image: NOAA NCEI Climate
Friday, August 14, 2020
What Does the Kamala Harris Pick Signal to the Sanders-Warren Wing of the Party?
In one of the least surprising moments of what has so far been an uncommonly anticlimactic race, Joe Biden on Tuesday did what everyone was already expecting him to do: he chose Senator Kamala Harris of California as his running mate.
The pick comes on the heels of a slew of leaks and on- and off-the-record comments from Biden allies wishing to trash Harris and downplay her chances in the press. The Florida Democratic donor John Morgan lamented to CNBC that Harris “would be running for president the day of the inauguration”. Former Senator Chris Dodd complained that Harris showed “no remorse” after attacking Biden based on his racial justice record. In retrospect, these comments in the media read less like realistic dispatches from within the VP vetting process than attempts to influence it from the outside, perhaps from Biden allies still angry at Harris over the primary. That anger, evidently isn’t shared by the candidate himself.
But more than evidence of mended fences between Biden and Harris, the pick reflects a strategic decision over which sections of voters, and which factions of the Democratic party, the Biden team feels it needs to prioritize in order to win in November. And with the Harris pick, they are resoundingly signaling that it is the centrist and pragmatic voters – particularly older Black voters – and not the younger progressive left, that they feel they have the most to gain from appealing to.
Harris was the early frontrunner for the VP slot in part precisely because her political record reveals only spotty and inconsistent ideological commitments. During her own presidential bid in the primary cycle, she moved left on Medicare for All, Bernie Sanders’ signature issue, but then backtracked right. She claimed to have evolved her thinking on law enforcement and incarceration in one instance, then touted her record as a prosecutor in another. Harris was by no means alone in this ideological shape shifting: she was no more willing to alter her positions for the sake of convenience than, say Mayor Pete Buttigieg. But the shifts signaled that what Harris was selling to the American people was not so much an ideological commitment, like Sanders and Elizabeth Warren to her left, or Amy Klobuchar to her right. What Harris was running on was more cultural and affective. She was not selling a policy platform. She was selling her character; namely, the carefully projected impression that she was thick-skinned, intelligent and unwilling to suffer fools.
For Biden, himself light on policy and heavy on appeals to his own affable familiarity, presumed competence and promises to return the country to a pre-Trump “normal”, this made Harris a good fit. But after a heated, if not especially close, last few months of the primary contest against the party’s progressive standard bearer, Bernie Sanders, there was one line of thinking that posited that a Harris vice-presidential nomination would be risky. As a noncommittal but generally center-left contender, Harris would potentially alienate and certainly fail to excite the younger, more progressive voters who had backed Sanders. Could Biden afford to turn off Bernie’s base by not picking a progressive?
Evidently, he thinks he can, and there is some evidence that he’s right. Though the left raised huge amounts of money for the Sanders campaign, they couldn’t drum up votes: after a long and contentious primary season, Biden won overwhelmingly, in spite of Sanders’ superior fundraising. Bernie’s failure – namely, his campaign’s inability to transform money and significant online enthusiasm into actual voter turnout – may have undercut the left’s ability to build leverage more broadly. It didn’t help matters that Bernie’s base, though enthusiastic, was hostile to overtures from other candidates: when Elizabeth Warren made gestures to Sanders voters, she was met with vitriol, derision and misogynist contempt. Democratic strategists may have begun to understand the Sanders base as an unreliable voting block, one that doesn’t deliver turnout and can’t take yes for an answer. From that perspective, the Biden campaign had few incentives to pick a progressive running mate or to make many policy overtures to the party’s left wing.
But perhaps the more morally grievous downside to the Harris pick lies in her potential to alienate the emergent movement against police brutality that has gained traction this summer in the wake of the murder of George Floyd in Minneapolis. The filmed killing ignited protests that the New York Times says were likely the largest popular demonstrations in American history. Under the rallying cry of Black Lives Matter, these uprisings crystallized a growing distrust of the police and a consensus around racial justice issues that is emerging in a large and surprisingly multiracial contingent of the country. The choice of Harris – a former prosecutor and attorney general whose career has included uncomfortably collegial relationships with the police and a comfort with incarceration as a punishment for even non-violent crimes – risks appearing to dismiss this movement’s righteous and morally urgent demands. But here, too, is a place where the Biden team may feel comfortable taking the left for granted: in a contest against the sadistic and racist Donald Trump, Black Lives Matter protesters have no meaningful choice except to support him.
And yet in spite of a policy history that places her in opposition to the policy demands of the country’s largest and most energetic movements for racial justice, Harris’ vice-presidential nomination is also seen as an acknowledgment of the outsized role played by Black voters, and particularly Black women voters, in Democratic electoral victories.
Despite pat political punditry that says otherwise, the Black vote is not monolithic, and nor is the progressive wing of the party uniformly white. Within the Democratic party, ideological differences fall much more neatly along generational lines than racial ones, and younger Black voters often have very different political instincts than their parents and grandparents. To understand the factional divide within the Democratic party as being between progressive voters, on the one hand, and Black voters, on the other, would be to fundamentally misdiagnose the issue.
But the Harris pick is part of a growing consensus among establishment Democratic strategists that many Democrats owe their electoral victories to the party’s most reliable constituency: the older Black voter, and specifically, the older Black woman voter. It is turnout among such voters that has propelled Democratic candidates to victory in many recent contests, but for too long the party has seemed to take them for granted, relying on the growing racism of the Republican party as a guarantee of Black votes they presumed they did not have to earn. The Harris pick can be seen as an attempt, if a relatively symbolic and shortsighted one, to correct that neglect by putting a Black woman at the center of a party they have long helped to maintain.
Tactically, it’s not hard to see why the Biden team thought that attempting to appeal to Black voters would be a winning strategy. Older Black voters in particular have been reliably loyal to the Democratic party, and are crucially much more likely to vote than younger people of all races. Perhaps this is because, given America’s long history of state-sanctioned racist violence and state-enacted racist neglect, these voters feel they have more on the line. With much to lose, some older Black voters find themselves picking candidates for tactical reasons more so than ideological ones. In the primary, Biden won them by promising them that he could win. It was their support that gave him the nomination.
by Moira Donegan, The Guardian | Read more:
Image: Bebeto Matthews/AP
[ed. Same as it ever was.]
The pick comes on the heels of a slew of leaks and on- and off-the-record comments from Biden allies wishing to trash Harris and downplay her chances in the press. The Florida Democratic donor John Morgan lamented to CNBC that Harris “would be running for president the day of the inauguration”. Former Senator Chris Dodd complained that Harris showed “no remorse” after attacking Biden based on his racial justice record. In retrospect, these comments in the media read less like realistic dispatches from within the VP vetting process than attempts to influence it from the outside, perhaps from Biden allies still angry at Harris over the primary. That anger, evidently isn’t shared by the candidate himself.

Harris was the early frontrunner for the VP slot in part precisely because her political record reveals only spotty and inconsistent ideological commitments. During her own presidential bid in the primary cycle, she moved left on Medicare for All, Bernie Sanders’ signature issue, but then backtracked right. She claimed to have evolved her thinking on law enforcement and incarceration in one instance, then touted her record as a prosecutor in another. Harris was by no means alone in this ideological shape shifting: she was no more willing to alter her positions for the sake of convenience than, say Mayor Pete Buttigieg. But the shifts signaled that what Harris was selling to the American people was not so much an ideological commitment, like Sanders and Elizabeth Warren to her left, or Amy Klobuchar to her right. What Harris was running on was more cultural and affective. She was not selling a policy platform. She was selling her character; namely, the carefully projected impression that she was thick-skinned, intelligent and unwilling to suffer fools.
For Biden, himself light on policy and heavy on appeals to his own affable familiarity, presumed competence and promises to return the country to a pre-Trump “normal”, this made Harris a good fit. But after a heated, if not especially close, last few months of the primary contest against the party’s progressive standard bearer, Bernie Sanders, there was one line of thinking that posited that a Harris vice-presidential nomination would be risky. As a noncommittal but generally center-left contender, Harris would potentially alienate and certainly fail to excite the younger, more progressive voters who had backed Sanders. Could Biden afford to turn off Bernie’s base by not picking a progressive?
Evidently, he thinks he can, and there is some evidence that he’s right. Though the left raised huge amounts of money for the Sanders campaign, they couldn’t drum up votes: after a long and contentious primary season, Biden won overwhelmingly, in spite of Sanders’ superior fundraising. Bernie’s failure – namely, his campaign’s inability to transform money and significant online enthusiasm into actual voter turnout – may have undercut the left’s ability to build leverage more broadly. It didn’t help matters that Bernie’s base, though enthusiastic, was hostile to overtures from other candidates: when Elizabeth Warren made gestures to Sanders voters, she was met with vitriol, derision and misogynist contempt. Democratic strategists may have begun to understand the Sanders base as an unreliable voting block, one that doesn’t deliver turnout and can’t take yes for an answer. From that perspective, the Biden campaign had few incentives to pick a progressive running mate or to make many policy overtures to the party’s left wing.
But perhaps the more morally grievous downside to the Harris pick lies in her potential to alienate the emergent movement against police brutality that has gained traction this summer in the wake of the murder of George Floyd in Minneapolis. The filmed killing ignited protests that the New York Times says were likely the largest popular demonstrations in American history. Under the rallying cry of Black Lives Matter, these uprisings crystallized a growing distrust of the police and a consensus around racial justice issues that is emerging in a large and surprisingly multiracial contingent of the country. The choice of Harris – a former prosecutor and attorney general whose career has included uncomfortably collegial relationships with the police and a comfort with incarceration as a punishment for even non-violent crimes – risks appearing to dismiss this movement’s righteous and morally urgent demands. But here, too, is a place where the Biden team may feel comfortable taking the left for granted: in a contest against the sadistic and racist Donald Trump, Black Lives Matter protesters have no meaningful choice except to support him.
And yet in spite of a policy history that places her in opposition to the policy demands of the country’s largest and most energetic movements for racial justice, Harris’ vice-presidential nomination is also seen as an acknowledgment of the outsized role played by Black voters, and particularly Black women voters, in Democratic electoral victories.
Despite pat political punditry that says otherwise, the Black vote is not monolithic, and nor is the progressive wing of the party uniformly white. Within the Democratic party, ideological differences fall much more neatly along generational lines than racial ones, and younger Black voters often have very different political instincts than their parents and grandparents. To understand the factional divide within the Democratic party as being between progressive voters, on the one hand, and Black voters, on the other, would be to fundamentally misdiagnose the issue.
But the Harris pick is part of a growing consensus among establishment Democratic strategists that many Democrats owe their electoral victories to the party’s most reliable constituency: the older Black voter, and specifically, the older Black woman voter. It is turnout among such voters that has propelled Democratic candidates to victory in many recent contests, but for too long the party has seemed to take them for granted, relying on the growing racism of the Republican party as a guarantee of Black votes they presumed they did not have to earn. The Harris pick can be seen as an attempt, if a relatively symbolic and shortsighted one, to correct that neglect by putting a Black woman at the center of a party they have long helped to maintain.
Tactically, it’s not hard to see why the Biden team thought that attempting to appeal to Black voters would be a winning strategy. Older Black voters in particular have been reliably loyal to the Democratic party, and are crucially much more likely to vote than younger people of all races. Perhaps this is because, given America’s long history of state-sanctioned racist violence and state-enacted racist neglect, these voters feel they have more on the line. With much to lose, some older Black voters find themselves picking candidates for tactical reasons more so than ideological ones. In the primary, Biden won them by promising them that he could win. It was their support that gave him the nomination.
by Moira Donegan, The Guardian | Read more:
Image: Bebeto Matthews/AP
[ed. Same as it ever was.]
Dolly Parton: 'Of Course Black Lives Matter!'
US country music star Dolly Parton has come out in support of Black Lives Matter, in a rare comment on politics.
She told Billboard Magazine: "Do we think our little white asses are the only ones that matter? No!"
With a broad fan base that spans the right and the left, the singer generally eschews political subjects.
Her comments come amid a nationwide reckoning on race that has impacted all of US society, including country music.
Although Ms Parton has not attended Black Lives Matter marches, she said she supported anti-racism activists' right to protest.
"I understand people having to make themselves known and felt and seen," she told the music magazine.
What did Dolly say about Dixie?
The entertainment mogul - who owns Dollywood amusement park in her home state of Tennessee as well as other attractions - also spoke about her decision in 2018 to drop the "Dixie" from her Dixie Stampede attraction.
A 2017 article in Slate critiqued Ms Parton's attraction, calling it a "lily-white kitsch extravaganza".
"Dixie" was often used as a nickname for the southern states that made up the Confederate States of America during the US Civil War era.
"There's such a thing as innocent ignorance, and so many of us are guilty of that," she told Billboard. "When they said 'Dixie' was an offensive word, I thought, 'Well, I don't want to offend anybody. This is a business. We'll just call it The Stampede.'
"As soon as you realise that [something] is a problem, you should fix it. Don't be a dumbass. That's where my heart is. I would never dream of hurting anybody on purpose."
She told Billboard Magazine: "Do we think our little white asses are the only ones that matter? No!"

Her comments come amid a nationwide reckoning on race that has impacted all of US society, including country music.
Although Ms Parton has not attended Black Lives Matter marches, she said she supported anti-racism activists' right to protest.
"I understand people having to make themselves known and felt and seen," she told the music magazine.
What did Dolly say about Dixie?
The entertainment mogul - who owns Dollywood amusement park in her home state of Tennessee as well as other attractions - also spoke about her decision in 2018 to drop the "Dixie" from her Dixie Stampede attraction.
A 2017 article in Slate critiqued Ms Parton's attraction, calling it a "lily-white kitsch extravaganza".
"Dixie" was often used as a nickname for the southern states that made up the Confederate States of America during the US Civil War era.
"There's such a thing as innocent ignorance, and so many of us are guilty of that," she told Billboard. "When they said 'Dixie' was an offensive word, I thought, 'Well, I don't want to offend anybody. This is a business. We'll just call it The Stampede.'
"As soon as you realise that [something] is a problem, you should fix it. Don't be a dumbass. That's where my heart is. I would never dream of hurting anybody on purpose."
by BBC | Read more:
Image: Getty
[ed. How refreshing to see someone say "I would never dream of hurting someone on purpose". How many people can honestly say that about themselves these days.]
Let Russ Cook
Russell Wilson didn’t don an apron and a chef’s hat for his first Zoom news conference since training camp began. But the Seahawks’ quarterback didn’t exactly slam shut the oven door on the Twitter sentiment that boiled throughout the offseason:
Let Russ Cook.
For the uninitiated, that phrase, preceded by a hashtag, is a plea to Pete Carroll to take the shackles off Wilson. To lessen the coach’s long-standing reliance on the running game in order to accentuate the team’s best asset — Wilson with the ball in his hands.
Wilson, of course, is far too much the diplomat to ever state that so directly. In response to the question of whether he ever retweeted a #LetRussCook missive, Wilson laughed and said, “No, I never retweeted it.”
But when asked Thursday if he agreed with the sentiment that he needed to be involved sooner, and at a higher pace, in the Seahawks’ offense, Wilson clicked the metaphorical “like” button.
“Yeah, I definitely think so,’’ he said. “I mean, rather than us having to be in the fourth quarter to be able to make stuff happen. I think we have a crazy stat of 56 and 0 when we have the lead by halftime. I think getting ahead is the key.”
The stat is actually that the Seahawks are 57-0 when leading by four or more points at halftime since Wilson took over as starting quarterback in 2012. Last year in many ways was a historical outlier; they won six games when trailing at halftime — tied for the second-highest total of any team since the 1970 merger.
Many of those wins were achieved by finally turning Wilson loose in the fourth quarter, when the situation got dire. In many of their close losses, they failed to execute a similar blueprint — including the one that ended their season, a 28-23 playoff defeat to Green Bay in which the Seahawks trailed 28-10 midway through the third quarter before Wilson was unleashed.
Logic and a decades-long body of statistical evidence in the NFL says that there’s going to be a regression to the mean when it comes to second-half rallies to victory. As legendary as Wilson has become in fourth quarter and overtime comebacks, it would behoove them to stop relying so heavily on his late magic.
All the #LetRussCook movement is saying, if I’m interpreting it correctly, is let him weave some magic early, too. And then you might not need him to pull a win out of his hat.
[ed. It's understandable management would want to protect their (very large) investment, but every Seahawks fan has been saying this for years. Russ is probably the best running quarterback in the league, let him use all his talents.]
Let Russ Cook.
For the uninitiated, that phrase, preceded by a hashtag, is a plea to Pete Carroll to take the shackles off Wilson. To lessen the coach’s long-standing reliance on the running game in order to accentuate the team’s best asset — Wilson with the ball in his hands.

But when asked Thursday if he agreed with the sentiment that he needed to be involved sooner, and at a higher pace, in the Seahawks’ offense, Wilson clicked the metaphorical “like” button.
“Yeah, I definitely think so,’’ he said. “I mean, rather than us having to be in the fourth quarter to be able to make stuff happen. I think we have a crazy stat of 56 and 0 when we have the lead by halftime. I think getting ahead is the key.”
The stat is actually that the Seahawks are 57-0 when leading by four or more points at halftime since Wilson took over as starting quarterback in 2012. Last year in many ways was a historical outlier; they won six games when trailing at halftime — tied for the second-highest total of any team since the 1970 merger.
Many of those wins were achieved by finally turning Wilson loose in the fourth quarter, when the situation got dire. In many of their close losses, they failed to execute a similar blueprint — including the one that ended their season, a 28-23 playoff defeat to Green Bay in which the Seahawks trailed 28-10 midway through the third quarter before Wilson was unleashed.
Logic and a decades-long body of statistical evidence in the NFL says that there’s going to be a regression to the mean when it comes to second-half rallies to victory. As legendary as Wilson has become in fourth quarter and overtime comebacks, it would behoove them to stop relying so heavily on his late magic.
All the #LetRussCook movement is saying, if I’m interpreting it correctly, is let him weave some magic early, too. And then you might not need him to pull a win out of his hat.
by Larry Stone, Seattle Times | Read more:
Image: John Froschauer/AP[ed. It's understandable management would want to protect their (very large) investment, but every Seahawks fan has been saying this for years. Russ is probably the best running quarterback in the league, let him use all his talents.]
Small Town Colleges May Pose a Public Health Threat
There's a lot riding on a kickoff set for 6 p.m. Saturday, Sept. 12.
The Sterling College Warriors are scheduled to take on the McPherson College Bulldogs at home. If that familiar thud of shoe against football and cheer from the stands doesn't happen, the college that keeps the central Kansas town's economy humming, that gives it cultural vitality, and that separates Sterling from the hollowing out that defines so many other small Midwestern towns, might not survive.
The school, after 133 years, could die and doom the town that takes such pride in the football squad and embraces the student body like family.
"If COVID defeats the athletic season this year, it will probably defeat a lot of small colleges," said Jeb Miller, a non-traditional senior at Sterling College. "And, as a result, harm a lot of small towns. Badly."
Small town institutions
Hundreds of small colleges dotting the country rely on students paying tens of thousands of dollars a year in exchange for a distinctive, personal, high-touch college experience.
Many of those colleges hung on year-to-year even before the pandemic. Now COVID-19 threatens to cut off the oxygen sustaining these schools, and the sports programs that drive enrollment.
But the very thing small colleges need to stay afloat — students coming in, spending money, playing sports — also poses a major risk to relatively isolated little towns that, so far, have dodged major coronavirus outbreaks.
Only about 2,200 people live in Sterling out on the flat, flat plains of south-central Kansas. But this small city boasts an almost idyllic downtown. New office buildings. Two good coffee shops. A nice grocery store, a bowling alley, you name it.
Sterling has good schools, competitive sports teams. Locals say school plays, games and concerts draw big crowds. Without the college, the money, diversity and energy that defines life in Sterling could evaporate quickly.
"There is just so much overlap," said Kyler Comley, a Sterling College senior who's lived in the town all his life. "The community supports the college. The college supports the community. You know, you just see how everything's intertwined and how people are just so overly giving and involved."
Every student attending Sterling College gets paired with a family in town. Those families speak endearingly about their adopted scholars.
The students left in March. Most haven't come back. Like many people here, Sterling criminal justice professor Mark Tremaine said that starting classes up again in person this month is make or break for Sterling College.
"The bottom line is, we've got to get students back to campus. If we're going to survive," he said."We have to accept whatever the risks are and do it."
And that's the plan. Sterling doesn't have much of a choice.
The Sterling College Warriors are scheduled to take on the McPherson College Bulldogs at home. If that familiar thud of shoe against football and cheer from the stands doesn't happen, the college that keeps the central Kansas town's economy humming, that gives it cultural vitality, and that separates Sterling from the hollowing out that defines so many other small Midwestern towns, might not survive.

"If COVID defeats the athletic season this year, it will probably defeat a lot of small colleges," said Jeb Miller, a non-traditional senior at Sterling College. "And, as a result, harm a lot of small towns. Badly."
Small town institutions
Hundreds of small colleges dotting the country rely on students paying tens of thousands of dollars a year in exchange for a distinctive, personal, high-touch college experience.
Many of those colleges hung on year-to-year even before the pandemic. Now COVID-19 threatens to cut off the oxygen sustaining these schools, and the sports programs that drive enrollment.
But the very thing small colleges need to stay afloat — students coming in, spending money, playing sports — also poses a major risk to relatively isolated little towns that, so far, have dodged major coronavirus outbreaks.
Only about 2,200 people live in Sterling out on the flat, flat plains of south-central Kansas. But this small city boasts an almost idyllic downtown. New office buildings. Two good coffee shops. A nice grocery store, a bowling alley, you name it.
Sterling has good schools, competitive sports teams. Locals say school plays, games and concerts draw big crowds. Without the college, the money, diversity and energy that defines life in Sterling could evaporate quickly.
"There is just so much overlap," said Kyler Comley, a Sterling College senior who's lived in the town all his life. "The community supports the college. The college supports the community. You know, you just see how everything's intertwined and how people are just so overly giving and involved."
Every student attending Sterling College gets paired with a family in town. Those families speak endearingly about their adopted scholars.
The students left in March. Most haven't come back. Like many people here, Sterling criminal justice professor Mark Tremaine said that starting classes up again in person this month is make or break for Sterling College.
"The bottom line is, we've got to get students back to campus. If we're going to survive," he said."We have to accept whatever the risks are and do it."
And that's the plan. Sterling doesn't have much of a choice.
by Frank Morris, NPR | Read more:
Image: Frank Morris
Subscribe to:
Posts (Atom)