Saturday, August 22, 2020
Basketball Was Filmed Before a Live Studio Audience
I knew there were way more important things than basketball, and I was all for canceling everything back in March: in-person school, sports, plays, concerts, conferences, just shut it down. But, in order to hold this line, I had to force myself to stop thinking all the damn time about the interruption of the Milwaukee Bucks’ magical season, the second consecutive MVP season of Giannis Antetokounmpo, their certain progress toward their first NBA Finals in decades. This was supposed to be Milwaukee’s summer, with a long playoff run for the basketball team followed by the Democratic National Convention in the same new arena built for just such moments. Months later, when it was official that the NBA season would resume at Disney World, encased in a quarantined bubble, tears formed in my eyes. From mid-March until the beginning of summer, I watched no live TV. The news was too awful, and sports were all reruns. Since late July, I’ve been watching the Bucks again, and like everything else in America, it’s been strange.
As sports, the competitions from the NBA bubble, like the football (soccer), baseball, and ice hockey games I’ve watched, are more or less the same. But as television shows, as a variety of broadcast media, and as an aesthetic experience made up of images and sounds, the NBA games so far have been a departure from the usual, and nothing feels right. It’s been a bit like another newly familiar experience: getting takeout from a restaurant where you previously would dine in. The food might taste like you remember it, but the sensory and social environment of the meal makes you realize how much context matters. (...)
The NBA bubble games have had a particularly sitcommy feel. The courts at Disney’s Wide World of Sports are walled in on three sides by tall video displays, obscuring whatever seats or walls are beyond the court except for rare glimpses when the director cuts to a camera behind the scorer’s table for a referee’s call. The images almost always stay on one side of the action facing these displays, and unlike the usual games from the before times, there are no camera operators on the court itself under the basket. The visual array is reminiscent of the kind of three-wall sets that television comedies adopted from the stage, with their proscenium effect of positioning the viewer across an invisible fourth wall. In a typical American sitcom, you hear but do not see an audience. Many are recorded with a live audience in the studio, and sometimes begin with a voice-over telling you as much (“Cheers was filmed before a live studio audience”). The combination of the three-wall set and audience audio makes the television comedy much more like theater than many kinds of television (the difference between this aesthetic and the “single-camera” comedy style of shows like The Office often prompts comparisons of the latter to cinema).
The sitcom “laugh track” is an old convention. It has sometimes been held up as the epitome of commercial television’s basically fraudulent nature. In the absence of a live audience, or when the audience isn’t demonstrative in the way the producers would like, the sound track of a comedy can be massaged by sweetening the recording or adding canned laughter. This isn’t that different from an older tradition in live performance of the claque, the audience members hired to applaud. But in any event, the sounds of the audience recreate for the viewer at home a sense of participation in a live event among members of a community who experience the show together. This is true for sports just as much as it is for scripted comedy or late-night variety shows. The audible audience for televised sports is always manipulated to be an accompaniment that suggests the space of a live event. A sports stadium or arena is a big television studio in the first place, a stage for the cameras with a raucous in-person audience. Your ticket gets you into the show as an extra. The sensory pandemonium of the live event is never really captured on TV, the blaring music and sound effects are kept low in the mix to keep the booth broadcasters’ voices loud and centered, and no one shoots a T-shirt cannon in your direction when you’re watching at home. But the crowd is essential to the visual and auditory qualities of sports, and the missing elements in these games from Florida have been a present absence. (...)
The video displays are part of what makes each game have a “home team,” as the imagery conveys the identity of one of the two competitors with the text, colors, and advertisements you would find in their home arena. The displays, expansive like digital billboards, also show images of the home team’s fans, which is a nice touch in theory. But the way this works in practice is bizarre. The low-res webcam images of the individual faces are abstracted against backgrounds that look like arena seats, and these are arrayed in a grid to create a large rectangle of spectators. The images are presumably live, but they could be out of sync for all we know as the fans seldom react to anything in the moment, have no way of feeding off one another, and are not audible. The arena has set up a grade of rows that recede away from the court, and some fans are more visible than others as they are courtside or behind the bench or scorer’s table. The close proximity of fans, separated by no barrier from the stars, is one of the thrills of watching live basketball. These virtual fans are by contrast one big upright surface of blurry, laggy heads, and they are reminiscent of the Hollywood Squares of meeting attendees now all too familiar from Zoom’s gallery view. Like many elements of live television of the past few months, these visuals of the NBA’s bubble games are the optics of a pandemic that has turned our lives inside out. (...)
These bubble games remind us, minute by minute, what life is like now. They afford us the dreamworld of a space where you can safely breathe heavily, unmasked, indoors with nine other players and three refs on the same basketball court. But they also televise this newly risky world of facemasks and six feet, of conversations mediated by plexiglass and video screens. I have felt for the NBA players whose season was abruptly arrested as it was getting good, but now I also envy the careful setup that their filthy rich sports league can afford, while my cash-strapped public university takes its chances and opens its dorms and classrooms without such a luxury of frequent testing and exceptional security.
by Michael Z. Newman, LARB | Read more:
Image: CNN
As sports, the competitions from the NBA bubble, like the football (soccer), baseball, and ice hockey games I’ve watched, are more or less the same. But as television shows, as a variety of broadcast media, and as an aesthetic experience made up of images and sounds, the NBA games so far have been a departure from the usual, and nothing feels right. It’s been a bit like another newly familiar experience: getting takeout from a restaurant where you previously would dine in. The food might taste like you remember it, but the sensory and social environment of the meal makes you realize how much context matters. (...)

The sitcom “laugh track” is an old convention. It has sometimes been held up as the epitome of commercial television’s basically fraudulent nature. In the absence of a live audience, or when the audience isn’t demonstrative in the way the producers would like, the sound track of a comedy can be massaged by sweetening the recording or adding canned laughter. This isn’t that different from an older tradition in live performance of the claque, the audience members hired to applaud. But in any event, the sounds of the audience recreate for the viewer at home a sense of participation in a live event among members of a community who experience the show together. This is true for sports just as much as it is for scripted comedy or late-night variety shows. The audible audience for televised sports is always manipulated to be an accompaniment that suggests the space of a live event. A sports stadium or arena is a big television studio in the first place, a stage for the cameras with a raucous in-person audience. Your ticket gets you into the show as an extra. The sensory pandemonium of the live event is never really captured on TV, the blaring music and sound effects are kept low in the mix to keep the booth broadcasters’ voices loud and centered, and no one shoots a T-shirt cannon in your direction when you’re watching at home. But the crowd is essential to the visual and auditory qualities of sports, and the missing elements in these games from Florida have been a present absence. (...)
The video displays are part of what makes each game have a “home team,” as the imagery conveys the identity of one of the two competitors with the text, colors, and advertisements you would find in their home arena. The displays, expansive like digital billboards, also show images of the home team’s fans, which is a nice touch in theory. But the way this works in practice is bizarre. The low-res webcam images of the individual faces are abstracted against backgrounds that look like arena seats, and these are arrayed in a grid to create a large rectangle of spectators. The images are presumably live, but they could be out of sync for all we know as the fans seldom react to anything in the moment, have no way of feeding off one another, and are not audible. The arena has set up a grade of rows that recede away from the court, and some fans are more visible than others as they are courtside or behind the bench or scorer’s table. The close proximity of fans, separated by no barrier from the stars, is one of the thrills of watching live basketball. These virtual fans are by contrast one big upright surface of blurry, laggy heads, and they are reminiscent of the Hollywood Squares of meeting attendees now all too familiar from Zoom’s gallery view. Like many elements of live television of the past few months, these visuals of the NBA’s bubble games are the optics of a pandemic that has turned our lives inside out. (...)
These bubble games remind us, minute by minute, what life is like now. They afford us the dreamworld of a space where you can safely breathe heavily, unmasked, indoors with nine other players and three refs on the same basketball court. But they also televise this newly risky world of facemasks and six feet, of conversations mediated by plexiglass and video screens. I have felt for the NBA players whose season was abruptly arrested as it was getting good, but now I also envy the careful setup that their filthy rich sports league can afford, while my cash-strapped public university takes its chances and opens its dorms and classrooms without such a luxury of frequent testing and exceptional security.
by Michael Z. Newman, LARB | Read more:
Image: CNN
Friday, August 21, 2020
Jerry Falwell Jr. and the Evangelical Redemption Story
Two weeks ago, Jerry Falwell Jr., the president of Liberty University, the largest evangelical college in America, posted an Instagram photo of himself on a yacht with his arm around a young woman whose midriff was bare and whose pants were unzipped. This would have been remarkable by itself, but it was all the more so because Falwell’s midriff was also bare and his pants also unzipped. In his hand, Falwell held a plastic cup of what he described winkingly in his caption as “black water.”
The aesthetics of the photo would be familiar to anyone who’s ever been to a frat party, but they were jarringly out of place for the son of Moral Majority cofounder Jerry Falwell Sr. and a professional evangelical Christian whose public rhetoric is built on a scaffolding of sexual conservatism and an antagonism to physical pleasure more generally.
The backdrop of a yacht represents an entirely different hypocrisy, arguably a more egregious one: the embrace of materialism and the open accumulation of enormous wealth. Falwell, who has a net worth estimated to be more than $100 million, is not formally a “prosperity gospel” adherent, but he has nonetheless jettisoned those inconvenient parts of Christian theology that preach the virtues of living modestly and using wealth to help the less fortunate.
But for his public, the problem with the photo was the optics of carnal sin—the attractive young woman who was not his wife, the recreational drinking, the unzipped pants—none of which would be acceptable at Liberty University, where coed dancing is penalized with a demerit. In the moral hierarchy of white evangelical Christianity, carnal sin is the worst, and this thinking drives the social conservatism that allows evangelicals to justify persecuting LGBTQ people, opposing sexual education in schools, distorting the very real problem of sex trafficking to punish sex workers, restricting access to abortion, eliminating contraception from employer-provided healthcare, and prosecuting culture wars against everything from medical marijuana to pop music. Evangelicalism’s official morality treats all pleasure as inherently suspect, the more so when those pleasures might belong to women or people of color.
Fortunately for Falwell, evangelicalism has built-in insurance for reputational damage, should a wealthy white man make the mistake of public licentiousness widely shared on the Web: the worst sins make for the best redemption stories. Even better, a fall from grace followed by a period of regret and repentance can be turned into a highly remunerative rehabilitation. That, in fact, has been many a traveling preacher’s grift from time immemorial.
I grew up hearing such “testimonies,” personal stories that articulate a life in sin and a coming to Jesus, firsthand. I was raised in the 1980s and 1990s in a family of Southern Baptists who viewed Episcopalians as raging liberals and Catholics, of which we knew precisely two, as an alien species. These were perfectly ordinary sentiments in the rural Alabama town we lived in. My dad was a local lineman for Alabama Power, and my mom worked at my school, first as a janitor and, later, as a lunch lady. Nobody in my family had gone to college.
Besides school and Little League, church was the primary basis of our social existence. As a child and into my early teens, my own religiosity was maybe a tick above average for our community. I went on mission trips to parts of the US that were more economically distressed than my hometown, handed out Chick tracts (named for the publisher and cartoonist Jack Chick) with as much zeal and sincerity as a twelve-year-old could muster, and on one occasion destroyed cassette tapes of my favorite bands (Nirvana, the Dead Kennedys, the Beastie Boys) in a fit of self-righteousness, only to re-buy them weeks later because, well, my faith had its limits.
All the while, I was—to use a word evangelicals like to misapply to any sort of secular education—“indoctrinated” by teachers, family, church staff, ministry organizations, and other members of the community to view everything I encountered in the world through an evangelical lens. If I went to the mall and lost my friends for a few minutes, I briefly suspected everyone had been raptured away except me, a particular brand of eschatological fantasy that we were taught was perpetually in danger of happening. Even my scandalous moments, which, do-goody overachiever that I was, were few and far between, were colored by the church. My first real kiss, at fourteen, was an epic make-out session on a sidewalk during a mission trip to a suburb of Orlando, with an eighteen-year-old assistant youth pastor named Matt.
I was ten or eleven when I was baptized—or in Southern Baptist parlance, “born again”—and part of this process involved constructing my own redemption narrative: I lived in sin and would be saved by Christ. I recently rediscovered my own handwritten testimony on a visit to my mom’s house. In a child’s rounded, looping handwriting, I had confessed that I used to “cheat at games,” something I don’t remember doing at all. The likely explanation for this is that because sin is such an important prerequisite for redemption, my ten-year-old self had to fabricate one to conform to the required convention (never mind that such a falsification would be sinful itself).
by Elizabeth Spiers, NY Review | Read more:
Image: Instagram
The aesthetics of the photo would be familiar to anyone who’s ever been to a frat party, but they were jarringly out of place for the son of Moral Majority cofounder Jerry Falwell Sr. and a professional evangelical Christian whose public rhetoric is built on a scaffolding of sexual conservatism and an antagonism to physical pleasure more generally.
The backdrop of a yacht represents an entirely different hypocrisy, arguably a more egregious one: the embrace of materialism and the open accumulation of enormous wealth. Falwell, who has a net worth estimated to be more than $100 million, is not formally a “prosperity gospel” adherent, but he has nonetheless jettisoned those inconvenient parts of Christian theology that preach the virtues of living modestly and using wealth to help the less fortunate.
But for his public, the problem with the photo was the optics of carnal sin—the attractive young woman who was not his wife, the recreational drinking, the unzipped pants—none of which would be acceptable at Liberty University, where coed dancing is penalized with a demerit. In the moral hierarchy of white evangelical Christianity, carnal sin is the worst, and this thinking drives the social conservatism that allows evangelicals to justify persecuting LGBTQ people, opposing sexual education in schools, distorting the very real problem of sex trafficking to punish sex workers, restricting access to abortion, eliminating contraception from employer-provided healthcare, and prosecuting culture wars against everything from medical marijuana to pop music. Evangelicalism’s official morality treats all pleasure as inherently suspect, the more so when those pleasures might belong to women or people of color.

I grew up hearing such “testimonies,” personal stories that articulate a life in sin and a coming to Jesus, firsthand. I was raised in the 1980s and 1990s in a family of Southern Baptists who viewed Episcopalians as raging liberals and Catholics, of which we knew precisely two, as an alien species. These were perfectly ordinary sentiments in the rural Alabama town we lived in. My dad was a local lineman for Alabama Power, and my mom worked at my school, first as a janitor and, later, as a lunch lady. Nobody in my family had gone to college.
Besides school and Little League, church was the primary basis of our social existence. As a child and into my early teens, my own religiosity was maybe a tick above average for our community. I went on mission trips to parts of the US that were more economically distressed than my hometown, handed out Chick tracts (named for the publisher and cartoonist Jack Chick) with as much zeal and sincerity as a twelve-year-old could muster, and on one occasion destroyed cassette tapes of my favorite bands (Nirvana, the Dead Kennedys, the Beastie Boys) in a fit of self-righteousness, only to re-buy them weeks later because, well, my faith had its limits.
All the while, I was—to use a word evangelicals like to misapply to any sort of secular education—“indoctrinated” by teachers, family, church staff, ministry organizations, and other members of the community to view everything I encountered in the world through an evangelical lens. If I went to the mall and lost my friends for a few minutes, I briefly suspected everyone had been raptured away except me, a particular brand of eschatological fantasy that we were taught was perpetually in danger of happening. Even my scandalous moments, which, do-goody overachiever that I was, were few and far between, were colored by the church. My first real kiss, at fourteen, was an epic make-out session on a sidewalk during a mission trip to a suburb of Orlando, with an eighteen-year-old assistant youth pastor named Matt.
I was ten or eleven when I was baptized—or in Southern Baptist parlance, “born again”—and part of this process involved constructing my own redemption narrative: I lived in sin and would be saved by Christ. I recently rediscovered my own handwritten testimony on a visit to my mom’s house. In a child’s rounded, looping handwriting, I had confessed that I used to “cheat at games,” something I don’t remember doing at all. The likely explanation for this is that because sin is such an important prerequisite for redemption, my ten-year-old self had to fabricate one to conform to the required convention (never mind that such a falsification would be sinful itself).
by Elizabeth Spiers, NY Review | Read more:
Image: Instagram
Thursday, August 20, 2020
Chart House
An iconic restaurant in Waikiki has closed its doors for good.
Management of Chart House Waikiki said they decided to stop operations citing coronavirus hardships. It’s unlikely they will reopen as many businesses especially in Waikiki continue to struggle.
The eatery has served customers for the past 52 years with beautiful views of the small boat harbor and stunning south shore sunsets.
In a simple statement on their website, Joey Cabell and Scott Okamoto said, “At this time we would like to say Mahalo to everyone who has supported us over the past 52 years.”
by HNN Staff, Hawaii News Now | Read more:
Image: Charthouse
[ed. Oh no. I'm grief-stricken. My all-time favorite bar, overlooking the Ala Wai Boat Harbor in Waikiki. So many great memories. It's the only place I make a special point of visiting everytime I go back.]Plastilina Mosh, El Guincho, Odisea
Repost
The American Nursing Home Is a Design Failure
With luck, either you will grow old or you already have. That is my ambition and probably yours, and yet with each year we succeed in surviving, we all face a crescendo of mockery, disdain, and neglect. Ageism is the most paradoxical form of bigotry. Rather than expressing contempt for others, it lashes out at our own futures. It expresses itself in innumerable ways — in the eagerness to sacrifice the elderly on the altar of the economy, in the willingness to keep them confined while everyone else emerges from their shells, and, in a popular culture that sees old age (when it sees it at all) as a purgatory of bingo nights. Stephen Colbert turned the notion of a 75-year-old antifa into a comic riff on geriatric terrorists, replete with images of octogenarians innocently locomoting with walkers, stair lifts, and golf carts.
In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”
In this country, we have erected a vast apparatus of last-stop living arrangements that, during the pandemic, have proven remarkably successful at killing the very people they were supposed to care for. The disease that has roared through nursing homes is forcing us to look hard at a system we use to store large populations and recognize that, like prisons and segregated schools, it brings us shame.
The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.
The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.
The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.
Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.
The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”
Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”
It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.
The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.
In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”

The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.
The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.
The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.
Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.
The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”
Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”
It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.
The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.
by Justin Davidson, NY Mag/Intelligencer | Read more:
Image: C.F. Møller
[ed. Personally, I'd prefer an endless supply of good drugs, or something like the euthanasia scene in Soylent Green - Death of Sol (not available on YouTube for some reason).]
Labels:
Architecture,
Business,
Culture,
Design,
Health
Wednesday, August 19, 2020
'One-Shot' Radiotherapy As Good For Breast Cancer As Longer Course
Women with breast cancer who receive one shot of radiotherapy immediately after surgery experience the same benefits as those who have up to 30 doses over three to six weeks, an international medical study has found.
The technique, known as targeted intraoperative radiotherapy, is increasingly being used around the world instead of women having to undergo weeks of painful and debilitating treatment.
Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.
The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.
The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.
by Denis Campbell, The Guardian | Read more:
Image: Rui Vieira/PA

Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.
The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.
The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.
Image: Rui Vieira/PA
Obama and the Beach House Loopholes
[ed. Magnum P.I.'s old property. Obama P.I.? Just doesn't have the same ring to it.]
A home in the nearby neighborhood of Kailua had served as the winter White House for the Obama family every Christmas, and photographers often captured shots of Obama and Nesbitt strolling on the beach or golfing over the holidays.
The prospective property was located just down the shore in the Native Hawaiian community of Waimanalo. Wedged between the Koʻolau mountains that jut 1,300 feet into the sky and a stunning turquoise ocean, the beachfront estate sprawled across 3 acres, featuring a five-bedroom manse, gatehouse, boat house and tennis courts. Fronting the property was a historic turtle pond that used to feed Hawaiian chiefs. Local families took their children to splash and swim in its calm waters.

But the sellers of the Waimanalo property found a way to ensure the seawall remained in place for another generation. They asked state officials for something called an easement, a real estate tool that allows private property owners to essentially lease the public land that sits under the seawall. The cost: a one-time payment of $61,400. Officials with the state Department of Land and Natural Resources approved the permit, which authorized the wall for another 55 years, and Nesbitt purchased the property.
State officials and community members say the Obamas will be among the future occupants.
The easement paved the way for building permits and allowed developers to exploit other loopholes built into Hawaii’s coastal planning system. Nesbitt went on to win another environmental exemption from local officials and is currently pursuing a third — to expand the seawall. According to building permits, the Obamas’ so-called First Friend is redeveloping the land into a sprawling estate that will include three new single-family homes, two pools and a guard post. The beach fronting the seawall is nearly gone, erased completely at high tide.
Community members are now rallying against the proposed seawall expansion. Some are directing their criticism at Obama, who staked his legacy, in part, on fighting climate change and promoting environmental sustainability.
Obama’s personal office declined to comment, referring inquiries to Nesbitt. And Nesbitt, who declined to be interviewed, would not directly address questions about ownership, only saying that he and his wife bought the land and were “the developers” of the estate.
In written responses to questions, Nesbitt, now chair of the Obama Foundation board and co-CEO of a Chicago-based private-equity firm, said the steps he’s taken to redevelop the property and expand the seawall are “consistent with and informed by the analysis of our consultants, and the laws, regulations and perspectives of the State of Hawaii.” Any damage the structure caused to the Waimanalo beach, he added, occurred decades ago “and is no longer relevant.”
In Hawaii, beaches are a public trust, and the state is constitutionally obligated to preserve and protect them. But across the islands, officials have routinely favored landowners over shorelines, granting exemptions from environmental laws as the state loses its beaches. (...)
Intended to protect homeowners’ existing properties, easements have also helped fuel building along portions of Hawaii’s most treasured coastlines, such as Lanikai on Oahu and west side beaches on Maui. Scores of property owners have renovated homes and condos on the coast while investors have redeveloped waterfront lots into luxury estates. Meanwhile, the seawalls protecting these properties have diminished the shorelines. With nowhere to go, beaches effectively drown as sea levels rise against the walls and waves claw away the sand fronting them, moving it out to sea.
Researchers estimate that roughly a quarter of the beaches on Oahu, Maui and Kauai have already been lost or substantially narrowed because of seawalls over the past century. That has left less coastal habitat for endangered monk seals to haul up and rest and sea turtles to lay eggs. By midcentury, experts predict, the state will be down to just a handful of healthy beaches as climate change causes sea levels to rise at unprecedented rates. (...)
Beaches and open coastlines have always been central to Hawaii’s way of life. For centuries, Native Hawaiians enjoyed access to the ocean’s life-sustaining resources. Natural sand dunes provided protection against strong storms and served as a place for Native Hawaiians to bury their loved ones.
After Hawaii became a state in 1959, development of homes and hotels along the coastlines exploded as investors sought to capitalize on what was becoming some of the most valuable real estate in the country. An environmental review commissioned by the state in the 1970s found that three-quarters of the state’s sandy coastlines were now hugged by private property, curtailing public access to shorelines. Many property owners erected seawalls to try to hold back the ocean.
By the 1990s, scientists were warning that those seawalls were causing significant beach loss on all the Hawaiian islands.
Alarmed by these losses, state officials in 1997 released a roadmap for protecting the state’s beaches. The report emphasized that the seawalls were destroying coastal ecosystems, threatening the state’s tourist-driven economy and limiting the public’s access to beaches and the ocean, a right enshrined in the Hawaii Constitution.
If beaches continue to disappear throughout the state, the report warned, “the fabric of life in Hawaii will change and the daily miracle of living among these islands will lose its luster.”
by Sophie Cocke, ProPublica/Honolulu Star Advertiser | Read more:
Image: Darryl Oumi, special to Honolulu Star-Advertiser
[ed. How many houses do the Obama's own? Let's see, there's that one in Washington D.C., the recent one in Martha's Vineyard, and wasn't there one in Chicago? I can't keep track. Being ex-president can be a pretty lucrative gig if you protect the status quo.]
Get Ready for a Teacher Shortage Like We’ve Never Seen Before
Usually on the first day back to work after summer break, there’s this buzzing, buoyant energy in the air. My school is a small school-within-a-school designated to serve gifted children, so there are only 16 teachers and staff members. We typically meet in a colleague’s tidy classroom, filled with natural light and the earthy smell of coffee.
We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.
Coming back this year was different.
It was Thursday, Aug. 6, the same day that the Houston area reported its new single-day high for deaths from Covid-19. Instead of gathering, we all tuned in to a Zoom meeting from our separate classrooms.
There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.
We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.
“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”
I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”
I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”
She sent a message loud and clear: “I really, really don’t want to be here.”
We are about to see schools in America filled with these kinds of teachers.
Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.
You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.
That’s not what they do.
They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.
So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.
This has been happening for years. We’re about to see it get a lot worse.
by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields
We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.
Coming back this year was different.

There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.
We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.
“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”
I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”
I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”
She sent a message loud and clear: “I really, really don’t want to be here.”
We are about to see schools in America filled with these kinds of teachers.
Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.
You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.
That’s not what they do.
They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.
So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.
This has been happening for years. We’re about to see it get a lot worse.
by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields
Tuesday, August 18, 2020
Jack Kirby. From a golden age story reprinted in an early ‘70s “Marvel Premiere” comic
[ed. Living in the bubble]
Deceptively Bright, in an Up & Coming Area
Bunker: Building for the End Times
By Bradley Garrett
What is a bunker? The term derives from an Old Swedish word meaning ‘boards used to protect the cargo of a ship’. But if we take it, as we usually do, to mean a defended structure, often underground, intended to shield people and important goods through a period of strife, then it is one of the oldest building types made by humans. In Cappadocia, central Turkey, there are twenty-two subterranean settlements made by Hittite peoples around 1200 BC. As their empire faltered, the Hittites dug into soft hillsides to shelter themselves. As many as twenty thousand people lived at Derinkuyu, the deepest complex.
But the word ‘bunker’ also has the scent of modernity about it. As Bradley Garrett explains in his book, it was a corollary of the rise of air power, as a result of which the battlefield became three-dimensional. With the enemy above and equipped with high explosives, you had to dig down and protect yourself with metres of concrete. Garrett’s previous book, Explore Everything, was a fascinating insider’s look at illicit ‘urban exploration’, and he kicks off Bunker with an account of time spent poking around the Burlington Bunker, which would have been used by the UK government in the event of a nuclear war. The Cold War may have ended, but governments still build bunkers, as Garrett shows: Chinese contractors have recently completed a 23,000-square-metre complex in Djibouti. But these grand, often secret manifestations of official fear are not the main focus of the book. Instead, Garrett is interested in private bunkers and the people who build them, people like Robert Vicino, founder of the Vivos Group, who purchased the Burlington Bunker with the intent of making a worldwide chain of apocalypse retreats.
Garrett calls these people the ‘dread merchants’. Dread differs from fear in that it has no object: it is fear that has not yet found a focus. And if dread is your business, business has never been better, with the sustaining structures of modern life seeming ever more fragile and challenged. The dark charisma of the bunker is probably what will attract readers to this book, but the energetic and gregarious Garrett keeps the story focused on people rather than buildings. Much of the emphasis is on his native USA, where ‘prepping’ – disaster and Armageddon preparedness – has become a significant subculture, though there are also excursions to Australia, where ecological precarity is fuelling the bunker biz, and New Zealand and Thailand, favoured global ‘bug-out’ locations of the elite.
The first wave of private bunker-building followed the Cuban Missile Crisis of 1962, during which the American government made it plain that it had no intention of providing for the shelter of more than the military and political elite. The rest of the population got the message: if the worst happens, you’re on your own. Since then, American society appears to have been locked in a spiral of mistrust. In the 1990s, religiously minded ‘survivalist’ movements sought to divorce themselves from what they saw as an increasingly controlling federal state by forming autonomous fortified communities. Alarmed at these splinter groups walling themselves up and stockpiling weapons, the government reacted with overwhelming force, resulting in multiple deaths at Ruby Ridge and at the Branch Davidian compound in Waco, Texas. This bloodshed did nothing but confirm survivalists’ worst fears.
After the 9/11 attacks, survivalism entered the mainstream, giving birth to the modern prepper movement. As bunker salesman Gary Lynch tells Garrett, 9/11 was good for business on two fronts, as some Americans began to fear further terrorist attacks while others became alarmed by the prospect of increasing domestic authoritarianism. (...)
Buried, seemingly secure, as much a target for robbers as protection against them, the bunker shares many characteristics with the tomb. Both structures mediate with a kind of afterlife: the tomb ferries the dead to the hereafter, while the bunker is designed to deliver the still-living through a period of calamity to a safer future. Hope and survival are, in theory, uplifting themes, but Bunker is, in some ways, rather depressing. The people who want bunkers have, in one form or another, given up on society, taking a dim view of its prospects and seeing it as a thin veneer of order laid over Hobbesian chaos. The salespeople naturally promote this view: ‘dread merchants’ is the right phrase for them, since dread is really the product they’re selling.
By Bradley Garrett

Garrett calls these people the ‘dread merchants’. Dread differs from fear in that it has no object: it is fear that has not yet found a focus. And if dread is your business, business has never been better, with the sustaining structures of modern life seeming ever more fragile and challenged. The dark charisma of the bunker is probably what will attract readers to this book, but the energetic and gregarious Garrett keeps the story focused on people rather than buildings. Much of the emphasis is on his native USA, where ‘prepping’ – disaster and Armageddon preparedness – has become a significant subculture, though there are also excursions to Australia, where ecological precarity is fuelling the bunker biz, and New Zealand and Thailand, favoured global ‘bug-out’ locations of the elite.
The first wave of private bunker-building followed the Cuban Missile Crisis of 1962, during which the American government made it plain that it had no intention of providing for the shelter of more than the military and political elite. The rest of the population got the message: if the worst happens, you’re on your own. Since then, American society appears to have been locked in a spiral of mistrust. In the 1990s, religiously minded ‘survivalist’ movements sought to divorce themselves from what they saw as an increasingly controlling federal state by forming autonomous fortified communities. Alarmed at these splinter groups walling themselves up and stockpiling weapons, the government reacted with overwhelming force, resulting in multiple deaths at Ruby Ridge and at the Branch Davidian compound in Waco, Texas. This bloodshed did nothing but confirm survivalists’ worst fears.
After the 9/11 attacks, survivalism entered the mainstream, giving birth to the modern prepper movement. As bunker salesman Gary Lynch tells Garrett, 9/11 was good for business on two fronts, as some Americans began to fear further terrorist attacks while others became alarmed by the prospect of increasing domestic authoritarianism. (...)
Buried, seemingly secure, as much a target for robbers as protection against them, the bunker shares many characteristics with the tomb. Both structures mediate with a kind of afterlife: the tomb ferries the dead to the hereafter, while the bunker is designed to deliver the still-living through a period of calamity to a safer future. Hope and survival are, in theory, uplifting themes, but Bunker is, in some ways, rather depressing. The people who want bunkers have, in one form or another, given up on society, taking a dim view of its prospects and seeing it as a thin veneer of order laid over Hobbesian chaos. The salespeople naturally promote this view: ‘dread merchants’ is the right phrase for them, since dread is really the product they’re selling.
by Will Wiles, Literary Review | Read more:
Image: via
Labels:
Culture,
Government,
Journalism,
Psychology,
Security
Love Letter To A Vanishing World
1.
I have several times been within spitting distance: to the Philippines—as far south as Panay; to the court cities of central Java and to the highlands of Sulawesi, in Indonesia. I’ve spent many happy days on Peninsular Malaysia. Have lived in Tokyo, Hong Kong, and Kaoshiung~~~But as they say, “Close, but no cigar!”

A cruel young woman, I vetoed Borneo –and dragged him off to Kashmir instead. And to make matters worse, a year later, Gavin Young came out with his highly acclaimed book, In Search of Conrad, in which he does just what my boyfriend had wanted to do: follow Conrad to that famed trading post up “an Eastern river.”
2.
Recently, I re-read Eric Hansen’s travel classic, Stranger in the Forest. The book came out in the mid-80s. This was about ten years before I vetoed our trip to Borneo. It was also a time before the Internet and GPS. To prepare for his trip, Hansen had to go to a university library and read books, flip through journals, and consult maps—and to his great delight, he discovered there were still uncharted areas. And these were the very spots he wanted to see! Beginning his journey on the Malaysian side of Borneo, in Kuching, he traveled upriver on the Rajang (every bit as legendary as the Mahakam), and made his way inland toward the Highlands, where the indigenous Dayak peoples lived.
Did I mention he was mainly going on foot?
His trip occurred just a few years before Bruno Manser’s legendary ramble across Borneo. You’ve heard the expression “Fact is stranger than fiction?” Well, that term was invented for Swiss environmentalist, Bruno Manser’s life story. Arriving in Borneo in the mid-80s, within a year, he was living with one of the most elusive tribes in the highlands, the Penan. Carl Hoffman (who wrote the best seller, Savage Harvest) has just come out with a double biography called The Last Wild Men of Borneo about Bruno Manser and American tribal art dealer Michael Palmieri. The cover of the book has a photograph of Manser that I did not realize was a white man until I was nearly finished reading. Dressed in a loincloth and carrying a poison arrow quiver and blowpipe, his hair has been cut in the Dayak fashion, and he is shown squatting on a rock near the river’s edge. It is a touching photograph of a man who gave his life to fight for the rights of the indigenous peoples of the highlands.

Even as early as 1980, logging was already a huge issue. In Japan, especially, environmentalists rightly bemoaned the destruction being caused by the timber industry—so much of that wood being imported into Japan (The majority is now being imported into China). Logging was pushing the indigenous Dayak peoples of the highland into greater and greater peril as the land they considered to be theirs was being destroyed. Water was contaminated and animals were dying in great numbers. Manser realized that a people who had lived harmoniously in the interior of the island for thousands of years were now in grave danger of being pushed out–all in the name of corporate greed.
And so he fought valiantly to bring their plight to the attention of the world—including climbing up a 30-foot-tall London lamppost outside of the media center covering the 1991 G7 Summit and unfurling a banner about Dayak rights and then the following year, paragliding into a crowded stadium during the Earth Summit in Rio de Janeiro. In 1992, after meeting Manser, Vice-President Al Gore introduced a resolution into the senate calling upon the government of Malaysia to protect the rights of the indigenous peoples and for Japan to look into its logging companies’ practices. By the mid-90s, Manser had become a serious headache to the huge logging industry in Malaysia and an embarrassment to the government. Manser was to disappear in 2000 and was officially pronounced dead in 2005 (though his body was never found).
3.
It is a tragic story, with the only possible silver-lining being that at least Manser was not around to see what happened next, when the palm oil industry came to town. I had began wondering how much of that Borneo my boyfriend dreamt of was left? So, I picked up The Wasting of Borneo, by Alex Shoumatoff (2017) and quickly realized the situation was far worse than I was imagining. A staff writer for the New Yorker, Shoumatoff has been a contributing editor at Vanity fair and Conde Nast traveler among others. A travel writer and environmentalist, he has been to Borneo several times. In this latest book, he begins his Borneo journey with a visit to Birute Galdikas at her Orangutan Care Center near the Tanjung Puting National Park in Central Kalimantan.
Have you heard of Leakey’s Angels?
by Leanne Ogasawara, 3 Quarks Daily | Read more:
Images: uncredited
Monday, August 17, 2020
La Caravana del Diablo
La Caravana del Diablo: a migrant caravan in Mexico (The Guardian)
Image: Ada Trillo
[ed. Photo essay.]
[ed. Photo essay.]
Labels:
Economics,
Government,
history,
Politics,
Security
The Fully Industrialized Modern Chicken
A century ago, Americans would not recognise our modern hunger for chicken. The year-long market for tender but relatively bland chicken meat is a newish phenomenon, and without it the idea of chicken cutlets, $5 rotisseries, or the McNugget would be a fantasy.
How did America go from thinking of chicken as an “alternative” meat to consuming it more than any other meat?
The story starts with corn.
How American corn fueled a taste for chicken
At the turn of the 20th century, chicken was almost always eaten in the spring. The priority for chicken raisers at the time was egg production, so after the eggs hatched, all the male birds would be fed up and then quickly harvested as “spring chickens” – young, tender birds that were sold whole for roasting or broiling (hence the term “broilers”). Outside the spring rush, you might be buying a bigger, fatter fryer or an old hen for stewing.
“Farmers were sending chickens of all sorts of ages, different feather colours, and tremendous variety to the marketplace in the early 20th century,” says Roger Horowitz, food historian and author of Putting Meat on the American Table. But almost all chickens in the market were simply surplus to egg production, making them relatively uncommon – even rare. Tender spring chickens in particular could fetch a good price. But it is worth noting, Horowitz says, that the higher price wasn’t necessarily coming from pent-up demand.
“It’s not as if consumers were clamoring for broilers,” he says. Though there was some consumer demand for chickens, the relatively high price for broilers likely had more to do with the limited, seasonal supply than a passion for poultry.
During the second world war, however, red meat was rationed, and a national campaign encouraged the consumption of poultry and fish to save “meat” (beef, pork and lamb) for “the army and our allies”. Eating chicken became more common, but the preference for young broilers, and white breast meat, persisted.
As the war drew to a close, feed millers, which buy and grind corn and other grains to feed livestock, saw a big opportunity to spur that demand for meat chickens, which consume large amounts of corn. When traditional banks refused to finance new-fangled “chicken farms”, the feed companies themselves offered farmers loans to buy feed and equipment, putting the pieces of the modern contract poultry system in place.
Consumer acceptance of broilers out of season was not automatic. In the 1930s, the average American ate 10lbs (4.5kg) or less of chicken annually; by 2017 that had risen to 64lbs (29kg), according to the Economic Research Service at the United States Department of Agriculture (USDA). For decades chicken battled to be seen as a “meat”, and did not surpass its most expensive competitor, beef, in terms of overall consumption until 2010. A strong USDA-funded marketing campaign helped out.
“In the 50s and 60s, you see where these agricultural extension operations start pushing out recipes very aggressively about broilers,” Horowitz says, and as feed companies and hatcheries (most of which would eventually become so-called “integrators”, which own several of the businesses involved in chicken production) continued to consolidate the industry, they were able to more carefully calibrate the chicken itself to what would sell most profitably, focusing on lowering costs and raising proportions of the highest-demand cuts, namely breast meat.
Don Tyson, the late president of Tyson Foods, famously said: “If breast meat is worth two dollars a pound and dark meat is worth one dollar, which would I rather have?” But for generations, the idea of buying just the most coveted cuts of chicken was foreign to most consumers. It wasn’t until the 1980s that preferences began to switch to cuts of meat over the whole bird.
These companies owned and understood their chickens from egg to table and were able to exert unprecedented control over the biology of their flocks. Now, not only are they able to fine tune the birds’ characteristics with incredible accuracy, they can also map interactions with feed, environment, and processing to maximise profits.
For integrators and corn farmers alike, the investment paid off. In 2019, 9.2 billion 6lb (2.7kg) broiler chickens were harvested in the US, consuming about 1.8lbs (820g) of grain for every pound of chicken.
But the impact on chickens from the changes in production is troubling.
The modern industrial chicken
Over the past 70 years, the poultry industry has measured its success in terms of how many pounds of meat a chicken can produce for a given amount of feed. Modern chickens are more efficient than ever, with producers able to calculate to the ounce how much “input” of food, water, air and time are required to get a set amount of white and dark meat.
The modern chicken is fully industrialised.
With more than 500 chicken breeds existing on Earth, it might surprise you to learn that every nugget, breast, and cup of chicken noodle soup you’ve ever eaten likely came from one breed, a specialised cross between a Cornish and a white rock.
How did America go from thinking of chicken as an “alternative” meat to consuming it more than any other meat?
The story starts with corn.
How American corn fueled a taste for chicken
At the turn of the 20th century, chicken was almost always eaten in the spring. The priority for chicken raisers at the time was egg production, so after the eggs hatched, all the male birds would be fed up and then quickly harvested as “spring chickens” – young, tender birds that were sold whole for roasting or broiling (hence the term “broilers”). Outside the spring rush, you might be buying a bigger, fatter fryer or an old hen for stewing.

“It’s not as if consumers were clamoring for broilers,” he says. Though there was some consumer demand for chickens, the relatively high price for broilers likely had more to do with the limited, seasonal supply than a passion for poultry.
During the second world war, however, red meat was rationed, and a national campaign encouraged the consumption of poultry and fish to save “meat” (beef, pork and lamb) for “the army and our allies”. Eating chicken became more common, but the preference for young broilers, and white breast meat, persisted.
As the war drew to a close, feed millers, which buy and grind corn and other grains to feed livestock, saw a big opportunity to spur that demand for meat chickens, which consume large amounts of corn. When traditional banks refused to finance new-fangled “chicken farms”, the feed companies themselves offered farmers loans to buy feed and equipment, putting the pieces of the modern contract poultry system in place.
Consumer acceptance of broilers out of season was not automatic. In the 1930s, the average American ate 10lbs (4.5kg) or less of chicken annually; by 2017 that had risen to 64lbs (29kg), according to the Economic Research Service at the United States Department of Agriculture (USDA). For decades chicken battled to be seen as a “meat”, and did not surpass its most expensive competitor, beef, in terms of overall consumption until 2010. A strong USDA-funded marketing campaign helped out.
“In the 50s and 60s, you see where these agricultural extension operations start pushing out recipes very aggressively about broilers,” Horowitz says, and as feed companies and hatcheries (most of which would eventually become so-called “integrators”, which own several of the businesses involved in chicken production) continued to consolidate the industry, they were able to more carefully calibrate the chicken itself to what would sell most profitably, focusing on lowering costs and raising proportions of the highest-demand cuts, namely breast meat.
Don Tyson, the late president of Tyson Foods, famously said: “If breast meat is worth two dollars a pound and dark meat is worth one dollar, which would I rather have?” But for generations, the idea of buying just the most coveted cuts of chicken was foreign to most consumers. It wasn’t until the 1980s that preferences began to switch to cuts of meat over the whole bird.
These companies owned and understood their chickens from egg to table and were able to exert unprecedented control over the biology of their flocks. Now, not only are they able to fine tune the birds’ characteristics with incredible accuracy, they can also map interactions with feed, environment, and processing to maximise profits.
For integrators and corn farmers alike, the investment paid off. In 2019, 9.2 billion 6lb (2.7kg) broiler chickens were harvested in the US, consuming about 1.8lbs (820g) of grain for every pound of chicken.
But the impact on chickens from the changes in production is troubling.
The modern industrial chicken
Over the past 70 years, the poultry industry has measured its success in terms of how many pounds of meat a chicken can produce for a given amount of feed. Modern chickens are more efficient than ever, with producers able to calculate to the ounce how much “input” of food, water, air and time are required to get a set amount of white and dark meat.
The modern chicken is fully industrialised.
With more than 500 chicken breeds existing on Earth, it might surprise you to learn that every nugget, breast, and cup of chicken noodle soup you’ve ever eaten likely came from one breed, a specialised cross between a Cornish and a white rock.
by Sarah Mock, The Guardian | Read more:
Image: Glowimages/GettyWhat's Up With the USPS
Donald Trump has never hidden his intention to destroy the United States Postal Service (USPS) as we know it. The administration released plans openly declaring that its long-term aim was to privatize the USPS, enriching private investors by handing them a valuable public asset. Now, Trump’s postmaster general, Louis DeJoy, is under fire for internal changes that are hindering the USPS’s ability to deliver mail efficiently, and Trump himself has implied that he is reluctant to fund the USPS due to his longstanding opposition to mail-in voting.
DeJoy is a prototypical “political crony” appointee, a Republican party donor who never worked for the postal service and has financial interest in private delivery competitors to the USPS. The Intercept discovered that when DeJoy was in the private sector, he had a long history of overseeing labor violations. DeJoy has admitted that his changes to the USPS have caused delays to service, though he insists it has been unintentional. Trump has targeted the USPS for years, threatening to jack up prices and treating it as in need of an overhaul, one that DeJoy is now ruthlessly implementing.
The postal service has long been a target for Republicans, in part because a successful USPS is a threat to Republican ideology. After all, the conservative argument is that efficient public services are essentially impossible, that most government functions should be handed over to the private sector. A popular socialized mail service threatens to severely undercut this case. After all, if people are satisfied with the government delivering their mail, they might turn out to be satisfied with the government providing their health insurance. It could be a slippery slope toward socialism. A number of other countries have privatized their postal services.
Trump did not actually start the war on the USPS. Barack Obama actually pushed austerity measures, including a plan to eliminate Saturday delivery and cut the service’s budget. Obama’s former Office of Management and Budget director, Peter Orszag, endorsed full privatization. The ideology that government should be “lean” and “run like a business”, and that the private sector is inherently superior to the public sector, is a bipartisan delusion.
The postal service’s infamous financial woes are not actually hard to fix. While Trump tries to suggest it is all a result of inefficiency and mismanagement, we know that it mostly boils down to an absurdly unnecessary requirement imposed on the USPS which required it to put away billions of dollars each year for future retirement benefits. It would be easy to get the USPS shipshape again, but it would require a commitment to building an excellent public service, one that Obama didn’t really show and Trump certainly doesn’t have.
We should also remember, though, that talk of the USPS “losing money” is inherently a bit misleading and strange. Public services do not “lose money”, because they’re not designed to make money. If people said that the public library, or the school system, or the fire department was “losing money”, it would be very strange. Of course they are: they don’t take in revenue, because their purpose is to give everyone a free service, paid for out of government funds. It’s not like that money just goes into a pit or is frittered away. The money pays for a service that we then all get to enjoy. So even though we should point out that the USPS’s financial distress is in an important way politically manufactured, we should also be careful about embracing the logic that a government agency needs to “break even”. That’s not what the government is for. (...)
A very clever Republican tactic is to mismanage the government, and then point to government mismanagement as a case for privatization. (Hence hobbling the USPS with absurd budgetary requirements and then pointing out holes in its budget.) To counter that, it’s very important to make the general public aware of whose fault the problem is. If people see their mail delayed, and become frustrated, they need to understand that it’s Trump, not their local letter carrier, who is at fault. Trump is going to try to turn the agency into the villain of the story, because the USPS’s popularity is one of the reasons it has been relatively safe.
[ed. When I first moved to Washington state a few years ago and got to vote by mail I wondered, why haven't we been doing this forever? It's so simple and easy. You get a ballot in the mail along with a detailed brochure providing both pro and con arguments by advocates on either sides of the issues, fill in your votes, sign it and send it off (no postage necessary), or drop off at libraries, postal and county offices, etc. Easy peasy. Sure beats standing in long lines after work. See also:
Almost every citizen is at least inconvenienced. I’ve been corresponding throughout the day with readers from around the country who have gotten mail delivery half of the days this week, who are waiting for overdue prescriptions, waiting on packages who are two weeks overdue, Social Security checks which are sole sources of income. For many life saving prescriptions are delayed or lost. Critical medical tests are being invalidated because they spend to line in the mail. Businesses already battered by COVID are imperiled because shipments are late. These all apply to citizens from the far right to the far left.
The Post Office isn’t some newfangled federal responsibility. It is one of very few federal responsibilities and agencies of government explicitly referenced in the federal constitution.
President Trump is far from the first corrupt American President. But it is genuinely hard to think of a case in almost a quarter millennium of US history in which a chief executive has inconvenienced, damaged and imperiled so many citizens so directly for the sole purpose of corruptly maintaining power in defiance of the constitutional order. There’s really nothing comparable.
DeJoy is a prototypical “political crony” appointee, a Republican party donor who never worked for the postal service and has financial interest in private delivery competitors to the USPS. The Intercept discovered that when DeJoy was in the private sector, he had a long history of overseeing labor violations. DeJoy has admitted that his changes to the USPS have caused delays to service, though he insists it has been unintentional. Trump has targeted the USPS for years, threatening to jack up prices and treating it as in need of an overhaul, one that DeJoy is now ruthlessly implementing.

Trump did not actually start the war on the USPS. Barack Obama actually pushed austerity measures, including a plan to eliminate Saturday delivery and cut the service’s budget. Obama’s former Office of Management and Budget director, Peter Orszag, endorsed full privatization. The ideology that government should be “lean” and “run like a business”, and that the private sector is inherently superior to the public sector, is a bipartisan delusion.
The postal service’s infamous financial woes are not actually hard to fix. While Trump tries to suggest it is all a result of inefficiency and mismanagement, we know that it mostly boils down to an absurdly unnecessary requirement imposed on the USPS which required it to put away billions of dollars each year for future retirement benefits. It would be easy to get the USPS shipshape again, but it would require a commitment to building an excellent public service, one that Obama didn’t really show and Trump certainly doesn’t have.
We should also remember, though, that talk of the USPS “losing money” is inherently a bit misleading and strange. Public services do not “lose money”, because they’re not designed to make money. If people said that the public library, or the school system, or the fire department was “losing money”, it would be very strange. Of course they are: they don’t take in revenue, because their purpose is to give everyone a free service, paid for out of government funds. It’s not like that money just goes into a pit or is frittered away. The money pays for a service that we then all get to enjoy. So even though we should point out that the USPS’s financial distress is in an important way politically manufactured, we should also be careful about embracing the logic that a government agency needs to “break even”. That’s not what the government is for. (...)
A very clever Republican tactic is to mismanage the government, and then point to government mismanagement as a case for privatization. (Hence hobbling the USPS with absurd budgetary requirements and then pointing out holes in its budget.) To counter that, it’s very important to make the general public aware of whose fault the problem is. If people see their mail delayed, and become frustrated, they need to understand that it’s Trump, not their local letter carrier, who is at fault. Trump is going to try to turn the agency into the villain of the story, because the USPS’s popularity is one of the reasons it has been relatively safe.
by Nathan J. Robinson, The Guardian | Read more:
Image: Rob Latour/Rex/Shutterstock[ed. When I first moved to Washington state a few years ago and got to vote by mail I wondered, why haven't we been doing this forever? It's so simple and easy. You get a ballot in the mail along with a detailed brochure providing both pro and con arguments by advocates on either sides of the issues, fill in your votes, sign it and send it off (no postage necessary), or drop off at libraries, postal and county offices, etc. Easy peasy. Sure beats standing in long lines after work. See also:
Almost every citizen is at least inconvenienced. I’ve been corresponding throughout the day with readers from around the country who have gotten mail delivery half of the days this week, who are waiting for overdue prescriptions, waiting on packages who are two weeks overdue, Social Security checks which are sole sources of income. For many life saving prescriptions are delayed or lost. Critical medical tests are being invalidated because they spend to line in the mail. Businesses already battered by COVID are imperiled because shipments are late. These all apply to citizens from the far right to the far left.
The Post Office isn’t some newfangled federal responsibility. It is one of very few federal responsibilities and agencies of government explicitly referenced in the federal constitution.
President Trump is far from the first corrupt American President. But it is genuinely hard to think of a case in almost a quarter millennium of US history in which a chief executive has inconvenienced, damaged and imperiled so many citizens so directly for the sole purpose of corruptly maintaining power in defiance of the constitutional order. There’s really nothing comparable.
Make Him Own It. (Talking Points Memo).]
Saturday, August 15, 2020
The Unraveling of America
Never in our lives have we experienced such a global phenomenon. For the first time in the history of the world, all of humanity, informed by the unprecedented reach of digital technology, has come together, focused on the same existential threat, consumed by the same fears and uncertainties, eagerly anticipating the same, as yet unrealized, promises of medical science.
In a single season, civilization has been brought low by a microscopic parasite 10,000 times smaller than a grain of salt. COVID-19 attacks our physical bodies, but also the cultural foundations of our lives, the toolbox of community and connectivity that is for the human what claws and teeth represent to the tiger.
Our interventions to date have largely focused on mitigating the rate of spread, flattening the curve of morbidity. There is no treatment at hand, and no certainty of a vaccine on the near horizon. The fastest vaccine ever developed was for mumps. It took four years. COVID-19 killed 100,000 Americans in four months. There is some evidence that natural infection may not imply immunity, leaving some to question how effective a vaccine will be, even assuming one can be found. And it must be safe. If the global population is to be immunized, lethal complications in just one person in a thousand would imply the death of millions.
Pandemics and plagues have a way of shifting the course of history, and not always in a manner immediately evident to the survivors. In the 14th Century, the Black Death killed close to half of Europe’s population. A scarcity of labor led to increased wages. Rising expectations culminated in the Peasants Revolt of 1381, an inflection point that marked the beginning of the end of the feudal order that had dominated medieval Europe for a thousand years.
The COVID pandemic will be remembered as such a moment in history, a seminal event whose significance will unfold only in the wake of the crisis. It will mark this era much as the 1914 assassination of Archduke Ferdinand, the stock market crash of 1929, and the 1933 ascent of Adolf Hitler became fundamental benchmarks of the last century, all harbingers of greater and more consequential outcomes.
COVID’s historic significance lies not in what it implies for our daily lives. Change, after all, is the one constant when it comes to culture. All peoples in all places at all times are always dancing with new possibilities for life. As companies eliminate or downsize central offices, employees work from home, restaurants close, shopping malls shutter, streaming brings entertainment and sporting events into the home, and airline travel becomes ever more problematic and miserable, people will adapt, as we’ve always done. Fluidity of memory and a capacity to forget is perhaps the most haunting trait of our species. As history confirms, it allows us to come to terms with any degree of social, moral, or environmental degradation.
To be sure, financial uncertainty will cast a long shadow. Hovering over the global economy for some time will be the sober realization that all the money in the hands of all the nations on Earth will never be enough to offset the losses sustained when an entire world ceases to function, with workers and businesses everywhere facing a choice between economic and biological survival. (...)
In the wake of the war, with Europe and Japan in ashes, the United States with but 6 percent of the world’s population accounted for half of the global economy, including the production of 93 percent of all automobiles. Such economic dominance birthed a vibrant middle class, a trade union movement that allowed a single breadwinner with limited education to own a home and a car, support a family, and send his kids to good schools. It was not by any means a perfect world but affluence allowed for a truce between capital and labor, a reciprocity of opportunity in a time of rapid growth and declining income inequality, marked by high tax rates for the wealthy, who were by no means the only beneficiaries of a golden age of American capitalism.
But freedom and affluence came with a price. The United States, virtually a demilitarized nation on the eve of the Second World War, never stood down in the wake of victory. To this day, American troops are deployed in 150 countries. Since the 1970s, China has not once gone to war; the U.S. has not spent a day at peace. President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world.” Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
As America policed the world, the violence came home. On D-Day, June 6th, 1944, the Allied death toll was 4,414; in 2019, domestic gun violence had killed that many American men and women by the end of April. By June of that year, guns in the hands of ordinary Americans had caused more casualties than the Allies suffered in Normandy in the first month of a campaign that consumed the military strength of five nations.
More than any other country, the United States in the post-war era lionized the individual at the expense of community and family. It was the sociological equivalent of splitting the atom. What was gained in terms of mobility and personal freedom came at the expense of common purpose. In wide swaths of America, the family as an institution lost its grounding. By the 1960s, 40 percent of marriages were ending in divorce. Only six percent of American homes had grandparents living beneath the same roof as grandchildren; elders were abandoned to retirement homes.
For the first time, the international community felt compelled to send disaster relief to Washington. For more than two centuries, reported the Irish Times, “the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” As American doctors and nurses eagerly awaited emergency airlifts of basic supplies from China, the hinge of history opened to the Asian century.
In a single season, civilization has been brought low by a microscopic parasite 10,000 times smaller than a grain of salt. COVID-19 attacks our physical bodies, but also the cultural foundations of our lives, the toolbox of community and connectivity that is for the human what claws and teeth represent to the tiger.
Our interventions to date have largely focused on mitigating the rate of spread, flattening the curve of morbidity. There is no treatment at hand, and no certainty of a vaccine on the near horizon. The fastest vaccine ever developed was for mumps. It took four years. COVID-19 killed 100,000 Americans in four months. There is some evidence that natural infection may not imply immunity, leaving some to question how effective a vaccine will be, even assuming one can be found. And it must be safe. If the global population is to be immunized, lethal complications in just one person in a thousand would imply the death of millions.

The COVID pandemic will be remembered as such a moment in history, a seminal event whose significance will unfold only in the wake of the crisis. It will mark this era much as the 1914 assassination of Archduke Ferdinand, the stock market crash of 1929, and the 1933 ascent of Adolf Hitler became fundamental benchmarks of the last century, all harbingers of greater and more consequential outcomes.
COVID’s historic significance lies not in what it implies for our daily lives. Change, after all, is the one constant when it comes to culture. All peoples in all places at all times are always dancing with new possibilities for life. As companies eliminate or downsize central offices, employees work from home, restaurants close, shopping malls shutter, streaming brings entertainment and sporting events into the home, and airline travel becomes ever more problematic and miserable, people will adapt, as we’ve always done. Fluidity of memory and a capacity to forget is perhaps the most haunting trait of our species. As history confirms, it allows us to come to terms with any degree of social, moral, or environmental degradation.
To be sure, financial uncertainty will cast a long shadow. Hovering over the global economy for some time will be the sober realization that all the money in the hands of all the nations on Earth will never be enough to offset the losses sustained when an entire world ceases to function, with workers and businesses everywhere facing a choice between economic and biological survival. (...)
In the wake of the war, with Europe and Japan in ashes, the United States with but 6 percent of the world’s population accounted for half of the global economy, including the production of 93 percent of all automobiles. Such economic dominance birthed a vibrant middle class, a trade union movement that allowed a single breadwinner with limited education to own a home and a car, support a family, and send his kids to good schools. It was not by any means a perfect world but affluence allowed for a truce between capital and labor, a reciprocity of opportunity in a time of rapid growth and declining income inequality, marked by high tax rates for the wealthy, who were by no means the only beneficiaries of a golden age of American capitalism.
But freedom and affluence came with a price. The United States, virtually a demilitarized nation on the eve of the Second World War, never stood down in the wake of victory. To this day, American troops are deployed in 150 countries. Since the 1970s, China has not once gone to war; the U.S. has not spent a day at peace. President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world.” Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
As America policed the world, the violence came home. On D-Day, June 6th, 1944, the Allied death toll was 4,414; in 2019, domestic gun violence had killed that many American men and women by the end of April. By June of that year, guns in the hands of ordinary Americans had caused more casualties than the Allies suffered in Normandy in the first month of a campaign that consumed the military strength of five nations.
More than any other country, the United States in the post-war era lionized the individual at the expense of community and family. It was the sociological equivalent of splitting the atom. What was gained in terms of mobility and personal freedom came at the expense of common purpose. In wide swaths of America, the family as an institution lost its grounding. By the 1960s, 40 percent of marriages were ending in divorce. Only six percent of American homes had grandparents living beneath the same roof as grandchildren; elders were abandoned to retirement homes.
For the first time, the international community felt compelled to send disaster relief to Washington. For more than two centuries, reported the Irish Times, “the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” As American doctors and nurses eagerly awaited emergency airlifts of basic supplies from China, the hinge of history opened to the Asian century.
by Wade Davis, Rolling Stone | Read more:
Image: Gary Hershorn/Getty Images
[ed. Let's note again: "President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world. Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
[ed. Let's note again: "President Jimmy Carter recently noted that in its 242-year history, America has enjoyed only 16 years of peace, making it, as he wrote, “the most warlike nation in the history of the world. Since 2001, the U.S. has spent over $6 trillion on military operations and war, money that might have been invested in the infrastructure of home. China, meanwhile, built its nation, pouring more cement every three years than America did in the entire 20th century.
See also: ---
When you’re in a pessimistic cast of mind, which developments of the COVID era do you find yourself dwelling on, and what grim scenarios do you imagine them portending? And then, when you’re in better spirits, what causes for optimism do you see?
I do think that the American project, the American experiment, is on the rack right now. We don’t know how things are going to go in the next 90 days. We really need to know whether this electoral process will go smoothly and whether it will deliver what it is supposed to, which is a decisive vote of the American public that confirms somebody to the presidency and thereby demonstrates the capacity of this place to govern itself.
And there is a very distinct possibility that that won’t happen. Or that the decision will fall in favor of the candidate and party that has demonstrated its incapacity to govern — and has in fact demonstrated its capacity to drive this country to ever-greater degrees of ungovernability. I never thought I would live under curfew. I’ve lived under curfew now in New York. It was insane. It made me indignant and outraged, and I didn’t think I would ever experience that.
The counterpart to the American election, globally, is obviously Hong Kong. They, too, have elections. And the brutality Beijing is capable of is shocking. For all of my advocacy for détente — in fact, because of my advocacy for détente — I’m haunted by memories of the 1930s and 1940s and the naïveté of many people who advocated for collective security and Popular Front collaboration with the Soviet Union, all for very good reasons that I would have certainly endorsed. We have to reckon with what we now know about the violence of which the Soviet Union was capable. And we have to reckon with what the Chinese Communist regime is capable of too. So those are the two advanced economy problems that are most on my mind.
I recently had the chance to be involved in conversations with a bunch of colleagues in South Africa. If COVID were to become yet another devastating shock to the developmental possibilities of sub-Saharan Africa, in terms of the humanitarian crisis, that has the makings of a truly catastrophic drama. Already, the economic and social news out of South Africa is biblically bad. They started the year with a 30 percent unemployment rate. They think they will have a 50 percent unemployment rate in the townships by the end of the year. Coming of age when I did, the end of apartheid and the advent of multiracial democracy in South Africa stood out as one of the great triumphs of humanity. And if South Africa becomes a basket case, then this is a disaster of traumatic proportions.
But the good news is … (?)
Oh, right. Hopeful signs. Well, let me try. At the risk of sounding trite, I actually do still marvel at the lockdown. And this actually goes back to our earlier discussion — to the question of the extent to which history is determined by the capitalist pursuit of profit. I’m enough of an economic historian to think that it’s a hugely important variable. But there was something really extraordinary that happened in March, in which nearly the entire world — individually and collectively — made this decision to shut down the economy to preserve human life. Politicians and businesses and citizens and trade unions — the whole mass of collective actors — made this decision. The vast majority of humanity was subject to it.
And it may have been a catastrophic mistake. I don’t think we can rule that possibility out. We can’t run it again. We don’t know what the consequences would have been. We’ve ended up with what we’ve ended up with. But part of what we ended up with was this collective decision — and as costly and painful as it was, there’s something truly spectacular about that moment.
And then, of course, all hell breaks loose. Inequalities make themselves dramatically felt. We can’t hold it together. It’s a shitshow. None of that struck me as surprising. But March was a different story.
How Will the Covid 19 Pandemic Change World History (NY Mag/Intelligencer).]
When you’re in a pessimistic cast of mind, which developments of the COVID era do you find yourself dwelling on, and what grim scenarios do you imagine them portending? And then, when you’re in better spirits, what causes for optimism do you see?
I do think that the American project, the American experiment, is on the rack right now. We don’t know how things are going to go in the next 90 days. We really need to know whether this electoral process will go smoothly and whether it will deliver what it is supposed to, which is a decisive vote of the American public that confirms somebody to the presidency and thereby demonstrates the capacity of this place to govern itself.
And there is a very distinct possibility that that won’t happen. Or that the decision will fall in favor of the candidate and party that has demonstrated its incapacity to govern — and has in fact demonstrated its capacity to drive this country to ever-greater degrees of ungovernability. I never thought I would live under curfew. I’ve lived under curfew now in New York. It was insane. It made me indignant and outraged, and I didn’t think I would ever experience that.
The counterpart to the American election, globally, is obviously Hong Kong. They, too, have elections. And the brutality Beijing is capable of is shocking. For all of my advocacy for détente — in fact, because of my advocacy for détente — I’m haunted by memories of the 1930s and 1940s and the naïveté of many people who advocated for collective security and Popular Front collaboration with the Soviet Union, all for very good reasons that I would have certainly endorsed. We have to reckon with what we now know about the violence of which the Soviet Union was capable. And we have to reckon with what the Chinese Communist regime is capable of too. So those are the two advanced economy problems that are most on my mind.
I recently had the chance to be involved in conversations with a bunch of colleagues in South Africa. If COVID were to become yet another devastating shock to the developmental possibilities of sub-Saharan Africa, in terms of the humanitarian crisis, that has the makings of a truly catastrophic drama. Already, the economic and social news out of South Africa is biblically bad. They started the year with a 30 percent unemployment rate. They think they will have a 50 percent unemployment rate in the townships by the end of the year. Coming of age when I did, the end of apartheid and the advent of multiracial democracy in South Africa stood out as one of the great triumphs of humanity. And if South Africa becomes a basket case, then this is a disaster of traumatic proportions.
But the good news is … (?)
Oh, right. Hopeful signs. Well, let me try. At the risk of sounding trite, I actually do still marvel at the lockdown. And this actually goes back to our earlier discussion — to the question of the extent to which history is determined by the capitalist pursuit of profit. I’m enough of an economic historian to think that it’s a hugely important variable. But there was something really extraordinary that happened in March, in which nearly the entire world — individually and collectively — made this decision to shut down the economy to preserve human life. Politicians and businesses and citizens and trade unions — the whole mass of collective actors — made this decision. The vast majority of humanity was subject to it.
And it may have been a catastrophic mistake. I don’t think we can rule that possibility out. We can’t run it again. We don’t know what the consequences would have been. We’ve ended up with what we’ve ended up with. But part of what we ended up with was this collective decision — and as costly and painful as it was, there’s something truly spectacular about that moment.
And then, of course, all hell breaks loose. Inequalities make themselves dramatically felt. We can’t hold it together. It’s a shitshow. None of that struck me as surprising. But March was a different story.
How Will the Covid 19 Pandemic Change World History (NY Mag/Intelligencer).]
Labels:
Critical Thought,
Government,
Health,
history,
Politics
Office Noise Simulators
During the first few days of quarantine, many displaced office workers likely enjoyed the peace and quiet of working from home. Now enough time has passed for them to miss the typing, chatter, and other background noises they would have complained about less than two months ago. If you're feeling nostalgic for the bustle of your workplace, this website, designed by Reichenbergerstr 121, can keep you company.
This tool, spotted by Lifehacker, simulates the ordinary, sometimes distracting noises of office life. When you visit imisstheoffice.eu and press the play button in the bottom left corner, a track of soft typing and muffled conversations fills your speakers. To adjust the number of colleagues sharing your space, toggle the tool in the bottom right corner.
Clicking the objects animating the page will add more sounds to the mix. A scanner, a water cooler, and a ping pong table are just a few of the office noise-makers you can activate to make your home feel less empty (or maybe remind you that working in silence isn't that bad).
People used to working outside an office before quarantine may be missing other sounds right now, like those of public spaces. This tool recreates the ambient noises of cafés around the globe.
by Michele Debczak, Lifehacker | Read more:
Image: Oli Scarff

Clicking the objects animating the page will add more sounds to the mix. A scanner, a water cooler, and a ping pong table are just a few of the office noise-makers you can activate to make your home feel less empty (or maybe remind you that working in silence isn't that bad).
People used to working outside an office before quarantine may be missing other sounds right now, like those of public spaces. This tool recreates the ambient noises of cafés around the globe.
by Michele Debczak, Lifehacker | Read more:
Image: Oli Scarff
Last Decade Was Earth's Hottest on Record, Exposing Grim Reality of Climate Change
A new report released Wednesday details how 2019 was another year of extremes for Earth's climate, adding to a litany of evidence exposing the grim reality of our warming world.
Last year saw devastating wildfires burn through Australia; large regions including Europe, Japan, Pakistan, and India experienced deadly heat waves; almost 100 tropical cyclones created havoc; glaciers and sea ice continued to melt at worrying levels; and drought and floods destroyed vital crops and infrastructure.
Among the key findings of the State of the Climate in 2019, published by the American Meteorological Society, was that 2019 was among the warmest years on record, that greenhouse gases in the Earth's atmosphere are at their highest recorded levels and this decade is the hottest since records began in the mid-1800s.
"Each decade since 1980 has been successively warmer than the preceding decade, with the most recent (2010-1019) being around 0.2°C warmer than the previous (2000-2009)," the report said. "As a primary driver for our changing climate, the abundance of many long-lived greenhouse gases continues to increase."
The study also reported other key findings:
Global carbon dioxide concentrations, which represent the bulk of the gases' warming power, rose during 2010 to a record 409.8 parts per million, the study found. That was the "highest in the modern 61-year measurement record as well as the highest ever measured in ice core records dating back as far as 800,000 years," the report said.
The report was led by the National Oceanic and Atmospheric Administration's Centers for Environmental Information and was based on contributions from more than 520 scientists from 60 countries. The annual report is often described by meteorologists as the "annual physical of the climate system."
"A number of extreme events, such as wildfires, heatwaves and droughts, have at least part of their root linked to the rise in global temperature. And of course the rise in global temperature is linked to another climate indicator: the ongoing rise in emissions of greenhouse gases, notably carbon-dioxide, nitrous oxide and methane," Dunn said.
by Helen Regan, CNN | Read more:
Image: NOAA NCEI Climate
Last year saw devastating wildfires burn through Australia; large regions including Europe, Japan, Pakistan, and India experienced deadly heat waves; almost 100 tropical cyclones created havoc; glaciers and sea ice continued to melt at worrying levels; and drought and floods destroyed vital crops and infrastructure.

"Each decade since 1980 has been successively warmer than the preceding decade, with the most recent (2010-1019) being around 0.2°C warmer than the previous (2000-2009)," the report said. "As a primary driver for our changing climate, the abundance of many long-lived greenhouse gases continues to increase."
The study also reported other key findings:
- The six warmest years on record have all occurred in the past six years, since 2014.
- 2019 was among the three hottest years since records began in the mid-1800s. Only 2016, and for some datasets 2015, were warmer than 2019.
- Average sea surface temperatures in 2019 was the second highest on record, surpassed only by 2016.
- Sea levels rose to a new record high for the eighth consecutive year.
- Surface air temperatures for the Arctic were the second highest in 120 years of records, trailing only 2016. In the Antarctic, 2019 was the second warmest year for the continent since 1979.
- Glaciers continue to melt at a concerning rate for the 32nd straight year.
Global carbon dioxide concentrations, which represent the bulk of the gases' warming power, rose during 2010 to a record 409.8 parts per million, the study found. That was the "highest in the modern 61-year measurement record as well as the highest ever measured in ice core records dating back as far as 800,000 years," the report said.
The report was led by the National Oceanic and Atmospheric Administration's Centers for Environmental Information and was based on contributions from more than 520 scientists from 60 countries. The annual report is often described by meteorologists as the "annual physical of the climate system."
Robert Dunn, one of the report's lead editors from the UK Met Office, said in a statement that, "The view for 2019 is that climate indicators and observations show that the global climate is continuing to change rapidly."
"A number of extreme events, such as wildfires, heatwaves and droughts, have at least part of their root linked to the rise in global temperature. And of course the rise in global temperature is linked to another climate indicator: the ongoing rise in emissions of greenhouse gases, notably carbon-dioxide, nitrous oxide and methane," Dunn said.
by Helen Regan, CNN | Read more:
Image: NOAA NCEI Climate
Subscribe to:
Posts (Atom)