Monday, August 24, 2020

Zephy


George Booth
(photo: markk)
[ed. R.I.P.]

Why Every City Feels the Same Now

Some time ago, I woke up in a hotel room unable to determine where I was in the world. The room was like any other these days, with its neutral bedding, uncomfortable bouclé lounge chair, and wood-veneer accent wall—tasteful, but purgatorial. The eerie uniformity extended well beyond the interior design too: The building itself felt like it could’ve been located in any number of metropolises across the globe. From the window, I saw only the signs of ubiquitous brands, such as Subway, Starbucks, and McDonald’s. I thought about phoning down to reception to get my bearings, but it felt too much like the beginning of an episode of The Twilight Zone. I travel a lot, so it was not the first or the last time that I would wake up in a state of placelessness or the accompanying feeling of déjà vu.

The anthropologist Marc Augé gave the name non-place to the escalating homogeneity of urban spaces. In non-places, history, identity, and human relation are not on offer. Non-places used to be relegated to the fringes of cities in retail parks or airports, or contained inside shopping malls. But they have spread. Everywhere looks like everywhere else and, as a result, anywhere feels like nowhere in particular.

The opposite of placelessness is place, and all that it implies—the resonances of history, folklore, and environment; the qualities that make a location deep, layered, and idiosyncratic. Humans are storytelling creatures. If a place has been inhabited for long enough, the stories will already be present, even if hidden. We need to uncover and resurface them, to excavate the meanings behind street names, to unearth figures lost to obscurity, and to rediscover architecture that has long since vanished. A return to vernacular architecture—the built environment of the people, tailored by and for local culture and conditions—is overdue. It can combat the placelessness that empires and corporations have imposed. (...)

Commercial builders also emulate architecture that conveys a desirable image. At the turn of the 20th century, the administrators and businessmen of Meiji Japan commissioned Western architects to modernize their country, adopting the structures of supposed Western progress. So did the sultan of Zanzibar, whose House of Wonders has European characteristics, along with a front entrance large enough to ride his elephant through.

It was only a matter of time before corporations began to construct their own hegemonic visions of urban life. In 1928, an American town sailed up a tributary of the Amazon. It came in pieces, to be assembled into shingled houses with lawns and picket fences, a Main Street, a dance hall, a cinema, and a golf course. Henry Ford was the visionary behind the development; his aim: to control the rubber industry via exported Americanism. He named it Fordlândia.

The settlement failed dramatically. The jungle was unforgiving, and the settlers were unprepared for malarial fevers and snake attacks. Cement and iron were unsuited to the humidity. Blight spread through the rubber plantation, which had been cultivated too intensively. Ford’s promises of free health care and fair wages were undermined by puritanical surveillance, cruelty, and incompetence. Eventually, the workers rioted. As a utopia, Fordlândia was probably doomed from the start, given its founding in neocolonial arrogance. But despite its failure almost a century ago, Fordlândia successfully predicted the future of cities: utter sameness, exported globally.

In the decades that followed, corporate architecture of the sort outside my hotel room adopted designs that expressed corporate power. It became slick and monolithic. Ruthlessly rational, it exudes aloofness—its denizens exist high above the streets in glass-and-steel boxes that maximize the expensive floor space. The earliest of these structures were inspired by Ludwig Mies van der Rohe’s 1959 Seagram Building, which set the archetype until the 1980s. The New Formalists tried to temper this model with humanizing, historical touches—the tall, pseudo-gothic arches with which Minoru Yamasaki embellished the World Trade Center, for instance—but even then, it often harked back to earlier symbols of dominating power, like Greco-Roman classicism had done.

Eventually, aware of appearing cold and remote, corporate architecture underwent an image change. Its buildings now resemble its brands: cooler, cuter, greener, more knowing and ironic. The doughnut-shaped mothership of Apple Park or the biodome spheres of Amazon’s Seattle campus offer examples.

But these structures might be worse than the indifferent, modernist monoliths they replaced. At least the glass towers made clear that their occupants didn’t care about you, or maybe anyone. Now headquarters buildings express the hypocrisy of corporate gentility. Apple Park, with its circular form and large central garden, telegraphs connection and collaboration. But its real message is power: It is one of the most valuable corporate headquarters in the world, echoing the Pentagon in size and ambition. Shaped like a spaceship, it also suggests to the local community, which grants Apple huge tax breaks, that the company could take off and relocate anywhere in the world, whenever it wants. (...)

Vernacular is an umbrella term architects and planners use to describe local styles. Vernacular architecture arises when the people indigenous to a particular place use materials they find there to build structures that become symptomatic of and attuned to that particular environment. Augé called it “relational, historical and concerned with identity.” It aims for harmonious interaction with the environment, rather than setting itself apart from it. (...)

Creativity often works according to a dialectic process. Frank Lloyd Wright sought to “break the box” of Western architecture by shifting geometries, letting the outside in, and designing architecture within a natural setting, as he did with Fallingwater, one of his most famous designs. Wright was inspired by a love of the Japanese woodblock prints of Hiroshige and Hokusai—an influence he would later repay by training Japanese architects such as Nobuko and Kameki Tsuchiura, who reinterpreted European modernist design in Japan. The goal is not to replace glass skyscrapers with thatch huts, but to see vernacular as the future, like Wright did, rather than abandoning it to the past.

by Darran Anderson, The Atlantic | Read more:
Image: Justin Sullivan / Getty
[ed. I've been thinking about this a lot lately as so many cultural institutions die left and right, victims of pandemic economics.]

Defunding the Police: Seattle's Stumbling Blocks

Seattle was on the verge of taking one of the most radical steps of late toward large-scale police reform of any city in the US just last month.

In the wake of the police killing of George Floyd in Minneapolis in May, and widespread police brutality and anti-racist protests, a veto-proof majority of council members voiced their support for defunding the police, slashing 50% of the department’s budget.

But since then, they’ve faced a series of logistical roadblocks and clashed with other city leaders, and ultimately all but one of them have walked back their statements.

The council instead voted for a much smaller round of cuts, including reducing the salaries of Carmen Best, who is Seattle’s chief of police, and members of her command staff as well as trimming about 100 of the department’s 1,400 police officers.

Mere hours after the vote, Best, the first African American leader of the department who has held the position for only two years, announced her retirement.

“The idea of letting, after we worked so incredibly hard to make sure that our department was diverse, that reflects the community that we serve, to just turn that all on a dime and hack it off without having a plan in place to move forward, it’s highly distressful to me,” she said during a news conference last week. “It goes against my principles and my conviction and I really couldn’t do it.” (...)

Stephen Page, associate professor at the University of Washington’s Evans School of Public Policy and Governance, told the Guardian that what appears to be missing in Seattle, Minneapolis and New York is leaders transforming police reform from a rallying cry to a precise plan.

“None of those discussions in any of those cities at this point seem to be taking seriously these questions of what, exactly, are we doing if we’re not funding the police and how are we going to do it,” he said. (...)

In Seattle, one of the key challenges during this process has been collaboration. While city council members have said they’ve tried to work with the police chief and mayor during the defunding process, at last week’s news conference Durkan characterized the last few weeks as an “absolute breakdown of collaboration and civil dialogue”. (...)

The council president M Lorena González, council member Teresa Mosqueda and council member Tammy J Morales said in a statement that they were sorry to see Best go, and again stressed the importance of city leaders working together during the law enforcement reform process. But they also made it clear that this has in no way deterred their efforts.

“The council will remain focused on the need to begin the process of transforming community safety in our city,” the statement said. “This historic opportunity to transition the SPD from reform to transformation will continue.”

Isaac Joy, an organizer with King County Equity Now, one of the coalitions that has pushed to defund the department by 50%, said there is potentially a silver lining to Best’s departure: it presents an opportunity to find someone to lead the department who can be a “thought partner on listening and responding to the community’s demands, to divest from our police force, demilitarize our police force and start reinvesting and making Seattle a city that everyone can thrive in”.

He also stressed that the leader should also be Black.

Joy explained that’s because of the “police history, specifically as it relates to the enslaved, the Black population and that being the root of the police force. And so, in order to rectify and address that root, you do need Black leadership, you just, along with Black leadership, you need the support of the department, the support of the mayor, the support of the council.”

Earlier this month, the coalition released a blueprint for cutting the police budget and reinvesting that money into such groups as those developing alternatives to policing and providing housing for people in need.

On Monday, the council unanimously approved a resolution that includes a variety of goals for 2021, including creating a civilian-led department of community safety and violence prevention, and moving the city’s 911 dispatch out of the police department.

by Hallie Golden, The Guardian |  Read more:
Image: Karen Ducey/Getty Images
[ed. This once beloved city has gone absolutely nuts. Forcing your first black female police chief into retirement is not, as they say, a good look. Good luck to Carmen Best, she tried her best.] 

Saturday, August 22, 2020


Jesse Mockrin, Syrinx
via:

Basketball Was Filmed Before a Live Studio Audience

I knew there were way more important things than basketball, and I was all for canceling everything back in March: in-person school, sports, plays, concerts, conferences, just shut it down. But, in order to hold this line, I had to force myself to stop thinking all the damn time about the interruption of the Milwaukee Bucks’ magical season, the second consecutive MVP season of Giannis Antetokounmpo, their certain progress toward their first NBA Finals in decades. This was supposed to be Milwaukee’s summer, with a long playoff run for the basketball team followed by the Democratic National Convention in the same new arena built for just such moments. Months later, when it was official that the NBA season would resume at Disney World, encased in a quarantined bubble, tears formed in my eyes. From mid-March until the beginning of summer, I watched no live TV. The news was too awful, and sports were all reruns. Since late July, I’ve been watching the Bucks again, and like everything else in America, it’s been strange.

As sports, the competitions from the NBA bubble, like the football (soccer), baseball, and ice hockey games I’ve watched, are more or less the same. But as television shows, as a variety of broadcast media, and as an aesthetic experience made up of images and sounds, the NBA games so far have been a departure from the usual, and nothing feels right. It’s been a bit like another newly familiar experience: getting takeout from a restaurant where you previously would dine in. The food might taste like you remember it, but the sensory and social environment of the meal makes you realize how much context matters. (...)

The NBA bubble games have had a particularly sitcommy feel. The courts at Disney’s Wide World of Sports are walled in on three sides by tall video displays, obscuring whatever seats or walls are beyond the court except for rare glimpses when the director cuts to a camera behind the scorer’s table for a referee’s call. The images almost always stay on one side of the action facing these displays, and unlike the usual games from the before times, there are no camera operators on the court itself under the basket. The visual array is reminiscent of the kind of three-wall sets that television comedies adopted from the stage, with their proscenium effect of positioning the viewer across an invisible fourth wall. In a typical American sitcom, you hear but do not see an audience. Many are recorded with a live audience in the studio, and sometimes begin with a voice-over telling you as much (“Cheers was filmed before a live studio audience”). The combination of the three-wall set and audience audio makes the television comedy much more like theater than many kinds of television (the difference between this aesthetic and the “single-camera” comedy style of shows like The Office often prompts comparisons of the latter to cinema).

The sitcom “laugh track” is an old convention. It has sometimes been held up as the epitome of commercial television’s basically fraudulent nature. In the absence of a live audience, or when the audience isn’t demonstrative in the way the producers would like, the sound track of a comedy can be massaged by sweetening the recording or adding canned laughter. This isn’t that different from an older tradition in live performance of the claque, the audience members hired to applaud. But in any event, the sounds of the audience recreate for the viewer at home a sense of participation in a live event among members of a community who experience the show together. This is true for sports just as much as it is for scripted comedy or late-night variety shows. The audible audience for televised sports is always manipulated to be an accompaniment that suggests the space of a live event. A sports stadium or arena is a big television studio in the first place, a stage for the cameras with a raucous in-person audience. Your ticket gets you into the show as an extra. The sensory pandemonium of the live event is never really captured on TV, the blaring music and sound effects are kept low in the mix to keep the booth broadcasters’ voices loud and centered, and no one shoots a T-shirt cannon in your direction when you’re watching at home. But the crowd is essential to the visual and auditory qualities of sports, and the missing elements in these games from Florida have been a present absence. (...)

The video displays are part of what makes each game have a “home team,” as the imagery conveys the identity of one of the two competitors with the text, colors, and advertisements you would find in their home arena. The displays, expansive like digital billboards, also show images of the home team’s fans, which is a nice touch in theory. But the way this works in practice is bizarre. The low-res webcam images of the individual faces are abstracted against backgrounds that look like arena seats, and these are arrayed in a grid to create a large rectangle of spectators. The images are presumably live, but they could be out of sync for all we know as the fans seldom react to anything in the moment, have no way of feeding off one another, and are not audible. The arena has set up a grade of rows that recede away from the court, and some fans are more visible than others as they are courtside or behind the bench or scorer’s table. The close proximity of fans, separated by no barrier from the stars, is one of the thrills of watching live basketball. These virtual fans are by contrast one big upright surface of blurry, laggy heads, and they are reminiscent of the Hollywood Squares of meeting attendees now all too familiar from Zoom’s gallery view. Like many elements of live television of the past few months, these visuals of the NBA’s bubble games are the optics of a pandemic that has turned our lives inside out. (...)

These bubble games remind us, minute by minute, what life is like now. They afford us the dreamworld of a space where you can safely breathe heavily, unmasked, indoors with nine other players and three refs on the same basketball court. But they also televise this newly risky world of facemasks and six feet, of conversations mediated by plexiglass and video screens. I have felt for the NBA players whose season was abruptly arrested as it was getting good, but now I also envy the careful setup that their filthy rich sports league can afford, while my cash-strapped public university takes its chances and opens its dorms and classrooms without such a luxury of frequent testing and exceptional security.

by Michael Z. Newman, LARB | Read more:
Image: CNN

Friday, August 21, 2020

Jerry Falwell Jr. and the Evangelical Redemption Story

Two weeks ago, Jerry Falwell Jr., the president of Liberty University, the largest evangelical college in America, posted an Instagram photo of himself on a yacht with his arm around a young woman whose midriff was bare and whose pants were unzipped. This would have been remarkable by itself, but it was all the more so because Falwell’s midriff was also bare and his pants also unzipped. In his hand, Falwell held a plastic cup of what he described winkingly in his caption as “black water.”

The aesthetics of the photo would be familiar to anyone who’s ever been to a frat party, but they were jarringly out of place for the son of Moral Majority cofounder Jerry Falwell Sr. and a professional evangelical Christian whose public rhetoric is built on a scaffolding of sexual conservatism and an antagonism to physical pleasure more generally.

The backdrop of a yacht represents an entirely different hypocrisy, arguably a more egregious one: the embrace of materialism and the open accumulation of enormous wealth. Falwell, who has a net worth estimated to be more than $100 million, is not formally a “prosperity gospel” adherent, but he has nonetheless jettisoned those inconvenient parts of Christian theology that preach the virtues of living modestly and using wealth to help the less fortunate.

But for his public, the problem with the photo was the optics of carnal sin—the attractive young woman who was not his wife, the recreational drinking, the unzipped pants—none of which would be acceptable at Liberty University, where coed dancing is penalized with a demerit. In the moral hierarchy of white evangelical Christianity, carnal sin is the worst, and this thinking drives the social conservatism that allows evangelicals to justify persecuting LGBTQ people, opposing sexual education in schools, distorting the very real problem of sex trafficking to punish sex workers, restricting access to abortion, eliminating contraception from employer-provided healthcare, and prosecuting culture wars against everything from medical marijuana to pop music. Evangelicalism’s official morality treats all pleasure as inherently suspect, the more so when those pleasures might belong to women or people of color.

Fortunately for Falwell, evangelicalism has built-in insurance for reputational damage, should a wealthy white man make the mistake of public licentiousness widely shared on the Web: the worst sins make for the best redemption stories. Even better, a fall from grace followed by a period of regret and repentance can be turned into a highly remunerative rehabilitation. That, in fact, has been many a traveling preacher’s grift from time immemorial.

I grew up hearing such “testimonies,” personal stories that articulate a life in sin and a coming to Jesus, firsthand. I was raised in the 1980s and 1990s in a family of Southern Baptists who viewed Episcopalians as raging liberals and Catholics, of which we knew precisely two, as an alien species. These were perfectly ordinary sentiments in the rural Alabama town we lived in. My dad was a local lineman for Alabama Power, and my mom worked at my school, first as a janitor and, later, as a lunch lady. Nobody in my family had gone to college.

Besides school and Little League, church was the primary basis of our social existence. As a child and into my early teens, my own religiosity was maybe a tick above average for our community. I went on mission trips to parts of the US that were more economically distressed than my hometown, handed out Chick tracts (named for the publisher and cartoonist Jack Chick) with as much zeal and sincerity as a twelve-year-old could muster, and on one occasion destroyed cassette tapes of my favorite bands (Nirvana, the Dead Kennedys, the Beastie Boys) in a fit of self-righteousness, only to re-buy them weeks later because, well, my faith had its limits.

All the while, I was—to use a word evangelicals like to misapply to any sort of secular education—“indoctrinated” by teachers, family, church staff, ministry organizations, and other members of the community to view everything I encountered in the world through an evangelical lens. If I went to the mall and lost my friends for a few minutes, I briefly suspected everyone had been raptured away except me, a particular brand of eschatological fantasy that we were taught was perpetually in danger of happening. Even my scandalous moments, which, do-goody overachiever that I was, were few and far between, were colored by the church. My first real kiss, at fourteen, was an epic make-out session on a sidewalk during a mission trip to a suburb of Orlando, with an eighteen-year-old assistant youth pastor named Matt.

I was ten or eleven when I was baptized—or in Southern Baptist parlance, “born again”—and part of this process involved constructing my own redemption narrative: I lived in sin and would be saved by Christ. I recently rediscovered my own handwritten testimony on a visit to my mom’s house. In a child’s rounded, looping handwriting, I had confessed that I used to “cheat at games,” something I don’t remember doing at all. The likely explanation for this is that because sin is such an important prerequisite for redemption, my ten-year-old self had to fabricate one to conform to the required convention (never mind that such a falsification would be sinful itself).

by Elizabeth Spiers, NY Review | Read more:
Image: Instagram

Thursday, August 20, 2020

Chart House


An iconic restaurant in Waikiki has closed its doors for good.

Management of Chart House Waikiki said they decided to stop operations citing coronavirus hardships. It’s unlikely they will reopen as many businesses especially in Waikiki continue to struggle.

The eatery has served customers for the past 52 years with beautiful views of the small boat harbor and stunning south shore sunsets.

In a simple statement on their website, Joey Cabell and Scott Okamoto said, “At this time we would like to say Mahalo to everyone who has supported us over the past 52 years.”

by HNN Staff, Hawaii News Now |  Read more:
Image: Charthouse
[ed. Oh no. I'm grief-stricken. My all-time favorite bar, overlooking the Ala Wai Boat Harbor in Waikiki. So many great memories. It's the only place I make a special point of visiting everytime I go back.]

Akira Kurosawa - Composing Movement


[ed. See also: The Highs and Lows of High and Low (Dissolve).]
Respost

Plastilina Mosh, El Guincho, Odisea


Repost

The American Nursing Home Is a Design Failure

With luck, either you will grow old or you already have. That is my ambition and probably yours, and yet with each year we succeed in surviving, we all face a crescendo of mockery, disdain, and neglect. Ageism is the most paradoxical form of bigotry. Rather than expressing contempt for others, it lashes out at our own futures. It expresses itself in innumerable ways — in the eagerness to sacrifice the elderly on the altar of the economy, in the willingness to keep them confined while everyone else emerges from their shells, and, in a popular culture that sees old age (when it sees it at all) as a purgatory of bingo nights. Stephen Colbert turned the notion of a 75-year-old antifa into a comic riff on geriatric terrorists, replete with images of octogenarians innocently locomoting with walkers, stair lifts, and golf carts.

In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”

In this country, we have erected a vast apparatus of last-stop living arrangements that, during the pandemic, have proven remarkably successful at killing the very people they were supposed to care for. The disease that has roared through nursing homes is forcing us to look hard at a system we use to store large populations and recognize that, like prisons and segregated schools, it brings us shame.

The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.

The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.

The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.

Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.

The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”

Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”

It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.

The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.

by Justin Davidson, NY Mag/Intelligencer |  Read more:
Image: C.F. Møller
[ed. Personally, I'd prefer an endless supply of good drugs, or something like the euthanasia scene in Soylent Green - Death of Sol (not available on YouTube for some reason).]

Wednesday, August 19, 2020

'One-Shot' Radiotherapy As Good For Breast Cancer As Longer Course

Women with breast cancer who receive one shot of radiotherapy immediately after surgery experience the same benefits as those who have up to 30 doses over three to six weeks, an international medical study has found.

The technique, known as targeted intraoperative radiotherapy, is increasingly being used around the world instead of women having to undergo weeks of painful and debilitating treatment.

Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.

The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.

The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.

by Denis Campbell, The Guardian | Read more:
Image: Rui Vieira/PA

Obama and the Beach House Loopholes

[ed. Magnum P.I.'s old property. Obama P.I.? Just doesn't have the same ring to it.]

As Barack Obama entered the home stretch of his presidency, his close friend Marty Nesbitt was scouting an oceanfront property on Oahu, the Hawaiian island where the two regularly vacationed together with their families.

A home in the nearby neighborhood of Kailua had served as the winter White House for the Obama family every Christmas, and photographers often captured shots of Obama and Nesbitt strolling on the beach or golfing over the holidays.

The prospective property was located just down the shore in the Native Hawaiian community of Waimanalo. Wedged between the Koʻolau mountains that jut 1,300 feet into the sky and a stunning turquoise ocean, the beachfront estate sprawled across 3 acres, featuring a five-bedroom manse, gatehouse, boat house and tennis courts. Fronting the property was a historic turtle pond that used to feed Hawaiian chiefs. Local families took their children to splash and swim in its calm waters.

The property had one major problem though: a century-old seawall. While the concrete structure had long protected the estate from the sea, it now stood at odds with modern laws designed to preserve Hawaii’s natural coastlines. Scientists and environmental experts say seawalls are the primary cause of beach loss throughout the state. Such structures interrupt the natural flow of the ocean, preventing beaches from migrating inland.

But the sellers of the Waimanalo property found a way to ensure the seawall remained in place for another generation. They asked state officials for something called an easement, a real estate tool that allows private property owners to essentially lease the public land that sits under the seawall. The cost: a one-time payment of $61,400. Officials with the state Department of Land and Natural Resources approved the permit, which authorized the wall for another 55 years, and Nesbitt purchased the property.

State officials and community members say the Obamas will be among the future occupants.

The easement paved the way for building permits and allowed developers to exploit other loopholes built into Hawaii’s coastal planning system. Nesbitt went on to win another environmental exemption from local officials and is currently pursuing a third — to expand the seawall. According to building permits, the Obamas’ so-called First Friend is redeveloping the land into a sprawling estate that will include three new single-family homes, two pools and a guard post. The beach fronting the seawall is nearly gone, erased completely at high tide.

Community members are now rallying against the proposed seawall expansion. Some are directing their criticism at Obama, who staked his legacy, in part, on fighting climate change and promoting environmental sustainability.

Obama’s personal office declined to comment, referring inquiries to Nesbitt. And Nesbitt, who declined to be interviewed, would not directly address questions about ownership, only saying that he and his wife bought the land and were “the developers” of the estate.

In written responses to questions, Nesbitt, now chair of the Obama Foundation board and co-CEO of a Chicago-based private-equity firm, said the steps he’s taken to redevelop the property and expand the seawall are “consistent with and informed by the analysis of our consultants, and the laws, regulations and perspectives of the State of Hawaii.” Any damage the structure caused to the Waimanalo beach, he added, occurred decades ago “and is no longer relevant.”

In Hawaii, beaches are a public trust, and the state is constitutionally obligated to preserve and protect them. But across the islands, officials have routinely favored landowners over shorelines, granting exemptions from environmental laws as the state loses its beaches. (...)

Intended to protect homeowners’ existing properties, easements have also helped fuel building along portions of Hawaii’s most treasured coastlines, such as Lanikai on Oahu and west side beaches on Maui. Scores of property owners have renovated homes and condos on the coast while investors have redeveloped waterfront lots into luxury estates. Meanwhile, the seawalls protecting these properties have diminished the shorelines. With nowhere to go, beaches effectively drown as sea levels rise against the walls and waves claw away the sand fronting them, moving it out to sea.

Researchers estimate that roughly a quarter of the beaches on Oahu, Maui and Kauai have already been lost or substantially narrowed because of seawalls over the past century. That has left less coastal habitat for endangered monk seals to haul up and rest and sea turtles to lay eggs. By midcentury, experts predict, the state will be down to just a handful of healthy beaches as climate change causes sea levels to rise at unprecedented rates. (...)

Beaches and open coastlines have always been central to Hawaii’s way of life. For centuries, Native Hawaiians enjoyed access to the ocean’s life-sustaining resources. Natural sand dunes provided protection against strong storms and served as a place for Native Hawaiians to bury their loved ones.

After Hawaii became a state in 1959, development of homes and hotels along the coastlines exploded as investors sought to capitalize on what was becoming some of the most valuable real estate in the country. An environmental review commissioned by the state in the 1970s found that three-quarters of the state’s sandy coastlines were now hugged by private property, curtailing public access to shorelines. Many property owners erected seawalls to try to hold back the ocean.

By the 1990s, scientists were warning that those seawalls were causing significant beach loss on all the Hawaiian islands.

Alarmed by these losses, state officials in 1997 released a roadmap for protecting the state’s beaches. The report emphasized that the seawalls were destroying coastal ecosystems, threatening the state’s tourist-driven economy and limiting the public’s access to beaches and the ocean, a right enshrined in the Hawaii Constitution.

If beaches continue to disappear throughout the state, the report warned, “the fabric of life in Hawaii will change and the daily miracle of living among these islands will lose its luster.”

by Sophie Cocke, ProPublica/Honolulu Star Advertiser | Read more:
Image: Darryl Oumi, special to Honolulu Star-Advertiser
[ed. How many houses do the Obama's own? Let's see, there's that one in Washington D.C., the recent one in Martha's Vineyard, and wasn't there one in Chicago? I can't keep track. Being ex-president can be a pretty lucrative gig if you protect the status quo.]

Get Ready for a Teacher Shortage Like We’ve Never Seen Before

Usually on the first day back to work after summer break, there’s this buzzing, buoyant energy in the air. My school is a small school-within-a-school designated to serve gifted children, so there are only 16 teachers and staff members. We typically meet in a colleague’s tidy classroom, filled with natural light and the earthy smell of coffee.

We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.

Coming back this year was different.

It was Thursday, Aug. 6, the same day that the Houston area reported its new single-day high for deaths from Covid-19. Instead of gathering, we all tuned in to a Zoom meeting from our separate classrooms.

There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.

We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.

“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”

I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”

I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”

She sent a message loud and clear: “I really, really don’t want to be here.”

We are about to see schools in America filled with these kinds of teachers.

Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.

You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.

That’s not what they do.

They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.

So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.

This has been happening for years. We’re about to see it get a lot worse.

by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields

Takahashi, Hiroaki (Shotei). Mice, Radish, and Carrot, 1926

Tuesday, August 18, 2020


Jack Kirby. From a golden age story reprinted in an early ‘70s “Marvel Premiere” comic
[ed. Living in the bubble]

Deceptively Bright, in an Up & Coming Area

Bunker: Building for the End Times
By Bradley Garrett

What is a bunker? The term derives from an Old Swedish word meaning ‘boards used to protect the cargo of a ship’. But if we take it, as we usually do, to mean a defended structure, often underground, intended to shield people and important goods through a period of strife, then it is one of the oldest building types made by humans. In Cappadocia, central Turkey, there are twenty-two subterranean settlements made by Hittite peoples around 1200 BC. As their empire faltered, the Hittites dug into soft hillsides to shelter themselves. As many as twenty thousand people lived at Derinkuyu, the deepest complex.

But the word ‘bunker’ also has the scent of modernity about it. As Bradley Garrett explains in his book, it was a corollary of the rise of air power, as a result of which the battlefield became three-dimensional. With the enemy above and equipped with high explosives, you had to dig down and protect yourself with metres of concrete. Garrett’s previous book, Explore Everything, was a fascinating insider’s look at illicit ‘urban exploration’, and he kicks off Bunker with an account of time spent poking around the Burlington Bunker, which would have been used by the UK government in the event of a nuclear war. The Cold War may have ended, but governments still build bunkers, as Garrett shows: Chinese contractors have recently completed a 23,000-square-metre complex in Djibouti. But these grand, often secret manifestations of official fear are not the main focus of the book. Instead, Garrett is interested in private bunkers and the people who build them, people like Robert Vicino, founder of the Vivos Group, who purchased the Burlington Bunker with the intent of making a worldwide chain of apocalypse retreats.

Garrett calls these people the ‘dread merchants’. Dread differs from fear in that it has no object: it is fear that has not yet found a focus. And if dread is your business, business has never been better, with the sustaining structures of modern life seeming ever more fragile and challenged. The dark charisma of the bunker is probably what will attract readers to this book, but the energetic and gregarious Garrett keeps the story focused on people rather than buildings. Much of the emphasis is on his native USA, where ‘prepping’ – disaster and Armageddon preparedness – has become a significant subculture, though there are also excursions to Australia, where ecological precarity is fuelling the bunker biz, and New Zealand and Thailand, favoured global ‘bug-out’ locations of the elite.

The first wave of private bunker-building followed the Cuban Missile Crisis of 1962, during which the American government made it plain that it had no intention of providing for the shelter of more than the military and political elite. The rest of the population got the message: if the worst happens, you’re on your own. Since then, American society appears to have been locked in a spiral of mistrust. In the 1990s, religiously minded ‘survivalist’ movements sought to divorce themselves from what they saw as an increasingly controlling federal state by forming autonomous fortified communities. Alarmed at these splinter groups walling themselves up and stockpiling weapons, the government reacted with overwhelming force, resulting in multiple deaths at Ruby Ridge and at the Branch Davidian compound in Waco, Texas. This bloodshed did nothing but confirm survivalists’ worst fears.

After the 9/11 attacks, survivalism entered the mainstream, giving birth to the modern prepper movement. As bunker salesman Gary Lynch tells Garrett, 9/11 was good for business on two fronts, as some Americans began to fear further terrorist attacks while others became alarmed by the prospect of increasing domestic authoritarianism. (...)

Buried, seemingly secure, as much a target for robbers as protection against them, the bunker shares many characteristics with the tomb. Both structures mediate with a kind of afterlife: the tomb ferries the dead to the hereafter, while the bunker is designed to deliver the still-living through a period of calamity to a safer future. Hope and survival are, in theory, uplifting themes, but Bunker is, in some ways, rather depressing. The people who want bunkers have, in one form or another, given up on society, taking a dim view of its prospects and seeing it as a thin veneer of order laid over Hobbesian chaos. The salespeople naturally promote this view: ‘dread merchants’ is the right phrase for them, since dread is really the product they’re selling.

by Will Wiles, Literary Review |  Read more:
Image: via

Love Letter To A Vanishing World

1.

Of all the places I’ve never been, Borneo is my favorite.

I have several times been within spitting distance: to the Philippines—as far south as Panay; to the court cities of central Java and to the highlands of Sulawesi, in Indonesia. I’ve spent many happy days on Peninsular Malaysia. Have lived in Tokyo, Hong Kong, and Kaoshiung~~~But as they say, “Close, but no cigar!”

My college boyfriend was a great fan of Joseph Conrad. He wanted to follow in the great man’s footsteps. He planned it all out. We’d go up the Mahakam River. “More than a river, it’s like a huge muddy snake,” his eyes danced with excitement, “Slithering through the dense forest.” We talked about Borneo endlessly. He promised I would see Borneo’s great hornbills, wearing their bright orange helmets– with bills to match. And primates: maybe we would see a gibbon in the tangle of thick foliage –or an orangutan. There would be noisy parrots in the trees and huge butterflies with indigo wings like peacock feathers, fluttering figments of our imagination. He told me that nothing would make him happier than to see the forests of Borneo.

A cruel young woman, I vetoed Borneo –and dragged him off to Kashmir instead. And to make matters worse, a year later, Gavin Young came out with his highly acclaimed book, In Search of Conrad, in which he does just what my boyfriend had wanted to do: follow Conrad to that famed trading post up “an Eastern river.”

2.

Recently, I re-read Eric Hansen’s travel classic, Stranger in the Forest. The book came out in the mid-80s. This was about ten years before I vetoed our trip to Borneo. It was also a time before the Internet and GPS. To prepare for his trip, Hansen had to go to a university library and read books, flip through journals, and consult maps—and to his great delight, he discovered there were still uncharted areas. And these were the very spots he wanted to see! Beginning his journey on the Malaysian side of Borneo, in Kuching, he traveled upriver on the Rajang (every bit as legendary as the Mahakam), and made his way inland toward the Highlands, where the indigenous Dayak peoples lived.

Did I mention he was mainly going on foot?

His trip occurred just a few years before Bruno Manser’s legendary ramble across Borneo. You’ve heard the expression “Fact is stranger than fiction?” Well, that term was invented for Swiss environmentalist, Bruno Manser’s life story. Arriving in Borneo in the mid-80s, within a year, he was living with one of the most elusive tribes in the highlands, the Penan. Carl Hoffman (who wrote the best seller, Savage Harvest) has just come out with a double biography called The Last Wild Men of Borneo about Bruno Manser and American tribal art dealer Michael Palmieri. The cover of the book has a photograph of Manser that I did not realize was a white man until I was nearly finished reading. Dressed in a loincloth and carrying a poison arrow quiver and blowpipe, his hair has been cut in the Dayak fashion, and he is shown squatting on a rock near the river’s edge. It is a touching photograph of a man who gave his life to fight for the rights of the indigenous peoples of the highlands.

For those who want to see what walking in the forest is actually like, they can take a look at the fourth documentary, “Dream Wanderers of Borneo,” in Lorne and Lawrence Blair’s Ring of Fire films. The book came out in 1988 and the book in 2003, based on their several month-long journey in the early 80s to find and stay with the nomadic Punan Dayaks. The brothers were themselves following in the footsteps of the Victorian naturalist Alfred Russel Wallace in his Malay Archipelago, who had discovered over 2,000 species of insects in the region. This is well shown in the brother’s video, as the insects are relentless. One of the brothers wonders if that is not the reason why Borneo was left alone for so long, for who could tolerate the bugs? At one point Lorne becomes temporarily blind for a half hour when something stings the back of his neck.

Even as early as 1980, logging was already a huge issue. In Japan, especially, environmentalists rightly bemoaned the destruction being caused by the timber industry—so much of that wood being imported into Japan (The majority is now being imported into China). Logging was pushing the indigenous Dayak peoples of the highland into greater and greater peril as the land they considered to be theirs was being destroyed. Water was contaminated and animals were dying in great numbers. Manser realized that a people who had lived harmoniously in the interior of the island for thousands of years were now in grave danger of being pushed out–all in the name of corporate greed.

And so he fought valiantly to bring their plight to the attention of the world—including climbing up a 30-foot-tall London lamppost outside of the media center covering the 1991 G7 Summit and unfurling a banner about Dayak rights and then the following year, paragliding into a crowded stadium during the Earth Summit in Rio de Janeiro. In 1992, after meeting Manser, Vice-President Al Gore introduced a resolution into the senate calling upon the government of Malaysia to protect the rights of the indigenous peoples and for Japan to look into its logging companies’ practices. By the mid-90s, Manser had become a serious headache to the huge logging industry in Malaysia and an embarrassment to the government. Manser was to disappear in 2000 and was officially pronounced dead in 2005 (though his body was never found).

3.

It is a tragic story, with the only possible silver-lining being that at least Manser was not around to see what happened next, when the palm oil industry came to town. I had began wondering how much of that Borneo my boyfriend dreamt of was left? So, I picked up The Wasting of Borneo, by Alex Shoumatoff (2017) and quickly realized the situation was far worse than I was imagining. A staff writer for the New Yorker, Shoumatoff has been a contributing editor at Vanity fair and Conde Nast traveler among others. A travel writer and environmentalist, he has been to Borneo several times. In this latest book, he begins his Borneo journey with a visit to Birute Galdikas at her Orangutan Care Center near the Tanjung Puting National Park in Central Kalimantan.

Have you heard of Leakey’s Angels?

by Leanne Ogasawara, 3 Quarks Daily | Read more:
Images: uncredited

Monday, August 17, 2020

La Caravana del Diablo


La Caravana del Diablo: a migrant caravan in Mexico (The Guardian)
Image: Ada Trillo
[ed. Photo essay.]

The Fully Industrialized Modern Chicken

A century ago, Americans would not recognise our modern hunger for chicken. The year-long market for tender but relatively bland chicken meat is a newish phenomenon, and without it the idea of chicken cutlets, $5 rotisseries, or the McNugget would be a fantasy.

How did America go from thinking of chicken as an “alternative” meat to consuming it more than any other meat?

The story starts with corn.

How American corn fueled a taste for chicken

At the turn of the 20th century, chicken was almost always eaten in the spring. The priority for chicken raisers at the time was egg production, so after the eggs hatched, all the male birds would be fed up and then quickly harvested as “spring chickens” – young, tender birds that were sold whole for roasting or broiling (hence the term “broilers”). Outside the spring rush, you might be buying a bigger, fatter fryer or an old hen for stewing.

“Farmers were sending chickens of all sorts of ages, different feather colours, and tremendous variety to the marketplace in the early 20th century,” says Roger Horowitz, food historian and author of Putting Meat on the American Table. But almost all chickens in the market were simply surplus to egg production, making them relatively uncommon – even rare. Tender spring chickens in particular could fetch a good price. But it is worth noting, Horowitz says, that the higher price wasn’t necessarily coming from pent-up demand.

“It’s not as if consumers were clamoring for broilers,” he says. Though there was some consumer demand for chickens, the relatively high price for broilers likely had more to do with the limited, seasonal supply than a passion for poultry.

During the second world war, however, red meat was rationed, and a national campaign encouraged the consumption of poultry and fish to save “meat” (beef, pork and lamb) for “the army and our allies”. Eating chicken became more common, but the preference for young broilers, and white breast meat, persisted.

As the war drew to a close, feed millers, which buy and grind corn and other grains to feed livestock, saw a big opportunity to spur that demand for meat chickens, which consume large amounts of corn. When traditional banks refused to finance new-fangled “chicken farms”, the feed companies themselves offered farmers loans to buy feed and equipment, putting the pieces of the modern contract poultry system in place.

Consumer acceptance of broilers out of season was not automatic. In the 1930s, the average American ate 10lbs (4.5kg) or less of chicken annually; by 2017 that had risen to 64lbs (29kg), according to the Economic Research Service at the United States Department of Agriculture (USDA). For decades chicken battled to be seen as a “meat”, and did not surpass its most expensive competitor, beef, in terms of overall consumption until 2010. A strong USDA-funded marketing campaign helped out.

“In the 50s and 60s, you see where these agricultural extension operations start pushing out recipes very aggressively about broilers,” Horowitz says, and as feed companies and hatcheries (most of which would eventually become so-called “integrators”, which own several of the businesses involved in chicken production) continued to consolidate the industry, they were able to more carefully calibrate the chicken itself to what would sell most profitably, focusing on lowering costs and raising proportions of the highest-demand cuts, namely breast meat.

Don Tyson, the late president of Tyson Foods, famously said: “If breast meat is worth two dollars a pound and dark meat is worth one dollar, which would I rather have?” But for generations, the idea of buying just the most coveted cuts of chicken was foreign to most consumers. It wasn’t until the 1980s that preferences began to switch to cuts of meat over the whole bird.

These companies owned and understood their chickens from egg to table and were able to exert unprecedented control over the biology of their flocks. Now, not only are they able to fine tune the birds’ characteristics with incredible accuracy, they can also map interactions with feed, environment, and processing to maximise profits.

For integrators and corn farmers alike, the investment paid off. In 2019, 9.2 billion 6lb (2.7kg) broiler chickens were harvested in the US, consuming about 1.8lbs (820g) of grain for every pound of chicken.

But the impact on chickens from the changes in production is troubling.

The modern industrial chicken

Over the past 70 years, the poultry industry has measured its success in terms of how many pounds of meat a chicken can produce for a given amount of feed. Modern chickens are more efficient than ever, with producers able to calculate to the ounce how much “input” of food, water, air and time are required to get a set amount of white and dark meat.

The modern chicken is fully industrialised.

With more than 500 chicken breeds existing on Earth, it might surprise you to learn that every nugget, breast, and cup of chicken noodle soup you’ve ever eaten likely came from one breed, a specialised cross between a Cornish and a white rock.

by Sarah Mock, The Guardian |  Read more:
Image: Glowimages/Getty