Wednesday, August 26, 2020
Suburbia, Reconsidered
It’s a weird time for the American suburbs.
As the Trump administration attempts to secure votes in the lead-up to the 2020 election, the president has leaned in to a not-so-subtle tactic: promising to protect suburban America from the supposedly harmful influence of low-income housing, by abolishing an Obama-era rule designed to combat racial segregation.
But Trump’s suburban rhetoric — and his apparent conviction that suburbia is the exclusive domain of affluent white housewives enjoying the “Suburban Lifestyle Dream” — no longer holds water. Suburban America is more diverse than ever, and poverty is rising in the suburbs at a faster pace than in urban or rural areas.
“I honestly don’t think that guy has ever been to a real suburb — aside from like, golf courses.” says Jason Diamond, the author of The Sprawl: Reconsidering the Weird American Suburbs, a new examination of the suburbs and their influence on American culture. As he writes in the introduction to his book, out Aug. 25 from Coffee House Press, “we try to pigeonhole suburbia, act like it’s a great big boring monolith of conformity and tract housing, but there’s so much more to it than that, and we need to understand it better.”
As in his previous book, 2016’s Searching for John Hughes, Diamond mined his 1980s childhood in the suburbs of Chicago for material. But he also traveled to suburbs throughout the U.S. to try and understand how they went from being perceived as utopian enclaves to bland wastelands. Along the way, he discovered that hackneyed ideas about the homogeneity of suburbia don’t hold up.
“I started noticing how much some of these places are different from the other ones — like some suburbs are suburbs, but they’re more country,” he says. “I was like, that’s interesting, because we’re taught that suburbs are one thing, and all the houses look alike. That’s not necessarily true.”
The book is also an examination of how the suburbs have influenced popular culture and vice versa, through the work of artists like Steven Spielberg (raised in the Phoenix suburb of Arcadia, Arizona), TV shows like “Twin Peaks” and “Fresh Off the Boat,” and authors like John Cheever, Shirley Jackson and William Gibson.
None of this is to say the suburbs aren’t worthy of critique; as Diamond writes in the introduction, “the suburbs were a smart, practical idea that was put into practice in all the wrong ways.” He finds plenty to scrutinize in the racist policies that established patterns of segregation and inequity that persist to this day, and in the strain of suburban NIMBYism that defends it. In addition, the car-centric geography of many suburbs takes a terrible environmental toll. But, Diamond argues, it’s worth fighting those forces and making suburbia more welcoming for all. “Whether we like it or not, the future is in suburbia,” he writes. “We just need to reclaim it.”
We spoke with Diamond about the the cultural power of the American suburb, why stereotypes about it persist, and how life among the cul-de-sacs could change. The following conversation has been condensed and edited. (...)
The concept of place in your book is really interesting — you write about the way the suburbs are designed, and how that can foster creativity. What’s the connection between the suburbs as a place and art?
I am always curious about how people hit a certain point and are still creative and curious about things. I started realizing that it wasn’t so much the specific suburb they were from; it was mostly the suburban way of life that influenced them. I would talk to a lot of people and everyone had the same experience: “Yeah, I was really bored, and would just draw all day.” That is a thing that unites all the people I know from the suburbs; boredom was a great connector.
I didn’t want to write a book about the architecture of the suburbs; that’s not something I know a lot about. From the get-go, the art coming out of the suburbs was going to be the focus. We can pooh-pooh the suburbs, but we’ll call Blue Velvet one of the great cinematic masterpieces of the last 40 years, or “The Simpsons” will get voted the greatest show of all time. There’s a reason. It’s because this stuff connects to us. (...)
You also say that one way to fix the suburbs and make them more livable would be to “decrease the ease” that people who live there have gotten used to. How do you sell that to suburbanites when part of the appeal is the ease of living?
I don’t think you’re going to sell it. As we’ve learned with trying to get people to wear masks, I don’t think we’re going to sell anything. I think you change the culture. You’re going to see people moving from the cities back to the suburbs — which was happening before Covid — and [those] people are like, “I want what I had in the city, I want more of that.” It’s not going to be widespread, but it’s going to impact the culture of certain suburbs. And that’s a good thing.
As the Trump administration attempts to secure votes in the lead-up to the 2020 election, the president has leaned in to a not-so-subtle tactic: promising to protect suburban America from the supposedly harmful influence of low-income housing, by abolishing an Obama-era rule designed to combat racial segregation.
But Trump’s suburban rhetoric — and his apparent conviction that suburbia is the exclusive domain of affluent white housewives enjoying the “Suburban Lifestyle Dream” — no longer holds water. Suburban America is more diverse than ever, and poverty is rising in the suburbs at a faster pace than in urban or rural areas.
“I honestly don’t think that guy has ever been to a real suburb — aside from like, golf courses.” says Jason Diamond, the author of The Sprawl: Reconsidering the Weird American Suburbs, a new examination of the suburbs and their influence on American culture. As he writes in the introduction to his book, out Aug. 25 from Coffee House Press, “we try to pigeonhole suburbia, act like it’s a great big boring monolith of conformity and tract housing, but there’s so much more to it than that, and we need to understand it better.” As in his previous book, 2016’s Searching for John Hughes, Diamond mined his 1980s childhood in the suburbs of Chicago for material. But he also traveled to suburbs throughout the U.S. to try and understand how they went from being perceived as utopian enclaves to bland wastelands. Along the way, he discovered that hackneyed ideas about the homogeneity of suburbia don’t hold up.
“I started noticing how much some of these places are different from the other ones — like some suburbs are suburbs, but they’re more country,” he says. “I was like, that’s interesting, because we’re taught that suburbs are one thing, and all the houses look alike. That’s not necessarily true.”
The book is also an examination of how the suburbs have influenced popular culture and vice versa, through the work of artists like Steven Spielberg (raised in the Phoenix suburb of Arcadia, Arizona), TV shows like “Twin Peaks” and “Fresh Off the Boat,” and authors like John Cheever, Shirley Jackson and William Gibson.
None of this is to say the suburbs aren’t worthy of critique; as Diamond writes in the introduction, “the suburbs were a smart, practical idea that was put into practice in all the wrong ways.” He finds plenty to scrutinize in the racist policies that established patterns of segregation and inequity that persist to this day, and in the strain of suburban NIMBYism that defends it. In addition, the car-centric geography of many suburbs takes a terrible environmental toll. But, Diamond argues, it’s worth fighting those forces and making suburbia more welcoming for all. “Whether we like it or not, the future is in suburbia,” he writes. “We just need to reclaim it.”
We spoke with Diamond about the the cultural power of the American suburb, why stereotypes about it persist, and how life among the cul-de-sacs could change. The following conversation has been condensed and edited. (...)
The concept of place in your book is really interesting — you write about the way the suburbs are designed, and how that can foster creativity. What’s the connection between the suburbs as a place and art?
I am always curious about how people hit a certain point and are still creative and curious about things. I started realizing that it wasn’t so much the specific suburb they were from; it was mostly the suburban way of life that influenced them. I would talk to a lot of people and everyone had the same experience: “Yeah, I was really bored, and would just draw all day.” That is a thing that unites all the people I know from the suburbs; boredom was a great connector.
I didn’t want to write a book about the architecture of the suburbs; that’s not something I know a lot about. From the get-go, the art coming out of the suburbs was going to be the focus. We can pooh-pooh the suburbs, but we’ll call Blue Velvet one of the great cinematic masterpieces of the last 40 years, or “The Simpsons” will get voted the greatest show of all time. There’s a reason. It’s because this stuff connects to us. (...)
You also say that one way to fix the suburbs and make them more livable would be to “decrease the ease” that people who live there have gotten used to. How do you sell that to suburbanites when part of the appeal is the ease of living?
I don’t think you’re going to sell it. As we’ve learned with trying to get people to wear masks, I don’t think we’re going to sell anything. I think you change the culture. You’re going to see people moving from the cities back to the suburbs — which was happening before Covid — and [those] people are like, “I want what I had in the city, I want more of that.” It’s not going to be widespread, but it’s going to impact the culture of certain suburbs. And that’s a good thing.
Spycams Are Becoming Ubiquitous
[ed. See also: Activists find camera inside mysterious box on power pole near union organizer’s home (Fox 13).]
Tuesday, August 25, 2020
Clothing As Platform
Long before the Covid-19 pandemic halted fashion shows and shuttered malls, the harsh realities of the fashion industry’s race-to-the-bottom production practices were becoming all too clear: unsustainable for the environment and lethally dangerous for textile producers and garment workers. Yet despite news stories of dangerous working conditions and tragedies such as the Rana Plaza factory collapse, tons of garments continued to enter the market, resulting in record amounts of textile waste. In fact, according to the EPA, Americans buried 10.5 million tons of clothing in landfills each year. And it’s not just the mass-market H&Ms and Zaras of the retail world who are implicated; the luxury segment of the market is flooded with unwanted products too: Burberry was discovered to have burned £90 million worth of unsold stock over a five-year period rather than see it devalue its brand image in discount stores.
In tandem, responsible retail alternatives have also boomed, and America’s bloated clothes retail sector has led to a burgeoning resale market. Fueled by online opportunities for peer-to-peer commerce, companies such as Thredup (which bills itself as the largest online consignment and thrift store), Depop, and Vestiaire Collective have multiplied — so much so that according to a 2020 report, the resale market is estimated to grow 21 times faster than that of regular apparel, with the secondhand market reaching a projected value of $51 billion within five years. Add to this the luxury-rental options (Rent the Runway) as well as monthly rental subscriptions (such as Le Tote for bags and Armoire for designer fashion) and it becomes clear that clothes have begun to circulate beyond the traditional control of luxury-fashion conglomerates.
Unsurprisingly, traditional fashion brands perceive these new distribution models as a threat, jeopardizing revenue and their well-honed prestige, cachet, and financial value. In an attempt regain monopoly over the sale of their goods, some have made efforts to discredit non-affiliated resale. (...)
In addition, some luxury brands have started adding surveillance to their arsenal, turning to blockchains to undermine the emergence of secondary markets in a way that pays lip service to sustainability and labor ethics concerns. LVMH launched Aura in 2019, a blockchain-enabled platform for authenticating products from the Louis Vuitton, Christian Dior, Marc Jacobs, and Fenty brands, among others. Meanwhile, fashion label Stella McCartney began a transparency and data-monitoring partnership with Google for tracking garment provenance, discouraging fakes and promising to ensure the ethical integrity of supply chains. Elsewhere, a host of fashion blockchain startups, including Loomia, Vechain, and Faizod, have emerged, offering tracking technologies to assuage customer concerns over poor labor conditions and manufacturing-related pollution by providing transparency on precisely where products are made and by which subcontractors.
However, as promising as these technologies may be for holding a mirror to the industry’s production methods, their impact on consumers won’t simply be to reassure them. When it comes to garments, surveillance isn’t simply a matter of placing the supply chain under new scrutiny. Companies such as Arianee, Dentsu and Evrythng also aim to track clothes on consumers’ bodies and in their closets. At the forefront of this trend is Eon, which with backing from Microsoft and buy-in from mainstream fashion brands such as H&M and Target, has begun rolling out the embedding of small, unobtrusive RFID tags — currently used for everything from tracking inventory to runners on a marathon course — in garments designed to transmit data without human intervention.
Eon’s primary stated goal sits squarely within the realm of sustainability: It wants to help implement a global digital-identity protocol so the information from everybody who touches or owns the product is uploaded in a standardized way, potentially encouraging better labor practices through transparency and increased rental and resale opportunities. Tracking sensors (along with apps developed to make use of them) could feasibly be used to extend the life of a garment, ensuring its provenance and making it a better long-term investment, encouraging resale, and allowing for proper recycling.
But its technology would also connect products and their wearers to the internet of things. According to the future depicted by Eon and its partners, garments would become datafied brand assets administering access to surveillance-enabled services, benefits, and experiences. The people who put on these clothes would become “users” rather than wearers. In some respects, this would simply extend some of the functionality of niche wearables to garments in general. Think: swimsuits able to detect UV light and prevent overexposure to the sun, yoga pants that prompt the wearer to hold the right pose, socks that monitor for disease risks, and fitness trackers embedded into sports shirts. At the same time, it would extend the symbolic functions of clothing to one’s online networks, offering consumers the potential cultural capital and social currency of having one’s outfit and location broadcast automatically to their social circle and beyond. Digital identity tags would also allow consumers to purchase physical and augmented-reality products simultaneously: i.e. the owner of a pair of Nike Cryptokicks could wear them on the street and as an avatar in a video game.
These benefits, such as they are, pale in comparison to what companies stand to gain from implementing ubiquitous fashion surveillance. As described by consultant Chris Grantham, this “new dynamic channel for marketing … and even new customer acquisition” would afford “seamless and personalized marketing strategies,” “continued conversation with the consumer post-sale,” “new business models such as subscription, rental and second-market offerings,” and even “tailored shopping/outfit planning services effectively incentivizing customers to share their data.” Simply put, clothes would become a digital platform for engaging consumers in branded, monetized experiences and tapping them as recurring revenue streams.
It’s unclear what consumers would get from so much “engagement,” other than a constant seep of ads. According to one potential scenario outlined by Eon partners, a running shoe could send a stream of usage data to the manufacturer so that it could notify the consumer when the shoe “nears the end of its life.” In another, sensors would determine when a garment needs repairing and trigger an online auction among competing menders. Finally, according to another, sensors syncing with smart mirrors would offer style advice and personalized advertising. All these open the door to myriad behavioral nudges, frictionless repeat orders, push notifications, and exhortations to update, repurchase, or repair on the manufacturer’s timetable — like a Check Engine light for a garment.
Given these ambitions, mainstream “smart” fashion (as with most things “smart”) appears as little more than an alibi for collecting personal behavioral data — not to mention a form of greenwashed techno-solutionism that ignores the realities of today’s surveillance economy. After all, sensor-laden garments would become part of the economic system described by Shoshana Zuboff as “surveillance capitalism,” or what digital theorist Mark Andrejevic has called the “digital enclosure,” an entanglement of “free” services from the likes of Facebook and Google and household products with networking capabilities, for which access “requires willing submission to increasingly detailed forms of data collection and online monitoring.”
As Zuboff illustrates, even well-intentioned privacy guidelines and “stylized disclosure agreements” don’t entirely protect users— opaque, exploitative terms of service still allow for data sharing and, for example, the monetization of patients’ private information from mobile health apps. Within this greater picture, the assetization of garments puts fashion brands on the same economic path as big tech, employing a monopolistic business rationale Nick Srnicek calls “platform capitalism,” or “ecosystems of goods and services that close off competitors: apps that only work with Android, services that require Facebook logins.” It would be inescapable unless you make your own clothes or remove embedded tags — potentially at a penalty. Using the economic playbook developed by Google, Facebook, Spotify, and Netflix, fashion brands would be poised to leverage users for financial gain, either selling them as audiences to other brands or collecting subscription revenue from them directly. In either case, a conventional material good (clothing) becomes reimagined as a service for which use is contingent upon regular payment, with either data or cash.
by Rachel Huber, Real Life | Read more:
Image: Farah Al Qasimi
In tandem, responsible retail alternatives have also boomed, and America’s bloated clothes retail sector has led to a burgeoning resale market. Fueled by online opportunities for peer-to-peer commerce, companies such as Thredup (which bills itself as the largest online consignment and thrift store), Depop, and Vestiaire Collective have multiplied — so much so that according to a 2020 report, the resale market is estimated to grow 21 times faster than that of regular apparel, with the secondhand market reaching a projected value of $51 billion within five years. Add to this the luxury-rental options (Rent the Runway) as well as monthly rental subscriptions (such as Le Tote for bags and Armoire for designer fashion) and it becomes clear that clothes have begun to circulate beyond the traditional control of luxury-fashion conglomerates.
Unsurprisingly, traditional fashion brands perceive these new distribution models as a threat, jeopardizing revenue and their well-honed prestige, cachet, and financial value. In an attempt regain monopoly over the sale of their goods, some have made efforts to discredit non-affiliated resale. (...)In addition, some luxury brands have started adding surveillance to their arsenal, turning to blockchains to undermine the emergence of secondary markets in a way that pays lip service to sustainability and labor ethics concerns. LVMH launched Aura in 2019, a blockchain-enabled platform for authenticating products from the Louis Vuitton, Christian Dior, Marc Jacobs, and Fenty brands, among others. Meanwhile, fashion label Stella McCartney began a transparency and data-monitoring partnership with Google for tracking garment provenance, discouraging fakes and promising to ensure the ethical integrity of supply chains. Elsewhere, a host of fashion blockchain startups, including Loomia, Vechain, and Faizod, have emerged, offering tracking technologies to assuage customer concerns over poor labor conditions and manufacturing-related pollution by providing transparency on precisely where products are made and by which subcontractors.
However, as promising as these technologies may be for holding a mirror to the industry’s production methods, their impact on consumers won’t simply be to reassure them. When it comes to garments, surveillance isn’t simply a matter of placing the supply chain under new scrutiny. Companies such as Arianee, Dentsu and Evrythng also aim to track clothes on consumers’ bodies and in their closets. At the forefront of this trend is Eon, which with backing from Microsoft and buy-in from mainstream fashion brands such as H&M and Target, has begun rolling out the embedding of small, unobtrusive RFID tags — currently used for everything from tracking inventory to runners on a marathon course — in garments designed to transmit data without human intervention.
Eon’s primary stated goal sits squarely within the realm of sustainability: It wants to help implement a global digital-identity protocol so the information from everybody who touches or owns the product is uploaded in a standardized way, potentially encouraging better labor practices through transparency and increased rental and resale opportunities. Tracking sensors (along with apps developed to make use of them) could feasibly be used to extend the life of a garment, ensuring its provenance and making it a better long-term investment, encouraging resale, and allowing for proper recycling.
But its technology would also connect products and their wearers to the internet of things. According to the future depicted by Eon and its partners, garments would become datafied brand assets administering access to surveillance-enabled services, benefits, and experiences. The people who put on these clothes would become “users” rather than wearers. In some respects, this would simply extend some of the functionality of niche wearables to garments in general. Think: swimsuits able to detect UV light and prevent overexposure to the sun, yoga pants that prompt the wearer to hold the right pose, socks that monitor for disease risks, and fitness trackers embedded into sports shirts. At the same time, it would extend the symbolic functions of clothing to one’s online networks, offering consumers the potential cultural capital and social currency of having one’s outfit and location broadcast automatically to their social circle and beyond. Digital identity tags would also allow consumers to purchase physical and augmented-reality products simultaneously: i.e. the owner of a pair of Nike Cryptokicks could wear them on the street and as an avatar in a video game.
These benefits, such as they are, pale in comparison to what companies stand to gain from implementing ubiquitous fashion surveillance. As described by consultant Chris Grantham, this “new dynamic channel for marketing … and even new customer acquisition” would afford “seamless and personalized marketing strategies,” “continued conversation with the consumer post-sale,” “new business models such as subscription, rental and second-market offerings,” and even “tailored shopping/outfit planning services effectively incentivizing customers to share their data.” Simply put, clothes would become a digital platform for engaging consumers in branded, monetized experiences and tapping them as recurring revenue streams.
It’s unclear what consumers would get from so much “engagement,” other than a constant seep of ads. According to one potential scenario outlined by Eon partners, a running shoe could send a stream of usage data to the manufacturer so that it could notify the consumer when the shoe “nears the end of its life.” In another, sensors would determine when a garment needs repairing and trigger an online auction among competing menders. Finally, according to another, sensors syncing with smart mirrors would offer style advice and personalized advertising. All these open the door to myriad behavioral nudges, frictionless repeat orders, push notifications, and exhortations to update, repurchase, or repair on the manufacturer’s timetable — like a Check Engine light for a garment.
Given these ambitions, mainstream “smart” fashion (as with most things “smart”) appears as little more than an alibi for collecting personal behavioral data — not to mention a form of greenwashed techno-solutionism that ignores the realities of today’s surveillance economy. After all, sensor-laden garments would become part of the economic system described by Shoshana Zuboff as “surveillance capitalism,” or what digital theorist Mark Andrejevic has called the “digital enclosure,” an entanglement of “free” services from the likes of Facebook and Google and household products with networking capabilities, for which access “requires willing submission to increasingly detailed forms of data collection and online monitoring.”
As Zuboff illustrates, even well-intentioned privacy guidelines and “stylized disclosure agreements” don’t entirely protect users— opaque, exploitative terms of service still allow for data sharing and, for example, the monetization of patients’ private information from mobile health apps. Within this greater picture, the assetization of garments puts fashion brands on the same economic path as big tech, employing a monopolistic business rationale Nick Srnicek calls “platform capitalism,” or “ecosystems of goods and services that close off competitors: apps that only work with Android, services that require Facebook logins.” It would be inescapable unless you make your own clothes or remove embedded tags — potentially at a penalty. Using the economic playbook developed by Google, Facebook, Spotify, and Netflix, fashion brands would be poised to leverage users for financial gain, either selling them as audiences to other brands or collecting subscription revenue from them directly. In either case, a conventional material good (clothing) becomes reimagined as a service for which use is contingent upon regular payment, with either data or cash.
by Rachel Huber, Real Life | Read more:
Image: Farah Al Qasimi
Monday, August 24, 2020
Why Every City Feels the Same Now
Some time ago, I woke up in a hotel room unable to determine where I was in the world. The room was like any other these days, with its neutral bedding, uncomfortable bouclĂ© lounge chair, and wood-veneer accent wall—tasteful, but purgatorial. The eerie uniformity extended well beyond the interior design too: The building itself felt like it could’ve been located in any number of metropolises across the globe. From the window, I saw only the signs of ubiquitous brands, such as Subway, Starbucks, and McDonald’s. I thought about phoning down to reception to get my bearings, but it felt too much like the beginning of an episode of The Twilight Zone. I travel a lot, so it was not the first or the last time that I would wake up in a state of placelessness or the accompanying feeling of dĂ©jĂ vu.
The anthropologist Marc Augé gave the name non-place to the escalating homogeneity of urban spaces. In non-places, history, identity, and human relation are not on offer. Non-places used to be relegated to the fringes of cities in retail parks or airports, or contained inside shopping malls. But they have spread. Everywhere looks like everywhere else and, as a result, anywhere feels like nowhere in particular.
The opposite of placelessness is place, and all that it implies—the resonances of history, folklore, and environment; the qualities that make a location deep, layered, and idiosyncratic. Humans are storytelling creatures. If a place has been inhabited for long enough, the stories will already be present, even if hidden. We need to uncover and resurface them, to excavate the meanings behind street names, to unearth figures lost to obscurity, and to rediscover architecture that has long since vanished. A return to vernacular architecture—the built environment of the people, tailored by and for local culture and conditions—is overdue. It can combat the placelessness that empires and corporations have imposed. (...)
Commercial builders also emulate architecture that conveys a desirable image. At the turn of the 20th century, the administrators and businessmen of Meiji Japan commissioned Western architects to modernize their country, adopting the structures of supposed Western progress. So did the sultan of Zanzibar, whose House of Wonders has European characteristics, along with a front entrance large enough to ride his elephant through.
It was only a matter of time before corporations began to construct their own hegemonic visions of urban life. In 1928, an American town sailed up a tributary of the Amazon. It came in pieces, to be assembled into shingled houses with lawns and picket fences, a Main Street, a dance hall, a cinema, and a golf course. Henry Ford was the visionary behind the development; his aim: to control the rubber industry via exported Americanism. He named it Fordlândia.
The settlement failed dramatically. The jungle was unforgiving, and the settlers were unprepared for malarial fevers and snake attacks. Cement and iron were unsuited to the humidity. Blight spread through the rubber plantation, which had been cultivated too intensively. Ford’s promises of free health care and fair wages were undermined by puritanical surveillance, cruelty, and incompetence. Eventually, the workers rioted. As a utopia, Fordlândia was probably doomed from the start, given its founding in neocolonial arrogance. But despite its failure almost a century ago, Fordlândia successfully predicted the future of cities: utter sameness, exported globally.
In the decades that followed, corporate architecture of the sort outside my hotel room adopted designs that expressed corporate power. It became slick and monolithic. Ruthlessly rational, it exudes aloofness—its denizens exist high above the streets in glass-and-steel boxes that maximize the expensive floor space. The earliest of these structures were inspired by Ludwig Mies van der Rohe’s 1959 Seagram Building, which set the archetype until the 1980s. The New Formalists tried to temper this model with humanizing, historical touches—the tall, pseudo-gothic arches with which Minoru Yamasaki embellished the World Trade Center, for instance—but even then, it often harked back to earlier symbols of dominating power, like Greco-Roman classicism had done.
Eventually, aware of appearing cold and remote, corporate architecture underwent an image change. Its buildings now resemble its brands: cooler, cuter, greener, more knowing and ironic. The doughnut-shaped mothership of Apple Park or the biodome spheres of Amazon’s Seattle campus offer examples.
But these structures might be worse than the indifferent, modernist monoliths they replaced. At least the glass towers made clear that their occupants didn’t care about you, or maybe anyone. Now headquarters buildings express the hypocrisy of corporate gentility. Apple Park, with its circular form and large central garden, telegraphs connection and collaboration. But its real message is power: It is one of the most valuable corporate headquarters in the world, echoing the Pentagon in size and ambition. Shaped like a spaceship, it also suggests to the local community, which grants Apple huge tax breaks, that the company could take off and relocate anywhere in the world, whenever it wants. (...)
Vernacular is an umbrella term architects and planners use to describe local styles. Vernacular architecture arises when the people indigenous to a particular place use materials they find there to build structures that become symptomatic of and attuned to that particular environment. AugĂ© called it “relational, historical and concerned with identity.” It aims for harmonious interaction with the environment, rather than setting itself apart from it. (...)
Creativity often works according to a dialectic process. Frank Lloyd Wright sought to “break the box” of Western architecture by shifting geometries, letting the outside in, and designing architecture within a natural setting, as he did with Fallingwater, one of his most famous designs. Wright was inspired by a love of the Japanese woodblock prints of Hiroshige and Hokusai—an influence he would later repay by training Japanese architects such as Nobuko and Kameki Tsuchiura, who reinterpreted European modernist design in Japan. The goal is not to replace glass skyscrapers with thatch huts, but to see vernacular as the future, like Wright did, rather than abandoning it to the past.
by Darran Anderson, The Atlantic | Read more:
Image: Justin Sullivan / Getty
The anthropologist Marc Augé gave the name non-place to the escalating homogeneity of urban spaces. In non-places, history, identity, and human relation are not on offer. Non-places used to be relegated to the fringes of cities in retail parks or airports, or contained inside shopping malls. But they have spread. Everywhere looks like everywhere else and, as a result, anywhere feels like nowhere in particular.
The opposite of placelessness is place, and all that it implies—the resonances of history, folklore, and environment; the qualities that make a location deep, layered, and idiosyncratic. Humans are storytelling creatures. If a place has been inhabited for long enough, the stories will already be present, even if hidden. We need to uncover and resurface them, to excavate the meanings behind street names, to unearth figures lost to obscurity, and to rediscover architecture that has long since vanished. A return to vernacular architecture—the built environment of the people, tailored by and for local culture and conditions—is overdue. It can combat the placelessness that empires and corporations have imposed. (...)Commercial builders also emulate architecture that conveys a desirable image. At the turn of the 20th century, the administrators and businessmen of Meiji Japan commissioned Western architects to modernize their country, adopting the structures of supposed Western progress. So did the sultan of Zanzibar, whose House of Wonders has European characteristics, along with a front entrance large enough to ride his elephant through.
It was only a matter of time before corporations began to construct their own hegemonic visions of urban life. In 1928, an American town sailed up a tributary of the Amazon. It came in pieces, to be assembled into shingled houses with lawns and picket fences, a Main Street, a dance hall, a cinema, and a golf course. Henry Ford was the visionary behind the development; his aim: to control the rubber industry via exported Americanism. He named it Fordlândia.
The settlement failed dramatically. The jungle was unforgiving, and the settlers were unprepared for malarial fevers and snake attacks. Cement and iron were unsuited to the humidity. Blight spread through the rubber plantation, which had been cultivated too intensively. Ford’s promises of free health care and fair wages were undermined by puritanical surveillance, cruelty, and incompetence. Eventually, the workers rioted. As a utopia, Fordlândia was probably doomed from the start, given its founding in neocolonial arrogance. But despite its failure almost a century ago, Fordlândia successfully predicted the future of cities: utter sameness, exported globally.
In the decades that followed, corporate architecture of the sort outside my hotel room adopted designs that expressed corporate power. It became slick and monolithic. Ruthlessly rational, it exudes aloofness—its denizens exist high above the streets in glass-and-steel boxes that maximize the expensive floor space. The earliest of these structures were inspired by Ludwig Mies van der Rohe’s 1959 Seagram Building, which set the archetype until the 1980s. The New Formalists tried to temper this model with humanizing, historical touches—the tall, pseudo-gothic arches with which Minoru Yamasaki embellished the World Trade Center, for instance—but even then, it often harked back to earlier symbols of dominating power, like Greco-Roman classicism had done.
Eventually, aware of appearing cold and remote, corporate architecture underwent an image change. Its buildings now resemble its brands: cooler, cuter, greener, more knowing and ironic. The doughnut-shaped mothership of Apple Park or the biodome spheres of Amazon’s Seattle campus offer examples.
But these structures might be worse than the indifferent, modernist monoliths they replaced. At least the glass towers made clear that their occupants didn’t care about you, or maybe anyone. Now headquarters buildings express the hypocrisy of corporate gentility. Apple Park, with its circular form and large central garden, telegraphs connection and collaboration. But its real message is power: It is one of the most valuable corporate headquarters in the world, echoing the Pentagon in size and ambition. Shaped like a spaceship, it also suggests to the local community, which grants Apple huge tax breaks, that the company could take off and relocate anywhere in the world, whenever it wants. (...)
Vernacular is an umbrella term architects and planners use to describe local styles. Vernacular architecture arises when the people indigenous to a particular place use materials they find there to build structures that become symptomatic of and attuned to that particular environment. AugĂ© called it “relational, historical and concerned with identity.” It aims for harmonious interaction with the environment, rather than setting itself apart from it. (...)
Creativity often works according to a dialectic process. Frank Lloyd Wright sought to “break the box” of Western architecture by shifting geometries, letting the outside in, and designing architecture within a natural setting, as he did with Fallingwater, one of his most famous designs. Wright was inspired by a love of the Japanese woodblock prints of Hiroshige and Hokusai—an influence he would later repay by training Japanese architects such as Nobuko and Kameki Tsuchiura, who reinterpreted European modernist design in Japan. The goal is not to replace glass skyscrapers with thatch huts, but to see vernacular as the future, like Wright did, rather than abandoning it to the past.
by Darran Anderson, The Atlantic | Read more:
Image: Justin Sullivan / Getty
[ed. I've been thinking about this a lot lately as so many cultural institutions die left and right, victims of pandemic economics.]
Defunding the Police: Seattle's Stumbling Blocks
Seattle was on the verge of taking one of the most radical steps of late toward large-scale police reform of any city in the US just last month.
In the wake of the police killing of George Floyd in Minneapolis in May, and widespread police brutality and anti-racist protests, a veto-proof majority of council members voiced their support for defunding the police, slashing 50% of the department’s budget.
But since then, they’ve faced a series of logistical roadblocks and clashed with other city leaders, and ultimately all but one of them have walked back their statements.
The council instead voted for a much smaller round of cuts, including reducing the salaries of Carmen Best, who is Seattle’s chief of police, and members of her command staff as well as trimming about 100 of the department’s 1,400 police officers.
Mere hours after the vote, Best, the first African American leader of the department who has held the position for only two years, announced her retirement.
“The idea of letting, after we worked so incredibly hard to make sure that our department was diverse, that reflects the community that we serve, to just turn that all on a dime and hack it off without having a plan in place to move forward, it’s highly distressful to me,” she said during a news conference last week. “It goes against my principles and my conviction and I really couldn’t do it.” (...)
Stephen Page, associate professor at the University of Washington’s Evans School of Public Policy and Governance, told the Guardian that what appears to be missing in Seattle, Minneapolis and New York is leaders transforming police reform from a rallying cry to a precise plan.
“None of those discussions in any of those cities at this point seem to be taking seriously these questions of what, exactly, are we doing if we’re not funding the police and how are we going to do it,” he said. (...)
In Seattle, one of the key challenges during this process has been collaboration. While city council members have said they’ve tried to work with the police chief and mayor during the defunding process, at last week’s news conference Durkan characterized the last few weeks as an “absolute breakdown of collaboration and civil dialogue”. (...)
The council president M Lorena González, council member Teresa Mosqueda and council member Tammy J Morales said in a statement that they were sorry to see Best go, and again stressed the importance of city leaders working together during the law enforcement reform process. But they also made it clear that this has in no way deterred their efforts.
“The council will remain focused on the need to begin the process of transforming community safety in our city,” the statement said. “This historic opportunity to transition the SPD from reform to transformation will continue.”
Isaac Joy, an organizer with King County Equity Now, one of the coalitions that has pushed to defund the department by 50%, said there is potentially a silver lining to Best’s departure: it presents an opportunity to find someone to lead the department who can be a “thought partner on listening and responding to the community’s demands, to divest from our police force, demilitarize our police force and start reinvesting and making Seattle a city that everyone can thrive in”.
He also stressed that the leader should also be Black.
Joy explained that’s because of the “police history, specifically as it relates to the enslaved, the Black population and that being the root of the police force. And so, in order to rectify and address that root, you do need Black leadership, you just, along with Black leadership, you need the support of the department, the support of the mayor, the support of the council.”
Earlier this month, the coalition released a blueprint for cutting the police budget and reinvesting that money into such groups as those developing alternatives to policing and providing housing for people in need.
On Monday, the council unanimously approved a resolution that includes a variety of goals for 2021, including creating a civilian-led department of community safety and violence prevention, and moving the city’s 911 dispatch out of the police department.
In the wake of the police killing of George Floyd in Minneapolis in May, and widespread police brutality and anti-racist protests, a veto-proof majority of council members voiced their support for defunding the police, slashing 50% of the department’s budget.
But since then, they’ve faced a series of logistical roadblocks and clashed with other city leaders, and ultimately all but one of them have walked back their statements.
The council instead voted for a much smaller round of cuts, including reducing the salaries of Carmen Best, who is Seattle’s chief of police, and members of her command staff as well as trimming about 100 of the department’s 1,400 police officers.Mere hours after the vote, Best, the first African American leader of the department who has held the position for only two years, announced her retirement.
“The idea of letting, after we worked so incredibly hard to make sure that our department was diverse, that reflects the community that we serve, to just turn that all on a dime and hack it off without having a plan in place to move forward, it’s highly distressful to me,” she said during a news conference last week. “It goes against my principles and my conviction and I really couldn’t do it.” (...)
Stephen Page, associate professor at the University of Washington’s Evans School of Public Policy and Governance, told the Guardian that what appears to be missing in Seattle, Minneapolis and New York is leaders transforming police reform from a rallying cry to a precise plan.
“None of those discussions in any of those cities at this point seem to be taking seriously these questions of what, exactly, are we doing if we’re not funding the police and how are we going to do it,” he said. (...)
In Seattle, one of the key challenges during this process has been collaboration. While city council members have said they’ve tried to work with the police chief and mayor during the defunding process, at last week’s news conference Durkan characterized the last few weeks as an “absolute breakdown of collaboration and civil dialogue”. (...)
The council president M Lorena González, council member Teresa Mosqueda and council member Tammy J Morales said in a statement that they were sorry to see Best go, and again stressed the importance of city leaders working together during the law enforcement reform process. But they also made it clear that this has in no way deterred their efforts.
“The council will remain focused on the need to begin the process of transforming community safety in our city,” the statement said. “This historic opportunity to transition the SPD from reform to transformation will continue.”
Isaac Joy, an organizer with King County Equity Now, one of the coalitions that has pushed to defund the department by 50%, said there is potentially a silver lining to Best’s departure: it presents an opportunity to find someone to lead the department who can be a “thought partner on listening and responding to the community’s demands, to divest from our police force, demilitarize our police force and start reinvesting and making Seattle a city that everyone can thrive in”.
He also stressed that the leader should also be Black.
Joy explained that’s because of the “police history, specifically as it relates to the enslaved, the Black population and that being the root of the police force. And so, in order to rectify and address that root, you do need Black leadership, you just, along with Black leadership, you need the support of the department, the support of the mayor, the support of the council.”
Earlier this month, the coalition released a blueprint for cutting the police budget and reinvesting that money into such groups as those developing alternatives to policing and providing housing for people in need.
On Monday, the council unanimously approved a resolution that includes a variety of goals for 2021, including creating a civilian-led department of community safety and violence prevention, and moving the city’s 911 dispatch out of the police department.
by Hallie Golden, The Guardian | Read more:
Image: Karen Ducey/Getty Images
[ed. This once beloved city has gone absolutely nuts. Forcing your first black female police chief into retirement is not, as they say, a good look. Good luck to Carmen Best, she tried her best.]
Sunday, August 23, 2020
Saturday, August 22, 2020
Basketball Was Filmed Before a Live Studio Audience
I knew there were way more important things than basketball, and I was all for canceling everything back in March: in-person school, sports, plays, concerts, conferences, just shut it down. But, in order to hold this line, I had to force myself to stop thinking all the damn time about the interruption of the Milwaukee Bucks’ magical season, the second consecutive MVP season of Giannis Antetokounmpo, their certain progress toward their first NBA Finals in decades. This was supposed to be Milwaukee’s summer, with a long playoff run for the basketball team followed by the Democratic National Convention in the same new arena built for just such moments. Months later, when it was official that the NBA season would resume at Disney World, encased in a quarantined bubble, tears formed in my eyes. From mid-March until the beginning of summer, I watched no live TV. The news was too awful, and sports were all reruns. Since late July, I’ve been watching the Bucks again, and like everything else in America, it’s been strange.
As sports, the competitions from the NBA bubble, like the football (soccer), baseball, and ice hockey games I’ve watched, are more or less the same. But as television shows, as a variety of broadcast media, and as an aesthetic experience made up of images and sounds, the NBA games so far have been a departure from the usual, and nothing feels right. It’s been a bit like another newly familiar experience: getting takeout from a restaurant where you previously would dine in. The food might taste like you remember it, but the sensory and social environment of the meal makes you realize how much context matters. (...)
The NBA bubble games have had a particularly sitcommy feel. The courts at Disney’s Wide World of Sports are walled in on three sides by tall video displays, obscuring whatever seats or walls are beyond the court except for rare glimpses when the director cuts to a camera behind the scorer’s table for a referee’s call. The images almost always stay on one side of the action facing these displays, and unlike the usual games from the before times, there are no camera operators on the court itself under the basket. The visual array is reminiscent of the kind of three-wall sets that television comedies adopted from the stage, with their proscenium effect of positioning the viewer across an invisible fourth wall. In a typical American sitcom, you hear but do not see an audience. Many are recorded with a live audience in the studio, and sometimes begin with a voice-over telling you as much (“Cheers was filmed before a live studio audience”). The combination of the three-wall set and audience audio makes the television comedy much more like theater than many kinds of television (the difference between this aesthetic and the “single-camera” comedy style of shows like The Office often prompts comparisons of the latter to cinema).
The sitcom “laugh track” is an old convention. It has sometimes been held up as the epitome of commercial television’s basically fraudulent nature. In the absence of a live audience, or when the audience isn’t demonstrative in the way the producers would like, the sound track of a comedy can be massaged by sweetening the recording or adding canned laughter. This isn’t that different from an older tradition in live performance of the claque, the audience members hired to applaud. But in any event, the sounds of the audience recreate for the viewer at home a sense of participation in a live event among members of a community who experience the show together. This is true for sports just as much as it is for scripted comedy or late-night variety shows. The audible audience for televised sports is always manipulated to be an accompaniment that suggests the space of a live event. A sports stadium or arena is a big television studio in the first place, a stage for the cameras with a raucous in-person audience. Your ticket gets you into the show as an extra. The sensory pandemonium of the live event is never really captured on TV, the blaring music and sound effects are kept low in the mix to keep the booth broadcasters’ voices loud and centered, and no one shoots a T-shirt cannon in your direction when you’re watching at home. But the crowd is essential to the visual and auditory qualities of sports, and the missing elements in these games from Florida have been a present absence. (...)
The video displays are part of what makes each game have a “home team,” as the imagery conveys the identity of one of the two competitors with the text, colors, and advertisements you would find in their home arena. The displays, expansive like digital billboards, also show images of the home team’s fans, which is a nice touch in theory. But the way this works in practice is bizarre. The low-res webcam images of the individual faces are abstracted against backgrounds that look like arena seats, and these are arrayed in a grid to create a large rectangle of spectators. The images are presumably live, but they could be out of sync for all we know as the fans seldom react to anything in the moment, have no way of feeding off one another, and are not audible. The arena has set up a grade of rows that recede away from the court, and some fans are more visible than others as they are courtside or behind the bench or scorer’s table. The close proximity of fans, separated by no barrier from the stars, is one of the thrills of watching live basketball. These virtual fans are by contrast one big upright surface of blurry, laggy heads, and they are reminiscent of the Hollywood Squares of meeting attendees now all too familiar from Zoom’s gallery view. Like many elements of live television of the past few months, these visuals of the NBA’s bubble games are the optics of a pandemic that has turned our lives inside out. (...)
These bubble games remind us, minute by minute, what life is like now. They afford us the dreamworld of a space where you can safely breathe heavily, unmasked, indoors with nine other players and three refs on the same basketball court. But they also televise this newly risky world of facemasks and six feet, of conversations mediated by plexiglass and video screens. I have felt for the NBA players whose season was abruptly arrested as it was getting good, but now I also envy the careful setup that their filthy rich sports league can afford, while my cash-strapped public university takes its chances and opens its dorms and classrooms without such a luxury of frequent testing and exceptional security.
by Michael Z. Newman, LARB | Read more:
Image: CNN
As sports, the competitions from the NBA bubble, like the football (soccer), baseball, and ice hockey games I’ve watched, are more or less the same. But as television shows, as a variety of broadcast media, and as an aesthetic experience made up of images and sounds, the NBA games so far have been a departure from the usual, and nothing feels right. It’s been a bit like another newly familiar experience: getting takeout from a restaurant where you previously would dine in. The food might taste like you remember it, but the sensory and social environment of the meal makes you realize how much context matters. (...)
The NBA bubble games have had a particularly sitcommy feel. The courts at Disney’s Wide World of Sports are walled in on three sides by tall video displays, obscuring whatever seats or walls are beyond the court except for rare glimpses when the director cuts to a camera behind the scorer’s table for a referee’s call. The images almost always stay on one side of the action facing these displays, and unlike the usual games from the before times, there are no camera operators on the court itself under the basket. The visual array is reminiscent of the kind of three-wall sets that television comedies adopted from the stage, with their proscenium effect of positioning the viewer across an invisible fourth wall. In a typical American sitcom, you hear but do not see an audience. Many are recorded with a live audience in the studio, and sometimes begin with a voice-over telling you as much (“Cheers was filmed before a live studio audience”). The combination of the three-wall set and audience audio makes the television comedy much more like theater than many kinds of television (the difference between this aesthetic and the “single-camera” comedy style of shows like The Office often prompts comparisons of the latter to cinema).The sitcom “laugh track” is an old convention. It has sometimes been held up as the epitome of commercial television’s basically fraudulent nature. In the absence of a live audience, or when the audience isn’t demonstrative in the way the producers would like, the sound track of a comedy can be massaged by sweetening the recording or adding canned laughter. This isn’t that different from an older tradition in live performance of the claque, the audience members hired to applaud. But in any event, the sounds of the audience recreate for the viewer at home a sense of participation in a live event among members of a community who experience the show together. This is true for sports just as much as it is for scripted comedy or late-night variety shows. The audible audience for televised sports is always manipulated to be an accompaniment that suggests the space of a live event. A sports stadium or arena is a big television studio in the first place, a stage for the cameras with a raucous in-person audience. Your ticket gets you into the show as an extra. The sensory pandemonium of the live event is never really captured on TV, the blaring music and sound effects are kept low in the mix to keep the booth broadcasters’ voices loud and centered, and no one shoots a T-shirt cannon in your direction when you’re watching at home. But the crowd is essential to the visual and auditory qualities of sports, and the missing elements in these games from Florida have been a present absence. (...)
The video displays are part of what makes each game have a “home team,” as the imagery conveys the identity of one of the two competitors with the text, colors, and advertisements you would find in their home arena. The displays, expansive like digital billboards, also show images of the home team’s fans, which is a nice touch in theory. But the way this works in practice is bizarre. The low-res webcam images of the individual faces are abstracted against backgrounds that look like arena seats, and these are arrayed in a grid to create a large rectangle of spectators. The images are presumably live, but they could be out of sync for all we know as the fans seldom react to anything in the moment, have no way of feeding off one another, and are not audible. The arena has set up a grade of rows that recede away from the court, and some fans are more visible than others as they are courtside or behind the bench or scorer’s table. The close proximity of fans, separated by no barrier from the stars, is one of the thrills of watching live basketball. These virtual fans are by contrast one big upright surface of blurry, laggy heads, and they are reminiscent of the Hollywood Squares of meeting attendees now all too familiar from Zoom’s gallery view. Like many elements of live television of the past few months, these visuals of the NBA’s bubble games are the optics of a pandemic that has turned our lives inside out. (...)
These bubble games remind us, minute by minute, what life is like now. They afford us the dreamworld of a space where you can safely breathe heavily, unmasked, indoors with nine other players and three refs on the same basketball court. But they also televise this newly risky world of facemasks and six feet, of conversations mediated by plexiglass and video screens. I have felt for the NBA players whose season was abruptly arrested as it was getting good, but now I also envy the careful setup that their filthy rich sports league can afford, while my cash-strapped public university takes its chances and opens its dorms and classrooms without such a luxury of frequent testing and exceptional security.
by Michael Z. Newman, LARB | Read more:
Image: CNN
Friday, August 21, 2020
Jerry Falwell Jr. and the Evangelical Redemption Story
Two weeks ago, Jerry Falwell Jr., the president of Liberty University, the largest evangelical college in America, posted an Instagram photo of himself on a yacht with his arm around a young woman whose midriff was bare and whose pants were unzipped. This would have been remarkable by itself, but it was all the more so because Falwell’s midriff was also bare and his pants also unzipped. In his hand, Falwell held a plastic cup of what he described winkingly in his caption as “black water.”
The aesthetics of the photo would be familiar to anyone who’s ever been to a frat party, but they were jarringly out of place for the son of Moral Majority cofounder Jerry Falwell Sr. and a professional evangelical Christian whose public rhetoric is built on a scaffolding of sexual conservatism and an antagonism to physical pleasure more generally.
The backdrop of a yacht represents an entirely different hypocrisy, arguably a more egregious one: the embrace of materialism and the open accumulation of enormous wealth. Falwell, who has a net worth estimated to be more than $100 million, is not formally a “prosperity gospel” adherent, but he has nonetheless jettisoned those inconvenient parts of Christian theology that preach the virtues of living modestly and using wealth to help the less fortunate.
But for his public, the problem with the photo was the optics of carnal sin—the attractive young woman who was not his wife, the recreational drinking, the unzipped pants—none of which would be acceptable at Liberty University, where coed dancing is penalized with a demerit. In the moral hierarchy of white evangelical Christianity, carnal sin is the worst, and this thinking drives the social conservatism that allows evangelicals to justify persecuting LGBTQ people, opposing sexual education in schools, distorting the very real problem of sex trafficking to punish sex workers, restricting access to abortion, eliminating contraception from employer-provided healthcare, and prosecuting culture wars against everything from medical marijuana to pop music. Evangelicalism’s official morality treats all pleasure as inherently suspect, the more so when those pleasures might belong to women or people of color.
Fortunately for Falwell, evangelicalism has built-in insurance for reputational damage, should a wealthy white man make the mistake of public licentiousness widely shared on the Web: the worst sins make for the best redemption stories. Even better, a fall from grace followed by a period of regret and repentance can be turned into a highly remunerative rehabilitation. That, in fact, has been many a traveling preacher’s grift from time immemorial.
I grew up hearing such “testimonies,” personal stories that articulate a life in sin and a coming to Jesus, firsthand. I was raised in the 1980s and 1990s in a family of Southern Baptists who viewed Episcopalians as raging liberals and Catholics, of which we knew precisely two, as an alien species. These were perfectly ordinary sentiments in the rural Alabama town we lived in. My dad was a local lineman for Alabama Power, and my mom worked at my school, first as a janitor and, later, as a lunch lady. Nobody in my family had gone to college.
Besides school and Little League, church was the primary basis of our social existence. As a child and into my early teens, my own religiosity was maybe a tick above average for our community. I went on mission trips to parts of the US that were more economically distressed than my hometown, handed out Chick tracts (named for the publisher and cartoonist Jack Chick) with as much zeal and sincerity as a twelve-year-old could muster, and on one occasion destroyed cassette tapes of my favorite bands (Nirvana, the Dead Kennedys, the Beastie Boys) in a fit of self-righteousness, only to re-buy them weeks later because, well, my faith had its limits.
All the while, I was—to use a word evangelicals like to misapply to any sort of secular education—“indoctrinated” by teachers, family, church staff, ministry organizations, and other members of the community to view everything I encountered in the world through an evangelical lens. If I went to the mall and lost my friends for a few minutes, I briefly suspected everyone had been raptured away except me, a particular brand of eschatological fantasy that we were taught was perpetually in danger of happening. Even my scandalous moments, which, do-goody overachiever that I was, were few and far between, were colored by the church. My first real kiss, at fourteen, was an epic make-out session on a sidewalk during a mission trip to a suburb of Orlando, with an eighteen-year-old assistant youth pastor named Matt.
I was ten or eleven when I was baptized—or in Southern Baptist parlance, “born again”—and part of this process involved constructing my own redemption narrative: I lived in sin and would be saved by Christ. I recently rediscovered my own handwritten testimony on a visit to my mom’s house. In a child’s rounded, looping handwriting, I had confessed that I used to “cheat at games,” something I don’t remember doing at all. The likely explanation for this is that because sin is such an important prerequisite for redemption, my ten-year-old self had to fabricate one to conform to the required convention (never mind that such a falsification would be sinful itself).
by Elizabeth Spiers, NY Review | Read more:
Image: Instagram
The aesthetics of the photo would be familiar to anyone who’s ever been to a frat party, but they were jarringly out of place for the son of Moral Majority cofounder Jerry Falwell Sr. and a professional evangelical Christian whose public rhetoric is built on a scaffolding of sexual conservatism and an antagonism to physical pleasure more generally.
The backdrop of a yacht represents an entirely different hypocrisy, arguably a more egregious one: the embrace of materialism and the open accumulation of enormous wealth. Falwell, who has a net worth estimated to be more than $100 million, is not formally a “prosperity gospel” adherent, but he has nonetheless jettisoned those inconvenient parts of Christian theology that preach the virtues of living modestly and using wealth to help the less fortunate.
But for his public, the problem with the photo was the optics of carnal sin—the attractive young woman who was not his wife, the recreational drinking, the unzipped pants—none of which would be acceptable at Liberty University, where coed dancing is penalized with a demerit. In the moral hierarchy of white evangelical Christianity, carnal sin is the worst, and this thinking drives the social conservatism that allows evangelicals to justify persecuting LGBTQ people, opposing sexual education in schools, distorting the very real problem of sex trafficking to punish sex workers, restricting access to abortion, eliminating contraception from employer-provided healthcare, and prosecuting culture wars against everything from medical marijuana to pop music. Evangelicalism’s official morality treats all pleasure as inherently suspect, the more so when those pleasures might belong to women or people of color.
Fortunately for Falwell, evangelicalism has built-in insurance for reputational damage, should a wealthy white man make the mistake of public licentiousness widely shared on the Web: the worst sins make for the best redemption stories. Even better, a fall from grace followed by a period of regret and repentance can be turned into a highly remunerative rehabilitation. That, in fact, has been many a traveling preacher’s grift from time immemorial.I grew up hearing such “testimonies,” personal stories that articulate a life in sin and a coming to Jesus, firsthand. I was raised in the 1980s and 1990s in a family of Southern Baptists who viewed Episcopalians as raging liberals and Catholics, of which we knew precisely two, as an alien species. These were perfectly ordinary sentiments in the rural Alabama town we lived in. My dad was a local lineman for Alabama Power, and my mom worked at my school, first as a janitor and, later, as a lunch lady. Nobody in my family had gone to college.
Besides school and Little League, church was the primary basis of our social existence. As a child and into my early teens, my own religiosity was maybe a tick above average for our community. I went on mission trips to parts of the US that were more economically distressed than my hometown, handed out Chick tracts (named for the publisher and cartoonist Jack Chick) with as much zeal and sincerity as a twelve-year-old could muster, and on one occasion destroyed cassette tapes of my favorite bands (Nirvana, the Dead Kennedys, the Beastie Boys) in a fit of self-righteousness, only to re-buy them weeks later because, well, my faith had its limits.
All the while, I was—to use a word evangelicals like to misapply to any sort of secular education—“indoctrinated” by teachers, family, church staff, ministry organizations, and other members of the community to view everything I encountered in the world through an evangelical lens. If I went to the mall and lost my friends for a few minutes, I briefly suspected everyone had been raptured away except me, a particular brand of eschatological fantasy that we were taught was perpetually in danger of happening. Even my scandalous moments, which, do-goody overachiever that I was, were few and far between, were colored by the church. My first real kiss, at fourteen, was an epic make-out session on a sidewalk during a mission trip to a suburb of Orlando, with an eighteen-year-old assistant youth pastor named Matt.
I was ten or eleven when I was baptized—or in Southern Baptist parlance, “born again”—and part of this process involved constructing my own redemption narrative: I lived in sin and would be saved by Christ. I recently rediscovered my own handwritten testimony on a visit to my mom’s house. In a child’s rounded, looping handwriting, I had confessed that I used to “cheat at games,” something I don’t remember doing at all. The likely explanation for this is that because sin is such an important prerequisite for redemption, my ten-year-old self had to fabricate one to conform to the required convention (never mind that such a falsification would be sinful itself).
by Elizabeth Spiers, NY Review | Read more:
Image: Instagram
Thursday, August 20, 2020
Chart House
An iconic restaurant in Waikiki has closed its doors for good.
Management of Chart House Waikiki said they decided to stop operations citing coronavirus hardships. It’s unlikely they will reopen as many businesses especially in Waikiki continue to struggle.
The eatery has served customers for the past 52 years with beautiful views of the small boat harbor and stunning south shore sunsets.
In a simple statement on their website, Joey Cabell and Scott Okamoto said, “At this time we would like to say Mahalo to everyone who has supported us over the past 52 years.”
by HNN Staff, Hawaii News Now | Read more:
Image: Charthouse
[ed. Oh no. I'm grief-stricken. My all-time favorite bar, overlooking the Ala Wai Boat Harbor in Waikiki. So many great memories. It's the only place I make a special point of visiting everytime I go back.]Plastilina Mosh, El Guincho, Odisea
Repost
The American Nursing Home Is a Design Failure
With luck, either you will grow old or you already have. That is my ambition and probably yours, and yet with each year we succeed in surviving, we all face a crescendo of mockery, disdain, and neglect. Ageism is the most paradoxical form of bigotry. Rather than expressing contempt for others, it lashes out at our own futures. It expresses itself in innumerable ways — in the eagerness to sacrifice the elderly on the altar of the economy, in the willingness to keep them confined while everyone else emerges from their shells, and, in a popular culture that sees old age (when it sees it at all) as a purgatory of bingo nights. Stephen Colbert turned the notion of a 75-year-old antifa into a comic riff on geriatric terrorists, replete with images of octogenarians innocently locomoting with walkers, stair lifts, and golf carts.
In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”
In this country, we have erected a vast apparatus of last-stop living arrangements that, during the pandemic, have proven remarkably successful at killing the very people they were supposed to care for. The disease that has roared through nursing homes is forcing us to look hard at a system we use to store large populations and recognize that, like prisons and segregated schools, it brings us shame.
The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.
The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.
The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.
Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.
The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”
Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”
It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.
The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.
In Sweden, elderly COVID patients were denied hospitalization, and in some cases palliative care edged over into “active euthanasia,” which seems barely distinguishable from execution. The Wall Street Journal quotes a nurse, Latifa Löfvenberg: “People suffocated, it was horrible to watch. One patient asked me what I was giving him when I gave him the morphine injection, and I lied to him. Many died before their time. It was very, very difficult.”
In this country, we have erected a vast apparatus of last-stop living arrangements that, during the pandemic, have proven remarkably successful at killing the very people they were supposed to care for. The disease that has roared through nursing homes is forcing us to look hard at a system we use to store large populations and recognize that, like prisons and segregated schools, it brings us shame.The job of housing the old sits at the juncture of social services, the medical establishment, the welfare system, and the real-estate business. Those industries have come together to spawn another, geared mostly to affluent planners-ahead. With enough money and foresight, you can outfit your homes for your changing needs, hire staff, or perhaps sell some property to pay for a move into a deluxe assisted-living facility, a cross between a condo and a hotel with room-service doctors. “I don’t think the industry has pushed itself to advocate for the highly frail or the people needing higher levels of care and support,” USC architecture professor Victor Regnier told an interviewer in 2018. “Many providers are happy to settle for mildly impaired individuals that can afford their services.” In other words, if you’re a sick, old person who’s not too old, not too sick, and not too poor, you’re golden. For everyone else, there are nursing homes.
The nursing-home system is an obsolete mess that emerged out of a bureaucratic misconception. In 1946, Congress passed the Hill-Burton Act, which paid to modernize hospitals that agreed to provide free or low-cost care. In 1954, the law was expanded to cover nursing homes, which consolidated the medicalization of senior care. Federal money summoned a wave of new nursing homes, which were built like hospitals, regulated by public-health authorities, and designed to deliver medical care with maximal efficiency and minimal cost. They reflect, reinforce, and perhaps resulted in, a society that pathologizes old age.
The government sees its mission as preventing the worst outcomes: controlling waste, preventing elder abuse, and minimizing unnecessary death. Traditional nursing homes, with their medical stations and long corridors, are designed for a constantly changing staff to circulate among residents who, ideally, remain inert, confined to beds that take up most of their assigned square footage. As in hospitals, two people share a room and a mini-bathroom with a toilet and a sink. Social life, dining, activities, and exercise are mostly regimented and take place in common areas, where dozens, even hundreds, of residents can get together and swap deadly germs. The whole apparatus is ideally suited to propagating infectious disease. David Grabowski, a professor of health-care policy at Harvard Medical School, and a team of researchers analyzed the spread of COVID-19 in nursing homes, and concluded that it didn’t matter whether they were well or shoddily managed, or if the population was rich or poor; if the virus was circulating outside the doors, staff almost invariably brought it inside. This wasn’t a bad-apples problem; it was systemic dysfunction.
Even when there is no pandemic to worry about, most of these places have pared existence for the long-lived back to its grim essentials. These are places nobody would choose to die. More important, they are places nobody would choose to live. “People ask me, ‘After COVID, is anyone going to want to go into a nursing home ever again?’ The answer is: Nobody ever wanted to go to one,” Grabowski says. And yet 1.5 million people do, mostly because they have no other choice. “If we’d seen a different way, maybe we’d have a different attitude about them,” Grabowski adds.
The fact that we haven’t represents a colossal failure of imagination — worse, it’s the triumph of indifference. “We baby boomers thought we would die without ever getting old,” says Dan Reingold, the CEO of RiverSpring Health, which runs the Hebrew Home in Riverdale. “We upended every other system — suburbia, education, child-rearing, college campuses — but not long-term care. Now the pandemic is forcing us to take care of the design and delivery of long-term care just as the baby boomers are about to overwhelm the system.”
Most of us fantasize about aging in place: dying in the homes we have lived in for decades, with the occasional assist from friends, family, and good-hearted neighbors. The problem is not just that home care can be viciously expensive, or that stairs, bathtubs, and stoves pose new dangers as their owners age. It’s also that, in most places, living alone is deadly. When a longtime suburbanite loses the ability to drive, a car-dependent neighborhood can turn into a verdant prison, stranding the elderly indoors without access to public transit, shops, or even sidewalks. “Social isolation kills people,” Reingold says. “It’s the equivalent of smoking two packs a day. A colleague said something profound: ‘A lot of people are going to die of COVID who never got the coronavirus.’ ”
It’s not as if the only alternative to staying at home is a soul-sapping institution. Back when people traveled for pleasure, tourists regularly visited the Royal Hospital Chelsea in London, where, since the end of the 17th century, veterans have been able to trade in a military pension for a lifelong berth in a soldiers’ collective on an architecturally exquisite campus, located amid some of the city’s most expensive real estate. Those who can work tending the grounds, staffing the small museum, and leading tours. When health crises hit, they can move into the care home, which is on the grounds, overlooking immaculate gardens.
The example of an institution so humane that it seems almost wastefully archaic suggests that we don’t need to reinvent the nursing home, only build on humane principles that already succeed.
by Justin Davidson, NY Mag/Intelligencer | Read more:
Image: C.F. Møller
[ed. Personally, I'd prefer an endless supply of good drugs, or something like the euthanasia scene in Soylent Green - Death of Sol (not available on YouTube for some reason).]
Labels:
Architecture,
Business,
Culture,
Design,
Health
Wednesday, August 19, 2020
'One-Shot' Radiotherapy As Good For Breast Cancer As Longer Course
Women with breast cancer who receive one shot of radiotherapy immediately after surgery experience the same benefits as those who have up to 30 doses over three to six weeks, an international medical study has found.
The technique, known as targeted intraoperative radiotherapy, is increasingly being used around the world instead of women having to undergo weeks of painful and debilitating treatment.
Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.
The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.
The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.
by Denis Campbell, The Guardian | Read more:
Image: Rui Vieira/PA
The technique, known as targeted intraoperative radiotherapy, is increasingly being used around the world instead of women having to undergo weeks of painful and debilitating treatment.Eight out of 10 of the 2,298 participants in the study, women over 45 with early-stage breast cancer who had had surgery to remove a lump of up to 3.5cm, needed no further radiotherapy after having the single dose, researchers on the British-led study found.
The findings are based on results from 32 hospitals in 10 countries including the UK. During the treatment, carried out immediately after a lumpectomy, a ball-shaped device measuring a few centimetres is placed into the area of the breast where the cancer had been and a single dose of radiotherapy is administered. The procedure takes 20 to 30 minutes.
The 80% of patients for whom it works thus avoid going back to hospital between 15 and 30 times over the following weeks to have further sessions of radiotherapy.
Image: Rui Vieira/PA
Obama and the Beach House Loopholes
[ed. Magnum P.I.'s old property. Obama P.I.? Just doesn't have the same ring to it.]
A home in the nearby neighborhood of Kailua had served as the winter White House for the Obama family every Christmas, and photographers often captured shots of Obama and Nesbitt strolling on the beach or golfing over the holidays.
The prospective property was located just down the shore in the Native Hawaiian community of Waimanalo. Wedged between the Koʻolau mountains that jut 1,300 feet into the sky and a stunning turquoise ocean, the beachfront estate sprawled across 3 acres, featuring a five-bedroom manse, gatehouse, boat house and tennis courts. Fronting the property was a historic turtle pond that used to feed Hawaiian chiefs. Local families took their children to splash and swim in its calm waters.
The property had one major problem though: a century-old seawall. While the concrete structure had long protected the estate from the sea, it now stood at odds with modern laws designed to preserve Hawaii’s natural coastlines. Scientists and environmental experts say seawalls are the primary cause of beach loss throughout the state. Such structures interrupt the natural flow of the ocean, preventing beaches from migrating inland.But the sellers of the Waimanalo property found a way to ensure the seawall remained in place for another generation. They asked state officials for something called an easement, a real estate tool that allows private property owners to essentially lease the public land that sits under the seawall. The cost: a one-time payment of $61,400. Officials with the state Department of Land and Natural Resources approved the permit, which authorized the wall for another 55 years, and Nesbitt purchased the property.
State officials and community members say the Obamas will be among the future occupants.
The easement paved the way for building permits and allowed developers to exploit other loopholes built into Hawaii’s coastal planning system. Nesbitt went on to win another environmental exemption from local officials and is currently pursuing a third — to expand the seawall. According to building permits, the Obamas’ so-called First Friend is redeveloping the land into a sprawling estate that will include three new single-family homes, two pools and a guard post. The beach fronting the seawall is nearly gone, erased completely at high tide.
Community members are now rallying against the proposed seawall expansion. Some are directing their criticism at Obama, who staked his legacy, in part, on fighting climate change and promoting environmental sustainability.
Obama’s personal office declined to comment, referring inquiries to Nesbitt. And Nesbitt, who declined to be interviewed, would not directly address questions about ownership, only saying that he and his wife bought the land and were “the developers” of the estate.
In written responses to questions, Nesbitt, now chair of the Obama Foundation board and co-CEO of a Chicago-based private-equity firm, said the steps he’s taken to redevelop the property and expand the seawall are “consistent with and informed by the analysis of our consultants, and the laws, regulations and perspectives of the State of Hawaii.” Any damage the structure caused to the Waimanalo beach, he added, occurred decades ago “and is no longer relevant.”
In Hawaii, beaches are a public trust, and the state is constitutionally obligated to preserve and protect them. But across the islands, officials have routinely favored landowners over shorelines, granting exemptions from environmental laws as the state loses its beaches. (...)
Intended to protect homeowners’ existing properties, easements have also helped fuel building along portions of Hawaii’s most treasured coastlines, such as Lanikai on Oahu and west side beaches on Maui. Scores of property owners have renovated homes and condos on the coast while investors have redeveloped waterfront lots into luxury estates. Meanwhile, the seawalls protecting these properties have diminished the shorelines. With nowhere to go, beaches effectively drown as sea levels rise against the walls and waves claw away the sand fronting them, moving it out to sea.
Researchers estimate that roughly a quarter of the beaches on Oahu, Maui and Kauai have already been lost or substantially narrowed because of seawalls over the past century. That has left less coastal habitat for endangered monk seals to haul up and rest and sea turtles to lay eggs. By midcentury, experts predict, the state will be down to just a handful of healthy beaches as climate change causes sea levels to rise at unprecedented rates. (...)
Beaches and open coastlines have always been central to Hawaii’s way of life. For centuries, Native Hawaiians enjoyed access to the ocean’s life-sustaining resources. Natural sand dunes provided protection against strong storms and served as a place for Native Hawaiians to bury their loved ones.
After Hawaii became a state in 1959, development of homes and hotels along the coastlines exploded as investors sought to capitalize on what was becoming some of the most valuable real estate in the country. An environmental review commissioned by the state in the 1970s found that three-quarters of the state’s sandy coastlines were now hugged by private property, curtailing public access to shorelines. Many property owners erected seawalls to try to hold back the ocean.
By the 1990s, scientists were warning that those seawalls were causing significant beach loss on all the Hawaiian islands.
Alarmed by these losses, state officials in 1997 released a roadmap for protecting the state’s beaches. The report emphasized that the seawalls were destroying coastal ecosystems, threatening the state’s tourist-driven economy and limiting the public’s access to beaches and the ocean, a right enshrined in the Hawaii Constitution.
If beaches continue to disappear throughout the state, the report warned, “the fabric of life in Hawaii will change and the daily miracle of living among these islands will lose its luster.”
by Sophie Cocke, ProPublica/Honolulu Star Advertiser | Read more:
Image: Darryl Oumi, special to Honolulu Star-Advertiser
[ed. How many houses do the Obama's own? Let's see, there's that one in Washington D.C., the recent one in Martha's Vineyard, and wasn't there one in Chicago? I can't keep track. Being ex-president can be a pretty lucrative gig if you protect the status quo.]
Get Ready for a Teacher Shortage Like We’ve Never Seen Before
Usually on the first day back to work after summer break, there’s this buzzing, buoyant energy in the air. My school is a small school-within-a-school designated to serve gifted children, so there are only 16 teachers and staff members. We typically meet in a colleague’s tidy classroom, filled with natural light and the earthy smell of coffee.
We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.
Coming back this year was different.
It was Thursday, Aug. 6, the same day that the Houston area reported its new single-day high for deaths from Covid-19. Instead of gathering, we all tuned in to a Zoom meeting from our separate classrooms.
There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.
We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.
“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”
I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”
I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”
She sent a message loud and clear: “I really, really don’t want to be here.”
We are about to see schools in America filled with these kinds of teachers.
Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.
You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.
That’s not what they do.
They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.
So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.
This has been happening for years. We’re about to see it get a lot worse.
by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields
We hug, remark on one another’s new haircuts. Sure, there’s an element of sadness about not being able to sleep in or pee on our own schedules anymore, but for the most part, we’re eager to get back to doing work that we believe is the most important work in the world.
Coming back this year was different.
It was Thursday, Aug. 6, the same day that the Houston area reported its new single-day high for deaths from Covid-19. Instead of gathering, we all tuned in to a Zoom meeting from our separate classrooms.There was no buzz in the air, and we weren’t hugging and chatting. We were talking about how long we had: a few weeks of virtual teaching before students returned to our classrooms on Sept. 8. Or maybe sooner. We’ve been told our start date is subject to change at any time.
We asked about short- vs. long-term disability plans on our insurance. We silently worried about a colleague who has an autoimmune disease. We listened as our counselor, who, along with her daughters, tested positive for the coronavirus the week before, shared how they were doing. We tried not to react from inside each of our little Zoom squares as we began to realize there was no way of maintaining true social distancing when school reopened.
“We’re a family,” one of our administrators kept saying while talking about the measures we would need to take to reduce our and our students’ exposure. “We’re a family.”
I know what he meant — that our tight-knit community would get through this year together — but I kept wondering, “Wouldn’t it be safer for our family to stay home?”
I invite you to recall your worst teacher. Mine was my seventh-grade science teacher, whose pedagogical approach consisted of our reading silently from our textbooks. Once, when I asked if I could do a project on Pompeii, she frowned and said: “This is science class. Your project has to be on a real thing.”
She sent a message loud and clear: “I really, really don’t want to be here.”
We are about to see schools in America filled with these kinds of teachers.
Even before Covid-19, teachers were leaving the profession in droves. According to a report by the Economic Policy Institute, the national teacher shortage is looking dire. Every year, fewer and fewer people want to become teachers.
You would think states would panic upon hearing this. You would think they’d take steps to retain quality teachers and create a competitive system that attracts the best, brightest and most passionate to the profession.
That’s not what they do.
They slash the education budget, which forces districts to cut jobs (increasing class size), put off teacher raises and roll back the quality of teachers’ health care. They ignore teachers’ pleas for buildings without black mold creeping out of ceiling tiles, for sensible gun legislation, and for salaries we can live on without having to pick up two to three additional part-time jobs.
So, a lot of good and talented teachers leave. When state leaders realized they couldn’t actually replace these teachers, they started passing legislation lowering the qualifications, ushering underqualified people into classrooms.
This has been happening for years. We’re about to see it get a lot worse.
by Kelly Treleaven, NY Times | Read more:
Image: Olivia Fields
Tuesday, August 18, 2020
Jack Kirby. From a golden age story reprinted in an early ‘70s “Marvel Premiere” comic
[ed. Living in the bubble]
Subscribe to:
Comments (Atom)








