Sunday, October 11, 2020
Political Economy After Neoliberalism
If anything could have dislodged the neoliberal doctrine of freeing the market from the government, you might have expected the coronavirus pandemic to do the trick. Of course, the same was said about the global financial crisis, which was supposed to transform everything from macroeconomic policy to financial regulation and the social safety net.
Now we are facing a particularly horrifying moment, defined by the triple shock of the Trump presidency, the pandemic, and the economic disasters that followed from it. Perhaps these—if combined with a change in power in the upcoming election—could offer a historic window of opportunity. Perhaps. But seizing the opportunity will require a new kind of political-economic thinking. Instead of starting from a stylized view of how the world ought to work, we should consider what policies have proved effective in different societies experiencing similar challenges. This comparative way of thinking increases the menu of options and may suggest novel solutions to our problems that lie outside the narrow theoretical assumptions of market-fundamentalist neoliberalism.
Neoliberalism implies a one-size-fits-all set of policy solutions: less government and more market, as if the “free market” were a single equilibrium. To the contrary, we know that there have been multiple paths to economic growth and multiple solutions to economic crises in different societies. By recognizing that there is not one single path to good outcomes, that real-world markets are complex human constructions—governed in different places by different laws, practices, and norms—we open up the possibility that policies that seem objectionable in light of neoliberal abstractions may deliver high performance along both social and economic dimensions.
We know about these possibilities from the work of economic sociologists, who stress the political, cultural, and social embedding of real-world markets. From work in comparative political economy, demonstrating how the relationships between government and industry and among firms, banks, and unions vary from one country to another. From political and economic geographers, who place regional economies in their spatial contexts and natural environments. From economic historians, who explore the transformation of the institutions of capitalism over time. From an emergent Law and Political Economy (LPE) movement that aspires to shift priorities from efficiency to power, from neutrality to equality, and from apolitical governance to democracy. And from economists—often villainized as the agents of neoliberalism—who are exploring novel approaches to the problem of inequality and the slowdown in productivity, and show renewed concern with the economic dominance of a few large firms.
The challenge is to bring these insights together.
As a step in this direction, we will propose three core principles of an alternative political economy. We then illustrate these principles by discussing the dynamics of the American political economy, focusing particularly on the rise of “shareholder capitalism” in the 1980s. Finally, we apply the principles to the ongoing national policy responses to the COVID-19 pandemic, comparing the United States to Germany.
We recognize that these principles do not resolve the very real problem of the dominance of business in U.S. politics and the political gridlock produced by this configuration of power. Still, they point in new and urgent directions.
Likewise, critics of neoliberalism often adopt the fictional “free market” as a reference point even as they make the case for deviation from it. For example, they follow the standard practice of economists by identifying market failures and proposing solutions to those failures. To be fair, this can be a useful way to see how government action can remedy specific problems, and to assess when action may be helpful or not. But this approach also risks obscuring the fact that market failure is the rule and not the exception. More fundamentally, the government is not a repair technician for a market economy that functions reasonably well, but rather the master craftsperson of market infrastructure.
Thus, governments pacify a territory and centralize the means of violence, making investment safer and trade less precarious. They create ways to write and enforce contracts via the rule of law. They provide public goods like education and transport infrastructure. No neoliberal denies the value of these things.
Beyond these basic functions, governments establish the conditions for the emergence of new markets, provide the architecture to stabilize existing ones, and manage crises to limit damage and facilitate recovery. Historically, governments fostered many of the largest markets, such housing and banking, by designing new market structures that enabled the mass expansion of goods and services. In the case of the housing market, the U.S. federal government created the 30-year fixed interest rate mortgage as the standard mortgage product. It also stabilized the savings and loan industry by creating rules about paying interest on bank accounts and deposit insurance.
In the postwar era, this system helped propel home ownership from around 40 percent to 64 percent. More recently, many policy failures, such as the financial crisis of 2007–2009, occurred because governments shirked their role of making markets work through “deregulation.” Essentially, the U.S. government allowed financial institutions to enter whichever businesses they liked and with little oversight. In the wake of the Great Recession, predictably, the government re-established control and oversight over the banking sector with the Dodd-Frank Act. One of the provisions of that act was to give the Federal Reserve the ability to ask the largest banks to undergo stress tests every year to determine whether or not they could manage a serious downturn.
Governments also support knowledge creation and dissemination and underwrite the cost of innovation in the private sector. They facilitate the organization of market activity by establishing the legal basis for corporations and by setting the rules for fair and efficient trading practices on stock exchanges. A political economy that does not value the role of government along these different dimensions distorts how markets do contribute to society.
by Neil Fligstein and Steven Vogel, Boston Review | Read more:
Image: Timothy A. Clary/AFP via Getty Images
[ed. See also: Trump’s America Remains Stuck in the Shadow of Reagan (Boston Review):
Now we are facing a particularly horrifying moment, defined by the triple shock of the Trump presidency, the pandemic, and the economic disasters that followed from it. Perhaps these—if combined with a change in power in the upcoming election—could offer a historic window of opportunity. Perhaps. But seizing the opportunity will require a new kind of political-economic thinking. Instead of starting from a stylized view of how the world ought to work, we should consider what policies have proved effective in different societies experiencing similar challenges. This comparative way of thinking increases the menu of options and may suggest novel solutions to our problems that lie outside the narrow theoretical assumptions of market-fundamentalist neoliberalism.
Neoliberalism implies a one-size-fits-all set of policy solutions: less government and more market, as if the “free market” were a single equilibrium. To the contrary, we know that there have been multiple paths to economic growth and multiple solutions to economic crises in different societies. By recognizing that there is not one single path to good outcomes, that real-world markets are complex human constructions—governed in different places by different laws, practices, and norms—we open up the possibility that policies that seem objectionable in light of neoliberal abstractions may deliver high performance along both social and economic dimensions.
We know about these possibilities from the work of economic sociologists, who stress the political, cultural, and social embedding of real-world markets. From work in comparative political economy, demonstrating how the relationships between government and industry and among firms, banks, and unions vary from one country to another. From political and economic geographers, who place regional economies in their spatial contexts and natural environments. From economic historians, who explore the transformation of the institutions of capitalism over time. From an emergent Law and Political Economy (LPE) movement that aspires to shift priorities from efficiency to power, from neutrality to equality, and from apolitical governance to democracy. And from economists—often villainized as the agents of neoliberalism—who are exploring novel approaches to the problem of inequality and the slowdown in productivity, and show renewed concern with the economic dominance of a few large firms.
The challenge is to bring these insights together.
As a step in this direction, we will propose three core principles of an alternative political economy. We then illustrate these principles by discussing the dynamics of the American political economy, focusing particularly on the rise of “shareholder capitalism” in the 1980s. Finally, we apply the principles to the ongoing national policy responses to the COVID-19 pandemic, comparing the United States to Germany.
We recognize that these principles do not resolve the very real problem of the dominance of business in U.S. politics and the political gridlock produced by this configuration of power. Still, they point in new and urgent directions.
***
First, then, governments and markets are co-constituted. Government regulation is not an intrusion into the market but rather a prerequisite for a functioning market economy. Critics of neoliberalism often make the case for government “intervention” in the market. But why refer to government action as intervention? The language of intervention implies that government action contaminates a market otherwise free of public action. To the contrary, the alternative to government action is not a perfect market, but rather real-world markets thoroughly sullied with collusion, fraud, imbalances of power, production of substandard or dangerous products, and prone to crises due to excessive risk-taking.Likewise, critics of neoliberalism often adopt the fictional “free market” as a reference point even as they make the case for deviation from it. For example, they follow the standard practice of economists by identifying market failures and proposing solutions to those failures. To be fair, this can be a useful way to see how government action can remedy specific problems, and to assess when action may be helpful or not. But this approach also risks obscuring the fact that market failure is the rule and not the exception. More fundamentally, the government is not a repair technician for a market economy that functions reasonably well, but rather the master craftsperson of market infrastructure.
Thus, governments pacify a territory and centralize the means of violence, making investment safer and trade less precarious. They create ways to write and enforce contracts via the rule of law. They provide public goods like education and transport infrastructure. No neoliberal denies the value of these things.
Beyond these basic functions, governments establish the conditions for the emergence of new markets, provide the architecture to stabilize existing ones, and manage crises to limit damage and facilitate recovery. Historically, governments fostered many of the largest markets, such housing and banking, by designing new market structures that enabled the mass expansion of goods and services. In the case of the housing market, the U.S. federal government created the 30-year fixed interest rate mortgage as the standard mortgage product. It also stabilized the savings and loan industry by creating rules about paying interest on bank accounts and deposit insurance.
In the postwar era, this system helped propel home ownership from around 40 percent to 64 percent. More recently, many policy failures, such as the financial crisis of 2007–2009, occurred because governments shirked their role of making markets work through “deregulation.” Essentially, the U.S. government allowed financial institutions to enter whichever businesses they liked and with little oversight. In the wake of the Great Recession, predictably, the government re-established control and oversight over the banking sector with the Dodd-Frank Act. One of the provisions of that act was to give the Federal Reserve the ability to ask the largest banks to undergo stress tests every year to determine whether or not they could manage a serious downturn.
Governments also support knowledge creation and dissemination and underwrite the cost of innovation in the private sector. They facilitate the organization of market activity by establishing the legal basis for corporations and by setting the rules for fair and efficient trading practices on stock exchanges. A political economy that does not value the role of government along these different dimensions distorts how markets do contribute to society.
by Neil Fligstein and Steven Vogel, Boston Review | Read more:
Image: Timothy A. Clary/AFP via Getty Images
[ed. See also: Trump’s America Remains Stuck in the Shadow of Reagan (Boston Review):
But in the end, Trump’s most enduring deformation of U.S. political life may derive from his slavish devotion to unchecked corporate power and his work in further consolidating power in the hands of a few billionaires. As Christian Lorentzen recently wrote in Bookforum, the Republican Party under Trump should primarily be understood as “an electoral entity that reliably obtains tax cuts for the wealthy, deregulation for big business, increased budgets for the military, and little of anything else for anyone else.”
Being Eaten
As high tide inundates the muddy shallows of the Fraser river delta in British Columbia, what looks like a swarm of mosquitoes quivers in the air above. Upon closer inspection, the flitting mass turns out to be a flock of small shorebirds. The grey-brown wings and white chests of several thousand Pacific dunlins move in synchrony, undulating low over the water, then rising up like a rippling wave, sometimes for hours on end. Staying aloft like this is exhausting, especially in midwinter when the internal furnaces of these small birds, weighing less than a tennis ball, must be refuelled continuously. But setting down to rest and digest their mud-dug meals in the adjacent coastal marshes comes at a cost: an obscured, fearsome view of lurking predators like the skydiving peregrine falcon. The dunlins won’t alight until the ebbing tide buys them back their safer, open vistas.
The evidence that fear motivates dunlin flocking is circumstantial, but compelling. In the 1970s, when populations of peregrine falcons were depressed due to pesticides, dunlins spent less time flying and more roosting. But as pesticides such as DDT waned due to regulations, more peregrines have returned.
Fear is a powerful force not just for wintering dunlins, but across the natural world. Ecologists have long known that predators play a key role in ecosystems, shaping whole communities with the knock-on effects of who eats whom. But a new approach is revealing that it’s not just getting eaten, but also the fear of getting eaten, that shapes everything from individual brains and behaviour to whole ecosystems. This new field, exploring the non-consumptive effects of predators, is known as fear ecology.
by Lesley Evans Ogden, Aeon | Read more:
Image: Robbie George/The National Geographic Image Collection
Saturday, October 10, 2020
Nirvana
Load up on guns, bring your friends. It's fun to lose and to pretend. She's over-bored and self-assured. Oh no, I know a dirty word. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello. With the lights out, it's less dangerous. Here we are now, entertain us. I feel stupid and contagious. Here we are now, entertain us. A mulatto, an albino, a mosquito, my libido. Yeah. I'm worse at what I do best. And for this gift I feel blessed. Our little group has always been. And always will until the end. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello. With the lights out, it's less dangerous. Here we are now, entertain us. I feel stupid and contagious. Here we are now, entertain us. A mulatto, an albino, a mosquito, my libido. Yeah, hey. And I forget just why I taste. Oh yeah, I guess it makes me smile. I found it hard, it's hard to find. Oh well, whatever, never mind. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello, how low. Hello, hello, hello. With the lights out, it's less dangerous. Here we are now, entertain us. I feel stupid and contagious. Here we are now, entertain us. A mulatto, an albino, a mosquito, my libido. A denial, a denial, a denial, a denial, a denial. A denial, a denial, a denial, a denial.
[ed. I was practicing the other day and ran through this old song (which, along with the video, pretty much killed the 80s). The comments section is still going strong, 11 years later.]
[ed. I was practicing the other day and ran through this old song (which, along with the video, pretty much killed the 80s). The comments section is still going strong, 11 years later.]
The Wall Between What’s Private and What’s Not is Dissolving
A celebrity story broke last week that gave me, as my fellow young people would say, all the feels. But they were not good feels. In fact, they were pretty much every feel except the good kind: sad for the celebrity, bad about myself, uncertain about the world today.
This story was about Chrissy Teigen, a model and the wife of the singer John Legend, although neither of those descriptors really explains her popularity. Rather, that is down to what is frequently described as her “relatability”, or her willingness to share her personal life with the world. This, according to current thinking, makes this extremely beautiful and wealthy woman more real to the public. Over several days, she posted videos of herself on Twitter and Instagram, talking about how she’d been having heavy bleeding while pregnant. “Chrissy Teigen shares updates from hospital bed as she prepares for second blood transfusion” and “Pregnant Chrissy Teigen’s horror scare as she scrambled to hear baby’s heartbeat” were just two of the newspaper headlines, as if it were totally normal that a woman’s intimate pregnancy issues should be international news.
Normally, I ignore news stories ripped from a celebrity’s social media feed, as they are little more than press releases, given that they were written by the celebrity. But it turns out Teigen is more relatable than I thought. Last year, I also had some bleeding while pregnant, and went to the same hospital as Teigen. As I waited for the scan, I cried and blamed myself: for travelling to Los Angeles while pregnant, for being 41, for maybe losing yet another baby. I also thought of Ariel Levy’s 2013 article about her stillbirth: “I knew that this change in fortune was my fault. I had boarded a plane out of vanity and selfishness, and the dark Mongolian sky had punished me.”
In the end my baby was fine, but seeing Teigen’s daily posts was like watching the past unspool and knowing that the future is not as guaranteed as Teigen’s followers seemed to think (“You got this, girl!” “Stay strong! You’re amazing!”). Then, last Thursday morning, Teigen posted more photos from hospital: she had lost her baby.
How should we talk about this kind of loss? I admit, my first thought on seeing the photos on every news website of Teigen bent forward and weeping – photos taken from her social media – was “Maybe not like this?” It reminded me of a time last year when Alec Baldwin’s wife, Hilaria, posted a video of herself telling their young daughter that she’d just had a miscarriage. So-called mumfluencers are praised for taking whatever stigma there still is out of breast-feeding, fertility trouble and more. But when I saw a photo of one – taken by who, her husband? – sitting on a toilet and crying, with a long caption about her miscarriage, I wondered if the cost of stigma removal was self-exploitation. It felt not intimate but voyeuristic, and I know too well how long it takes to recover from these things.
Is it helpful to these women to have these images, taken in the heat of shock and grief, follow them around for ever? (...)
This story was about Chrissy Teigen, a model and the wife of the singer John Legend, although neither of those descriptors really explains her popularity. Rather, that is down to what is frequently described as her “relatability”, or her willingness to share her personal life with the world. This, according to current thinking, makes this extremely beautiful and wealthy woman more real to the public. Over several days, she posted videos of herself on Twitter and Instagram, talking about how she’d been having heavy bleeding while pregnant. “Chrissy Teigen shares updates from hospital bed as she prepares for second blood transfusion” and “Pregnant Chrissy Teigen’s horror scare as she scrambled to hear baby’s heartbeat” were just two of the newspaper headlines, as if it were totally normal that a woman’s intimate pregnancy issues should be international news.
Normally, I ignore news stories ripped from a celebrity’s social media feed, as they are little more than press releases, given that they were written by the celebrity. But it turns out Teigen is more relatable than I thought. Last year, I also had some bleeding while pregnant, and went to the same hospital as Teigen. As I waited for the scan, I cried and blamed myself: for travelling to Los Angeles while pregnant, for being 41, for maybe losing yet another baby. I also thought of Ariel Levy’s 2013 article about her stillbirth: “I knew that this change in fortune was my fault. I had boarded a plane out of vanity and selfishness, and the dark Mongolian sky had punished me.”
In the end my baby was fine, but seeing Teigen’s daily posts was like watching the past unspool and knowing that the future is not as guaranteed as Teigen’s followers seemed to think (“You got this, girl!” “Stay strong! You’re amazing!”). Then, last Thursday morning, Teigen posted more photos from hospital: she had lost her baby.
How should we talk about this kind of loss? I admit, my first thought on seeing the photos on every news website of Teigen bent forward and weeping – photos taken from her social media – was “Maybe not like this?” It reminded me of a time last year when Alec Baldwin’s wife, Hilaria, posted a video of herself telling their young daughter that she’d just had a miscarriage. So-called mumfluencers are praised for taking whatever stigma there still is out of breast-feeding, fertility trouble and more. But when I saw a photo of one – taken by who, her husband? – sitting on a toilet and crying, with a long caption about her miscarriage, I wondered if the cost of stigma removal was self-exploitation. It felt not intimate but voyeuristic, and I know too well how long it takes to recover from these things.
Is it helpful to these women to have these images, taken in the heat of shock and grief, follow them around for ever? (...)
We live in a performative age. We’re rewarded for revealing our private lives to strangers, for exaggerating our emotions online, for sharing every crisis that happens in our bodies, every thought that passes through our heads. So many of us now depend on the reactions of strangers for our own identity. Why, four months later, did I need to post a photo on Instagram of my baby the day after she was born? I tell myself that it’s so family and friends can know all is well, but I’d be lying if I said I didn’t get a kick of validation from seeing the likes rack up. When you can’t even go through one of the most intimate experiences of your life without seeing how others react, how do you know what you feel about anything any more?
It is the height of narcissism (more so than taking a selfie) to assume that my feelings are applicable to all women. I can see online that many women find Teigen’s openness helpful, with some realising, at last, that it wasn’t their fault after all. And while I would have killed anyone who responded to my own miscarriage with an emoji (“So sad! Sadface!”), I can also imagine how, for some, to have millions commenting on a personal loss might be helpful – liberating, even. Once the wall between your private and public lives has dissolved, as it now has for so many, then what’s the difference?
It is the height of narcissism (more so than taking a selfie) to assume that my feelings are applicable to all women. I can see online that many women find Teigen’s openness helpful, with some realising, at last, that it wasn’t their fault after all. And while I would have killed anyone who responded to my own miscarriage with an emoji (“So sad! Sadface!”), I can also imagine how, for some, to have millions commenting on a personal loss might be helpful – liberating, even. Once the wall between your private and public lives has dissolved, as it now has for so many, then what’s the difference?
by Hadley Freeman, The Guardian | Read more:
Image: The Project Twins/Synergy
Labels:
Celebrities,
Culture,
Media,
Psychology,
Relationships
The Vocabulary of Violence
Terrorists (noun): evil brown people.
Thugs (noun): violent black people.
Militia (noun): misunderstood white men. Groups of heavily armed individuals whose actions, while not exactly ideal, deserve compassion and should be looked at within a wider socioeconomic context. Instead of rushing to judgment or making generalisations, one must consider the complex causes (economic anxiety, video games, mental health issues) that have triggered these poor guys into committing mass murder, conspiring to violently overthrow the state or plotting to kidnap government officials.
I’m afraid to say that the misunderstood white men have struck again – or attempted to, at least. On Thursday 13 men were charged in relation to an alleged plot to kidnap Michigan’s Democratic governor, Gretchen Whitmer. The plan was to, “grab the bitch”, as they put it, and then try her for “treason”. The eventual goal being to create “a society that followed the US Bill of Rights and where they could be self-sufficient”.
Much of the media coverage of Whitmer’s would-be kidnappers referred to them as members of a Michigan militia group called Wolverine Watchmen. The wolverine, by the way, isn’t just a Marvel character – it’s an animal that looks like a small bear but is actually part of the weasel family. This seems appropriate because “militia” is very much a weasel word. It’s a way to avoid putting white extremists in the same bucket as brown people. It lends them legitimacy. It obfuscates what these people really are.
Governor Whitmer, to her immense credit, was having none of it. “They’re not ‘militias’,” she tweeted on Friday morning. “They’re domestic terrorists endangering and intimidating their fellow Americans. Words matter.”
Donald Trump’s words, in particular, matter. In April the president tweeted “LIBERATE MICHIGAN!” as far-right protesters, many of them armed, railed against stay-at-home orders imposed by Whitmer. Protesters waving semi-automatic rifles later tried to storm the state capitol. “The Governor of Michigan should give a little, and put out the fire,” Trump wrote on 1 May. “These are very good people, but they are angry.”
Trump’s words, Whitmer said in televised comments on Thursday, had served as a “rallying cry” to far-right extremists. Not only had the president refused to condemn white supremacists, he stood on the debate stage last week and told the Proud Boys, a violently racist gang, to “stand back and stand by”. When our leaders “stoke and contribute to hate speech, they are complicit”, Whitmer said.
It’s not just the White House that’s complicit, it’s the media. Kyle Rittenhouse, for example, the 17-year-old accused of killing two protesters in Wisconsin last month, was celebrated as a vigilante by rightwing outlets. “How shocked are we that 17-year-olds with rifles decided they had to maintain order when no one else would?” Tucker Carlson asked on Fox News. Far-right pundit Ann Coulter tweeted that she wanted the teenager “as my president”. The New York Post, meanwhile, published photos of Rittenhouse cleaning up graffiti; he was framed as a concerned citizen rather than a cold-blooded killer.
To be clear: double standards aren’t just a rightwing media problem. A study conducted by Georgia State University last year found that terror attacks carried out by Muslims receive on average 357% more media coverage than those committed by other groups. While this is clearly racist, it’s also dangerous. White supremacists, plenty of evidence shows, are the deadliest domestic threat facing the US. By downplaying the threat of white nationalist terrorism, by finding politer ways to refer to it, the media have allowed it to proliferate. So please, let’s call things by their name. Enough with the “militias”, these people are terrorists.
by Arwa Mahdawi, The Guardian | Read more:
Thugs (noun): violent black people.
Militia (noun): misunderstood white men. Groups of heavily armed individuals whose actions, while not exactly ideal, deserve compassion and should be looked at within a wider socioeconomic context. Instead of rushing to judgment or making generalisations, one must consider the complex causes (economic anxiety, video games, mental health issues) that have triggered these poor guys into committing mass murder, conspiring to violently overthrow the state or plotting to kidnap government officials.
I’m afraid to say that the misunderstood white men have struck again – or attempted to, at least. On Thursday 13 men were charged in relation to an alleged plot to kidnap Michigan’s Democratic governor, Gretchen Whitmer. The plan was to, “grab the bitch”, as they put it, and then try her for “treason”. The eventual goal being to create “a society that followed the US Bill of Rights and where they could be self-sufficient”.
Much of the media coverage of Whitmer’s would-be kidnappers referred to them as members of a Michigan militia group called Wolverine Watchmen. The wolverine, by the way, isn’t just a Marvel character – it’s an animal that looks like a small bear but is actually part of the weasel family. This seems appropriate because “militia” is very much a weasel word. It’s a way to avoid putting white extremists in the same bucket as brown people. It lends them legitimacy. It obfuscates what these people really are.
Governor Whitmer, to her immense credit, was having none of it. “They’re not ‘militias’,” she tweeted on Friday morning. “They’re domestic terrorists endangering and intimidating their fellow Americans. Words matter.”
Donald Trump’s words, in particular, matter. In April the president tweeted “LIBERATE MICHIGAN!” as far-right protesters, many of them armed, railed against stay-at-home orders imposed by Whitmer. Protesters waving semi-automatic rifles later tried to storm the state capitol. “The Governor of Michigan should give a little, and put out the fire,” Trump wrote on 1 May. “These are very good people, but they are angry.”
Trump’s words, Whitmer said in televised comments on Thursday, had served as a “rallying cry” to far-right extremists. Not only had the president refused to condemn white supremacists, he stood on the debate stage last week and told the Proud Boys, a violently racist gang, to “stand back and stand by”. When our leaders “stoke and contribute to hate speech, they are complicit”, Whitmer said.
It’s not just the White House that’s complicit, it’s the media. Kyle Rittenhouse, for example, the 17-year-old accused of killing two protesters in Wisconsin last month, was celebrated as a vigilante by rightwing outlets. “How shocked are we that 17-year-olds with rifles decided they had to maintain order when no one else would?” Tucker Carlson asked on Fox News. Far-right pundit Ann Coulter tweeted that she wanted the teenager “as my president”. The New York Post, meanwhile, published photos of Rittenhouse cleaning up graffiti; he was framed as a concerned citizen rather than a cold-blooded killer.
To be clear: double standards aren’t just a rightwing media problem. A study conducted by Georgia State University last year found that terror attacks carried out by Muslims receive on average 357% more media coverage than those committed by other groups. While this is clearly racist, it’s also dangerous. White supremacists, plenty of evidence shows, are the deadliest domestic threat facing the US. By downplaying the threat of white nationalist terrorism, by finding politer ways to refer to it, the media have allowed it to proliferate. So please, let’s call things by their name. Enough with the “militias”, these people are terrorists.
Image: Jeff Kowalsky/AFP/Getty Images
[ed. And labelling them "terrorists" means what? (they should be locked up in Guantanamo without legal recourse for years?)]
Friday, October 9, 2020
It Took One Year to Build My Dream Studio
[ed. Paul Davids is one of my favorite YouTube guitar instructors. Here he describes his new home studio (mine is quite a bit more utilitarian.]
An Earlier Universe Existed Before the Big Bang
An earlier universe existed before the Big Bang and can still be observed today, Sir Roger Penrose has said, as he received the Nobel Prize for Physics.
Sir Roger, 89, who won the honour for his seminal work proving that black holes exist, said he had found six ‘warm’ points in the sky (dubbed ‘Hawking Points’) which are around eight times the diameter of the Moon.
They are named after Prof Stephen Hawking, who theorised that black holes ‘leak’ radiation and eventually evaporate away entirely.
The timescale for the complete evaporation of a black hole is huge, possibly longer than the age of our current universe, making them impossible to detect.
However, Sir Roger believes that ‘dead’ black holes from earlier universes or ‘aeons’ are observable now. If true, it would prove Hawking’s theories were correct.
Sir Roger shared the World Prize in physics with Prof Hawking in 1988 for their work on black holes.
Speaking from his home in Oxford, Sir Roger said: “I claim that there is observation of Hawking radiation.
“The Big Bang was not the beginning. There was something before the Big Bang and that something is what we will have in our future.
“We have a universe that expands and expands, and all mass decays away, and in this crazy theory of mine, that remote future becomes the Big Bang of another aeon.
“So our Big Bang began with something which was the remote future of a previous aeon and there would have been similar black holes evaporating away, via Hawking evaporation, and they would produce these points in the sky, that I call Hawking Points.
“We are seeing them. These points are about eight times the diameter of the Moon and are slightly warmed up regions. There is pretty good evidence for at least six of these points.”
The idea is controversial, although many scientists do believe that the universe operates in a perpetual cycle in which it expands, before contracting back in a ‘Big Crunch’ followed by a new Big Bang. (...)
Sir Roger proved that when objects become too dense they suffer gravitational collapse to a point of infinite mass where all known laws of nature cease, called the singularity.
His groundbreaking article is still regarded as the most important contribution to the theory of relativity since Einstein, and increased evidence for the Big Bang. (...)
Commenting on the prize, Prof Martin Rees, Astronomer Royal and Fellow of Trinity College, University of Cambridge, said it was sad that Prof Hawking had not been alive to share the prize.
“Penrose is amazingly original and inventive, and has contributed creative insights for more than 60 years.
“There would, I think, be a consensus that Penrose and Hawking are the two individuals who have done more than anyone else since Einstein to deepen our knowledge of gravity.
“Sadly, this award was too much delayed to allow Hawking to share the credit with Penrose."
by Sarah Knapton, The Telegraph/Yahoo News | Read more:
Image:APA Picturedesk Gmbh/Shutterstock/APA Picturedesk Gmbh/Shutterstock
Sir Roger, 89, who won the honour for his seminal work proving that black holes exist, said he had found six ‘warm’ points in the sky (dubbed ‘Hawking Points’) which are around eight times the diameter of the Moon.
They are named after Prof Stephen Hawking, who theorised that black holes ‘leak’ radiation and eventually evaporate away entirely.
The timescale for the complete evaporation of a black hole is huge, possibly longer than the age of our current universe, making them impossible to detect.
However, Sir Roger believes that ‘dead’ black holes from earlier universes or ‘aeons’ are observable now. If true, it would prove Hawking’s theories were correct.
Sir Roger shared the World Prize in physics with Prof Hawking in 1988 for their work on black holes.
Speaking from his home in Oxford, Sir Roger said: “I claim that there is observation of Hawking radiation.
“The Big Bang was not the beginning. There was something before the Big Bang and that something is what we will have in our future.
“We have a universe that expands and expands, and all mass decays away, and in this crazy theory of mine, that remote future becomes the Big Bang of another aeon.
“So our Big Bang began with something which was the remote future of a previous aeon and there would have been similar black holes evaporating away, via Hawking evaporation, and they would produce these points in the sky, that I call Hawking Points.
“We are seeing them. These points are about eight times the diameter of the Moon and are slightly warmed up regions. There is pretty good evidence for at least six of these points.”
The idea is controversial, although many scientists do believe that the universe operates in a perpetual cycle in which it expands, before contracting back in a ‘Big Crunch’ followed by a new Big Bang. (...)
Sir Roger proved that when objects become too dense they suffer gravitational collapse to a point of infinite mass where all known laws of nature cease, called the singularity.
His groundbreaking article is still regarded as the most important contribution to the theory of relativity since Einstein, and increased evidence for the Big Bang. (...)
Commenting on the prize, Prof Martin Rees, Astronomer Royal and Fellow of Trinity College, University of Cambridge, said it was sad that Prof Hawking had not been alive to share the prize.
“Penrose is amazingly original and inventive, and has contributed creative insights for more than 60 years.
“There would, I think, be a consensus that Penrose and Hawking are the two individuals who have done more than anyone else since Einstein to deepen our knowledge of gravity.
“Sadly, this award was too much delayed to allow Hawking to share the credit with Penrose."
by Sarah Knapton, The Telegraph/Yahoo News | Read more:
Image:APA Picturedesk Gmbh/Shutterstock/APA Picturedesk Gmbh/Shutterstock
Burning Injustice
Fifty four degrees centigrade is the highest temperature ever reliably recorded on earth. Registered in California’s Death Valley only two months ago, it signalled what was to come. The next day fires erupted in the north of the state that eventually snowballed into the largest single fire in its history. Among the shocking scenes of red skies and destroyed homes, we might forget that it was only as little as two years ago that the last fire season records in California were broken. The smoke from those flames clouded the skies as far away as New York City. Yet, the vision it presented of our future could not have been clearer.
Whether it is the flames of the wet Amazon or the fires of the frozen Arctic, wildfires have become the canary in the gold mine. The urgency of a fire is a far cry from the dry scientific language of global warming. They represent everything that is terrifying about climate change. Fire rips through the natural and physical world, leaving behind a blackened and uninhabitable landscape, like watching the next century play out on fast forward. All that is left is a wasteland, showing us, in the words of T.S. Elliot’s poem, “fear in a handful of dust”.
Of the 295,000 people that were evacuated in the 2018 California inferno, two names in particular hit the news. Kim Kardashian and Kanye West were forced to abandon their $60 million mansion in the serene gated community just outside of Los Angeles, known as the Hidden Hills. The Hills are home to several Hollywood stars and celebrities, including Kylie Jenner (the world’s youngest billionaire), Miley Cyrus and Britney Spears.
When the fire finally started to die down, the couple found themselves having to put out the flames of their own publicity crisis. Reports started to enfold that the couple had hired a private fire team to protect their mansion, a decision they were publicly burned for as critics raged that they should not be able to pay for protection. In an attempt to stem the crisis, Kim Kardashian appeared on ‘The Ellen Show’ to present a $100,000 donation to a firefighter and his wife who had lost their homes in the fire, in a declaration of their devotion to the public Californian firefighting service.
Whether Kim and Kanye were wrong for going private is not really the issue here. But it does raise the question – why couldn’t they rely on the public fire service to protect their home? In answering this question, we will see that the climate crisis is a class crisis. As the world warms and becomes ever increasingly hostile to human life, class divides will be sharpened. This is not inevitable. But there are many features of the 2018 Californian wildfires that show the path we are on, an allegory for a century which will be defined by its relationship to the elements.
Fire Services Run by Insurers
During the fires, the Californian fire service was stretched well beyond capacity, having to call in backup from seventeen other states. This was in part due to the gutting of the public service in the era of privatisation. Starting in the 1980s, the US began to promote more and more private actors in the fire industry, under the neoliberal idea that going private would improve efficiency. By 2018, the National Wildfire Suppression Association – the main lobby group representing over 250 private fire-fighting companies – claimed that 40% of the country’s fire service has been privatised.
If there was one company that would be responsible for pioneering the private fire service it would be the American Insurance Group (AIG) – the world’s largest insurance company. In 2005, the AIG kickstarted the business model of getting rich people to pay a massive premium in exchange for a bespoke team. According to the group’s press release, the ‘Wildfire Defence Service’ serves thousands of homes across California and has been taken up by nearly half of the Forbes 400 richest Americans. That AIG was behind these developments is telling. Alongside its bespoke service, the company was also developing a financial product that would help to ultimately set the global economy on fire.
Insurance companies may sound like boring places of little importance, but they played a major part in bringing about the 2008 financial crisis. In the lead up to the crisis, AIG was making billions from reckless financial speculation. When things turned sour, AIG had to turn to the US government for a bailout, with taxpayers forking out $182.3 billion of public money to save the insurance giant. Many of the dodgy deals that led to AIG’s problems trace back to a division in their London office, run by a man called Joseph Cassano, or as the papers call him, “the man who crashed the world”. Despite losing billions, he left AIG without being held to account for his actions and with a massive financial payout: $280 million in cash and an additional $34 million in bonuses.
The story of the Californian wildfires is not just the usual story of privilege paying for protection. To fill the void left by 40 years of privatisation, the government had to rely on its bulging prison population to put out the flames. To this day prisoners make up a vast chunk of the Californian fire service and these prisoners are not just a token part of the force – nearly 40% of Californian firefighters are inmates. That is over 4,000 people. For their services, they are paid a token $1 an hour; receive no benefits; and if they die on the job, their families are given no compensation. Employing prisoners for barely a wage saves the US government $100 million a year.
California is infamous for its dramatically oversized and inflated prison population, having grown by 750% since the mid-1970s. According to academic Ruth Wilson Gilmore, the cause of this growth has nothing to do with rising crime rates, which actually fell during this period. The prison population increased because the government built new prisons, in an incarceration construction frenzy that developers proudly called “the biggest in the history of the world”.
The new prisons, paid for largely out of public debt which was never intended to be repaid, provided a new meaning for a state bureaucracy that was under the threat of privatisation. We can see the legacy of this today: California spends six times the amount to put a person behind bars than it does to put them through school. There are now more women in prison in California alone than there were in the United States as a whole in 1970.
From flooding to rising sea levels, fires are not the only ecological threat facing us and science tells us that the damaging effects of climate change will intensify over the coming years. How we respond to these crises will depend on the economic and political institutions that now govern us. What we are witnessing in California is a particularly dystopian vision of the relationship between climate change and class. There, a millionaire class is protected for a steep fee by a multinational corporation that crashed the global economy but was bailed out regardless by taxpayers – who, in turn, have to rely on crumbling state protection. Meanwhile, growing numbers of the poor are locked up and risk their lives fighting the problem for just $1 an hour.
by Ben Tippet, OpenDemocracy via Naked Capitalism | Read more:
Image: PA Images
[ed. Not to mention that former prisoners with a criminal record have significantly reduced chances of finding employment (fire-fighting or otherwise).]
Whether it is the flames of the wet Amazon or the fires of the frozen Arctic, wildfires have become the canary in the gold mine. The urgency of a fire is a far cry from the dry scientific language of global warming. They represent everything that is terrifying about climate change. Fire rips through the natural and physical world, leaving behind a blackened and uninhabitable landscape, like watching the next century play out on fast forward. All that is left is a wasteland, showing us, in the words of T.S. Elliot’s poem, “fear in a handful of dust”.
Of the 295,000 people that were evacuated in the 2018 California inferno, two names in particular hit the news. Kim Kardashian and Kanye West were forced to abandon their $60 million mansion in the serene gated community just outside of Los Angeles, known as the Hidden Hills. The Hills are home to several Hollywood stars and celebrities, including Kylie Jenner (the world’s youngest billionaire), Miley Cyrus and Britney Spears.
When the fire finally started to die down, the couple found themselves having to put out the flames of their own publicity crisis. Reports started to enfold that the couple had hired a private fire team to protect their mansion, a decision they were publicly burned for as critics raged that they should not be able to pay for protection. In an attempt to stem the crisis, Kim Kardashian appeared on ‘The Ellen Show’ to present a $100,000 donation to a firefighter and his wife who had lost their homes in the fire, in a declaration of their devotion to the public Californian firefighting service.
Whether Kim and Kanye were wrong for going private is not really the issue here. But it does raise the question – why couldn’t they rely on the public fire service to protect their home? In answering this question, we will see that the climate crisis is a class crisis. As the world warms and becomes ever increasingly hostile to human life, class divides will be sharpened. This is not inevitable. But there are many features of the 2018 Californian wildfires that show the path we are on, an allegory for a century which will be defined by its relationship to the elements.
Fire Services Run by Insurers
During the fires, the Californian fire service was stretched well beyond capacity, having to call in backup from seventeen other states. This was in part due to the gutting of the public service in the era of privatisation. Starting in the 1980s, the US began to promote more and more private actors in the fire industry, under the neoliberal idea that going private would improve efficiency. By 2018, the National Wildfire Suppression Association – the main lobby group representing over 250 private fire-fighting companies – claimed that 40% of the country’s fire service has been privatised.
If there was one company that would be responsible for pioneering the private fire service it would be the American Insurance Group (AIG) – the world’s largest insurance company. In 2005, the AIG kickstarted the business model of getting rich people to pay a massive premium in exchange for a bespoke team. According to the group’s press release, the ‘Wildfire Defence Service’ serves thousands of homes across California and has been taken up by nearly half of the Forbes 400 richest Americans. That AIG was behind these developments is telling. Alongside its bespoke service, the company was also developing a financial product that would help to ultimately set the global economy on fire.
Insurance companies may sound like boring places of little importance, but they played a major part in bringing about the 2008 financial crisis. In the lead up to the crisis, AIG was making billions from reckless financial speculation. When things turned sour, AIG had to turn to the US government for a bailout, with taxpayers forking out $182.3 billion of public money to save the insurance giant. Many of the dodgy deals that led to AIG’s problems trace back to a division in their London office, run by a man called Joseph Cassano, or as the papers call him, “the man who crashed the world”. Despite losing billions, he left AIG without being held to account for his actions and with a massive financial payout: $280 million in cash and an additional $34 million in bonuses.
The story of the Californian wildfires is not just the usual story of privilege paying for protection. To fill the void left by 40 years of privatisation, the government had to rely on its bulging prison population to put out the flames. To this day prisoners make up a vast chunk of the Californian fire service and these prisoners are not just a token part of the force – nearly 40% of Californian firefighters are inmates. That is over 4,000 people. For their services, they are paid a token $1 an hour; receive no benefits; and if they die on the job, their families are given no compensation. Employing prisoners for barely a wage saves the US government $100 million a year.
California is infamous for its dramatically oversized and inflated prison population, having grown by 750% since the mid-1970s. According to academic Ruth Wilson Gilmore, the cause of this growth has nothing to do with rising crime rates, which actually fell during this period. The prison population increased because the government built new prisons, in an incarceration construction frenzy that developers proudly called “the biggest in the history of the world”.
The new prisons, paid for largely out of public debt which was never intended to be repaid, provided a new meaning for a state bureaucracy that was under the threat of privatisation. We can see the legacy of this today: California spends six times the amount to put a person behind bars than it does to put them through school. There are now more women in prison in California alone than there were in the United States as a whole in 1970.
From flooding to rising sea levels, fires are not the only ecological threat facing us and science tells us that the damaging effects of climate change will intensify over the coming years. How we respond to these crises will depend on the economic and political institutions that now govern us. What we are witnessing in California is a particularly dystopian vision of the relationship between climate change and class. There, a millionaire class is protected for a steep fee by a multinational corporation that crashed the global economy but was bailed out regardless by taxpayers – who, in turn, have to rely on crumbling state protection. Meanwhile, growing numbers of the poor are locked up and risk their lives fighting the problem for just $1 an hour.
by Ben Tippet, OpenDemocracy via Naked Capitalism | Read more:
Image: PA Images
[ed. Not to mention that former prisoners with a criminal record have significantly reduced chances of finding employment (fire-fighting or otherwise).]
Even When It’s a Big Fat Lie
In October 2017, two months after white supremacists had held a ‘Unite the Right’ rally in Charlottesville, Virginia, Donald Trump’s (then) chief of staff, John Kelly, went on Fox News and delivered a history lesson. ‘The lack of an ability to compromise led to the Civil War,’ he said. ‘Men and women of good faith on both sides made their stand where their conscience had them make their stand.’ Kelly’s comments echoed the president’s remarks in the rally’s immediate aftermath. (‘Some very fine people on both sides,’ Trump said, comparing the marchers – who carried torches and chanted ‘Jews will not replace us’ – with those who had come out to protest against their presence.) In many quarters Kelly was taken to task. But when Trump’s (then) press secretary, Sarah Huckabee Sanders, was asked about it, she concurred. ‘I don’t know that I’m going to get into debating the Civil War,’ she said. ‘But I do know that many historians, including Shelby Foote, in Ken Burns’s famous Civil War documentary, agree that a failure to compromise was a cause of the Civil War.’
Sanders was right: Kelly’s comments could have come straight out of Burns’s documentary, which gave a sympathetic hearing to the notion of the ‘Lost Cause’. ‘Basically,’ Foote said at the start, ‘it was a failure on our part to find a way not to fight that war. It was because we failed to do the thing we really have a genius for, which is compromise. Americans like to think of themselves as uncompromising. Our true genius is for compromising. Our whole government’s founded on it. And it failed.’
Foote was right, too, in a way: the history of federal compromise with the slave states went all the way back to America’s founding: Southern colonies refused to ratify the Declaration of Independence until Thomas Jefferson struck out a clause attacking the slave trade; the constitution counted each slave as three-fifths of a person in the Federal census, granting slave owners disproportionate representation in Congress; the Missouri Compromise admitted Missouri and Maine into the US as a slave and a free state, respectively; the Kansas-Nebraska Act allowed settlers to decide whether or not slavery would be allowed in their territories; and so on. But what Foote – a novelist and popular historian who never held a position at a university – didn’t say was that slavery lay at the heart of every one of these compromises, that all of them favoured the status quo, and that, when the process finally broke down, it did so despite Lincoln’s best efforts to preserve slavery in the South, on condition that it not be allowed to expand into new territories. By this reckoning (which most historians outside the neo-Confederate fringe agree on) the North didn’t fail to compromise; it compromised all the way to the edge of a cliff.
But the Lost Cause – which holds that the war was fought to defend states’ rights and so to save the Southern way of life – won out anyway in the South, where monuments to soldiers who fell in ‘the war of Northern Aggression’ still stand in town squares, because it allowed white Southerners to pretend that men on both sides of the Mason-Dixon Line had fought honourably for their own noble causes. It won out in the North because, in theory, it paved the way for national reconciliation. Despite the best efforts of scholars such as W.E.B. Du Bois, it worked its way into the textbooks, where it remained well into the 20th century. (When I was at school in New York in the 1980s, I was taught that the war had been fought over states’ rights; slavery wasn’t much mentioned.) Now Trump’s White House was invoking the Lost Cause again.
To his credit, Burns was quick to respond to Kelly’s statements. ‘Many factors contributed to the Civil War,’ he said on Twitter. ‘One caused it: slavery.’ By and large, his documentary had made the same point. But Americans who watched The Civil War (39 million of them when the series first aired, and many more since) could have been forgiven for drawing other conclusions – in part, because the avuncular Foote was given so much time to make the opposite case. It left the impression that reasonable people on both sides could have reasonable disagreements about the war’s causes.
None of this went unnoticed when The Civil War was released in 1990. Historians wrote papers. Symposiums were held. In 1996, Oxford published Ken Burns’s ‘The Civil War’: Historians Respond. Two of the essays, by Eric Foner and Leon Litwack, were scathing, but, for the most part, the book’s tone was measured; Burns and Geoffrey Ward, who had written the film’s script, contributed replies. But a funny thing happened as Burns made more documentaries: instead of making more of the views of historians, he shunted them to the sidelines. For all its faults, The Civil War featured 24 historians. Burns’s 2007 film on the Second World War, The War, had 15. The Vietnam War (2017) included two in its chorus of 79 talking heads, and Country Music – which premiered in the States last September and aired, in edited form, on BBC 4 two months later – has only one: Bill Malone, whose book Country Music USA: A Fifty-Year History (1968) provided the template for Burns’s documentary.
‘Going to a dance was sort of like going back home to mama’s, or to grandma’s, for Thanksgiving,’ Malone says, eight minutes in.
For the most part, they do tell it themselves: Bobby Bare, Garth Brooks, Roseanne Cash, Charlie Daniels, Little Jimmy Dickens, Merle Haggard, Emmylou Harris, Rhiannon Giddens, Kris Kristofferson, Loretta Lynn, Willie Nelson, Dolly Parton, Charley Pride, Randy Scruggs, Connie Smith, Marty Stuart, Dwight Yoakam – a who’s who of Nashville, Austin and Bakersfield turned out for Burns’s camera, 85 strong. They’re a pleasure to watch, and if they’re dishonest, they’re disarming about it. ‘Truth-telling,’ Ketch Secor says in the first episode, ‘which country music at its best is. Truth-telling, even when it’s a big fat lie.’ It’s the stuff in between interviews that’s a drag, because it’s dishonest, too, but in more insidious ways. (...)
There’s a lot more Burns gets wrong, or sweeps under the carpet, and that may be unavoidable, given the scale of his projects. (He tends to work on several at a time.) This one took six years to make, whittling six hundred hours of footage down to sixteen. The credits are 174 names long, not counting the interviewees. Surely, any mistakes must pale next to the effort and service that these films provide. (‘More Americans get their history from Ken Burns than from any other source,’ Stephen Ambrose is supposed to have said, and for better or worse, I believe him.) But you notice, after a while, that the errors all face in a certain direction, and serve to make the same points, while all the things that are supposed to stay under the carpet keep reappearing. That may be unavoidable, too, when you try to make apolitical films about highly charged subjects. But country music is about as politically charged as an American cultural subject could be because, in a sense, it’s the Lost Cause set to a I-IV-V chord progression: the broken heart longing for simpler times, mother and home, and some sense of stability (stand-ins for the old Southern manse, where the log cabins were also slave shacks); the lip-service paid to Christian values (coupled with belligerence, blood-lust, knee-jerk patriotism, and a native distrust of authority); the lingering persistence of minstrel-show stereotypes, melodies and songs.
Sanders was right: Kelly’s comments could have come straight out of Burns’s documentary, which gave a sympathetic hearing to the notion of the ‘Lost Cause’. ‘Basically,’ Foote said at the start, ‘it was a failure on our part to find a way not to fight that war. It was because we failed to do the thing we really have a genius for, which is compromise. Americans like to think of themselves as uncompromising. Our true genius is for compromising. Our whole government’s founded on it. And it failed.’
Foote was right, too, in a way: the history of federal compromise with the slave states went all the way back to America’s founding: Southern colonies refused to ratify the Declaration of Independence until Thomas Jefferson struck out a clause attacking the slave trade; the constitution counted each slave as three-fifths of a person in the Federal census, granting slave owners disproportionate representation in Congress; the Missouri Compromise admitted Missouri and Maine into the US as a slave and a free state, respectively; the Kansas-Nebraska Act allowed settlers to decide whether or not slavery would be allowed in their territories; and so on. But what Foote – a novelist and popular historian who never held a position at a university – didn’t say was that slavery lay at the heart of every one of these compromises, that all of them favoured the status quo, and that, when the process finally broke down, it did so despite Lincoln’s best efforts to preserve slavery in the South, on condition that it not be allowed to expand into new territories. By this reckoning (which most historians outside the neo-Confederate fringe agree on) the North didn’t fail to compromise; it compromised all the way to the edge of a cliff.
But the Lost Cause – which holds that the war was fought to defend states’ rights and so to save the Southern way of life – won out anyway in the South, where monuments to soldiers who fell in ‘the war of Northern Aggression’ still stand in town squares, because it allowed white Southerners to pretend that men on both sides of the Mason-Dixon Line had fought honourably for their own noble causes. It won out in the North because, in theory, it paved the way for national reconciliation. Despite the best efforts of scholars such as W.E.B. Du Bois, it worked its way into the textbooks, where it remained well into the 20th century. (When I was at school in New York in the 1980s, I was taught that the war had been fought over states’ rights; slavery wasn’t much mentioned.) Now Trump’s White House was invoking the Lost Cause again.
To his credit, Burns was quick to respond to Kelly’s statements. ‘Many factors contributed to the Civil War,’ he said on Twitter. ‘One caused it: slavery.’ By and large, his documentary had made the same point. But Americans who watched The Civil War (39 million of them when the series first aired, and many more since) could have been forgiven for drawing other conclusions – in part, because the avuncular Foote was given so much time to make the opposite case. It left the impression that reasonable people on both sides could have reasonable disagreements about the war’s causes.
None of this went unnoticed when The Civil War was released in 1990. Historians wrote papers. Symposiums were held. In 1996, Oxford published Ken Burns’s ‘The Civil War’: Historians Respond. Two of the essays, by Eric Foner and Leon Litwack, were scathing, but, for the most part, the book’s tone was measured; Burns and Geoffrey Ward, who had written the film’s script, contributed replies. But a funny thing happened as Burns made more documentaries: instead of making more of the views of historians, he shunted them to the sidelines. For all its faults, The Civil War featured 24 historians. Burns’s 2007 film on the Second World War, The War, had 15. The Vietnam War (2017) included two in its chorus of 79 talking heads, and Country Music – which premiered in the States last September and aired, in edited form, on BBC 4 two months later – has only one: Bill Malone, whose book Country Music USA: A Fifty-Year History (1968) provided the template for Burns’s documentary.
‘Going to a dance was sort of like going back home to mama’s, or to grandma’s, for Thanksgiving,’ Malone says, eight minutes in.
Country music is full of songs about little old log cabins that people had never lived in, the old country church that people have never attended. But it spoke for a lot of people who were being forgotten – or felt they were being forgotten. Country music’s staple, above all, is nostalgia. Just a harkening back to the old way of life, either real or imagined.Burns’s producers interviewed Malone in 2014. Two years later, a lot of Americans who were being forgotten, or felt they were being forgotten, voted for Trump, who promised to return them to the ‘old way of life, either real or imagined’. They were the people country songs spoke to; the people Burns’s new film seems to speak for. ‘It depicts our entire history,’ the singer-songwriter Vince Gill said when he appeared with Burns at the 92nd Street Y in New York last September. ‘And what’s beautiful about the way it’s been depicted is that it’s finally given the respect that it’s never had. As someone that’s kind of given their life to it, to finally see our story told with that is – it’s amazing.’ The filmmakers ‘weren’t part of the culture’, Gill said. ‘They weren’t part of the fibre. They weren’t part of the history. But they told it in such a profound and honest way that it’s light years more compelling than if we could have told it ourselves. I think we would have lied.’
For the most part, they do tell it themselves: Bobby Bare, Garth Brooks, Roseanne Cash, Charlie Daniels, Little Jimmy Dickens, Merle Haggard, Emmylou Harris, Rhiannon Giddens, Kris Kristofferson, Loretta Lynn, Willie Nelson, Dolly Parton, Charley Pride, Randy Scruggs, Connie Smith, Marty Stuart, Dwight Yoakam – a who’s who of Nashville, Austin and Bakersfield turned out for Burns’s camera, 85 strong. They’re a pleasure to watch, and if they’re dishonest, they’re disarming about it. ‘Truth-telling,’ Ketch Secor says in the first episode, ‘which country music at its best is. Truth-telling, even when it’s a big fat lie.’ It’s the stuff in between interviews that’s a drag, because it’s dishonest, too, but in more insidious ways. (...)
There’s a lot more Burns gets wrong, or sweeps under the carpet, and that may be unavoidable, given the scale of his projects. (He tends to work on several at a time.) This one took six years to make, whittling six hundred hours of footage down to sixteen. The credits are 174 names long, not counting the interviewees. Surely, any mistakes must pale next to the effort and service that these films provide. (‘More Americans get their history from Ken Burns than from any other source,’ Stephen Ambrose is supposed to have said, and for better or worse, I believe him.) But you notice, after a while, that the errors all face in a certain direction, and serve to make the same points, while all the things that are supposed to stay under the carpet keep reappearing. That may be unavoidable, too, when you try to make apolitical films about highly charged subjects. But country music is about as politically charged as an American cultural subject could be because, in a sense, it’s the Lost Cause set to a I-IV-V chord progression: the broken heart longing for simpler times, mother and home, and some sense of stability (stand-ins for the old Southern manse, where the log cabins were also slave shacks); the lip-service paid to Christian values (coupled with belligerence, blood-lust, knee-jerk patriotism, and a native distrust of authority); the lingering persistence of minstrel-show stereotypes, melodies and songs.
Wednesday, October 7, 2020
Grapefruit Is One of the Weirdest Fruits on the Planet
In 1989, David Bailey, a researcher in the field of clinical pharmacology (the study of how drugs affect humans), accidentally stumbled on perhaps the biggest discovery of his career, in his lab in London, Ontario. Follow-up testing confirmed his findings, and today there is not really any doubt that he was correct. “The hard part about it was that most people didn’t believe our data, because it was so unexpected,” he says. “A food had never been shown to produce a drug interaction like this, as large as this, ever.”
That food was grapefruit, a seemingly ordinary fruit that is, in truth, anything but ordinary. Right from the moment of its discovery, the grapefruit has been a true oddball. Its journey started in a place where it didn’t belong, and ended up in a lab in a place where it doesn’t grow. Hell, even the name doesn’t make any sense. (...)
Grapefruit has long been associated with health. Even in the 1800s and before, early chroniclers of fruit in the Caribbean described it as being good for you. Perhaps it’s something about the combination of bitter, sour, and sweet that reads as vaguely medicinal.
This is especially ironic, because the grapefruit, as Bailey would show, is actually one of the most destructive foes of modern medicine in the entire food world.
Bailey works with the Canadian government, among others, testing various medications in different circumstances to see how humans react to them. In 1989, he was working on a blood pressure drug called felodipine, trying to figure out if alcohol affected response to the drug. The obvious way to test that sort of thing is to have a control group and experimental group—one that takes the drug with alcohol and one that takes it with water or nothing at all. But good clinical science calls for the study to be double-blind—that is, that both the tester and subjects don’t know which group they belong to. But how do you disguise the taste of alcohol so thoroughly that subjects don’t know they’re drinking it?
“It was really my wife Barbara and I, one Saturday night, we decided to try everything in the refrigerator,” says Bailey. They mixed pharmaceutical-grade booze with all kinds of juices, but nothing was really working; the alcohol always came through. “Finally at the very end, she said, ‘You know, we’ve got a can of grapefruit juice. Why don’t you try that?’ And by golly, you couldn’t tell!” says Bailey. So he decided to give his experimental subjects a cocktail of alcohol and grapefruit juice (a greyhound, when made with vodka), and his control group a glass of unadulterated grapefruit juice.
The blinding worked, but the results of the study were … strange. There was a slight difference in blood pressure between the groups, which isn’t that unusual, but then Bailey looked at the amount of the drug in the subjects’ bloodstreams. “The levels were about four times higher than I would have expected for the doses they were taking,” he says. This was true of both the control and experimental groups. Bailey checked every possible thing that could have gone wrong—his figures, whether the pharmacist gave him the wrong dosage—but nothing was off. Except the grapefruit juice.
Bailey first tested a new theory on himself. Felodipine doesn’t really have any ill effects at high dosage, so he figured it’d be safe, and he was curious. “I remember the research nurse who was helping me, she thought this was the dumbest idea she’d ever heard,” he recalls. But after taking his grapefruit-and-felodipine cocktail, his bloodstream showed that he had a whopping five times as much felodipine in his system than he should have had. More testing confirmed it. Grapefruit was screwing something up, and screwing it up good.
Eventually, with Bailey leading the effort, the mechanism became clear. The human body has mechanisms to break down stuff that ends up in the stomach. The one involved here is cytochrome P450, a group of enzymes that are tremendously important for converting various substances to inactive forms. Drugmakers factor this into their dosage formulation as they try to figure out what’s called the bioavailability of a drug, which is how much of a medication gets to your bloodstream after running the gauntlet of enzymes in your stomach. For most drugs, it is surprisingly little—sometimes as little as 10 percent.
Grapefruit has a high volume of compounds called furanocoumarins, which are designed to protect the fruit from fungal infections. When you ingest grapefruit, those furanocoumarins permanently take your cytochrome P450 enzymes offline. There’s no coming back. Grapefruit is powerful, and those cytochromes are donezo. So the body, when it encounters grapefruit, basically sighs, throws up its hands, and starts producing entirely new sets of cytochrome P450s. This can take over 12 hours.
This rather suddenly takes away one of the body’s main defense mechanisms. If you have a drug with 10 percent bioavailability, for example, the drugmakers, assuming you have intact cytochrome P450s, will prescribe you 10 times the amount of the drug you actually need, because so little will actually make it to your bloodstream. But in the presence of grapefruit, without those cytochrome P450s, you’re not getting 10 percent of that drug. You’re getting 100 percent. You’re overdosing.
And it does not take an excessive amount of grapefruit juice to have this effect: Less than a single cup can be enough, and the effect doesn’t seem to change as long as you hit that minimum.
None of this is a mystery, at this point, and it’s shockingly common. Here’s a brief and incomplete list of some of the medications that research indicates get screwed up by grapefruit:
That food was grapefruit, a seemingly ordinary fruit that is, in truth, anything but ordinary. Right from the moment of its discovery, the grapefruit has been a true oddball. Its journey started in a place where it didn’t belong, and ended up in a lab in a place where it doesn’t grow. Hell, even the name doesn’t make any sense. (...)
Grapefruit has long been associated with health. Even in the 1800s and before, early chroniclers of fruit in the Caribbean described it as being good for you. Perhaps it’s something about the combination of bitter, sour, and sweet that reads as vaguely medicinal.
This is especially ironic, because the grapefruit, as Bailey would show, is actually one of the most destructive foes of modern medicine in the entire food world.
Bailey works with the Canadian government, among others, testing various medications in different circumstances to see how humans react to them. In 1989, he was working on a blood pressure drug called felodipine, trying to figure out if alcohol affected response to the drug. The obvious way to test that sort of thing is to have a control group and experimental group—one that takes the drug with alcohol and one that takes it with water or nothing at all. But good clinical science calls for the study to be double-blind—that is, that both the tester and subjects don’t know which group they belong to. But how do you disguise the taste of alcohol so thoroughly that subjects don’t know they’re drinking it?
“It was really my wife Barbara and I, one Saturday night, we decided to try everything in the refrigerator,” says Bailey. They mixed pharmaceutical-grade booze with all kinds of juices, but nothing was really working; the alcohol always came through. “Finally at the very end, she said, ‘You know, we’ve got a can of grapefruit juice. Why don’t you try that?’ And by golly, you couldn’t tell!” says Bailey. So he decided to give his experimental subjects a cocktail of alcohol and grapefruit juice (a greyhound, when made with vodka), and his control group a glass of unadulterated grapefruit juice.
The blinding worked, but the results of the study were … strange. There was a slight difference in blood pressure between the groups, which isn’t that unusual, but then Bailey looked at the amount of the drug in the subjects’ bloodstreams. “The levels were about four times higher than I would have expected for the doses they were taking,” he says. This was true of both the control and experimental groups. Bailey checked every possible thing that could have gone wrong—his figures, whether the pharmacist gave him the wrong dosage—but nothing was off. Except the grapefruit juice.
Bailey first tested a new theory on himself. Felodipine doesn’t really have any ill effects at high dosage, so he figured it’d be safe, and he was curious. “I remember the research nurse who was helping me, she thought this was the dumbest idea she’d ever heard,” he recalls. But after taking his grapefruit-and-felodipine cocktail, his bloodstream showed that he had a whopping five times as much felodipine in his system than he should have had. More testing confirmed it. Grapefruit was screwing something up, and screwing it up good.
Eventually, with Bailey leading the effort, the mechanism became clear. The human body has mechanisms to break down stuff that ends up in the stomach. The one involved here is cytochrome P450, a group of enzymes that are tremendously important for converting various substances to inactive forms. Drugmakers factor this into their dosage formulation as they try to figure out what’s called the bioavailability of a drug, which is how much of a medication gets to your bloodstream after running the gauntlet of enzymes in your stomach. For most drugs, it is surprisingly little—sometimes as little as 10 percent.
Grapefruit has a high volume of compounds called furanocoumarins, which are designed to protect the fruit from fungal infections. When you ingest grapefruit, those furanocoumarins permanently take your cytochrome P450 enzymes offline. There’s no coming back. Grapefruit is powerful, and those cytochromes are donezo. So the body, when it encounters grapefruit, basically sighs, throws up its hands, and starts producing entirely new sets of cytochrome P450s. This can take over 12 hours.
This rather suddenly takes away one of the body’s main defense mechanisms. If you have a drug with 10 percent bioavailability, for example, the drugmakers, assuming you have intact cytochrome P450s, will prescribe you 10 times the amount of the drug you actually need, because so little will actually make it to your bloodstream. But in the presence of grapefruit, without those cytochrome P450s, you’re not getting 10 percent of that drug. You’re getting 100 percent. You’re overdosing.
And it does not take an excessive amount of grapefruit juice to have this effect: Less than a single cup can be enough, and the effect doesn’t seem to change as long as you hit that minimum.
None of this is a mystery, at this point, and it’s shockingly common. Here’s a brief and incomplete list of some of the medications that research indicates get screwed up by grapefruit:
- Benzodiazepines (Xanax, Klonopin, and Valium)
- Amphetamines (Adderall and Ritalin)
- Anti-anxiety SSRIs (Zoloft and Paxil)
- Cholesterol-lowering statins (Lipitor and Crestor)
- Erectile-dysfunction drugs (Cialis and Viagra)
- Various over-the-counter meds (Tylenol, Allegra, and Prilosec)
- And about a hundred others.
by Dan Nosowitz, Atlas Obscura | Read more:
Image: Stella Murphy
[ed. We used cytochrome P450 during the Exxon Valdez oil spill as a bio-marker for exposure. Interestingly, the by-products of it's metabolizing function can be more toxic than the original pollutant.]
Chamber of Commerce Quietly Supports a United Government Led by Democrats
One of the more underappreciated pieces of news in a week that exploded with news — leak of Trump’s taxes, the presidential debate, the presidential disease — was this, that a long-time strategist for the U.S. Chamber of Commerce has resigned over the Chamber’s decision to back 23 vulnerable House Democrats and to reduce financial support for Republican senatorial candidates.
From Politico:
Needless to say, “moving left” is not the same as “supporting Democrats.”Ryan Grim, writing at The Intercept, calls the Chamber’s transformation a “slow migration of the elite wing of the Republican Party into the Democratic fold.” This seems a much better explanation.
Hedging Their Bets or Trying to Influence the Outcome?
As Rising‘s Saagar Enjeti noted in the video above, the U.S. Chamber of Commerce, which spends $100 million per year, is the largest lobbyist by far in the United States, doling out 30% more money than its nearest competitor.
In the past, all or almost all of that money went to Republicans — 93%, for example, in 2010. This year the Chamber is not only supporting many more Democrats; it’s supporting Democrats in a way that will make a difference in the partisan makeup of Congress. While the Chamber also supports House Republicans, the 29 House freshmen it is backing “are running in some of the most competitive races in the country, including 14 in districts won by President Donald Trump in 2016” according to CNN.
On the Senate side, the Chamber has greatly reduced its spending on vulnerable Republicans, including Sen. Susan Collins (R-Maine). Politico notes that Reed’s decision to resign “was linked to the Chamber’s unwillingness to spend significant money on Senate races in the closing days of the election” and adds that Ms. Collins is receiving “far less money in 2020 than … in 2014, when [the Chamber] put tens of millions of dollars behind GOP Senate candidates.”
Politico has Reed saying the Chamber is “hedging its bets.” Voices on the libertarian right are much more virulent, calling this a “betrayal” and abandonment of “free market principles.” At the same time Republican leaders see the Chamber as, in House minority leader Kevin McCarthy’s words, “part of this socialist agenda that is driving this country out, and … fighting the president.”
Those are angry, empty words. Biden to Trump at the first debate: “I am not a socialist.” Progressives to world: “It’s true. He’s not. He’s a moderate Republican.”
Three Conclusions
From all this I think we can draw three conclusions, each leading to a different electoral thought.
First, that Ryan Grim is right when he says the elite wing of the Republican Party is being folded into the Democratic Party — not just in theory, but in practice, in dollars, as well. It’s clear that the Chamber and those who give it their money have made the calculation, at least for this presidential cycle, that their interests will be genuinely served by a Biden White House and a unified Democratic Congress.
In other words, they want a united government controlled by the Democratic Party. They know Trump is going to lose (Trump was scheduled to lose even before the recent Covid incident), and they’re working to both maintain a Democratic House majority and to sabotage the current Republican Senate majority.
There’s really no other way to read this news.
Second, as stated above, the Chamber of Congress and the big-league donors who support it know that a Biden White House and Democratic Congress will further their interest far more than a Trump-led divided or Republican government.
If the Chamber is right, progressives looking to “move Biden left” after the election, have their work cut out for them. The only “moving left” the administration will do is on identity issues. On issues involving money, it will “move left” only at the margins and for show.
For example, will Biden ban fracking? Of course not; there are too many big-donor dollars (and banking dollars) involved in that industry. For all his recent words, Biden seeks a “middle ground” on climate issues. It’s easy to promise carbon-free power by 2035,” fifteen years into a future in which he’ll be dead.
Finally, Biden will almost certainly be the next president.
I mentioned a “Trump-led government” above for a reason. Earlier I wrote (“Civil War? What Civil War?“) that almost everyone in the establishment regardless of party, from the military to the national security apparatus to the media to, now, the Chamber of Congress, opposes a return of Donald Trump to the White House. While they’re not working directly against him — that would be a bridge too far — they’re not helping out; in fact, they’re working to give him a Congress he can’t work with.
The truth is this: Donald Trump is such a terrible, unpredictable and embarrassing steward of the American hegemony project that no one with Establishment power wants to see him back. #NeverTrumpers are just a tip of the Republican side of that iceberg. This “betrayal” by the Chamber of Congress, one of the Republican Party’s most stalwart and reliable supporters, strongly supports that contention. (...)
You can bet that if the election is closer than the number of disputed ballots in key electoral-college states, there will be a way to hand the election to whichever candidate the Roberts Court prefers. Will John Roberts, a Republican, give the election to MAGA Republicans or to Chamber Republicans, if he could pick one or the other? John Roberts is a Chamber Republican.
by Thomas Neuburger, Naked Capitalism/DownWithTyranny! | Read more:
[ed. Assuming he survives, of course. See also: Washington’s worst-kept secret (Politico).]
From Politico:
Chamber of Commerce and top political strategist part ways amid turmoilPutting aside the dispute over whether Reed left or was fired, there are two explanations for what the Chamber is doing, and they’re not the same. Reed says he departed because the Chamber “moved left.” The Politico slugline writer says more simply that the Chamber was “shifting toward Democrats.”
Scott Reed, who had been with the business organization for most of the past decade, said it was shifting toward Democrats.
Scott Reed, the longtime top political strategist for the U.S. Chamber of Commerce, said Tuesday that he left the organization after a political shift at the business lobbying powerhouse.
The move comes amid mounting fears among Republicans — including many within the organization — that the traditionally conservative Chamber is moving to the left after endorsing roughly two dozen freshman House Democrats for reelection this year.
Reed explained his departure (the Chamber said he was “fired for cause”) this way: “I can no longer be part of this institution as it moves left.”
Needless to say, “moving left” is not the same as “supporting Democrats.”Ryan Grim, writing at The Intercept, calls the Chamber’s transformation a “slow migration of the elite wing of the Republican Party into the Democratic fold.” This seems a much better explanation.
Hedging Their Bets or Trying to Influence the Outcome?
As Rising‘s Saagar Enjeti noted in the video above, the U.S. Chamber of Commerce, which spends $100 million per year, is the largest lobbyist by far in the United States, doling out 30% more money than its nearest competitor.
In the past, all or almost all of that money went to Republicans — 93%, for example, in 2010. This year the Chamber is not only supporting many more Democrats; it’s supporting Democrats in a way that will make a difference in the partisan makeup of Congress. While the Chamber also supports House Republicans, the 29 House freshmen it is backing “are running in some of the most competitive races in the country, including 14 in districts won by President Donald Trump in 2016” according to CNN.
On the Senate side, the Chamber has greatly reduced its spending on vulnerable Republicans, including Sen. Susan Collins (R-Maine). Politico notes that Reed’s decision to resign “was linked to the Chamber’s unwillingness to spend significant money on Senate races in the closing days of the election” and adds that Ms. Collins is receiving “far less money in 2020 than … in 2014, when [the Chamber] put tens of millions of dollars behind GOP Senate candidates.”
Politico has Reed saying the Chamber is “hedging its bets.” Voices on the libertarian right are much more virulent, calling this a “betrayal” and abandonment of “free market principles.” At the same time Republican leaders see the Chamber as, in House minority leader Kevin McCarthy’s words, “part of this socialist agenda that is driving this country out, and … fighting the president.”
Those are angry, empty words. Biden to Trump at the first debate: “I am not a socialist.” Progressives to world: “It’s true. He’s not. He’s a moderate Republican.”
Three Conclusions
From all this I think we can draw three conclusions, each leading to a different electoral thought.
First, that Ryan Grim is right when he says the elite wing of the Republican Party is being folded into the Democratic Party — not just in theory, but in practice, in dollars, as well. It’s clear that the Chamber and those who give it their money have made the calculation, at least for this presidential cycle, that their interests will be genuinely served by a Biden White House and a unified Democratic Congress.
In other words, they want a united government controlled by the Democratic Party. They know Trump is going to lose (Trump was scheduled to lose even before the recent Covid incident), and they’re working to both maintain a Democratic House majority and to sabotage the current Republican Senate majority.
There’s really no other way to read this news.
Second, as stated above, the Chamber of Congress and the big-league donors who support it know that a Biden White House and Democratic Congress will further their interest far more than a Trump-led divided or Republican government.
If the Chamber is right, progressives looking to “move Biden left” after the election, have their work cut out for them. The only “moving left” the administration will do is on identity issues. On issues involving money, it will “move left” only at the margins and for show.
For example, will Biden ban fracking? Of course not; there are too many big-donor dollars (and banking dollars) involved in that industry. For all his recent words, Biden seeks a “middle ground” on climate issues. It’s easy to promise carbon-free power by 2035,” fifteen years into a future in which he’ll be dead.
Finally, Biden will almost certainly be the next president.
I mentioned a “Trump-led government” above for a reason. Earlier I wrote (“Civil War? What Civil War?“) that almost everyone in the establishment regardless of party, from the military to the national security apparatus to the media to, now, the Chamber of Congress, opposes a return of Donald Trump to the White House. While they’re not working directly against him — that would be a bridge too far — they’re not helping out; in fact, they’re working to give him a Congress he can’t work with.
The truth is this: Donald Trump is such a terrible, unpredictable and embarrassing steward of the American hegemony project that no one with Establishment power wants to see him back. #NeverTrumpers are just a tip of the Republican side of that iceberg. This “betrayal” by the Chamber of Congress, one of the Republican Party’s most stalwart and reliable supporters, strongly supports that contention. (...)
You can bet that if the election is closer than the number of disputed ballots in key electoral-college states, there will be a way to hand the election to whichever candidate the Roberts Court prefers. Will John Roberts, a Republican, give the election to MAGA Republicans or to Chamber Republicans, if he could pick one or the other? John Roberts is a Chamber Republican.
[ed. Assuming he survives, of course. See also: Washington’s worst-kept secret (Politico).]
The Making of an Hermès Kelly Bag
The Making of an Hermès Kelly Bag (NY Times)
Named for the actress Grace Kelly, who popularized the style in the 1950s, and modeled after a tote long ago used to carry horse saddles, the Hermès Kelly bag requires between 20 and 25 hours of handiwork by a single artisan to create. Such attention to craftsmanship, as illustrated in this video, is representative of the 183-year-old French house’s respect for the time and care that excellence often requires.
Tuesday, October 6, 2020
Subscribe to:
Posts (Atom)