Vocal: Lucas A. Engel; Drums: Gonzalo Díaz; Bass: Andi Schneir; Guitars: David Contreras; Keyboards & Programming: Marcelo Nuñez
Sunday, July 26, 2020
A Bunch of Guys in a Band
Vocal: Lucas A. Engel; Drums: Gonzalo Díaz; Bass: Andi Schneir; Guitars: David Contreras; Keyboards & Programming: Marcelo Nuñez
Saturday, July 25, 2020
What’s Happening?
In 1974, Doris Lessing published memoirs of a survivor, a postapocalyptic novel narrated by an unnamed woman, almost entirely from inside her ground-floor apartment in an English suburb. In a state of suspended disbelief and detachment, the woman describes the events happening outside her window as society slowly collapses, intermittently dissociating from reality and lapsing into dream states. At first, the basic utilities begin to cut out, then the food supply runs short. Suddenly, rats are everywhere. Roving groups from neighboring areas pass through the yard, ostensibly escaping even worse living conditions and heading somewhere they imagine will be better. Her neighbors disappear, either dead or gone, leaving children behind—children who become feral and increasingly violent. Over the course of a few years, even the children’s language devolves into almost unintelligible jargon and cursing, as if the polite words they have been taught to communicate with no longer suit the survivalist demands of their situation.
The narrator’s myopic view of the outside world reflects the shortsightedness of her culture at large. Nobody, apparently, can admit how bad things are until conditions become completely unlivable, and meanwhile nobody can bear to name “it,” this slow, ongoing collapse with unidentifiable origins. The narrator spends considerable time trying and failing to define “it,” this never-quite-climactic but steady disintegration of life as she knew it. The news barely addresses “it,” and neither do the authorities, who, instead of offering aid, send troops in to police the newly homeless. To the narrator, “it” had never been “felt as an immediate threat”—because it always seemed like a problem elsewhere, relevant to somebody else, but never at the doorstep, until it was far too late. She explains: “While everything, all forms of social organization, broke up, we lived on, adjusting our lives as if nothing fundamental was happening. It was amazing how determined, how stubborn, how self-renewing were the attempts to lead an ordinary life. When nothing, or very little, was left of what we had been used to, had taken for granted even ten years before, we went on talking and behaving as if those old forms were still ours.”
Lessing sticks to the pronoun and describes “it” from an oblique angle, but writers of dystopian fiction have given “it” all sorts of names and causes. These turning points, which many science fiction plotlines hinge on, are similar to what the critic Darko Suvin has called “the novum”—the event or technological novelty that signals the fictional world is different from our own. The event that destroyed the earth in Philip K. Dick’s Do Androids Dream of Electric Sheep? is spurred by a major war called “World War Terminus.” Kim Stanley Robinson’s drowned city in New York 2140 is the product of two major “Pulses,” or moments of drastic sea-level rise. Neal Stephenson marks the inexplicable explosion of the moon in Seveneves by starting a new clock for human time, with the lunar destruction as hour A+0.0.0, or simply “Zero.” In P. D. James’s Children of Men, too, the clock starts over, at the point when humans become infertile and are faced with species demise: Year Omega. The titular event in Liz Jensen’s 2009 The Rapture is a major flood instigated by climate change, the biblical name of which is a not uncommon choice that, like the clock at 0, indicates that something has ended and something has begun anew. Such terminology points to the religious (and moralistic) undertones of much science fiction, a genre that supposedly rests on the supremacy of reason and rationality but is often undeniably theological in structure. One could say the same of Western cultural narratives at large.
A particularly inventive recent name for “it” is William Gibson’s “jackpot,” from his 2014 novel The Peripheral (which continues in the 2020 sequel Agency). The jackpot is what future humans call their previous social collapse, initiated partly by antibiotic-resistant bacterial infections. The choice of term is a somewhat ironic comment on the fact that global population decimation resulting from the plague was highly beneficial for some. The scarcity of an overpopulated world became post-jackpot abundance, at least for those who were poised to take advantage of it. As Gibson himself is said to have remarked, the future is already here, it is just unevenly distributed—an adage he updated in recent years to say that dystopia is also here, it’s just unevenly distributed, too.
Gibson’s jackpot seems like an appropriate term for our times and the current “it” the world is undergoing, which has so far been named the COVID-19 pandemic. While the virus can infect anyone, the pandemic disproportionately affects poor and minority communities when it comes to loss of livelihood and morbidity rates: If health care and basic rights are unevenly distributed, we can assume that this disease, this dystopia, will be too. And, as Gibson shows, we can expect that this disparity will perpetuate or widen after the event, as evidenced by choices like the Trump administration’s stimulus package, which supports “the economy” (i.e., the wealthy and their banks) rather than those most vulnerable. In other words, this pandemic may be hell for most but turn out to be a jackpot for some.
This raises the question: What will “after” the pandemic look like? In some ways it is the wrong question to ask, because event-izing the pandemic and giving it an after implies that there was a true before. Yet as writers of dystopian novels know, there was no before, there was only a time when “it” wasn’t quite so unavoidably visible. The circumstances that gave rise to “it” have been in place for quite some time. Yet until now, like Lessing’s narrator, those of us with the privilege to sit safely inside and watch what is happening outside, through the window, have been able to uphold the pretense that it is not our responsibility nor our calamity. We have successfully outsourced dystopia to somewhere else. But now it is “here,” because it is everywhere. (...)
On the one hand, naming the crisis allows one to apprehend it, grasp it, fight back against it. On the other hand, no word can fully encompass it, and any term is necessarily a reduction—the essence of “it” or “change” is not any singular instance but rather their constancy. For example, while one could call COVID-19 a biological crisis, one could just as accurately call it a health care crisis, a values crisis, or an ecological crisis. Names matter: Think of how Donna Haraway reframed the Anthropocene era as the “Capitalocene,” redirecting blame from the human species as such to humanity’s current economic system of relentless extraction and exploitation. The Capitalocene is in many ways a more optimistic title for our era than the Anthropocene, because it implies that there is another way: Although we might remain anthropos, we can still construct our world according to a different set of priorities and principles than the ones capitalism allows.
Year Zero is a useful concept for a story to hinge on, because it reflects our entrenched desire for moments of rupture that change everything at once. Disasters do shape history and intervene in the narratives we cling to—but in truth they only catalyze and make visible malignant processes that have been ongoing for a long time. The biggest disasters are the ones that are never identified as such—what Rob Nixon calls “Slow Violence,” those occurrences, like gradual environmental devastation, that disproportionately affect those without a megaphone, and which are not deemed newsworthy because they are not sensational single events. (One could also take up Keller Easterling’s use of the term “disposition” to describe the latent violent attitudes of infrastructure design—from electrical grids to legislation—that are only made manifest when the system spectacularly fails.) The pandemic might also be reframed as a form of slow violence, resulting not only from sudden, invasive “foreign,” nonhuman threats, but also from ongoing, pervasive, systemic power imbalances inside and outside the arbitrary borders we draw around places, people, and concepts.
Slow violence is hard to identify, hard to describe, and hard to resist. But this is one thing literature, postapocalyptic or otherwise, can do: to portray how the short and the long, the small and the big, connect. To identify the rot within rather than the threat without. To articulate “it” even when “it” has no name. Fiction can portray ecologies, timescales, catastrophes, and forms of violence that may be otherwise invisible, or more to the point, unnameable. We will never grasp the pandemic in its entirety, just like we will never see the microbe responsible for it with the naked eye. But we can try to articulate how it has changed us—is changing us.
Postapocalyptic literature probably does not dominate library shelves. Yet Lessing suggests that “it,” that apocalyptic pronoun, may be the hidden subject of all literature, precisely because it is the story of human hope and human failure and the coexistence of the two—the simultaneity of heaven/hell that makes up the human condition on earth. “It” is the essence of change, of human experience.
by Elvia Wilk, Bookforum | Read more:
Image: Jill Mulleady, No Hope No Fear, 2016. Courtesy the artist and Freedman Fitzpatrick, Los Angeles/Paris

Lessing sticks to the pronoun and describes “it” from an oblique angle, but writers of dystopian fiction have given “it” all sorts of names and causes. These turning points, which many science fiction plotlines hinge on, are similar to what the critic Darko Suvin has called “the novum”—the event or technological novelty that signals the fictional world is different from our own. The event that destroyed the earth in Philip K. Dick’s Do Androids Dream of Electric Sheep? is spurred by a major war called “World War Terminus.” Kim Stanley Robinson’s drowned city in New York 2140 is the product of two major “Pulses,” or moments of drastic sea-level rise. Neal Stephenson marks the inexplicable explosion of the moon in Seveneves by starting a new clock for human time, with the lunar destruction as hour A+0.0.0, or simply “Zero.” In P. D. James’s Children of Men, too, the clock starts over, at the point when humans become infertile and are faced with species demise: Year Omega. The titular event in Liz Jensen’s 2009 The Rapture is a major flood instigated by climate change, the biblical name of which is a not uncommon choice that, like the clock at 0, indicates that something has ended and something has begun anew. Such terminology points to the religious (and moralistic) undertones of much science fiction, a genre that supposedly rests on the supremacy of reason and rationality but is often undeniably theological in structure. One could say the same of Western cultural narratives at large.
A particularly inventive recent name for “it” is William Gibson’s “jackpot,” from his 2014 novel The Peripheral (which continues in the 2020 sequel Agency). The jackpot is what future humans call their previous social collapse, initiated partly by antibiotic-resistant bacterial infections. The choice of term is a somewhat ironic comment on the fact that global population decimation resulting from the plague was highly beneficial for some. The scarcity of an overpopulated world became post-jackpot abundance, at least for those who were poised to take advantage of it. As Gibson himself is said to have remarked, the future is already here, it is just unevenly distributed—an adage he updated in recent years to say that dystopia is also here, it’s just unevenly distributed, too.
Gibson’s jackpot seems like an appropriate term for our times and the current “it” the world is undergoing, which has so far been named the COVID-19 pandemic. While the virus can infect anyone, the pandemic disproportionately affects poor and minority communities when it comes to loss of livelihood and morbidity rates: If health care and basic rights are unevenly distributed, we can assume that this disease, this dystopia, will be too. And, as Gibson shows, we can expect that this disparity will perpetuate or widen after the event, as evidenced by choices like the Trump administration’s stimulus package, which supports “the economy” (i.e., the wealthy and their banks) rather than those most vulnerable. In other words, this pandemic may be hell for most but turn out to be a jackpot for some.
This raises the question: What will “after” the pandemic look like? In some ways it is the wrong question to ask, because event-izing the pandemic and giving it an after implies that there was a true before. Yet as writers of dystopian novels know, there was no before, there was only a time when “it” wasn’t quite so unavoidably visible. The circumstances that gave rise to “it” have been in place for quite some time. Yet until now, like Lessing’s narrator, those of us with the privilege to sit safely inside and watch what is happening outside, through the window, have been able to uphold the pretense that it is not our responsibility nor our calamity. We have successfully outsourced dystopia to somewhere else. But now it is “here,” because it is everywhere. (...)
On the one hand, naming the crisis allows one to apprehend it, grasp it, fight back against it. On the other hand, no word can fully encompass it, and any term is necessarily a reduction—the essence of “it” or “change” is not any singular instance but rather their constancy. For example, while one could call COVID-19 a biological crisis, one could just as accurately call it a health care crisis, a values crisis, or an ecological crisis. Names matter: Think of how Donna Haraway reframed the Anthropocene era as the “Capitalocene,” redirecting blame from the human species as such to humanity’s current economic system of relentless extraction and exploitation. The Capitalocene is in many ways a more optimistic title for our era than the Anthropocene, because it implies that there is another way: Although we might remain anthropos, we can still construct our world according to a different set of priorities and principles than the ones capitalism allows.
Year Zero is a useful concept for a story to hinge on, because it reflects our entrenched desire for moments of rupture that change everything at once. Disasters do shape history and intervene in the narratives we cling to—but in truth they only catalyze and make visible malignant processes that have been ongoing for a long time. The biggest disasters are the ones that are never identified as such—what Rob Nixon calls “Slow Violence,” those occurrences, like gradual environmental devastation, that disproportionately affect those without a megaphone, and which are not deemed newsworthy because they are not sensational single events. (One could also take up Keller Easterling’s use of the term “disposition” to describe the latent violent attitudes of infrastructure design—from electrical grids to legislation—that are only made manifest when the system spectacularly fails.) The pandemic might also be reframed as a form of slow violence, resulting not only from sudden, invasive “foreign,” nonhuman threats, but also from ongoing, pervasive, systemic power imbalances inside and outside the arbitrary borders we draw around places, people, and concepts.
Slow violence is hard to identify, hard to describe, and hard to resist. But this is one thing literature, postapocalyptic or otherwise, can do: to portray how the short and the long, the small and the big, connect. To identify the rot within rather than the threat without. To articulate “it” even when “it” has no name. Fiction can portray ecologies, timescales, catastrophes, and forms of violence that may be otherwise invisible, or more to the point, unnameable. We will never grasp the pandemic in its entirety, just like we will never see the microbe responsible for it with the naked eye. But we can try to articulate how it has changed us—is changing us.
Postapocalyptic literature probably does not dominate library shelves. Yet Lessing suggests that “it,” that apocalyptic pronoun, may be the hidden subject of all literature, precisely because it is the story of human hope and human failure and the coexistence of the two—the simultaneity of heaven/hell that makes up the human condition on earth. “It” is the essence of change, of human experience.
by Elvia Wilk, Bookforum | Read more:
Image: Jill Mulleady, No Hope No Fear, 2016. Courtesy the artist and Freedman Fitzpatrick, Los Angeles/Paris
Labels:
Critical Thought,
Fiction,
Health,
Literature,
Psychology
Rep. Alexandria Ocasio-Cortez (D-NY) Responds to Rep. Ted Yoho (R-FL)
[ed. Powerful. I'd also imagine this is probably the first time the term "fucking bitch" has been entered into the Congressional Record by a member of Congress. See also: Rep. Yoho's response (such as it is).]
The Horror Novel Lurking in Your Busy Online Life
In early April, at the height of the pandemic lockdown, Gianpiero Petriglieri, an Italian business professor, suggested on Twitter that being forced to conduct much of our lives online was making us sick. The constant video calls and Zoom meetings were draining us because they go against our brain’s need for boundaries: here versus not here. “It’s easier being in each other’s presence, or in each other’s absence,” he wrote, “than in the constant presence of each other’s absence.”
Petriglieri’s widely retweeted post reads like the germ of a horror tale. The liminal space between presence and absence, reality and unreality, is often where the literature of fear unfolds — a place called the “uncanny.” That old aesthetic term for creeping dread, famously dissected by Freud, is typically now applied to disturbing specimens of digital animation said to reside in the “uncanny valley.”
by Margot Harrison, NY Times | Read more:Image: Julia Dufossé
Friday, July 24, 2020
Are American Colleges and Universities the Next Covid Casualties?
Long before Donald Trump or Covid 19, the eerie resemblance of American higher education to the old Habsburg Empire was hard to miss. At the top a handful of vintage institutions continued to glitter. They exercised a magnetic attraction on the rest of the world that even intellectual disasters on the scale of the economics discipline before the 2008 financial crisis hardly dented. But most every institution below the royals was at least fraying around the edges. Long before the pandemic hit, many showed clear signs of distress.
The root of that distress is not hard to identify: It is the pressures arising from the decline of the American middle class and the soaring income inequalities of the New Gilded Age. While a few US colleges have lineages stretching back centuries, they and their less venerable competitors dramatically reconfigured themselves during the long boom that followed World War II. Historically rapid economic growth along with major government funding initiatives, such as the GI Bill, post-Sputnik spending on defense and R&D; and Lyndon Johnson’s Great Society fueled a vast expansion of the whole system.
With college degrees the passport to well-paid, stable employment, going to college became the default expectation of middle-class students and parents and the aspiration of many less affluent households. State supported institutions bulked up, but so did most private colleges and universities. Research institutions, private liberal arts colleges, professional schools, state colleges and universities, and junior colleges nearly all added students and faculty. Many also transformed themselves into conglomerates, branching out into wholly new lines of activity and adding layers of administrators.
The fateful fork in the road came in the nineteen seventies, as economic growth slowed and became far more variable. The declines, along with major campaigns for lower taxes, squeezed both federal and state finances. With direct aid from governments constrained, and advances in biotechnology promising high returns, both Democrats and Republicans encouraged colleges and universities to privatize research performed on their campuses and to spin off products to private industry.
As college costs spiraled upward while middle class incomes stagnated, the market for college education stratified more sharply. A handful of private universities and a very few public ones with deep-pocketed alumni spent big to build internationally competitive programs in science, engineering, and professional schools. In a virtuous circle, those successes attracted further outside funding from both government and industry. A few institutions were so successful at this that student tuition eventually became a secondary factor compared to how their endowments fared in the stock markets.
Over time, the search for outside funding turned increasingly desperate as state support continued falling off, especially after economic downturns. State funding now supplies 21% of the budget – a huge decline from the nineteen seventies —-and has been replaced by net tuition revenues which have grown year after year since 1980.
Permanent faculty are higher education’s institutional memory; they are vital to manage the curriculum in departments and programs, decide who is qualified to teach what in the curriculum, and how students should be assessed. But desperate to save money, colleges and universities steadily chopped back full-time academic positions –from 85% in 1970 to less than 25% today. Instead they filled more and more teaching slots with adjuncts, who are paid much less. Many, according to a new report, live on incomes of $25,000 or less. Because the permanent faculty is less than 25% at institutions outside of the top 150 or so ranked public and private colleges, most instruction is now done by part-timers who are given little or no professional guidance about what or how to teach or how to assess students.
Many colleges, including large numbers of state institutions, also turned to recruiting out-of-state students who could pay full cost. They sought to attract students from abroad, including many from China, for the same reason. In large universities, teaching assistants with an uncertain grasp of English often teach many students.
The nature and amount of student services also changed; many schools, for example, found it necessary to add medical, psychological, and other counseling services for non-traditional students. Rising health costs were a constant problem, especially for part timers. Many institutions also poured scarce resources into sports success, believing that would inspire increased alumni contributions. They also competed for affluent students by offering hotel-like amenities, state of the art gyms, and other expensive facilities. It did not help that many heads of colleges aspired to be paid like corporate CEOs.
In the wake of the 2008 financial crisis, the Obama administration was reluctant to help states out of their budget shortfalls, while national Republicans were openly opposed. State support for higher ed plunged to new lows. In most states, it never really came back. In 2017, for example, the largest governmental source of revenues for public higher education, state general appropriations, amounted to $87 billion – $2.2 billion below the level of 2007.
Throughout this long time of troubles, both governments and universities encouraged students and their parents to make up the revenue shortfall by taking on debt themselves. Student private lenders gleefully helped, often at rates that were astonishing even by the standards of deregulated American finance. After 2008, as interest rates fell to historically low levels, some private lenders still tried to charge double digit interest rates for student loans. The national student loan debt has risen to over $1.6 trillion dollars in 2020.
The result has been a slow motion train wreck. The steady growth of a dual economy in the US has made middle-class jobs increasingly scarce and destroyed many previously well-paid, secure jobs. As the Sanders and Warren campaigns made obvious, many students now carry heavy loads of debt when the graduate – if they graduate. Dropout rates, especially of minority students, have soared and many fewer students – again, especially minority students – find college a practical possibility. Rates of college attendance for Black and Hispanic students run far below that of whites, whose rates have also been declining. Whites and Asians earn a college-level credential at rates about 20% higher than Blacks and Hispanics. At the same time, students with diplomas often cannot find anything resembling an old fashioned entry-level position, because there are so few to be found.
Now, suddenly, with the Covid 19 pandemic, the long running financial squeeze threatens to turn overnight into genuine insolvency as institutions struggle to figure out how to safely run instructional systems dependent on in-person activities and support systems all too reminiscent of cruise ships. Duke University’s President Vincent Price recently sent the board, faculty, and staff a memo stating that Duke would need to find an additional $150 million to $200 million to get through the upcoming academic year. University of Michigan and Stanford University administrators project losses on a similar scale. Endowments have likely also taken a hit, though the massive Federal Reserve interventions in financial markets has supported portfolios, if not working Americans.
Duke, Michigan, and Stanford, though, are wealthy institutions with established reputations. Many of these, if they must, can operate online for a good while, if not comfortably, and relatively few students will likely fail to show eventually. By contrast, it is painfully obvious that many less well-endowed institutions are grasping at straws to find ways to reopen in person. They fear that students and parents simply will not pay for online instruction at home from less renowned institutions and many need the tuition to survive. In addition, colleges and universities often garner important revenues from student payments for dorm and meal services. More than a few have substantial debts to service. (...)
Many education leaders are pressing for much larger packages in the next CARES legislative package. Figures of $47 billion or more are being tossed around by groups representing only part of American higher education. There is also discussion of measures protecting universities from at least some liability suits.
Not everyone is on board. A celebrated former president of Harvard known, if guardedly, to be close to the Biden campaign, has proposed that institutions should take advantage of the crisis to accelerate changes that were in train anyway. In his view, that might lead to wider use of online instruction by a few institutions with strong worldwide brand names. Some of his colleagues are more cautious: they recommend that for only courses in some fields.
By contrast, the outgoing President of the University of California system recently stated flatly that Massive Open Online Courses (MOOCs) have not worked well. That is our view, with the important qualification that for highly motivated students, in some sharply defined contexts, well designed MOOCs or videotapes can be effective. Absent those, we think that the experience of Princeton and other institutions, where students enrolled in MOOCs stayed away in droves, is likely to be repeated.
by Roger Benjamin and Thomas Ferguson, INET| Read more:
Image: uncredited
[ed. Personally, I'd select a state school with reasonable tuition, then transfer later if a "name" college is important to you.]
The root of that distress is not hard to identify: It is the pressures arising from the decline of the American middle class and the soaring income inequalities of the New Gilded Age. While a few US colleges have lineages stretching back centuries, they and their less venerable competitors dramatically reconfigured themselves during the long boom that followed World War II. Historically rapid economic growth along with major government funding initiatives, such as the GI Bill, post-Sputnik spending on defense and R&D; and Lyndon Johnson’s Great Society fueled a vast expansion of the whole system.

The fateful fork in the road came in the nineteen seventies, as economic growth slowed and became far more variable. The declines, along with major campaigns for lower taxes, squeezed both federal and state finances. With direct aid from governments constrained, and advances in biotechnology promising high returns, both Democrats and Republicans encouraged colleges and universities to privatize research performed on their campuses and to spin off products to private industry.
As college costs spiraled upward while middle class incomes stagnated, the market for college education stratified more sharply. A handful of private universities and a very few public ones with deep-pocketed alumni spent big to build internationally competitive programs in science, engineering, and professional schools. In a virtuous circle, those successes attracted further outside funding from both government and industry. A few institutions were so successful at this that student tuition eventually became a secondary factor compared to how their endowments fared in the stock markets.
Over time, the search for outside funding turned increasingly desperate as state support continued falling off, especially after economic downturns. State funding now supplies 21% of the budget – a huge decline from the nineteen seventies —-and has been replaced by net tuition revenues which have grown year after year since 1980.
Permanent faculty are higher education’s institutional memory; they are vital to manage the curriculum in departments and programs, decide who is qualified to teach what in the curriculum, and how students should be assessed. But desperate to save money, colleges and universities steadily chopped back full-time academic positions –from 85% in 1970 to less than 25% today. Instead they filled more and more teaching slots with adjuncts, who are paid much less. Many, according to a new report, live on incomes of $25,000 or less. Because the permanent faculty is less than 25% at institutions outside of the top 150 or so ranked public and private colleges, most instruction is now done by part-timers who are given little or no professional guidance about what or how to teach or how to assess students.
Many colleges, including large numbers of state institutions, also turned to recruiting out-of-state students who could pay full cost. They sought to attract students from abroad, including many from China, for the same reason. In large universities, teaching assistants with an uncertain grasp of English often teach many students.
The nature and amount of student services also changed; many schools, for example, found it necessary to add medical, psychological, and other counseling services for non-traditional students. Rising health costs were a constant problem, especially for part timers. Many institutions also poured scarce resources into sports success, believing that would inspire increased alumni contributions. They also competed for affluent students by offering hotel-like amenities, state of the art gyms, and other expensive facilities. It did not help that many heads of colleges aspired to be paid like corporate CEOs.
In the wake of the 2008 financial crisis, the Obama administration was reluctant to help states out of their budget shortfalls, while national Republicans were openly opposed. State support for higher ed plunged to new lows. In most states, it never really came back. In 2017, for example, the largest governmental source of revenues for public higher education, state general appropriations, amounted to $87 billion – $2.2 billion below the level of 2007.
Throughout this long time of troubles, both governments and universities encouraged students and their parents to make up the revenue shortfall by taking on debt themselves. Student private lenders gleefully helped, often at rates that were astonishing even by the standards of deregulated American finance. After 2008, as interest rates fell to historically low levels, some private lenders still tried to charge double digit interest rates for student loans. The national student loan debt has risen to over $1.6 trillion dollars in 2020.
The result has been a slow motion train wreck. The steady growth of a dual economy in the US has made middle-class jobs increasingly scarce and destroyed many previously well-paid, secure jobs. As the Sanders and Warren campaigns made obvious, many students now carry heavy loads of debt when the graduate – if they graduate. Dropout rates, especially of minority students, have soared and many fewer students – again, especially minority students – find college a practical possibility. Rates of college attendance for Black and Hispanic students run far below that of whites, whose rates have also been declining. Whites and Asians earn a college-level credential at rates about 20% higher than Blacks and Hispanics. At the same time, students with diplomas often cannot find anything resembling an old fashioned entry-level position, because there are so few to be found.
Now, suddenly, with the Covid 19 pandemic, the long running financial squeeze threatens to turn overnight into genuine insolvency as institutions struggle to figure out how to safely run instructional systems dependent on in-person activities and support systems all too reminiscent of cruise ships. Duke University’s President Vincent Price recently sent the board, faculty, and staff a memo stating that Duke would need to find an additional $150 million to $200 million to get through the upcoming academic year. University of Michigan and Stanford University administrators project losses on a similar scale. Endowments have likely also taken a hit, though the massive Federal Reserve interventions in financial markets has supported portfolios, if not working Americans.
Duke, Michigan, and Stanford, though, are wealthy institutions with established reputations. Many of these, if they must, can operate online for a good while, if not comfortably, and relatively few students will likely fail to show eventually. By contrast, it is painfully obvious that many less well-endowed institutions are grasping at straws to find ways to reopen in person. They fear that students and parents simply will not pay for online instruction at home from less renowned institutions and many need the tuition to survive. In addition, colleges and universities often garner important revenues from student payments for dorm and meal services. More than a few have substantial debts to service. (...)
Many education leaders are pressing for much larger packages in the next CARES legislative package. Figures of $47 billion or more are being tossed around by groups representing only part of American higher education. There is also discussion of measures protecting universities from at least some liability suits.
Not everyone is on board. A celebrated former president of Harvard known, if guardedly, to be close to the Biden campaign, has proposed that institutions should take advantage of the crisis to accelerate changes that were in train anyway. In his view, that might lead to wider use of online instruction by a few institutions with strong worldwide brand names. Some of his colleagues are more cautious: they recommend that for only courses in some fields.
By contrast, the outgoing President of the University of California system recently stated flatly that Massive Open Online Courses (MOOCs) have not worked well. That is our view, with the important qualification that for highly motivated students, in some sharply defined contexts, well designed MOOCs or videotapes can be effective. Absent those, we think that the experience of Princeton and other institutions, where students enrolled in MOOCs stayed away in droves, is likely to be repeated.
by Roger Benjamin and Thomas Ferguson, INET| Read more:
Image: uncredited
[ed. Personally, I'd select a state school with reasonable tuition, then transfer later if a "name" college is important to you.]
The Never-Ending War (and Military Budget)
[ed. Just so you know. $740 billion and counting (every year)... and for what? Who cares any more what we're trying to accomplish over there, after 19 years? Whatever it is, it ain't working (except for defense contractors). See also: House Democrats, Working With Liz Cheney, Restrict Trump’s Planned Withdrawal of Troops From Afghanistan and Germany (The Intercept)]
‘Almost Famous’: The Oral History of a Golden God’s Acid Trip
As the coming-of-age rock tale turns 20, the cast and crew remember how a star’s badly aimed jump and other near-misses became a standout scene.
The writer-director Cameron Crowe was already a beloved voice in cinema for high school tales like “Fast Times at Ridgemont High” and “Say Anything.” Then came “Almost Famous” (2000). Also a coming-of-age story, it gave audiences a backstage pass to the 1970s rock ’n’ roll scene and in the process became a classic.
Loosely based on Crowe’s teenage years as a music journalist covering the Allman Brothers Band and Led Zeppelin, “Almost Famous” is the story of the 15-year-old aspiring scribe William Miller (Patrick Fugit), who gets the opportunity of a lifetime when Rolling Stone magazine sends him from his home in San Diego on tour with Stillwater, fictitious rockers on the verge of fame. As they travel, William forges relationships with the guitarist, Russell Hammond (Billy Crudup), and a “Band Aid,” or groupie, Penny Lane (Kate Hudson).
“Almost Famous” was released on Sept. 13, 2000, and gave Hudson and Fugit their big breaks, and Crowe an Oscar for original screenplay. As it nears its 20th anniversary, the movie has been the subject of a stage adaptation (headed to Broadway but delayed because of the pandemic) and a podcast. It’s beloved in part for standout scenes like one riffing on 1970s rock-star hubris: during a tour stop in Topeka, Kan., Russell crashes a party and ends up on the roof of a house tripping on acid, ready to jump into a murky pool. Before he does, he has a few final words: “I am a golden god!” Also, “I’m on drugs!”
I talked to the film’s cast and crew — including Crudup, Fugit, Crowe, the editor Joe Hutshing, the production designer Clay Griffith, and the costume designer Betsy Heimann — about how that scene came to be.
Led Zeppelin’s Robert Plant was the inspiration, and not just for the quote.
CAMERON CROWE There was a famous moment where [Plant] was on the balcony of the Continental Hyatt House, [the Sunset Strip hotel known as] the Riot House. I think there’s even a photo of the moment, and he’s holding his hands out, and he said, “I am a golden god!” It’s Zeppelin lore.
BILLY CRUDUP The reason Robert Plant said it was because he had long, golden hair. Because my hair is brown, I wasn’t making that connection at all. I was just imagining that Russell was thinking of himself as some sort of tribal idol.
CROWE When I was writing it, I was thinking, I love the playful relationship Plant had with his own image. Sometimes he would wear a Robert Plant fan T-shirt that someone had thrown onstage. He just had a wonderful sense of humor about his position as a big-time rock star. Wearing that shirt was saying that he understands the fan experience. In effect he’s one of them. Russell being with fans was doing a version of the same thing. “We understand each other, and I can even collaborate with you about my deepest feeling from your roof!”
CRUDUP Cameron, from his perspective as somebody who had spent time around these [famous] people, wanted to articulate their utter humanness, even with somebody like Russell. What it took was an acid trip for him to expose this childlike experience of being a rock god.
CROWE Russell goes to the fan’s home in search of “what’s real.” This was originally one of the discarded titles for the movie, “Something Real.” Before Russell is pulled back into the hermetically sealed world of Stillwater, I just wanted to make sure that we celebrated the fans there because “Almost Famous” is so much about us as fans of music. To me, the sequence with Billy on the roof, in the pool, that whole joke lives in a little bit of a love letter to the fans, which is what I always wanted it to be.
The film’s Topeka party sequence became an actual party.
CRUDUP We spent, I think, two days or three days in that house with all of the background artists who were playing the partygoers. To have everybody gathered around was quite a joyful experience.
CROWE Mary-Louise Parker [Crudup’s partner for a time] showed up, Nancy Wilson [the Heart rocker who contributed music to the film and Crowe’s wife at the time] showed up, and it really became like a Topeka party.
CLAY GRIFFITH We cast the extras more by their look. We had a bunch of extras one day in San Diego, and they all looked too contemporary. I’m not sure how to say what a 1970s look is, but it definitely was [very] 1990s. So, we went to local places, I can’t remember if it was bars or restaurants, starting to pick people out and asking if they wanted to be in this movie.
FUGIT [That house] was the first time I was around a lot of people closer to my age.
CROWE The people that lived there became part of the party, too. But it was filled with these extras. It was heavenly because, as it happened a few times, life became the movie, which became life, which became the movie. Patrick was that age and entranced with Kate Hudson, just like the character.
FUGIT She came from Hollywood royalty and [her mother, Goldie Hawn, and stepfather, Kurt Russell] would come to set. That was crazy to me because I had grown up watching a lot of their movies. Also Kate is beautiful and talented. It made an impression on 16-year-old me. I crushed on her for two months. As we got more and more into filming, she became much like an older sister.
by Ilana Kaplan, NY Times | Read more:
Image: Almost Famous/DreamWorks Pictures
The writer-director Cameron Crowe was already a beloved voice in cinema for high school tales like “Fast Times at Ridgemont High” and “Say Anything.” Then came “Almost Famous” (2000). Also a coming-of-age story, it gave audiences a backstage pass to the 1970s rock ’n’ roll scene and in the process became a classic.
Loosely based on Crowe’s teenage years as a music journalist covering the Allman Brothers Band and Led Zeppelin, “Almost Famous” is the story of the 15-year-old aspiring scribe William Miller (Patrick Fugit), who gets the opportunity of a lifetime when Rolling Stone magazine sends him from his home in San Diego on tour with Stillwater, fictitious rockers on the verge of fame. As they travel, William forges relationships with the guitarist, Russell Hammond (Billy Crudup), and a “Band Aid,” or groupie, Penny Lane (Kate Hudson).

I talked to the film’s cast and crew — including Crudup, Fugit, Crowe, the editor Joe Hutshing, the production designer Clay Griffith, and the costume designer Betsy Heimann — about how that scene came to be.
Led Zeppelin’s Robert Plant was the inspiration, and not just for the quote.
CAMERON CROWE There was a famous moment where [Plant] was on the balcony of the Continental Hyatt House, [the Sunset Strip hotel known as] the Riot House. I think there’s even a photo of the moment, and he’s holding his hands out, and he said, “I am a golden god!” It’s Zeppelin lore.
BILLY CRUDUP The reason Robert Plant said it was because he had long, golden hair. Because my hair is brown, I wasn’t making that connection at all. I was just imagining that Russell was thinking of himself as some sort of tribal idol.
CROWE When I was writing it, I was thinking, I love the playful relationship Plant had with his own image. Sometimes he would wear a Robert Plant fan T-shirt that someone had thrown onstage. He just had a wonderful sense of humor about his position as a big-time rock star. Wearing that shirt was saying that he understands the fan experience. In effect he’s one of them. Russell being with fans was doing a version of the same thing. “We understand each other, and I can even collaborate with you about my deepest feeling from your roof!”
CRUDUP Cameron, from his perspective as somebody who had spent time around these [famous] people, wanted to articulate their utter humanness, even with somebody like Russell. What it took was an acid trip for him to expose this childlike experience of being a rock god.
CROWE Russell goes to the fan’s home in search of “what’s real.” This was originally one of the discarded titles for the movie, “Something Real.” Before Russell is pulled back into the hermetically sealed world of Stillwater, I just wanted to make sure that we celebrated the fans there because “Almost Famous” is so much about us as fans of music. To me, the sequence with Billy on the roof, in the pool, that whole joke lives in a little bit of a love letter to the fans, which is what I always wanted it to be.
The film’s Topeka party sequence became an actual party.
CRUDUP We spent, I think, two days or three days in that house with all of the background artists who were playing the partygoers. To have everybody gathered around was quite a joyful experience.
CROWE Mary-Louise Parker [Crudup’s partner for a time] showed up, Nancy Wilson [the Heart rocker who contributed music to the film and Crowe’s wife at the time] showed up, and it really became like a Topeka party.
CLAY GRIFFITH We cast the extras more by their look. We had a bunch of extras one day in San Diego, and they all looked too contemporary. I’m not sure how to say what a 1970s look is, but it definitely was [very] 1990s. So, we went to local places, I can’t remember if it was bars or restaurants, starting to pick people out and asking if they wanted to be in this movie.
FUGIT [That house] was the first time I was around a lot of people closer to my age.
CROWE The people that lived there became part of the party, too. But it was filled with these extras. It was heavenly because, as it happened a few times, life became the movie, which became life, which became the movie. Patrick was that age and entranced with Kate Hudson, just like the character.
FUGIT She came from Hollywood royalty and [her mother, Goldie Hawn, and stepfather, Kurt Russell] would come to set. That was crazy to me because I had grown up watching a lot of their movies. Also Kate is beautiful and talented. It made an impression on 16-year-old me. I crushed on her for two months. As we got more and more into filming, she became much like an older sister.
by Ilana Kaplan, NY Times | Read more:
Image: Almost Famous/DreamWorks Pictures
[ed. I loved this movie (and a later one - Singles - about the Seattle music scene). See also: FILM: BACK TO THE 70's; The Extraordinary Adolescence of Cameron Crowe (NYT).]
Thursday, July 23, 2020
Heat Domes
A perfect storm of crises is forming across the United States. Above our heads, a “heat dome” of high pressure could blast 80 percent of the continental US with temperatures over 90 degrees for the next few weeks. This coming in a summer when the Covid-19 lockdown has trapped people indoors, many without air-conditioning—and mass unemployment may mean that residents with AC units can’t afford to run them. Deeper still, the heat and the pandemic are exacerbating long-standing and deadly inequities that will only get deadlier this summer.
A heat dome “is really just sort of a colloquial term for a persistent and/or strong high-pressure system that occurs during the warm season, with the end result being a lot of heat,” says climate scientist Daniel Swain of UCLA’s Institute of the Environment and Sustainability.
That high-pressure air descends from above and gets compressed as it nears the ground. Think about how much more pressure you experience at sea level than at the top of a mountain—what you’re feeling is the weight of the atmosphere on your shoulders. As the air descends and gets compressed, it heats up. “So the same air that’s maybe 80 degrees a few thousand feet up, you bring that same air—without adding any extra energy to it—down to the surface in a high-pressure system and it could be 90, 95, 100 degrees,” says Swain.
At the same time, a high-pressure system keeps clouds from forming by inhibiting upward vertical motion in the atmosphere. Oddly enough, it’s this same phenomenon that produces extremely cold temperatures in the winter. “If you don’t have that upward vertical motion, you don’t get clouds or storms,” Swain says. “So when it’s already cold and dark, that means the temperatures can get really cold because of clear skies, as things radiate out at night. In the warm season, that lack of clouds and lack of upward motion in the atmosphere means it can get really hot because you have a lot of sunlight.”
That heat can accumulate over days or weeks, turning the heat dome into a kind of self-perpetuating atmospheric cap over the landscape. On a normal day, some of the sun’s energy evaporates water from the soil, meaning that solar energy isn’t put toward further warming the air. But as the heat dome persists, it blasts away the soil’s moisture, and that solar energy now goes full-tilt into heating the air.
“So after a certain point, once it’s been hot enough for long enough, it becomes even easier to get even hotter,” says Swain. “And so that’s why these things can often be really persistent, because once they’ve been around for a little while, they start to feed off of themselves.”
Unfortunately for the southwestern US, this is likely to unfold in the next week or two. Normally at this time of year, monsoons would be drenching the landscape, but no such storms are on the horizon. “And so those super dry land surfaces are going to amplify the heat and the persistence of this heat dome,” says Swain. The central US and mountain states will also be sweltering particularly badly over the next few weeks—heat domes tend to perpetuate inland, where they more easily dry out the surface than in wetter regions—though over three-quarters of the Lower 48 will be under the dome’s influence.
This won’t be the last heat dome, or the most severe one. On a warming planet, the conditions are ripe for these systems to perpetuate themselves. Harsher droughts mean ever-drier soils, so when future heat domes settle over the US, they’ll start from the beginning with more solar energy heating the air instead of the wet ground. And thanks to climate change, those air temperatures will be hotter even before a heat dome arrives.
A heat dome “is really just sort of a colloquial term for a persistent and/or strong high-pressure system that occurs during the warm season, with the end result being a lot of heat,” says climate scientist Daniel Swain of UCLA’s Institute of the Environment and Sustainability.

At the same time, a high-pressure system keeps clouds from forming by inhibiting upward vertical motion in the atmosphere. Oddly enough, it’s this same phenomenon that produces extremely cold temperatures in the winter. “If you don’t have that upward vertical motion, you don’t get clouds or storms,” Swain says. “So when it’s already cold and dark, that means the temperatures can get really cold because of clear skies, as things radiate out at night. In the warm season, that lack of clouds and lack of upward motion in the atmosphere means it can get really hot because you have a lot of sunlight.”
That heat can accumulate over days or weeks, turning the heat dome into a kind of self-perpetuating atmospheric cap over the landscape. On a normal day, some of the sun’s energy evaporates water from the soil, meaning that solar energy isn’t put toward further warming the air. But as the heat dome persists, it blasts away the soil’s moisture, and that solar energy now goes full-tilt into heating the air.
“So after a certain point, once it’s been hot enough for long enough, it becomes even easier to get even hotter,” says Swain. “And so that’s why these things can often be really persistent, because once they’ve been around for a little while, they start to feed off of themselves.”
Unfortunately for the southwestern US, this is likely to unfold in the next week or two. Normally at this time of year, monsoons would be drenching the landscape, but no such storms are on the horizon. “And so those super dry land surfaces are going to amplify the heat and the persistence of this heat dome,” says Swain. The central US and mountain states will also be sweltering particularly badly over the next few weeks—heat domes tend to perpetuate inland, where they more easily dry out the surface than in wetter regions—though over three-quarters of the Lower 48 will be under the dome’s influence.
This won’t be the last heat dome, or the most severe one. On a warming planet, the conditions are ripe for these systems to perpetuate themselves. Harsher droughts mean ever-drier soils, so when future heat domes settle over the US, they’ll start from the beginning with more solar energy heating the air instead of the wet ground. And thanks to climate change, those air temperatures will be hotter even before a heat dome arrives.
by Matt Simon, Mother Jones | Read more:
Image: Lloyd Fox/Zuma‘Superspreading’ Events Appear to be Propelling the Pandemic
It wasn’t until Day 7 of her team’s coronavirus investigation when it dawned on Linda Vail, the health officer for Michigan’s Ingham County, that this was going to be a big one. It had started with just two infections at the college bar on June 18, not long after the state began reopening. But the numbers quickly jumped to 12, then 18, then 34.
As of Friday, she was staring at a spreadsheet with 187 infected at Harper’s Restaurant and Brew Pub.
“The tables were six feet apart, but no one stayed there,” she said. “The DJ was playing music so people were shouting, the dance floor started to get crowded. We had flattened the curve and then boom.”
The East Lansing case is what’s known as a superspreading event — possibly the largest so far in the United States among the general public. Many scientists say such infection bursts — probably sparked by a single, highly infectious individual who may show no signs of illness and unwittingly share an enclosed space with many others — are driving the pandemic. They worry these cases, rather than routine transmission between one infected person and, say, two or three close contacts, are propelling case counts out of control.
More than 1,000 suspected clusters — ranging from the single digits to thousands — have been logged in a database compiled by a coder in the Netherlands. A megachurch in South Korea. A political rally in Madrid. An engagement party in Rio de Janeiro. Nearly all took place indoors, or in indoor-outdoor spaces.
Even as the Trump administration pressures schools to reopen this fall, the latest research suggests that understanding how and why these events occur — and how to prevent them — is key to reopening safely. In recent days, governors from at least 18 states, including Michigan, have backtracked on plans to loosen restrictions due to outbreaks.
But even those efforts may fail if people ignore the most common ways the virus is considered to spread. Transmission, it turns out, is far more idiosyncratic than previously understood. Scientists say they believe it is dependent on such factors as an individual’s infectivity, which can vary person to person by billions of virus particles, whether the particles are contained in large droplets that fall to the ground or in fine vapor that can float much further, and how much the air in a particular space circulates.
Donald Milton, a professor of environmental health at the University of Maryland, and other experts have wondered if superspreading events could be the “Achilles’ heel” of the virus. If we could pinpoint the conditions under which these clusters occur, Milton argued, we could lower the transmission rate enough to extinguish the spread.
“If you could stop these events, you could stop the pandemic,” Milton said. “You would crush the curve.” (...)
As we enter the seventh month of the global pandemic, scientists are still frustratingly in the dark when it comes to key aspects of how the virus is transmitted.
Why, for instance, didn’t the earliest infections in the United States, or the infamous Lake of the Ozarks party, spur lots of cases, while a much smaller gathering at a Michigan bar produced nearly 200? Why out of countless large gatherings held — church services, soccer games, choir rehearsals, and Zumba classes — did only a fraction ignite significant infections?
Part of the uneven spread of the coronavirus — and the phenomenon of superspreading — can be explained by extreme individual variation in infectivity, researchers say.
Some people will not transmit the virus to anyone, contact tracing has shown, while others appear to spread the virus with great efficiency. Overall, researchers have estimated in recent studies that some 10 to 20 percent of the infected may be responsible for 80 percent of all cases. (...)
A growing body of evidence suggests that SARS-CoV2, like other coronaviruses, expands in a community in fits and starts, rather than more evenly over space and time. Adam Kucharski of the London School of Hygiene and Tropical Medicine has estimated that the value of what’s known as the k-parameter — a measure of how much a virus tends to cluster — indicates that just 10 percent of people may be responsible for 80 percent of novel coronavirus cases.
Real world data corroborates the skewed transmission pattern.
In a detailed analysis of outbreaks in Hong Kong, for example, researchers found three distinct groups of incidents. The superspreading individuals, representing 20 percent of the total, were responsible for 80 percent of transmissions. A second group, involving about 10 percent of cases, transmitted the virus to one or two others. The final group, 70 percent, did not infect anyone else at all.
In Israel, investigators looking at 212 cases concluded that they could be linked back to 1 to 10 percent of people. And in an outbreak in a South Korea office tower, investigators found that about 45 percent of 216 workers got the virus from a single person. In the United States, an analysis from five counties in Georgia found that superspreading appeared to be “widespread across space and time,” and that 2 percent of the infected seeded 20 percent of the cases.
Most of these events took place in coronavirus hot spots of which most people are now aware: buildings where people live in close quarters, such as nursing homes, prisons, worker dormitories and cruise ships. There have been a fair number of clusters at meat-processing and frozen food factories, as well as at a curling event in Edmonton, Canada, leading some to speculate that temperatures could be a factor. (...)
As of Friday, she was staring at a spreadsheet with 187 infected at Harper’s Restaurant and Brew Pub.
“The tables were six feet apart, but no one stayed there,” she said. “The DJ was playing music so people were shouting, the dance floor started to get crowded. We had flattened the curve and then boom.”
The East Lansing case is what’s known as a superspreading event — possibly the largest so far in the United States among the general public. Many scientists say such infection bursts — probably sparked by a single, highly infectious individual who may show no signs of illness and unwittingly share an enclosed space with many others — are driving the pandemic. They worry these cases, rather than routine transmission between one infected person and, say, two or three close contacts, are propelling case counts out of control.
More than 1,000 suspected clusters — ranging from the single digits to thousands — have been logged in a database compiled by a coder in the Netherlands. A megachurch in South Korea. A political rally in Madrid. An engagement party in Rio de Janeiro. Nearly all took place indoors, or in indoor-outdoor spaces.
Even as the Trump administration pressures schools to reopen this fall, the latest research suggests that understanding how and why these events occur — and how to prevent them — is key to reopening safely. In recent days, governors from at least 18 states, including Michigan, have backtracked on plans to loosen restrictions due to outbreaks.
But even those efforts may fail if people ignore the most common ways the virus is considered to spread. Transmission, it turns out, is far more idiosyncratic than previously understood. Scientists say they believe it is dependent on such factors as an individual’s infectivity, which can vary person to person by billions of virus particles, whether the particles are contained in large droplets that fall to the ground or in fine vapor that can float much further, and how much the air in a particular space circulates.
Donald Milton, a professor of environmental health at the University of Maryland, and other experts have wondered if superspreading events could be the “Achilles’ heel” of the virus. If we could pinpoint the conditions under which these clusters occur, Milton argued, we could lower the transmission rate enough to extinguish the spread.
“If you could stop these events, you could stop the pandemic,” Milton said. “You would crush the curve.” (...)
As we enter the seventh month of the global pandemic, scientists are still frustratingly in the dark when it comes to key aspects of how the virus is transmitted.
Why, for instance, didn’t the earliest infections in the United States, or the infamous Lake of the Ozarks party, spur lots of cases, while a much smaller gathering at a Michigan bar produced nearly 200? Why out of countless large gatherings held — church services, soccer games, choir rehearsals, and Zumba classes — did only a fraction ignite significant infections?
Part of the uneven spread of the coronavirus — and the phenomenon of superspreading — can be explained by extreme individual variation in infectivity, researchers say.
Some people will not transmit the virus to anyone, contact tracing has shown, while others appear to spread the virus with great efficiency. Overall, researchers have estimated in recent studies that some 10 to 20 percent of the infected may be responsible for 80 percent of all cases. (...)
A growing body of evidence suggests that SARS-CoV2, like other coronaviruses, expands in a community in fits and starts, rather than more evenly over space and time. Adam Kucharski of the London School of Hygiene and Tropical Medicine has estimated that the value of what’s known as the k-parameter — a measure of how much a virus tends to cluster — indicates that just 10 percent of people may be responsible for 80 percent of novel coronavirus cases.
Real world data corroborates the skewed transmission pattern.
In a detailed analysis of outbreaks in Hong Kong, for example, researchers found three distinct groups of incidents. The superspreading individuals, representing 20 percent of the total, were responsible for 80 percent of transmissions. A second group, involving about 10 percent of cases, transmitted the virus to one or two others. The final group, 70 percent, did not infect anyone else at all.
In Israel, investigators looking at 212 cases concluded that they could be linked back to 1 to 10 percent of people. And in an outbreak in a South Korea office tower, investigators found that about 45 percent of 216 workers got the virus from a single person. In the United States, an analysis from five counties in Georgia found that superspreading appeared to be “widespread across space and time,” and that 2 percent of the infected seeded 20 percent of the cases.
Most of these events took place in coronavirus hot spots of which most people are now aware: buildings where people live in close quarters, such as nursing homes, prisons, worker dormitories and cruise ships. There have been a fair number of clusters at meat-processing and frozen food factories, as well as at a curling event in Edmonton, Canada, leading some to speculate that temperatures could be a factor. (...)
The rest of the known superspreading events were set in a hodgepodge of social venues where people gather in crowds: concerts, sports games, weddings, funerals, churches, political rallies, restaurants, shopping centers. And nearly all took place indoors, or in venues with indoor-outdoor spaces.
by Ariana Eunjung Cha, Washington Post | Read more:
GPT-3: AI Takes a Major Leap Forward
This year is likely to be remembered for the Covid-19 pandemic and for a significant presidential election, but there is a new contender for the most spectacularly newsworthy happening of 2020: the unveiling of GPT-3. As a very rough description, think of GPT-3 as giving computers a facility with words that they have had with numbers for a long time, and with images since about 2012.
The core of GPT-3, which is a creation of OpenAI, an artificial intelligence company based in San Francisco, is a general language model designed to perform autofill. It is trained on uncategorized internet writings, and basically guesses what text ought to come next from any starting point. That may sound unglamorous, but a language model built for guessing with 175 billion parameters — 10 times more than previous competitors — is surprisingly powerful.
The eventual uses of GPT-3 are hard to predict, but it is easy to see the potential. GPT-3 can converse at a conceptual level, translate language, answer email, perform (some) programming tasks, help with medical diagnoses and, perhaps someday, serve as a therapist. It can write poetry, dialogue and stories with a surprising degree of sophistication, and it is generally good at common sense — a typical failing for many automated response systems. You can even ask it questions about God.
Imagine a Siri-like voice-activated assistant that actually did your intended bidding. It also has the potential to outperform Google for many search queries, which could give rise to a highly profitable company.
GPT-3 does not try to pass the Turing test by being indistinguishable from a human in its responses. Rather, it is built for generality and depth, even though that means it will serve up bad answers to many queries, at least in its current state. As a general philosophical principle, it accepts that being weird sometimes is a necessary part of being smart. In any case, like so many other technologies, GPT-3 has the potential to rapidly improve.
It is not difficult to imagine a wide variety of GPT-3 spinoffs, or companies built around auxiliary services, or industry task forces to improve the less accurate aspects of GPT-3. Unlike some innovations, it could conceivably generate an entire ecosystem. (...)
Like all innovations, GPT-3 involves some dangers. For instance, if prompted by descriptive ethnic or racial words, it can come up with unappetizing responses. One can also imagine that a more advanced version of GPT-3 would be a powerful surveillance engine for written text and transcribed conversations. Furthermore, it is not an obvious plus if you can train your software to impersonate you over email. Imagine a world where you never know who you are really talking to — “Is this a verified email conversation?” Still, the hope is that protective mechanisms can at least limit some of these problems.
We have not quite entered the era where “Skynet goes live,” to cite the famous movie phrase about an AI taking over (and destroying) the world. But artificial intelligence does seem to have taken a major leap forward. In an otherwise grim year, this is a welcome and hopeful development. Oh, and if you would like to read more here is an article about GPT-3 written by … GPT-3.
by Tyler Cowan, Bloomberg | Read more:
Image: Wall-E
[ed. From my casual surfing adventures, it does seem like GPT-3 has exploded recently. For example: OpenAI’s new language generator GPT-3 is shockingly good—and completely mindless (MIT Technology Review).]
The core of GPT-3, which is a creation of OpenAI, an artificial intelligence company based in San Francisco, is a general language model designed to perform autofill. It is trained on uncategorized internet writings, and basically guesses what text ought to come next from any starting point. That may sound unglamorous, but a language model built for guessing with 175 billion parameters — 10 times more than previous competitors — is surprisingly powerful.

Imagine a Siri-like voice-activated assistant that actually did your intended bidding. It also has the potential to outperform Google for many search queries, which could give rise to a highly profitable company.
GPT-3 does not try to pass the Turing test by being indistinguishable from a human in its responses. Rather, it is built for generality and depth, even though that means it will serve up bad answers to many queries, at least in its current state. As a general philosophical principle, it accepts that being weird sometimes is a necessary part of being smart. In any case, like so many other technologies, GPT-3 has the potential to rapidly improve.
It is not difficult to imagine a wide variety of GPT-3 spinoffs, or companies built around auxiliary services, or industry task forces to improve the less accurate aspects of GPT-3. Unlike some innovations, it could conceivably generate an entire ecosystem. (...)
Like all innovations, GPT-3 involves some dangers. For instance, if prompted by descriptive ethnic or racial words, it can come up with unappetizing responses. One can also imagine that a more advanced version of GPT-3 would be a powerful surveillance engine for written text and transcribed conversations. Furthermore, it is not an obvious plus if you can train your software to impersonate you over email. Imagine a world where you never know who you are really talking to — “Is this a verified email conversation?” Still, the hope is that protective mechanisms can at least limit some of these problems.
We have not quite entered the era where “Skynet goes live,” to cite the famous movie phrase about an AI taking over (and destroying) the world. But artificial intelligence does seem to have taken a major leap forward. In an otherwise grim year, this is a welcome and hopeful development. Oh, and if you would like to read more here is an article about GPT-3 written by … GPT-3.
by Tyler Cowan, Bloomberg | Read more:
Image: Wall-E
[ed. From my casual surfing adventures, it does seem like GPT-3 has exploded recently. For example: OpenAI’s new language generator GPT-3 is shockingly good—and completely mindless (MIT Technology Review).]
Who Are Some of the Alive Masters of Their Craft?
Who is someone alive, regardless of field, that seems to be a master, one of the best, in their craft?
Musicians: which is the most technically competent?
Writers, actors, directors: who is the most competent?
It's easy for me to think of the greats after they are done - Jackie Chan, Mozart, etc. Who's currently among the greats?
Terrence Tao in Mathematics, Elon Musk in Entrepreneurship, Warren Buffet in Investing, Dalai Lama in Buddhism and Putin in Statecraft.
level 2
Five_Decades
5 points·4 days ago
I agree with some answers but not others.
I admire how musk has become successful in a wide range of ventures. People shit on him, but most entrepreneurs are lucky to succeed in one endeavor. Musk has at least half a dozen going on.
He put the yellow pages online, then he revolutionized online banking. Then he built electric cars, then he advances space travel. Then he helped create the hyperloop, now he is creating space based broadband. He is also working on computer/brain interfaces. Granted he himself didn't do it all alone, he had a lot of highly talented workers under him. But the point is that he isn't like most entreprenurs who succeed at one thing. He has a list of a few things he deems important (renewable energy, space travel, AI, the internet and genetics). All his companies are designed to advance these five things.
Hopefully before he dies he will have advances 20+ different kinds of technology.
As far as Putin, I'm not sure. I think Putin may be overplaying his hand and will eventually face a massive pushback from europe due to his behavior.
level 2
bbqturtle
2 points·5 days ago
Can you elaborate on the dalai lama? I've literally never heard of him outside his title.
Yeah so he passed his Geshe degree (which is something like PhD in Tibetan Buddhism) back in old Lhasa before the Chinese invasion and met the highest standards of scholarship in front of thousands of monks.
He does 5 hours meditation per day, even in his 80's, so yeah he's a complete master of Buddhist theory and practice.
And then as head of the government he overthrew himself and peacefully established a democracy for the Tibetan people in exile.
What's most mad is I was watching a film about him recently and he doesn't harbour ill will towards the Chinese. He said he met Mao and liked that he was a zealous reformer and he believes that equality and economic development are good goals.
He just has this really impartial vision for all human beings to be happy, even those who have caused him and his people so much suffering he just sees them as confused, like a mental patient who ends up hurting those they love without understanding.
And yeah he doesn't proselytise Buddhism and works for religious harmony, he actively promotes scientific understanding and says when science and Buddhism conflict science should take precedence. He talks a lot about non-religious spirituality where the most important thing is to have a warm heart of compassion.
He's not without controversy, but people say power corrupts and he has had intense power all his life and he's come out of it looking pretty amazing.
Guitar: Mark Knopfler
level 1
mrspecial
14 points·5 days ago
As a guitarist and someone who works in the music business I would posit that most household name guitarists wouldn’t even crack the top ten. People like knopfler, Clapton, Hendrix probably couldn’t keep up at all matched with the abilities of studio players like Brent Mason, Dean Parks, Tommy Tedesco.
level 2
leworthy
4 points·5 days ago
Although in the cases mentioned I agree, I think there is a larger concern here, which is how to define "technically competent". Because I think it is not clear, at the cutting edge of any art, where technique ends and other things begin.
I mean, we all know what "technical skill" is in a guitarist - but we also all know that thousands of session musicians around the world have it maxed out to 100. As do Korean teenagers. Can creativity be a technical skill? Can originality? My issue is, if we measure "technical skill" in its trivial sense, we will end up with maybe 500,000 "best guitarists" in the world - and no way to pick between them (this is a conservative estimate).
I do realise that extending the definition of "technical skill" like this is going beyond the op's question, but I think the wider point the op makes about mastery is more important.
In short, I am saying that technical mastery (even at the highest level) is an insufficient definition for mastery proper in any art - and that adopting it as a working definition will result in an enormous number of "masters" - most of whom will never make a meaningful impact on their area of expertise.
Musicians: which is the most technically competent?
Writers, actors, directors: who is the most competent?
It's easy for me to think of the greats after they are done - Jackie Chan, Mozart, etc. Who's currently among the greats?
level 2
Five_Decades
5 points·4 days ago
I agree with some answers but not others.
I admire how musk has become successful in a wide range of ventures. People shit on him, but most entrepreneurs are lucky to succeed in one endeavor. Musk has at least half a dozen going on.
He put the yellow pages online, then he revolutionized online banking. Then he built electric cars, then he advances space travel. Then he helped create the hyperloop, now he is creating space based broadband. He is also working on computer/brain interfaces. Granted he himself didn't do it all alone, he had a lot of highly talented workers under him. But the point is that he isn't like most entreprenurs who succeed at one thing. He has a list of a few things he deems important (renewable energy, space travel, AI, the internet and genetics). All his companies are designed to advance these five things.
Hopefully before he dies he will have advances 20+ different kinds of technology.
As far as Putin, I'm not sure. I think Putin may be overplaying his hand and will eventually face a massive pushback from europe due to his behavior.
level 2
bbqturtle
2 points·5 days ago
Can you elaborate on the dalai lama? I've literally never heard of him outside his title.
Yeah so he passed his Geshe degree (which is something like PhD in Tibetan Buddhism) back in old Lhasa before the Chinese invasion and met the highest standards of scholarship in front of thousands of monks.
He does 5 hours meditation per day, even in his 80's, so yeah he's a complete master of Buddhist theory and practice.
And then as head of the government he overthrew himself and peacefully established a democracy for the Tibetan people in exile.
What's most mad is I was watching a film about him recently and he doesn't harbour ill will towards the Chinese. He said he met Mao and liked that he was a zealous reformer and he believes that equality and economic development are good goals.
He just has this really impartial vision for all human beings to be happy, even those who have caused him and his people so much suffering he just sees them as confused, like a mental patient who ends up hurting those they love without understanding.
And yeah he doesn't proselytise Buddhism and works for religious harmony, he actively promotes scientific understanding and says when science and Buddhism conflict science should take precedence. He talks a lot about non-religious spirituality where the most important thing is to have a warm heart of compassion.
He's not without controversy, but people say power corrupts and he has had intense power all his life and he's come out of it looking pretty amazing.
------
Guitar: Mark Knopfler
level 1
mrspecial
14 points·5 days ago
As a guitarist and someone who works in the music business I would posit that most household name guitarists wouldn’t even crack the top ten. People like knopfler, Clapton, Hendrix probably couldn’t keep up at all matched with the abilities of studio players like Brent Mason, Dean Parks, Tommy Tedesco.
level 2
leworthy
4 points·5 days ago
Although in the cases mentioned I agree, I think there is a larger concern here, which is how to define "technically competent". Because I think it is not clear, at the cutting edge of any art, where technique ends and other things begin.
I mean, we all know what "technical skill" is in a guitarist - but we also all know that thousands of session musicians around the world have it maxed out to 100. As do Korean teenagers. Can creativity be a technical skill? Can originality? My issue is, if we measure "technical skill" in its trivial sense, we will end up with maybe 500,000 "best guitarists" in the world - and no way to pick between them (this is a conservative estimate).
I do realise that extending the definition of "technical skill" like this is going beyond the op's question, but I think the wider point the op makes about mastery is more important.
In short, I am saying that technical mastery (even at the highest level) is an insufficient definition for mastery proper in any art - and that adopting it as a working definition will result in an enormous number of "masters" - most of whom will never make a meaningful impact on their area of expertise.
From the SSC sub-reddit (more):
Jacob Collier
Nostalgia Reimagined
He was still too young to know that the heart’s memory eliminates the bad and magnifies the good, and that thanks to this artifice we manage to endure the burden of the past. But when he stood at the railing of the ship and saw the white promontory of the colonial district again, the motionless buzzards on the roofs, the washing of the poor hung out to dry on the balconies, only then did he understand to what extent he had been an easy victim to the charitable deceptions of nostalgia.The other day I caught myself reminiscing about high school with a kind of sadness and longing that can only be described as nostalgia. I felt imbued with a sense of wanting to go back in time and re-experience my classroom, the gym, the long hallways. Such bouts of nostalgia are all too common, but this case was striking because there is something I know for sure: I hated high school. I used to have nightmares, right before graduation, about having to redo it all, and would wake up in sweat and agony. I would never, ever like to go back to high school. So why did I feel nostalgia about a period I wouldn’t like to relive? The answer, as it turns out, requires we rethink our traditional idea of nostalgia.
– From Love in the Time of Cholera (1985) by Gabriel García Márquez

By the early 20th century, nostalgia was considered a psychiatric rather than neurological illness – a variant of melancholia. Within the psychoanalytic tradition, the object of nostalgia – ie, what the nostalgic state is about – was dissociated from its cause. Nostalgia can manifest as a desire to return home, but – according to psychoanalysts – it is actually caused by the traumatic experience of being removed from one’s mother at birth. This account began to be questioned in the 1940s, with nostalgia once again linked to homesickness. ‘Home’ was now interpreted more broadly to include not only concrete places, such as a childhood town, but also abstract ones, such as past experiences or bygone moments. While disagreements lingered, by the second part of the 20th century, nostalgia began to be characterised as involving three components. The first was cognitive: nostalgia involves the retrieval of autobiographical memories. The second, affective: nostalgia is considered a debilitating, negatively valenced emotion. And third, conative: nostalgia comprises a desire to return to one’s homeland. As I’ll argue, however, this tripartite characterisation of nostalgia is likely wrong.
by Felipe De Brigard, Aeon | Read more:
Image: Winslow Homer
Wednesday, July 22, 2020
Experimental Blood Test Detects Cancer up to Four Years before Symptoms Appear
For years scientists have sought to create the ultimate cancer-screening test—one that can reliably detect a malignancy early, before tumor cells spread and when treatments are more effective. A new method reported today in Nature Communications brings researchers a step closer to that goal. By using a blood test, the international team was able to diagnose cancer long before symptoms appeared in nearly all the people it tested who went on to develop cancer.
“What we showed is: up to four years before these people walk into the hospital, there are already signatures in their blood that show they have cancer,” says Kun Zhang, a bioengineer at the University of California, San Diego, and a co-author of the study. “That’s never been done before.”
Past efforts to develop blood tests for cancer typically involved researchers collecting blood samples from people already diagnosed with the disease. They would then see if they could accurately detect malignant cells in those samples, usually by looking at genetic mutations, DNA methylation (chemical alterations to DNA) or specific blood proteins. “The best you can prove is whether your method is as good at detecting cancer as existing methods,” Zhang says. “You can never prove it’s better.”
In contrast, Zhang and his colleagues began collecting samples from people before they had any signs that they had cancer. In 2007 the researchers began recruiting more than 123,000 healthy individuals in Taizhou, China, to undergo annual health checks—an effort that required building a specialized warehouse to store the more than 1.6 million samples they eventually accrued. Around 1,000 participants developed cancer over the next 10 years.
Zhang and his colleagues focused on developing a test for five of the most common types of cancer: stomach, esophageal, colorectal, lung and liver malignancies. The test they developed, called PanSeer, detects methylation patterns in which a chemical group is added to DNA to alter genetic activity. Past studies have shown that abnormal methylation can signal various types of cancer, including pancreatic and colon cancer.
The PanSeer test works by isolating DNA from a blood sample and measuring DNA methylation at 500 locations previously identified as having the greatest chance of signaling the presence of cancer. A machine-learning algorithm compiles the findings into a single score that indicates a person’s likelihood of having the disease. The researchers tested blood samples from 191 participants who eventually developed cancer, paired with the same number of matching healthy individuals. They were able to detect cancer up to four years before symptoms appeared with roughly 90 percent accuracy and a 5 percent false-positive rate.
by Rachel Nuwer, Scientific American | Read more:
Image: Getty
[ed. I wonder what happened to all the enthusiasm for AI, CRISPR, Big Data, etc. to solve all our problems. And in latest Covid19 news): The Pandemic May Very Well Last Another Year or More (vaccine production - Bloomberg); and Rapid, Cheap, Less Accurate Coronavirus Testing Has A Place, Scientists Say (NPR).]
“What we showed is: up to four years before these people walk into the hospital, there are already signatures in their blood that show they have cancer,” says Kun Zhang, a bioengineer at the University of California, San Diego, and a co-author of the study. “That’s never been done before.”

In contrast, Zhang and his colleagues began collecting samples from people before they had any signs that they had cancer. In 2007 the researchers began recruiting more than 123,000 healthy individuals in Taizhou, China, to undergo annual health checks—an effort that required building a specialized warehouse to store the more than 1.6 million samples they eventually accrued. Around 1,000 participants developed cancer over the next 10 years.
Zhang and his colleagues focused on developing a test for five of the most common types of cancer: stomach, esophageal, colorectal, lung and liver malignancies. The test they developed, called PanSeer, detects methylation patterns in which a chemical group is added to DNA to alter genetic activity. Past studies have shown that abnormal methylation can signal various types of cancer, including pancreatic and colon cancer.
The PanSeer test works by isolating DNA from a blood sample and measuring DNA methylation at 500 locations previously identified as having the greatest chance of signaling the presence of cancer. A machine-learning algorithm compiles the findings into a single score that indicates a person’s likelihood of having the disease. The researchers tested blood samples from 191 participants who eventually developed cancer, paired with the same number of matching healthy individuals. They were able to detect cancer up to four years before symptoms appeared with roughly 90 percent accuracy and a 5 percent false-positive rate.
by Rachel Nuwer, Scientific American | Read more:
Image: Getty
[ed. I wonder what happened to all the enthusiasm for AI, CRISPR, Big Data, etc. to solve all our problems. And in latest Covid19 news): The Pandemic May Very Well Last Another Year or More (vaccine production - Bloomberg); and Rapid, Cheap, Less Accurate Coronavirus Testing Has A Place, Scientists Say (NPR).]
What You Need To Know About The Battle of Portland
The city of Portland, Oregon is currently in the national spotlight after video evidence of federal agents driving rented vans and abducting activists went viral. This footage was taken in the early morning hours of July 15, and an Oregon Public Broadcasting article published on the 16th brought the matter out of the local social networks of Portland activists and on to the national stage.
As I write this, mainstream media personalities are beginning to parachute into Portland to cover what some have dubbed the “fascist takeover of Portland”. The word “Gestapo” is trending on Twitter.
The Beginning (...)
At a little before 11 p.m., several dozen protesters began to shatter the windows of the Justice Center. They entered the building, trashing the interior and lighting random fires inside. I watched all this happen from feet away, and it is my opinion that the destruction was unplanned, yet more or less inevitable — you could feel it in the mood of the crowd. The 3rd Precinct in Minneapolis had just burned: there was absolutely no way Portland wasn’t going to try to do the same thing.
Of course, the Portland Police Bureau (PPB) arrived very shortly thereafter. In one of the more gentlemanly moments of the entire uprising, they gave a warning to people who’d brought their families and dogs, urging them to leave. A sizable chunk of more moderate demonstrators went home. A thousand or more protesters ranked up, and began shouting at the police. At a little after midnight, the PPB launched the first of what would eventually be hundreds of tear gas grenades into the crowd.
The crowd scattered, pushed by police in several different directions at once. They split into several groups. One rampaged through a series of downtown banks, shattering windows and lighting fires as they ran from the cops. Another, larger group of demonstrators tore through the luxury shopping district, sacking the Apple Store, Louis Vuitton, H&M and, eventually, looting a Target. The rest of the night was a messy haze of gas, flash-bangs, and burning barricades.
The Portland Police have stated that more than a dozen riots took place over the last fifty days, but May 29th remains the only night that truly felt like the actual people of this city were rioting. (...)
The Edge of All-Out War
On July 4th, Portland’s thirty-ninth consecutive night of protests, more than a thousand people assembled in front of the Justice Center and Federal Courthouse downtown. They began launching dozens of commercial-grade fireworks into the concrete facades of both buildings, prompting a response from the police and federal agents inside both buildings.
What followed resembled nothing so much as a medieval siege. The windows of both government buildings had been covered in plywood weeks ago, after the first riots. Officers inside fired out through murder holes cut in the plywood, pumping rubber bullets, pepper balls and foam rounds into the crowd, while the crowd formed phalanxes of shield-bearers to protect the men and women launching fireworks back in response. Federal agents dumped tear gas into the street, but Portland’s frontline activists had long since lost their fear of gas. The feds and the police were eventually forced to sally out with batons to drive the crowd back.
I reported on the fighting in Mosul back in 2017, and what happened that night in the streets of Portland was, of course, not nearly as brutal or dangerous as actual combat. Yet it was about as close as you can get without using live ammunition. At times, dozens of flash-bangs and fireworks would detonate within feet of us over the course of a few minutes. My ears rang for days afterwards. My hands shook. I could not write for days. (...)
Perhaps this will change as the protests continue. But thus far, the only escalation seen recently has been the federal agents now roaming the streets of downtown Portland in rented vans, arresting activists seemingly at random. These men display no identification, no name tag or badge number or anything else that might be useful identifying them. That fact has rightly shocked Americans across the country, but at this point, it is nothing new to Portland protesters.
Portland Police have been hiding their names for weeks, instead using numbers that cannot be correlated to names by any means available to citizens. Members of multiple different law enforcement agencies, all with different rules of engagement from the PPB, have been policing demonstrations since the very beginning. As Tuck Woodstock, a local reporter, noted on Twitter:
“This is the natural escalation of the last 7 weeks. This is what has come of Portlanders protesting police brutality for 50 days: more bizarre acts of police brutality. Portlanders are risking everything every day. Please notice.”That is, in the end, what both the Portland press corps and the people out in the streets, protesting every night, seem to want from the rest of the United States. Please pay attention to the videos of officers ripping people’s face masks off to spray mace directly into their mouths. Please pay attention to the video of Donovan LaBella, blood gushing from his head, seizing on the ground. And, yes, please pay attention to the videos of men in full combat gear abducting activists off the street.
Pay attention, because it is my belief that all of this will not stay confined to Portland. Your city might be next.
by Robert Evans, Bellingcat | Read more:
Images: Mason Trinca for The New York Times
[ed. Wow. Total craziness (with videos). Federal intervention is doing nothing but inflame an already bad situation.]
Tuesday, July 21, 2020
New Psychedelic Research Sheds Light on Why Psilocybin-Containing Mushrooms Have Been Consumed for Centuries
A new study from the Center for Psychedelic and Consciousness Research at Johns Hopkins University School of Medicine provides insight into the psychoactive effects that distinguish psilocybin from other hallucinogenic substances. The findings suggest that feelings of spiritual and/or psychological insight play an important role in the drug’s popularity.
The new study has been published in the journal Psychopharmacology.
“Recently there has been a renewal of interest in research with psychedelic drugs,” explained Roland R. Griffiths, a professor of psychiatry and behavioral sciences who is the corresponding author of the new study.
“Studies from the Johns Hopkins Center for Psychedelic and Consciousness Research and elsewhere suggest that psilocybin, a classic psychedelic drug, has significant potential for treating various psychiatric conditions such as depression and drug dependence disorders. This study sought to address a simple but somewhat perplexing question: Why do people use psilocybin?”
“Psilocybin, in the form of hallucinogenic mushrooms, has been used for centuries for the psychoactive effects. Recent US survey studies show that lifetime psilocybin use is relatively modest and quite stable over a period of decades,” Griffiths explained.
“However, the National Institute on Drug Abuse does not consider psilocybin to be addictive because it does not cause uncontrollable drug seeking behavior, does not produce classic euphoria, does not produce a withdrawal syndrome, and does not activate brain mechanisms associated with classic drugs of abuse.”
In the double-blind study, 20 healthy participants with histories of hallucinogen use received doses of psilocybin, dextromethorphan (DXM), and a placebo during five experiment sessions. (...)
The researchers found that most of the participants reported wanting to take psilocybin again. But only 1 in 4 reported wanting to take DXM again.
“The study showed that several subjective features of the drug experience predicted participants’ desire to take psilocybin again: psychological insight, meaningfulness of the experience, increased awareness of beauty, positive social effects (e.g. empathy), positive mood (e.g. inner peace), amazement, and mystical-type effects,” Griffiths explained.
Nearly half of the participants rated their experience following the highest psilocybin dose to be among the top most meaningful and psychologically insightful of their lives.
“The study provides an answer to the puzzle for why psilocybin has been used by people for hundreds of years, yet it does not share any of the features used to define classic drugs of abuse. The answer seems to reside in the ability of psilocybin to produce unique changes in the human conscious experience that give rise to meaning, insight, the experience of beauty and mystical-type effects,” Griffiths said.
The new study has been published in the journal Psychopharmacology.
“Recently there has been a renewal of interest in research with psychedelic drugs,” explained Roland R. Griffiths, a professor of psychiatry and behavioral sciences who is the corresponding author of the new study.

“Psilocybin, in the form of hallucinogenic mushrooms, has been used for centuries for the psychoactive effects. Recent US survey studies show that lifetime psilocybin use is relatively modest and quite stable over a period of decades,” Griffiths explained.
“However, the National Institute on Drug Abuse does not consider psilocybin to be addictive because it does not cause uncontrollable drug seeking behavior, does not produce classic euphoria, does not produce a withdrawal syndrome, and does not activate brain mechanisms associated with classic drugs of abuse.”
In the double-blind study, 20 healthy participants with histories of hallucinogen use received doses of psilocybin, dextromethorphan (DXM), and a placebo during five experiment sessions. (...)
The researchers found that most of the participants reported wanting to take psilocybin again. But only 1 in 4 reported wanting to take DXM again.
“The study showed that several subjective features of the drug experience predicted participants’ desire to take psilocybin again: psychological insight, meaningfulness of the experience, increased awareness of beauty, positive social effects (e.g. empathy), positive mood (e.g. inner peace), amazement, and mystical-type effects,” Griffiths explained.
Nearly half of the participants rated their experience following the highest psilocybin dose to be among the top most meaningful and psychologically insightful of their lives.
“The study provides an answer to the puzzle for why psilocybin has been used by people for hundreds of years, yet it does not share any of the features used to define classic drugs of abuse. The answer seems to reside in the ability of psilocybin to produce unique changes in the human conscious experience that give rise to meaning, insight, the experience of beauty and mystical-type effects,” Griffiths said.
by Eric W. Dolan, PsyPost | Read more:
Image: uncredited
Monday, July 20, 2020
Breaking Into The Close-Knit World Of Country Music, While Keeping Distant
The early experiments in COVID-19-era concerts have been watched closely, because the stakes are clearly high on both sides of the coin: the possibility of salvaging lost musical livelihoods has to be balanced against any potential exposure risks for all involved. So far, country acts seem to have outpaced those in other genres when it comes to experimenting with live show layouts, from those that separate pods of attendees in vehicles or outdoor suites to those that let audience members cram in shoulder to shoulder, just like the good, old, pre-pandemic days.
But concerts aren't the only career-furthering events that the Nashville industry has been going without. Another type of gathering routinely happens out of public view, its function to promote new acts and new music to industry gatekeepers, tastemakers and professional peers. Often, a small bar or venue will be rented out for these boozy, schmoozy shindigs. It's about getting face time, as opposed to FaceTime, so artists will work the room making friendly conversation. If they're new to the game, they're likely to have a publicist by their side providing guidance and making introductions.Of course, lockdown brought those rituals to a halt too, but attempts to safely (and resourcefully) replace them have begun.
Country success tends to require staunch participation in the Nashville community. That's one of the many reasons that Lil Nas X's winking, cowboy-burlesquing, hip-hop virality seemed so out of step with country music's establishment, at least initially — he pulled it off without them.
Today's centralized country music industry is the result of once geographically, culturally and stylistically disparate and distinct threads being woven into a consolidated, popular (and artificially whitewashed) format, with its business and creative infrastructure and towering historical narrative staunchly headquartered in Nashville. It's not easy to launch and sustain a recording and touring career in this world without winning over some of its major players, securing the approval, opportunities, institutional support and media coverage they have to give. Performing know-how matters, but so does being eager and personable.
So the events where up-close access happens do serve a real purpose: The albums that were released rather than postponed just after the nation went into quarantine — which included established names at pivotal points in their careers, like Ashley McBryde, Sam Hunt, Kelsea Ballerini and Maddie & Tae, and promising country-pop newbie Ingrid Andress — likely forfeited some of the recognition they might have gotten if plans to promote them in person hadn't been canceled out of necessity. It's no wonder that record label, management and publicity staff are seeking innovative stopgaps.
Just over a week before the release of the debut country single by Shy Carter — a biracial, Memphis-bred, singer-rapper who had a decade of pop-R&B, hip-hop and country songwriting credits and guest spots under his belt, but was new to the Warner Nashville roster — a label rep emailed with the offer of an "At-Home Artist Visit." The proposed scenario involved Carter giving some sort of brief performance on a flatbed truck that would be parked in front of my residence. I was too curious, about both the extravagance of the scheme and the act it was meant to introduce, to decline.
As it turned out, what rolled up in front of my house around lunchtime that Friday wasn't an industrial hauler at all — just a shiny, silver pickup.
A guy who identified himself as Carter's brother hopped out the passenger side first, followed by Carter himself, both of them extending greetings. I asked Carter how many stops he'd already covered on his mini-tour that day. This was only his second, he reported brightly, implying the specialness of the visit.
The driver walked to the back of the truck, lowered the tailgate and slid a small portable PA onto it, while Carter's sole live accompanist swung his legs over the side of the truck bed and perched there, balancing an acoustic guitar on his lap. The remaining two members of the entourage, one of them a publicist, emerged from a second vehicle, keeping a respectable distance on the sidewalk. A few neighbors walked over and spread out in the yard to watch.
We were all wearing fabric face masks, but Carter removed his to sing into a microphone, revealing a luminous smile. He was a seasoned and charismatic enough entertainer to seem entirely unfazed by the awkwardness of serenading me from my sidewalk.
by Jewly Hight, NPR | Read more:
Image: Jewly Hight
But concerts aren't the only career-furthering events that the Nashville industry has been going without. Another type of gathering routinely happens out of public view, its function to promote new acts and new music to industry gatekeepers, tastemakers and professional peers. Often, a small bar or venue will be rented out for these boozy, schmoozy shindigs. It's about getting face time, as opposed to FaceTime, so artists will work the room making friendly conversation. If they're new to the game, they're likely to have a publicist by their side providing guidance and making introductions.Of course, lockdown brought those rituals to a halt too, but attempts to safely (and resourcefully) replace them have begun.

Today's centralized country music industry is the result of once geographically, culturally and stylistically disparate and distinct threads being woven into a consolidated, popular (and artificially whitewashed) format, with its business and creative infrastructure and towering historical narrative staunchly headquartered in Nashville. It's not easy to launch and sustain a recording and touring career in this world without winning over some of its major players, securing the approval, opportunities, institutional support and media coverage they have to give. Performing know-how matters, but so does being eager and personable.
So the events where up-close access happens do serve a real purpose: The albums that were released rather than postponed just after the nation went into quarantine — which included established names at pivotal points in their careers, like Ashley McBryde, Sam Hunt, Kelsea Ballerini and Maddie & Tae, and promising country-pop newbie Ingrid Andress — likely forfeited some of the recognition they might have gotten if plans to promote them in person hadn't been canceled out of necessity. It's no wonder that record label, management and publicity staff are seeking innovative stopgaps.
Just over a week before the release of the debut country single by Shy Carter — a biracial, Memphis-bred, singer-rapper who had a decade of pop-R&B, hip-hop and country songwriting credits and guest spots under his belt, but was new to the Warner Nashville roster — a label rep emailed with the offer of an "At-Home Artist Visit." The proposed scenario involved Carter giving some sort of brief performance on a flatbed truck that would be parked in front of my residence. I was too curious, about both the extravagance of the scheme and the act it was meant to introduce, to decline.
As it turned out, what rolled up in front of my house around lunchtime that Friday wasn't an industrial hauler at all — just a shiny, silver pickup.
A guy who identified himself as Carter's brother hopped out the passenger side first, followed by Carter himself, both of them extending greetings. I asked Carter how many stops he'd already covered on his mini-tour that day. This was only his second, he reported brightly, implying the specialness of the visit.
The driver walked to the back of the truck, lowered the tailgate and slid a small portable PA onto it, while Carter's sole live accompanist swung his legs over the side of the truck bed and perched there, balancing an acoustic guitar on his lap. The remaining two members of the entourage, one of them a publicist, emerged from a second vehicle, keeping a respectable distance on the sidewalk. A few neighbors walked over and spread out in the yard to watch.
We were all wearing fabric face masks, but Carter removed his to sing into a microphone, revealing a luminous smile. He was a seasoned and charismatic enough entertainer to seem entirely unfazed by the awkwardness of serenading me from my sidewalk.
by Jewly Hight, NPR | Read more:
Image: Jewly Hight
Subscribe to:
Posts (Atom)