Tuesday, November 27, 2018
Self-Care Won't Save Us
It is somewhere between one and two in the morning and, as per usual, I am flicking through internet tabs. Without really taking anything in, I am dividing my attention between a recipe for broccoli and peanut butter soup (one which has been in my favorites tab for maybe three years, still never attempted), some news story about a terrible event in which many people have needlessly died, and the usual social media sites. Scrolling down my Facebook feed, in between the enviable holiday snaps and the links to more sad news stories—people don’t talk very much on Facebook any more, I’ve noticed; it’s mostly a conduit for the exchanging of links—a picture catches my eye. It’s a cartoon of a friendly-looking blob man, large-eyed and edgeless, wrapped up in blankets. The blob man is saying “It’s okay if all you want to do today is just stay in bed and watch Netflix.” I draw up my covers, nodding to no one in particular, and flick to a tab with my favorite old TV show.
The above story doesn’t refer to any particular night that I can remember. But the general theme is one that I’ve played out again and again. I’m not sure I’m ever going to make that soup.
If you’re a millennial with regular access to the internet, you’ve probably seen similar images to the cartoon I’ve described above. They’re usually painted in comforting primary colors or pastels, featuring simple illustrations, accompanied by text in a non-threatening font. They invite you to practice ‘self-care’, a term that has been prominent in healthcare theory for many decades but has recently increased in visibility online. The term generally refers to a variety of techniques and habits that are supposed to help with one’s physical and mental well-being, reduce stress, and lead to a more balanced lifestyle. “It’s like if you were walking outside in a thunderstorm, umbrella-less, and you walked into a café filled with plush armchairs, wicker baskets full of flowers, and needlepoints on the walls that say things like ‘Be kind to yourself’ and ‘You are enough,’” says The Atlantic. Though the term has a medical tinge to it, the language used in the world of self-care is more aligned with the world of self-help, and much of the advice commonly given in the guise of self-care will be familiar to anyone who has browsed the pop-psychology shelves of a bookstore or listened to the counsel of a kindly coworker—take breaks from work and step outside for fresh air, take walks in the countryside, call a friend for a chat, have a lavender bath, get a good night’s sleep. Light a candle. Stop being so hard on yourself. Take time off if you’re not feeling so well and snuggle under the comforter with a DVD set and a herbal tea. Few people would argue with these tips in isolation (with a few exceptions—I think herbal tea is foul). We should all be making sure we are well-fed, rested, and filling our lives with things that we enjoy. In a time where people—especially millennials, at whom this particular brand of self-care is aimed—are increasingly talking about their struggles with depression, anxiety and insecurities, it’s no wonder that “practicing self-care” is an appealing prospect, even if it does sometimes seem like a fancy way to say “do things you like.” What is concerning is the way that this advice appears to be perfectly designed to fit in with a society that appears to be the cause of so much of the depression, anxiety, and insecurities. By finding the solution to young people’s mental ill-health (be it a diagnosed mental health problem or simply the day-to-day stresses of life) in do-it-yourself fixes, and putting the burden on the target audience to find a way to cope, the framework of self-care avoids having to think about issues on a societal level. In the world of self-care, mental health is not political, it’s individual. Self-care is mental health care for the neoliberal era.
As I write, the U.K. Prime Minister, Theresa May, is tweeting about World Mental Health Day and suicide prevention. She is not the only one; scrolling through the trending hashtags (there are several) one can find lots of comforting words about taking care of yourself, about opening up, confiding in a friend, keeping active, taking a breath. One such tweet is a picture of an arts-and-craftsy cut-out of a bright yellow circle behind dull green paper, designed to look like a cheerful sun. Printed on the sun are the words “everything will be so good so soon just hang in there & don’t worry about it too much.” All of us have probably seen some variation of these words at many points in our lives, and probably found at least a little bit of momentary relief in them. But looking through other tweets about World Mental Health Day reveals a different side of the issue. People talk about the times they did try to seek help, and were left to languish on waiting lists for therapy. They talk about the cuts to their local services (if they’re from somewhere with universal healthcare) or the insurance policies that wouldn’t cover them (if they’re in the United States). They talk about the illnesses left cold and untouched by campaigns that claim to reduce stigma—personality disorders, bipolar disorder, schizophrenia. They talk about homelessness and insecure housing and jobs that leave them exhausted. They talk about loneliness. And, in the case of Theresa May, they talk about how the suicide prevention minister she promises to hire will have to deal with the many people who consider suicide in response to her government’s policies. These are deep material and societal issues that all of us are touched by, to at least some degree. We know it when we see people begging in the streets, when we read yet another report that tells us our planet is dying, when we try to figure out why we feel sad and afraid and put it down to an ‘off day’, trying not to think about just how many ‘off days’ we seem to have. We turn to our TVs, to our meditation apps, and hope we can paper over the cracks. We are in darkness, and when we cry out for light, we are handed a scented candle.
A common sentiment expressed in the world of self-care is that anyone can suffer from mental ill-health. This is true, but it’s not the entire story. In fact, mental health problems are strongly correlated with poverty, vulnerability, and physical health conditions (with the causation going both ways). Furthermore, there is a big difference between those of us who are fortunate enough to be able to take time off work for doctor’s appointments and mental health days, and those who can’t; those of us who have children or other dependents to take care of, and those who don’t; those of us who have the financial independence to take a break from our obligations when we need to, and those who don’t. Not all people have the same access to help, or even access to their own free time—employers increasingly expect workers to be available whenever they are needed, both in white-collar jobs and precarious shift work. Add in the (heavily gendered) responsibilities of being a parent, studying, a night-time Uber gig to cover the bills, or a long commute from the only affordable area in the city, and the stress of life will pile on even as it soaks up the time you’re supposed to set aside to relieve that stress. Funding cuts are in fashion across a plethora of Western countries, both to healthcare and to other services that indirectly affect our health, especially the health of people who need additional support to lead the lives they wish to live, or even just to survive. The rhetoric around self-care is flattering but flattening, treating its audience as though the solution to their problems is believing in themselves and investing in themselves. This picture glosses over the question of what happens when society does not believe or invest in us.
Even for those of us who are relatively lucky in life, self-care does not solve our problems. “It’s okay if all you did today was breathe,” promises a widely-shared image macro of a gentle talking pair of lungs. Well, I hate to break it to you, talking lungs, but it’s 2018. We’re supposed to be walking powerhouses of productivity, using every minute of our time to its best effect. In an economic environment where careers are precarious and competitive, young people are increasingly pressured to give up their free time to take on extracurriculars and unpaid projects “for their resume,” produce creative content “for exposure,” learn skills such as coding, scout for jobs on LinkedIn, write self-promoting posts about their personal qualities, and perhaps worst of all, attend godawful networking events, some of which don’t even have free canapés.
The above story doesn’t refer to any particular night that I can remember. But the general theme is one that I’ve played out again and again. I’m not sure I’m ever going to make that soup.
As I write, the U.K. Prime Minister, Theresa May, is tweeting about World Mental Health Day and suicide prevention. She is not the only one; scrolling through the trending hashtags (there are several) one can find lots of comforting words about taking care of yourself, about opening up, confiding in a friend, keeping active, taking a breath. One such tweet is a picture of an arts-and-craftsy cut-out of a bright yellow circle behind dull green paper, designed to look like a cheerful sun. Printed on the sun are the words “everything will be so good so soon just hang in there & don’t worry about it too much.” All of us have probably seen some variation of these words at many points in our lives, and probably found at least a little bit of momentary relief in them. But looking through other tweets about World Mental Health Day reveals a different side of the issue. People talk about the times they did try to seek help, and were left to languish on waiting lists for therapy. They talk about the cuts to their local services (if they’re from somewhere with universal healthcare) or the insurance policies that wouldn’t cover them (if they’re in the United States). They talk about the illnesses left cold and untouched by campaigns that claim to reduce stigma—personality disorders, bipolar disorder, schizophrenia. They talk about homelessness and insecure housing and jobs that leave them exhausted. They talk about loneliness. And, in the case of Theresa May, they talk about how the suicide prevention minister she promises to hire will have to deal with the many people who consider suicide in response to her government’s policies. These are deep material and societal issues that all of us are touched by, to at least some degree. We know it when we see people begging in the streets, when we read yet another report that tells us our planet is dying, when we try to figure out why we feel sad and afraid and put it down to an ‘off day’, trying not to think about just how many ‘off days’ we seem to have. We turn to our TVs, to our meditation apps, and hope we can paper over the cracks. We are in darkness, and when we cry out for light, we are handed a scented candle.
A common sentiment expressed in the world of self-care is that anyone can suffer from mental ill-health. This is true, but it’s not the entire story. In fact, mental health problems are strongly correlated with poverty, vulnerability, and physical health conditions (with the causation going both ways). Furthermore, there is a big difference between those of us who are fortunate enough to be able to take time off work for doctor’s appointments and mental health days, and those who can’t; those of us who have children or other dependents to take care of, and those who don’t; those of us who have the financial independence to take a break from our obligations when we need to, and those who don’t. Not all people have the same access to help, or even access to their own free time—employers increasingly expect workers to be available whenever they are needed, both in white-collar jobs and precarious shift work. Add in the (heavily gendered) responsibilities of being a parent, studying, a night-time Uber gig to cover the bills, or a long commute from the only affordable area in the city, and the stress of life will pile on even as it soaks up the time you’re supposed to set aside to relieve that stress. Funding cuts are in fashion across a plethora of Western countries, both to healthcare and to other services that indirectly affect our health, especially the health of people who need additional support to lead the lives they wish to live, or even just to survive. The rhetoric around self-care is flattering but flattening, treating its audience as though the solution to their problems is believing in themselves and investing in themselves. This picture glosses over the question of what happens when society does not believe or invest in us.
Even for those of us who are relatively lucky in life, self-care does not solve our problems. “It’s okay if all you did today was breathe,” promises a widely-shared image macro of a gentle talking pair of lungs. Well, I hate to break it to you, talking lungs, but it’s 2018. We’re supposed to be walking powerhouses of productivity, using every minute of our time to its best effect. In an economic environment where careers are precarious and competitive, young people are increasingly pressured to give up their free time to take on extracurriculars and unpaid projects “for their resume,” produce creative content “for exposure,” learn skills such as coding, scout for jobs on LinkedIn, write self-promoting posts about their personal qualities, and perhaps worst of all, attend godawful networking events, some of which don’t even have free canapés.
by Aisling McCrae, Current Affairs | Read more:
Image: Lizzy Price
The Crisis
Tom Tomorrow
via:
[ed. See also: World must triple efforts or face catastrophic climate change, says UN. And, (of course) 'I don't believe it'. (The Guardian)]
via:
[ed. See also: World must triple efforts or face catastrophic climate change, says UN. And, (of course) 'I don't believe it'. (The Guardian)]
The Insect Apocalypse Is Here
Sune Boye Riis was on a bike ride with his youngest son, enjoying the sun slanting over the fields and woodlands near their home north of Copenhagen, when it suddenly occurred to him that something about the experience was amiss. Specifically, something was missing.
It was summer. He was out in the country, moving fast. But strangely, he wasn’t eating any bugs.
For a moment, Riis was transported to his childhood on the Danish island of Lolland, in the Baltic Sea. Back then, summer bike rides meant closing his mouth to cruise through thick clouds of insects, but inevitably he swallowed some anyway. When his parents took him driving, he remembered, the car’s windshield was frequently so smeared with insect carcasses that you almost couldn’t see through it. But all that seemed distant now. He couldn’t recall the last time he needed to wash bugs from his windshield; he even wondered, vaguely, whether car manufacturers had invented some fancy new coating to keep off insects. But this absence, he now realized with some alarm, seemed to be all around him. Where had all those insects gone? And when? And why hadn’t he noticed?
Riis watched his son, flying through the beautiful day, not eating bugs, and was struck by the melancholy thought that his son’s childhood would lack this particular bug-eating experience of his own. It was, he granted, an odd thing to feel nostalgic about. But he couldn’t shake a feeling of loss. “I guess it’s pretty human to think that everything was better when you were a kid,” he said. “Maybe I didn’t like it when I was on my bike and I ate all the bugs, but looking back on it, I think it’s something everybody should experience.”
I met Riis, a lanky high school science and math teacher, on a hot day in June. He was anxious about not having yet written his address for the school’s graduation ceremony that evening, but first, he had a job to do. From his garage, he retrieved a large insect net, drove to a nearby intersection and stopped to strap the net to the car’s roof. Made of white mesh, the net ran the length of his car and was held up by a tent pole at the front, tapering to a small, removable bag in back. Drivers whizzing past twisted their heads to stare. Riis eyed his parking spot nervously as he adjusted the straps of the contraption. “This is not 100 percent legal,” he said, “but I guess, for the sake of science.”
Riis had not been able to stop thinking about the missing bugs. The more he learned, the more his nostalgia gave way to worry. Insects are the vital pollinators and recyclers of ecosystems and the base of food webs everywhere. Riis was not alone in noticing their decline. In the United States, scientists recently found the population of monarch butterflies fell by 90 percent in the last 20 years, a loss of 900 million individuals; the rusty-patched bumblebee, which once lived in 28 states, dropped by 87 percent over the same period. With other, less-studied insect species, one butterfly researcher told me, “all we can do is wave our arms and say, ‘It’s not here anymore!’ ” Still, the most disquieting thing wasn’t the disappearance of certain species of insects; it was the deeper worry, shared by Riis and many others, that a whole insect world might be quietly going missing, a loss of abundance that could alter the planet in unknowable ways. “We notice the losses,” says David Wagner, an entomologist at the University of Connecticut. “It’s the diminishment that we don’t see.” (...)
When the investigators began planning the study in 2016, they weren’t sure if anyone would sign up. But by the time the nets were ready, a paper by an obscure German entomological society had brought the problem of insect decline into sharp focus. The German study found that, measured simply by weight, the overall abundance of flying insects in German nature reserves had decreased by 75 percent over just 27 years. If you looked at midsummer population peaks, the drop was 82 percent.
Riis learned about the study from a group of his students in one of their class projects. They must have made some kind of mistake in their citation, he thought. But they hadn’t. The study would quickly become, according to the website Altmetric, the sixth-most-discussed scientific paper of 2017. Headlines around the world warned of an “insect Armageddon.”
Within days of announcing the insect-collection project, the Natural History Museum of Denmark was turning away eager volunteers by the dozens. It seemed there were people like Riis everywhere, people who had noticed a change but didn’t know what to make of it. How could something as fundamental as the bugs in the sky just disappear? And what would become of the world without them?
A 1995 study, by Peter H. Kahn and Batya Friedman, of the way some children in Houston experienced pollution summed up our blindness this way: “With each generation, the amount of environmental degradation increases, but each generation takes that amount as the norm.” In decades of photos of fishermen holding up their catch in the Florida Keys, the marine biologist Loren McClenachan found a perfect illustration of this phenomenon, which is often called “shifting baseline syndrome.” The fish got smaller and smaller, to the point where the prize catches were dwarfed by fish that in years past were piled up and ignored. But the smiles on the fishermen’s faces stayed the same size. The world never feels fallen, because we grow accustomed to the fall.
By one measure, bugs are the wildlife we know best, the nondomesticated animals whose lives intersect most intimately with our own: spiders in the shower, ants at the picnic, ticks buried in the skin. We sometimes feel that we know them rather too well. In another sense, though, they are one of our planet’s greatest mysteries, a reminder of how little we know about what’s happening in the world around us. (...)
With so much abundance, it very likely never occurred to most entomologists of the past that their multitudinous subjects might dwindle away. As they poured themselves into studies of the life cycles and taxonomies of the species that fascinated them, few thought to measure or record something as boring as their number. Besides, tracking quantity is slow, tedious and unglamorous work: setting and checking traps, waiting years or decades for your data to be meaningful, grappling with blunt baseline questions instead of more sophisticated ones. And who would pay for it? Most academic funding is short-term, but when what you’re interested in is invisible, generational change, says Dave Goulson, an entomologist at the University of Sussex, “a three-year monitoring program is no good to anybody.” This is especially true of insect populations, which are naturally variable, with wide, trend-obscuring fluctuations from one year to the next. (...)
Entomologists also knew that climate change and the overall degradation of global habitat are bad news for biodiversity in general, and that insects are dealing with the particular challenges posed by herbicides and pesticides, along with the effects of losing meadows, forests and even weedy patches to the relentless expansion of human spaces. There were studies of other, better-understood species that suggested that the insects associated with them might be declining, too. People who studied fish found that the fish had fewer mayflies to eat. Ornithologists kept finding that birds that rely on insects for food were in trouble: eight in 10 partridges gone from French farmlands; 50 and 80 percent drops, respectively, for nightingales and turtledoves. Half of all farmland birds in Europe disappeared in just three decades. At first, many scientists assumed the familiar culprit of habitat destruction was at work, but then they began to wonder if the birds might simply be starving. In Denmark, an ornithologist named Anders Tottrup was the one who came up with the idea of turning cars into insect trackers for the windshield-effect study after he noticed that rollers, little owls, Eurasian hobbies and bee-eaters — all birds that subsist on large insects such as beetles and dragonflies — had abruptly disappeared from the landscape.
The signs were certainly alarming, but they were also just signs, not enough to justify grand pronouncements about the health of insects as a whole or about what might be driving a widespread, cross-species decline. “There are no quantitative data on insects, so this is just a hypothesis,” Hans de Kroon, an ecologist at Radboud University in Denmark, explained to me — not the sort of language that sends people to the barricades.
Then came the German study. Scientists are still cautious about what the findings might imply about other regions of the world. But the study brought forth exactly the kind of longitudinal data they had been seeking, and it wasn’t specific to just one type of insect. The numbers were stark, indicating a vast impoverishment of an entire insect universe, even in protected areas where insects ought to be under less stress. The speed and scale of the drop were shocking even to entomologists who were already anxious about bees or fireflies or the cleanliness of car windshields.
It was summer. He was out in the country, moving fast. But strangely, he wasn’t eating any bugs.
For a moment, Riis was transported to his childhood on the Danish island of Lolland, in the Baltic Sea. Back then, summer bike rides meant closing his mouth to cruise through thick clouds of insects, but inevitably he swallowed some anyway. When his parents took him driving, he remembered, the car’s windshield was frequently so smeared with insect carcasses that you almost couldn’t see through it. But all that seemed distant now. He couldn’t recall the last time he needed to wash bugs from his windshield; he even wondered, vaguely, whether car manufacturers had invented some fancy new coating to keep off insects. But this absence, he now realized with some alarm, seemed to be all around him. Where had all those insects gone? And when? And why hadn’t he noticed?
Riis watched his son, flying through the beautiful day, not eating bugs, and was struck by the melancholy thought that his son’s childhood would lack this particular bug-eating experience of his own. It was, he granted, an odd thing to feel nostalgic about. But he couldn’t shake a feeling of loss. “I guess it’s pretty human to think that everything was better when you were a kid,” he said. “Maybe I didn’t like it when I was on my bike and I ate all the bugs, but looking back on it, I think it’s something everybody should experience.”I met Riis, a lanky high school science and math teacher, on a hot day in June. He was anxious about not having yet written his address for the school’s graduation ceremony that evening, but first, he had a job to do. From his garage, he retrieved a large insect net, drove to a nearby intersection and stopped to strap the net to the car’s roof. Made of white mesh, the net ran the length of his car and was held up by a tent pole at the front, tapering to a small, removable bag in back. Drivers whizzing past twisted their heads to stare. Riis eyed his parking spot nervously as he adjusted the straps of the contraption. “This is not 100 percent legal,” he said, “but I guess, for the sake of science.”
Riis had not been able to stop thinking about the missing bugs. The more he learned, the more his nostalgia gave way to worry. Insects are the vital pollinators and recyclers of ecosystems and the base of food webs everywhere. Riis was not alone in noticing their decline. In the United States, scientists recently found the population of monarch butterflies fell by 90 percent in the last 20 years, a loss of 900 million individuals; the rusty-patched bumblebee, which once lived in 28 states, dropped by 87 percent over the same period. With other, less-studied insect species, one butterfly researcher told me, “all we can do is wave our arms and say, ‘It’s not here anymore!’ ” Still, the most disquieting thing wasn’t the disappearance of certain species of insects; it was the deeper worry, shared by Riis and many others, that a whole insect world might be quietly going missing, a loss of abundance that could alter the planet in unknowable ways. “We notice the losses,” says David Wagner, an entomologist at the University of Connecticut. “It’s the diminishment that we don’t see.” (...)
When the investigators began planning the study in 2016, they weren’t sure if anyone would sign up. But by the time the nets were ready, a paper by an obscure German entomological society had brought the problem of insect decline into sharp focus. The German study found that, measured simply by weight, the overall abundance of flying insects in German nature reserves had decreased by 75 percent over just 27 years. If you looked at midsummer population peaks, the drop was 82 percent.
Riis learned about the study from a group of his students in one of their class projects. They must have made some kind of mistake in their citation, he thought. But they hadn’t. The study would quickly become, according to the website Altmetric, the sixth-most-discussed scientific paper of 2017. Headlines around the world warned of an “insect Armageddon.”
Within days of announcing the insect-collection project, the Natural History Museum of Denmark was turning away eager volunteers by the dozens. It seemed there were people like Riis everywhere, people who had noticed a change but didn’t know what to make of it. How could something as fundamental as the bugs in the sky just disappear? And what would become of the world without them?
***
Anyone who has returned to a childhood haunt to find that everything somehow got smaller knows that humans are not great at remembering the past accurately. This is especially true when it comes to changes to the natural world. It is impossible to maintain a fixed perspective, as Heraclitus observed 2,500 years ago: It is not the same river, but we are also not the same people.A 1995 study, by Peter H. Kahn and Batya Friedman, of the way some children in Houston experienced pollution summed up our blindness this way: “With each generation, the amount of environmental degradation increases, but each generation takes that amount as the norm.” In decades of photos of fishermen holding up their catch in the Florida Keys, the marine biologist Loren McClenachan found a perfect illustration of this phenomenon, which is often called “shifting baseline syndrome.” The fish got smaller and smaller, to the point where the prize catches were dwarfed by fish that in years past were piled up and ignored. But the smiles on the fishermen’s faces stayed the same size. The world never feels fallen, because we grow accustomed to the fall.
By one measure, bugs are the wildlife we know best, the nondomesticated animals whose lives intersect most intimately with our own: spiders in the shower, ants at the picnic, ticks buried in the skin. We sometimes feel that we know them rather too well. In another sense, though, they are one of our planet’s greatest mysteries, a reminder of how little we know about what’s happening in the world around us. (...)
With so much abundance, it very likely never occurred to most entomologists of the past that their multitudinous subjects might dwindle away. As they poured themselves into studies of the life cycles and taxonomies of the species that fascinated them, few thought to measure or record something as boring as their number. Besides, tracking quantity is slow, tedious and unglamorous work: setting and checking traps, waiting years or decades for your data to be meaningful, grappling with blunt baseline questions instead of more sophisticated ones. And who would pay for it? Most academic funding is short-term, but when what you’re interested in is invisible, generational change, says Dave Goulson, an entomologist at the University of Sussex, “a three-year monitoring program is no good to anybody.” This is especially true of insect populations, which are naturally variable, with wide, trend-obscuring fluctuations from one year to the next. (...)
Entomologists also knew that climate change and the overall degradation of global habitat are bad news for biodiversity in general, and that insects are dealing with the particular challenges posed by herbicides and pesticides, along with the effects of losing meadows, forests and even weedy patches to the relentless expansion of human spaces. There were studies of other, better-understood species that suggested that the insects associated with them might be declining, too. People who studied fish found that the fish had fewer mayflies to eat. Ornithologists kept finding that birds that rely on insects for food were in trouble: eight in 10 partridges gone from French farmlands; 50 and 80 percent drops, respectively, for nightingales and turtledoves. Half of all farmland birds in Europe disappeared in just three decades. At first, many scientists assumed the familiar culprit of habitat destruction was at work, but then they began to wonder if the birds might simply be starving. In Denmark, an ornithologist named Anders Tottrup was the one who came up with the idea of turning cars into insect trackers for the windshield-effect study after he noticed that rollers, little owls, Eurasian hobbies and bee-eaters — all birds that subsist on large insects such as beetles and dragonflies — had abruptly disappeared from the landscape.
The signs were certainly alarming, but they were also just signs, not enough to justify grand pronouncements about the health of insects as a whole or about what might be driving a widespread, cross-species decline. “There are no quantitative data on insects, so this is just a hypothesis,” Hans de Kroon, an ecologist at Radboud University in Denmark, explained to me — not the sort of language that sends people to the barricades.
Then came the German study. Scientists are still cautious about what the findings might imply about other regions of the world. But the study brought forth exactly the kind of longitudinal data they had been seeking, and it wasn’t specific to just one type of insect. The numbers were stark, indicating a vast impoverishment of an entire insect universe, even in protected areas where insects ought to be under less stress. The speed and scale of the drop were shocking even to entomologists who were already anxious about bees or fireflies or the cleanliness of car windshields.
by Brooke Jarvis, NY Times | Read more:
Image: Photo illustrations by Matt Dorfman. Source photographs: Bridgeman ImagesThe Case for Dropping Out of College
During the summer, my father asked me whether the money he’d spent to finance my first few years at Fordham University in New York City, one of the more expensive private colleges in the United States, had been well spent. I said yes, which was a lie.
I majored in computer science, a field with good career prospects, and involved myself in several extracurricular clubs. Since I managed to test out of some introductory classes, I might even have been able to graduate a year early—thereby producing a substantial cost savings for my family. But the more I learned about the relationship between formal education and actual learning, the more I wondered why I’d come to Fordham in the first place.
What do students get for that price? I asked myself this question on a class by class basis, and have found an enormous mismatch between price and product in almost all cases. Take the two 4-credit calculus classes I took during freshman year. The professor had an unusual teaching style that suited me well, basing his lectures directly on lectures posted online by MIT. Half the class, including me, usually skipped the lectures and learned the content by watching the original material on MIT’s website. When the material was straightforward, I sped up the video. When it was more difficult, I hit pause, re-watched it, or opened a new tab on my browser so I could find a source that covered the same material in a more accessible way. From the perspective of my own convenience and education, it was probably one of the best classes I’ve taken in college. But I was left wondering: Why should anyone pay more than $8,000 to watch a series of YouTube videos, available online for free, and occasionally take an exam?
Another class I took, Philosophical Ethics, involved a fair bit of writing. The term paper, which had an assigned minimum length of 5,000 words, had to be written in two steps—first a full draft and then a revised version that incorporated feedback from the professor. Is $3,250 an appropriate cost for feedback on 10,000 words? That’s hard to say. But consider that the going rate on the web for editing this amount of text is just a few hundred dollars. Even assuming that my professor is several times more skilled and knowledgeable, it’s not clear that this is a good value proposition.
“But what about the lectures?” you ask. The truth is that many students, including me, don’t find the lectures valuable. As noted above, equivalent material usually can be found online for free, or at low cost. In some cases, a student will find that his or her own professor has posted video of his or her own lectures. And the best educators, assisted with the magic of video editing, often put out content that puts even the most renowned college lecturers to shame. If you have questions about the material, there’s a good chance you will find the answer on Quora or Reddit.
Last semester, I took a 4-credit class called Computer Organization. There was a total of 23 lectures, each of 75 minutes length—or about 29 hours of lectures. I liked the professor and enjoyed the class. Yet, once the semester was over, I noticed that almost all of the core material was contained in a series of YouTube videos that was just three hours long.
Like many of my fellow students, I spend most of my time in class on my laptop: Twitter, online chess, reading random articles. From the back of the class, I can see that other students are doing likewise. One might think that all of these folks will be in trouble when test time comes around. But watching a few salient online videos generally is all it takes to master the required material. You see the pattern here: The degrees these people get say “Fordham,” but the actual education often comes courtesy of YouTube.
The issue I am discussing is not new, and predates the era of on-demand web video. As far back as 1984, American educational psychologist Benjamin Bloom discovered that an average student who gets individual tutoring will outperform the vast majority of peers taught in a regular classroom setting. Even the best tutors cost no more than $80 an hour—which means you could buy 50 hours of their service for the pro-rated cost of a 4-credit college class that supplies 30 hours of (far less effective) lectures.
All of these calculations are necessarily imprecise, of course. But for the most part, I would argue, the numbers I have presented here underestimate the true economic cost of bricks-and-mortar college education, since I have not imputed the substantial effective subsidies that come through government tax breaks, endowments and support programs run by all levels of government.
So given all this, why are we told that, far from being a rip-off, college is a great deal? “In 2014, the median full-time, full-year worker over age 25 with a bachelor’s degree earned nearly 70% more than a similar worker with just a high school degree,” read one typical online report from 2016. The occasion was Jason Furman, then head of Barack Obama’s Council of Economic Advisers, tweeting out data showing that the ratio of an average college graduate’s earnings to a similarly situated high-school graduate’s earnings had grown from 1.1 in 1975 to more than 1.6 four decades later.
To ask my question another way: What accounts for the disparity between the apparently poor value proposition of college at a micro level with the statistically observed college premium at the macro level? A clear set of answers appears in The Case against Education: Why the Education System Is a Waste of Time and Money, a newly published book by George Mason University economist Bryan Caplan.
One explanation lies in what Caplan calls “ability bias”: From the outset, the average college student is different from the average American who does not go to college. The competitive college admissions process winnows the applicant pool in such a way as to guarantee that those who make it into college are more intelligent, conscientious and conformist than other members of his or her high-school graduating cohort. In other words, when colleges boast about the “70% income premium” they supposedly provide students, they are taking credit for abilities that those students already had before they set foot on campus, and which they likely could retain and commercially exploit even if they never got a college diploma. By Caplan’s estimate, ability bias accounts for about 45% of the vaunted college premium. Which would means that a college degree actually boosts income by about 40 points, not the oft-cited 70.
Of course, 40% is still a huge premium. But Caplan digs deeper by asking how that premium is earned. And in his view, the extra income doesn’t come from substantive skills learned in college classrooms, but rather from what he called the “signaling” function of a diploma: Because employers lack any quick and reliable objective way to evaluate a job candidate’s potential worth, they fall back on the vetting work done by third parties—namely, colleges. A job candidate who also happens to be someone who managed to get through the college admissions process, followed by four years of near constant testing, likely is someone who is also intelligent and conscientious, and who can be relied on to conform to institutional norms. It doesn’t matter what the applicant was tested on, since it is common knowledge that most of what one learns in college will never be applied later in life. What matters is that these applicants were tested on something. Caplan estimates that signaling accounts for around 80% of the 40-point residual college premium described above, which, if true, would leave less than ten percentage points—from the original 70—left to be accounted for. (...)
Till now, I have discussed the value of college education in generic fashion. But as everyone on any campus knows, different majors offer different value. In the case of liberal arts, the proportion of the true college premium attributable to signaling is probably close to 100%. It is not just that the jobs these students seek typically don’t require any of the substantive knowledge they acquired during their course of study: They also aren’t really improving students’ analytical skills, either. In their 2011 book Academically Adrift: Limited Learning on College Campuses, sociologists Richard Arum and Josipa Roksa presented data showing that, over their first two years of college, students typically improve their skills in critical thinking, complex reasoning and writing by less than a fifth of a standard deviation.
According to the U.S. Department of Commerce’s 2017 report on STEM jobs, even the substantive educational benefit to be had from degrees in technical fields may be overstated—since “almost two-thirds of the workers with a STEM undergraduate degree work in a non-STEM job.” Signaling likely play a strong role in such cases. Indeed, since STEM degrees are harder to obtain than non-STEM degrees, they provide an even stronger signal of intelligence and conscientiousness.
However, this is not the only reason why irrelevant coursework pays. Why do U.S. students who want to become doctors, one of the highest paying professions, first need to complete four years of often unrelated undergraduate studies? The American blogger and psychiatrist Scott Alexander, who majored in philosophy as an undergraduate and then went on to study medicine in Ireland, observed in his brilliant 2015 essay Against Tulip Subsidies that “Americans take eight years to become doctors. Irishmen can do it in four, and achieve the same result.” Law follows a similar pattern: While it takes four years to study law in Ireland, and in France it takes five, students in the United States typically spend seven years in school before beginning the separate process of bar accreditation.
by Samuel Knoche, Quillette | Read more:
Image: uncredited
I majored in computer science, a field with good career prospects, and involved myself in several extracurricular clubs. Since I managed to test out of some introductory classes, I might even have been able to graduate a year early—thereby producing a substantial cost savings for my family. But the more I learned about the relationship between formal education and actual learning, the more I wondered why I’d come to Fordham in the first place.
* * *
According to the not-for-profit College Board, the average cost of a school year at a private American university was almost $35,000 in 2017—a figure I will use for purposes of rough cost-benefit analysis. (While public universities are less expensive thanks to government subsidies, the total economic cost per student-year, including the cost borne by taxpayers, typically is similar.) The average student takes about 32 credits worth of classes per year (with a bachelor’s degree typically requiring at least 120 credits in total). So a 3-credit class costs just above $3,000, and a 4-credit class costs a little more than $4,000.
What do students get for that price? I asked myself this question on a class by class basis, and have found an enormous mismatch between price and product in almost all cases. Take the two 4-credit calculus classes I took during freshman year. The professor had an unusual teaching style that suited me well, basing his lectures directly on lectures posted online by MIT. Half the class, including me, usually skipped the lectures and learned the content by watching the original material on MIT’s website. When the material was straightforward, I sped up the video. When it was more difficult, I hit pause, re-watched it, or opened a new tab on my browser so I could find a source that covered the same material in a more accessible way. From the perspective of my own convenience and education, it was probably one of the best classes I’ve taken in college. But I was left wondering: Why should anyone pay more than $8,000 to watch a series of YouTube videos, available online for free, and occasionally take an exam?Another class I took, Philosophical Ethics, involved a fair bit of writing. The term paper, which had an assigned minimum length of 5,000 words, had to be written in two steps—first a full draft and then a revised version that incorporated feedback from the professor. Is $3,250 an appropriate cost for feedback on 10,000 words? That’s hard to say. But consider that the going rate on the web for editing this amount of text is just a few hundred dollars. Even assuming that my professor is several times more skilled and knowledgeable, it’s not clear that this is a good value proposition.
“But what about the lectures?” you ask. The truth is that many students, including me, don’t find the lectures valuable. As noted above, equivalent material usually can be found online for free, or at low cost. In some cases, a student will find that his or her own professor has posted video of his or her own lectures. And the best educators, assisted with the magic of video editing, often put out content that puts even the most renowned college lecturers to shame. If you have questions about the material, there’s a good chance you will find the answer on Quora or Reddit.
Last semester, I took a 4-credit class called Computer Organization. There was a total of 23 lectures, each of 75 minutes length—or about 29 hours of lectures. I liked the professor and enjoyed the class. Yet, once the semester was over, I noticed that almost all of the core material was contained in a series of YouTube videos that was just three hours long.
Like many of my fellow students, I spend most of my time in class on my laptop: Twitter, online chess, reading random articles. From the back of the class, I can see that other students are doing likewise. One might think that all of these folks will be in trouble when test time comes around. But watching a few salient online videos generally is all it takes to master the required material. You see the pattern here: The degrees these people get say “Fordham,” but the actual education often comes courtesy of YouTube.
The issue I am discussing is not new, and predates the era of on-demand web video. As far back as 1984, American educational psychologist Benjamin Bloom discovered that an average student who gets individual tutoring will outperform the vast majority of peers taught in a regular classroom setting. Even the best tutors cost no more than $80 an hour—which means you could buy 50 hours of their service for the pro-rated cost of a 4-credit college class that supplies 30 hours of (far less effective) lectures.
All of these calculations are necessarily imprecise, of course. But for the most part, I would argue, the numbers I have presented here underestimate the true economic cost of bricks-and-mortar college education, since I have not imputed the substantial effective subsidies that come through government tax breaks, endowments and support programs run by all levels of government.
So given all this, why are we told that, far from being a rip-off, college is a great deal? “In 2014, the median full-time, full-year worker over age 25 with a bachelor’s degree earned nearly 70% more than a similar worker with just a high school degree,” read one typical online report from 2016. The occasion was Jason Furman, then head of Barack Obama’s Council of Economic Advisers, tweeting out data showing that the ratio of an average college graduate’s earnings to a similarly situated high-school graduate’s earnings had grown from 1.1 in 1975 to more than 1.6 four decades later.
To ask my question another way: What accounts for the disparity between the apparently poor value proposition of college at a micro level with the statistically observed college premium at the macro level? A clear set of answers appears in The Case against Education: Why the Education System Is a Waste of Time and Money, a newly published book by George Mason University economist Bryan Caplan.
One explanation lies in what Caplan calls “ability bias”: From the outset, the average college student is different from the average American who does not go to college. The competitive college admissions process winnows the applicant pool in such a way as to guarantee that those who make it into college are more intelligent, conscientious and conformist than other members of his or her high-school graduating cohort. In other words, when colleges boast about the “70% income premium” they supposedly provide students, they are taking credit for abilities that those students already had before they set foot on campus, and which they likely could retain and commercially exploit even if they never got a college diploma. By Caplan’s estimate, ability bias accounts for about 45% of the vaunted college premium. Which would means that a college degree actually boosts income by about 40 points, not the oft-cited 70.
Of course, 40% is still a huge premium. But Caplan digs deeper by asking how that premium is earned. And in his view, the extra income doesn’t come from substantive skills learned in college classrooms, but rather from what he called the “signaling” function of a diploma: Because employers lack any quick and reliable objective way to evaluate a job candidate’s potential worth, they fall back on the vetting work done by third parties—namely, colleges. A job candidate who also happens to be someone who managed to get through the college admissions process, followed by four years of near constant testing, likely is someone who is also intelligent and conscientious, and who can be relied on to conform to institutional norms. It doesn’t matter what the applicant was tested on, since it is common knowledge that most of what one learns in college will never be applied later in life. What matters is that these applicants were tested on something. Caplan estimates that signaling accounts for around 80% of the 40-point residual college premium described above, which, if true, would leave less than ten percentage points—from the original 70—left to be accounted for. (...)
Till now, I have discussed the value of college education in generic fashion. But as everyone on any campus knows, different majors offer different value. In the case of liberal arts, the proportion of the true college premium attributable to signaling is probably close to 100%. It is not just that the jobs these students seek typically don’t require any of the substantive knowledge they acquired during their course of study: They also aren’t really improving students’ analytical skills, either. In their 2011 book Academically Adrift: Limited Learning on College Campuses, sociologists Richard Arum and Josipa Roksa presented data showing that, over their first two years of college, students typically improve their skills in critical thinking, complex reasoning and writing by less than a fifth of a standard deviation.
According to the U.S. Department of Commerce’s 2017 report on STEM jobs, even the substantive educational benefit to be had from degrees in technical fields may be overstated—since “almost two-thirds of the workers with a STEM undergraduate degree work in a non-STEM job.” Signaling likely play a strong role in such cases. Indeed, since STEM degrees are harder to obtain than non-STEM degrees, they provide an even stronger signal of intelligence and conscientiousness.
However, this is not the only reason why irrelevant coursework pays. Why do U.S. students who want to become doctors, one of the highest paying professions, first need to complete four years of often unrelated undergraduate studies? The American blogger and psychiatrist Scott Alexander, who majored in philosophy as an undergraduate and then went on to study medicine in Ireland, observed in his brilliant 2015 essay Against Tulip Subsidies that “Americans take eight years to become doctors. Irishmen can do it in four, and achieve the same result.” Law follows a similar pattern: While it takes four years to study law in Ireland, and in France it takes five, students in the United States typically spend seven years in school before beginning the separate process of bar accreditation.
by Samuel Knoche, Quillette | Read more:
Image: uncredited
Maybe They’re Just Bad People
Seven years ago, a former aide to Ralph Reed — who also worked, briefly, for Paul Manafort — published a tawdry, shallow memoir that is also one of the more revealing political books I’ve ever read. Lisa Baron was a pro-choice, pro-gay rights, hard-partying Jew who nonetheless made a career advancing the fortunes of the Christian right. She opened her book with an anecdote about performing oral sex on a future member of the George W. Bush administration during the 2000 primary, which, she wrote, “perfectly summed up my groupie-like relationship to politics at that time — I wanted it, I worshiped it, and I went for it.”
It’s not exactly a secret that politics is full of amoral careerists lusting — literally or figuratively — for access to power. Still, if you’re interested in politics because of values and ideas, it can be easier to understand people who have foul ideologies than those who don’t have ideologies at all. Steve Bannon, a quasi-fascist with delusions of grandeur, makes more sense to me than Anthony Scaramucci, a political cipher who likes to be on TV. I don’t think I’m alone. Consider all the energy spent trying to figure out Ivanka Trump’s true beliefs, when she’s shown that what she believes most is that she’s entitled to power and prestige.
Baron’s book, “Life of the Party: A Political Press Tart Bares All,” is useful because it is a self-portrait of a cynical, fame-hungry narcissist, a common type but one underrepresented in the stories we tell about partisan combat. A person of limited self-awareness — she seemed to think readers would find her right-wing exploits plucky and cute — Baron became Reed’s communications director because she saw it as a steppingstone to her dream job, White House press secretary, a position she envisioned in mostly sartorial terms. (“Outfits would be planned around the news of the day,” she wrote.) Reading Baron’s story helped me realize emotionally something I knew intellectually. It’s tempting for those of us who interpret politics for a living to overstate the importance of competing philosophies. We shouldn't forget the enduring role of sheer vanity. (...)
In many ways, the insincere Trumpists are the most frustrating. Because they don’t really believe in Trump’s belligerent nationalism and racist conspiracy theories, we keep expecting them to feel shame or remorse. But they’re not insincere because they believe in something better than Trumpism. Rather, they believe in very little. They are transactional in a way that makes no psychological sense to those of us who see politics as a moral drama; they might as well all be wearing jackets saying, “I really don’t care, do u?”
Baron’s book helped me grasp what public life is about for such people. “I loved being in the middle of something big, and the biggest thing in my life was Ralph,” she wrote in one of her more plaintive passages. “Without him, I was nobody.” Such a longing for validation is underrated as a political motivator. Senator Lindsey Graham, another insincere Trumpist, once justified his sycophantic relationship with the president by saying, “If you knew anything about me, I want to be relevant.” Some people would rather be on the wrong side than on the outside.
by Michelle Goldberg, NY Times | Read more:
It’s not exactly a secret that politics is full of amoral careerists lusting — literally or figuratively — for access to power. Still, if you’re interested in politics because of values and ideas, it can be easier to understand people who have foul ideologies than those who don’t have ideologies at all. Steve Bannon, a quasi-fascist with delusions of grandeur, makes more sense to me than Anthony Scaramucci, a political cipher who likes to be on TV. I don’t think I’m alone. Consider all the energy spent trying to figure out Ivanka Trump’s true beliefs, when she’s shown that what she believes most is that she’s entitled to power and prestige.Baron’s book, “Life of the Party: A Political Press Tart Bares All,” is useful because it is a self-portrait of a cynical, fame-hungry narcissist, a common type but one underrepresented in the stories we tell about partisan combat. A person of limited self-awareness — she seemed to think readers would find her right-wing exploits plucky and cute — Baron became Reed’s communications director because she saw it as a steppingstone to her dream job, White House press secretary, a position she envisioned in mostly sartorial terms. (“Outfits would be planned around the news of the day,” she wrote.) Reading Baron’s story helped me realize emotionally something I knew intellectually. It’s tempting for those of us who interpret politics for a living to overstate the importance of competing philosophies. We shouldn't forget the enduring role of sheer vanity. (...)
In many ways, the insincere Trumpists are the most frustrating. Because they don’t really believe in Trump’s belligerent nationalism and racist conspiracy theories, we keep expecting them to feel shame or remorse. But they’re not insincere because they believe in something better than Trumpism. Rather, they believe in very little. They are transactional in a way that makes no psychological sense to those of us who see politics as a moral drama; they might as well all be wearing jackets saying, “I really don’t care, do u?”
Baron’s book helped me grasp what public life is about for such people. “I loved being in the middle of something big, and the biggest thing in my life was Ralph,” she wrote in one of her more plaintive passages. “Without him, I was nobody.” Such a longing for validation is underrated as a political motivator. Senator Lindsey Graham, another insincere Trumpist, once justified his sycophantic relationship with the president by saying, “If you knew anything about me, I want to be relevant.” Some people would rather be on the wrong side than on the outside.
by Michelle Goldberg, NY Times | Read more:
How a Japanese Craftsman Lives by the Consuming Art of Indigo Dyeing
Kanji Hama, 69, has quietly dedicated his life to maintaining the traditional Japanese craft of katazome: stencil-printed indigo-dyed kimonos made according to the manner and style of the Edo period. He works alone seven days a week from his home in Matsumoto, Nagano, keeping indigo fermentation vats brewing in his backyard and cutting highly detailed patterns into handmade paper hardened with persimmon tannins to create designs for a craft for which there is virtually no market. Nearly identical-looking garments can be had for a pittance at any souvenir store.
Indigo is one of a handful of blue dyes found in nature, and it’s surprising that it was ever discovered at all, as the plants that yield it reveal no hint of the secret they hold. Unlike other botanical dyestuff, which can be boiled or crushed to release its color, the creation of indigo requires a complex molecular process involving fermentation of the plant’s leaves. (The most common source is the tropical indigo plant, or Indigofera tinctoria, but Japanese dyes are generally made from Persicaria tinctoria, a species of buckwheat.) Everyone who has worked with indigo — from the Tuareg and Yoruba in Africa to the Indians and Japanese across Asia to the prehistoric tribes in the Americas — figured out their own methods for coaxing out the dye, and distinct ways of using it to embellish their clothing, costumes, domestic textiles or ritual objects that were particularly expressive of their own culture and beliefs.
No one knows exactly when indigo arrived in Japan, but beginning around the eighth century, the Japanese began creating a large repertoire of refined traditions for designing with it. Many indigo techniques are intended to hold back, or resist, the dye in certain areas to create designs. Nearly all of these, which include various ways of manipulating the fabric before it is dyed, such as tying it, knotting it, folding it, stitching it, rolling it or applying a gluey substance to it, are used in the great variety of Japanese traditions. But for Hama’s katazome practice, a paste of fermented rice is applied through a stencil laid on top of the fabric. After the fabric has been dipped in an indigo vat, the paste gets washed off and the stenciled design remains. (Resist pastes in other countries often employ local ingredients: Indonesian batik is made with wax, Indian dabu block prints with mud and Nigerian adire with cassava flour.) Katazome, however, unlike the other resist techniques, can yield very intricate and delicate designs because the stencil-making itself, called katagami, is a precise and elaborate craft, unique to Japan.
Matsumoto, which is roughly halfway between Tokyo and Kyoto, was once a center for the Japanese folk craft movement of the 1930s through the 1950s, which recognized and celebrated the beauty of regional, handcrafted everyday objects, or mingei. Hama’s grandfather was part of that movement and a pioneer in reviving natural dyeing after its obsolescence. Hama learned his trade as his father’s apprentice, starting when he was 18, working without salary or holidays, seven days a week for 15 years. (Every evening, from 8 p.m. until about 3 a.m., Hama returned to the studio to practice what he had learned that day.)
Wearing blue work clothes, his hair covered with an indigo scarf and his hands and fingernails stained blue, Hama ushers me to his studio, which occupies the second floor of his house and is outfitted with long, narrow tables built to accommodate lengths of kimono fabric (a standard kimono is about 40 feet long and 16 inches wide). From a back door off the studio, stairs lead to a shed that houses his fermentation vats and a small yard, given over in its entirety to sheaths of dyed kimono fabric, stretched from one end to the other — like long, slender hammocks — to dry.
Of the dozens of steps involved in his process, some are highly complicated and some are simply tedious, such as the repeated washing and starching and rinsing of the fabric, but all are time-consuming. “Craft is doing things with your hands. Once you manufacture things, it is no longer craft,” Hama tells me. As a holdout devoted to maintaining the tradition against all odds, almost to the point of tragic absurdity, Hama is not interested in the easy way. Rather than buy prewashed fabric or premade starch, Hama makes them himself. He sets down one of the stencils he has carved into persimmon-hardened paper called washi — a slight modification of an 18th-century pattern, which he has backed in silk to keep the intricate design intact — onto a length of fabric fastened to one of the tables. (He doesn’t make his own paper or persimmon extract, but only because he doesn’t think the variety of persimmon used today yields the same quality tannins as those from his grandfather’s day. As a result, he has planted a tree from which he hopes one day to make his own.) With a hera, a spatula-like tool, he evenly slathers a glutinous rice paste over the stencil to resist the dye. Because Hama wants a precise consistency to his paste, which varies based on the intricacy of the design and the weather conditions, he mixes his own, a process that takes half a day. He squeegees the excess off the stencil and, by eye, proceeds down the table, lining it up where the previous one left off. The fabric is then hung in the studio to dry before he can do the same work on the other side: Once sewn into a kimono, it won’t even be visible. Next, the fabric is moved outside, where it gets covered in soy milk (also homemade) to help keep the glue in place as it dries in the sun; this is repeated three times on each side before the dyeing can start. We head down to the fermentation dye vats, which are steaming cauldrons cut into the floor of a lean-to shed. Each indigo dyer has his own recipe for adding lime, ash, lye from wood and wheat husks to the sukumo (or composted indigo plant), which must be kept warm and stirred for a couple weeks in order to ferment and become dye in a process called aitate. Hama works according to the seasons. In the summer and monsoon seasons, it is too hot for indigo, as the paste will melt, while in winter, he must rise each morning at 3 a.m. to descend into the cold, adding new coals for a consistent temperature.
by Deborah Needleman, NY Times | Read more:
Image: Kyoko Hamada. Styled by Theresa Rivera. Photographer’s assistant: Garrett Milanovich. Styling assistant: Sarice Olson. Indigo pieces courtesy of Kanji Hama
Indigo is one of a handful of blue dyes found in nature, and it’s surprising that it was ever discovered at all, as the plants that yield it reveal no hint of the secret they hold. Unlike other botanical dyestuff, which can be boiled or crushed to release its color, the creation of indigo requires a complex molecular process involving fermentation of the plant’s leaves. (The most common source is the tropical indigo plant, or Indigofera tinctoria, but Japanese dyes are generally made from Persicaria tinctoria, a species of buckwheat.) Everyone who has worked with indigo — from the Tuareg and Yoruba in Africa to the Indians and Japanese across Asia to the prehistoric tribes in the Americas — figured out their own methods for coaxing out the dye, and distinct ways of using it to embellish their clothing, costumes, domestic textiles or ritual objects that were particularly expressive of their own culture and beliefs.No one knows exactly when indigo arrived in Japan, but beginning around the eighth century, the Japanese began creating a large repertoire of refined traditions for designing with it. Many indigo techniques are intended to hold back, or resist, the dye in certain areas to create designs. Nearly all of these, which include various ways of manipulating the fabric before it is dyed, such as tying it, knotting it, folding it, stitching it, rolling it or applying a gluey substance to it, are used in the great variety of Japanese traditions. But for Hama’s katazome practice, a paste of fermented rice is applied through a stencil laid on top of the fabric. After the fabric has been dipped in an indigo vat, the paste gets washed off and the stenciled design remains. (Resist pastes in other countries often employ local ingredients: Indonesian batik is made with wax, Indian dabu block prints with mud and Nigerian adire with cassava flour.) Katazome, however, unlike the other resist techniques, can yield very intricate and delicate designs because the stencil-making itself, called katagami, is a precise and elaborate craft, unique to Japan.
Matsumoto, which is roughly halfway between Tokyo and Kyoto, was once a center for the Japanese folk craft movement of the 1930s through the 1950s, which recognized and celebrated the beauty of regional, handcrafted everyday objects, or mingei. Hama’s grandfather was part of that movement and a pioneer in reviving natural dyeing after its obsolescence. Hama learned his trade as his father’s apprentice, starting when he was 18, working without salary or holidays, seven days a week for 15 years. (Every evening, from 8 p.m. until about 3 a.m., Hama returned to the studio to practice what he had learned that day.)
Wearing blue work clothes, his hair covered with an indigo scarf and his hands and fingernails stained blue, Hama ushers me to his studio, which occupies the second floor of his house and is outfitted with long, narrow tables built to accommodate lengths of kimono fabric (a standard kimono is about 40 feet long and 16 inches wide). From a back door off the studio, stairs lead to a shed that houses his fermentation vats and a small yard, given over in its entirety to sheaths of dyed kimono fabric, stretched from one end to the other — like long, slender hammocks — to dry.
Of the dozens of steps involved in his process, some are highly complicated and some are simply tedious, such as the repeated washing and starching and rinsing of the fabric, but all are time-consuming. “Craft is doing things with your hands. Once you manufacture things, it is no longer craft,” Hama tells me. As a holdout devoted to maintaining the tradition against all odds, almost to the point of tragic absurdity, Hama is not interested in the easy way. Rather than buy prewashed fabric or premade starch, Hama makes them himself. He sets down one of the stencils he has carved into persimmon-hardened paper called washi — a slight modification of an 18th-century pattern, which he has backed in silk to keep the intricate design intact — onto a length of fabric fastened to one of the tables. (He doesn’t make his own paper or persimmon extract, but only because he doesn’t think the variety of persimmon used today yields the same quality tannins as those from his grandfather’s day. As a result, he has planted a tree from which he hopes one day to make his own.) With a hera, a spatula-like tool, he evenly slathers a glutinous rice paste over the stencil to resist the dye. Because Hama wants a precise consistency to his paste, which varies based on the intricacy of the design and the weather conditions, he mixes his own, a process that takes half a day. He squeegees the excess off the stencil and, by eye, proceeds down the table, lining it up where the previous one left off. The fabric is then hung in the studio to dry before he can do the same work on the other side: Once sewn into a kimono, it won’t even be visible. Next, the fabric is moved outside, where it gets covered in soy milk (also homemade) to help keep the glue in place as it dries in the sun; this is repeated three times on each side before the dyeing can start. We head down to the fermentation dye vats, which are steaming cauldrons cut into the floor of a lean-to shed. Each indigo dyer has his own recipe for adding lime, ash, lye from wood and wheat husks to the sukumo (or composted indigo plant), which must be kept warm and stirred for a couple weeks in order to ferment and become dye in a process called aitate. Hama works according to the seasons. In the summer and monsoon seasons, it is too hot for indigo, as the paste will melt, while in winter, he must rise each morning at 3 a.m. to descend into the cold, adding new coals for a consistent temperature.
Hama is cognizant that what he knows will likely die along with him. Like many masters of traditional crafts in Japan, Hama does not believe in writing down the process, because the craft is understood to be so much more than its individual steps and thus impossible to transmit through written instruction. Indigo dyeing like this is a way of life, and to the extent to which Hama is a master, he possesses not just his own knowledge but, in a very real way, his father’s and his father’s father’s knowledge. This kind of embodied, tacit expertise doesn’t translate easily into English as it involves the very un-Western idea of the body and the intellect working in unison, masterfully and efficiently, as if in a dance. There is a chance his son will take on the business, but Hama thinks this generation is incapable of putting in the time it takes to gain the mastery of a craft like this.
by Deborah Needleman, NY Times | Read more:
[ed. See also: Why Is Japan Still So Attached to Paper?]
Monday, November 26, 2018
An Ecology of Beauty and Strong Drink
According to the theory of cultural evolution, rituals and other cultural elements evolve in the context of human beings. They depend on us for their reproduction, and sometimes help us feel good and accomplish our goals, reproductive and otherwise. Ritual performances, like uses of language, exhibit a high degree of variation; ritual performances change over time, and some changes are copied, some are not. As with genetic mutation, ritual novelty is constantly emerging.
The following presents several ecological metaphors for ritual adaptation: sexual selection, the isolated island, and the clearcut forest. Once these metaphors are established, I will explain how they apply to ritual, and suggest some policy recommendations based on this speculation. (...)
Clearcuts
When a mature natural ecosystem is destroyed by fire, clearcutting, or plowing, a particular process of succession follows. First, plants with a short life history that specialize in colonization emerge; these first-stage plants are often called weeds, or “weedy ephemerals,” and make up a large number of agricultural pest species. But these initial colonizers specialize in colonization at the expense of long-term competitiveness for light. Second, a wave of plants that are not as good at spreading their seed, but a little better at monopolizing light, gain dominance. These are followed by plants that are even better at long-term competition; eventually, absent human interference, the original weeds become rare.
Sometimes, however, the landscape is frozen at the first stage of succession; this is known as agriculture. Second-wave competitive plants are prevented from growing; the land is cleared again and again, and the seeds of a single species planted, providing an optimal environment for short-life-history weeds. Since the survival of humans and their livestock depends on only a few species of plants, other plants that would eventually out-compete the weeds must not be permitted to grow. Instead, herbicides are applied, resulting in selection for better and better weeds.
This is not an indictment of agriculture. Again, without these methods, most humans on earth would die. But the precariousness of the situation is a result of evolutionary processes. Perverse results are common in naive pest management strategies; Kaneshiro (pp. 13-14) suggests that eradication efforts for the Mediterranean fruit fly in California in the 1980s, despite temporarily reducing the population size substantially, paradoxically resulted in the adaptation of the fruit fly to winter conditions and subsequent population explosions. Pesticide resistance in plants and animals (and even diseases) frequently follows a similarly perverse course.
Ritual Ecology
Ecosystems are made up of “selfish” organisms that display variation, and undergo natural and sexual selection. Ecosystems seem to self-repair because any temporarily empty niche will quickly be filled by any organism that shows up to do the job, no matter how ill-suited it may be at first. Economies self-repair in the same manner: a product or service that is not being supplied is an opportunity.
Language appears to be remarkably self-repairing: deaf school children in Nicaragua, provided only with lipreading training of dubious effectiveness, developed their own language, which within two generations acquired the core expressive characteristics of any human language.
While inherited ritual traditions may be extremely useful and highly adapted to their contexts, ritual may exhibit a high degree of self-repair as well. And since the context of human existence has changed so rapidly since the Industrial Revolution, ancestral traditions may be poorly adapted to new contexts; self-repair for new contexts may be a necessity. The human being himself has not changed much, but his environment, duties, modes of subsistence, and social interdependencies have changed dramatically.
Memetic selection is like sexual selection, in that it is based on signal reception by a perceiving organism (another human or group of humans). Rituals are transmitted by preferential copying (with variation); even novel rituals, like the rock concert, the desert art festival, the school shooting, or the Twitter shaming, must be attended to and copied in order to survive and spread.
Some rituals are useful, providing group cohesion and bonding, the opportunity for costly signaling, free-rider detection and exclusion, and similar benefits. Some rituals have aesthetic or affective benefits, providing desirable mental states; these need not be happy, as one of the most popular affective states provided by songs is poignant sadness. Rituals vary in their usefulness, communication efficiency, pleasurability, and prestige; they will be selected for all these qualities.
Ritual is not a single, fungible substance. Rather, an entire human culture has many ritual niches, just like an ecosystem: rituals specialized for cohesion and bonding may display adaptations entirely distinct from rituals that are specialized for psychological self-control or pleasurable feelings. Marriage rituals are different from dispute resolution rituals; healing rituals are distinct from criminal justice rituals. Humans have many signaling and affective needs, and at any time many rituals are in competition to supply them.
Cultural Clearcutting: Ritual Shocks
Ordinarily, rituals evolve slowly and regularly, reflecting random chance as well as changes in context and technology. From time to time, there are shocks to the system, and an entire ritual ecosystem is destroyed and must be repaired out of sticks and twigs.
Recall that in literal clearcutting, short-life-history plants flourish. They specialize in spreading quickly, with little regard for long-term survival and zero regard for participating in relationships within a permanent ecosystem. After a cultural clearcutting occurs, short-life-history rituals such as drug abuse flourish. To take a very extreme example, the Native American genocide destroyed many cultures at one blow. Many peoples who had safely used alcohol in ceremonial contexts for centuries experienced chronic alcohol abuse as their cultures were erased and they were massacred and forcibly moved across the country to the most marginal lands. There is some recent evidence of ritual repair, however; among many Native American groups, alcohol use is lower than among whites, and the ratio of Native American to white alcohol deaths has been decreasing for decades.
Crack cocaine did not spread among healthy, ritually intact communities. It spread among communities that had been “clearcut” by economic problems (including loss of manufacturing jobs), sadistic urban planning practices, and tragic social changes in family structure. Methamphetamine has followed similar patterns.
Alcohol prohibition in the United States constituted both a ritual destruction and a pesticide-style management policy. Relatively healthy ritual environments for alcohol consumption, resulting in substantial social capital, were destroyed, including fine restaurants. American cuisine was set back decades as the legitimate fine restaurants could not survive economically without selling a bottle of wine with dinner. In their place, short-life-history ritual environments, such as the speakeasy, sprung up; they contributed little to social capital, and had no ritual standards for decorum.
During (alcohol) Prohibition, when grain and fruit alcohol was not available, poisonous wood alcohols or other toxic alcohol substitutes were commonly consumed, often (but not always) unknowingly. (It’s surprising that there are drugs more toxic than alcohol, but there you go.) The consumption of poisoned (denatured) or wood alcohol may be the ultimate short-life-history ritual; it contributed nothing to social capital, provided but a brief experience of palliation, and often resulted in death or serious medical consequences. Morgues filled with bodies. The modern-day policy of poisoning prescription opiates with acetaminophen has the same effect as the Prohibition-era policy of “denaturing” alcohol: death and suffering to those in too much pain to pay attention to long-term incentives.
Early 20th century and modern prohibitions clearly don’t eradicate short-life-history drug rituals; rather, they concentrate them in their most harmful forms, and at the same time create a permanent economic niche for distributors. As the recently deceased economist Douglass North said in his Nobel lecture,
I focus on drugs because drugs are interesting, and they provide a tidy example of the processes in ritual ecology. But the same selective effects are present in many domains: music, drama, exercise, food, and the new ritual domain of the internet.
The following presents several ecological metaphors for ritual adaptation: sexual selection, the isolated island, and the clearcut forest. Once these metaphors are established, I will explain how they apply to ritual, and suggest some policy recommendations based on this speculation. (...)
Clearcuts
When a mature natural ecosystem is destroyed by fire, clearcutting, or plowing, a particular process of succession follows. First, plants with a short life history that specialize in colonization emerge; these first-stage plants are often called weeds, or “weedy ephemerals,” and make up a large number of agricultural pest species. But these initial colonizers specialize in colonization at the expense of long-term competitiveness for light. Second, a wave of plants that are not as good at spreading their seed, but a little better at monopolizing light, gain dominance. These are followed by plants that are even better at long-term competition; eventually, absent human interference, the original weeds become rare.
Sometimes, however, the landscape is frozen at the first stage of succession; this is known as agriculture. Second-wave competitive plants are prevented from growing; the land is cleared again and again, and the seeds of a single species planted, providing an optimal environment for short-life-history weeds. Since the survival of humans and their livestock depends on only a few species of plants, other plants that would eventually out-compete the weeds must not be permitted to grow. Instead, herbicides are applied, resulting in selection for better and better weeds.This is not an indictment of agriculture. Again, without these methods, most humans on earth would die. But the precariousness of the situation is a result of evolutionary processes. Perverse results are common in naive pest management strategies; Kaneshiro (pp. 13-14) suggests that eradication efforts for the Mediterranean fruit fly in California in the 1980s, despite temporarily reducing the population size substantially, paradoxically resulted in the adaptation of the fruit fly to winter conditions and subsequent population explosions. Pesticide resistance in plants and animals (and even diseases) frequently follows a similarly perverse course.
Ritual Ecology
Ecosystems are made up of “selfish” organisms that display variation, and undergo natural and sexual selection. Ecosystems seem to self-repair because any temporarily empty niche will quickly be filled by any organism that shows up to do the job, no matter how ill-suited it may be at first. Economies self-repair in the same manner: a product or service that is not being supplied is an opportunity.
Language appears to be remarkably self-repairing: deaf school children in Nicaragua, provided only with lipreading training of dubious effectiveness, developed their own language, which within two generations acquired the core expressive characteristics of any human language.
While inherited ritual traditions may be extremely useful and highly adapted to their contexts, ritual may exhibit a high degree of self-repair as well. And since the context of human existence has changed so rapidly since the Industrial Revolution, ancestral traditions may be poorly adapted to new contexts; self-repair for new contexts may be a necessity. The human being himself has not changed much, but his environment, duties, modes of subsistence, and social interdependencies have changed dramatically.
Memetic selection is like sexual selection, in that it is based on signal reception by a perceiving organism (another human or group of humans). Rituals are transmitted by preferential copying (with variation); even novel rituals, like the rock concert, the desert art festival, the school shooting, or the Twitter shaming, must be attended to and copied in order to survive and spread.
Some rituals are useful, providing group cohesion and bonding, the opportunity for costly signaling, free-rider detection and exclusion, and similar benefits. Some rituals have aesthetic or affective benefits, providing desirable mental states; these need not be happy, as one of the most popular affective states provided by songs is poignant sadness. Rituals vary in their usefulness, communication efficiency, pleasurability, and prestige; they will be selected for all these qualities.
Ritual is not a single, fungible substance. Rather, an entire human culture has many ritual niches, just like an ecosystem: rituals specialized for cohesion and bonding may display adaptations entirely distinct from rituals that are specialized for psychological self-control or pleasurable feelings. Marriage rituals are different from dispute resolution rituals; healing rituals are distinct from criminal justice rituals. Humans have many signaling and affective needs, and at any time many rituals are in competition to supply them.
Cultural Clearcutting: Ritual Shocks
Ordinarily, rituals evolve slowly and regularly, reflecting random chance as well as changes in context and technology. From time to time, there are shocks to the system, and an entire ritual ecosystem is destroyed and must be repaired out of sticks and twigs.
Recall that in literal clearcutting, short-life-history plants flourish. They specialize in spreading quickly, with little regard for long-term survival and zero regard for participating in relationships within a permanent ecosystem. After a cultural clearcutting occurs, short-life-history rituals such as drug abuse flourish. To take a very extreme example, the Native American genocide destroyed many cultures at one blow. Many peoples who had safely used alcohol in ceremonial contexts for centuries experienced chronic alcohol abuse as their cultures were erased and they were massacred and forcibly moved across the country to the most marginal lands. There is some recent evidence of ritual repair, however; among many Native American groups, alcohol use is lower than among whites, and the ratio of Native American to white alcohol deaths has been decreasing for decades.
Crack cocaine did not spread among healthy, ritually intact communities. It spread among communities that had been “clearcut” by economic problems (including loss of manufacturing jobs), sadistic urban planning practices, and tragic social changes in family structure. Methamphetamine has followed similar patterns.
Alcohol prohibition in the United States constituted both a ritual destruction and a pesticide-style management policy. Relatively healthy ritual environments for alcohol consumption, resulting in substantial social capital, were destroyed, including fine restaurants. American cuisine was set back decades as the legitimate fine restaurants could not survive economically without selling a bottle of wine with dinner. In their place, short-life-history ritual environments, such as the speakeasy, sprung up; they contributed little to social capital, and had no ritual standards for decorum.
During (alcohol) Prohibition, when grain and fruit alcohol was not available, poisonous wood alcohols or other toxic alcohol substitutes were commonly consumed, often (but not always) unknowingly. (It’s surprising that there are drugs more toxic than alcohol, but there you go.) The consumption of poisoned (denatured) or wood alcohol may be the ultimate short-life-history ritual; it contributed nothing to social capital, provided but a brief experience of palliation, and often resulted in death or serious medical consequences. Morgues filled with bodies. The modern-day policy of poisoning prescription opiates with acetaminophen has the same effect as the Prohibition-era policy of “denaturing” alcohol: death and suffering to those in too much pain to pay attention to long-term incentives.
Early 20th century and modern prohibitions clearly don’t eradicate short-life-history drug rituals; rather, they concentrate them in their most harmful forms, and at the same time create a permanent economic niche for distributors. As the recently deceased economist Douglass North said in his Nobel lecture,
The organizations that come into existence will reflect the opportunities provided by the institutional matrix. That is, if the institutional framework rewards piracy then piratical organizations will come into existence; and if the institutional framework rewards productive activities then organizations – firms – will come into existence to engage in productive activities.If the ritual ecology within a category of ritual provides attractive niches for short-life-history rituals, and the economic ecology provides niches for drug cartels, then these will come into existence and prosper; but if a ritual context is allowed to evolve to encapsulate mind-altering substances, as it has for most human societies in the history of the world, and to direct the use of these substances in specific times, manners, and places, then these longer-life-history rituals specialized for competition rather than short-term palliation will flourish. Prohibition is a pesticide with perverse effects; ritual reforestation is a long-term solution. (...)
I focus on drugs because drugs are interesting, and they provide a tidy example of the processes in ritual ecology. But the same selective effects are present in many domains: music, drama, exercise, food, and the new ritual domain of the internet.
by Sarah Perry, Ribbonfarm | Read more:
Image: Clearcut, Wikipedia
Sunday, November 25, 2018
Of America and the Rise of the Stupefied Plutocrat
At the higher elevations of informed American opinion in the spring of 2018 the voices of reason stand united in their fear and loathing of Donald J. Trump, real estate mogul, reality TV star, 45th president of the United States. Their viewing with alarm is bipartisan and heartfelt, but the dumbfounded question, “How can such things be?” is well behind the times. Trump is undoubtedly a menace, but he isn’t a surprise. His smug and self-satisfied face is the face of the way things are and have been in Washington and Wall Street for the last quarter of a century.
Trump staked his claim to the White House on the proposition that he was “really rich,” embodiment of the divine right of money and therefore free to say and do whatever it takes to make America great again. A deus ex machina descending an escalator into the atrium of his eponymous tower on Manhattan’s Fifth Avenue in June 2015, Trump was there to say, and say it plainly, that money is power, and power, ladies and gentlemen, is not self-sacrificing or democratic. The big money cares for nothing other than itself, always has and always will. Name of the game, nature of the beast.
Not the exact words in Trump’s loud and thoughtless mouth, but the gist of the message that over the next 17 months he shouted to fairground crowd and camera in states red, white and blue. A fair enough share of his fellow citizens screamed, stamped and voted in agreement because what he was saying they knew to be true, knew it not as precept borrowed from the collected works of V.I. Lenin or Ralph Lauren but from their own downwardly mobile experience on the losing side of a class war waged over the past 40 years by America’s increasingly frightened and selfish rich against its increasingly angry and debtbound poor.
Trump didn’t need briefing papers to refine the message. He presented it live and in person, an unscripted and overweight canary flown from its gilded cage, telling it like it is when seen from the perch of the haves looking down on the birdseed of the have-nots. Had he time or patience for looking into books instead of mirrors, he could have sourced his wisdom to Supreme Court Justice Louis Brandeis, who in 1933 presented the case for Franklin D. Roosevelt’s New Deal: “We must make our choice. We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.”
Not that it would have occurred to Trump to want both, but he might have been glad to know the Supreme Court had excused him from further study under the heading of politics. In the world according to Trump—as it was in the worlds according to Ronald Reagan, George Bush pere et fils, Bill Clinton and Barack Obama—the concentration of wealth is the good, the true and the beautiful. Democracy is for losers.
Ronald Reagan was elected President in 1980 with an attitude and agenda similar to Trump’s—to restore America to its rightful place where “someone can always get rich.” His administration arrived in Washington firm in its resolve to uproot the democratic style of feeling and thought that underwrote FDR’s New Deal. What was billed as the Reagan Revolution and the dawn of a New Morning in America recruited various parties of the dissatisfied right (conservative, neoconservative, libertarian, reactionary and evangelical) under one flag of abiding and transcendent truth—money ennobles rich people, making them healthy, wealthy and wise; money corrupts poor people, making them ignorant, lazy and sick.
Re-branded as neoliberalism in the 1990s the doctrine of enlightened selfishness has served as the wisdom in political and cultural office ever since Reagan stepped onto the White House stage promising a happy return to an imaginary American past—to the home on the range made safe from Apaches by John Wayne, an America once again cowboy-hatted and standing tall, risen from the ashes of defeat in Vietnam, cleansed of its Watergate impurities, outspending the Russians on weapons of mass destruction, releasing the free market from the prison of government regulation, going long on the private good, selling short the public good.
For 40 years under administrations Republican and Democrat, the concentrations of wealth and power have systematically shuffled public land and light and air into a private purse, extended the reach of corporate monopoly, shifted the bulk of the nation’s income to its top-tier fatted calves, let fall into disrepair nearly all the infrastructure—roads, water systems, schools, bridges, hospitals and power plants—that provides a democratic commonwealth with the means of production for its mutual enterprise. The subdivision of America the Beautiful into a nation of the rich and a nation of the poor has outfitted a tenth of the population with three-quarters of the nation’s wealth. The work in progress has been accompanied by the construction of a national security and surveillance state backed by the guarantee of never-ending foreign war and equipped with increasingly repressive police powers to quiet the voices of domestic discontent. In the 1950s the word public indicated a common good (public health, public school, public service, public spirit); private was a synonym for selfishness and greed (plutocrats in top hats, pigs at troughs). The connotations traded places in the 1980s; private to be associated with all things bright and beautiful (private trainer, private school, private plane), public a synonym for all things ugly, incompetent and unclean (public housing, public welfare, public toilet). (...)
The framers of the Constitution, prosperous and well-educated gentlemen assembled in Philadelphia in the summer of 1787, shared with John Adams the suspicion that “democracy will infallibly destroy all civilization,” agreed with James Madison that the turbulent passions of the common man lead to “reckless agitation” for the abolition of debts and “other wicked projects.” With Plato the framers shared the assumption that the best government incorporates the means by which a privileged few arrange the distribution of property and law for the less fortunate many. They envisioned an enlightened oligarchy to which they gave the name of a republic. Adams thought “the great functions of state” should be reserved for “the rich, the well-born, and the able,” the new republic to be managed by men to whom Madison attributed “most wisdom to discern and most virtue to pursue the common good of the society.” (...)
But unlike our present-day makers of money and law, the founders were not stupefied plutocrats. They knew how to read and write (in Latin or French if not also in Greek) and they weren’t preoccupied with the love and fear of money. From their reading of history they understood that oligarchy was well-advised to furnish democracy with some measure of political power because the failure to do so was apt to lead to their being roasted on pitchforks. Accepting of the fact that whereas democracy puts a premium on equality, a capitalist economy does not, the founders looked to balance the divergent ways and means, to accommodate both motions of the heart and the movement of a market. They conceived the Constitution as both organism and mechanism and offered as warranty for its worth the character of men presumably relieved of the necessity to cheat and steal and lie.
The presumption in 1787 could be taken at fair and face value. The framers were endowed with the intellectual energy of the 18th-century Enlightenment, armed with the moral force of the Christian religion. Their idea of law they held to be sacred, a marriage of faith and reason. But good intentions are a perishable commodity, and even the best of oligarchies bear comparison to cheese. Sooner or later they turn rancid in the sun. Wealth accumulates, men decay; a band of brothers that once aspired to form a wise and just government acquires the character of what Aristotle likened to that of “the prosperous fool,” a class of men insatiable in their appetite for more—more banquets, more laurel wreaths and naval victories, more temples, dancing girls and portrait busts—so intoxicated by the love of money “they therefore imagine there is nothing it cannot buy.” (...)
All men were maybe equal in the eye of God, but not in the pews in Boston’s Old North Church, in the streets of Benjamin Franklin’s Philadelphia, in the fields at Jefferson’s Monticello. The Calvinist doctrine of predestination divided the Massachusetts flock of Christian sheep into damned and saved; Cotton Mather in 1696 reminded the servants in his midst, “You are the animate, separate passive instruments of other men . . . your tongues, your hands, your feet, are your masters’s and they should move according to the will of your masters.” Franklin, enlightened businessman and founder of libraries, looked upon the Philadelphia rabble as coarse material that maybe could be brushed and combed into an acceptable grade of bourgeois broadcloth. His Poor Richard’s Almanac offered a program for turning sow’s ears if not into silk purses, then into useful tradesmen furnished with a “happy mediocrity.” For poor white children in Virginia, Jefferson proposed a scheme he described as “raking from the rubbish” the scraps of intellect and talent worth the trouble of further cultivation. A few young illiterates who showed promise as students were allowed to proceed beyond the elementary grades; the majority were released into a wilderness of ignorance and poverty, dispersed over time into the westward moving breeds of an American underclass variously denominated as “mudsill,” “hillbilly,” “cracker,” “Okie,” “redneck,” Hillary Clinton’s “basket of deplorables.”
Nor at any moment in its history has America declared a lasting peace between the haves and have-nots. Temporary cessations of hostilities, but no permanent closing of the moral and social frontier between debtor and creditor. The notion of a classless society derives its credibility from the relatively few periods in the life of the nation during which circumstances encouraged social readjustment and experiment—in the 1830s, 1840s, and 1850s, again in the 1940s, 1950s and 1960s—but for the most part the record will show the game securely rigged in favor of the rich, no matter how selfish or stupid, at the expense of the poor, no matter how innovative or entrepreneurial. During the last 30 years of the 19th century and the first 30 years of the 20th, class conflict furnished the newspaper mills with their best-selling headlines—railroad company thugs quelling labor unrest in the industrial East, the Ku Klux Klan lynching Negroes in the rural South, the U.S. army exterminating Sioux Indians on the Western plains.
Around the turn of the 20th century the forces of democracy pushed forward an era of progressive reform sponsored by both the Republican president, Theodore Roosevelt, and the Democratic president, Woodrow Wilson. During the middle years of the 20th century America at times showed some semblance of the republic envisioned by its 18th-century founders—Franklin D. Roosevelt’s New Deal, a citizen army fighting World War II, the Great Depression replaced with a fully employed economy in which all present shared in the profits.
The civil rights and anti-Vietnam war protests in the 1960s were expressions of democratic objection and dissent intended to reform the country’s political thought and practice, not to overthrow its government. Nobody was threatening to reset the game clock in the Rose Bowl, tear down Grand Central Terminal or remove the Lincoln Memorial. The men, women and children confronting racist tyranny in the South—sitting at a lunch counter in Alabama, riding a bus into Mississippi, going to school in Arkansas—risked their lives and sacred honor on behalf of a principle, not a lifestyle; for a government of laws, not men. The unarmed rebellion led to the enactment in the mid-1960s of the Economic Opportunity Act, the Voting Rights Act, the Medicare and Medicaid programs, eventually to the shutting down of the Vietnam War.
Faith in democracy survived the assassination of President John F. Kennedy in 1963; it didn’t survive the assassinations of Robert Kennedy and Martin Luther King in 1968. The 1960s and 1970s gave rise to a sequence of ferocious and destabilizing change—social, cultural, technological, sexual, economic and demographic—that tore up the roots of family, community and church from which a democratic society draws meaning and strength. The news media promoted the multiple wounds to the body politic (the murders of King and Kennedy, big-city race riots, the killing of college students at Kent State and Jackson State, crime in the streets of Los Angeles, Chicago and Newark) as revolution along the line of Robespierre’s reign of terror. The fantasy of armed revolt sold papers, boosted ratings, stimulated the demand for heavy surveillance and repressive law enforcement that over the last 50 years has blossomed into the richest and most innovative of the nation’s growth industries.
By the end of the 1970s democracy had come to be seen as a means of government gone soft in the head and weak in the knees, no match for unscrupulous Russians, incapable of securing domestic law and order, unable to disperse the barbarians (foreign and native born) at the gates of the gated real estate in Beverly Hills, Westchester County and Palm Beach. The various liberation movements still in progress no longer sought to right the wrongs of government. The political was personal, the personal political. Seized by the appetite for more—more entitlements, privileges and portrait busts—plaintiffs for both the haves and the have-nots agitated for a lifestyle, not a principle. The only constitutional value still on the table was the one constituting freedom as property, property as freedom. A fearful bourgeois society adrift in a sea of troubles was clinging to its love of money as if to the last lifeboat rowing away from the Titanic when Ronald Reagan in 1980 stepped onto the stage of the self-pitying national melodrama with the promise of an America to become great again in a future made of gold.
Trump staked his claim to the White House on the proposition that he was “really rich,” embodiment of the divine right of money and therefore free to say and do whatever it takes to make America great again. A deus ex machina descending an escalator into the atrium of his eponymous tower on Manhattan’s Fifth Avenue in June 2015, Trump was there to say, and say it plainly, that money is power, and power, ladies and gentlemen, is not self-sacrificing or democratic. The big money cares for nothing other than itself, always has and always will. Name of the game, nature of the beast.
Not the exact words in Trump’s loud and thoughtless mouth, but the gist of the message that over the next 17 months he shouted to fairground crowd and camera in states red, white and blue. A fair enough share of his fellow citizens screamed, stamped and voted in agreement because what he was saying they knew to be true, knew it not as precept borrowed from the collected works of V.I. Lenin or Ralph Lauren but from their own downwardly mobile experience on the losing side of a class war waged over the past 40 years by America’s increasingly frightened and selfish rich against its increasingly angry and debtbound poor.Trump didn’t need briefing papers to refine the message. He presented it live and in person, an unscripted and overweight canary flown from its gilded cage, telling it like it is when seen from the perch of the haves looking down on the birdseed of the have-nots. Had he time or patience for looking into books instead of mirrors, he could have sourced his wisdom to Supreme Court Justice Louis Brandeis, who in 1933 presented the case for Franklin D. Roosevelt’s New Deal: “We must make our choice. We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.”
Not that it would have occurred to Trump to want both, but he might have been glad to know the Supreme Court had excused him from further study under the heading of politics. In the world according to Trump—as it was in the worlds according to Ronald Reagan, George Bush pere et fils, Bill Clinton and Barack Obama—the concentration of wealth is the good, the true and the beautiful. Democracy is for losers.
Ronald Reagan was elected President in 1980 with an attitude and agenda similar to Trump’s—to restore America to its rightful place where “someone can always get rich.” His administration arrived in Washington firm in its resolve to uproot the democratic style of feeling and thought that underwrote FDR’s New Deal. What was billed as the Reagan Revolution and the dawn of a New Morning in America recruited various parties of the dissatisfied right (conservative, neoconservative, libertarian, reactionary and evangelical) under one flag of abiding and transcendent truth—money ennobles rich people, making them healthy, wealthy and wise; money corrupts poor people, making them ignorant, lazy and sick.
Re-branded as neoliberalism in the 1990s the doctrine of enlightened selfishness has served as the wisdom in political and cultural office ever since Reagan stepped onto the White House stage promising a happy return to an imaginary American past—to the home on the range made safe from Apaches by John Wayne, an America once again cowboy-hatted and standing tall, risen from the ashes of defeat in Vietnam, cleansed of its Watergate impurities, outspending the Russians on weapons of mass destruction, releasing the free market from the prison of government regulation, going long on the private good, selling short the public good.
For 40 years under administrations Republican and Democrat, the concentrations of wealth and power have systematically shuffled public land and light and air into a private purse, extended the reach of corporate monopoly, shifted the bulk of the nation’s income to its top-tier fatted calves, let fall into disrepair nearly all the infrastructure—roads, water systems, schools, bridges, hospitals and power plants—that provides a democratic commonwealth with the means of production for its mutual enterprise. The subdivision of America the Beautiful into a nation of the rich and a nation of the poor has outfitted a tenth of the population with three-quarters of the nation’s wealth. The work in progress has been accompanied by the construction of a national security and surveillance state backed by the guarantee of never-ending foreign war and equipped with increasingly repressive police powers to quiet the voices of domestic discontent. In the 1950s the word public indicated a common good (public health, public school, public service, public spirit); private was a synonym for selfishness and greed (plutocrats in top hats, pigs at troughs). The connotations traded places in the 1980s; private to be associated with all things bright and beautiful (private trainer, private school, private plane), public a synonym for all things ugly, incompetent and unclean (public housing, public welfare, public toilet). (...)
The framers of the Constitution, prosperous and well-educated gentlemen assembled in Philadelphia in the summer of 1787, shared with John Adams the suspicion that “democracy will infallibly destroy all civilization,” agreed with James Madison that the turbulent passions of the common man lead to “reckless agitation” for the abolition of debts and “other wicked projects.” With Plato the framers shared the assumption that the best government incorporates the means by which a privileged few arrange the distribution of property and law for the less fortunate many. They envisioned an enlightened oligarchy to which they gave the name of a republic. Adams thought “the great functions of state” should be reserved for “the rich, the well-born, and the able,” the new republic to be managed by men to whom Madison attributed “most wisdom to discern and most virtue to pursue the common good of the society.” (...)
But unlike our present-day makers of money and law, the founders were not stupefied plutocrats. They knew how to read and write (in Latin or French if not also in Greek) and they weren’t preoccupied with the love and fear of money. From their reading of history they understood that oligarchy was well-advised to furnish democracy with some measure of political power because the failure to do so was apt to lead to their being roasted on pitchforks. Accepting of the fact that whereas democracy puts a premium on equality, a capitalist economy does not, the founders looked to balance the divergent ways and means, to accommodate both motions of the heart and the movement of a market. They conceived the Constitution as both organism and mechanism and offered as warranty for its worth the character of men presumably relieved of the necessity to cheat and steal and lie.
The presumption in 1787 could be taken at fair and face value. The framers were endowed with the intellectual energy of the 18th-century Enlightenment, armed with the moral force of the Christian religion. Their idea of law they held to be sacred, a marriage of faith and reason. But good intentions are a perishable commodity, and even the best of oligarchies bear comparison to cheese. Sooner or later they turn rancid in the sun. Wealth accumulates, men decay; a band of brothers that once aspired to form a wise and just government acquires the character of what Aristotle likened to that of “the prosperous fool,” a class of men insatiable in their appetite for more—more banquets, more laurel wreaths and naval victories, more temples, dancing girls and portrait busts—so intoxicated by the love of money “they therefore imagine there is nothing it cannot buy.” (...)
All men were maybe equal in the eye of God, but not in the pews in Boston’s Old North Church, in the streets of Benjamin Franklin’s Philadelphia, in the fields at Jefferson’s Monticello. The Calvinist doctrine of predestination divided the Massachusetts flock of Christian sheep into damned and saved; Cotton Mather in 1696 reminded the servants in his midst, “You are the animate, separate passive instruments of other men . . . your tongues, your hands, your feet, are your masters’s and they should move according to the will of your masters.” Franklin, enlightened businessman and founder of libraries, looked upon the Philadelphia rabble as coarse material that maybe could be brushed and combed into an acceptable grade of bourgeois broadcloth. His Poor Richard’s Almanac offered a program for turning sow’s ears if not into silk purses, then into useful tradesmen furnished with a “happy mediocrity.” For poor white children in Virginia, Jefferson proposed a scheme he described as “raking from the rubbish” the scraps of intellect and talent worth the trouble of further cultivation. A few young illiterates who showed promise as students were allowed to proceed beyond the elementary grades; the majority were released into a wilderness of ignorance and poverty, dispersed over time into the westward moving breeds of an American underclass variously denominated as “mudsill,” “hillbilly,” “cracker,” “Okie,” “redneck,” Hillary Clinton’s “basket of deplorables.”
Nor at any moment in its history has America declared a lasting peace between the haves and have-nots. Temporary cessations of hostilities, but no permanent closing of the moral and social frontier between debtor and creditor. The notion of a classless society derives its credibility from the relatively few periods in the life of the nation during which circumstances encouraged social readjustment and experiment—in the 1830s, 1840s, and 1850s, again in the 1940s, 1950s and 1960s—but for the most part the record will show the game securely rigged in favor of the rich, no matter how selfish or stupid, at the expense of the poor, no matter how innovative or entrepreneurial. During the last 30 years of the 19th century and the first 30 years of the 20th, class conflict furnished the newspaper mills with their best-selling headlines—railroad company thugs quelling labor unrest in the industrial East, the Ku Klux Klan lynching Negroes in the rural South, the U.S. army exterminating Sioux Indians on the Western plains.
Around the turn of the 20th century the forces of democracy pushed forward an era of progressive reform sponsored by both the Republican president, Theodore Roosevelt, and the Democratic president, Woodrow Wilson. During the middle years of the 20th century America at times showed some semblance of the republic envisioned by its 18th-century founders—Franklin D. Roosevelt’s New Deal, a citizen army fighting World War II, the Great Depression replaced with a fully employed economy in which all present shared in the profits.
The civil rights and anti-Vietnam war protests in the 1960s were expressions of democratic objection and dissent intended to reform the country’s political thought and practice, not to overthrow its government. Nobody was threatening to reset the game clock in the Rose Bowl, tear down Grand Central Terminal or remove the Lincoln Memorial. The men, women and children confronting racist tyranny in the South—sitting at a lunch counter in Alabama, riding a bus into Mississippi, going to school in Arkansas—risked their lives and sacred honor on behalf of a principle, not a lifestyle; for a government of laws, not men. The unarmed rebellion led to the enactment in the mid-1960s of the Economic Opportunity Act, the Voting Rights Act, the Medicare and Medicaid programs, eventually to the shutting down of the Vietnam War.
Faith in democracy survived the assassination of President John F. Kennedy in 1963; it didn’t survive the assassinations of Robert Kennedy and Martin Luther King in 1968. The 1960s and 1970s gave rise to a sequence of ferocious and destabilizing change—social, cultural, technological, sexual, economic and demographic—that tore up the roots of family, community and church from which a democratic society draws meaning and strength. The news media promoted the multiple wounds to the body politic (the murders of King and Kennedy, big-city race riots, the killing of college students at Kent State and Jackson State, crime in the streets of Los Angeles, Chicago and Newark) as revolution along the line of Robespierre’s reign of terror. The fantasy of armed revolt sold papers, boosted ratings, stimulated the demand for heavy surveillance and repressive law enforcement that over the last 50 years has blossomed into the richest and most innovative of the nation’s growth industries.
By the end of the 1970s democracy had come to be seen as a means of government gone soft in the head and weak in the knees, no match for unscrupulous Russians, incapable of securing domestic law and order, unable to disperse the barbarians (foreign and native born) at the gates of the gated real estate in Beverly Hills, Westchester County and Palm Beach. The various liberation movements still in progress no longer sought to right the wrongs of government. The political was personal, the personal political. Seized by the appetite for more—more entitlements, privileges and portrait busts—plaintiffs for both the haves and the have-nots agitated for a lifestyle, not a principle. The only constitutional value still on the table was the one constituting freedom as property, property as freedom. A fearful bourgeois society adrift in a sea of troubles was clinging to its love of money as if to the last lifeboat rowing away from the Titanic when Ronald Reagan in 1980 stepped onto the stage of the self-pitying national melodrama with the promise of an America to become great again in a future made of gold.
by Lewis Lapham, LitHub | Read more:
Image: Detail from Jasper Johns 'White Flag'America’s Epidemic of Empty Churches
Three blocks from my Brooklyn apartment, a large brick structure stretches toward heaven. Tourists recognize it as a church—the building’s bell tower and stained-glass windows give it away—but worshippers haven’t gathered here in years.
The 19th-century building was once known as St. Vincent De Paul Church and housed a vibrant congregation for more than a century. But attendance dwindled and coffers ran dry by the early 2000s. Rain leaked through holes left by missing shingles, a tree sprouted in the bell tower, and the Brooklyn diocese decided to sell the building to developers. Today, the Spire Lofts boasts 40 luxury apartments with one-bedroom units renting for as much as $4,812 per month. It takes serious cash to make God’s house your own, apparently.
Many of our nation’s churches can no longer afford to maintain their structures—between 6,000 and 10,000 churches die each year in America—and that number will likely grow. Though more than 70 percent of our citizens still claim to be Christian, congregational participation is less central to many Americans’ faith than it once was. Most denominations are declining as a share of the overall population, and donations to congregations have been falling for decades. Meanwhile, religiously unaffiliated Americans, nicknamed the “nones,” are growing as a share of the U.S. population.
Any minister can tell you that the two best predictors of a congregation’s survival are “budgets and butts,” and American churches are struggling by both metrics. As donations and attendance decrease, the cost of maintaining large physical structures that are only in use a few hours a week by a handful of worshippers becomes prohibitive. None of these trends show signs of slowing, so the United States’s struggling congregations face a choice: start packing or find a creative way to stay afloat.
Closure and adaptive reuse often seems like the simplest and most responsible path. Many houses of worship sit on prime real estate, often in the center of towns or cities where inventory is low. Selling the property to the highest bidder is a quick and effective way to cut losses and settle debts. But repurposing a sacred space for secular use has a number of drawbacks. There are zoning issues, price negotiations, and sometimes fierce pushback from the surrounding community and the parish’s former members.
by Jonathan Merritt, The Atlantic | Read more:
Image: Carlos Barria/Reuters
The 19th-century building was once known as St. Vincent De Paul Church and housed a vibrant congregation for more than a century. But attendance dwindled and coffers ran dry by the early 2000s. Rain leaked through holes left by missing shingles, a tree sprouted in the bell tower, and the Brooklyn diocese decided to sell the building to developers. Today, the Spire Lofts boasts 40 luxury apartments with one-bedroom units renting for as much as $4,812 per month. It takes serious cash to make God’s house your own, apparently.Many of our nation’s churches can no longer afford to maintain their structures—between 6,000 and 10,000 churches die each year in America—and that number will likely grow. Though more than 70 percent of our citizens still claim to be Christian, congregational participation is less central to many Americans’ faith than it once was. Most denominations are declining as a share of the overall population, and donations to congregations have been falling for decades. Meanwhile, religiously unaffiliated Americans, nicknamed the “nones,” are growing as a share of the U.S. population.
Any minister can tell you that the two best predictors of a congregation’s survival are “budgets and butts,” and American churches are struggling by both metrics. As donations and attendance decrease, the cost of maintaining large physical structures that are only in use a few hours a week by a handful of worshippers becomes prohibitive. None of these trends show signs of slowing, so the United States’s struggling congregations face a choice: start packing or find a creative way to stay afloat.
Closure and adaptive reuse often seems like the simplest and most responsible path. Many houses of worship sit on prime real estate, often in the center of towns or cities where inventory is low. Selling the property to the highest bidder is a quick and effective way to cut losses and settle debts. But repurposing a sacred space for secular use has a number of drawbacks. There are zoning issues, price negotiations, and sometimes fierce pushback from the surrounding community and the parish’s former members.
by Jonathan Merritt, The Atlantic | Read more:
Image: Carlos Barria/Reuters
[ed. I wonder at what point they lose their tax exempt status? The article doesn't say.]
Saturday, November 24, 2018
In Praise of Mediocrity
I’m a little surprised by how many people tell me they have no hobbies. It may seem a small thing, but — at the risk of sounding grandiose — I see it as a sign of a civilization in decline. The idea of leisure, after all, is a hard-won achievement; it presupposes that we have overcome the exigencies of brute survival. Yet here in the United States, the wealthiest country in history, we seem to have forgotten the importance of doing things solely because we enjoy them.
Yes, I know: We are all so very busy. Between work and family and social obligations, where are we supposed to find the time?
But there’s a deeper reason, I’ve come to think, that so many people don’t have hobbies: We’re afraid of being bad at them. Or rather, we are intimidated by the expectation — itself a hallmark of our intensely public, performative age — that we must actually be skilled at what we do in our free time. Our “hobbies,” if that’s even the word for them anymore, have become too serious, too demanding, too much an occasion to become anxious about whether you are really the person you claim to be.
If you’re a jogger, it is no longer enough to cruise around the block; you’re training for the next marathon. If you’re a painter, you are no longer passing a pleasant afternoon, just you, your watercolors and your water lilies; you are trying to land a gallery show or at least garner a respectable social media following. When your identity is linked to your hobby — you’re a yogi, a surfer, a rock climber — you’d better be good at it, or else who are you?
Lost here is the gentle pursuit of a modest competence, the doing of something just because you enjoy it, not because you are good at it. Hobbies, let me remind you, are supposed to be something different from work. But alien values like “the pursuit of excellence” have crept into and corrupted what was once the realm of leisure, leaving little room for the true amateur. The population of our country now seems divided between the semipro hobbyists (some as devoted as Olympic athletes) and those who retreat into the passive, screeny leisure that is the signature of our technological moment.
I don’t deny that you can derive a lot of meaning from pursuing an activity at the highest level. I would never begrudge someone a lifetime devotion to a passion or an inborn talent. There are depths of experience that come with mastery. But there is also a real and pure joy, a sweet, childlike delight, that comes from just learning and trying to get better. Looking back, you will find that the best years of, say, scuba-diving or doing carpentry were those you spent on the learning curve, when there was exaltation in the mere act of doing.
Yes, I know: We are all so very busy. Between work and family and social obligations, where are we supposed to find the time?
But there’s a deeper reason, I’ve come to think, that so many people don’t have hobbies: We’re afraid of being bad at them. Or rather, we are intimidated by the expectation — itself a hallmark of our intensely public, performative age — that we must actually be skilled at what we do in our free time. Our “hobbies,” if that’s even the word for them anymore, have become too serious, too demanding, too much an occasion to become anxious about whether you are really the person you claim to be.If you’re a jogger, it is no longer enough to cruise around the block; you’re training for the next marathon. If you’re a painter, you are no longer passing a pleasant afternoon, just you, your watercolors and your water lilies; you are trying to land a gallery show or at least garner a respectable social media following. When your identity is linked to your hobby — you’re a yogi, a surfer, a rock climber — you’d better be good at it, or else who are you?
Lost here is the gentle pursuit of a modest competence, the doing of something just because you enjoy it, not because you are good at it. Hobbies, let me remind you, are supposed to be something different from work. But alien values like “the pursuit of excellence” have crept into and corrupted what was once the realm of leisure, leaving little room for the true amateur. The population of our country now seems divided between the semipro hobbyists (some as devoted as Olympic athletes) and those who retreat into the passive, screeny leisure that is the signature of our technological moment.
I don’t deny that you can derive a lot of meaning from pursuing an activity at the highest level. I would never begrudge someone a lifetime devotion to a passion or an inborn talent. There are depths of experience that come with mastery. But there is also a real and pure joy, a sweet, childlike delight, that comes from just learning and trying to get better. Looking back, you will find that the best years of, say, scuba-diving or doing carpentry were those you spent on the learning curve, when there was exaltation in the mere act of doing.
by Tim Wu, NY Times | Read more:
Image: markk‘The Academy Is Largely Itself Responsible for Its Own Peril’
The book was supposed to end with the inauguration of Barack Obama. That was Jill Lepore’s plan when she began work in 2015 on her new history of America, These Truths (W.W. Norton). She had arrived at the Civil War when Donald J. Trump was elected. Not to alter the ending, she has said, would have felt like "a dereliction of duty as a historian."
These Truths clocks in at 789 pages (nearly 1,000 if you include the notes and index). It begins with Christopher Columbus and concludes with you-know-who. But the book isn’t a compendium; it’s an argument. The American Revolution, Lepore shows, was also an epistemological revolution. The country was built on truths that are self-evident and empirical, not sacred and God-given. "Let facts be submitted to a candid world," Thomas Jefferson wrote in the Declaration of Independence. Now, it seems, our faith in facts has been shaken. These Truths traces how we got here.
Lepore occupies a rarefied perch in American letters. She is a professor at Harvard University and a staff writer at The New Yorker. She has written books about King Philip’s War, Wonder Woman, and Jane Franklin, sister of Benjamin Franklin. She even co-wrote an entire novel in mock 18th-century prose. The Princeton historian Sean Wilentz has said of Lepore: "More successfully than any other American historian of her generation, she has gained a wide general readership without compromising her academic standing."
Lepore spoke with The Chronicle Review about how the American founding inaugurated a new way of thinking, the history of identity politics, and whether she's tired of people asking about her productivity. (...)
Q. America’s founding marked not only a new era of politics, but also a new way of thinking.
A. I call the book These Truths to invoke those truths in the Declaration of Independence that Jefferson describes, with the revision provided by Franklin, as "self-evident" — political equality, natural rights, and the sovereignty of the people. But I’m also talking about an unstated fourth truth, which is inquiry itself. Anyone who has spent time with the founding documents and the political and intellectual history in which they were written understands that the United States was founded quite explicitly as a political experiment, an experiment in the science of politics. It was always going to be subject to scrutiny. That scrutiny is done not from above by some commission, but by the citizenry itself.
Q. For democracy to work, of course, the people must be well informed. Yet we live in an age of epistemological mayhem. How did the relationship between truth and fact come unwound?
A. I spend a lot of time in the book getting it wound, to be fair. There’s an incredibly rich scholarship on the history of evidence, which traces its rise in the Middle Ages in the world of law, its migration into historical writing, and then finally into the realm that we’re most familiar with, journalism. That’s a centuries-long migration of an idea that begins in a very particular time and place, basically the rise of trial by jury starting in 1215. We have a much better vantage on the tenuousness of our own grasp of facts when we understand where facts come from.
The larger epistemological shift is how the elemental unit of knowledge has changed. Facts have been devalued for a long time. The rise of the fact was centuries ago. Facts were replaced by numbers in the 18th and 19th centuries as the higher-status unit of knowledge. That’s the moment at which the United States is founded as a demographic democracy. Now what’s considered to be most prestigious is data. The bigger the data, the better.
That transformation, from facts to numbers to data, traces something else: the shifting prestige placed on different ways of knowing. Facts come from the realm of the humanities, numbers represent the social sciences, and data the natural sciences. When people talk about the decline of the humanities, they are actually talking about the rise and fall of the fact, as well as other factors. When people try to re-establish the prestige of the humanities with the digital humanities and large data sets, that is no longer the humanities. What humanists do comes from a different epistemological scale of a unit of knowledge.
Q. How is the academy implicated in or imperiled by this moment of epistemological crisis?
A. The academy is largely itself responsible for its own peril. The retreat of humanists from public life has had enormous consequences for the prestige of humanistic ways of knowing and understanding the world.
Universities have also been complicit in letting sources of federal government funding set the intellectual agenda. The size and growth of majors follows the size of budgets, and unsurprisingly so. After World War II, the demands of the national security state greatly influenced the exciting fields of study. Federal-government funding is still crucial, but now there’s a lot of corporate money. Whole realms of knowing are being brought to the university through commerce.
I don’t expect the university to be a pure place, but there are questions that need to be asked. If we have a public culture that suffers for lack of ability to comprehend other human beings, we shouldn’t be surprised. The resources of institutions of higher learning have gone to teaching students how to engineer problems rather than speak to people. (...)
Q. The last chapter of These Truths is titled "America, Disrupted," and it traces the rise of ideas from the tech world, like innovation. You point out that innovation was traditionally seen as something to be wary of.
A. It’s true that the last chapter is about disruptive innovation, but it’s also true that the book starts with the history of writing as a technology. Reading "America, Disrupted" in isolation might seem like I have some beef with Silicon Valley — which may or may not be the case — but reading that chapter after the 15 that come before makes it clear that what I have is a deep and abiding interest in technology and communication.
Innovation as an idea in America is historically a negative thing. Innovation in politics is what is to be condemned: To experiment recklessly with a political arrangement is fatal to our domestic tranquillity. So there’s a lot of anti-innovation language around the founding, especially because Republicanism — Jeffersonianism — is considered excessively innovative. Innovation doesn’t assume its modern sense until the 1930s, and then only in a specialized literature.
Disruption has a totally different history. It’s a way to avoid the word "progress," which, even when it’s secularized, still implies some kind of moral progress. Disruption emerges in the 1990s as progress without any obligation to notions of goodness. And so "disruptive innovation," which became the buzzword of change in every realm in the first years of the 21st century, including higher education, is basically destroying things because we can and because there can be money made doing so. Before the 1990s, something that was disruptive was like the kid in the class throwing chalk. And that’s what disruptive innovation turned out to really mean. A little less disruptive innovation is called for.
by Evan Goldstein, Chronicle of Higher Education | Read more:
Image: Kayana Szymczak, The New York Times, Redux
These Truths clocks in at 789 pages (nearly 1,000 if you include the notes and index). It begins with Christopher Columbus and concludes with you-know-who. But the book isn’t a compendium; it’s an argument. The American Revolution, Lepore shows, was also an epistemological revolution. The country was built on truths that are self-evident and empirical, not sacred and God-given. "Let facts be submitted to a candid world," Thomas Jefferson wrote in the Declaration of Independence. Now, it seems, our faith in facts has been shaken. These Truths traces how we got here.Lepore occupies a rarefied perch in American letters. She is a professor at Harvard University and a staff writer at The New Yorker. She has written books about King Philip’s War, Wonder Woman, and Jane Franklin, sister of Benjamin Franklin. She even co-wrote an entire novel in mock 18th-century prose. The Princeton historian Sean Wilentz has said of Lepore: "More successfully than any other American historian of her generation, she has gained a wide general readership without compromising her academic standing."
Lepore spoke with The Chronicle Review about how the American founding inaugurated a new way of thinking, the history of identity politics, and whether she's tired of people asking about her productivity. (...)
Q. America’s founding marked not only a new era of politics, but also a new way of thinking.
A. I call the book These Truths to invoke those truths in the Declaration of Independence that Jefferson describes, with the revision provided by Franklin, as "self-evident" — political equality, natural rights, and the sovereignty of the people. But I’m also talking about an unstated fourth truth, which is inquiry itself. Anyone who has spent time with the founding documents and the political and intellectual history in which they were written understands that the United States was founded quite explicitly as a political experiment, an experiment in the science of politics. It was always going to be subject to scrutiny. That scrutiny is done not from above by some commission, but by the citizenry itself.
Q. For democracy to work, of course, the people must be well informed. Yet we live in an age of epistemological mayhem. How did the relationship between truth and fact come unwound?
A. I spend a lot of time in the book getting it wound, to be fair. There’s an incredibly rich scholarship on the history of evidence, which traces its rise in the Middle Ages in the world of law, its migration into historical writing, and then finally into the realm that we’re most familiar with, journalism. That’s a centuries-long migration of an idea that begins in a very particular time and place, basically the rise of trial by jury starting in 1215. We have a much better vantage on the tenuousness of our own grasp of facts when we understand where facts come from.
The larger epistemological shift is how the elemental unit of knowledge has changed. Facts have been devalued for a long time. The rise of the fact was centuries ago. Facts were replaced by numbers in the 18th and 19th centuries as the higher-status unit of knowledge. That’s the moment at which the United States is founded as a demographic democracy. Now what’s considered to be most prestigious is data. The bigger the data, the better.
That transformation, from facts to numbers to data, traces something else: the shifting prestige placed on different ways of knowing. Facts come from the realm of the humanities, numbers represent the social sciences, and data the natural sciences. When people talk about the decline of the humanities, they are actually talking about the rise and fall of the fact, as well as other factors. When people try to re-establish the prestige of the humanities with the digital humanities and large data sets, that is no longer the humanities. What humanists do comes from a different epistemological scale of a unit of knowledge.
Q. How is the academy implicated in or imperiled by this moment of epistemological crisis?
A. The academy is largely itself responsible for its own peril. The retreat of humanists from public life has had enormous consequences for the prestige of humanistic ways of knowing and understanding the world.
Universities have also been complicit in letting sources of federal government funding set the intellectual agenda. The size and growth of majors follows the size of budgets, and unsurprisingly so. After World War II, the demands of the national security state greatly influenced the exciting fields of study. Federal-government funding is still crucial, but now there’s a lot of corporate money. Whole realms of knowing are being brought to the university through commerce.
I don’t expect the university to be a pure place, but there are questions that need to be asked. If we have a public culture that suffers for lack of ability to comprehend other human beings, we shouldn’t be surprised. The resources of institutions of higher learning have gone to teaching students how to engineer problems rather than speak to people. (...)
Q. The last chapter of These Truths is titled "America, Disrupted," and it traces the rise of ideas from the tech world, like innovation. You point out that innovation was traditionally seen as something to be wary of.
A. It’s true that the last chapter is about disruptive innovation, but it’s also true that the book starts with the history of writing as a technology. Reading "America, Disrupted" in isolation might seem like I have some beef with Silicon Valley — which may or may not be the case — but reading that chapter after the 15 that come before makes it clear that what I have is a deep and abiding interest in technology and communication.
Innovation as an idea in America is historically a negative thing. Innovation in politics is what is to be condemned: To experiment recklessly with a political arrangement is fatal to our domestic tranquillity. So there’s a lot of anti-innovation language around the founding, especially because Republicanism — Jeffersonianism — is considered excessively innovative. Innovation doesn’t assume its modern sense until the 1930s, and then only in a specialized literature.
Disruption has a totally different history. It’s a way to avoid the word "progress," which, even when it’s secularized, still implies some kind of moral progress. Disruption emerges in the 1990s as progress without any obligation to notions of goodness. And so "disruptive innovation," which became the buzzword of change in every realm in the first years of the 21st century, including higher education, is basically destroying things because we can and because there can be money made doing so. Before the 1990s, something that was disruptive was like the kid in the class throwing chalk. And that’s what disruptive innovation turned out to really mean. A little less disruptive innovation is called for.
by Evan Goldstein, Chronicle of Higher Education | Read more:
Image: Kayana Szymczak, The New York Times, Redux
Friday, November 23, 2018
Subscribe to:
Comments (Atom)








