Wednesday, November 28, 2018

The Invisible Hit Parade: How Unofficial Recordings Have Flowered in the 21st Century

Most times, Eric Pier-Hocking will get to the venue before you do. It's not because he wants to be in the front row, grab some limited edition merch, or even meet the performing musicians. But all of those sometimes occur in the line of duty.

This evening at Trans-Pecos near the Brooklyn-Queens border, he is in the front, though only because the room is small and the exact center of the stage in front of the performer is the most convenient place to set up his microphones. Plus, there's a booth alongside the nearby wall where he can sit. And he will be acquiring something rare, in that he's about to make a high-fidelity recording of an exquisite performance by acoustic guitarist Daniel Bachman. And, in fact, he does meet the artist, as well. "Mostly just to say hi," Pier-Hocking shrugs. The show isn't empty, but it's far from a sell-out. In time, though, more people will be able to hear Bachman's performance. Pier-Hocking is there to preserve the music and share it. In the process, he has become a valuable part of the 21st century musical ecosystem.

For most of the artists he records, he'll make sure to secure permission in advance, but like an increasing number of touring musicians, the Virginia-based Bachman is fine with audio-obsessed fans like Pier-Hocking. In this case, Pier-Hocking doesn't even ask, just sets up his recording gear.

"I'm all about it," Bachman says, understanding that high-quality recordings of his performances are good calling cards to have out there, his music spreading further when Pier-Hocking posts it online. "I actually record other musicians myself," the guitarist says. "I'll just pull up the voice memo app on my iPhone and record an entire set. I do it on the road a lot so that I can listen back to friends or other people I get to perform with."

Wearing a black denim jacket covered in pins, Pier-Hocking is not a professional audio engineer. The 37-year-old works by day as a production manager at a publishing company. With short hair and a neatly cropped beard, it's easy to peg him for the enthusiastic indie music fan he is. But to call him an amateur wouldn't be accurate either. What he does goes far beyond recording on an iPhone.

Tonight, Pier-Hocking is running a pair of MBHO KA100DK omnidirectional microphone capsules (via a 603A capsule attachment) into "a home-brewed" PFA phantom power adapter by way of a set of newfangled "active" cables, wired up by a colleague on a web forum for live-performance recording aficionados. (Most still refer to them as tapers.) Along with a feed from the venue's soundboard, the microphone signal runs into Sound Devices MixPre-6, a digital multi-track recorder.

But, once getting his gear set up and he's sure his levels are OK, Pier-Hocking mostly just sits and listens attentively to Bachman's performance. Occasionally, he glances at the MixPre-6, just to make sure it's still running.

Capturing the music from the two different sources—his own mics and the sound-board feed—as a pair of multitrack WAV files, Pier-Hocking will later align the two recordings in Adobe Audition CC. It gets pretty geeky. "Usually the mics are milliseconds behind the board feed," he says. "I zoom in on the WAV and look for a sharp point I can isolate, like a drum hit, and then shift it all over." He corrects the EQ with Izotope Ozone 5, tracks and tags them with Audacity, and outputs them as high-def, lossless files known as FLACs. Once Bachman has gotten back to him with approval and corrects the track listing, Pier-Hocking will post the show as FLACs and mp3s to NYCTaper.com, a website established by Dan Lynch in 2007.

Sometimes, with an artist's permission, Pier-Hocking will also establish a page on the Internet Archive's Live Music Archive, where visitors can listen to shows right in their web browsers, and where files are backed up regularly to locations in Egypt, the Netherlands, and Canada. "I love Archive," he says. "You upload it once, and it sets it up for streaming and all the formats. It saves me a lot of work. And I know when I die, my recordings will still be there." He pauses for half a beat. "Which is comforting, I guess."

Like every other part of the music world, taping has changed utterly in the digital age. Once dismissed as mere bootlegging, the surrounding attitudes, economies, and technologies have evolved. It's been a long haul since Dean Benedetti recorded Charlie Parker's solos on a wire recorder. In the '60s and '70s aspiring preservationists snuck reel-to-reel recorders into venues under battlefield conditions, scaling down to professional quality handheld cassette decks and eventually to DATs.

The myth and popular image of "the taper" persists, even though there haven't really been tapes since the early 2000s, when most tapers switched from DAT to laptops and finally to portable drives. But old terms are hard to dismiss. Many now prefer "recording" or even "capturing" to "taping," though recent headlines are a good reminder of just how durable "tape" really is, and most just use the term unconsciously and don't have a preference about the terminology one way or the other—as long as you don't ask them to leave.

Unlike most every other part of the music world, taping has not only thrived in the 21st century but come into its own, from advanced cell phone gadgetry (like DPA's iPhone-ready d:vice MMA-A digital audio interface) to compact handheld recorders (like Zoom's varied line of products), from high-speed distribution to metadata organization. Despite constant radical change, taping has never been disrupted. Rather, it has positively flowered.

by Jesse Jarnow, Backchannel |  Read more:
Image: Vincent Tullo

Tuesday, November 27, 2018

Who Goes Nazi?

It is an interesting and somewhat macabre parlor game to play at a large gathering of one’s acquaintances: to speculate who in a showdown would go Nazi. By now, I think I know. I have gone through the experience many times—in Germany, in Austria, and in France. I have come to know the types: the born Nazis, the Nazis whom democracy itself has created, the certain-to-be fellow-travelers. And I also know those who never, under any conceivable circumstances, would become Nazis.

It is preposterous to think that they are divided by any racial characteristics. Germans may be more susceptible to Nazism than most people, but I doubt it. Jews are barred out, but it is an arbitrary ruling. I know lots of Jews who are born Nazis and many others who would heil Hitler tomorrow morning if given a chance. There are Jews who have repudiated their own ancestors in order to become “Honorary Aryans and Nazis”; there are full-blooded Jews who have enthusiastically entered Hitler’s secret service. Nazism has nothing to do with race and nationality. It appeals to a certain type of mind.

It is also, to an immense extent, the disease of a generation—the generation which was either young or unborn at the end of the last war. This is as true of Englishmen, Frenchmen, and Americans as of Germans. It is the disease of the so-called “lost generation.”

Sometimes I think there are direct biological factors at work—a type of education, feeding, and physical training which has produced a new kind of human being with an imbalance in his nature. He has been fed vitamins and filled with energies that are beyond the capacity of his intellect to discipline. He has been treated to forms of education which have released him from inhibitions. His body is vigorous. His mind is childish. His soul has been almost completely neglected.

At any rate, let us look round the room.

The gentleman standing beside the fireplace with an almost untouched glass of whiskey beside him on the mantelpiece is Mr. A, a descendant of one of the great American families. There has never been an American Blue Book without several persons of his surname in it. He is poor and earns his living as an editor. He has had a classical education, has a sound and cultivated taste in literature, painting, and music; has not a touch of snobbery in him; is full of humor, courtesy, and wit. He was a lieutenant in the World War, is a Republican in politics, but voted twice for Roosevelt, last time for Willkie. He is modest, not particularly brilliant, a staunch friend, and a man who greatly enjoys the company of pretty and witty women. His wife, whom he adored, is dead, and he will never remarry.

He has never attracted any attention because of outstanding bravery. But I will put my hand in the fire that nothing on earth could ever make him a Nazi. He would greatly dislike fighting them, but they could never convert him. . . . Why not?

Beside him stands Mr. B, a man of his own class, graduate of the same preparatory school and university, rich, a sportsman, owner of a famous racing stable, vice-president of a bank, married to a well-known society belle. He is a good fellow and extremely popular. But if America were going Nazi he would certainly join up, and early. Why? . . . Why the one and not the other?

Mr. A has a life that is established according to a certain form of personal behavior. Although he has no money, his unostentatious distinction and education have always assured him a position. He has never been engaged in sharp competition. He is a free man. I doubt whether ever in his life he has done anything he did not want to do or anything that was against his code. Nazism wouldn’t fit in with his standards and he has never become accustomed to making concessions.

Mr. B has risen beyond his real abilities by virtue of health, good looks, and being a good mixer. He married for money and he has done lots of other things for money. His code is not his own; it is that of his class—no worse, no better, He fits easily into whatever pattern is successful. That is his sole measure of value—success. Nazism as a minority movement would not attract him. As a movement likely to attain power, it would.

The saturnine man over there talking with a lovely French emigree is already a Nazi. Mr. C is a brilliant and embittered intellectual. He was a poor white-trash Southern boy, a scholarship student at two universities where he took all the scholastic honors but was never invited to join a fraternity. His brilliant gifts won for him successively government positions, partnership in a prominent law firm, and eventually a highly paid job as a Wall Street adviser. He has always moved among important people and always been socially on the periphery. His colleagues have admired his brains and exploited them, but they have seldom invited him—or his wife—to dinner.

He is a snob, loathing his own snobbery. He despises the men about him—he despises, for instance, Mr. B—because he knows that what he has had to achieve by relentless work men like B have won by knowing the right people. But his contempt is inextricably mingled with envy. Even more than he hates the class into which he has insecurely risen, does he hate the people from whom he came. He hates his mother and his father for being his parents. He loathes everything that reminds him of his origins and his humiliations. He is bitterly anti-Semitic because the social insecurity of the Jews reminds him of his own psychological insecurity.

Pity he has utterly erased from his nature, and joy he has never known. He has an ambition, bitter and burning. It is to rise to such an eminence that no one can ever again humiliate him. Not to rule but to be the secret ruler, pulling the strings of puppets created by his brains. Already some of them are talking his language—though they have never met him.

There he sits: he talks awkwardly rather than glibly; he is courteous. He commands a distant and cold respect. But he is a very dangerous man. Were he primitive and brutal he would be a criminal—a murderer. But he is subtle and cruel. He would rise high in a Nazi regime. It would need men just like him—intellectual and ruthless. But Mr. C is not a born Nazi. He is the product of a democracy hypocritically preaching social equality and practicing a carelessly brutal snobbery. He is a sensitive, gifted man who has been humiliated into nihilism. He would laugh to see heads roll.

I think young D over there is the only born Nazi in the room. Young D is the spoiled only son of a doting mother. He has never been crossed in his life. He spends his time at the game of seeing what he can get away with. He is constantly arrested for speeding and his mother pays the fines. He has been ruthless toward two wives and his mother pays the alimony. His life is spent in sensation-seeking and theatricality. He is utterly inconsiderate of everybody. He is very good-looking, in a vacuous, cavalier way, and inordinately vain. He would certainly fancy himself in a uniform that gave him a chance to swagger and lord it over others.

Mrs. E would go Nazi as sure as you are born. That statement surprises you? Mrs. E seems so sweet, so clinging, so cowed. She is. She is a masochist. She is married to a man who never ceases to humiliate her, to lord it over her, to treat her with less consideration than he does his dogs. He is a prominent scientist, and Mrs. E, who married him very young, has persuaded herself that he is a genius, and that there is something of superior womanliness in her utter lack of pride, in her doglike devotion. She speaks disapprovingly of other “masculine” or insufficiently devoted wives. Her husband, however, is bored to death with her. He neglects her completely and she is looking for someone else before whom to pour her ecstatic self-abasement. She will titillate with pleased excitement to the first popular hero who proclaims the basic subordination of women.

On the other hand, Mrs. F would never go Nazi. She is the most popular woman in the room, handsome, gay, witty, and full of the warmest emotion. She was a popular actress ten years ago; married very happily; promptly had four children in a row; has a charming house, is not rich but has no money cares, has never cut herself off from her own happy-go-lucky profession, and is full of sound health and sound common sense. All men try to make love to her; she laughs at them all, and her husband is amused. She has stood on her own feet since she was a child, she has enormously helped her husband’s career (he is a lawyer), she would ornament any drawing-room in any capital, and she is as American as ice cream and cake.

How about the butler who is passing the drinks? I look at James with amused eyes. James is safe. James has been butler to the ‘ighest aristocracy, considers all Nazis parvenus and communists, and has a very good sense for “people of quality.” He serves the quiet editor with that friendly air of equality which good servants always show toward those they consider good enough to serve, and he serves the horsy gent stiffly and coldly.

Bill, the grandson of the chauffeur, is helping serve to-night. He is a product of a Bronx public school and high school, and works at night like this to help himself through City College, where he is studying engineering. He is a “proletarian,” though you’d never guess it if you saw him without that white coat. He plays a crack game of tennis—has been a tennis tutor in summer resorts—swims superbly, gets straight A’s in his classes, and thinks America is okay and don’t let anybody say it isn’t. He had a brief period of Youth Congress communism, but it was like the measles. He was not taken in the draft because his eyes are not good enough, but he wants to design airplanes, “like Sikorsky.” He thinks Lindbergh is “just another pilot with a build-up and a rich wife” and that he is “always talking down America, like how we couldn’t lick Hitler if we wanted to.” At this point Bill snorts.

Mr. G is a very intellectual young man who was an infant prodigy. He has been concerned with general ideas since the age of ten and has one of those minds that can scintillatingly rationalize everything. I have known him for ten years and in that time have heard him enthusiastically explain Marx, social credit, technocracy, Keynesian economics, Chestertonian distributism, and everything else one can imagine. Mr. G will never be a Nazi, because he will never be anything. His brain operates quite apart from the rest of his apparatus. He will certainly be able, however, fully to explain and apologize for Nazism if it ever comes along. But Mr. G is always a “deviationist.” When he played with communism he was a Trotskyist; when he talked of Keynes it was to suggest improvement; Chesterton’s economic ideas were all right but he was too bound to Catholic philosophy. So we may be sure that Mr. G would be a Nazi with purse-lipped qualifications. He would certainly be purged.

H is an historian and biographer. He is American of Dutch ancestry born and reared in the Middle West. He has been in love with America all his life. He can recite whole chapters of Thoreau and volumes of American poetry, from Emerson to Steve Benet. He knows Jefferson’s letters, Hamilton’s papers, Lincoln’s speeches. He is a collector of early American furniture, lives in New England, runs a farm for a hobby and doesn’t lose much money on it, and loathes parties like this one. He has a ribald and manly sense of humor, is unconventional and lost a college professorship because of a love affair. Afterward he married the lady and has lived happily ever afterward as the wages of sin.

H has never doubted his own authentic Americanism for one instant. This is his country, and he knows it from Acadia to Zenith. His ancestors fought in the Revolutionary War and in all the wars since. He is certainly an intellectual, but an intellectual smelling slightly of cow barns and damp tweeds. He is the most good-natured and genial man alive, but if anyone ever tries to make this country over into an imitation of Hitler’s, Mussolini’s, or Petain’s systems H will grab a gun and fight. Though H’s liberalism will not permit him to say it, it is his secret conviction that nobody whose ancestors have not been in this country since before the Civil War really understands America or would really fight for it against Nazism or any other foreign ism in a showdown.

But H is wrong. There is one other person in the room who would fight alongside H and he is not even an American citizen. He is a young German emigre, whom I brought along to the party. The people in the room look at him rather askance because he is so Germanic, so very blond-haired, so very blue-eyed, so tanned that somehow you expect him to be wearing shorts. He looks like the model of a Nazi. His English is flawed—he learned it only five years ago. He comes from an old East Prussian family; he was a member of the post-war Youth Movement and afterward of the Republican “Reichsbanner.” All his German friends went Nazi—without exception. He hiked to Switzerland penniless, there pursued his studies in New Testament Greek, sat under the great Protestant theologian, Karl Barth, came to America through the assistance of an American friend whom he had met in a university, got a job teaching the classics in a fashionable private school; quit, and is working now in an airplane factory—working on the night shift to make planes to send to Britain to defeat Germany. He has devoured volumes of American history, knows Whitman by heart, wonders why so few Americans have ever really read the Federalist papers, believes in the United States of Europe, the Union of the English-speaking world, and the coming democratic revolution all over the earth. He believes that America is the country of Creative Evolution once it shakes off its middle-class complacency, its bureaucratized industry, its tentacle-like and spreading government, and sets itself innerly free.

The people in the room think he is not an American, but he is more American than almost any of them. He has discovered America and his spirit is the spirit of the pioneers. He is furious with America because it does not realize its strength and beauty and power. He talks about the workmen in the factory where he is employed. . . . He took the job “in order to understand the real America.” He thinks the men are wonderful. “Why don’t you American in- tellectuals ever get to them; talk to them?”

I grin bitterly to myself, thinking that if we ever got into war with the Nazis he would probably be interned, while Mr. B and Mr. G and Mrs. E would be spreading defeatism at all such parties as this one. “Of course I don’t like Hitler but . . .”

by Dorothy Thompson, Harper's (Aug. 1941) |  Read more:
Image: via
[ed. Repost. See also: Maybe They’re Just Bad People. The Mississippi senate runoff results come in tonight and the full extent of Robert Mueller's investigation should be made public soon. Make of this what you will.]

Kate Wolf

Self-Care Won't Save Us

It is somewhere between one and two in the morning and, as per usual, I am flicking through internet tabs. Without really taking anything in, I am dividing my attention between a recipe for broccoli and peanut butter soup (one which has been in my favorites tab for maybe three years, still never attempted), some news story about a terrible event in which many people have needlessly died, and the usual social media sites. Scrolling down my Facebook feed, in between the enviable holiday snaps and the links to more sad news stories—people don’t talk very much on Facebook any more, I’ve noticed; it’s mostly a conduit for the exchanging of links—a picture catches my eye. It’s a cartoon of a friendly-looking blob man, large-eyed and edgeless, wrapped up in blankets. The blob man is saying “It’s okay if all you want to do today is just stay in bed and watch Netflix.” I draw up my covers, nodding to no one in particular, and flick to a tab with my favorite old TV show.

The above story doesn’t refer to any particular night that I can remember. But the general theme is one that I’ve played out again and again. I’m not sure I’m ever going to make that soup.

If you’re a millennial with regular access to the internet, you’ve probably seen similar images to the cartoon I’ve described above. They’re usually painted in comforting primary colors or pastels, featuring simple illustrations, accompanied by text in a non-threatening font. They invite you to practice ‘self-care’, a term that has been prominent in healthcare theory for many decades but has recently increased in visibility online. The term generally refers to a variety of techniques and habits that are supposed to help with one’s physical and mental well-being, reduce stress, and lead to a more balanced lifestyle. “It’s like if you were walking outside in a thunderstorm, umbrella-less, and you walked into a cafĂ© filled with plush armchairs, wicker baskets full of flowers, and needlepoints on the walls that say things like ‘Be kind to yourself’ and ‘You are enough,’” says The Atlantic. Though the term has a medical tinge to it, the language used in the world of self-care is more aligned with the world of self-help, and much of the advice commonly given in the guise of self-care will be familiar to anyone who has browsed the pop-psychology shelves of a bookstore or listened to the counsel of a kindly coworker—take breaks from work and step outside for fresh air, take walks in the countryside, call a friend for a chat, have a lavender bath, get a good night’s sleep. Light a candle. Stop being so hard on yourself. Take time off if you’re not feeling so well and snuggle under the comforter with a DVD set and a herbal tea. Few people would argue with these tips in isolation (with a few exceptions—I think herbal tea is foul). We should all be making sure we are well-fed, rested, and filling our lives with things that we enjoy. In a time where people—especially millennials, at whom this particular brand of self-care is aimed—are increasingly talking about their struggles with depression, anxiety and insecurities, it’s no wonder that “practicing self-care” is an appealing prospect, even if it does sometimes seem like a fancy way to say “do things you like.” What is concerning is the way that this advice appears to be perfectly designed to fit in with a society that appears to be the cause of so much of the depression, anxiety, and insecurities. By finding the solution to young people’s mental ill-health (be it a diagnosed mental health problem or simply the day-to-day stresses of life) in do-it-yourself fixes, and putting the burden on the target audience to find a way to cope, the framework of self-care avoids having to think about issues on a societal level. In the world of self-care, mental health is not political, it’s individual. Self-care is mental health care for the neoliberal era.

As I write, the U.K. Prime Minister, Theresa May, is tweeting about World Mental Health Day and suicide prevention. She is not the only one; scrolling through the trending hashtags (there are several) one can find lots of comforting words about taking care of yourself, about opening up, confiding in a friend, keeping active, taking a breath. One such tweet is a picture of an arts-and-craftsy cut-out of a bright yellow circle behind dull green paper, designed to look like a cheerful sun. Printed on the sun are the words “everything will be so good so soon just hang in there & don’t worry about it too much.” All of us have probably seen some variation of these words at many points in our lives, and probably found at least a little bit of momentary relief in them. But looking through other tweets about World Mental Health Day reveals a different side of the issue. People talk about the times they did try to seek help, and were left to languish on waiting lists for therapy. They talk about the cuts to their local services (if they’re from somewhere with universal healthcare) or the insurance policies that wouldn’t cover them (if they’re in the United States). They talk about the illnesses left cold and untouched by campaigns that claim to reduce stigma—personality disorders, bipolar disorder, schizophrenia. They talk about homelessness and insecure housing and jobs that leave them exhausted. They talk about loneliness. And, in the case of Theresa May, they talk about how the suicide prevention minister she promises to hire will have to deal with the many people who consider suicide in response to her government’s policies. These are deep material and societal issues that all of us are touched by, to at least some degree. We know it when we see people begging in the streets, when we read yet another report that tells us our planet is dying, when we try to figure out why we feel sad and afraid and put it down to an ‘off day’, trying not to think about just how many ‘off days’ we seem to have. We turn to our TVs, to our meditation apps, and hope we can paper over the cracks. We are in darkness, and when we cry out for light, we are handed a scented candle.

A common sentiment expressed in the world of self-care is that anyone can suffer from mental ill-health. This is true, but it’s not the entire story. In fact, mental health problems are strongly correlated with poverty, vulnerability, and physical health conditions (with the causation going both ways). Furthermore, there is a big difference between those of us who are fortunate enough to be able to take time off work for doctor’s appointments and mental health days, and those who can’t; those of us who have children or other dependents to take care of, and those who don’t; those of us who have the financial independence to take a break from our obligations when we need to, and those who don’t. Not all people have the same access to help, or even access to their own free time—employers increasingly expect workers to be available whenever they are needed, both in white-collar jobs and precarious shift work. Add in the (heavily gendered) responsibilities of being a parent, studying, a night-time Uber gig to cover the bills, or a long commute from the only affordable area in the city, and the stress of life will pile on even as it soaks up the time you’re supposed to set aside to relieve that stress. Funding cuts are in fashion across a plethora of Western countries, both to healthcare and to other services that indirectly affect our health, especially the health of people who need additional support to lead the lives they wish to live, or even just to survive. The rhetoric around self-care is flattering but flattening, treating its audience as though the solution to their problems is believing in themselves and investing in themselves. This picture glosses over the question of what happens when society does not believe or invest in us.

Even for those of us who are relatively lucky in life, self-care does not solve our problems. It’s okay if all you did today was breathe,” promises a widely-shared image macro of a gentle talking pair of lungs. Well, I hate to break it to you, talking lungs, but it’s 2018. We’re supposed to be walking powerhouses of productivity, using every minute of our time to its best effect. In an economic environment where careers are precarious and competitive, young people are increasingly pressured to give up their free time to take on extracurriculars and unpaid projects “for their resume,” produce creative content “for exposure,” learn skills such as coding, scout for jobs on LinkedIn, write self-promoting posts about their personal qualities, and perhaps worst of all, attend godawful networking events, some of which don’t even have free canapĂ©s.

by Aisling McCrae, Current Affairs |  Read more:
Image: Lizzy Price

Steve Lawson - Celebrations

The Crisis

Tom Tomorrow
via:
[ed. See also: World must triple efforts or face catastrophic climate change, says UN. And, (of course) 'I don't believe it'. (The Guardian)]

The Insect Apocalypse Is Here

Sune Boye Riis was on a bike ride with his youngest son, enjoying the sun slanting over the fields and woodlands near their home north of Copenhagen, when it suddenly occurred to him that something about the experience was amiss. Specifically, something was missing.

It was summer. He was out in the country, moving fast. But strangely, he wasn’t eating any bugs.

For a moment, Riis was transported to his childhood on the Danish island of Lolland, in the Baltic Sea. Back then, summer bike rides meant closing his mouth to cruise through thick clouds of insects, but inevitably he swallowed some anyway. When his parents took him driving, he remembered, the car’s windshield was frequently so smeared with insect carcasses that you almost couldn’t see through it. But all that seemed distant now. He couldn’t recall the last time he needed to wash bugs from his windshield; he even wondered, vaguely, whether car manufacturers had invented some fancy new coating to keep off insects. But this absence, he now realized with some alarm, seemed to be all around him. Where had all those insects gone? And when? And why hadn’t he noticed?

Riis watched his son, flying through the beautiful day, not eating bugs, and was struck by the melancholy thought that his son’s childhood would lack this particular bug-eating experience of his own. It was, he granted, an odd thing to feel nostalgic about. But he couldn’t shake a feeling of loss. “I guess it’s pretty human to think that everything was better when you were a kid,” he said. “Maybe I didn’t like it when I was on my bike and I ate all the bugs, but looking back on it, I think it’s something everybody should experience.”

I met Riis, a lanky high school science and math teacher, on a hot day in June. He was anxious about not having yet written his address for the school’s graduation ceremony that evening, but first, he had a job to do. From his garage, he retrieved a large insect net, drove to a nearby intersection and stopped to strap the net to the car’s roof. Made of white mesh, the net ran the length of his car and was held up by a tent pole at the front, tapering to a small, removable bag in back. Drivers whizzing past twisted their heads to stare. Riis eyed his parking spot nervously as he adjusted the straps of the contraption. “This is not 100 percent legal,” he said, “but I guess, for the sake of science.”

Riis had not been able to stop thinking about the missing bugs. The more he learned, the more his nostalgia gave way to worry. Insects are the vital pollinators and recyclers of ecosystems and the base of food webs everywhere. Riis was not alone in noticing their decline. In the United States, scientists recently found the population of monarch butterflies fell by 90 percent in the last 20 years, a loss of 900 million individuals; the rusty-patched bumblebee, which once lived in 28 states, dropped by 87 percent over the same period. With other, less-studied insect species, one butterfly researcher told me, “all we can do is wave our arms and say, ‘It’s not here anymore!’ ” Still, the most disquieting thing wasn’t the disappearance of certain species of insects; it was the deeper worry, shared by Riis and many others, that a whole insect world might be quietly going missing, a loss of abundance that could alter the planet in unknowable ways. “We notice the losses,” says David Wagner, an entomologist at the University of Connecticut. “It’s the diminishment that we don’t see.” (...)

When the investigators began planning the study in 2016, they weren’t sure if anyone would sign up. But by the time the nets were ready, a paper by an obscure German entomological society had brought the problem of insect decline into sharp focus. The German study found that, measured simply by weight, the overall abundance of flying insects in German nature reserves had decreased by 75 percent over just 27 years. If you looked at midsummer population peaks, the drop was 82 percent.

Riis learned about the study from a group of his students in one of their class projects. They must have made some kind of mistake in their citation, he thought. But they hadn’t. The study would quickly become, according to the website Altmetric, the sixth-most-discussed scientific paper of 2017. Headlines around the world warned of an “insect Armageddon.”

Within days of announcing the insect-collection project, the Natural History Museum of Denmark was turning away eager volunteers by the dozens. It seemed there were people like Riis everywhere, people who had noticed a change but didn’t know what to make of it. How could something as fundamental as the bugs in the sky just disappear? And what would become of the world without them?
***
Anyone who has returned to a childhood haunt to find that everything somehow got smaller knows that humans are not great at remembering the past accurately. This is especially true when it comes to changes to the natural world. It is impossible to maintain a fixed perspective, as Heraclitus observed 2,500 years ago: It is not the same river, but we are also not the same people.

A 1995 study, by Peter H. Kahn and Batya Friedman, of the way some children in Houston experienced pollution summed up our blindness this way: “With each generation, the amount of environmental degradation increases, but each generation takes that amount as the norm.” In decades of photos of fishermen holding up their catch in the Florida Keys, the marine biologist Loren McClenachan found a perfect illustration of this phenomenon, which is often called “shifting baseline syndrome.” The fish got smaller and smaller, to the point where the prize catches were dwarfed by fish that in years past were piled up and ignored. But the smiles on the fishermen’s faces stayed the same size. The world never feels fallen, because we grow accustomed to the fall.

By one measure, bugs are the wildlife we know best, the nondomesticated animals whose lives intersect most intimately with our own: spiders in the shower, ants at the picnic, ticks buried in the skin. We sometimes feel that we know them rather too well. In another sense, though, they are one of our planet’s greatest mysteries, a reminder of how little we know about what’s happening in the world around us. (...)

With so much abundance, it very likely never occurred to most entomologists of the past that their multitudinous subjects might dwindle away. As they poured themselves into studies of the life cycles and taxonomies of the species that fascinated them, few thought to measure or record something as boring as their number. Besides, tracking quantity is slow, tedious and unglamorous work: setting and checking traps, waiting years or decades for your data to be meaningful, grappling with blunt baseline questions instead of more sophisticated ones. And who would pay for it? Most academic funding is short-term, but when what you’re interested in is invisible, generational change, says Dave Goulson, an entomologist at the University of Sussex, “a three-year monitoring program is no good to anybody.” This is especially true of insect populations, which are naturally variable, with wide, trend-obscuring fluctuations from one year to the next. (...)

Entomologists also knew that climate change and the overall degradation of global habitat are bad news for biodiversity in general, and that insects are dealing with the particular challenges posed by herbicides and pesticides, along with the effects of losing meadows, forests and even weedy patches to the relentless expansion of human spaces. There were studies of other, better-understood species that suggested that the insects associated with them might be declining, too. People who studied fish found that the fish had fewer mayflies to eat. Ornithologists kept finding that birds that rely on insects for food were in trouble: eight in 10 partridges gone from French farmlands; 50 and 80 percent drops, respectively, for nightingales and turtledoves. Half of all farmland birds in Europe disappeared in just three decades. At first, many scientists assumed the familiar culprit of habitat destruction was at work, but then they began to wonder if the birds might simply be starving. In Denmark, an ornithologist named Anders Tottrup was the one who came up with the idea of turning cars into insect trackers for the windshield-effect study after he noticed that rollers, little owls, Eurasian hobbies and bee-eaters — all birds that subsist on large insects such as beetles and dragonflies — had abruptly disappeared from the landscape.

The signs were certainly alarming, but they were also just signs, not enough to justify grand pronouncements about the health of insects as a whole or about what might be driving a widespread, cross-species decline. “There are no quantitative data on insects, so this is just a hypothesis,” Hans de Kroon, an ecologist at Radboud University in Denmark, explained to me — not the sort of language that sends people to the barricades.

Then came the German study. Scientists are still cautious about what the findings might imply about other regions of the world. But the study brought forth exactly the kind of longitudinal data they had been seeking, and it wasn’t specific to just one type of insect. The numbers were stark, indicating a vast impoverishment of an entire insect universe, even in protected areas where insects ought to be under less stress. The speed and scale of the drop were shocking even to entomologists who were already anxious about bees or fireflies or the cleanliness of car windshields.

by Brooke Jarvis, NY Times |  Read more:
Image: Photo illustrations by Matt Dorfman. Source photographs: Bridgeman Images

The Case for Dropping Out of College

During the summer, my father asked me whether the money he’d spent to finance my first few years at Fordham University in New York City, one of the more expensive private colleges in the United States, had been well spent. I said yes, which was a lie.

I majored in computer science, a field with good career prospects, and involved myself in several extracurricular clubs. Since I managed to test out of some introductory classes, I might even have been able to graduate a year early—thereby producing a substantial cost savings for my family. But the more I learned about the relationship between formal education and actual learning, the more I wondered why I’d come to Fordham in the first place.
* * *
According to the not-for-profit College Board, the average cost of a school year at a private American university was almost $35,000 in 2017—a figure I will use for purposes of rough cost-benefit analysis. (While public universities are less expensive thanks to government subsidies, the total economic cost per student-year, including the cost borne by taxpayers, typically is similar.) The average student takes about 32 credits worth of classes per year (with a bachelor’s degree typically requiring at least 120 credits in total). So a 3-credit class costs just above $3,000, and a 4-credit class costs a little more than $4,000.

What do students get for that price? I asked myself this question on a class by class basis, and have found an enormous mismatch between price and product in almost all cases. Take the two 4-credit calculus classes I took during freshman year. The professor had an unusual teaching style that suited me well, basing his lectures directly on lectures posted online by MIT. Half the class, including me, usually skipped the lectures and learned the content by watching the original material on MIT’s website. When the material was straightforward, I sped up the video. When it was more difficult, I hit pause, re-watched it, or opened a new tab on my browser so I could find a source that covered the same material in a more accessible way. From the perspective of my own convenience and education, it was probably one of the best classes I’ve taken in college. But I was left wondering: Why should anyone pay more than $8,000 to watch a series of YouTube videos, available online for free, and occasionally take an exam?

Another class I took, Philosophical Ethics, involved a fair bit of writing. The term paper, which had an assigned minimum length of 5,000 words, had to be written in two steps—first a full draft and then a revised version that incorporated feedback from the professor. Is $3,250 an appropriate cost for feedback on 10,000 words? That’s hard to say. But consider that the going rate on the web for editing this amount of text is just a few hundred dollars. Even assuming that my professor is several times more skilled and knowledgeable, it’s not clear that this is a good value proposition.

“But what about the lectures?” you ask. The truth is that many students, including me, don’t find the lectures valuable. As noted above, equivalent material usually can be found online for free, or at low cost. In some cases, a student will find that his or her own professor has posted video of his or her own lectures. And the best educators, assisted with the magic of video editing, often put out content that puts even the most renowned college lecturers to shame. If you have questions about the material, there’s a good chance you will find the answer on Quora or Reddit.

Last semester, I took a 4-credit class called Computer Organization. There was a total of 23 lectures, each of 75 minutes length—or about 29 hours of lectures. I liked the professor and enjoyed the class. Yet, once the semester was over, I noticed that almost all of the core material was contained in a series of YouTube videos that was just three hours long.

Like many of my fellow students, I spend most of my time in class on my laptop: Twitter, online chess, reading random articles. From the back of the class, I can see that other students are doing likewise. One might think that all of these folks will be in trouble when test time comes around. But watching a few salient online videos generally is all it takes to master the required material. You see the pattern here: The degrees these people get say “Fordham,” but the actual education often comes courtesy of YouTube.

The issue I am discussing is not new, and predates the era of on-demand web video. As far back as 1984, American educational psychologist Benjamin Bloom discovered that an average student who gets individual tutoring will outperform the vast majority of peers taught in a regular classroom setting. Even the best tutors cost no more than $80 an hour—which means you could buy 50 hours of their service for the pro-rated cost of a 4-credit college class that supplies 30 hours of (far less effective) lectures.

All of these calculations are necessarily imprecise, of course. But for the most part, I would argue, the numbers I have presented here underestimate the true economic cost of bricks-and-mortar college education, since I have not imputed the substantial effective subsidies that come through government tax breaks, endowments and support programs run by all levels of government.

So given all this, why are we told that, far from being a rip-off, college is a great deal? “In 2014, the median full-time, full-year worker over age 25 with a bachelor’s degree earned nearly 70% more than a similar worker with just a high school degree,” read one typical online report from 2016. The occasion was Jason Furman, then head of Barack Obama’s Council of Economic Advisers, tweeting out data showing that the ratio of an average college graduate’s earnings to a similarly situated high-school graduate’s earnings had grown from 1.1 in 1975 to more than 1.6 four decades later.

To ask my question another way: What accounts for the disparity between the apparently poor value proposition of college at a micro level with the statistically observed college premium at the macro level? A clear set of answers appears in The Case against Education: Why the Education System Is a Waste of Time and Money, a newly published book by George Mason University economist Bryan Caplan.

One explanation lies in what Caplan calls “ability bias”: From the outset, the average college student is different from the average American who does not go to college. The competitive college admissions process winnows the applicant pool in such a way as to guarantee that those who make it into college are more intelligent, conscientious and conformist than other members of his or her high-school graduating cohort. In other words, when colleges boast about the “70% income premium” they supposedly provide students, they are taking credit for abilities that those students already had before they set foot on campus, and which they likely could retain and commercially exploit even if they never got a college diploma. By Caplan’s estimate, ability bias accounts for about 45% of the vaunted college premium. Which would means that a college degree actually boosts income by about 40 points, not the oft-cited 70.

Of course, 40% is still a huge premium. But Caplan digs deeper by asking how that premium is earned. And in his view, the extra income doesn’t come from substantive skills learned in college classrooms, but rather from what he called the “signaling” function of a diploma: Because employers lack any quick and reliable objective way to evaluate a job candidate’s potential worth, they fall back on the vetting work done by third parties—namely, colleges. A job candidate who also happens to be someone who managed to get through the college admissions process, followed by four years of near constant testing, likely is someone who is also intelligent and conscientious, and who can be relied on to conform to institutional norms. It doesn’t matter what the applicant was tested on, since it is common knowledge that most of what one learns in college will never be applied later in life. What matters is that these applicants were tested on something. Caplan estimates that signaling accounts for around 80% of the 40-point residual college premium described above, which, if true, would leave less than ten percentage points—from the original 70—left to be accounted for. (...)

Till now, I have discussed the value of college education in generic fashion. But as everyone on any campus knows, different majors offer different value. In the case of liberal arts, the proportion of the true college premium attributable to signaling is probably close to 100%. It is not just that the jobs these students seek typically don’t require any of the substantive knowledge they acquired during their course of study: They also aren’t really improving students’ analytical skills, either. In their 2011 book Academically Adrift: Limited Learning on College Campuses, sociologists Richard Arum and Josipa Roksa presented data showing that, over their first two years of college, students typically improve their skills in critical thinking, complex reasoning and writing by less than a fifth of a standard deviation.

According to the U.S. Department of Commerce’s 2017 report on STEM jobs, even the substantive educational benefit to be had from degrees in technical fields may be overstated—since “almost two-thirds of the workers with a STEM undergraduate degree work in a non-STEM job.” Signaling likely play a strong role in such cases. Indeed, since STEM degrees are harder to obtain than non-STEM degrees, they provide an even stronger signal of intelligence and conscientiousness.

However, this is not the only reason why irrelevant coursework pays. Why do U.S. students who want to become doctors, one of the highest paying professions, first need to complete four years of often unrelated undergraduate studies? The American blogger and psychiatrist Scott Alexander, who majored in philosophy as an undergraduate and then went on to study medicine in Ireland, observed in his brilliant 2015 essay Against Tulip Subsidies that “Americans take eight years to become doctors. Irishmen can do it in four, and achieve the same result.” Law follows a similar pattern: While it takes four years to study law in Ireland, and in France it takes five, students in the United States typically spend seven years in school before beginning the separate process of bar accreditation.

by Samuel Knoche, Quillette | Read more:
Image: uncredited

Maybe They’re Just Bad People

Seven years ago, a former aide to Ralph Reed — who also worked, briefly, for Paul Manafort — published a tawdry, shallow memoir that is also one of the more revealing political books I’ve ever read. Lisa Baron was a pro-choice, pro-gay rights, hard-partying Jew who nonetheless made a career advancing the fortunes of the Christian right. She opened her book with an anecdote about performing oral sex on a future member of the George W. Bush administration during the 2000 primary, which, she wrote, “perfectly summed up my groupie-like relationship to politics at that time — I wanted it, I worshiped it, and I went for it.”

It’s not exactly a secret that politics is full of amoral careerists lusting — literally or figuratively — for access to power. Still, if you’re interested in politics because of values and ideas, it can be easier to understand people who have foul ideologies than those who don’t have ideologies at all. Steve Bannon, a quasi-fascist with delusions of grandeur, makes more sense to me than Anthony Scaramucci, a political cipher who likes to be on TV. I don’t think I’m alone. Consider all the energy spent trying to figure out Ivanka Trump’s true beliefs, when she’s shown that what she believes most is that she’s entitled to power and prestige.

Baron’s book, “Life of the Party: A Political Press Tart Bares All,” is useful because it is a self-portrait of a cynical, fame-hungry narcissist, a common type but one underrepresented in the stories we tell about partisan combat. A person of limited self-awareness — she seemed to think readers would find her right-wing exploits plucky and cute — Baron became Reed’s communications director because she saw it as a steppingstone to her dream job, White House press secretary, a position she envisioned in mostly sartorial terms. (“Outfits would be planned around the news of the day,” she wrote.) Reading Baron’s story helped me realize emotionally something I knew intellectually. It’s tempting for those of us who interpret politics for a living to overstate the importance of competing philosophies. We shouldn't forget the enduring role of sheer vanity. (...)

In many ways, the insincere Trumpists are the most frustrating. Because they don’t really believe in Trump’s belligerent nationalism and racist conspiracy theories, we keep expecting them to feel shame or remorse. But they’re not insincere because they believe in something better than Trumpism. Rather, they believe in very little. They are transactional in a way that makes no psychological sense to those of us who see politics as a moral drama; they might as well all be wearing jackets saying, “I really don’t care, do u?”

Baron’s book helped me grasp what public life is about for such people. “I loved being in the middle of something big, and the biggest thing in my life was Ralph,” she wrote in one of her more plaintive passages. “Without him, I was nobody.” Such a longing for validation is underrated as a political motivator. Senator Lindsey Graham, another insincere Trumpist, once justified his sycophantic relationship with the president by saying, “If you knew anything about me, I want to be relevant.” Some people would rather be on the wrong side than on the outside.

by Michelle Goldberg, NY Times |  Read more:
Image: uncredited
[ed. See also: Mia Love: "No real relationships, just convenient transactions".]

How a Japanese Craftsman Lives by the Consuming Art of Indigo Dyeing

Kanji Hama, 69, has quietly dedicated his life to maintaining the traditional Japanese craft of katazome: stencil-printed indigo-dyed kimonos made according to the manner and style of the Edo period. He works alone seven days a week from his home in Matsumoto, Nagano, keeping indigo fermentation vats brewing in his backyard and cutting highly detailed patterns into handmade paper hardened with persimmon tannins to create designs for a craft for which there is virtually no market. Nearly identical-looking garments can be had for a pittance at any souvenir store.

Indigo is one of a handful of blue dyes found in nature, and it’s surprising that it was ever discovered at all, as the plants that yield it reveal no hint of the secret they hold. Unlike other botanical dyestuff, which can be boiled or crushed to release its color, the creation of indigo requires a complex molecular process involving fermentation of the plant’s leaves. (The most common source is the tropical indigo plant, or Indigofera tinctoria, but Japanese dyes are generally made from Persicaria tinctoria, a species of buckwheat.) Everyone who has worked with indigo — from the Tuareg and Yoruba in Africa to the Indians and Japanese across Asia to the prehistoric tribes in the Americas — figured out their own methods for coaxing out the dye, and distinct ways of using it to embellish their clothing, costumes, domestic textiles or ritual objects that were particularly expressive of their own culture and beliefs.

No one knows exactly when indigo arrived in Japan, but beginning around the eighth century, the Japanese began creating a large repertoire of refined traditions for designing with it. Many indigo techniques are intended to hold back, or resist, the dye in certain areas to create designs. Nearly all of these, which include various ways of manipulating the fabric before it is dyed, such as tying it, knotting it, folding it, stitching it, rolling it or applying a gluey substance to it, are used in the great variety of Japanese traditions. But for Hama’s katazome practice, a paste of fermented rice is applied through a stencil laid on top of the fabric. After the fabric has been dipped in an indigo vat, the paste gets washed off and the stenciled design remains. (Resist pastes in other countries often employ local ingredients: Indonesian batik is made with wax, Indian dabu block prints with mud and Nigerian adire with cassava flour.) Katazome, however, unlike the other resist techniques, can yield very intricate and delicate designs because the stencil-making itself, called katagami, is a precise and elaborate craft, unique to Japan.

Matsumoto, which is roughly halfway between Tokyo and Kyoto, was once a center for the Japanese folk craft movement of the 1930s through the 1950s, which recognized and celebrated the beauty of regional, handcrafted everyday objects, or mingei. Hama’s grandfather was part of that movement and a pioneer in reviving natural dyeing after its obsolescence. Hama learned his trade as his father’s apprentice, starting when he was 18, working without salary or holidays, seven days a week for 15 years. (Every evening, from 8 p.m. until about 3 a.m., Hama returned to the studio to practice what he had learned that day.)

Wearing blue work clothes, his hair covered with an indigo scarf and his hands and fingernails stained blue, Hama ushers me to his studio, which occupies the second floor of his house and is outfitted with long, narrow tables built to accommodate lengths of kimono fabric (a standard kimono is about 40 feet long and 16 inches wide). From a back door off the studio, stairs lead to a shed that houses his fermentation vats and a small yard, given over in its entirety to sheaths of dyed kimono fabric, stretched from one end to the other — like long, slender hammocks — to dry.

Of the dozens of steps involved in his process, some are highly complicated and some are simply tedious, such as the repeated washing and starching and rinsing of the fabric, but all are time-consuming. “Craft is doing things with your hands. Once you manufacture things, it is no longer craft,” Hama tells me. As a holdout devoted to maintaining the tradition against all odds, almost to the point of tragic absurdity, Hama is not interested in the easy way. Rather than buy prewashed fabric or premade starch, Hama makes them himself. He sets down one of the stencils he has carved into persimmon-hardened paper called washi — a slight modification of an 18th-century pattern, which he has backed in silk to keep the intricate design intact — onto a length of fabric fastened to one of the tables. (He doesn’t make his own paper or persimmon extract, but only because he doesn’t think the variety of persimmon used today yields the same quality tannins as those from his grandfather’s day. As a result, he has planted a tree from which he hopes one day to make his own.) With a hera, a spatula-like tool, he evenly slathers a glutinous rice paste over the stencil to resist the dye. Because Hama wants a precise consistency to his paste, which varies based on the intricacy of the design and the weather conditions, he mixes his own, a process that takes half a day. He squeegees the excess off the stencil and, by eye, proceeds down the table, lining it up where the previous one left off. The fabric is then hung in the studio to dry before he can do the same work on the other side: Once sewn into a kimono, it won’t even be visible. Next, the fabric is moved outside, where it gets covered in soy milk (also homemade) to help keep the glue in place as it dries in the sun; this is repeated three times on each side before the dyeing can start. We head down to the fermentation dye vats, which are steaming cauldrons cut into the floor of a lean-to shed. Each indigo dyer has his own recipe for adding lime, ash, lye from wood and wheat husks to the sukumo (or composted indigo plant), which must be kept warm and stirred for a couple weeks in order to ferment and become dye in a process called aitate. Hama works according to the seasons. In the summer and monsoon seasons, it is too hot for indigo, as the paste will melt, while in winter, he must rise each morning at 3 a.m. to descend into the cold, adding new coals for a consistent temperature.

Hama is cognizant that what he knows will likely die along with him. Like many masters of traditional crafts in Japan, Hama does not believe in writing down the process, because the craft is understood to be so much more than its individual steps and thus impossible to transmit through written instruction. Indigo dyeing like this is a way of life, and to the extent to which Hama is a master, he possesses not just his own knowledge but, in a very real way, his father’s and his father’s father’s knowledge. This kind of embodied, tacit expertise doesn’t translate easily into English as it involves the very un-Western idea of the body and the intellect working in unison, masterfully and efficiently, as if in a dance. There is a chance his son will take on the business, but Hama thinks this generation is incapable of putting in the time it takes to gain the mastery of a craft like this.

by Deborah Needleman, NY Times |  Read more:
Image: Kyoko Hamada. Styled by Theresa Rivera. Photographer’s assistant: Garrett Milanovich. Styling assistant: Sarice Olson. Indigo pieces courtesy of Kanji Hama

Monday, November 26, 2018


Oleg Tselkov (Russian, b. 1934), Flush Toilet and Agave, 1956
via:

Michael Kidd (1937 - ) Derek Jarman’s Garden
via:

An Ecology of Beauty and Strong Drink

According to the theory of cultural evolution, rituals and other cultural elements evolve in the context of human beings. They depend on us for their reproduction, and sometimes help us feel good and accomplish our goals, reproductive and otherwise. Ritual performances, like uses of language, exhibit a high degree of variation; ritual performances change over time, and some changes are copied, some are not. As with genetic mutation, ritual novelty is constantly emerging.

The following presents several ecological metaphors for ritual adaptation: sexual selection, the isolated island, and the clearcut forest. Once these metaphors are established, I will explain how they apply to ritual, and suggest some policy recommendations based on this speculation. (...)

Clearcuts

When a mature natural ecosystem is destroyed by fire, clearcutting, or plowing, a particular process of succession follows. First, plants with a short life history that specialize in colonization emerge; these first-stage plants are often called weeds, or “weedy ephemerals,” and make up a large number of agricultural pest species. But these initial colonizers specialize in colonization at the expense of long-term competitiveness for light. Second, a wave of plants that are not as good at spreading their seed, but a little better at monopolizing light, gain dominance. These are followed by plants that are even better at long-term competition; eventually, absent human interference, the original weeds become rare.

Sometimes, however, the landscape is frozen at the first stage of succession; this is known as agriculture. Second-wave competitive plants are prevented from growing; the land is cleared again and again, and the seeds of a single species planted, providing an optimal environment for short-life-history weeds. Since the survival of humans and their livestock depends on only a few species of plants, other plants that would eventually out-compete the weeds must not be permitted to grow. Instead, herbicides are applied, resulting in selection for better and better weeds.

This is not an indictment of agriculture. Again, without these methods, most humans on earth would die. But the precariousness of the situation is a result of evolutionary processes. Perverse results are common in naive pest management strategies; Kaneshiro (pp. 13-14) suggests that eradication efforts for the Mediterranean fruit fly in California in the 1980s, despite temporarily reducing the population size substantially, paradoxically resulted in the adaptation of the fruit fly to winter conditions and subsequent population explosions. Pesticide resistance in plants and animals (and even diseases) frequently follows a similarly perverse course.

Ritual Ecology

Ecosystems are made up of “selfish” organisms that display variation, and undergo natural and sexual selection. Ecosystems seem to self-repair because any temporarily empty niche will quickly be filled by any organism that shows up to do the job, no matter how ill-suited it may be at first. Economies self-repair in the same manner: a product or service that is not being supplied is an opportunity.

Language appears to be remarkably self-repairing: deaf school children in Nicaragua, provided only with lipreading training of dubious effectiveness, developed their own language, which within two generations acquired the core expressive characteristics of any human language.

While inherited ritual traditions may be extremely useful and highly adapted to their contexts, ritual may exhibit a high degree of self-repair as well. And since the context of human existence has changed so rapidly since the Industrial Revolution, ancestral traditions may be poorly adapted to new contexts; self-repair for new contexts may be a necessity. The human being himself has not changed much, but his environment, duties, modes of subsistence, and social interdependencies have changed dramatically.

Memetic selection is like sexual selection, in that it is based on signal reception by a perceiving organism (another human or group of humans). Rituals are transmitted by preferential copying (with variation); even novel rituals, like the rock concert, the desert art festival, the school shooting, or the Twitter shaming, must be attended to and copied in order to survive and spread.

Some rituals are useful, providing group cohesion and bonding, the opportunity for costly signaling, free-rider detection and exclusion, and similar benefits. Some rituals have aesthetic or affective benefits, providing desirable mental states; these need not be happy, as one of the most popular affective states provided by songs is poignant sadness. Rituals vary in their usefulness, communication efficiency, pleasurability, and prestige; they will be selected for all these qualities.

Ritual is not a single, fungible substance. Rather, an entire human culture has many ritual niches, just like an ecosystem: rituals specialized for cohesion and bonding may display adaptations entirely distinct from rituals that are specialized for psychological self-control or pleasurable feelings. Marriage rituals are different from dispute resolution rituals; healing rituals are distinct from criminal justice rituals. Humans have many signaling and affective needs, and at any time many rituals are in competition to supply them.

Cultural Clearcutting: Ritual Shocks

Ordinarily, rituals evolve slowly and regularly, reflecting random chance as well as changes in context and technology. From time to time, there are shocks to the system, and an entire ritual ecosystem is destroyed and must be repaired out of sticks and twigs.

Recall that in literal clearcutting, short-life-history plants flourish. They specialize in spreading quickly, with little regard for long-term survival and zero regard for participating in relationships within a permanent ecosystem. After a cultural clearcutting occurs, short-life-history rituals such as drug abuse flourish. To take a very extreme example, the Native American genocide destroyed many cultures at one blow. Many peoples who had safely used alcohol in ceremonial contexts for centuries experienced chronic alcohol abuse as their cultures were erased and they were massacred and forcibly moved across the country to the most marginal lands. There is some recent evidence of ritual repair, however; among many Native American groups, alcohol use is lower than among whites, and the ratio of Native American to white alcohol deaths has been decreasing for decades.

Crack cocaine did not spread among healthy, ritually intact communities. It spread among communities that had been “clearcut” by economic problems (including loss of manufacturing jobs), sadistic urban planning practices, and tragic social changes in family structure. Methamphetamine has followed similar patterns.

Alcohol prohibition in the United States constituted both a ritual destruction and a pesticide-style management policy. Relatively healthy ritual environments for alcohol consumption, resulting in substantial social capital, were destroyed, including fine restaurants. American cuisine was set back decades as the legitimate fine restaurants could not survive economically without selling a bottle of wine with dinner. In their place, short-life-history ritual environments, such as the speakeasy, sprung up; they contributed little to social capital, and had no ritual standards for decorum.

During (alcohol) Prohibition, when grain and fruit alcohol was not available, poisonous wood alcohols or other toxic alcohol substitutes were commonly consumed, often (but not always) unknowingly. (It’s surprising that there are drugs more toxic than alcohol, but there you go.) The consumption of poisoned (denatured) or wood alcohol may be the ultimate short-life-history ritual; it contributed nothing to social capital, provided but a brief experience of palliation, and often resulted in death or serious medical consequences. Morgues filled with bodies. The modern-day policy of poisoning prescription opiates with acetaminophen has the same effect as the Prohibition-era policy of “denaturing” alcohol: death and suffering to those in too much pain to pay attention to long-term incentives.

Early 20th century and modern prohibitions clearly don’t eradicate short-life-history drug rituals; rather, they concentrate them in their most harmful forms, and at the same time create a permanent economic niche for distributors. As the recently deceased economist Douglass North said in his Nobel lecture,
The organizations that come into existence will reflect the opportunities provided by the institutional matrix. That is, if the institutional framework rewards piracy then piratical organizations will come into existence; and if the institutional framework rewards productive activities then organizations – firms – will come into existence to engage in productive activities.
If the ritual ecology within a category of ritual provides attractive niches for short-life-history rituals, and the economic ecology provides niches for drug cartels, then these will come into existence and prosper; but if a ritual context is allowed to evolve to encapsulate mind-altering substances, as it has for most human societies in the history of the world, and to direct the use of these substances in specific times, manners, and places, then these longer-life-history rituals specialized for competition rather than short-term palliation will flourish. Prohibition is a pesticide with perverse effects; ritual reforestation is a long-term solution. (...)

I focus on drugs because drugs are interesting, and they provide a tidy example of the processes in ritual ecology. But the same selective effects are present in many domains: music, drama, exercise, food, and the new ritual domain of the internet.

by Sarah Perry, Ribbonfarm |  Read more:
Image: Clearcut, Wikipedia

Sunday, November 25, 2018

Of America and the Rise of the Stupefied Plutocrat

At the higher elevations of informed American opinion in the spring of 2018 the voices of reason stand united in their fear and loathing of Donald J. Trump, real estate mogul, reality TV star, 45th president of the United States. Their viewing with alarm is bipartisan and heartfelt, but the dumbfounded question, “How can such things be?” is well behind the times. Trump is undoubtedly a menace, but he isn’t a surprise. His smug and self-satisfied face is the face of the way things are and have been in Washington and Wall Street for the last quarter of a century.

Trump staked his claim to the White House on the proposition that he was “really rich,” embodiment of the divine right of money and therefore free to say and do whatever it takes to make America great again. A deus ex machina descending an escalator into the atrium of his eponymous tower on Manhattan’s Fifth Avenue in June 2015, Trump was there to say, and say it plainly, that money is power, and power, ladies and gentlemen, is not self-sacrificing or democratic. The big money cares for nothing other than itself, always has and always will. Name of the game, nature of the beast.

Not the exact words in Trump’s loud and thoughtless mouth, but the gist of the message that over the next 17 months he shouted to fairground crowd and camera in states red, white and blue. A fair enough share of his fellow citizens screamed, stamped and voted in agreement because what he was saying they knew to be true, knew it not as precept borrowed from the collected works of V.I. Lenin or Ralph Lauren but from their own downwardly mobile experience on the losing side of a class war waged over the past 40 years by America’s increasingly frightened and selfish rich against its increasingly angry and debtbound poor.

Trump didn’t need briefing papers to refine the message. He presented it live and in person, an unscripted and overweight canary flown from its gilded cage, telling it like it is when seen from the perch of the haves looking down on the birdseed of the have-nots. Had he time or patience for looking into books instead of mirrors, he could have sourced his wisdom to Supreme Court Justice Louis Brandeis, who in 1933 presented the case for Franklin D. Roosevelt’s New Deal: “We must make our choice. We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.”

Not that it would have occurred to Trump to want both, but he might have been glad to know the Supreme Court had excused him from further study under the heading of politics. In the world according to Trump—as it was in the worlds according to Ronald Reagan, George Bush pere et fils, Bill Clinton and Barack Obama—the concentration of wealth is the good, the true and the beautiful. Democracy is for losers.

Ronald Reagan was elected President in 1980 with an attitude and agenda similar to Trump’s—to restore America to its rightful place where “someone can always get rich.” His administration arrived in Washington firm in its resolve to uproot the democratic style of feeling and thought that underwrote FDR’s New Deal. What was billed as the Reagan Revolution and the dawn of a New Morning in America recruited various parties of the dissatisfied right (conservative, neoconservative, libertarian, reactionary and evangelical) under one flag of abiding and transcendent truth—money ennobles rich people, making them healthy, wealthy and wise; money corrupts poor people, making them ignorant, lazy and sick.

Re-branded as neoliberalism in the 1990s the doctrine of enlightened selfishness has served as the wisdom in political and cultural office ever since Reagan stepped onto the White House stage promising a happy return to an imaginary American past—to the home on the range made safe from Apaches by John Wayne, an America once again cowboy-hatted and standing tall, risen from the ashes of defeat in Vietnam, cleansed of its Watergate impurities, outspending the Russians on weapons of mass destruction, releasing the free market from the prison of government regulation, going long on the private good, selling short the public good.

For 40 years under administrations Republican and Democrat, the concentrations of wealth and power have systematically shuffled public land and light and air into a private purse, extended the reach of corporate monopoly, shifted the bulk of the nation’s income to its top-tier fatted calves, let fall into disrepair nearly all the infrastructure—roads, water systems, schools, bridges, hospitals and power plants—that provides a democratic commonwealth with the means of production for its mutual enterprise. The subdivision of America the Beautiful into a nation of the rich and a nation of the poor has outfitted a tenth of the population with three-quarters of the nation’s wealth. The work in progress has been accompanied by the construction of a national security and surveillance state backed by the guarantee of never-ending foreign war and equipped with increasingly repressive police powers to quiet the voices of domestic discontent. In the 1950s the word public indicated a common good (public health, public school, public service, public spirit); private was a synonym for selfishness and greed (plutocrats in top hats, pigs at troughs). The connotations traded places in the 1980s; private to be associated with all things bright and beautiful (private trainer, private school, private plane), public a synonym for all things ugly, incompetent and unclean (public housing, public welfare, public toilet). (...)

The framers of the Constitution, prosperous and well-educated gentlemen assembled in Philadelphia in the summer of 1787, shared with John Adams the suspicion that “democracy will infallibly destroy all civilization,” agreed with James Madison that the turbulent passions of the common man lead to “reckless agitation” for the abolition of debts and “other wicked projects.” With Plato the framers shared the assumption that the best government incorporates the means by which a privileged few arrange the distribution of property and law for the less fortunate many. They envisioned an enlightened oligarchy to which they gave the name of a republic. Adams thought “the great functions of state” should be reserved for “the rich, the well-born, and the able,” the new republic to be managed by men to whom Madison attributed “most wisdom to discern and most virtue to pursue the common good of the society.” (...)

But unlike our present-day makers of money and law, the founders were not stupefied plutocrats. They knew how to read and write (in Latin or French if not also in Greek) and they weren’t preoccupied with the love and fear of money. From their reading of history they understood that oligarchy was well-advised to furnish democracy with some measure of political power because the failure to do so was apt to lead to their being roasted on pitchforks. Accepting of the fact that whereas democracy puts a premium on equality, a capitalist economy does not, the founders looked to balance the divergent ways and means, to accommodate both motions of the heart and the movement of a market. They conceived the Constitution as both organism and mechanism and offered as warranty for its worth the character of men presumably relieved of the necessity to cheat and steal and lie.

The presumption in 1787 could be taken at fair and face value. The framers were endowed with the intellectual energy of the 18th-century Enlightenment, armed with the moral force of the Christian religion. Their idea of law they held to be sacred, a marriage of faith and reason. But good intentions are a perishable commodity, and even the best of oligarchies bear comparison to cheese. Sooner or later they turn rancid in the sun. Wealth accumulates, men decay; a band of brothers that once aspired to form a wise and just government acquires the character of what Aristotle likened to that of “the prosperous fool,” a class of men insatiable in their appetite for more—more banquets, more laurel wreaths and naval victories, more temples, dancing girls and portrait busts—so intoxicated by the love of money “they therefore imagine there is nothing it cannot buy.” (...)

All men were maybe equal in the eye of God, but not in the pews in Boston’s Old North Church, in the streets of Benjamin Franklin’s Philadelphia, in the fields at Jefferson’s Monticello. The Calvinist doctrine of predestination divided the Massachusetts flock of Christian sheep into damned and saved; Cotton Mather in 1696 reminded the servants in his midst, “You are the animate, separate passive instruments of other men . . . your tongues, your hands, your feet, are your masters’s and they should move according to the will of your masters.” Franklin, enlightened businessman and founder of libraries, looked upon the Philadelphia rabble as coarse material that maybe could be brushed and combed into an acceptable grade of bourgeois broadcloth. His Poor Richard’s Almanac offered a program for turning sow’s ears if not into silk purses, then into useful tradesmen furnished with a “happy mediocrity.” For poor white children in Virginia, Jefferson proposed a scheme he described as “raking from the rubbish” the scraps of intellect and talent worth the trouble of further cultivation. A few young illiterates who showed promise as students were allowed to proceed beyond the elementary grades; the majority were released into a wilderness of ignorance and poverty, dispersed over time into the westward moving breeds of an American underclass variously denominated as “mudsill,” “hillbilly,” “cracker,” “Okie,” “redneck,” Hillary Clinton’s “basket of deplorables.”

Nor at any moment in its history has America declared a lasting peace between the haves and have-nots. Temporary cessations of hostilities, but no permanent closing of the moral and social frontier between debtor and creditor. The notion of a classless society derives its credibility from the relatively few periods in the life of the nation during which circumstances encouraged social readjustment and experiment—in the 1830s, 1840s, and 1850s, again in the 1940s, 1950s and 1960s—but for the most part the record will show the game securely rigged in favor of the rich, no matter how selfish or stupid, at the expense of the poor, no matter how innovative or entrepreneurial. During the last 30 years of the 19th century and the first 30 years of the 20th, class conflict furnished the newspaper mills with their best-selling headlines—railroad company thugs quelling labor unrest in the industrial East, the Ku Klux Klan lynching Negroes in the rural South, the U.S. army exterminating Sioux Indians on the Western plains.

Around the turn of the 20th century the forces of democracy pushed forward an era of progressive reform sponsored by both the Republican president, Theodore Roosevelt, and the Democratic president, Woodrow Wilson. During the middle years of the 20th century America at times showed some semblance of the republic envisioned by its 18th-century founders—Franklin D. Roosevelt’s New Deal, a citizen army fighting World War II, the Great Depression replaced with a fully employed economy in which all present shared in the profits.

The civil rights and anti-Vietnam war protests in the 1960s were expressions of democratic objection and dissent intended to reform the country’s political thought and practice, not to overthrow its government. Nobody was threatening to reset the game clock in the Rose Bowl, tear down Grand Central Terminal or remove the Lincoln Memorial. The men, women and children confronting racist tyranny in the South—sitting at a lunch counter in Alabama, riding a bus into Mississippi, going to school in Arkansas—risked their lives and sacred honor on behalf of a principle, not a lifestyle; for a government of laws, not men. The unarmed rebellion led to the enactment in the mid-1960s of the Economic Opportunity Act, the Voting Rights Act, the Medicare and Medicaid programs, eventually to the shutting down of the Vietnam War.

Faith in democracy survived the assassination of President John F. Kennedy in 1963; it didn’t survive the assassinations of Robert Kennedy and Martin Luther King in 1968. The 1960s and 1970s gave rise to a sequence of ferocious and destabilizing change—social, cultural, technological, sexual, economic and demographic—that tore up the roots of family, community and church from which a democratic society draws meaning and strength. The news media promoted the multiple wounds to the body politic (the murders of King and Kennedy, big-city race riots, the killing of college students at Kent State and Jackson State, crime in the streets of Los Angeles, Chicago and Newark) as revolution along the line of Robespierre’s reign of terror. The fantasy of armed revolt sold papers, boosted ratings, stimulated the demand for heavy surveillance and repressive law enforcement that over the last 50 years has blossomed into the richest and most innovative of the nation’s growth industries.

By the end of the 1970s democracy had come to be seen as a means of government gone soft in the head and weak in the knees, no match for unscrupulous Russians, incapable of securing domestic law and order, unable to disperse the barbarians (foreign and native born) at the gates of the gated real estate in Beverly Hills, Westchester County and Palm Beach. The various liberation movements still in progress no longer sought to right the wrongs of government. The political was personal, the personal political. Seized by the appetite for more—more entitlements, privileges and portrait busts—plaintiffs for both the haves and the have-nots agitated for a lifestyle, not a principle. The only constitutional value still on the table was the one constituting freedom as property, property as freedom. A fearful bourgeois society adrift in a sea of troubles was clinging to its love of money as if to the last lifeboat rowing away from the Titanic when Ronald Reagan in 1980 stepped onto the stage of the self-pitying national melodrama with the promise of an America to become great again in a future made of gold.

by Lewis Lapham, LitHub |  Read more:
Image: Detail from Jasper Johns 'White Flag'