Monday, March 2, 2015

The Troubled History Of The Foreskin

Circumcision has been practised for millennia. Right now, in America, it is so common that foreskins are somewhat rare, and may become more so. A few weeks before the protests, the Centers for Disease Control and Prevention (CDC) had suggested that healthcare professionals talk to men and parents about the benefits of the procedure, which include protection from some sexually transmitted diseases, and the risks, which the CDC describes as low. But as the protesters wanted drivers to know, there is no medical consensus on this issue. Circumcision isn't advised for health reasons in Europe, for instance, because the benefits remain unclear. Meanwhile, Western organisations are paying for the circumcision of millions of African men in an attempt to rein in HIV – a campaign that critics say is also based on questionable evidence.

Men have been circumcised for thousands of years, yet our thinking about the foreskin seems as muddled as ever. And a close examination of this muddle raises disturbing questions. Is this American exceptionalism justified? Should we really be funding mass circumcision in Africa? Or by removing the foreskins of men, boys and newborns, are we actually committing a violation of human rights?

The tomb of Ankhmahor, a high-ranking official in ancient Egypt, is situated in a vast burial ground just outside Cairo. A picture of a man standing upright is carved into one of the walls. His hands are restrained, and another figure kneels in front of him, holding a tool to his penis. Though there is no definitive explanation of why circumcision began, many historians believe this relief, carved more than four thousand years ago, is the oldest known record of the procedure.

The best-known circumcision ritual, the Jewish ceremony of brit milah, is also thousands of years old. It survives to this day, as do others practised by Muslims and some African tribes. But American attitudes to circumcision have a much more recent origin. As medical historian David Gollaher recounts in his book Circumcision: A History of the World's Most Controversial Surgery, early Christian leaders abandoned the practice, realising perhaps that their religion would be more attractive to converts if surgery wasn't required. Circumcision disappeared from Christianity, and the secular Western cultures that descended from it, for almost two thousand years.

Then came the Victorians. One day in 1870, a New York orthopaedic surgeon named Lewis Sayre was asked to examine a five-year-old boy suffering from paralysis of both legs. Sayre was the picture of a Victorian gentleman: three-piece suit, bow tie, mutton chops. He was also highly respected, a renowned physician at Bellevue Hospital, New York's oldest public hospital, and an early member of the American Medical Association.

After the boy's sore genitals were pointed out by his nanny, Sayre removed the foreskin. The boy recovered. Believing he was on to something big, Sayre conducted more procedures. His reputation was such that when he praised the benefits of circumcision – which he did in the Transactions of the American Medical Association and elsewhere until he died in 1900 – surgeons elsewhere followed suit. Among other ailments, Sayre discussed patients whose foreskins were tightened and could not retract, a condition known as phimosis. Sayre declared that the condition caused a general state of nervous irritation, and that circumcision was the cure.

His ideas found a receptive audience. To Victorian minds many mental health issues originated with the sexual organs and masturbation. The connection had its roots in a widely read 18th-century treatise entitled Onania, or the Heinous Sin of Self-Pollution, and All Its Frightful Consequences, in Both Sexes, Considered. With Spiritual and Physical Advice to Those Who Have Already Injur'd Themselves By This Abominable Practice. The anonymous author warned that masturbation could cause epilepsy, infertility, "a wounded conscience" and other problems. By 1765 the book was in its 80th printing.

Later puritans took a similar view. Sylvester Graham associated any pleasure with immorality. He was a preacher, health reformer and creator of the graham cracker. Masturbation turned one into "a confirmed and degraded idiot", he declared in 1834. Men and women suffering from otherwise unlabelled psychiatric issues were diagnosed with masturbatory insanity; treatments included clitoridectomies for women, circumcision for men.

Graham's views were later taken up by another eccentric but prominent thinker on health matters: John Harvey Kellogg, who promoted abstinence and advocated foreskin removal as a cure. (He also worked with his brother to invent the cornflake.) "The operation should be performed by a surgeon without administering anesthetic," instructed Kellogg, "as the brief pain attending the operation will have a salutary effect upon the mind, especially if it be connected with the idea of punishment."

Counter-examples to Sayre's supposed breakthrough could be found in operating theatres across America. Attempts to cure children of paralysis failed. Men, one can assume, continued to masturbate. It mattered not. The circumcised penis came to be seen as more hygienic, and cleanliness was a sign of moral standards. An 1890 journal identified smegma as "infectious material". A few years later, a book for mothers – Confidential Talks on Home and Child Life, by a member of the National Temperance Society – described the foreskin as a "mark of Satan". Another author described parents who did not circumcise their sons at an early age as "almost criminally negligent".

by Jessica Wapner, io9 | Read more:
Image: Image by Flickr user Mararie.

Medicating Women’s Feelings

Women are moody. By evolutionary design, we are hard-wired to be sensitive to our environments, empathic to our children’s needs and intuitive of our partners’ intentions. This is basic to our survival and that of our offspring. Some research suggests that women are often better at articulating their feelings than men because as the female brain develops, more capacity is reserved for language, memory, hearing and observing emotions in others.

These are observations rooted in biology, not intended to mesh with any kind of pro- or anti-feminist ideology. But they do have social implications. Women’s emotionality is a sign of health, not disease; it is a source of power. But we are under constant pressure to restrain our emotional lives. We have been taught to apologize for our tears, to suppress our anger and to fear being called hysterical.

The pharmaceutical industry plays on that fear, targeting women in a barrage of advertising on daytime talk shows and in magazines. More Americans are on psychiatric medications than ever before, and in my experience they are staying on them far longer than was ever intended. Sales of antidepressants and antianxiety meds have been booming in the past two decades, and they’ve recently been outpaced by an antipsychotic, Abilify, that is the No. 1 seller among all drugs in the United States, not just psychiatric ones.

As a psychiatrist practicing for 20 years, I must tell you, this is insane.

At least one in four women in America now takes a psychiatric medication, compared with one in seven men. Women are nearly twice as likely to receive a diagnosis of depression or anxiety disorder than men are. For many women, these drugs greatly improve their lives. But for others they aren’t necessary. The increase in prescriptions for psychiatric medications, often by doctors in other specialties, is creating a new normal, encouraging more women to seek chemical assistance. Whether a woman needs these drugs should be a medical decision, not a response to peer pressure and consumerism.

The new, medicated normal is at odds with women’s dynamic biology; brain and body chemicals are meant to be in flux. To simplify things, think of serotonin as the “it’s all good” brain chemical. Too high and you don’t care much about anything; too low and everything seems like a problem to be fixed. (...)

The most common antidepressants, which are also used to treat anxiety, are selective serotonin reuptake inhibitors (S.S.R.I.s) that enhance serotonin transmission. S.S.R.I.s keep things “all good.” But too good is no good. More serotonin might lengthen your short fuse and quell your fears, but it also helps to numb you, physically and emotionally. These medicines frequently leave women less interested in sex. S.S.R.I.s tend to blunt negative feelings more than they boost positive ones. On S.S.R.I.s, you probably won’t be skipping around with a grin; it’s just that you stay more rational and less emotional. Some people on S.S.R.I.s have also reported less of many other human traits: empathy, irritation, sadness, erotic dreaming, creativity, anger, expression of their feelings, mourning and worry.

by Julie Holland, NY Times |  Read more:
Image: Christelle Enault

American Democracy is Doomed

America's constitutional democracy is going to collapse.

Some day — not tomorrow, not next year, but probably sometime before runaway climate change forces us to seek a new life in outer-space colonies — there is going to be a collapse of the legal and political order and its replacement by something else. If we're lucky, it won't be violent. If we're very lucky, it will lead us to tackle the underlying problems and result in a better, more robust, political system. If we're less lucky, well, then, something worse will happen.

Very few people agree with me about this, of course. When I say it, people generally think that I'm kidding. America is the richest, most successful country on earth. The basic structure of its government has survived contested elections and Great Depressions and civil rights movements and world wars and terrorist attacks and global pandemics. People figure that whatever political problems it might have will prove transient — just as happened before. (...)

The breakdown of American constitutional democracy is a contrarian view. But it's nothing more than the view that rather than everyone being wrong about the state of American politics, maybe everyone is right. Maybe Bush and Obama are dangerously exceeding norms of executive authority. Maybe legislative compromise really has broken down in an alarming way. And maybe the reason these complaints persist across different administrations and congresses led by members of different parties is that American politics is breaking down.

The perils of presidential democracy

To understand the looming crisis in American politics, it's useful to think about Germany, Japan, Italy, and Austria. These are countries that were defeated by American military forces during the Second World War and given constitutions written by local leaders operating in close collaboration with occupation authorities. It's striking that even though the US Constitution is treated as a sacred text in America's political culture, we did not push any of these countries to adopt our basic framework of government.

This wasn't an oversight.

In a 1990 essay, the late Yale political scientist Juan Linz observed that "aside from the United States, only Chile has managed a century and a half of relatively undisturbed constitutional continuity under presidential government — but Chilean democracy broke down in the 1970s."

The exact reasons for why are disputed among scholars — in part because you can't just randomly assign different governments to people. One issue here is that American-style systems are much more common in the Western Hemisphere and parliamentary ones are more common elsewhere. Latin-American countries have experienced many episodes of democratic breakdown, so distinguishing Latin-American cultural attributes from institutional characteristics is difficult.

Still, Linz offered several reasons why presidential systems are so prone to crisis. One particularly important one is the nature of the checks and balances system. Since both the president and the Congress are directly elected by the people, they can both claim to speak forthe people. When they have a serious disagreement, according to Linz, "there is no democratic principle on the basis of which it can be resolved." The constitution offers no help in these cases, he wrote: "the mechanisms the constitution might provide are likely to prove too complicated and aridly legalistic to be of much force in the eyes of the electorate."

In a parliamentary system, deadlocks get resolved. A prime minister who lacks the backing of a parliamentary majority is replaced by a new one who has it. If no such majority can be found, a new election is held and the new parliament picks a leader. It can get a little messy for a period of weeks, but there's simply no possibility of a years-long spell in which the legislative and executive branches glare at each other unproductively.

But within a presidential system, gridlock leads to a constitutional trainwreck with no resolution. The United States's recent government shutdowns and executive action on immigration are small examples of the kind of dynamic that's led to coups and putsches abroad.

There was, of course, the American exception to the problems of the checks-and-balances system. Linz observed on this score: "The uniquely diffuse character of American political parties — which, ironically, exasperates many American political scientists and leads them to call for responsible, ideologically disciplined parties — has something to do with it."

For much of American history, in other words, US political parties have been relatively un-ideological and un-disciplined. They are named after vague ideas rather than specific ideologies, and neither presidents nor legislative leaders can compel back-bench members to vote with them. This has often been bemoaned (famously, a 1950 report by the American Political Science Association called for a more rigorous party system) as the source of problems. It's also, according to Linz, helped avert the kind of zero-sum conflicts that have torn other structurally similar democracies apart. But that diffuse party structure is also a thing of the past.

by Matthew Yglesias, Vox | Read more:
Image: Pete Souza

Christopher Thompson, The Letter
via:

Gerrymandering Explained


[ed. See also: This report that Republican lawmakers from Arizona have filed an appeal with the Supreme Court (which will be heard today) against the state's voter-approved independent redistricting commission for creating the districts of U.S. House members. A decision striking down the commission probably would doom a similar system in neighboring California, and could affect districting commissions in 11 other states..]

Gerrymandering -- drawing political boundaries to give your party a numeric advantage over an opposing party -- is a difficult process to explain. If you find the notion confusing, check out the chart above -- adapted from one posted to Reddit this weekend -- and wonder no more.

by Christopher Ingraham, Washington Post | Read more:
Image: N8theGr8

Sunday, March 1, 2015

Back Dorm Boys


[ed. I've been digging around in the Music archives and found this Back Dorm Boys video that I hadn't seen in a long time. They even have their own Wikipedia entry now. Still makes me smile. A few other videos after this.]
[ed. Repost]

Jenny O.

[ed. Repost]

Ronnie Earl & The Broadcasters


[ed. Repost]

Frans Post, Brazilian Landscape with Anteater, 1649
via:

Gillian Welch, David Rawlings

[Repost]

When Your Punctuation Says It All (!)

I went out with a guy based on his use of dashes once. Within moments of our first interaction — over text message — I was basically in love.

He didn’t just use the lazy singular dash (“-”) as a pause between his thoughts, or even the more time-consuming double-dash (“--”). Nope. This man used a proper em dash.

That is, the kind that required him to hold down the dash button on his iPhone for that extra second, until the “—” appeared, then choose it from among three options. I don’t remember what his messages actually said. But he obviously really liked me.

I’m a writer; it’s natural I’d have a thing for grammar. But these days, it’s as if our punctuation is on steroids.

It’s not just that each of us is more delicately choosing our characters, knowing that an exclamation point or a colon carries more weight in our 140-character world. Or even that our punctuation suddenly feels like hyperbole (right?!?!?!?!?!) because we’ve lost all audible tone.

Those things are true. But it’s also as if a kind of micro-punctuation has emerged: tiny marks in the smallest of spaces that suddenly tell us more about the person on the other end than the words themselves (or, at least, we think they do).

Take the question mark. Recently, a friend I had dinner plans with sent a text to ask “what time” we were meeting. We’d been organizing this meal for weeks; a half-dozen emails back and forth. And yet the question — sans the mark — felt indifferent, almost cold. Couldn’t she at least bother to insert the necessary character?

Of course, had she inserted too many marks, that may have been a problem, too, as there is suddenly a very fine line between appearing overeager (too much punctuation) and dismissive (not enough).

Even the period, once the most benign of the punctuation spectrum, now feels aggressive. And the exclamation point is so ubiquitous that “when my girlfriends don’t use an exclamation point, I’m like ‘What’s wrong, you O.K.?’ ” said Jordana Narin, a 19-year-old student in New York.

“Girlfriends” may be a key word there, as women are more likely to use emotive punctuation than men are. Yet lately I’ve tried to rein my own effusiveness in, going as far as to insert additional punctuation into existing punctuation in an effort to soften the marks themselves.

So instead of responding to a text with “Cant wait!!” I’ll insert a space or two before the mark — “Cant wait !!” – for that extra little pause. Sometimes I’ll make the exclamation point a parenthetical, as a kind of after thought (“Can’t wait (!)”). A friend inserts an ellipses — “Can’t wait … !!” — so, as she puts it, “it’s less intense.”

“At this point, I’ve basically suspended judgment,” said Ben Crair, an editor at the New Republic who recently wrote a column about the new aggression of the period. “You could drive yourself insane trying to decode the hidden messages in other people’s punctuation.”

by Jessica Bennett, NY Times |  Read more:
Image: Ron Barrett

Saturday, February 28, 2015

You Are Listening To...


...a soothing mix of police radio chatter and ambient music. Choose from Los Angeles, New York, San Francisco, Chicago, or my personal recommendation, Montréal. French police chat really blends into the music nicely. You may need to adjust the balance of each stream a bit to find the right mix. 
via:
[ed. Repost]

photo: markk
[ed. Repost]

Erik Satie


[ed. Repost]

Los Amigos Invisibles


Have some fun.
[ed. Repost]

Our Date with Miranda

[ed. One of my favorite movies: Me and You and Everyone We Know.]

I first met Miranda July years ago at a faraway literary conference in Portland, Oregon. Along with Rick Moody and others we were on a panel that was supposed to converse authoritatively about narrative structure. When it came time for July to speak, she stood up and started singing. She was large-eyed and lithe. I don’t remember what song it was—something she had written herself, I believe. I was startled. Who was this woman? (Her performances and short films had not appeared widely enough to have caught my notice.) I was then mortified, not for her, since she seemed completely at ease and the audience was enthralled, but mortified for narrative structure, which had clearly been given the bum’s rush. (Well, fiction writers will do anything to avoid this topic: it is the one about which they are the most clueless and worried and improvisational.)

Sitting next to Ms. July was the brilliant Denis Johnson, who, inspired by his neighbor, when it was his turn (figuring out one’s turn can be the most difficult part of a panel) also began to sing. Also something he had written himself. I may have laughed, thinking it was all supposed to be funny, realizing too late my mistake. There was a tragic aspect to one verse in the Johnson song. I believe he did not sit down because he had not stood to begin with.

Then it was clearly, or unclearly, my turn. If not the wallflower at the orgy then I was the mute at the a cappella operetta (a condition typical of many a July character though not of July herself): I refused to sing. I don’t remember what I said—I believe I read from some notes, silently vowing never to be on another panel. (The next panel I was on, in Boston, I thought went well by comparison. That is, no one burst into random song. But when I said as much to the person sitting next to me, the editor of a prominent literary journal, he said, “Really? This was the worst panel I’ve ever participated in.”) So my introduction to July was one at which I watched her redefine boundaries and hijack something destined to be inert and turn it into something uncomfortably alive, whether you wanted her to or not. This has been my experience of her work ever since.

July’s first feature-length film, the now-famous independent Me and You and Everyone We Know, also upends expectations. July writes, directs, and stars in all her films. In many ways, while remaining a love story, the film is about the boundary-busting that is ruleless sexuality—stalking and sexual transgression—though here the predators and perpetrators are gentle and female. A boy is coercively fellated by two slightly unpleasant teenage girls devising a competition. Low-level sexual harassment is everywhere and July sometimes plays it for laughs. Two kids in a chat room lead someone on a wild goose chase, writing scatological comments in the language of very young children, and despite all this it is hilarious. A shoe salesman named Richard who has set his hand on fire in front of his sons is hounded by a woman named Christine (played by July herself) who does not know him but who is erotically obsessed with him. She has psychically and perhaps correctly marked him as her mate (the telepathic heart is at the center of much of July’s work).

Another character, a middle-aged woman seeking a partner online, finds herself hooked up with a five-year-old boy in the park. Images of flame and precariousness recur—the burning sun, the burning hand, a bright goldfish riding in a plastic bag on a car roof. And yet all is put forward with tenderness and humor. The desire for human love goes unquestioned and its role in individual fate is assumed to be essential. July’s Christine, a struggling artist who works as a driver for ElderCab, possesses a thin-skinned empathy for everyone, and her love for the shoe salesman (who is played in convincingly addled fashion by John Hawkes) is performed with both vulnerability and purity of passion.

In her two feature-length films the chemistry with her male leads is quite strong: they as well as July are like openly soulful children, attaching without reason or guile, and July is quite focused on this quality of connective vulnerability, as well as on children themselves. Her work also engages with the criterion offered up by the character of a museum curator looking at Christine’s own works: “Could this have been made in any era or only now?” With July it is a little of both. She focuses on people living “courageously with grace,” while also quietly arguing with a culture that asks us to do that.

by Lorrie Moore, NY Review of Books |  Read more:
Image: Nick Wall/Eyevine/Redux

A Glorious Distraction

During the two weeks before the Super Bowl there were more than 10,000 news articles written about the slight deviation in air pressure of the footballs used by the New England Patriots in their American Football Conference Championship victory over the Indianapolis Colts. The Patriots quarterback Tom Brady, in an attempt to defuse conspiracy allegations, joked in a press conference, “Things are fine—this isn’t ISIS.”

He was right: it wasn’t ISIS. During those two weeks, the Islamic State of Iraq and Syria was the subject of only seventy-nine articles in The New York Times. “Deflate-gate” was the subject of eighty. These included interviews with football players, who explained why a deflated ball was easier to throw and catch; physicists, who suggested that the deflation might have occurred due to climate effects; logisticians, who opined on the time necessary to deflate a football; and a seamstress of Wilson footballs who vowed, “It’s not Wilson’s fault.” Even the leader of the free world felt obliged to make a statement. “Here’s what I know,” said President Obama on Super Bowl Sunday. “The Patriots were going to beat the Colts regardless of what the footballs looked like.”

In that period Andy Studebaker’s name appeared in only nine articles, all published in sports blogs. Studebaker is the twenty-nine-year-old backup linebacker for the Colts who, while defending a punt return, was blindsided with a gruesome hit to the chest by the Patriots’ backup running back Brandon Bolden. Studebaker’s head jerked back and he landed on his neck. On the sideline after the play Studebaker was seen coughing up blood.

Nor was much made of the fine levied on professional monster Clay Matthews of the Green Bay Packers for illegally smashing into the defenseless head of Seattle Seahawks quarterback Russell Wilson in the National Football Conference Championship game. Matthews’s fine was $22,050, or approximately what he earns every ninety seconds of game play. There was also little attention given to the fact that, in the second half of that game, Seattle cornerback Richard Sherman injured his left arm so badly that he couldn’t straighten it; he played the final quarter with it bent and pressed tightly to his chest like a chicken wing.

Was it broken? Badly sprained? Was he given shockingly powerful illegal or legal drugs in order to endure the pain? The league, and Seattle, were mum on these points. When asked ten days later about the injury, Sherman said, “It’s a little sore, but not too bad.” Then, with a wink: “That’s my story and I’m sticking to it.” Minutes after the Super Bowl ended it was revealed that Sherman had torn ligaments in his elbow and will have to undergo reconstructive surgery. (...)

NFL Commissioner Roger Goodell might have been grateful for the deflation controversy because it distracted from what otherwise have been the season’s two dominant storylines: the league’s reluctance to discipline players who commit domestic violence and its failure to protect its players from brain damage. But Goodell didn’t need the help. Every thinking fan must, in order to enjoy any NFL game, consent to participate in a formidable suspension of disbelief. We must put aside our knowledge that nearly every current NFL player can expect to suffer from chronic traumatic encephalopathy, a degenerative disease that leads to memory loss, impaired judgment, depression, and dementia.

Football players are also four times more likely both to die from ALS (a fact that Goodell, despite participating in this past year’s ALS ice-bucket challenge, refuses to acknowledge) and to develop Alzheimer’s disease. An NFL player can expect to live twenty years less than the average American male. The average NFL career lasts 3.3 years. By that measure, each season costs an NFL player about six years of his life. Football fans, in other words, must ignore the fact that we are watching men kill themselves.

by Nathaniel Rich, New York Review of Books |  Read more:
Image: Jim Davis/Boston Globe/Getty Images

Hans Erni, Le Dessinateur or Kybernetes, Lithograph in 5 colours, 1956
via: