Tuesday, March 3, 2015
Why We’re All Becoming Independent Contractors
[ed. FedEx? Really? I had no idea this is how they operated.]
FedEx calls its drivers independent contractors.Yet FedEx requires them to pay for the FedEx-branded trucks they drive, as well as the FedEx uniforms they wear, and FedEx scanners they use—along with insurance, fuel, tires, oil changes, meals on the road, maintenance, and workers compensation insurance. If they get sick or need a vacation, they have to hire their own replacements. They’re even required to groom themselves according to FedEx standards.
FedEx doesn’t tell its drivers what hours to work, but it tells them what packages to deliver and organizes their workloads to ensure they work between 9.5 and 11 hours every working day.
If this isn’t “employment,” I don’t know what the word means.
In 2005, thousands of FedEx drivers in California sued the company, alleging they were in fact employees and that FedEx owed them the money they shelled out, as well as wages for all the overtime work they put in.
Last summer, a federal appeals court agreed, finding that under California law—which looks at whether a company “controls” how a job is done along with a variety of other criteria to determine the real employment relationship—the FedEx drivers were indeed employees, not independent contractors.
by Robert Reich, Granta | Read more:
Image: WSJ
All You Have Eaten
How much of that will he remember, and for how long, and how well? You might be able to tell me, with some certainty, what your breakfast was, but that confidence most likely diminishes when I ask about two, three, four breakfasts ago—never mind this day last year. (...)
For breakfast on January 2, 2008, I ate oatmeal with pumpkin seeds and brown sugar and drank a cup of green tea. I know because it’s the first entry in a food log I still keep today. I began it as an experiment in food as a mnemonic device. The idea was this: I’d write something objective every day that would cue my memories into the future—they’d serve as compasses by which to remember moments.
Andy Warhol kept what he called a “smell collection,” switching perfumes every three months so he could reminisce more lucidly on those three months whenever he smelled that period’s particular scent. Food, I figured, took this even further. It involves multiple senses, and that’s why memories that surround food can come on so strong.
What I’d like to have is a perfect record of every day. I’ve long been obsessed with this impossibility, that every day be perfectly productive and perfectly remembered. What I remember from January 2, 2008 is that after eating the oatmeal I went to the post office, where an old woman was arguing with a postal worker about postage—she thought what she’d affixed to her envelope was enough and he didn’t. (...)
Last spring, as part of a NASA-funded study, a crew of three men and three women with “astronaut-like” characteristics spent four months in a geodesic dome in an abandoned quarry on the northern slope of Hawaii’s Mauna Loa volcano. For those four months, they lived and ate as though they were on Mars, only venturing outside to the surrounding Mars-like, volcanic terrain, in simulated space suits. The Hawaii Space Exploration Analog and Simulation (HI-SEAS) is a four-year project: a series of missions meant to simulate and study the challenges of long-term space travel, in anticipation of mankind’s eventual trip to Mars. This first mission’s focus was food.
Getting to Mars will take roughly six to nine months each way, depending on trajectory; the mission itself will likely span years. So the question becomes: How do you feed astronauts for so long? On “Mars,” the HI-SEAS crew alternated between two days of pre-prepared meals and two days of dome-cooked meals of shelf-stable ingredients. Researchers were interested in the answers to a number of behavioral issues: among them, the well-documented phenomenon of menu fatigue (when International Space Station astronauts grow weary of their packeted meals, they tend to lose weight). They wanted to see what patterns would evolve over time if a crew’s members were allowed dietary autonomy, and given the opportunity to cook for themselves (“an alternative approach to feeding crews of long term planetary outposts,” read the open call).
Everything was hyper-documented. Everything eaten was logged in painstaking detail: weighed, filmed, and evaluated. The crew filled in surveys before and after meals: queries into how hungry they were, their first impressions, their moods, how the food smelled, what its texture was, how it tasted. They documented their time spent cooking; their water usage; the quantity of leftovers, if any. The goal was to measure the effect of what they ate on their health and morale, along with other basic questions concerning resource use. How much water will it take to cook on Mars? How much water will it take to wash dishes? How much time is required; how much energy? How will everybody feel about it all? I followed news of the mission devoutly. (...)
Their crew of six was selected from a pool of 700 candidates. Kate is a science writer, open-water swimmer, and volleyball player. When I asked her what “astronaut-like” means and why she was picked she says it’s some “combination of education, experience, and attitude”: a science background, leadership experience, an adventurous attitude. An interest in cooking was not among the requirements. The cooking duties were divided from the get-go; in the kitchen, crew members worked in pairs. On non-creative days they’d eat just-add-water, camping-type meals: pre-prepared lasagna, which surprised Kate by being not terrible; a thing called “kung fu chicken” that Angelo described as “slimy” and less tolerable; a raspberry crumble dessert that’s a favorite among backpackers (“That was really delicious,” Kate said, “but still you felt weird about thinking it was too delicious”). The crew didn’t eat much real astronaut food—astronaut ice cream, for example—because real astronaut food is expensive. (...)
When I look back on my meals from the past year, the food log does the job I intended more or less effectively. I can remember, with some clarity, the particulars of given days: who I was with, how I was feeling, the subjects discussed. There was the night in October I stress-scarfed a head of romaine and peanut butter packed onto old, hard bread; the somehow not-sobering bratwurst and fries I ate on day two of a two-day hangover, while trying to keep things light with somebody to whom, the two nights before, I had aired more than I meant to. There was the night in January I cooked “rice, chicken stirfry with bell pepper and mushrooms, tomato-y Chinese broccoli, 1 bottle IPA” with my oldest, best friend, and we ate the stirfry and drank our beers slowly while commiserating about the most recent conversations we’d had with our mothers.
Reading the entries from 2008, that first year, does something else to me: it suffuses me with the same mortification as if I’d written down my most private thoughts (that reaction is what keeps me from maintaining a more conventional journal). There’s nothing particularly incriminating about my diet, except maybe that I ate tortilla chips with unusual frequency, but the fact that it’s just food doesn’t spare me from the horror and head-shaking that comes with reading old diaries. Mentions of certain meals conjure specific memories, but mostly what I’m left with are the general feelings from that year. They weren’t happy ones. I was living in San Francisco at the time. A relationship was dissolving.
It seems to me that the success of a relationship depends on a shared trove of memories. Or not shared, necessarily, but not incompatible. That’s the trouble, I think, with parents and children: parents retain memories of their children that the children themselves don’t share. My father’s favorite meal is breakfast and his favorite breakfast restaurant is McDonald’s, and I remember—having just read Michael Pollan or watched Super Size Me—self-righteously not ordering my regular egg McMuffin one morning, and how that actually hurt him.
When a relationship goes south, it’s hard to pinpoint just where or how—especially after a prolonged period of it heading that direction. I was at a loss with this one. Going forward, I didn’t want not to be able to account for myself. If I could remember everything, I thought, I’d be better equipped; I’d be better able to make proper, comprehensive assessments—informed decisions. But my memory had proved itself unreliable, and I needed something better. Writing down food was a way to turn my life into facts: if I had all the facts, I could keep them straight. So the next time this happened I’d know exactly why—I’d have all the data at hand.
by Rachel Khong, Lucky Peach | Read more:
Image: Jason PolanNew Professor
[ed. Professor of Physics! Congratulations, Nate. I'm so proud of you for achieving your goal.
Love, Dad.]
Image via:
[See also: Maybe the hardest nut for a new scientist to crack: finding a job.]
Monday, March 2, 2015
The Troubled History Of The Foreskin
Circumcision has been practised for millennia. Right now, in America, it is so common that foreskins are somewhat rare, and may become more so. A few weeks before the protests, the Centers for Disease Control and Prevention (CDC) had suggested that healthcare professionals talk to men and parents about the benefits of the procedure, which include protection from some sexually transmitted diseases, and the risks, which the CDC describes as low. But as the protesters wanted drivers to know, there is no medical consensus on this issue. Circumcision isn't advised for health reasons in Europe, for instance, because the benefits remain unclear. Meanwhile, Western organisations are paying for the circumcision of millions of African men in an attempt to rein in HIV – a campaign that critics say is also based on questionable evidence.
Men have been circumcised for thousands of years, yet our thinking about the foreskin seems as muddled as ever. And a close examination of this muddle raises disturbing questions. Is this American exceptionalism justified? Should we really be funding mass circumcision in Africa? Or by removing the foreskins of men, boys and newborns, are we actually committing a violation of human rights?
The tomb of Ankhmahor, a high-ranking official in ancient Egypt, is situated in a vast burial ground just outside Cairo. A picture of a man standing upright is carved into one of the walls. His hands are restrained, and another figure kneels in front of him, holding a tool to his penis. Though there is no definitive explanation of why circumcision began, many historians believe this relief, carved more than four thousand years ago, is the oldest known record of the procedure.
The best-known circumcision ritual, the Jewish ceremony of brit milah, is also thousands of years old. It survives to this day, as do others practised by Muslims and some African tribes. But American attitudes to circumcision have a much more recent origin. As medical historian David Gollaher recounts in his book Circumcision: A History of the World's Most Controversial Surgery, early Christian leaders abandoned the practice, realising perhaps that their religion would be more attractive to converts if surgery wasn't required. Circumcision disappeared from Christianity, and the secular Western cultures that descended from it, for almost two thousand years.
Then came the Victorians. One day in 1870, a New York orthopaedic surgeon named Lewis Sayre was asked to examine a five-year-old boy suffering from paralysis of both legs. Sayre was the picture of a Victorian gentleman: three-piece suit, bow tie, mutton chops. He was also highly respected, a renowned physician at Bellevue Hospital, New York's oldest public hospital, and an early member of the American Medical Association.
After the boy's sore genitals were pointed out by his nanny, Sayre removed the foreskin. The boy recovered. Believing he was on to something big, Sayre conducted more procedures. His reputation was such that when he praised the benefits of circumcision – which he did in the Transactions of the American Medical Association and elsewhere until he died in 1900 – surgeons elsewhere followed suit. Among other ailments, Sayre discussed patients whose foreskins were tightened and could not retract, a condition known as phimosis. Sayre declared that the condition caused a general state of nervous irritation, and that circumcision was the cure.
His ideas found a receptive audience. To Victorian minds many mental health issues originated with the sexual organs and masturbation. The connection had its roots in a widely read 18th-century treatise entitled Onania, or the Heinous Sin of Self-Pollution, and All Its Frightful Consequences, in Both Sexes, Considered. With Spiritual and Physical Advice to Those Who Have Already Injur'd Themselves By This Abominable Practice. The anonymous author warned that masturbation could cause epilepsy, infertility, "a wounded conscience" and other problems. By 1765 the book was in its 80th printing.
Later puritans took a similar view. Sylvester Graham associated any pleasure with immorality. He was a preacher, health reformer and creator of the graham cracker. Masturbation turned one into "a confirmed and degraded idiot", he declared in 1834. Men and women suffering from otherwise unlabelled psychiatric issues were diagnosed with masturbatory insanity; treatments included clitoridectomies for women, circumcision for men.
Graham's views were later taken up by another eccentric but prominent thinker on health matters: John Harvey Kellogg, who promoted abstinence and advocated foreskin removal as a cure. (He also worked with his brother to invent the cornflake.) "The operation should be performed by a surgeon without administering anesthetic," instructed Kellogg, "as the brief pain attending the operation will have a salutary effect upon the mind, especially if it be connected with the idea of punishment."
Counter-examples to Sayre's supposed breakthrough could be found in operating theatres across America. Attempts to cure children of paralysis failed. Men, one can assume, continued to masturbate. It mattered not. The circumcised penis came to be seen as more hygienic, and cleanliness was a sign of moral standards. An 1890 journal identified smegma as "infectious material". A few years later, a book for mothers – Confidential Talks on Home and Child Life, by a member of the National Temperance Society – described the foreskin as a "mark of Satan". Another author described parents who did not circumcise their sons at an early age as "almost criminally negligent".
by Jessica Wapner, io9 | Read more:
Image: Image by Flickr user Mararie.
Men have been circumcised for thousands of years, yet our thinking about the foreskin seems as muddled as ever. And a close examination of this muddle raises disturbing questions. Is this American exceptionalism justified? Should we really be funding mass circumcision in Africa? Or by removing the foreskins of men, boys and newborns, are we actually committing a violation of human rights?The tomb of Ankhmahor, a high-ranking official in ancient Egypt, is situated in a vast burial ground just outside Cairo. A picture of a man standing upright is carved into one of the walls. His hands are restrained, and another figure kneels in front of him, holding a tool to his penis. Though there is no definitive explanation of why circumcision began, many historians believe this relief, carved more than four thousand years ago, is the oldest known record of the procedure.
The best-known circumcision ritual, the Jewish ceremony of brit milah, is also thousands of years old. It survives to this day, as do others practised by Muslims and some African tribes. But American attitudes to circumcision have a much more recent origin. As medical historian David Gollaher recounts in his book Circumcision: A History of the World's Most Controversial Surgery, early Christian leaders abandoned the practice, realising perhaps that their religion would be more attractive to converts if surgery wasn't required. Circumcision disappeared from Christianity, and the secular Western cultures that descended from it, for almost two thousand years.
Then came the Victorians. One day in 1870, a New York orthopaedic surgeon named Lewis Sayre was asked to examine a five-year-old boy suffering from paralysis of both legs. Sayre was the picture of a Victorian gentleman: three-piece suit, bow tie, mutton chops. He was also highly respected, a renowned physician at Bellevue Hospital, New York's oldest public hospital, and an early member of the American Medical Association.
After the boy's sore genitals were pointed out by his nanny, Sayre removed the foreskin. The boy recovered. Believing he was on to something big, Sayre conducted more procedures. His reputation was such that when he praised the benefits of circumcision – which he did in the Transactions of the American Medical Association and elsewhere until he died in 1900 – surgeons elsewhere followed suit. Among other ailments, Sayre discussed patients whose foreskins were tightened and could not retract, a condition known as phimosis. Sayre declared that the condition caused a general state of nervous irritation, and that circumcision was the cure.
His ideas found a receptive audience. To Victorian minds many mental health issues originated with the sexual organs and masturbation. The connection had its roots in a widely read 18th-century treatise entitled Onania, or the Heinous Sin of Self-Pollution, and All Its Frightful Consequences, in Both Sexes, Considered. With Spiritual and Physical Advice to Those Who Have Already Injur'd Themselves By This Abominable Practice. The anonymous author warned that masturbation could cause epilepsy, infertility, "a wounded conscience" and other problems. By 1765 the book was in its 80th printing.
Later puritans took a similar view. Sylvester Graham associated any pleasure with immorality. He was a preacher, health reformer and creator of the graham cracker. Masturbation turned one into "a confirmed and degraded idiot", he declared in 1834. Men and women suffering from otherwise unlabelled psychiatric issues were diagnosed with masturbatory insanity; treatments included clitoridectomies for women, circumcision for men.
Graham's views were later taken up by another eccentric but prominent thinker on health matters: John Harvey Kellogg, who promoted abstinence and advocated foreskin removal as a cure. (He also worked with his brother to invent the cornflake.) "The operation should be performed by a surgeon without administering anesthetic," instructed Kellogg, "as the brief pain attending the operation will have a salutary effect upon the mind, especially if it be connected with the idea of punishment."
Counter-examples to Sayre's supposed breakthrough could be found in operating theatres across America. Attempts to cure children of paralysis failed. Men, one can assume, continued to masturbate. It mattered not. The circumcised penis came to be seen as more hygienic, and cleanliness was a sign of moral standards. An 1890 journal identified smegma as "infectious material". A few years later, a book for mothers – Confidential Talks on Home and Child Life, by a member of the National Temperance Society – described the foreskin as a "mark of Satan". Another author described parents who did not circumcise their sons at an early age as "almost criminally negligent".
by Jessica Wapner, io9 | Read more:
Image: Image by Flickr user Mararie.
Medicating Women’s Feelings
Women are moody. By evolutionary design, we are hard-wired to be sensitive to our environments, empathic to our children’s needs and intuitive of our partners’ intentions. This is basic to our survival and that of our offspring. Some research suggests that women are often better at articulating their feelings than men because as the female brain develops, more capacity is reserved for language, memory, hearing and observing emotions in others.
These are observations rooted in biology, not intended to mesh with any kind of pro- or anti-feminist ideology. But they do have social implications. Women’s emotionality is a sign of health, not disease; it is a source of power. But we are under constant pressure to restrain our emotional lives. We have been taught to apologize for our tears, to suppress our anger and to fear being called hysterical.
The pharmaceutical industry plays on that fear, targeting women in a barrage of advertising on daytime talk shows and in magazines. More Americans are on psychiatric medications than ever before, and in my experience they are staying on them far longer than was ever intended. Sales of antidepressants and antianxiety meds have been booming in the past two decades, and they’ve recently been outpaced by an antipsychotic, Abilify, that is the No. 1 seller among all drugs in the United States, not just psychiatric ones.
As a psychiatrist practicing for 20 years, I must tell you, this is insane.
At least one in four women in America now takes a psychiatric medication, compared with one in seven men. Women are nearly twice as likely to receive a diagnosis of depression or anxiety disorder than men are. For many women, these drugs greatly improve their lives. But for others they aren’t necessary. The increase in prescriptions for psychiatric medications, often by doctors in other specialties, is creating a new normal, encouraging more women to seek chemical assistance. Whether a woman needs these drugs should be a medical decision, not a response to peer pressure and consumerism.
The new, medicated normal is at odds with women’s dynamic biology; brain and body chemicals are meant to be in flux. To simplify things, think of serotonin as the “it’s all good” brain chemical. Too high and you don’t care much about anything; too low and everything seems like a problem to be fixed. (...)
The most common antidepressants, which are also used to treat anxiety, are selective serotonin reuptake inhibitors (S.S.R.I.s) that enhance serotonin transmission. S.S.R.I.s keep things “all good.” But too good is no good. More serotonin might lengthen your short fuse and quell your fears, but it also helps to numb you, physically and emotionally. These medicines frequently leave women less interested in sex. S.S.R.I.s tend to blunt negative feelings more than they boost positive ones. On S.S.R.I.s, you probably won’t be skipping around with a grin; it’s just that you stay more rational and less emotional. Some people on S.S.R.I.s have also reported less of many other human traits: empathy, irritation, sadness, erotic dreaming, creativity, anger, expression of their feelings, mourning and worry.
These are observations rooted in biology, not intended to mesh with any kind of pro- or anti-feminist ideology. But they do have social implications. Women’s emotionality is a sign of health, not disease; it is a source of power. But we are under constant pressure to restrain our emotional lives. We have been taught to apologize for our tears, to suppress our anger and to fear being called hysterical.
The pharmaceutical industry plays on that fear, targeting women in a barrage of advertising on daytime talk shows and in magazines. More Americans are on psychiatric medications than ever before, and in my experience they are staying on them far longer than was ever intended. Sales of antidepressants and antianxiety meds have been booming in the past two decades, and they’ve recently been outpaced by an antipsychotic, Abilify, that is the No. 1 seller among all drugs in the United States, not just psychiatric ones.As a psychiatrist practicing for 20 years, I must tell you, this is insane.
At least one in four women in America now takes a psychiatric medication, compared with one in seven men. Women are nearly twice as likely to receive a diagnosis of depression or anxiety disorder than men are. For many women, these drugs greatly improve their lives. But for others they aren’t necessary. The increase in prescriptions for psychiatric medications, often by doctors in other specialties, is creating a new normal, encouraging more women to seek chemical assistance. Whether a woman needs these drugs should be a medical decision, not a response to peer pressure and consumerism.
The new, medicated normal is at odds with women’s dynamic biology; brain and body chemicals are meant to be in flux. To simplify things, think of serotonin as the “it’s all good” brain chemical. Too high and you don’t care much about anything; too low and everything seems like a problem to be fixed. (...)
The most common antidepressants, which are also used to treat anxiety, are selective serotonin reuptake inhibitors (S.S.R.I.s) that enhance serotonin transmission. S.S.R.I.s keep things “all good.” But too good is no good. More serotonin might lengthen your short fuse and quell your fears, but it also helps to numb you, physically and emotionally. These medicines frequently leave women less interested in sex. S.S.R.I.s tend to blunt negative feelings more than they boost positive ones. On S.S.R.I.s, you probably won’t be skipping around with a grin; it’s just that you stay more rational and less emotional. Some people on S.S.R.I.s have also reported less of many other human traits: empathy, irritation, sadness, erotic dreaming, creativity, anger, expression of their feelings, mourning and worry.
by Julie Holland, NY Times | Read more:
Image: Christelle EnaultAmerican Democracy is Doomed
America's constitutional democracy is going to collapse.
Some day — not tomorrow, not next year, but probably sometime before runaway climate change forces us to seek a new life in outer-space colonies — there is going to be a collapse of the legal and political order and its replacement by something else. If we're lucky, it won't be violent. If we're very lucky, it will lead us to tackle the underlying problems and result in a better, more robust, political system. If we're less lucky, well, then, something worse will happen.
Very few people agree with me about this, of course. When I say it, people generally think that I'm kidding. America is the richest, most successful country on earth. The basic structure of its government has survived contested elections and Great Depressions and civil rights movements and world wars and terrorist attacks and global pandemics. People figure that whatever political problems it might have will prove transient — just as happened before. (...)
The breakdown of American constitutional democracy is a contrarian view. But it's nothing more than the view that rather than everyone being wrong about the state of American politics, maybe everyone is right. Maybe Bush and Obama are dangerously exceeding norms of executive authority. Maybe legislative compromise really has broken down in an alarming way. And maybe the reason these complaints persist across different administrations and congresses led by members of different parties is that American politics is breaking down.
The perils of presidential democracy
To understand the looming crisis in American politics, it's useful to think about Germany, Japan, Italy, and Austria. These are countries that were defeated by American military forces during the Second World War and given constitutions written by local leaders operating in close collaboration with occupation authorities. It's striking that even though the US Constitution is treated as a sacred text in America's political culture, we did not push any of these countries to adopt our basic framework of government.
This wasn't an oversight.
In a 1990 essay, the late Yale political scientist Juan Linz observed that "aside from the United States, only Chile has managed a century and a half of relatively undisturbed constitutional continuity under presidential government — but Chilean democracy broke down in the 1970s."
The exact reasons for why are disputed among scholars — in part because you can't just randomly assign different governments to people. One issue here is that American-style systems are much more common in the Western Hemisphere and parliamentary ones are more common elsewhere. Latin-American countries have experienced many episodes of democratic breakdown, so distinguishing Latin-American cultural attributes from institutional characteristics is difficult.
Still, Linz offered several reasons why presidential systems are so prone to crisis. One particularly important one is the nature of the checks and balances system. Since both the president and the Congress are directly elected by the people, they can both claim to speak forthe people. When they have a serious disagreement, according to Linz, "there is no democratic principle on the basis of which it can be resolved." The constitution offers no help in these cases, he wrote: "the mechanisms the constitution might provide are likely to prove too complicated and aridly legalistic to be of much force in the eyes of the electorate."
In a parliamentary system, deadlocks get resolved. A prime minister who lacks the backing of a parliamentary majority is replaced by a new one who has it. If no such majority can be found, a new election is held and the new parliament picks a leader. It can get a little messy for a period of weeks, but there's simply no possibility of a years-long spell in which the legislative and executive branches glare at each other unproductively.
But within a presidential system, gridlock leads to a constitutional trainwreck with no resolution. The United States's recent government shutdowns and executive action on immigration are small examples of the kind of dynamic that's led to coups and putsches abroad.
There was, of course, the American exception to the problems of the checks-and-balances system. Linz observed on this score: "The uniquely diffuse character of American political parties — which, ironically, exasperates many American political scientists and leads them to call for responsible, ideologically disciplined parties — has something to do with it."
For much of American history, in other words, US political parties have been relatively un-ideological and un-disciplined. They are named after vague ideas rather than specific ideologies, and neither presidents nor legislative leaders can compel back-bench members to vote with them. This has often been bemoaned (famously, a 1950 report by the American Political Science Association called for a more rigorous party system) as the source of problems. It's also, according to Linz, helped avert the kind of zero-sum conflicts that have torn other structurally similar democracies apart. But that diffuse party structure is also a thing of the past.
Some day — not tomorrow, not next year, but probably sometime before runaway climate change forces us to seek a new life in outer-space colonies — there is going to be a collapse of the legal and political order and its replacement by something else. If we're lucky, it won't be violent. If we're very lucky, it will lead us to tackle the underlying problems and result in a better, more robust, political system. If we're less lucky, well, then, something worse will happen.
Very few people agree with me about this, of course. When I say it, people generally think that I'm kidding. America is the richest, most successful country on earth. The basic structure of its government has survived contested elections and Great Depressions and civil rights movements and world wars and terrorist attacks and global pandemics. People figure that whatever political problems it might have will prove transient — just as happened before. (...)The breakdown of American constitutional democracy is a contrarian view. But it's nothing more than the view that rather than everyone being wrong about the state of American politics, maybe everyone is right. Maybe Bush and Obama are dangerously exceeding norms of executive authority. Maybe legislative compromise really has broken down in an alarming way. And maybe the reason these complaints persist across different administrations and congresses led by members of different parties is that American politics is breaking down.
The perils of presidential democracy
To understand the looming crisis in American politics, it's useful to think about Germany, Japan, Italy, and Austria. These are countries that were defeated by American military forces during the Second World War and given constitutions written by local leaders operating in close collaboration with occupation authorities. It's striking that even though the US Constitution is treated as a sacred text in America's political culture, we did not push any of these countries to adopt our basic framework of government.
This wasn't an oversight.
In a 1990 essay, the late Yale political scientist Juan Linz observed that "aside from the United States, only Chile has managed a century and a half of relatively undisturbed constitutional continuity under presidential government — but Chilean democracy broke down in the 1970s."
The exact reasons for why are disputed among scholars — in part because you can't just randomly assign different governments to people. One issue here is that American-style systems are much more common in the Western Hemisphere and parliamentary ones are more common elsewhere. Latin-American countries have experienced many episodes of democratic breakdown, so distinguishing Latin-American cultural attributes from institutional characteristics is difficult.
Still, Linz offered several reasons why presidential systems are so prone to crisis. One particularly important one is the nature of the checks and balances system. Since both the president and the Congress are directly elected by the people, they can both claim to speak forthe people. When they have a serious disagreement, according to Linz, "there is no democratic principle on the basis of which it can be resolved." The constitution offers no help in these cases, he wrote: "the mechanisms the constitution might provide are likely to prove too complicated and aridly legalistic to be of much force in the eyes of the electorate."
In a parliamentary system, deadlocks get resolved. A prime minister who lacks the backing of a parliamentary majority is replaced by a new one who has it. If no such majority can be found, a new election is held and the new parliament picks a leader. It can get a little messy for a period of weeks, but there's simply no possibility of a years-long spell in which the legislative and executive branches glare at each other unproductively.
But within a presidential system, gridlock leads to a constitutional trainwreck with no resolution. The United States's recent government shutdowns and executive action on immigration are small examples of the kind of dynamic that's led to coups and putsches abroad.
There was, of course, the American exception to the problems of the checks-and-balances system. Linz observed on this score: "The uniquely diffuse character of American political parties — which, ironically, exasperates many American political scientists and leads them to call for responsible, ideologically disciplined parties — has something to do with it."
For much of American history, in other words, US political parties have been relatively un-ideological and un-disciplined. They are named after vague ideas rather than specific ideologies, and neither presidents nor legislative leaders can compel back-bench members to vote with them. This has often been bemoaned (famously, a 1950 report by the American Political Science Association called for a more rigorous party system) as the source of problems. It's also, according to Linz, helped avert the kind of zero-sum conflicts that have torn other structurally similar democracies apart. But that diffuse party structure is also a thing of the past.
Gerrymandering Explained
[ed. See also: This report that Republican lawmakers from Arizona have filed an appeal with the Supreme Court (which will be heard today) against the state's voter-approved independent redistricting commission for creating the districts of U.S. House members. A decision striking down the commission probably would doom a similar system in neighboring California, and could affect districting commissions in 11 other states..]
Gerrymandering -- drawing political boundaries to give your party a numeric advantage over an opposing party -- is a difficult process to explain. If you find the notion confusing, check out the chart above -- adapted from one posted to Reddit this weekend -- and wonder no more.
by Christopher Ingraham, Washington Post | Read more:
Image: N8theGr8
Sunday, March 1, 2015
Back Dorm Boys
[ed. I've been digging around in the Music archives and found this Back Dorm Boys video that I hadn't seen in a long time. They even have their own Wikipedia entry now. Still makes me smile. A few other videos after this.]
[ed. Repost]
[ed. Repost]
When Your Punctuation Says It All (!)
I went out with a guy based on his use of dashes once. Within moments of our first interaction — over text message — I was basically in love.
He didn’t just use the lazy singular dash (“-”) as a pause between his thoughts, or even the more time-consuming double-dash (“--”). Nope. This man used a proper em dash.That is, the kind that required him to hold down the dash button on his iPhone for that extra second, until the “—” appeared, then choose it from among three options. I don’t remember what his messages actually said. But he obviously really liked me.
I’m a writer; it’s natural I’d have a thing for grammar. But these days, it’s as if our punctuation is on steroids.
It’s not just that each of us is more delicately choosing our characters, knowing that an exclamation point or a colon carries more weight in our 140-character world. Or even that our punctuation suddenly feels like hyperbole (right?!?!?!?!?!) because we’ve lost all audible tone.
Those things are true. But it’s also as if a kind of micro-punctuation has emerged: tiny marks in the smallest of spaces that suddenly tell us more about the person on the other end than the words themselves (or, at least, we think they do).
Take the question mark. Recently, a friend I had dinner plans with sent a text to ask “what time” we were meeting. We’d been organizing this meal for weeks; a half-dozen emails back and forth. And yet the question — sans the mark — felt indifferent, almost cold. Couldn’t she at least bother to insert the necessary character?
Of course, had she inserted too many marks, that may have been a problem, too, as there is suddenly a very fine line between appearing overeager (too much punctuation) and dismissive (not enough).
Even the period, once the most benign of the punctuation spectrum, now feels aggressive. And the exclamation point is so ubiquitous that “when my girlfriends don’t use an exclamation point, I’m like ‘What’s wrong, you O.K.?’ ” said Jordana Narin, a 19-year-old student in New York.
“Girlfriends” may be a key word there, as women are more likely to use emotive punctuation than men are. Yet lately I’ve tried to rein my own effusiveness in, going as far as to insert additional punctuation into existing punctuation in an effort to soften the marks themselves.
So instead of responding to a text with “Cant wait!!” I’ll insert a space or two before the mark — “Cant wait !!” – for that extra little pause. Sometimes I’ll make the exclamation point a parenthetical, as a kind of after thought (“Can’t wait (!)”). A friend inserts an ellipses — “Can’t wait … !!” — so, as she puts it, “it’s less intense.”
“At this point, I’ve basically suspended judgment,” said Ben Crair, an editor at the New Republic who recently wrote a column about the new aggression of the period. “You could drive yourself insane trying to decode the hidden messages in other people’s punctuation.”
by Jessica Bennett, NY Times | Read more:
Image: Ron BarrettSaturday, February 28, 2015
You Are Listening To...
...a soothing mix of police radio chatter and ambient music. Choose from Los Angeles, New York, San Francisco, Chicago, or my personal recommendation, Montréal. French police chat really blends into the music nicely. You may need to adjust the balance of each stream a bit to find the right mix.
Subscribe to:
Comments (Atom)










