Wednesday, July 13, 2011

The Ivy Delusion

Image credit: Yuta Onoda

[ed.  Interesting companion piece to the article following this one.]

by Caitlin Flanagan

Right now, in admissions offices in Cambridge and New Haven and Palo Alto, the teenage children of some of America’s most thoughtful and devoted mothers are coming in for exceptionally close scrutiny—as is, so these women feel, the parenting they have offered their youngsters for the past 18 years. This is the tail end of reading season, when our august universities must turn to what their relentless high-school visiting and U.S. News & World Report boosterism have wrought: a staggering number of requests for an absurdly small number of spots at their schools. Harvard recently announced that this year it is considering an astronomical 35,000 applications for only about 1,640 spaces in the freshman class. The great hope of today’s professional-class parents—whose offspring still make up the majority at elite colleges, no matter how much progress the institutions have made in widening the socioeconomic range of their student bodies—was that the ebbing of the so-called echo boom would save their children from the heartless winnowing. The late 1990s and the 2000s saw an uptick in the number of teenagers in America, and there was a belief, in many quarters, that the increasingly competitive nature of elite-college admissions was a by-product of that demographic fluke. But now, although the number of teens has receded, the percentage of those kids who nurture the dream of attending a selective college continues to skyrocket. And so, for this year’s most accomplished and talented high-school seniors, the reckoning is at hand.

But we were talking about the mothers—the good mothers. The good mothers went to Brown, and they read The Drama of the Gifted Child, and they feel things very deeply, and they love their children in a way that is both complicated and primal, and they will make any sacrifice for them. They know that it takes a lot of time to nurture and guide a child—and also that time is fleeting, and that the bliss of having your kids at home is painfully short-lived—and so most of them have cut back on their professional aspirations in significant ways. The good mothers have certain ideas about how success in life is achieved, and these ideas have been sizzled into their brains by popularizers such as Joseph Campbell and Oprah Winfrey, and they boil down to this: everyone has at least one natural talent (the good mothers call it a “passion”), and creativity, effortless success, and beaucoup dinero flow not from banging your head against the closed door of, say, organic chemistry if you’re not that excited by it, but from dwelling deeply and ecstatically inside the thing that gives you the most pleasure. But you shouldn’t necessarily—or under any circumstances, actually—follow your bliss in a way that keeps you out of Yale. Because Yale is important, too! So important. The good mothers believe that their children should be able to follow their passions all the way to New Haven, Connecticut, and this obdurate belief of theirs is the reason so many of them (Obama voters, Rosa Parks diorama co-creators, gay-rights supporters, champions, in every conceivable way, of racial diversity and tolerance) are suddenly ready to demand restoration of the Chinese Exclusion Act. Because Amy Chua has revealed, in so many blunt and horrifying words, why the good mothers are getting spanked, and why it’s only going to get worse.

You should know that the good mothers have been mad—and getting madder—for quite a while now. The good mothers believe that something is really wrong with the hypercompetitive world of professional-class child rearing, whose practices they have at once co-created and haplessly inherited. The good mothers e-blast each other New York Times articles about overscheduled kids and the importance of restructuring the AP curriculum so that it encourages more creative thinking. They think that the college-admissions process is “soul crushing.” One thing the good mothers love to do—something they undertake with the same “fierce urgency of now” with which my mom used to protest the Vietnam War—is organize viewings of a documentary called Race to Nowhere. Although the movie spends some time exploring the problems of lower-income students, it is most lovingly devoted to a group of neurasthenic, overworked, cracking-at-the-seams kids from a wealthy suburb in Northern California, whom we see mooning around the enormous kitchens of their McMansions and groaning about sleeplessness and stress. It posits that too much homework can give your child stomach pains, chronic anxiety, anhedonia.

Read more:

Harvard and Class

by Misha Glouberman (as told to Sheila Heti)

I grew up in Montreal and went to an upper-middle-class Jewish day school where kids had parents who maybe owned a carpet store or maybe were dentists. And then I went to Harvard for college. And it was pretty weird.

When I applied, I thought it would be great because I would get to meet lots of smart people. Those were the kinds of people I liked to be friends with, and I thought there would be more of them there. That was the main reason I thought it would be a fun place to be. I don’t think I was super ambitious or professional minded or even a very good student.

The thing I figured out soon after I applied was that, on Gilligan’s Island, it wasn’t the Professor who went to Harvard, it was Mr. Howell, the rich man. That was something of a revelation.

It’s funny, because what a lot of people talk about when they talk about going to Harvard is being really intimidated by the place when they arrive. I wasn’t at all intimidated by the place when I arrived—but I was really intimidated after graduating.

I arrived at Harvard from Montreal, which is a pretty fucking hip place to be an eighteen-year-old. I’d been going to bars for a while, and I was in a political theater company that did shows in lofts with homeless people and South American activists. And we went to pubs and got old gay men to buy us drinks. It was a pretty cool, fun, and exciting life for a kid in Montreal. It was a very vibrant place, and young people were really part of the life of the city.

Then when I went to Harvard, the place was full of these nominally smart, interesting people, all of whom at the age of eighteen seemed perfectly happy to live in dormitories and be on a meal plan and live a fully institutional life. And that was completely maddening! This was the opposite of everything I’d hoped for from the environment I’d be in.

By design, the university wants to be an enclosed institution, so you’re required to live on campus, which means that you’re not living in the city. You don’t have a landlord or neighbors or those kinds of things. You’re pretty much required to sign up for the meal plan, which means you don’t interact with people in restaurants or grocery stores or any of that kind of stuff. The drinking age is twenty-one, and it’s strictly enforced in the city but mostly unenforced on campus, which means if you want to drink or go to a party, you can only do that on campus, but if you want to go see a band at a club, you can’t do that.

I spent my first year trying to figure out how to participate in the life of the city in some way, but by the end of my first year I think I gave up because the pull of the university community was so strong and the boundaries were so hard to overcome.

By the end of university, I ended up living somewhere that was considered off campus—a place called the Dudley Co-op. The Dudley Co-op was located in a building that was owned by Harvard. About thirty or forty Harvard students lived there. We did our own cooking and cleaning, but we were on the university phone system and the university did the building maintenance. That’s how fully institutionalized life at Harvard was: even Dudley House, which was the organization that looked after off-campus living, provided university-owned accommodation for people who wanted to live off campus.

There actually was a small percentage of students who genuinely did live off campus—like 1 percent—but you had to get university permission. I think the explanation the university would give is that going to Harvard isn’t just a set of courses, it’s an experience and a community, and they’re interested in people being part of that community, which means living there and participating in what they call the “house system,” the different dorms students live in.

But the end result is that it makes the university into an ivory tower—I mean, incredibly so. It would be one thing if you were out in the woods, but this is Boston. In four years of living in that city I pretty much didn’t come to know anybody who wasn’t affiliated with Harvard. And I’m someone who’s interested in cities and who’s interested in meeting different kinds of people. The university is a completely isolated environment, and the fact that you’re inside a city somehow makes that more insidious and terrible.

All the parties were on campus. So when you went to a party—and that’s what you would do Friday and Saturday night, you would go to a party—the party would be on campus, which means, sort of implicitly, that if you’re a student at the university, you’re welcome, and if you’re not, you’re trespassing. So even at parties—and I went to parties for four years—the average number of people at a given party who weren’t Harvard students was zero. All of this serves to create a very weird, very contained environment.

When I was at university, it shocked me how focused so many people were about their careers, in ways that often seemed pretty narrow. I guess I knew that Harvard attracts very ambitious young people, but I was still surprised. In Montreal I knew a lot of really interesting people doing interesting things, and there was a lot less of that at Harvard than I would have expected. In retrospect it’s not surprising. At a certain level, an institution like that is going to attract people who are very good at playing by the rules.

Read more:

The 17th-Century Breastoration: A Time Before Bras

by Lilli Loofbourow

If you've ever been to a Renaissance Faire (I have), you know that the concept is less Queen Elizabeth and more Don Key-Ho-Tee's Medieval Potlucke WITH BREASTS. Or at least it was 10 years ago when a Ren-friend and I ate shepherd's pie, looked at chain-mail, and — once we'd soaked in enough of the Worlde and its high freckled bosoms — tried some boob-hoisting ourselves.

Putting a corset on is tough, and the instructions I received at the Faire went as follows: Lean down, shove your boobs into it, straighten up, then pop them up so they'll show through the dress. It may or may not surprise you that a) these instructions came from the amiable sales-fellow, and b) I walked around the booth with a nipple on display until my friend came out of her dressing room.

If this story has a moral, it's that cleavage-wrangling is complex. My God! I thought. How did the ladies of yore do it?

I'm finally in a position to find out. Picture it: It's the seventeenth century. Bras don't exist yet. As a typical woman, what do you do?

Option 1: Consult a Reference Work! You might turn to the Ladies' Dictionary. Published in 1694, here's the entry marked “Breasts”:

...how to make them Plump and Round: Breasts that hang loose, and are of an extraordinary largeness, lose their charms, and have their Beauty buried in the grave of uncomeliness, whilst those that are small, plump and round, like two ivory globes, or little worlds of beauty, whereon Love has founded his Empire, command an awful homage from his vassals, captivate the wondering gazer's eyes, and dart warm desires into his Soul, that make him languish and melt before the soft Temptation.

That there's your goal. Now, what to do if you're saddled with large breasts whose beauty is buried in the grave of uncomeliness? The Ladies' Dictionary is here to help.

Therefore to reduce those Breasts that hang flagging out of all comely shape and form, that they may be plump, round and smaller, bind them up close to you with caps or bags that will just fit them, and so let them continue for some nights. Then take carrot-seed, plantain-seeds, aniseeds, fennel-seeds, cumin-seeds, of each two ounces, virgin's honey an ounce, the juice of plantain and vinegar two ounces each. Bruise and mingle them well together. Then, unbinding your breast, spread the composition plaster-wise and lay it on your breasts, binding them up close as before. After two days and two nights, take off the plasters and wash your breasts with white wine and rose-water.

Got that? Basically, fashion a fitted homemade bra out of caps or bags, stuff it with seeds and honey, and marinate in the fruit of your loom for two days.

Then what?

In so doing for twelve or fourteen days together, you will find them reduced to a curious plumpness and charming roundness. Wash them then with water of Benjamin, and it will not only whiten them, but make their azure veins appear in all their intricate meanders, till the Lover in tracing them loses himself.

Whew. Right? Plastic surgeons, take note: This is what those “After” photos you wallpaper restrooms with should convey. (Minus the veins, which you will no doubt recommend cauterizing because circulation is so 1694. Oh, and get better fonts.)

Now that you have your breasts tight, round, and full of blue veins, you might notice that there's a tad too much pink in your skin tone. Tricky, this. If your complexion happens to be on the ruddy side, the Ladies' Dictionary must regrettably advise against the widespread practice of exposing your face and naked breast to the moon at night, “as if the Moon (because pale herself) would make them so, or by spitting in their Faces, scour off the Crimson dye.” This is a silly thing to do, ladies. Dew is moon-spit. It won't wash your color off, and moontanning isn't a thing. So ease up on those half-naked midnight strolls.

An Eye-Opening Adventure in Socialized Medicine

by Steve Silberman

I woke up in a rented room in London in the middle of the night, feeling like my eyes had been packed with hot sand and the lids were somehow glued together. When I pried them apart, the whites of my eyes were an angry crimson.

Maybe it was nothing. I’d been told that the pollen counts in the UK this summer are sky high. A raging heat wave in a city that doesn’t really do air-conditioning (like my gloriously fogbound town of San Francisco) didn’t seem to be helping. But when I squinted in the bathroom mirror, I saw a greenish-white discharge collecting around my tear ducts. This looked like more than a bad case of hay fever.

Then I remembered that one of the cognitive psychologists I’d come to London to interview mentioned that she’d recently had a bad eye infection. I Googled “conjunctivitis.” It dawned on me that the bottle of water I drank in her office may have been a mixed blessing.

But what to do? I was far from home with lots of work to do and no idea how to see a doctor locally. Thankfully, I didn’t have any appointments for a couple of days, and have health insurance from Kaiser-Permanente through my spouse’s employer. But I knew that getting reimbursed for treatment by a doctor outside the Kaiser network can be complex; what about an out-of-country doctor?

When I dialed the 800 number on my Kaiser card to find out what to do, an automated voice from AT&T informed me that I would be billed at the standard international calling rate of $1 a minute. After navigating a maze of call-center prompts, I sat on hold for 15 minutes.

The first Kaiser rep who took my call fired off a barrage of questions. Was I experiencing “blind spots, double vision, floaters, hallucinations, or any other problems” with my vision? Yes — the goopy discharge from my tear ducts was making it hard to see, and I said so. But that turned out to be the wrong answer. The Kaiser rep simply repeated her question in a more brittle tone of voice and added, “Just answer yes or no.”

Yes, I was having problem with my vision, but not “double vision, floaters, or hallucinations.” Judging by the structure of the question, I suspected that it was designed to fish for a different sort of problem than the one I had, such as evidence of entopic phenomena that might indicate something awry inside the eyeball, or even in the brain. I didn’t want to end up shunted onto the wrong track in the voicemail maze. “Floaters, hallucinations, and double-vision, no,” I explained, ”but problems with my vision yes, because the discharge from my tear ducts…”

“Sir,” she cut me off sternly. “These are yes or no questions. Answer either yes or no or I will not be able to help you.” I furiously tried to calculate which falsely binary oversimplifications were the right ones.

Then back to limbo at $1 a minute. Finally an advice nurse picked up. She ran me through a nearly identical gantlet of questions — hadn’t my previous answers been logged into the database? — but unlike the previous insurance rep, the advice nurse could handle nuance. Given the severity of my symptoms, she told me, I should certainly certainly see a doctor right away — as soon as I had secured permission for an out-of-network exam with someone at the member-services line on the other side of my Kaiser card.

It was 2 in the morning in a strange country and my eyes were oozing green goo, but at least I was getting somewhere. I called the other number, navigated another maze of prompts, and waited. Tick, tick, tick.

Thankfully, the member-services rep was both efficient and sympathetic. Of course, she said, it must be upsetting to be having eye problems far from home. I should definitely go to a local clinic. But before she could give me permission to do that, she would have to talk to her supervisor, because she’d never dealt with someone having a medical problem outside the country before. Several minutes passed.

Then, good news from the supervisor — with one caveat. Yes, I should go see a doctor at a local clinic. But because this was all happening out-of-network, I would have to pay out of pocket. As long as I made sure to obtain all the necessary receipts and forms, however, I could submit them when I got home, and Kaiser would “open a case file” on me so I could be reimbursed.

I wondered how much the visit would cost me up front — $200, $500, $1000? The unfavorable exchange rate had already vacuumed out my wallet, just picking up Chunky Hummus Salad wraps and “flat white” coffees at Pret A Manger. But it didn’t matter. My eyes needed help now, and I was almost certainly highly contagious; I didn’t want to pass this mess on to anyone else.

The member-services rep then explained that a Kaiser doctor would be calling me within the next four hours to give me additional information. I asked her gently if the doctor could possibly call in the morning London time, because I was already sleep-deprived and had a lot of work to do the following day. Sorry, she replied, that was just not possible. The doctor would have to call within the four-hour window allotted for my case — even if that meant the phone ringing at 5 in the morning.

Still, I was grateful to finally have permission to seek the care that I desperately needed. I called a number I found on the Web for urgent care in Marylebone, the central London neighborhood where I’d found a semi-affordable place to stay for three weeks. Amazingly, a human being picked up the phone right away — an affable guy with a disarmingly chummy accent and an empathic manner. Yes, yes, of course I should see a doctor right away. Where should they send him?

What? This guy was offering to dispatch someone to examine my eyes immediately in my apartment in the middle of the night?

I couldn’t even remember the last time I’d gotten a house call from a doctor — was it when I had chicken pox in 3rd grade? I expressed my astonishment. The chap on the other end of the line just laughed: I assure you, it’s no problem.

Read more:

Tuesday, July 12, 2011

How Music is Born

via:

Gorillaz


Tangerine Dream


[ed. I prefer this version, but for some reason it's only available for viewing on YouTube.]
via:

The Eight Truths About Weddings (That No One Ever Tells You)

by Melissa Lafsky

Once you decide to have a wedding, there are many, many things to read: etiquette guides, Dos and Don'ts, planning checklists, vendor guides, “inspiration boards,” disaster stories, angry bridesmaid rants ("bitch made me wear PURPLE SHOES!"), even socio-political screeds about the cultural irrelevance of the whole thing. All of these are nice, and all of them are utterly useless.

If you're the one getting married—which I am, in three months, while also attending eight other weddings in as many months due to a hyper-marital zeitgeist (that, as of July 24th, includes New York gays!! Welcome to the madness!!)—a mysterious stupor befalls you. The tales of "bridal nervous breakdowns” have become ingrained in pop culture, "ingrained” meaning “anything that gets its own reality show.” Such breakdowns do happen, and they're hardly gender-specific, but these displays of emotional gangrene fail to get at the heart of the nuptial plight.

So where does one go to find a guide to the true sources of wedding-angst? One resource is the wedding industry, that fondant tower of chintzy madness that exists to suck your wallet and self-esteem out through multiple orifices. The industry gets plenty of flack, mostly for its organza-wrapped obfuscation of anything important. But all this hating is silly. Yes, the wedding industry will crack open your skull and pour in gallons of raspberry-hazelnut ganache, and then send you a bill for $15,000. But that's its job. It's absurd to expect people in the industry to tell you the truth about weddings. They're there for one purpose: to sell you shit. Calling them manipulative capitalist assholes (ahem, Rebecca Mead) isn’t solving the problem; it’s simply blaming the addiction on the dealer.

The truth about weddings was once something we all figured out for ourselves as we made our way across the glurpy morass of the engagement tar fields. Until now! Here is your look into the things no one ever tells you about weddings (but are nonetheless true).

1. WEDDINGS ARE EMOTIONAL RECKONINGS.

Have you dealt with your issues? I’m not talking about a few months in therapy and the occasional Xanax on a bad day—I’m talking about really digging in, sitting under the Bodhi tree, and dealing with all the nasty icky hurts and fears and angers that have burned your face and clamped your guts since you were five. If you have never once taken a hard look at what really triggers you emotionally, and figured out a way to release that trigger, you're in for a shock. Because ALL of your submerged emotions will rear their Gorgon heads during the process of planning a wedding. Prepare to be confronted.

First, there’s your family. Ahh, family. The one group with perma-instant access to every emotional trigger in your psyche ("Of course your mother knows how to push all your buttons!" a matriarch once told me. 'She created them!!"). Do you still resent your mom for that "Honey, your thighs don’t need that ice cream!" comment in 8th grade? Clinging to the last vestiges of anger at your dad for never kissing you goodnight or reading your term papers? Secretly seethe at your brother for moving far away and leaving you to deal with the full brunt of your parents' needs? Lucky you! You're going to experience all of it again, since each of these people will be intimately involved in your Big Megaspecial Day (whether you invite them or not). If you do not give up any and all familial anger, it will seize you in its talons and tear out your liver at least once a day, Prometheus style. You will find yourself shrieking over the fact that your mom disapproves of your choice of chair covers ("You never liked my clothes in junior high!!! Wail Sob!”) or that your dad suggested "Psychokiller" as a father-daughter dance ("You spent my childhood in the office and now this!!"). Any unresolved issue, annoyance or pin in the side that you’ve had since, well, birth will now be a part of your daily life. And we haven’t even gotten to the fact that you may be asking them for money!

Tennis Ball Pajamas

[ed.  Really?  Two years?]

by Anahad O'Connor

For some people who snore, a slight tweak in sleeping position — lying on one side instead of the back — can lead to a better night’s rest. Yet staying put in that position, while wrapped in slumber, is not always an easy feat.

One of the oldest and simplest solutions involves a tennis ball, which is taped or sewn into the back of the pajamas to prevent a snorer from rolling onto his or her back at night. The technique is widely recommended by sleep experts, but studies have found it may not work for many chronic snorers.

In 2009 a team of researchers studied whether this trick could reduce snoring in 67 people with obstructive sleep apnea, which causes snoring and breathing interruptions throughout the night. The patients had an average of 30 breathing pauses per hour of sleep, which climbed to over 50 interruptions when they were on their backs, but was roughly 14 when they slept on their sides. They were taught to use the tennis ball technique, then followed for an average of over two years.

At the end of the study, which was published in The Journal of Clinical Sleep Medicine, the researchers found that most patients gave it up. Less than 10 percent still used the technique. Those who stopped said that it was ineffective or caused backaches, or that the ball moved around too much, among other problems.

For those in need of a more promising strategy, devices that provide continuous positive airway pressure, or C.P.A.P., help keep the airways open and are extremely effective. Some doctors also offer noninvasive treatments that tighten the throat tissue and improve breathing, taking the roar out of your snore.

THE BOTTOM LINE

Research shows that for many people, the tennis ball trick is not a very effective anti-snoring technique.

via:

War Without Humans

by Barbara Ehrenreich

For a book about the all-too-human “passions of war,” my 1997 work Blood Rites ended on a strangely inhuman note: I suggested that, whatever distinctly human qualities war calls upon -- honor, courage, solidarity, cruelty, and so forth -- it might be useful to stop thinking of war in exclusively human terms. After all, certain species of ants wage war and computers can simulate “wars” that play themselves out on-screen without any human involvement.

More generally, then, we should define war as a self-replicating pattern of activity that may or may not require human participation. In the human case, we know it is capable of spreading geographically and evolving rapidly over time -- qualities that, as I suggested somewhat fancifully, make war a metaphorical successor to the predatory animals that shaped humans into fighters in the first place.

A decade and a half later, these musings do not seem quite so airy and abstract anymore. The trend, at the close of the twentieth century, still seemed to be one of ever more massive human involvement in war -- from armies containing tens of thousands in the sixteenth century, to hundreds of thousands in the nineteenth, and eventually millions in the twentieth century world wars.

It was the ascending scale of war that originally called forth the existence of the nation-state as an administrative unit capable of maintaining mass armies and the infrastructure -- for taxation, weapons manufacture, transport, etc. -- that they require. War has been, and we still expect it to be, the most massive collective project human beings undertake. But it has been evolving quickly in a very different direction, one in which human beings have a much smaller role to play.

One factor driving this change has been the emergence of a new kind of enemy, so-called “non-state actors,” meaning popular insurgencies and loose transnational networks of fighters, none of which are likely to field large numbers of troops or maintain expensive arsenals of their own. In the face of these new enemies, typified by al-Qaeda, the mass armies of nation-states are highly ineffective, cumbersome to deploy, difficult to maneuver, and from a domestic point of view, overly dependent on a citizenry that is both willing and able to fight, or at least to have their children fight for them.

Yet just as U.S. military cadets continue, in defiance of military reality, to sport swords on their dress uniforms, our leaders, both military and political, tend to cling to an idea of war as a vast, labor-intensive effort on the order of World War II. Only slowly, and with a reluctance bordering on the phobic, have the leaders of major states begun to grasp the fact that this approach to warfare may soon be obsolete.

Consider the most recent U.S. war with Iraq. According to then-president George W. Bush, the casus belli was the 9/11 terror attacks. The causal link between that event and our chosen enemy, Iraq, was, however, imperceptible to all but the most dedicated inside-the-Beltway intellectuals. Nineteen men had hijacked airplanes and flown them into the Pentagon and the World Trade Center -- 15 of them Saudi Arabians, none of them Iraqis -- and we went to war against… Iraq?

Military history offers no ready precedents for such wildly misaimed retaliation. The closest analogies come from anthropology, which provides plenty of cases of small-scale societies in which the death of any member, for any reason, needs to be “avenged” by an attack on a more or less randomly chosen other tribe or hamlet.

Why Iraq? Neoconservative imperial ambitions have been invoked in explanation, as well as the American thirst for oil, or even an Oedipal contest between George W. Bush and his father. There is no doubt some truth to all of these explanations, but the targeting of Iraq also represented a desperate and irrational response to what was, for Washington, an utterly confounding military situation.

We faced a state-less enemy -- geographically diffuse, lacking uniforms and flags, invulnerable to invading infantries and saturation bombing, and apparently capable of regenerating itself at minimal expense. From the perspective of Secretary of Defense Donald Rumsfeld and his White House cronies, this would not do.

Since the U.S. was accustomed to fighting other nation-states -- geopolitical entities containing such identifiable targets as capital cities, airports, military bases, and munitions plants -- we would have to find a nation-state to fight, or as Rumsfeld put it, a “target-rich environment.” Iraq, pumped up by alleged stockpiles of “weapons of mass destruction,” became the designated surrogate for an enemy that refused to play our game.

The effects of this atavistic war are still being tallied: in Iraq, we would have to include civilian deaths estimated at possibly hundreds of thousands, the destruction of civilian infrastructure, and devastating outbreaks of sectarian violence of a kind that, as we should have learned from the dissolution of Yugoslavia, can readily follow the death or removal of a nationalist dictator.

But the effects of war on the U.S. and its allies may end up being almost as tragic. Instead of punishing the terrorists who had attacked the U.S., the war seems to have succeeded in recruiting more such irregular fighters, young men (and sometimes women) willing to die and ready to commit further acts of terror or revenge. By insisting on fighting a more or less randomly selected nation-state, the U.S. may only have multiplied the non-state threats it faces.

Unwieldy Armies

Whatever they may think of what the U.S. and its allies did in Iraq, many national leaders are beginning to acknowledge that conventional militaries are becoming, in a strictly military sense, almost ludicrously anachronistic. Not only are they unsuited to crushing counterinsurgencies and small bands of terrorists or irregular fighters, but mass armies are simply too cumbersome to deploy on short notice.

In military lingo, they are weighed down by their “tooth to tail” ratio -- a measure of the number of actual fighters in comparison to the support personnel and equipment the fighters require. Both hawks and liberal interventionists may hanker to airlift tens of thousands of soldiers to distant places virtually overnight, but those soldiers will need to be preceded or accompanied by tents, canteens, trucks, medical equipment, and so forth. “Flyover” rights will have to be granted by neighboring countries; air strips and eventually bases will have to be constructed; supply lines will have be created and defended -- all of which can take months to accomplish.

Read more:

image credit:

Monday, July 11, 2011

via:
via:

The Last Guy Standing

by Jonathan Goldstein

On Saturday afternoon at 2 o'clock, the staff of Forest Trace, a retirement community just outside Fort Lauderdale, Fla., clears aside the tables and chairs in the foyer of the main building to create a circle for the women and men to dance, though when I say the women and men, I mainly mean the women and Hy Kaplan.

When I walk into the lobby at 2:30, Kaplan, 93, is twirling Thelma Kahn in the middle of a circle of two dozen watchful women in wicker chairs. It is a scene of ethereal beauty. The couple dance among tall white pillars, and sunlight streams in through the skylight high above, giving Kahn's puffy white hair a halo. After a few more twirls, Kaplan returns Kahn to her chair and approaches the next lucky lady.

This is how it works: Kaplan escorts each partner to the center of the circle, where, depending on the song and Kaplan's mood, they will fox trot, waltz, tango or even merengue. As the afternoon wears on, Kaplan's white-leather loafers bounce gracefully about the carpet, and no matter how good or bad his partner, he always demonstrates a little showmanship, throwing in a Westchester step here and there, doing other little fancy things with his feet that I don't know the name for. When the song is over, he extends his arms to the next woman down the line, who always accepts them. There's something in Kaplan's manner that makes it seem as if he has a job to do, as if he's unloading a truck of women in boxes whom he must dance with one at a time before the quitting whistle blows.

A man named Big Nick wears a white suit and plays a white baby grand piano. He belts out tunes like ''Hava Nagila'' and ''It Had to Be You,'' and when he hits the opening notes of ''Bye Bye Blackbird,'' a hushed chorus of women's voices chime in.

I ask the woman seated in front of me if Kaplan is the best dancer here.

''Well,'' she says, ''he doesn't have much competition.''

There are more than twice as many women as men at Forest Trace. All a man has to do is stay alive, and he's guaranteed a full dance card. A couple of these men sit with the women, watching with a sort of aristocratic indifference as Kaplan dances. Simply because they are men, they have their choice of women, but even the casual observer can see that they are a bunch of sleepy yellow-pant-ed Potsies and practical-walking-shoed Ralphs, while around here, Hy Kaplan is the Fonz.

Forest Trace, home to more than 400 seniors, is all about leisure, and as such, a kind of courtlike behavior has emerged here, full of intrigues and legends and gossip. It's the kind of thing you think you're only going to live through once, in high school.

''It's like Peyton Place here,'' says Bea Utal, who is sitting in the foyer. ''There are so many affairs.'' Utal tells me the story of how a Forest Trace couple in their 90's were found naked in bed together. It seems that one of them, during the throes of passion, accidentally pulled the emergency cord above the bed.

Read more:

image credit:
via:
via:
via:

Two Decades of the Web: A Utopia No Longer

by Evgeny Morozov

The internet is a child with many fathers. It is an extremely complex multi-module technology and each module—from communication protocols to browsers—has a convoluted history. The internet’s earliest roots lie in the rise of cybernetics during the 1950s. Later breakthroughs included the invention of packet switching in the 1960s, a novel way for transmitting data by breaking it into chunks. Various university and government networks began to appear in the early 1970s, and were interlinked in the 1980s. The first browsers came on line in the early 1990s—20 years ago this August.

Many seemingly unrelated developments in the computer industry played an important role. The idea of personalised, decentralised and playful computing was being advanced by the likes of Apple and Microsoft in the 1970s. In contrast, IBM’s idea of computing was of an expensive, centralised and institutional activity. If this latter view had prevailed, the internet might have never developed beyond email, which would probably have been limited to academics and investment bankers. That your mobile phone moonlights as a computer is not the result of inevitable technological trends, but the outcome of deeply ideological and now almost forgotten struggle between two different visions of computing.

Much of the credit for the technical advances of the internet goes to individuals such as Vint Cerf, creator of the first inter-network protocol, which helped to unify the numerous pre-internet networks; David D Clark, who helped to theorise the “end-to-end” principle, the precursor to the modern concept of “net neutrality”; and Tim Berners-Lee, who invented the world wide web.

But studying the history of the internet is impossible without studying the ideas, biases, and desires of its early cheerleaders, a group distinct from the engineers. This included Stewart Brand, Kevin Kelly, John Perry Barlow, and the crowd that coalesced around Wired magazine after its launch in 1993. They were male, California-based, and had fond memories of the tumultuous hedonism of the 1960s.

These men emphasised the importance of community and shared experiences; they viewed humans as essentially good, influenced by rational deliberation, and tending towards co-operation. Anti-Hobbesian at heart, they viewed the state and its institutions as an obstacle to be overcome—and what better way to transcend them than via cyberspace? Their values had profound effects on the mechanics of the internet, not all of them positive. The proliferation of spam and cybercrime is, in part, the consequence of their failure to predict what might happen as a result of the internet’s open infrastructure. The first spam message dates back to 1978; now, 85 per cent of all email traffic in the world is spam.

Perhaps the cheerleaders’ greatest achievement was in wresting dominance of the internet from the founding engineers, whose mentality was that of the Cold War. These researchers greatly depended on the largesse of the US department of defence and its nervous anticipation of a nuclear exchange with the Soviet Union. The idea of the “virtual community”—the antithesis of Cold War paranoia—was popularised by the writer and thinker Howard Rheingold. The term arose from his experiences with Well.com, an early precursor to Facebook.

But this cyber-boosterism was not without a serious side. Figures such as Nicholas Negroponte, co-founder of the MIT Media Laboratory and the spiritual leader of the “One Laptop per Child” movement, Bill Gates of Microsoft, and Esther Dyson, the commentator and entrepreneur, helped to assure the public that the internet was not just a hangout for Bay Area hippies— it was also a serious place for doing business. And as the cyber-pundits kept promising, it was also a place for “getting empowered,” an attitude that made it a good fit for the broader neoliberal agenda of the 1990s.

Read more:
via:
Helmut Newton
via: