Saturday, August 31, 2013

Jackson Browne/David Lindley


Academy Fight Song

This essay starts with utopia—the utopia known as the American university. It is the finest educational institution in the world, everyone tells us. Indeed, to judge by the praise that is heaped upon it, the American university may be our best institution, period. With its peaceful quadrangles and prosperity-bringing innovation, the university is more spiritually satisfying than the church, more nurturing than the family, more productive than any industry.

The university deals in dreams. Like other utopias—like Walt Disney World, like the ambrosial lands shown in perfume advertisements, like the competitive Valhalla of the Olympics—the university is a place of wish fulfillment and infinite possibility. It is the four-year luxury cruise that will transport us gently across the gulf of class. It is the wrought-iron gateway to the land of lifelong affluence.

It is not the university itself that tells us these things; everyone does. It is the president of the United States. It is our most respected political commentators and economists. It is our business heroes and our sports heroes. It is our favorite teacher and our guidance counselor and maybe even our own Tiger Mom. They’ve been to the university, after all. They know.

When we reach the end of high school, we approach the next life, the university life, in the manner of children writing letters to Santa. Oh, we promise to be so very good. We open our hearts to the beloved institution. We get good grades. We do our best on standardized tests. We earnestly list our first, second, third choices. We tell them what we want to be when we grow up. We confide our wishes. We stare at the stock photos of smiling students, we visit the campus, and we find, always, that it is so very beautiful.

And when that fat acceptance letter comes—oh, it is the greatest moment of personal vindication most of us have experienced. Our hard work has paid off. We have been chosen.

Then several years pass, and one day we wake up to discover there is no Santa Claus. Somehow, we have been had. We are a hundred thousand dollars in debt, and there is no clear way to escape it. We have no prospects to speak of. And if those damned dreams of ours happened to have taken a particularly fantastic turn and urged us to get a PhD, then the learning really begins.

College and Mammon Both

Go back to the beginning, back to the days when people first understood a character-building college diploma to be the ticket to middle-class success. We would forge a model republic of citizen-students, who would redeem the merit badges of academic achievement for spots in the upper reaches of corporate capitalism. The totems of the modern American striver were to be the University Credential and the Corner Office, and prosperity would reward the ablest.

And so the story remains today, despite everything that has happened in the realms of the corporation and the university. We might worry from time to time about the liberal professors who infest the academy, but school is still where you go to “write your destiny,” to use President Obama’s 2010 description of education generally. Go to college, or else your destiny will be written by someone else. The bachelor’s degree that universities issue is a “credential” that’s “a prerequisite for 21st century jobs,” says the White House website. Obama himself equates education with upward mobility—more schooling equals more success—as well as with national greatness. “The kinds of opportunities that are open to you will be determined by how far you go in school,” he declared a few years ago.
In other words, the farther you go in school, the farther you’ll go in life. And at a time when other countries are competing with us like never before, when students around the world are working harder than ever, and doing better than ever, your success in school will also help determine America’s success in the twenty-first century.
This is commonplace and unremarkable to the point of being utterly hackneyed. Everyone says this. It is obvious. Thomas Friedman, the New York Times foreign affairs columnist who has refashioned himself into the Lord Protector of Learning in recent years, says the same thing, constantly: you’d better have the schooling and the skills that the entrepreneurial class demands if you want to make even a minimal living. The higher education mantra is possibly the greatest cliché in American public life.

And so the dreams proliferate. Education is the competitive advantage that might save our skins as we compete more and more directly with China and Vietnam and the Philippines, the journalists say. Education is what explains income inequality, chime the economists, and more education is what will roll it back. In fact, education is just about the only way we can justify being paid for our work at all; it is the only quantifiable input that makes us valuable or gives us “skills.”

Quantifiable, yes, but only vaguely. No one really knows the particular contents of the education that is supposed to save us. It is, again, a dream, a secret formula, a black box into which we pour money and out of which comes uplift or enrichment or wish-fulfillment. How a college education manages to do these marvelous things—Is it calculus? Is it classics?—is a subject of hot controversy. All we know for sure is that people who go to college are affluent; it follows naturally that if you send more people to college, you will have yourself a more affluent country.

by Thomas Frank, The Baffler |  Read more:
Image:Spencer Walts

Pros and Cons


The handball courts loom high over the dirt running path in Forest Park. The last time anyone painted the twenty-foot-high wooden boards they were a soft canary yellow; today they're a sun-bleached gray. In lush green surroundings, the handball courts are a stern sight, rising from the lawn like industrial ruins.

The players, when they start to arrive around noon by foot or bicycle, show up with a racquetball and a pair of gloves. Rarely do they schedule games ahead of time.

"Somebody turned over a rock," a man calls out by way of greeting as the courts fill up on a recent Monday afternoon.

The casual, pick-up-game style of play doesn't tend to work for Jerry Raymond Jones, known better as Junior. He paces impatiently on the baseline, a cell phone to his ear, trying to get a friend to the courts for a game of singles.

The sun is high, and there's a group of players sitting in the shade; none is interested in going toe-to-toe with the sinewy Jones.

"I don't fuck with Junior," one chuckles.

"I ain't gonna play you no singles," another snorts. "You think I'm crazy?"

Jones, a fair, boyish-looking 29-year-old with a boxer's nose, grins immodestly. "I'm a flat-wall player, for real," he says. "But I'm good at this, too."

Jones learned flat-wall (single-wall handball) in a place where there is an abundance of them: prison. Handball is one of the few sports that are allowed and encouraged in many federal and state penitentiaries where bats and racquets are out of the question.

According to Forest Park Handball Club president R.P. Murphy, Jones is one of the best in the core group of about 100 who come out regularly.

"He's a fantastic young player," says Murphy. "When they are locked up like that, they really get a chance to really develop their game."

Regulars here estimate that about half of the players come to Forest Park after they learned the game in prison. Though the handballers often don't know each other's last name (in the case of doubles there's a racial prefix, "White Don" and "Black Don," for example), the jailhouse stories tend to trickle out during sideline conversations.

"I thought it was just a prison game," says Ram Burrows, who was released in 2009 for a drug sentence. "Then I came out and met these guys who've been playing since the '60s."

As Jones demonstrates his serve against the court's high wall, it's not difficult to picture him behind the concrete barriers at Greenville Federal Correctional Institution in Illinois.

"This really changed my life, for real," he says.

The fact that Illinois and Missouri's prisons have become a farm team of sorts for Forest Park only explains part of the reason why the crowd here — sometimes 30 or 40 deep, drinking beers, smoking, shelling peanuts between games — stands out compared to the preppy joggers trotting past. Beyond the former inmates, the courts have always attracted an eclectic mix: restaurateurs, doctors, lawyers, Imo's delivery drivers, construction workers, entrepreneurs, prison guards and the unemployed. Forest Park even (very occasionally) lures the man some consider the greatest handballer to ever live, St. Louis' own David Chapman. And no matter what their background, handballers universally describe the game the exact same way:

"It's an addiction," says Terry Huelsman, the owner of the Break Billiards in Cahokia, Illinois. "It's a poor man's country club."

But among so much openness and camaraderie, the players also keep secrets. This is a place where the men (there is currently only one regular female player) go to lose themselves. The things they keep private vary from tales of failed business ventures to chemical dependencies to violent crime.

Three decades ago the handball community in Forest Park was forever changed when one of its own was gunned down as he left the courts. Today the man's killer is a frequent visitor to the Forest Park courts, though he hides his identity from the handball players who continue to tell the story of the 1979 murder in almost mythic terms. But more on that later.

"Most of the people there are looking for an escape. There's a lot of damaged people out there," confirms Rick Nelson, a retired insurance broker and Forest Park regular of eight years. "It's really hard to think about something else when you're whacking a blue ball against the wall."

by Jessica Lussenhop, Riverfront Times |  Read more:
Image:Jennifer Silverberg

Kangaroos propel themselves with powerful hind legs. Australia, December 1926. Photograph by Wide World Photos Inc.
via:

Kishin Shinoyama - Sans Titre, 1968.
via:

Why Pianos and Monkeys Can Never Really Play the Blues

One of the last things you’d expect to see at a physics conference is a physicist on stage, in a dapper hat, pounding out a few riffs of the blues on a keyboard. But that’s exactly what University of Illinois professor J. Murray Gibson did at the recent March meeting of the American Physical Society in Baltimore. Gibson has been doing these wildly popular demonstrations for years to illustrate the intimate connection between music, math, and physics.

While there is a long tradition of research on the science of acoustics and a noted connection between music and math in the brain, science and math have also influenced the evolution of musical styles themselves. For thousands of years, Western music was dominated by the diatonic Pythagorean scale, which is built on an interval (the difference in pitch between two different notes) known as a perfect fifth: where the higher note vibrates at exactly 50 percent higher frequency than the lower note. Anyone who’s seen The Sound of Music probably gets the idea of the perfect fifth, and can likely sing along with Julie Andrews: “Do, a deer, a female deer….” If you start on one note and keep going up by perfect fifths from one note to the next, you trace out a musical scale, the alphabet for the language of music. While a musical scale built like that includes a lot of ratios of whole numbers (like 3:2, the perfect fifth itself), it has a fatal flaw: It can’t duplicate another keystone of music, the octave, where one note is exactly double the frequency of the lower note. Contrary to Andrews’ lyrics, the scale doesn’t really bring us back to “Do.”

To bring the fifth and the octave together in the diatonic Pythagrean scale, various versions of the same interval were forced to be different lengths in different parts of the scale—one was so badly out of tune it was called the “wolf fifth” and composers avoided using it entirely. This meant that a piece of music composed in the key of E sounded fine on a harpsichord tuned to the key of E but dreadful on one in D. It also made it difficult to change keys within a single composition; you can’t really re-tune a piano mid-performance. Johann Sebastian Bach, among others, chafed at such constraints.

Thus was born the “well-tempered” scales, in which each appearance of an interval was tweaked so that it was not far off from the ideal length or from other versions of the same interval, so composers and performers could easily switch between keys. Bach used this scale to compose some of the most beautiful fugues and cantatas in Western music. This approach eventually led to the equal temperament scale, the one widely used today, in which every interval but the octave is slightly off from a perfect ratio of whole numbers, but intervals are entirely consistent and each step in the scale is exactly the same size.

In the 20th century, musicians like Jelly Roll Morton and ragtime composer Scott Joplin wanted to incorporate certain African influences into their music—namely, the so-called “blue notes.” But no such keys existed on the piano; when in the key of C, one major blue note falls somewhere between E-flat and E. So blues pianists started crushing the two notes together at the same time. It’s an example of “art building on artifacts,” according to Gibson. That distinctive bluesy sound is the result of trying to “recreate missing notes on the modern equal temperament scale”: In more traditional scales, the interval called a third represents a frequency ratio of 5/4; and indeed in the key of C, a true third lies between E-flat and E.

by Jennifer Ouellette, Nautilus |  Read more:
Image via:

Paul Klee, Tomcat’s Turf, 1919.
via:

Of Monsters and Men


Friday, August 30, 2013

The Burdened Walk


I remembered that, once, he had looked as though he walked on air. He had looked as though his feet never touched the ground. He had looked as though his club managed to strike the ball perfectly within a private reality. Even his divots had looked cleanly cut as they sailed through the clear air. You could have used one of them for a welcome mat. Once upon a time, I remembered, Tiger Woods had looked as though he played golf in a self-contained universe that he carried around with him. I remembered all this as I crouched behind the green on the 13th hole of the Oak Hill Country Club in Rochester, New York, on Sunday afternoon, and watched Tiger Woods, who was standing in the shade a little ways down the fairway and rotating his upper body to the left and to the right, stretching his back muscles.

Jesus, I thought to myself, that's something I do.

In fact, I always do it before I swing a club. I don't know that it does me any good. Very often, I do it as a distraction and, perhaps, as a kind of preemptive alibi; that way, when the ball goes where it's not supposed to go, which is very often, I have established that I have termites eating my spine or something. Now there was Tiger Woods, who used to look as though he were made of electrified wire, cranking up the sacroiliac the way that I do. And, yes, he'll be 38 this December, but there was a time in which he was so young that he looked ageless, a time in which the future blended so seamlessly with the present that the future looked as inevitable and predictable as the past. I met him then, and the aftermath was somewhat unusual, and this was the first time we'd been in the same area code since the afternoon we had spent together, and he'd had his picture taken, and he'd told some jokes, and had wondered whether or not women followed basketball players because they thought black men had larger penises, and now he was down under a tree, doing the same back exercises that I do. He knocked it a little ways past the hole, drained a putt coming back, and ground out another par.

"Having a chance on the back nine on Sunday, I can live with that," he said later, after flogging an even-par 70 out of the course and finishing even for the weekend, another major championship slipping away. And Jack Nicklaus's record of 18 major championships, which once seemed so easily within his grasp, now slips a little deeper into the mists of an uncertain future. "It's always frustrating going out there, and I'm 3 over today, got to 7 [over], and I'm grinding my tail off coming in just to shoot even par for the day. And I'm nowhere in it."

It was Woods himself who made the pursuit of Nicklaus's 18 majors the Mount Everest of his career, so it's hard to muster up much sympathy for him if he's getting a bit winded in the middle of the North Col. He has won five times on the Tour this year, most recently burying the field just a week earlier. He is the no. 1 player in the world. None of those things matters because the PGA Championship is a major, and he did not come close to winning it, and that is going to be the way his career will be defined no matter how many times he tears it up in Southern California or rural Ohio.

(The cynics in the audience wondered why Tiger couldn't just join the rest of the golfing world and pretend that the PGA Championship isn't really a major but, rather, a John Deere Classic jumped up with historical resonance. Walter Hagen, as the story goes, once left the Wanamaker Trophy in a taxicab and it went missing for years. That was the last remarkable thing that happened at a PGA Championship.)

He walks a burdened walk now, even when it is going well. He walks the same way, above par or below, birdie or bogie. He birdies and he tips his cap, but his head is down. He talks to the ball more when it is in flight — "Get right!" "Down, DOWN!" — than he once did. By contrast, on Saturday, Phil Mickelson had a round so bad he should have been escorted off Oak Hill by the EPA, and he looked like he was having more fun than most of the gallery was. His steps were light and his smile was easy. He did not walk the burdened walk. But he did blow town quickly.

Of course, Mickelson won the British Open a couple weeks ago, and he's not the guy who defined his career success by how many majors he won. Woods was never effervescent, even in the glorious heart of his young career, but he didn't look the way he does now, coming up the fairway toward the green like an aging farmer coming to work in fields he knows are burnt and fallow but remembers with fondness and with pain the verdancy they once had.

by Charles P. Pierce, Grantland |  Read more:
Image: Charlie Niebergall, AP/Photo

Internment Camp

For a very brief period not too long ago I was the “chief of research” at a glossy yet rugged men’s lifestyle magazine. An industry darling, this “practical guide to the sensory thrills and psychological rewards of an active physical life” (as its 1995 National Magazine Award write-up swooned), was one of the most celebrated and award-laden start-ups in recent memory. As they say in the industry, Men’s Journal was “a very hot book.”

Not coincidentally, it was also an advertiser’s wet dream—a place where we took press releases at their word, where we re-shot photos for “personal grooming” stories because the toothbrushes didn’t look “exciting” enough, and where being a “complete guide for high-performance living” (we used this phrase seriously) meant giving lavish coverage to every sexy consumer product we could get our comp-crazy hands on. In the pages of this morally bankrupt advertorial, this himbo of a magazine, you could, any given month, learn that speed-skiing was not only fun but fulfilling (“Courage wasn’t what would propel me down Willamette. Innocence. I would become innocent.”); read about the religious significance of mountain-biking equipment (“There’s a Zen-like mystery about Giro’s new Helios helmet.”); be the first to know that this particular style of Nikes was much better than the one we said was the best ever a month ago (this one uses aircraft tubing!); and discover all the reasons why Howie Long is a really good actor.

But do not be impressed by the lofty title I held there. “Research chief” was pure euphemism for “the-fact-checker-whose-head-will-roll-if-anything-goes-wrong.” In charge of the “legal invulnerability and factual accuracy” of the magazine, the bulk of my days involved determining whether octopi have pancreases (they don’t), what the hell “aircraft tubing” actually is (nobody knows), and if ex-Oakland Raider wide receiver Warren Wells would sue us for calling him “compulsively felonious” (playing it safe, we ultimately cut the “compulsively” and never heard from him).

I was also partly in charge of finding interns to send our faxes, answer our phones, and, among other sundry responsibilities, go shopping for the products in photo shoots that we couldn’t get comped. Compared to fact-checking, hiring interns was difficult stuff. Not because no one was willing, mind you. On the contrary, I was spoiled for choice. The applicants would walk in, these college kids, recent graduates, and grad students, always punctual and always white, sheepish but confident, polite, and well-fragranced. They would hand me clips from their school newspapers while I looked over their résumés, which always went something like this:

Interview Magazine
May ’95 to Sept ’95
Summer Intern

CBS News
Oct ’94 to May ’95
Fall Intern

The Village Voice
May ’94 to August ’94
Summer Intern

“Very impressive,” I would say. By my quick calculations they had contributed, conservatively, five or six thousand dollars worth of uncompensated work to various media conglomerates. I would tell them that they surely have all the “experience” they would ever get by following this strategy, and that while I had positions open (who doesn’t have unpaid positions open?), I was reluctant to fill them with people who were already competent cub writers, reporters, editors, and fact-checkers. They should have been demanding jobs a long time ago. They would try not to look too crestfallen at this news. They would explain to me that they were indeed the perfect person to work for me for free. Hell, they sometimes said, they had been doing it so long that they were good at it by now.

by Jim Frederick, The Baffler |  Read more:
Image via:

Concussions Lawsuit Settlement Lets NFL off the Hook

It's a testament to the NFL's massive financial success that they can claim victory while still agreeing to hand over $765 million. The truth is, that if the massive proposed settlement, to be paid out to former NFL players and their families, holds up it will be a huge win for the NFL and commissioner Roger Goodell. As evidence grows that NFL players risk serious life-altering health risks due to concussions and other serious injuries, this settlement, which undoubtedly will help the players involved and their families, effectively ends the first major threat to the NFL's current existence without forcing the league to make meaningful changes.

At first glance it might seem like the NFL has lost big time here, having been forced to pay out $765 million to over 4,500 former players, with that total before factoring in lawyer fees, "to fund medical exams, concussion-related compensation, and a program of medical research''. While this would come out to about $170,000 per player if handed out equally to each player, it's been reported that the actual payouts would be tied to the individual's specific medical conditions. Plus that estimate also factors in around $75 million of the settlement would go towards medical tests and there would be around $10 million leftover for further scientific research.

So, make no mistake, this is a significant amount of money, don't expect Roger Goodell to pull a Randy Moss and joke about paying it out in "straight cash homey", However, even a quick look at the NFL's finances makes it very clear that this will not be a crippling blow. The NFL made $9.5 billion last year alone. In relative terms this is a small price to pay to avoid confronting the fact that they have literally been killing their employees. On top of this, around half of this money will be doled out over the course of the next 17 years, severely lessening the immediate financial consequences for the league. (...)

As Grantland's Bill Barnwell notes, this settlement allows them to pay off the plaintiffs without acknowledging any liability, pretty much the best case scenario imaginable. Not admitting fault gives the NFL a much better chance at defeating future lawsuits, which are nearly inevitable.

The timing is perfect as well. By resolving this lawsuit before the start of the NFL regular season, Goodell ensures that at least this particular story won't distract fans from the on-field product, which is as popular and profitable as it has ever been, once the real games begin.

by Hunter Felt, The Guardian |  Read more:
Image via:

Googling Yourself Takes on a Whole New Meaning

Here’s what you see if you look at my face: a skinny titanium headband stretched across my forehead. It looks like a futuristic pair of sunglasses, minus the lenses. On my right-hand side there’s a computer, a metal frame with a small, clear cube of plastic perched just over my eye. When I tilt my head upward a bit, or run my finger along the side of the frame, the cube lights up. What I see, floating six inches in front of me, is a pinkish, translucent computer screen. It gives me access to a few simple apps: Google search, text messaging, Twitter, a to-do list, some hourly news headlines from CNN (“See a Truck Go Airborne, Fly Over Median,” “Dolphin Deaths Alarm Biologists”). Beside the screen is a teensy camera built into the frame of the glasses, ready to record anything I’m looking at.

Google Glass is the company’s attempt to mainstream what the tech industry calls wearable computing, to take the computer off your desk or out of your pocket and keep it in your field of view. In a world where we’re already peering at screens all day long, pecked at by alerts, the prospect of an eyeball computer can provoke a shudder. But over several weeks of using the device myself, I began to experience some of the intriguing — and occasionally delightful — aspects of this new machine. I got used to glancing up to start texting and e-mailing by addressing its surprisingly accurate voice-transcription capabilities. (I admit I once texted my wife while riding my bicycle.) I set up calendar reminders that dinged in my ear. I used an app that guided me back to my car in a parking lot. I sent pictures of magazine articles to Evernote, so I would have reminders of what I’d read. I had tweets from friends float across my gaze.

Despite my quick adoption, however, only rarely did I accomplish something with Glass that I couldn’t already do with, say, my mobile phone. When I first heard about the device, I envisioned using it as a next-level brain supplement, accessing brilliant trivia during conversations, making myself seem omniscient (or insufferable, or both). This happened only occasionally: I startled a friend with information about the author of a rare sci-fi book, for example. But generally I found that Googling was pretty hard; you mostly control Glass with voice commands, and speaking queries aloud in front of others was awkward.

The one thing I used regularly was its camera. I enjoyed taking hands-free shots while playing with my kids and street scenes for which I would probably not have bothered to pull out my phone. I streamed live point-of-view video with friends and family. But it also became clear that the camera is a social bomb. One friend I ran into on the street could focus only on the lens pointing at her. “Can it see into my soul?” she asked. Later, she wrote me an e-mail: “Nice to see you. Or spy you spying, I guess.”  (...)

The earliest prototypes of Glass were made by taking the components from phones running Android — Google’s mobile operating system — and gluing them to a pair of safety goggles, with a huge L.C.D. in front of one eye. Heft was a hurdle: the prototypes were more than five and a half ounces, creating an untenable amount of “nose-borne weight,” to use an industry term. “If it doesn’t meet a minimum bar for comfort and style, it just doesn’t matter what it will do,” Lee said. Nobody would wear it all day long.

To shrink the device and make it more attractive, Lee hired Isabelle Olsson, a Swedish industrial designer known for her elegant, stripped-down aesthetic. She wasn’t told what she was being hired for. On her first day at work, Olsson was shown the safety-goggle prototype. When she pulled it out of a box and put it on to show me, she looked like a mad scientist.

“My heart skipped a beat,” she said with a laugh. “As a very nontacky person, this idea overwhelmed me a little bit. I’m going to wear a computer on my face? I really felt like we need to simplify this to the extreme. Whatever we can remove, we will remove.” (...)

Google started selling Glass this spring. Two thousand went to software developers; 8,000 went to people who submitted to Google short descriptions of what they’d do with Glass; those selected paid $1,500 for it. (I received mine this way and paid full price.) Once users began wandering into public life a few months ago, gazing into their glowing eye-screens, it became possible to begin answering the question: how would people use wearable computers in their everyday lives?

by Clive Thompson, NY Times |  Read more:
Image: Grant Cornett for The New York Times

Facebook to Update Privacy Policy, but Adjusting Settings Is No Easier


[ed. Why anyone would continue to use a service so obviously manipulative is beyond me.]

Facebook announced Thursday that it planned to enact changes to its privacy policies on Sept. 5.

But the social network’s famously difficult privacy controls will not become any easier to navigate.

Mostly, the new data use policy and statement of rights and responsibilities lay out more clearly the things that Facebook already does with your personal information, Ed Palmieri, the company’s associate general counsel for privacy, said in an interview. “The updates that we are showing in the red lines are our way to better explain the products that exist today,” he said.

In some ways, the company is making it more clear that it uses a wide variety of personal data about its 1.2 billion users to deliver advertising, including things they share and do, what they like, how they interact with ads and applications, and even demographic data inferred from everything else.

Facebook also said that it might use its customers’ profile photos to help their friends tag them in photos. Those photos are already public, but Facebook does not currently use them to help recognize faces when photos are uploaded to the service. “This will make the product better for people,” Mr. Palmieri said. “You can still opt out of it.”

But the company is also deliberately deleting information about specific privacy controls. Instead, Mr. Palmieri said, Facebook decided it was better to send users to various other pages, such as one on advertising, to learn more about privacy issues and how to adjust the controls.

For example, the data use policy will no longer offer a direct path to the control for opting out of your name and activities on the site being used as endorsements on ads sent to your friends.

by Vindu Goel, NY Times |  Read more:
Image: Dado Ruvic/Reuters

Thursday, August 29, 2013

Art in Science

Joni Mitchell


[ed. Seriously great... from the Shadows and Light tour.] 

Georgia O’Keeffe, Taos, New Mexico, 1931
via:

Blueberry Corn Salad


[ed. I watched a cooking show last night and they served this dish along with a whole pan-fried rockfish. It looked delicious, even though the tv version was simpler -- just fresh corn, blueberries and arrugula. I would never have thought of this combination.]

This salad is light and refreshing. I love the pop of color and juiciness the blueberries add to the corn salad. The cucumbers also add a nice crunch. The salad is full of flavor thanks to the cilantro, jalapeño, red onion, and honey lime dressing.

Celebrate summer by making this Blueberry Corn Salad. It is simple to make and it can be made in advance which makes it perfect for summer bbq’s, picnics, and pool parties.
Yield: Serves 6-8

Simple summer salad with blueberries, sweet corn, cucumbers, cilantro, jalapeño, red onion, and a honey lime dressing.

Ingredients:

6 ears fresh sweet corn, husked
1 cup fresh blueberries
1 cucumber, sliced
1/4 cup finely chopped red onion
1/4 cup chopped fresh cilantro
1 jalapeno pepper, seeded and finely chopped
2 tablespoons lime juice
2 tablespoons olive oil
1 tablespoon honey
1/2 teaspoon ground cumin
1/2 teaspoon salt
1/4 teaspoon black pepper

Directions:

1. In a large pot, bring water to boiling. Add corn. Cook for 5 minutes, or until tender. When cool enough to handle, cut corn from the cobs. Discard cobs.

2. In a large serving bowl, combine corn, blueberries, cucumber, red onion, cilantro, and jalapeno. To make the dressing, whisk together lime juice, oil, honey, cumin, salt, and pepper. Pour dressing over salad and stir until combined. Cover and refrigerate until ready to serve.

by Maria and Josh, TPTP |  Read more:
Image: TPTP

"Disruptive" the Most Pernicious Cliché of Our Time

Sometimes buzzwords become so pervasive they’re almost inaudible, which is when we need to start listening to them. Disruptive is like that. It floats in the ether at ideas festivals and TED talks; it vanishes into the jargon cluttering the pages of Forbes and Harvard Business Review. There’s a quarterly called Disruptive Science and Technology; a Disruptive Health Technology Institute opened this summer. Disruptive doesn’t mean what it used to, of course. It’s no longer the adjective you hope not to hear in parent-teacher conferences. It’s what you want investors to say about your new social-media app. If it’s disruptive, it’s also innovative and transformational.

We can’t often name the person who released a cliché into the linguistic ecosystem, but in this case we can, and we also know why he did it. He’s Clayton Christensen, a Harvard Business School professor, and he wanted to explain why upstart enterprises drive better-established companies out of business. In his 1997 book, The Innovator’s Dilemma, Christensen launched the phrase that has transmogrified the English language: “disruptive innovation.”

Christensen’s theory goes like this. When a company succeeds at making and selling a gizmo, it commits itself to developing ever better gizmos, because their higher price yields larger profits. But that leaves a hole in the market quickly exploited by newcomers. They make stripped-down gizmos and sell them to consumers who hadn’t been able to afford them before. The strappy company, having found new people to market to, grows; the senior company, having narrowed its appeal, shrinks; the challenger overtakes the incumbent; and the cycle starts anew. An old example of disruptive innovation is the disk-drive market of the 1980s. As disk drives shrank, the bigger-disk makers went out of business, even though the smaller disks were arguably inferior: They held less data and cost more per byte. A newer example is the tablet, which may be relegating personal computers to history.

Christensen’s theory still has a powerful appeal, because it explains something we’ve all seen happen, even marked off our own decades by: the churning of businesses from start-ups to powerhouses to irrelevance or near-irrelevance. Me, I equate my youth with Microsoft’s apparent lock on the future of computing; we now know how fleeting that moment was. Christensen also sidestepped the obsession with leadership that bedevils management theory, stressing the tragic inevitability of market forces over the comic mishaps of shortsighted executives. It’s not that CEOs are too stupid to see disruption coming; it’s that their companies aren’t set up to make, or make money from, the new gizmos.

At least at first, Christensen deployed disruption theory to help managers cope with the revolutionary ferment from below that Joseph Schumpeter called “creative destruction.” But disruptive is now slapped onto every act of cultural defiance or technical derring-do, whether it has to do with business or not, and Christensen has not tried to rein in the word’s inflation. On the contrary, he has been out-punditing the pundits, publishing book after book—each with many co-authors—in which disruption theory is brought to bear first on this sector, then on that one. In the past five years, he has homed in on the social institutions—schools, public-health organizations, and the halls of government itself—he deems ripe for disruption.

You can’t blame Christensen and his co-writers for all the dumb things said and done in the name of disruption. But you can spot some unsavory habits of mind in their prescriptions. For one thing, they possess an almost utopian faith in technology: online or “blended” learning; massive open online courses, or MOOCs; cool health apps; and so on. Their convictions seem sincere, but they also coincide nicely with the interests of the Silicon Valley venture-capital crowd. If you use technology to disrupt the delivery of public services, you open up new markets; you also replace human labor with the virtual kind, a happy thought for an investor, since labor is the most expensive line item in all service-industry budgets.

by Judith Shulevitz, TNR |  Read more:
Image via:

How Economics Can Save the Whales

A study of 11,135 fisheries showed that introducing catch share roughly halved the chance of collapse. The system caught on in the 1980s and 1990s after decades of other well-intentioned efforts failed. Economist H. Scott Gordon is usually credited with laying out the problem and the solution in 1954.

Modern environmental economists accuse their predecessors of forgetting about incentives. Catch-share schemes issue permits to individuals and groups to fish some portion of the grounds or keep some fraction of the total catch. If fishermen exceed their share, they can buy extra rights from others, pay a hefty fine or even lose their fishing rights, depending on the particular arrangement. The system works because it aligns the interests of individual fishermen with the sustainability of the entire fishery. Everybody rises and falls with the fate of the total catch, eliminating destructive rivalries among fishermen.

Environmental economists have lately turned their attention to Atlantic bluefin tuna and whales. The National Marine Fisheries Service has just proposed new regulations that would for the first time establish a catch-share program for the endangered and lucrative bluefin. And a group of economists is pushing for a new international agreement on whaling. (...)

In both cases the problem is overfishing. The bluefin tuna population has dropped by a third in the Atlantic Ocean and by an incredible 96 percent in the Pacific. And whaling, which is supposedly subject to strict international rules that ban commercial fishing and regulate scientific work, is making a sad comeback. The total worldwide annual catch has risen more than fivefold over the last 20 years.

Ben Minteer, Leah Gerber, Christopher Costello and Steven Gaines have called for a new and properly regulated market in whales. Set a sustainable worldwide quota, they say, and allow fishermen, scientists and conservationists alike to bid for catch rights. Then watch the system that saved other fish species set whaling right.

The idea outrages many environmentalists. Putting a price on whales, they argue, moves even further away from conservationist principles than the current ban, however ineffective. They’re wrong. “The arguments that whales should not be hunted, whatever their merits, have not been winning where it counts -- that is, as measured by the size of the whale population,” says economist Timothy Taylor, editor of the Journal of Economic Perspectives.

by Evan Soltas, Bloomberg | Read more:
Image: Luis Robayo/AFP via Getty Images

The Pretenders



[ed. Chrissie Hynde love... Nate, remember Saratoga Springs?]

The Key to a Truly Great Chicken Wing


Americans are a wing-loving people. The Buffalo variety, by most accounts “invented” at the Anchor Bar in, yes, Buffalo, is the official food of our most sacred event of the year: the Super Bowl.

Wings have a higher ratio of skin to meat than almost any other cut of chicken, which is what makes them so appealing. In order to crisp the skin, you need to render out most of the fat that comes with it, otherwise you’ll get chewy wings instead of crunchy ones. A grill with one side that’s hot and one side that’s cool — one side with no or very little fire underneath it — is what you need: put the wings on the cool side, cover the grill and let the ovenlike heat melt the fat away through the grates without any fear of an intense flame burning the skin from below.

Because you’re not relying on this part for any browning, it’s O.K. to crowd the wings, even stacking them slightly if need be. The time it takes to render the fat and cook the wings through is more than enough to whip up one of the sauces here (including, you’ll be relieved to know, Buffalo), few of which require cooking. Make the sauce in a bowl large enough to accommodate the wings so you can toss them in from the first round on the grill.

by Mark Bittman, NY Times |  Read more:
Image: Marcus Nilsson for The New York Times. Food stylist: Chris Lanier. Prop stylist: Angharad Bailey.

This Week's Recipes
Teriyaki Chicken Wings
Miso Chicken Wings
Barbecue Chicken Wings
Curry-Yogurt Chicken Wings
Chipotle-Lime Chicken Wings
Lemon-Garlic-Pepper Chicken Wings
Thai-Peanut Chicken Wings
Fish-Sauce-and-Black-Pepper Chicken Wings
Jerk Chicken Wings
Buffalo Chicken Wings
Korean-Style Chicken Wings
Garam Masala Chicken Wings

Wednesday, August 28, 2013


Derek Gores, Women's World
via:

Terry Rodgers, Immaculate Reflection 2006.
via:

Withdrawal Symptoms


When you’re booked into the Los Angeles County Jail, they put you in a cage with a wire gate, and you have to wait while they type up a whole bunch of stuff. You lie there and sit there, and then, when enough people are ready, the guards call out the names and you walk to another section, where they take your fingerprints. They do each finger and your whole hand, and they take your picture. Then you wait again, and there’s no place to sit. You lie on the cement floor, and people get sick—they’re vomiting. I was sick before I got busted—I was sick before I went and hocked my horn—so I was deathly ill by the time I was waiting. And it took thirty-six hours to be booked in.

The agony of kicking is beyond words. It’s nothing like the movies, The Man with the Golden Arm, or things you read: how they scream and bat their heads against the wall, and they’d give up their mother, and they want to cut their throats. That’s ridiculous. It’s awful but it’s quiet. You just lie there and suffer. You have chills and your bones hurt; your veins hurt; and you ache. When water touches you it feels as if it’s burning you, and there’s a horrible taste in your mouth, and every smell is awful and becomes magnified a thousandfold. You can smell people, people with BO, their feet, and filth and dirt. But you don’t scream and all that: “Kill my mother, my father, just get me a fix and I’ll do anything you want!” That’s outrageous.

The depression you feel is indescribable, and you don’t sleep. Depending on how hooked you are, you might go three weeks or a month without ever sleeping except for momentary spells when you just pass out. You’ll be shaking and wiggling your legs to try to stop the pain in the joints, and all of a sudden you’ll black out and you’ll have a dream that you’re somewhere trying to score. You’ll get the shit and the outfit, and you’ll stick it in your vein, and then the outfit will clog, or the stuff will shoot out the rubber part of the dropper, or somebody’ll get in the way—somebody stops you and you never get it into your arm. I used to dream that my grandmother was holding me and I was hitting her in the face, smashing her in the mouth—blood came out of her face—and I could never get the dope in. You’d have terrible dreams: you’d flash to a woman, your old lady; she’d become a dog and she’d have a peepee like a dog instead of a cunt like a woman; and all of a sudden you’d come and immediately you’d wake up, and you’d be sticky and dirty and wet.

The first time I went to the county jail I went seventeen days and nights without sleeping at all, I was so sick. I kept vomiting and couldn’t eat. Seventeen days and nights, and all they gave you was aspirin. You could get three of them at night when they had sick call come around. And at night they had salts and soda. You could get either one. Salts to make you go to the bathroom or soda to settle your stomach.

In the county jail for a while they had a kick tank. They’d lock you up in a solid cell all alone. I knew a young Chicano cat who got put in the kick tank, and he started vomiting. He vomited and vomited, and he called for the guards, but they ignored him. He kept vomiting and he ruptured a blood vessel in his stomach and bled to death, choked in his own blood. That’s the treatment that the dope fiend got.

I was once in jail with a Chinaman. He had been shooting “black” (opium) for years and years. Chinese didn’t get busted for a long time because the Chinese as a whole are much stronger than the whites and the blacks. But then some of the young Chinese got out and started shooting regular heroin, hanging out with the other dope fiends, and they got Americanized. And so, when they got busted they ratted on their elders. This Chinaman was an older guy; he looked like a skeleton, and he was really strung out. He was shaking so much he could hardly walk. They assigned him to a cell but he said, “I can’t bear the cell. Just put me on the freeway.” The freeway is the walkway that goes by the cells. They put him out there, and for two weeks he did nothing but sit in one position. He didn’t eat one bit of food. Every now and then he’d drink a little something, take some broth out of the stew. For two weeks he sat with his feet on the floor and his arms around his knees in a corner on the freeway not saying a word to anybody, sweat pouring off his face. When he got a little better I talked to him, and he said that he was trying to put himself into a trance, to leave his body, to get over the misery. I’ve seen guys put their pant legs into their socks and tie strings around them so no wind could get to their bodies. Then they would walk up and down the freeway for days, walk all night long, and they wouldn’t sleep for weeks except for these horrible moments.

So kicking is the most insidious thing. It’s a million times worse than they portray it. It’s not an outward, noisy anguish. It’s an inner suffering that only you, and, if there’s any such thing as God, like, maybe you and He know it.

by Art Pepper, Lapham's Quarterly |  Read more:
Excerpt from the autobiography Straight Life

The Mourning Forest, Naomi Kawase (2007)
via:

Gait 101: Learning to Run More Naturally

Many beginning runners remark about how much they enjoy the new experience. They care little about the nuances regarding form, technique, or proper gait. As long as they are moving, accumulating mileage over a sustained period of time, they feel content and satisfied. But at the advanced and elite level of running, the concept of gait takes on an entirely new dimension of complexity, constant questioning, and evaluation by a coach or oneself.

But what is exactly meant by the term “gait?” In running, gait is typically defined as moving posture– the whole body’s forward progress, including the foot strike and pelvic position, to arm swing, head and knee movement. It’s not unusual for coaches, kinesiologists and other biomechanics experts, and elite runners to dissect each component of one’s gait. From this assessment, each element of the gait that’s viewed as “flawed” is “corrected”—the runner is told to lift the knee to this position, swing the arms that way, or hold the elbows this way.

Yet nothing is more natural than the biomechanics of human running. Or should be. With every step a runner takes, the limbs, trunk, head and spine participate in various combinations of movement, ranging from flexion, extension, and rotation, to abduction and adduction, along with the feet, which pronate, supinate, invert and evert. Only by understanding the normal ranges of motion can one detect “abnormal” movements so as to help assess an injury or observe for the potential of future injury.

More importantly, there’s no ideal running form. While all humans have the same basic running pattern—just like other animals—your gait is yours alone. In fact, it’s easy to recognize your training partner from a distance, even before the face comes into focus, because you know his or her unique running fingerprint. Even looking at the best athletes in professional sports, there’s one common feature—everyone’s movements are slightly different. Each golfer follows the basic swing, while at the same time each has a swing all his or her own; the same for every high-jumper, baseball pitcher, tennis player, or marathoner.

That is, unless something interferes with movement.

When something causes the gait to go astray, two things happen. First, there is the risk of getting injured because it meant something went wrong, and it will be reflected in running form in a subtle—or sometimes more obvious—way. There might be irregular movement in the hip joint causing the pelvis to tilt more to one side than the other, more flexion of one knee than the other stressing the hamstring muscles, too much rotation of the leg causing the foot to flair outward excessively, and erratic arm movements. The most common reason for this is muscle imbalance, and it forces the body to compensate by contracting certain muscles to keep the imbalance from worsening.

The second problem is that the body’s energy is being used inefficiently. A flawed running form will raise the heart rate more than usual, making one fatigue quicker, and resulting in a slower pace. Stretching can disturb the gait too—by making a muscle longer with a loss of power. By stretching muscles before running, it’s very possible to cause muscle imbalance.

Physical interference is most often the result of bad shoes or muscle imbalance, sometimes both. Stretching can disturb the gait too—by making a muscle longer with a loss of power. By stretching muscles before running, it’s very possible to cause muscle imbalance.

Another factor affecting is gait is poor postural habit. We sit in chairs too long or slump at our desks. We stand with poor posture and even walk with an irregular gait—all because somewhere along the way we allowed our bodies to get lazy. For many, these bad habits carry over to running.

by Dr.Phil Maffetone, Natural Running Center |  Read more:
Image: Uncredited

Why Etsy's Brave New Economy is Crumbling


One of the biggest consumer markets in the world resides in a river valley about 200 miles southwest of Shanghai. You can find almost anything you want in Commodity City's half-million stalls: baby bibs, knitting supplies, sundry types of underwear, T-shirts, glitzy jewelry.

The market is the literal and figurative heart of Yiwu, a metropolis of 1.3 million people that bustles under the sticky, industrial haze of Zhejiang province. Journalist Tim Phillips has called Yiwu the "Wall Street" of China's counterfeit goods industry—an accurate, if somewhat narrow, label. The city's factories flood the shelves of Commodity City and the rest of the world with a lot more than just knock-off iPhones and pirated Hollywood DVDs. They relentlessly pump out the type of cheap consumer goods you're used to seeing at shopping mall kiosks and street-side trinket stands.

It's in part because of factory cities like Yiwu that Etsy, an online marketplace that specializes in handcrafted goods, has become so successful. Since launching in 2005, Etsy has ridden an Internet-powered counter-industrial revolution, adding a personal and DIY touch to goods—everything from sweaters to Christmas ornaments. Craftsmakers have fled to Etsy to escape the price-cutting effects of big-box stores and their walls of factory-made products.The crafts marketplace has formed an alternate commercial universe with 30 million members and nearly 1 million stores, all ostensibly running on human labor and handmade goods.

It goes without saying that If Yiwu-type products began infiltrating Etsy, they'd upend the market—the same way Walmart destroyed your local department store or Barnes & Noble ended the era of the corner book shop.

Take a look at the "Infinity Ring," a delicate brass loop coated with a silver sheen and topped with rhinestones and crystal. In pictures of the factory where it's made, you can see rows of workers in surgical masks bent over dusty tables, not far from bulky industrial machines. From ports in Ningbo and Shanghai, the Yiwu Daihe Jewelry Corp. exports the ring to anywhere in the world at 50 cents a piece.

You can buy it on Etsy's most popular jewelry store for $15.

How? To most Etsy users, the obvious answer is that Laonato, the store, is buying the rings wholesale from the factory, then pawning them off as handmade goods, reaping a monstrous 2,900 percent profit. That practice is known as "reselling," and it's a subject of intense controversy on the site. But like with a lot of things on Etsy—where the entire economy operates behind the shroud of the Internet—easily drawn assumptions and reality rarely align as neatly as you'd expect.

As its leaders struggle to redefine the company's corporate identity, the blurry, ungraspable truth about the Infinity Ring hints of a chronic ailment at Etsy's core—one that could undermine the company’s ambitious plans and the marketplace as a whole.

by Kevin Morris, Daily Dot |  Read more:
Image: Jason Reed

Yume32ki, Summer School
via:

R. Crumb, Can't You See I'm Reading? 1984.
via:

Junku Nishimura
via:

Brain to Brain Interface

University of Washington researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a fellow researcher.

Using electrical brain recordings and a form of magnetic stimulation, Rajesh Rao sent a brain signal to Andrea Stocco on the other side of the UW campus, causing Stocco’s finger to move on a keyboard.

While researchers at Duke University have demonstrated brain-to-brain communication between two rats, and Harvard researchers have demonstrated it between a human and a rat, Rao and Stocco believe this is the first demonstration of human-to-human brain interfacing.

“The Internet was a way to connect computers, and now it can be a way to connect brains,” Stocco said. “We want to take the knowledge of a brain and transmit it directly from brain to brain.” (...)

Rao, a UW professor of computer science and engineering, has been working on brain-computer interfacing in his lab for more than 10 years and just published a textbook on the subject. In 2011, spurred by the rapid advances in technology, he believed he could demonstrate the concept of human brain-to-brain interfacing. So he partnered with Stocco, a UW research assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences.

On Aug. 12, Rao sat in his lab wearing a cap with electrodes hooked up to an electroencephalography machine, which reads electrical activity in the brain. Stocco was in his lab across campus wearing a purple swim cap marked with the stimulation site for the transcranial magnetic stimulation coil that was placed directly over his left motor cortex, which controls hand movement.

The team had a Skype connection set up so the two labs could coordinate, though neither Rao nor Stocco could see the Skype screens.

Rao looked at a computer screen and played a simple video game with his mind. When he was supposed to fire a cannon at a target, he imagined moving his right hand (being careful not to actually move his hand), causing a cursor to hit the “fire” button. Almost instantaneously, Stocco, who wore noise-canceling earbuds and wasn’t looking at a computer screen, involuntarily moved his right index finger to push the space bar on the keyboard in front of him, as if firing the cannon. Stocco compared the feeling of his hand moving involuntarily to that of a nervous tic.

by Doree Armstrong and Michelle Ma, University of Washington | Read more:
Image: University of Washington

The Bee Hummingbird is the smallest bird on Earth.
via:

Tuesday, August 27, 2013


Teuthology of the Deep” by Clara Jauquet
via:

The Failed Grand Strategy in the Middle East

In the beginning, the Hebrew Bible tells us, the universe was all "tohu wabohu," chaos and tumult. This month the Middle East seems to be reverting to that primeval state: Iraq continues to unravel, the Syrian War grinds on with violence spreading to Lebanon and allegations of chemical attacks this week, and Egypt stands on the brink of civil war with the generals crushing the Muslim Brotherhood and street mobs torching churches. Turkey's prime minister, once widely hailed as President Obama's best friend in the region, blames Egypt's violence on the Jews; pretty much everyone else blames it on the U.S.

The Obama administration had a grand strategy in the Middle East. It was well intentioned, carefully crafted and consistently pursued.

Unfortunately, it failed.

The plan was simple but elegant: The U.S. would work with moderate Islamist groups like Turkey's AK Party and Egypt's Muslim Brotherhood to make the Middle East more democratic. This would kill three birds with one stone. First, by aligning itself with these parties, the Obama administration would narrow the gap between the 'moderate middle' of the Muslim world and the U.S. Second, by showing Muslims that peaceful, moderate parties could achieve beneficial results, it would isolate the terrorists and radicals, further marginalizing them in the Islamic world. Finally, these groups with American support could bring democracy to more Middle Eastern countries, leading to improved economic and social conditions, gradually eradicating the ills and grievances that drove some people to fanatical and terroristic groups.

President Obama (whom I voted for in 2008) and his team hoped that the success of the new grand strategy would demonstrate once and for all that liberal Democrats were capable stewards of American foreign policy. The bad memories of the Lyndon Johnson and Jimmy Carter presidencies would at last be laid to rest; with the public still unhappy with George W. Bush's foreign policy troubles, Democrats would enjoy a long-term advantage as the party most trusted by voters to steer the country through stormy times.

It is much too early to anticipate history's verdict on the Obama administration's foreign policy; the president has 41 months left in his term, and that is more than enough for the picture in the Middle East to change drastically once again. Nevertheless, to get a better outcome, the president will have to change his approach.

With the advantages of hindsight, it appears that the White House made five big miscalculations about the Middle East. It misread the political maturity and capability of the Islamist groups it supported; it misread the political situation in Egypt; it misread the impact of its strategy on relations with America's two most important regional allies (Israel and Saudi Arabia); it failed to grasp the new dynamics of terrorist movements in the region; and it underestimated the costs of inaction in Syria. (...)

This is dangerous. Just as Nikita Khrushchev concluded that President Kennedy was weak and incompetent after the Bay of Pigs failure and the botched Vienna summit, and then proceeded to test the American president from Cuba to Berlin, so President Vladimir Putin and Supreme Leader Ayatollah Ali Khamenei now believe they are dealing with a dithering and indecisive American leader, and are calibrating their policies accordingly. Khrushchev was wrong about Kennedy, and President Obama's enemies are also underestimating him, but those underestimates can create dangerous crises before they are corrected.

by Walter Russell Mead, WSJ |  Read more:
Image: AP