Monday, December 24, 2012

Cold Pastoral (Fiction)

When Marina Keegan died, tragically, at the age of twenty-two, in a car accident in May, she had just graduated from Yale University and was about to start a job on the editorial staff of The New Yorker. 

We were in the stage where we couldn’t make serious eye contact for fear of implying we were too invested. We used euphemisms like “I miss you” and “I like you” and smiled every time our noses got too close. I was staying over at his place two or three nights a week and met his parents at an awkward brunch in Burlington. A lot of time was spent being consciously romantic: making sushi, walking places, waiting too long before responding to texts. I fluctuated between adding songs to his playlist and wondering if I should stop hooking up with people I was eighty per cent into and finally spend some time alone. (Read the books I was embarrassed I hadn’t read.) (Call my mother.) The thing is, I like being liked, and a lot of my friends had graduated and moved to cities. I’d thought about ending things but my roommate Charlotte advised me against it. Brian was handsome and smoked the same amount as me, and sometimes in the morning, I’d wake up and smile first thing because he made me feel safe.

In March, he died. I was microwaving instant Thai soup when I got a call from his best friend, asking if I knew which hospital he was at.

“Who?” I said. “Brian,” he said. “You haven’t heard?”

I was in a seminar my senior year where we read poems by John Keats. He has this famous one called “Ode on a Grecian Urn,” where these two lovers are almost kissing, frozen with their faces cocked beneath a tree. The tragedy, the professor said, is in eternal stasis. She never fades, they never kiss; but I remember finding the whole thing vaguely romantic. My ideal, after all, was always before we walked home—and ironically, I had that now.

* * *

I watched as the microwave droned in lopsided circles, but I never took the soup out. Someone else must have. Charlotte, perhaps, or one of my friends that came over in groups, offering foods in imitations of an adult response and trying to decipher my commitment. I was trying, too. I’d made out with a guy named Otto when I was back in Austin over Christmas, and Brian and I had never quite stopped playing games. We were involved, of course, but not associated.

“What’s the deal?” people would shout over the music when he’d gone to get a drink, and I’d explain that there was no deal to explain.

“We’re hanging out,” I’d say, smiling. “We like hanging out.”

I think we took a certain pride in our ambiguity. As if the tribulations of it all were somehow beneath us. Secretly, of course, the pauses in our correspondence were as calculated as our casualness—and we’d wait for those drunken moments when we might admit a “hey,” pause, “I like you.”

“Are you O.K.?,” they asked now. Whispering almost, as if I were fragile. We sat around that first night sipping singular drinks, a friend turning on a song and then stopping it. I wish I could say I was shocked into a state of inarticulate confusion, but I found myself remarkably capable of answering questions.

“They weren’t dating,” Sarah whispered to Sam, and I gave a soft smile so they knew it was O.K. that I’d heard.

But it became clear very quickly that I’d underestimated how much I liked him. Not him, perhaps, but the fact that I had someone on the other end of an invisible line. Someone to update and get updates from, to inform of a comic discovery, to imagine while dancing in a lonely basement, and to return to, finally, when the music stopped. Brian’s death was the clearest and most horrifying example of my terrific obsession with the unattainable. Alive, his biggest flaw was most likely that he liked me. Dead, his perfections were clearer.

But I’m not being fair. The fact of the matter is I felt a strange but recognizable hole that grew just behind my lungs. There was a person whose eyes and neck and penis I had kissed the night before, and this person no longer existed. The second cliché was that I couldn’t quite encompass it. Regardless, I surprised myself that night by crying alone once my friends had left, my face pressed hard against my pillow.

by Marina Keegan, The New Yorker |  Read more:
Illustration: Hannah K. Lee

Sunday, December 23, 2012

Dave Matthews, Tim Reynolds

[ed. This has become sort of a Christmas tradition. Dave Matthews' Christmas Song.]


Christmas Song

She was his girl, he was her boyfriend
Soon to be his wife, make him her husband
A surprise on the way, any day, any day
One healthy little giggling, dribbling baby boy
The Wise Men came, three made their way
To shower him with love
While he lay in the hay
Shower him with love, love, love
Love, love, love
Love, love was all around

Not very much of his childhood was known
Kept his mother Mary worried
Always out on his own
He met another Mary who for a reasonable fee
Less than reputable was known to be
His heart was full of love, love, love
Love, love, love
Love, love was all around

When Jesus Christ was nailed to his tree
Said "Oh, Daddy-o, I can see how it all soon will be
I came to shed a little light on this darkening scene
Instead I fear I've spilled the blood of our children all around."
The blood of our children all around
The blood of our childrens all around

So I'm told, so the story goes
The people he knew were
Less than golden-hearted
Gamblers and robbers
Drinkers and jokers
All soul searchers
Like you and me
Like you and me

Rumors insisted that he soon would be
For his deviations taken into custody
By the authorities, less informed than he.
Drinkers and jokers, all soul searchers
Searching for love, love, love
Love, love, love
Love, love was all around

Preparations were made
For his celebration day
He said, "Eat this bread, and think of it as me
Drink this wine and dream it will be
The blood of our children all around,
The blood of our children all around.
The blood of our childrens all around

Father up above,
Why in all this anger do you fill me up with love, love, love?
Love, love, love
Love, love was all around

Father up above,
Why in all this hatred do you fill me up with love?
Fill me with love, love, yeah
Love, love, love
Love, love, and the blood of our children all around

The Coming Drone Attack on America


People often ask me, in terms of my argument about "ten steps" that mark the descent to a police state or closed society, at what stage we are. I am sorry to say that with the importation of what will be tens of thousands of drones, by both US military and by commercial interests, into US airspace, with a specific mandate to engage in surveillance and with the capacity for weaponization – which is due to begin in earnest at the start of the new year – it means that the police state is now officially here.

In February of this year, Congress passed the FAA Reauthorization Act, with its provision to deploy fleets of drones domestically. Jennifer Lynch, an attorney at the Electronic Frontier Foundation, notes that this followed a major lobbying effort, "a huge push by […] the defense sector" to promote the use of drones in American skies: 30,000 of them are expected to be in use by 2020, some as small as hummingbirds – meaning that you won't necessarily see them, tracking your meeting with your fellow-activists, with your accountant or your congressman, or filming your cruising the bars or your assignation with your lover, as its video-gathering whirs.

Others will be as big as passenger planes. Business-friendly media stress their planned abundant use by corporations: police in Seattle have already deployed them.

An unclassified US air force document reported by CBS (pdf) news expands on this unprecedented and unconstitutional step – one that formally brings the military into the role of controlling domestic populations on US soil, which is the bright line that separates a democracy from a military oligarchy. (The US constitution allows for the deployment of National Guard units by governors, who are answerable to the people; but this system is intended, as is posse comitatus, to prevent the military from taking action aimed at US citizens domestically.)

The air force document explains that the air force will be overseeing the deployment of its own military surveillance drones within the borders of the US; that it may keep video and other data it collects with these drones for 90 days without a warrant – and will then, retroactively, determine if the material can be retained – which does away for good with the fourth amendment in these cases. While the drones are not supposed to specifically "conduct non-consensual surveillance on on specifically identified US persons", according to the document, the wording allows for domestic military surveillance of non-"specifically identified" people (that is, a group of activists or protesters) and it comes with the important caveat, also seemingly wholly unconstitutional, that it may not target individuals "unless expressly approved by the secretary of Defense".

In other words, the Pentagon can now send a domestic drone to hover outside your apartment window, collecting footage of you and your family, if the secretary of Defense approves it. Or it may track you and your friends and pick up audio of your conversations, on your way, say, to protest or vote or talk to your representative, if you are not "specifically identified", a determination that is so vague as to be meaningless.

What happens to those images, that audio? "Distribution of domestic imagery" can go to various other government agencies without your consent, and that imagery can, in that case, be distributed to various government agencies; it may also include your most private moments and most personal activities. The authorized "collected information may incidentally include US persons or private property without consent". Jennifer Lynch of the Electronic Frontier Foundation told CBS:
"In some records that were released by the air force recently … under their rules, they are allowed to fly drones in public areas and record information on domestic situations."
This document accompanies a major federal push for drone deployment this year in the United States, accompanied by federal policies to encourage law enforcement agencies to obtain and use them locally, as well as by federal support for their commercial deployment. That is to say: now HSBC, Chase, Halliburton etc can have their very own fleets of domestic surveillance drones. The FAA recently established a more efficient process for local police departments to get permits for their own squadrons of drones.

Given the Department of Homeland Security militarization of police departments, once the circle is completed with San Francisco or New York or Chicago local cops having their own drone fleet – and with Chase, HSBC and other banks having hired local police, as I reported here last week – the meshing of military, domestic law enforcement, and commercial interests is absolute. You don't need a messy, distressing declaration of martial law.

by Naomi Wolf, The Guardian |  Read more:
Photograph: US navy/Reuters

From Us to You

[ed. I don't know...does anybody ever get one of these anymore? Maybe it's just me. Or maybe it's Facebook's fault (isn't everything?) -- the Christmas letter now gets dribbled out all year long.]

"I think we ought to write a Christmas letter this year," my wife said at the breakfast table the other morning.

"A Christmas letter. You know, like the kind the Huggins send out to all their friends every year."

I recalled the Huggins' Christmas letters: five page mimeographed reports on family activities for the preceding year, with the simple greetings of the season all but buried.

I hurried off to work before my wife could pursue the subject any further, but, that evening she presented me with a packet of letters including not only the recent efforts of the Huggins hut Christmas letters other families had sent us as well.

"Now you read these and see if you don't think it would be a good idea for us to do this instead of sending cards this Christmas," she said.

One would have been enough, for the letters were indistinguishable in style and content. Posing innocently as Christmas greetings, they were actually unabashed family sagas. The writers touched lightly on the misfortunes which their families suffered during the year, dwelt gladly on happy events, and missed no opportunity for self congratulation.

I haven't the slightest intention of writing a Christmas letter myself, but once I'd put a red or green ribbon in my typewriter, I'm sure I could turn one out in no time at all.

"OUR HOUSE TO YOURS!" is the standard beginning. Centered at the top of an 8 x 11" sheet of paper, it spares the writer the nuisance of penning salutations on the hundred or more copies he will doubtless send out. The exclamation mark is the first of dozens that will be used. No Christmas letter averages fewer than eighteen "!'s," "!!'s," or "(!)'s" a page.

The opening sentence always starts with the word "Well." "Well, here it is Christmas again!" is a favorite; or, "Well, hard as it is to realize, Christmas has rolled round once more!" A somewhat more expansive opening is "Well, Christmas finds us all one year older, but young as ever in the spirit of the Season!" Actually what is said is unimportant as long as the sentence starts with "Well," and ends, of course, with an exclamation mark.

Having taken due note of the season, the Christmas letter writer works immediately into his first main topic—the accidents which befell him and his family and the diseases they suffered during the year. He writes with cheerful fortitude. Broken arms and legs call forth the reminiscent chuckle, and childhood diseases open the way for humor of a sort. "As it must to all children," the Huggins wrote last Christmas, "the mumps came to Albert Jr. and to Susie. Fortunately they were taken sick during the spring vacation and didn't miss any school. We don't think they'd agree with our use of the word 'fortunately.' (Haha!)" The parenthetical "Ha ha!" or simply "Ha!" appears at least once in each paragraph of a Christmas letter.

by William J. Copithorne, The Atlantic |  Read more:
Image via

Dallas, Part 1: From Afar

[ed. The second installment of this fascinating article can be found here: Dallas, Part 2: Up Close]

Between 318 and 271 million years ago, the ancient continental core of North America butted against what would become South America. Land folded and faulted; mountains were born. Then what would become the Gulf of Mexico opened, and inland seas washed the peaks away. It pays to remember there are mountains beneath Dallas. The tops may have eroded, but the roots remain buried deep.

Some 165 million years later—in 1841—John Neely Bryan built a shelter on a bluff and called the area Dallas.

One hundred and twenty-two years later—in 1963—John F. Kennedy was shot on that bluff, now named Dealey Plaza.

Seventeen years later—in 1980—J. R. Ewing was shot on TV.

Dallas exists outside of prehistory. Unlike surrounding areas, it was not a camp for Native Americans or prehistoric men. Dig and you find few artifacts. The Trinity River formed a boundary for ancient tribes: farmers to the east and hunters to the west. The Trinity is a true Texan; it begins and ends within the state. Its 710-mile path slices through what is now downtown Dallas, making Dallas a city on the cusp, on the boundary, in between. It wavers between being and not being. Dallas wasn’t there until—suddenly—it was, called forth in the minds of white men.

John Neely Bryan, the founder of Dallas, was born in Tennessee in 1810. In 1839, he arrived at the three forks of the Trinity River with a Cherokee he called Ned and a dog he called Tubby. He was twenty-nine. He wrote his name on a piece of buckskin, affixed it to a stake, drove it into the soft ground of an eighteen-foot bluff, and went back to Arkansas. Two years later, he returned to his bluff and built a lean-to. In another two years, he was married, a union which brought five children. Dallas—as he called his claim—was on its way. (...)

In 1855, some 200 French, Swiss, and Belgians—some on horses, some on foot, some in wooden shoes—made the 250-mile trek from Houston, settling just west of Dallas in a utopian community they called “La Reunion.” The settlement was a cooperative founded on the social ideals of France philosopher François Marie Charles Fourier. Women were equal to men and could vote. The usual entropy—combined with substandard housing, a severe winter, a spring drought, summer grasshoppers, and a crop of wheat grown without consideration that there was no one there to buy it—undid the community’s best-laid plans. By 1860, 160 members of the colony had defected to Dallas. Thus the first piano entered the city, which also gained from a pool of pastry chefs, brewers, dancing masters, artisans, jewelers, tailors, physicians, naturalists, and the like. The seeds of Dallas as a cultural hub were planted. La Reunion ended without formal dissolution—it simply disappeared, save for a small cemetery, once called Fishtrap, now Crown Hill Memorial Park, which some seventy years later would become the final resting place of outlaw Bonnie Parker. Today the cemetery remains fairly overlooked, overtaken by a different kind of entropy, abutted by drive-thru liquor stores, video chains, thrift shops, gas stations, and check-cashing places. Reunion Tower—named in honor of the colony—rises roughly three miles to the east, standing tall like a late 1970s microphone.

Dallas is: Big Tex, the Cotton Bowl, the Dr. Pepper clock, Baby Doe’s Matchless Mine (as seen from I-35), El Fenix (the downtown original), Reunion Tower, Fair Park (home of the state fair as well as the world’s largest collection of Art Deco buildings), Texas Stadium (R.I.P.), Old Red Courthouse, Highland Park Village (the country’s first planned shopping center), NorthPark Mall (with its modern art and holiday penguins), Love Field, the Cathedral de Guadalupe, Neiman Marcus, Bank of America Plaza (aka the “Green Building,” built in 1985 and outlined with two miles of green argon lights), the aluminum-clad Republic National Bank building (its rocket-like spire stretching to the sky), First Interstate Tower (its giant slanting curtain walls once home to Ewing Oil), Texas Commerce Tower (its curved glass top pierced by a seven-story hole, a passage made narrower than originally planned for fear that some Dallas daredevil would try to fly a plane through it), the beaux-arts Magnolia Building (boasting, in 1934, the largest rotating sign in the world, a 6,000-pound, neon red Pegasus, the logo of Mobil Oil, now ExxonMobil, headquartered thirteen miles down the road), among others. Some of these skyscrapers were to be built in pairs, but even in Dallas developers can’t forestall a crash in real estate or a bust in oil. So the buildings stand, twinless ghosts on the skyline.

John Steinbeck on Texas: “Few people dare to inspect it for fear of losing their bearings in mystery or paradox.” Dallas, in particular, is a city that resists narrative.

by Edward McPherson, Paris Review |  Read more:
Photo: Uncredited

The Perfect Guide to Holiday Etiquette

Saturday, December 22, 2012

Collective Intelligence

Pretty much everything I'm doing now falls under the broad umbrella that I'd call collective intelligence. What does collective intelligence mean? It's important to realize that intelligence is not just something that happens inside individual brains. It also arises with groups of individuals. In fact, I'd define collective intelligence as groups of individuals acting collectively in ways that seem intelligent. By that definition, of course, collective intelligence has been around for a very long time. Families, companies, countries, and armies: those are all examples of groups of people working together in ways that at least sometimes seem intelligent.

It's also possible for groups of people to work together in ways that seem pretty stupid, and I think collective stupidity is just as possible as collective intelligence. Part of what I want to understand and part of what the people I'm working with want to understand is what are the conditions that lead to collective intelligence rather than collective stupidity. But in whatever form, either intelligence or stupidity, this collective behavior has existed for a long time.

What's new, though, is a new kind of collective intelligence enabled by the Internet. Think of Google, for instance, where millions of people all over the world create web pages, and link those web pages to each other. Then all that knowledge is harvested by the Google technology so that when you type a question in the Google search bar the answers you get often seem amazingly intelligent, at least by some definition of the word "intelligence."

Or think of Wikipedia, where thousands of people all over the world have collectively created a very large and amazingly high quality intellectual product with almost no centralized control. And by the way, without even being paid. I think these examples of things like Google and Wikipedia are not the end of the story. I think they're just barely the beginning of the story. We're likely to see lots more examples of Internet-enabled collective intelligence—and other kinds of collective intelligence as well—over the coming decades.

If we want to predict what's going to happen, especially if we want to be able to take advantage of what's going to happen, we need to understand those possibilities at a much deeper level than we do so far. That's really our goal in the MIT Center for Collective Intelligence, which I direct. In fact, one way we frame our core research question there is: How can people and computers be connected so that—collectively—they act more intelligently than any person, group or computer has ever done before? If you take that question seriously, the answers you get are often very different from the kinds of organizations and groups we know today.

We do take the question seriously, and we are doing a bunch of things related to that question. The first is just trying to map the different kinds of collective intelligence, the new kinds of collective intelligence that are happening all around us in the world today. One of the projects we’ve done we call “mapping the genomes of collective intelligence”. We’ve collected over 200 examples of interesting cases of collective intelligence … things like Google, Wikipedia, InnoCentive, the community that developed the Linux open source operating system, et cetera.

Then we looked for the design patterns that come up over and over in those different examples. Using the biological analogy, we call these design patterns “genes,” but if you don’t like the analogy or the metaphor, you can just use the word “design patterns.” We’ve identified so far about 19 of these design patterns—or genes—that occur over and over in these different examples.

by Thomas W. Malone, Edge |  Read more:
Photo: uncredited

What Does a Conductor Do?

I'm standing on a podium, with an enameled wand cocked between my fingers and sweat dampening the small of my back. Ranks of young musicians eye me skeptically. They know I don’t belong here, but they’re waiting for me to pretend I do. I raise my arm in the oppressive silence and let it drop. Miraculously, Mozart’s overture to Don Giovanni explodes in front of me, ragged but recognizable, violently thrilling. This feels like an anxiety dream, but it’s actually an attempt to answer a question that the great conductor Riccardo Muti asked on receiving an award last year: “What is it, really, I do?”

I have been wondering what, exactly, a conductor does since around 1980, when I led a JVC boom box in a phenomenal performance of Beethoven’s Seventh Symphony in my bedroom. I was bewitched by the music—the poignant plod of the second movement, the crazed gallop of the fourth—and fascinated by the sorcery. In college, I took a conducting course, presided over a few performances of my own compositions, and led the pit orchestra for a modern-dance program. Those crumbs of experience left me in awe of the constellation of skills and talents required of a conductor—and also made me somewhat skeptical that waving a stick creates a coherent interpretation.

Ever since big ensembles became the basis of orchestral music, about 200 years ago, doubt has dogged the guy on the podium. Audiences wonder whether he (or, increasingly, she) has any effect; players are sure they could do better; and even conductors occasionally feel superfluous. “I’m in a bastard profession, a dishonest profession,” agonized Dimitri Mitropoulos, who led the New York Philharmonic in the fifties. “The others make all the music, and I get the salary and the credit.” Call it the Maestro Paradox: The person responsible for the totality of sound produces none.

by Justin Davidson, New York Magazine | Read more:
Photo: Gjon Mili, Time Life Pictures/Getty Images

Friday, December 21, 2012

NRA Offensive Exposes Deep U.S. Divisions on Guns

[ed. Yes, let's turn our schools into prisons and have our children cower in fear of being murdered every day. I have a feeling this is the end of the NRA. I find it repugnant that the title of this article even uses the word "offensive", as if this were some kind of game or war manuever. More like a terrorist group holding our culture hostage.]

Any chance for national unity on U.S. gun violence appeared to wane a week after the Connecticut school massacre, as the powerful NRA gun rights lobby called on Friday for armed guards in every school and gun-control advocates vehemently rejected the proposal.

The solution offered by the National Rifle Association defied a push by President Barack Obama for new gun laws, such as bans on high-capacity magazines and certain semiautomatic rifles.

At a hotel near the White House, NRA Chief Executive Wayne LaPierre said a debate among lawmakers would be long and ineffective, and that school children were better served by immediate action to send officers with firearms into schools.

LaPierre delivered an impassioned defense of the firearms that millions of Americans own, in a rare NRA news briefing after the Newtown, Connecticut, shooting in which a gunman killed his mother, and then 20 children and six adults at an elementary school.

"Why is the idea of a gun good when it's used to protect our president or our country or our police, but bad when it's used to protect our children in their schools?" LaPierre asked in comments twice interrupted by anti-NRA protesters whom guards forced from the room.

Speaking to about 200 reporters and editors but taking no questions, LaPierre dared politicians to oppose armed guards.

"Is the press and political class here in Washington so consumed by fear and hatred of the NRA and America's gun owners," he asked, "that you're willing to accept a world where real resistance to evil monsters is a lone, unarmed school principal?"

by David Ingram and Patricia Zengerle, Yahoo News | Read more:
Photo: REUTERS/Joshua Roberts

His Story: A Writer of Words and Music

When the poet, novelist, piano player and spoken-word recording artist Gil Scott-Heron died unexpectedly last May, at 62, he left behind a prickly and galvanizing body of work. His best songs — “The Revolution Will Not Be Televised,” “Whitey on the Moon,” “We Almost Lost Detroit” — are rarely heard on classic-rock radio; they’re too eccentric and polemical and might kill a workingman’s lunchtime buzz. But they’ll still stop you in your tracks.

Leave it to Scott-Heron to save some of his best for last. This posthumously published memoir, “The Last Holiday,” is an elegiac culmination to his musical and literary career. He’s a real writer, a word man, and it is as wriggling and vital in its way as Bob Dylan’s“Chronicles: Volume One.”

The Dylan comparison is worth picking up on for a moment. The critic Greil Marcus coined the phrase “the old, weird America” to refer to the influences that Mr. Dylan and the Band raked into their music on “The Basement Tapes.” In “The Last Holiday” Scott-Heron taps into the far side of that older and weirder America — that is, the fully African-American side. This memoir reads a bit like Langston Hughes filtered through the scratchy and electrified sensibilities of John Lee Hooker, Dick Gregory and Spike Lee.

For a relatively slim book, this one gets a lot of things said, not just about Scott-Heron’s own life but also about America in the second half of the 20th century. It encompasses Chicago, where he was born in 1949. There are sections in rural Tennessee, where he went to live with his grandmother after his Jamaican father abandoned the family to play professional soccer in Scotland. A few of these Tennessee passages are nearly as lovely as anything in James Agee’s prose poem “Knoxville: Summer of 1915.” Later Scott-Heron’s mother uprooted him to the Bronx.

This book is a warm memorial to the strong women in his life. One was his grandmother, who instilled in him a love of learning. The other was his mother, who came back into Scott-Heron’s life after his grandmother’s death. Both are electric presences in these pages.

In department stores in the 1950s, his grandmother refused to give up her place in line to whites. His mother fought for her son when he got into trouble for playing boogie-woogie music on a school Steinway, and when he was accidentally relegated to vocational classes. Administrators learned to fear and respect her. One said to the author: “Heron, your mother is a very impressive lady.”

Scott-Heron’s account of his school years evokes the entire arc of the African-American educational experience during the past century. He attended segregated schools in Tennessee before, bravely, in 1962, becoming one of the first blacks to desegregate a junior high school. Later, while living in the projects in the Chelsea section of Manhattan, he began attending, while in 10th grade, the prestigious Fieldston School in the Bronx on a full scholarship.

It was not an overwhelmingly positive experience. “I can never accuse the people of Fieldston, neither the students nor the faculty, of being racist,” he writes. “I can accuse the students of knowing each other for years and preferring to hang out with each other instead of some guy who just got there. I can accuse the teachers of having taught my classmates for 10 years and me for 10 minutes.”

This book is finally a testament to his unfettered drive as an artist. He left the historically black Lincoln College in Pennsylvania in 1968, after his freshman year, to write his first novel. That novel, a murder mystery called “The Vulture,” and a book of poems, “Small Talk at 125th and Lenox,” were quickly published. He began to record his songs soon after and his first album, also titled “Small Talk at 125th and Lenox,” was released in 1970.

In 1971 he drove down to Johns Hopkins University — he describes himself at the time as “about 6-foot-2 plus three inches of Afro” — and talked his way into the creative writing program. He got a master’s degree from Johns Hopkins in 1972 and taught writing until his musical career took off.

by Dwight Garner, NY Times |  Read more:
Photo: Mischa Richter




via: (inactive)

Only Rock and Roll

The minstrel, in his black-and-white domain, has had a poor time of it lately. The Black and White Minstrel Show, a stalwart of the BBC’s Saturday night schedule for twenty years, is talked of now almost exclusively in terms of moral wrongness, as if white men blacking up were merely an exercise in racial mockery and not the remnant of a cross-cultural theatrical form with intertwining roots in mid-nineteenth-century Southern society. Nevertheless, by the time of its final broadcast in 1978, the Black and White Minstrel Show was ready to fade away, not only on account of what seemed increasingly dubious representations in a country with a growing black population (though few among it came from the United States, where the minstrel show had flourished) but also because its burlesque manner had been superseded by an updated minstrelsy, which is more popular today than ever.

Rock and roll performers, like certain jazz crooners before them, dispensed with the stark device of blackface and depended instead on black voice. The American originator, obviously, was Elvis Presley, whose first single, “That’s All Right”, released in 1954, was a cover version of a song by the black singer Arthur “Big Boy” Crudup (he also recorded Crudup’s “My Baby Left Me” and “So Glad You’re Mine”). In Britain, lagging behind the US here as in most areas of innovation, the vocal twists and shouts associated with blues and gospel, the guitar pulses, the performers’ beseeching gestures, were most successfully absorbed by the young Rolling Stones. They would eventually steer their music into a style that can be called indigenous, with the African American just one among several ancestors; but in common with other groups that formed at about the same time and thrived in the light shed by the Stones’ success – the Yardbirds, the Pretty Things, the Animals, Manfred Mann, John Mayall’s Bluesbreakers, Peter Green’s Fleetwood Mac – the Stones started out trying to sound black, vocally and instrumentally. None of them had visited the US at the time of the group’s first releases, never mind the still-segregated South, and it seems probable they had met few black Americans. Besotted by both the blues and its frisky nephew, rhythm and blues, British groups isolated the music from its cultural and geographical territories, which oddly enough permitted it to prosper, freed from historic guilt and inverted deference. Mick Jagger, a distinctive singer with a number of transgressive extras in his gimmick bag – epicene ugly-beautiful looks for one thing, appealing delinquent attitude for another – became the Al Jolson of the teenybop era.  (...)

The sudden easy availability of long-playing records by the likes of Robert Johnson and Lightnin’ Hopkins opened a door into broader areas of black culture. A visit to the local library yielded a copy of Paul Oliver’s Blues Fell This Morning. From there, it was a short move to James Baldwin, whose crossover work The Fire Next Time was being read throughout the US just as “Johnny B. Goode” was appearing on Ready Steady Go!. The reader of Baldwin was unlikely to remain ignorant of Martin Luther King. It became possible, as it had not been before, for R&B enthusiasts from Dartford to Dundee to understand that when Muddy Waters sighed “Oh yeah-ah-ah-ah . . . . Ever’thing’s gonna be alright this morning”, he was not putting on the agony but making a statement about his life. The words of the Stones’ first No 1 hit, “It’s All Over Now” (1964), a country-and-western song by the black singer Bobby Womack, are bluntly literal and not susceptible to double meaning; but it was indeed all over for a certain way of attending to pop music, and, with that, of seeing the world. And the first pressing of the first Rolling Stones 45 rpm disc had much to do with it.

With Brian Jones in control of the balance, the early Stones were a cohesive R&B band with forward thrust and an attractive all-over rudeness in delivery (mistaken, at the time, for being “anti-establishment”). Keith Richards’s attack on “Carol” isn’t Chuck Berry’s, but it’s not bad for someone still in his teens. No matter how good they would become, though, they could never be the real thing. And no matter who they first thought they were singing to, they found their act overtaken by listeners for whom looks and image were as important as rhythm. Whereas the old-timers were on first-name terms with the blues (“Come on in now, Muddy, my husband’s just now left”; “Hey, Bo Diddley . . .” etc), the Stones – Jagger in particular – reserved their possessive feelings for the camera.

Both the Beatles and the Stones had a behind-the-scenes member in the form of the manager: Brian Epstein coddled the Beatles, while Andrew Loog Oldham shaped the anti-Beatles: scruffy cockneys (or mockneys, as Philip Norman says), where the Merseysiders were spruce; scowling, where they smiled; uncouth, ununiformed, apparently untameable. One group played before the Queen; the other appeared in front of the judges (Mick, Keith and Brian were all busted for drugs in 1967, and Brian again a year later; he was fired from the Stones some months before his death in July 1969). In The John Lennon Letters, Hunter Davies writes that John was forbidden to disclose in correspondence with fans that he was married to Cynthia Powell. The Stones, meanwhile, provided the rib for a gorgeous new being to mate with their own unearthly selves: the rock chick – Marianne Faithfull, Anita Pallenberg, Marsha Hunt. While Paul McCartney yearned for trouble-free yesterdays, Mick drawled suggestively through “Brown Sugar”, a song supposedly about his love for Hunt, though it could as easily have come from his liking for Blind Boy Fuller: “I got me a brownskin woman; / Tell me, Momma, where you get your sugar from” (“My Brownskin Sugar Plum”, 1935).

by James Campbell, TLS |  Read more:

“Brightening with each swipe of a workman’s cloth, stained glass in the Christian Science Mapparium in Boston, Massachusetts, shows political boundaries and coastlines charted after millennia of mapmaking.”

From “Revolution in Mapping,” February 1998, National Geographic magazine

New Facebook Policy: Dec. 20, 2012

[ed. So, if I understand this right, the filters FB uses to prevent spam any unwanted messages will work only if the sender doesn't bribe FB with a payment?]

If you see a message from someone you don't want to hear from in your Inbox, you can always select “Move to Other” or “Report Spam” from the Actions menu. You can also block people that you don’t want to hear from on Facebook.

Inbox delivery test

Facebook Messages is designed to get the most relevant messages into your Inbox and put less relevant messages into your Other folder. We rely on signals about the message to achieve this goal.

Some of these signals are social – we use social signals such as friend connections to determine whether a message is likely to be one you want to see in your Inbox.

Some of these signals are algorithmic – we use algorithms to identify spam and use broader signals from the social graph, such as friend of friend connections or people you may know, to help determine relevance.

Today we’re starting a small experiment to test the usefulness of economic signals to determine relevance. This test will give a small number of people the option to pay to have a message routed to the Inbox rather than the Other folder of a recipient that they are not connected with.

Several commentators and researchers have noted that imposing a financial cost on the sender may be the most effective way to discourage unwanted messages and facilitate delivery of messages that are relevant and useful.

This test is designed to address situations where neither social nor algorithmic signals are sufficient. For example, if you want to send a message to someone you heard speak at an event but are not friends with, or if you want to message someone about a job opportunity, you can use this feature to reach their Inbox. For the receiver, this test allows them to hear from people who have an important message to send them.

This message routing feature is only for personal messages between individuals in the U.S. In this test, the number of messages a person can have routed from their Other folder to their Inbox will be limited to a maximum of one per week.

We’ll continue to iterate and evolve Facebook Messages over the coming months.

via Facebook |  Read more:

A Social Offender for Our Times

What are you working on?" is academe's standard conversation starter, and for the past five years Geoffrey Nunberg has had a nonstandard response: "a book on assholes."

"You get giggles," says the linguist at the University of California at Berkeley, "or you get, 'You must have a lot of time on your hands'—the idea being that a word that vulgar and simple can't possibly be worth writing about." Scholars have tended to regard the book as a jeu d'esprit, not a serious undertaking. Their reaction intrigued Nunberg: "When people say a word is beneath consideration, it's a sign that there's a lot going on."

Aaron James can relate. He is a professor of philosophy at the University of California at Irvine and the author of Assholes: A Theory (Doubleday), which was published in late October, a few months after Nunberg's book, Ascent of the A-Word: Assholism, the First Sixty Years (PublicAffairs). James took up the project with some trepidation. "I felt a real sense of risk about writing something that might not appeal to my intellectual friends." (...)

A Cornell man, a Deke, a perfect asshole." Thus did Norman Mailer introduce Lieutenant Dove, literature's first asshole, in the 1948 novel The Naked and the Dead. From the start, Nunberg writes, the word carried overtones of class. "Asshole" launched its "attack from the ground level, in the name of ordinary Joes, people whose moral authority derives not from their rank or breeding but their authenticity, which is exactly the thing that the asshole lacks."

So what is an asshole, exactly? How is he (and assholes are almost always men) distinct from other types of social malefactors? Are assholes born that way, or is their boorishness culturally conditioned? What explains the spike in the asshole population?

James was at the beach when he began mulling those questions. "I was watching one of the usual miscreants surf by on a wave and thought, Gosh, he's an asshole." Not an intellectual breakthrough, he concedes, but his reaction had what he calls "cognitive content." In other words, his statement was more than a mere expression of feeling. He started sketching a theory of assholes, refining his thinking at the Center for Advanced Study in the Behavioral Sciences at Stanford, where he spent a year as a fellow in 2009.

He consulted Rousseau (who, James notes, was something of an asshole himself on account of his shabby parenting skills), Hobbes (especially his views on the "Foole" who breaks the social contract), Kant (his notion of self-conceit in particular), and more-recent scholarship on psychopaths. He spoke with psychologists, lawyers, and anthropologists, all of whom suggested asshole reading lists. "There are a lot of similar characters studied in other disciplines, like the free rider or the amoralist or the cheater," James says, calling his time at Stanford an "interdisciplinary education in asshole theory."

James argues for a three-part definition of assholes that boils down to this: Assholes act out of a deep-rooted sense of entitlement, a habitual and persistent belief that they deserve special treatment. (Nunberg points out that use of the phrase "sense of entitlement" tracks the spread of "asshole"—both have spiked since the 1970s.) How to distinguish an asshole from a scumbag, a jerk, a prick, or a schmuck? Assholes are systematic. We all do assholeish things, but only an asshole feels fully justified in always acting like an asshole. As James puts it, "If one is special on one's birthday, the asshole's birthday comes every day."

by Evan R. Goldstein, Chronicle of Higher Education | Read more:
Photo: Istock

Intelligence Without Design

Most people have a hard time believing that existence appeared out of nowhere. So they turn to available worldviews that incorporate purpose and meaning, which are generally religious. This has resulted in a deep split between a strict materialist worldview and other worldviews that see some intelligence within existence. They justify their respective positions by reducing the issues to a polarity between religious spirituality and secular materialism. This conflict cuts to the foundation of how and why existence exists. At one extreme are “arguments from design,” on the other side are materialist, evolutionary perspectives. (In the seeming middle are those who say God designed evolution, which is really just a more sophisticated argument from design.)

The vital questions are “purpose versus purposelessness” or “meaning versus meaninglessness.” Purposelessness is abhorrent to intelligent design believers; purpose is anathema to most of traditional science. The materialist scientists have no truck with what they consider the anthropomorphic indulgences that the “designers” exhibit. They focus on the absurdity of a transcendent God—an easy target. Intelligent “intelligent designers” rail against science’s narrow vision, its refusal to give any real explanation for the extraordinary confluence of statistically improbable events and finely tuned, coordinated configurations of exacting precision down to mathematically unimaginably small sub-atomic levels that allow this universe to be at all.

Design advocates point out that science can only prove design does not fit a very narrow scientific paradigm. Materialist scientists in turn say that without science as a baseline for truth and objectivity, any flight of fancy is possible. Obviously these two positions operate out of separate worldviews with different conceptions of “proof.” Scientific worldviews only address what is falsifiable and provable by the scientific method. But the scientific assumption that no intelligence is involved in the construction of the universe is likewise not provable by science.

Our approach to evolution cannot ultimately be proven in the ordinary sense of proof. This is because “proof” is always embedded in a worldview, and thus is always circular. So we speak to reasonableness and “most-likelihood.” After all, these are all one has to go on when the moorings of science prove insufficient to deal with life’s important questions and issues.

How this topic is viewed has vital consequences. Let’s first agree that evolution of some sort operates within the play of existence. The view we are putting forth features an intelligence without a designer or a specific design. We find that this perspective gives a better explanation of the evolutionary process, including where humanity finds itself at this historic and dangerous evolutionary moment. It is also a source of hope, offering a realistic possibility that we are facing an inevitable evolutionary challenge that can be met.

Does greater complexity in evolution entail improvement? Of course, the easy way out is to refuse to link evolutionary change with improvement, claiming that values are simply manufactured by human needs—evolution’s fancy way of giving humans a survival edge because it allows us the illusions of purpose and meaning that give impetus to social change when necessary.

Scientific reductionists believe that all explanations could be reduced to the laws of physics—if we knew enough. Both scientific reductionists and emergentists argue that because on a purely physical level the cosmos constructs increasing complexities that both replicate and evolve, life and then consciousness would necessarily likewise appear, given a proper environment to do so. In other words, because of the way evolution works, statistically speaking, given the right combination of chemicals in the right environment, life is bound to appear. And then, because consciousness has some significant evolutionary survival advantages, it, too, would come forth at some point. Seeing that life and consciousness have evolved, this argument in retrospect seeks an essentially mechanical explanation free from any hint of intention or purpose—because where would the purpose come from? Purpose would introduce a mystery that science assumes is unwarranted.

However, science typically does not inquire into where this vector toward complexity comes from and why different qualities emerge out of more complex configurations. Instead of addressing why a particular arrangement of chemicals in a particular “soup” brings forth life, science just observes and states that it does. The same is true for consciousness: it does seem to emerge from life at some point, although just why and even where it emerges are murky questions. Are amoebas conscious, or plants, or ants, or snakes, or apes? And then there is the self-reflecting consciousness of humans, which seems paired with an evolutionarily new linguistic ability. Did this, too, necessarily emerge simply because language gives social animals enormous facilities in cooperation, which in turn enables humans to out-compete other species?

The question of the emergence of something seemingly new out of something old lies at the heart of whether something besides purposeless and totally indifferent mechanisms are going on within the makeup of existence. In other words, is some kind of intelligence embedded in the very structure of existence itself that moves within the vectors of evolution to construct complexities, and even more, to bring about emergent qualities—including life and consciousness?

by Diana Alstad and Joel Kramer, Guernica |  Read more:
Photo: Hubble Space Telescope courtesy of NASA, ESA, STScI, J. Hester and P. Scowen (Arizona State University)