Saturday, May 12, 2012
Delta Dawn
How Sears, Roebuck & Co. midwifed the birth of the blues
Delta blues is as much legend as it is music. In the popular telling, blues articulated the hopelessness and poverty of an isolated, oppressed people through music that was disconnected from popular trends and technological advances. Delta blues giants like Robert Johnson were victims, buffeted by the winds of racism, singing out mostly for personal solace. The story is undoubtedly romantic, but it just isn’t true. “It angers me how scholars associate the blues strictly with tragedy,” B.B. King complained in his 1999 autobiography Blues All Around Me. “As a little kid, blues meant hope, excitement, pure emotion.”
The tragic image of the blues that originated in the Mississippi Delta ignores the competitive and entrepreneurial spirit of the bluesman himself. While it is certainly true that the music was forged in part by the legacy of slavery and the insults of Jim Crow, the iconic image of the lone bluesman traveling the road with a guitar strapped to his back is also a story about innovators seizing on expanded opportunities brought about by the commercial and technological advances of the early 1900s. There was no Delta blues before there were cheap, readily available steel-string guitars. And those guitars, which transformed American culture, were brought to the boondocks by Sears, Roebuck & Co.
Music has always been an instrument of upward mobility in the black community. During slavery, performers were afforded higher status than field workers. As the entertainment for plantation soirees, musicians were expected to be well versed in the social dance styles demanded by white audiences. But when performing in slave quarters, they played roughly the same repertoire. Former slaves’ narratives reveal that the slave musical ensemble closely resembled later minstrel-show string bands: fiddles and banjos, accompanied by various percussion instruments, usually the tambourine and two bones being struck together as claves. While the image of slaves dancing waltzes seems odd now, it was common in rural black communities well into the 20th century.
At the conclusion of the Civil War, freed black men were suddenly looking for employment. Musicianers, as they were called, could earn more money than the typical day laborer. With newfound freedom of movement, and cultural norms that had established entertainment as one of the few widely accepted jobs for blacks, Reconstruction became a time of great opportunity for black musicians. In an 1882 article in The Century Magazine a white onlooker at a 19th-century Georgia corn shucking described the elite status of the musicianer like this: “The fiddler is the man of most importance. He always comes late, must have an extra share of whiskey, is the best-dressed man in the crowd, and unless every honor is shown him he will not play.”
The music played by these 19th-century musicians was not blues, and their plucked string instrument was not the guitar; it was the banjo. In 1781 Thomas Jefferson wrote about the instrument slaves played at his plantation, the banjar, “which they brought with them from the hinterlands of Africa.” These simple instruments usually had four strings and no frets.
It may seem odd that an instrument with African roots, originally played by plantation slaves, would become popular among the white masses, but the banjo was portable, melodic, and relatively easy to play. Banjo proselytizers, seeking to overcome anxieties about embracing a product of slave culture, would go so far in trying to whitewash the instrument’s ancestry as to claim that it had “reached its apogee through the contribution of whites” who had added frets and a fifth string to the original banjar.
A few early “classic blues” recordings featured the banjo, often fit with a guitar neck to provide a wider range. But these vaudevillian sides, cut by people like “Papa” Charlie Jackson, sound only distantly related to the Delta blues of Tommy Johnson or Skip James. The sound of the Delta is the sound of the steel-string guitar. The guitars of the 19th century used gut strings and were expensive and difficult to play. So despite having superior range and flexibility compared to banjos, guitars were still a rare sight in the black community. That all began to change in the 20th century.
by Chris Kjorness, Reason.com | Read more:
Delta blues is as much legend as it is music. In the popular telling, blues articulated the hopelessness and poverty of an isolated, oppressed people through music that was disconnected from popular trends and technological advances. Delta blues giants like Robert Johnson were victims, buffeted by the winds of racism, singing out mostly for personal solace. The story is undoubtedly romantic, but it just isn’t true. “It angers me how scholars associate the blues strictly with tragedy,” B.B. King complained in his 1999 autobiography Blues All Around Me. “As a little kid, blues meant hope, excitement, pure emotion.”
The tragic image of the blues that originated in the Mississippi Delta ignores the competitive and entrepreneurial spirit of the bluesman himself. While it is certainly true that the music was forged in part by the legacy of slavery and the insults of Jim Crow, the iconic image of the lone bluesman traveling the road with a guitar strapped to his back is also a story about innovators seizing on expanded opportunities brought about by the commercial and technological advances of the early 1900s. There was no Delta blues before there were cheap, readily available steel-string guitars. And those guitars, which transformed American culture, were brought to the boondocks by Sears, Roebuck & Co.
Music has always been an instrument of upward mobility in the black community. During slavery, performers were afforded higher status than field workers. As the entertainment for plantation soirees, musicians were expected to be well versed in the social dance styles demanded by white audiences. But when performing in slave quarters, they played roughly the same repertoire. Former slaves’ narratives reveal that the slave musical ensemble closely resembled later minstrel-show string bands: fiddles and banjos, accompanied by various percussion instruments, usually the tambourine and two bones being struck together as claves. While the image of slaves dancing waltzes seems odd now, it was common in rural black communities well into the 20th century.
At the conclusion of the Civil War, freed black men were suddenly looking for employment. Musicianers, as they were called, could earn more money than the typical day laborer. With newfound freedom of movement, and cultural norms that had established entertainment as one of the few widely accepted jobs for blacks, Reconstruction became a time of great opportunity for black musicians. In an 1882 article in The Century Magazine a white onlooker at a 19th-century Georgia corn shucking described the elite status of the musicianer like this: “The fiddler is the man of most importance. He always comes late, must have an extra share of whiskey, is the best-dressed man in the crowd, and unless every honor is shown him he will not play.”
The music played by these 19th-century musicians was not blues, and their plucked string instrument was not the guitar; it was the banjo. In 1781 Thomas Jefferson wrote about the instrument slaves played at his plantation, the banjar, “which they brought with them from the hinterlands of Africa.” These simple instruments usually had four strings and no frets.
It may seem odd that an instrument with African roots, originally played by plantation slaves, would become popular among the white masses, but the banjo was portable, melodic, and relatively easy to play. Banjo proselytizers, seeking to overcome anxieties about embracing a product of slave culture, would go so far in trying to whitewash the instrument’s ancestry as to claim that it had “reached its apogee through the contribution of whites” who had added frets and a fifth string to the original banjar.
A few early “classic blues” recordings featured the banjo, often fit with a guitar neck to provide a wider range. But these vaudevillian sides, cut by people like “Papa” Charlie Jackson, sound only distantly related to the Delta blues of Tommy Johnson or Skip James. The sound of the Delta is the sound of the steel-string guitar. The guitars of the 19th century used gut strings and were expensive and difficult to play. So despite having superior range and flexibility compared to banjos, guitars were still a rare sight in the black community. That all began to change in the 20th century.
by Chris Kjorness, Reason.com | Read more:
The Dinner Party (fiction)
On occasion, the two women went to lunch and she came home offended by some pettiness. And he would say, “Why do this to yourself?” He wanted to keep her from being hurt. He also wanted his wife and her friend to drift apart so that he never had to sit through another dinner party with the friend and her husband. But after a few months the rift would inevitably heal and the friendship return to good standing. He couldn’t blame her. They went back a long way and you get only so many old friends.
He leaped four hours ahead of himself. He ruminated on the evening in future retrospect and recalled every gesture, every word. He walked back to the kitchen and stood with a new drink in front of the fridge, out of the way. “I can’t do it,” he said.
“Can’t do what?”
The balls were up in the air: water slowly coming to a boil on the stove, meat seasoned on a plate sitting on the butcher block. She stood beside the sink dicing an onion. Other vegetables waited their turn on the counter, bright and doomed. She stopped cutting long enough to lift her arm to her eyes in a tragic pose. Then she resumed, more tearfully. She wasn’t drinking much of her wine.
“I can predict everything that will happen from the moment they arrive to the little kiss on the cheek goodbye and I just can’t goddam do it.”
“You could stick your tongue down her throat instead of the kiss goodbye,” she offered casually as she continued to dice. She was game, his wife. She spoke to him in bad taste freely and he considered it one of her best qualities. “But then that would surprise her, I guess, not you.”
“They come in,” he said, “we take their coats. Everyone talks in a big hurry as if we didn’t have four long hours ahead of us. We self-medicate with alcohol. A lot of things are discussed, different issues. Everyone laughs a lot, but later no one can say what exactly was so witty. Compliments on the food. A couple of monologues. Then they start to yawn, we start to yawn. They say, ‘We should think about leaving, huh?,’ and we politely look away, like they’ve just decided to take a crap on the dinner table. Everyone stands, one of us gets their coats, peppy goodbyes. We all say what a lovely evening, do it again soon, blah-blah-blah. And then they leave and we talk about them and they hit the streets and talk about us.”
“What would make you happy?” she asked.
“A blow job.”
“Let’s wait until they get here for that,” she said.
She slid her finger along the blade to free the clinging onion. He handed her her glass. “Drink your wine,” he said. She took a sip. He left the kitchen.
by Joshua Ferris, The New Yorker (August, 2008) | Read more:
Photograph: Gilbert & George, "The Shadow of the Glass" (1972) Courtesy Lehmann Maupin Gallery and Sonnabend Gallery
He leaped four hours ahead of himself. He ruminated on the evening in future retrospect and recalled every gesture, every word. He walked back to the kitchen and stood with a new drink in front of the fridge, out of the way. “I can’t do it,” he said.
“Can’t do what?”
The balls were up in the air: water slowly coming to a boil on the stove, meat seasoned on a plate sitting on the butcher block. She stood beside the sink dicing an onion. Other vegetables waited their turn on the counter, bright and doomed. She stopped cutting long enough to lift her arm to her eyes in a tragic pose. Then she resumed, more tearfully. She wasn’t drinking much of her wine.
“I can predict everything that will happen from the moment they arrive to the little kiss on the cheek goodbye and I just can’t goddam do it.”
“You could stick your tongue down her throat instead of the kiss goodbye,” she offered casually as she continued to dice. She was game, his wife. She spoke to him in bad taste freely and he considered it one of her best qualities. “But then that would surprise her, I guess, not you.”
“They come in,” he said, “we take their coats. Everyone talks in a big hurry as if we didn’t have four long hours ahead of us. We self-medicate with alcohol. A lot of things are discussed, different issues. Everyone laughs a lot, but later no one can say what exactly was so witty. Compliments on the food. A couple of monologues. Then they start to yawn, we start to yawn. They say, ‘We should think about leaving, huh?,’ and we politely look away, like they’ve just decided to take a crap on the dinner table. Everyone stands, one of us gets their coats, peppy goodbyes. We all say what a lovely evening, do it again soon, blah-blah-blah. And then they leave and we talk about them and they hit the streets and talk about us.”
“What would make you happy?” she asked.
“A blow job.”
“Let’s wait until they get here for that,” she said.
She slid her finger along the blade to free the clinging onion. He handed her her glass. “Drink your wine,” he said. She took a sip. He left the kitchen.
by Joshua Ferris, The New Yorker (August, 2008) | Read more:
Photograph: Gilbert & George, "The Shadow of the Glass" (1972) Courtesy Lehmann Maupin Gallery and Sonnabend Gallery
What Your I.Q. Means
116+
17 percent of the world population; superior I.Q.; appropriate average for individuals in professional occupations.
121+
10 percent; potentially gifted; average for college graduates
132+
2 percent; borderline genius; average I.Q. of most Ph.D. recipients
143+
1 percent; genius level; about average for Ph.D.'s in physics
158+
1 in 10,000; Nobel Prize winners
164+
1 in 30,000; Wolfgang Amadeus Mozart and the chess champion Bobby Fischer.
via: Can You Make Yourself Smarter? (NY Times, April 2012)
17 percent of the world population; superior I.Q.; appropriate average for individuals in professional occupations.
121+
10 percent; potentially gifted; average for college graduates
132+
2 percent; borderline genius; average I.Q. of most Ph.D. recipients
143+
1 percent; genius level; about average for Ph.D.'s in physics
158+
1 in 10,000; Nobel Prize winners
164+
1 in 30,000; Wolfgang Amadeus Mozart and the chess champion Bobby Fischer.
via: Can You Make Yourself Smarter? (NY Times, April 2012)
Can You Call a 9-Year-Old a Psychopath?
By the time he turned 5, Michael had developed an uncanny ability to switch from full-blown anger to moments of pure rationality or calculated charm — a facility that Anne describes as deeply unsettling. “You never know when you’re going to see a proper emotion,” she said. She recalled one argument, over a homework assignment, when Michael shrieked and wept as she tried to reason with him. “I said: ‘Michael, remember the brainstorming we did yesterday? All you have to do is take your thoughts from that and turn them into sentences, and you’re done!’ He’s still screaming bloody murder, so I say, ‘Michael, I thought we brainstormed so we could avoid all this drama today.’ He stopped dead, in the middle of the screaming, turned to me and said in this flat, adult voice, ‘Well, you didn’t think that through very clearly then, did you?’ ” (...)
Over the last six years, Michael’s parents have taken him to eight different therapists and received a proliferating number of diagnoses. “We’ve had so many people tell us so many different things,” Anne said. “Oh, it’s A.D.D. — oh, it’s not. It’s depression — or it’s not. You could open the DSM and point to a random thing, and chances are he has elements of it. He’s got characteristics of O.C.D. He’s got characteristics of sensory-integration disorder. Nobody knows what the predominant feature is, in terms of treating him. Which is the frustrating part.”
Then last spring, the psychologist treating Michael referred his parents to Dan Waschbusch, a researcher at Florida International University. Following a battery of evaluations, Anne and Miguel were presented with another possible diagnosis: their son Michael might be a psychopath.
For the past 10 years, Waschbusch has been studying “callous-unemotional” children — those who exhibit a distinctive lack of affect, remorse or empathy — and who are considered at risk of becoming psychopaths as adults. To evaluate Michael, Waschbusch used a combination of psychological exams and teacher- and family-rating scales, including the Inventory of Callous-Unemotional Traits, the Child Psychopathy Scale and a modified version of the Antisocial Process Screening Device — all tools designed to measure the cold, predatory conduct most closely associated with adult psychopathy. (The terms “sociopath” and “psychopath” are essentially identical.) A research assistant interviewed Michael’s parents and teachers about his behavior at home and in school. When all the exams and reports were tabulated, Michael was almost two standard deviations outside the normal range for callous-unemotional behavior, which placed him on the severe end of the spectrum.
Currently, there is no standard test for psychopathy in children, but a growing number of psychologists believe that psychopathy, like autism, is a distinct neurological condition — one that can be identified in children as young as 5. Crucial to this diagnosis are callous-unemotional traits, which most researchers now believe distinguish “fledgling psychopaths” from children with ordinary conduct disorder, who are also impulsive and hard to control and exhibit hostile or violent behavior. According to some studies, roughly one-third of children with severe behavioral problems — like the aggressive disobedience that Michael displays — also test above normal on callous-unemotional traits. (Narcissism and impulsivity, which are part of the adult diagnostic criteria, are difficult to apply to children, who are narcissistic and impulsive by nature.) (...)
The idea that a young child could have psychopathic tendencies remains controversial among psychologists. Laurence Steinberg, a psychologist at Temple University, has argued that psychopathy, like other personality disorders, is almost impossible to diagnose accurately in children, or even in teenagers — both because their brains are still developing and because normal behavior at these ages can be misinterpreted as psychopathic. Others fear that even if such a diagnosis can be made accurately, the social cost of branding a young child a psychopath is simply too high. (The disorder has historically been considered untreatable.) John Edens, a clinical psychologist at Texas A&M University, has cautioned against spending money on research to identify children at risk of psychopathy. “This isn’t like autism, where the child and parents will find support,” Edens observes. “Even if accurate, it’s a ruinous diagnosis. No one is sympathetic to the mother of a psychopath.”
by Jennifer Kahn, NY Times | Read more:
Photo: Elinor Carucci/Redux
The Inquisition of Mr. Marvel
On the (surprisingly complicated) legacy of Stan Lee
Q: People ask, "Is Stan Lee still with Marvel Comics." Are you still with us?
STAN LEE: Sure! Especially on pay day!
— Marvel Age magazine interview, 1983
Almost all the main characters in Avengers — including Thor, the Hulk, superspy Nick Fury, and the movie's primary villain, the trickster-god Loki — were introduced between 1961 and 1964, in comics written and drawn by Lee and Kirby. During that same period — a generative streak basically unparalleled in American comics history before or since — they also introduced the X-Men and the Fantastic Four.
Officially, Lee wrote the books and Kirby drew them. Officially, Stan supplied the realism — his heroes had flaws, they argued among themselves, they were prone to colds and bouts of self-loathing, and sometimes they'd forget to pay the rent and face eviction from their futuristic high-rise HQs, which were in New York, not a made-up metropolis — while Kirby supplied the propulsion, filling the pages with visions of eternity and calamity, along with action sequences that basically invented the visual grammar of modern superhero comics. (...)
Over the years, Marvel changed hands, went bankrupt, reemerged, restructured. Stan stayed in the picture. Each time he renegotiated his deal with the company, he did so from a unique position — half elder god, half mascot. Administration after administration recognized that it was in their best interests PR-wise to keep him on the payroll. For years, he received 10 percent of all revenue generated by the exploitation of his characters on TV and in movies, along with a six-figure salary. This came out in 2002, when Lee sued Marvel, claiming they'd failed to pay him his percentage of the profits from the first Spider-Man movie, a development the Comics Journal compared to Colonel Sanders suing Kentucky Fried Chicken.
It's unclear if Stan still co-owns any of Marvel's characters, but the company continues to take care of him. When Disney (which, full disclosure, is also the parent company of ESPN, which owns the website you're now reading) bought Marvel for $4 billion in 2009, part of the deal involved a Disney subsidiary buying a small piece of POW! Entertainment, a content-farm company Stan co-founded; another Disney-affiliated company currently pays POW! $1.25 million a year to loan out Stan as a consultant "on the exploitation of the assets of Marvel Entertainment."
Jack Kirby, on the other hand, was a contractor. You could sink a continent in the amount of ink that's been spilled on the question of whether it was Stan's voice or Jack's visuals that ultimately made Marvel what it was, but it's hard to argue that any of this would have happened had Kirby been hit by a bus in 1960. Yet like most comics creators back then, he was paid by the page and retained no rights to any of the work he did for the company or the characters he helped create; by cashing his paychecks, he signed those rights over to the company. It took him decades just to persuade Marvel to give him back some of his original art, much of which was lost or given away or stolen in the meantime; there are horror stories about original Kirby pages being gifted to the water-delivery guy.
Kirby never sued Marvel, over the art or anything else. But as the years wore on he blasted the company in interviews. He blasted Lee, its avatar. Compared him to Sammy Glick. Referred to him as a mere "office worker" who'd grabbed credit from true idea men. "It wasn't possible for a man like Stan Lee to come up with new things — or old things, for that matter," Kirby told the Comics Journal in an infamous 1990 interview. "Stan Lee wasn't a guy that read or that told stories. Stan Lee was a guy that knew where the papers were or who was coming to visit that day."
And all this happened back when the comics industry only manufactured and sold comic books. Back when even the medium's most vocal champions wouldn't have dreamed of Marvel (which filed for Chapter 11 bankruptcy in 1996) being worth $4 billion to anybody.
by Alex Pappademis, Grantland | Read more:
Photo: Jerod Harris/WireImage
Friday, May 11, 2012
Friday Book Club - The Remains of the Day
[ed. I don't know why it's taken me so long to come to this, it's a masterpiece. I love Ishiguro's writing style - spare and elegant - never a misplaced word or phrase anywhere (see also, Never Let Me Go). An engaging story that transports the reader deep into British culture.]
Kazuo Ishiguro's third novel, ''The Remains of the Day,'' is a dream of a book: a beguiling comedy of manners that evolves almost magically into a profound and heart-rending study of personality, class and culture. At the beginning, though, its narrator, an elderly English butler named Stevens, seems the least forthcoming (let alone enchanting) of companions. Cartoonishly punctilious and reserved, he edges slowly into an account of a brief motoring holiday from Oxfordshire to the West Country that he is taking alone at the insistence of his new employer, a genial American, Mr. Farraday.
The time is July 1956. Farraday has recently bought Darlington Hall near Oxford from the descendants of the last noble-born owner and has asked Stevens - a fixture there for nearly four decades - to relax a bit before implementing a much-reduced staff plan for the running of the house. Tense about his little holiday, Stevens hopes secretly to use it for professional advantage: to recruit the former housekeeper, the admirable Miss Kenton, who had years ago left service to marry, but who is now estranged from her husband and seems nostalgic for her old position.
In the early part of his story, the strait-laced Stevens plays perfectly the role of model butler as obliging narrator. Attentive to detail, solicitous of others, eager to serve, he primly sketches the history and current state of affairs at the great house and points out the agreeable features of the landscape as he moves slowly from Salisbury to Taunton, Tavistock and Little Compton in Cornwall. Much of this is dryly, deliciously funny, not so much because Stevens is witty or notably perceptive (he is neither) but because in his impassive formality he is so breathtakingly true to type, so very much the familiar product of the suppressive and now anachronistic social system that has produced him and to which he is so intensely loyal.
At different points in his subdued musings on the past, Stevens offers formulations of immemorial English attitudes that are likely to strike many contemporary readers as at once laughably parochial and quaintly endearing. Obsessed with notions of greatness, he proclaims that the English landscape is the most deeply satisfying in the world because of ''the very lack of obvious drama or spectacle.'' As he puts it, ''The sorts of sights offered in such places as Africa and America, though undoubtedly very exciting, would, I am sure, strike the objective viewer as inferior on account of their unseemly demonstrativeness.''
Similarly, Stevens provides a long, solemn, yet unwittingly brilliant disquisition on the question of what makes a great butler, a topic that has provoked ''much debate in our profession over the years'' and continues to obsess him throughout his narrative. The key, he confidently insists, is dignity, which has to do with a butler's ability to ''inhabit'' his role ''to the utmost.''
''Lesser butlers,'' Stevens muses, ''will abandon their professional being for the private one at the least provocation. For such persons, being a butler is like playing some pantomime role; a small push, a slight stumble, and the facade will drop off to reveal the actor underneath. The great butlers are great by virtue of their ability to inhabit their professional role and inhabit it to the utmost; they will not be shaken out by external events, however surprising, alarming or vexing. They wear their professionalism as a decent gentleman will wear his suit: he will not let ruffians or circumstance tear it off him in the public gaze; he will discard it when, and only when, he wills to do so, and this will invariably be when he is entirely alone. It is, as I say, a matter of 'dignity.' '' Mr. Ishiguro's command of Stevens' corseted idiom is masterly, and nowhere more tellingly so than in the way he controls the progressive revelation of unintended ironic meaning. Underneath what Stevens says, something else is being said, and the something else eventually turns out to be a moving series of chilly revelations of the butler's buried life - and, by implication, a powerful critique of the social machine in which he is a cog. As we move westward with Stevens in Farraday's vintage Ford, we learn more and more about the price he has paid in striving for his lofty ideal of professional greatness.

The time is July 1956. Farraday has recently bought Darlington Hall near Oxford from the descendants of the last noble-born owner and has asked Stevens - a fixture there for nearly four decades - to relax a bit before implementing a much-reduced staff plan for the running of the house. Tense about his little holiday, Stevens hopes secretly to use it for professional advantage: to recruit the former housekeeper, the admirable Miss Kenton, who had years ago left service to marry, but who is now estranged from her husband and seems nostalgic for her old position.
In the early part of his story, the strait-laced Stevens plays perfectly the role of model butler as obliging narrator. Attentive to detail, solicitous of others, eager to serve, he primly sketches the history and current state of affairs at the great house and points out the agreeable features of the landscape as he moves slowly from Salisbury to Taunton, Tavistock and Little Compton in Cornwall. Much of this is dryly, deliciously funny, not so much because Stevens is witty or notably perceptive (he is neither) but because in his impassive formality he is so breathtakingly true to type, so very much the familiar product of the suppressive and now anachronistic social system that has produced him and to which he is so intensely loyal.
At different points in his subdued musings on the past, Stevens offers formulations of immemorial English attitudes that are likely to strike many contemporary readers as at once laughably parochial and quaintly endearing. Obsessed with notions of greatness, he proclaims that the English landscape is the most deeply satisfying in the world because of ''the very lack of obvious drama or spectacle.'' As he puts it, ''The sorts of sights offered in such places as Africa and America, though undoubtedly very exciting, would, I am sure, strike the objective viewer as inferior on account of their unseemly demonstrativeness.''
Similarly, Stevens provides a long, solemn, yet unwittingly brilliant disquisition on the question of what makes a great butler, a topic that has provoked ''much debate in our profession over the years'' and continues to obsess him throughout his narrative. The key, he confidently insists, is dignity, which has to do with a butler's ability to ''inhabit'' his role ''to the utmost.''
''Lesser butlers,'' Stevens muses, ''will abandon their professional being for the private one at the least provocation. For such persons, being a butler is like playing some pantomime role; a small push, a slight stumble, and the facade will drop off to reveal the actor underneath. The great butlers are great by virtue of their ability to inhabit their professional role and inhabit it to the utmost; they will not be shaken out by external events, however surprising, alarming or vexing. They wear their professionalism as a decent gentleman will wear his suit: he will not let ruffians or circumstance tear it off him in the public gaze; he will discard it when, and only when, he wills to do so, and this will invariably be when he is entirely alone. It is, as I say, a matter of 'dignity.' '' Mr. Ishiguro's command of Stevens' corseted idiom is masterly, and nowhere more tellingly so than in the way he controls the progressive revelation of unintended ironic meaning. Underneath what Stevens says, something else is being said, and the something else eventually turns out to be a moving series of chilly revelations of the butler's buried life - and, by implication, a powerful critique of the social machine in which he is a cog. As we move westward with Stevens in Farraday's vintage Ford, we learn more and more about the price he has paid in striving for his lofty ideal of professional greatness.
Among the problems Nabokov’s Lolita poses for the book designer, probably the thorniest is the popular misconception of the title character. She’s chronically miscast as a teenage sexpot—just witness the dozens of soft-core covers over the years. “We are talking about a novel which has child rape at its core,” says John Bertram, an architect and blogger who, three years ago, sponsored a Lolita cover competition asking designers to do better.
via:
How Wall Street Killed Financial Reform
Two years ago, when he signed the Dodd-Frank Wall Street Reform and Consumer Protection Act, President Barack Obama bragged that he'd dealt a crushing blow to the extravagant financial corruption that had caused the global economic crash in 2008. "These reforms represent the strongest consumer financial protections in history," the president told an adoring crowd in downtown D.C. on July 21st, 2010. "In history."
This was supposed to be the big one. At 2,300 pages, the new law ostensibly rewrote the rules for Wall Street. It was going to put an end to predatory lending in the mortgage markets, crack down on hidden fees and penalties in credit contracts, and create a powerful new Consumer Financial Protection Bureau to safeguard ordinary consumers. Big banks would be banned from gambling with taxpayer money, and a new set of rules would limit speculators from making the kind of crazy-ass bets that cause wild spikes in the price of food and energy. There would be no more AIGs, and the world would never again face a financial apocalypse when a bank like Lehman Brothers went bankrupt.
Most importantly, even if any of that fiendish crap ever did happen again, Dodd-Frank guaranteed we wouldn't be expected to pay for it. "The American people will never again be asked to foot the bill for Wall Street's mistakes," Obama promised. "There will be no more taxpayer-funded bailouts. Period."
Two years later, Dodd-Frank is groaning on its deathbed. The giant reform bill turned out to be like the fish reeled in by Hemingway's Old Man – no sooner caught than set upon by sharks that strip it to nothing long before it ever reaches the shore. In a furious below-the-radar effort at gutting the law – roundly despised by Washington's Wall Street paymasters – a troop of water-carrying Eric Cantor Republicans are speeding nine separate bills through the House, all designed to roll back the few genuinely toothy portions left in Dodd-Frank. With the Quislingian covert assistance of Democrats, both in Congress and in the White House, those bills could pass through the House and the Senate with little or no debate, with simple floor votes – by a process usually reserved for things like the renaming of post offices or a nonbinding resolution celebrating Amelia Earhart's birthday.
The fate of Dodd-Frank over the past two years is an object lesson in the government's inability to institute even the simplest and most obvious reforms, especially if those reforms happen to clash with powerful financial interests. From the moment it was signed into law, lobbyists and lawyers have fought regulators over every line in the rulemaking process. Congressmen and presidents may be able to get a law passed once in a while – but they can no longer make sure it stays passed. You win the modern financial-regulation game by filing the most motions, attending the most hearings, giving the most money to the most politicians and, above all, by keeping at it, day after day, year after fiscal year, until stealing is legal again. "It's like a scorched-earth policy," says Michael Greenberger, a former regulator who was heavily involved with the drafting of Dodd-Frank. "It requires constant combat. And it never, ever ends."
That the banks have just about succeeded in strangling Dodd-Frank is probably not news to most Americans – it's how they succeeded that's the scary part. The banks followed a five-point strategy that offers a dependable blueprint for defeating any regulation – and for guaranteeing that when it comes to the economy, might will always equal right.
by Matt Taibbi, Rolling Stone | Read more:
Photo: Rod Lamkey Jr./AFP Getty Images
The Secret's In The Pronouns
"Function words are essentially the filler words," Pennebaker says. "These are the words that we don't pay attention to, and they're the ones that are so interesting."
According to the way that Pennebaker organizes language, the words that we more often focus on in conversation are content words, words like "school," "family," "live," "friends" — words that conjure a specific image and relay more of the substance of what is being discussed.
"I speak bad Spanish," Pennebaker explains, "and if I'm in a conversation where I'm listening to the other person speak, I am just trying to find out what they are talking about. I am listening to 'what, where, when' — those big content heavy words. All those little words in between, I don't listen to those because they're too complex to listen to."
In fact, says Pennebaker, even in our native language, these function words are basically invisible to us.
"You can't hear them," Pennebaker says. "Humans just aren't able to do it."
But computers can, which is why two decades ago Pennebaker and his graduate students sat down to build themselves a computer program.
The Linguistic Inquiry and Word Count program that Pennebaker and his students built in the early 1990s has — like any computer program — an ability to peer into massive data sets and discern patterns that no human could ever hope to match.
And so after Pennebaker and his crew built the program, they used it to ask all kinds of questions that had previously been too complicated or difficult for humans to ask.
Some of those questions included:
- Could you tell if someone was lying by carefully analyzing the way they used function words?
- Looking only at a transcript, could you tell from function words whether someone was male or female, rich or poor?
- What could you tell about relationships by looking at the way two people spoke to each other?
See, one of the things that Pennebaker did was record and transcribe conversations that took place between people on speed dates. He fed these conversations into his program along with information about how the people themselves were perceiving the dates. What he found surprised him.
"We can predict by analyzing their language, who will go on a date — who will match — at rates better than the people themselves," he says.
by Alix Spiegal, NPR | Read more:
Illustration: iStockphoto.com
A Fish Story
How an angler and two government bureaucrats may have saved the Atlantic Ocean
Price is a lifelong striped bass fisherman with no formal training as a scientist. Yet he has spent the last four decades cutting open bass stomachs in a kind of renegade ecological study, charting the precipitous decline of the lowly menhaden. Price’s interest in the species is indirect; menhaden aren’t prized by anglers. But they are prized by striped bass. The little fish has historically been the striper’s most significant source of protein and calories. In fact, menhaden are a staple in the diets of dozens of marine predators in the Atlantic and its stuaries, from osprey to bluefish to dolphin to blue crab. In a host of undersea food chains, menhaden—also known as pogy and bunker—are a common denominator. They have been called the most important fish in the sea.
Price began his study years ago when it became increasingly evident to him that the striped bass in the Chesapeake were quite literally starving. And so, at least once a week he dissects bass to see whether the fish ate recently before they died. He squeezes spleens to determine if the fish had mycobacteriosis, a serious infection related to malnutrition that affects more than 60 percent of the striped bass in the Chesapeake Bay. He relays his findings in a numerical code of his own devising. “Body fat is a ten, ovaries a two, spleen is okay, empty stomach,” he says gruffly, while his wife, Henrietta, dutifully transcribes his thoughts into a ledger. Four times out of fifty, he pulls a whole menhaden from a bass belly, weighing each one with a small scale.
Local sport fishermen are happy to help Price by leaving him the bones and innards of their catch, because his work confirms what anglers up and down the Atlantic coast know from direct experience: the menhaden are disappearing.
Like any good mystery, this one has a prime suspect. Across the Chesapeake and about sixty miles to the south of where Price stands, a seaside factory hums and buzzes, filling the small town of Reedville, Virginia, with the putrid smell of menhaden chum. The looming smokestacks, warehouses, and pretty much everything else on Reedville’s Menhaden Road are owned by Omega Protein, a publicly traded company headquartered in Houston with a long and storied history of industrial fishing in Atlantic waters.
The operation is high-tech. Spotter planes take off from Reedville’s tiny airstrip to circle swathes of ocean, looking for the telltale shadow of menhaden moving by the million just below the surface. Pilots radio Omega Protein’s fleet of nine refurbished World War II transport ships, one of which dispatches two smaller boats that surround the school with a giant net called a purse seine, drawing the fish tightly together using the mechanics of a drawstring sack, until all the members of the school can be sucked out of the ocean with a vacuum pump. The boats can “set” the net twelve to fifteen times a day; a vessel will return to port with millions of menhaden aboard.
Harvested by the billions and then processed into various industrial products, menhaden are extruded into feed pellets that make up the staple food product for a booming global aquaculture market, diluted into oil for omega-3 health supplements, and sold in various meals and liquids to companies that make pet food, livestock feed, fertilizer, and cosmetics. We have all consumed menhaden one way or another. Pound for pound, more menhaden are pulled from the sea than any other fish species in the continental United States, and 80 percent of the menhaden netted from the Atlantic are the property of a single company. (...)
Menhaden were once so plentiful in the Atlantic that early pioneers described them as swimming in schools twenty-five miles long or more, packing themselves into bays and estuaries where they came to feed on dense schools of phytoplankton (algae and vegetable matter). Rutgers professor H. Bruce Franklin uncovered a trove of early accounts of menhaden for his book, The Most Important Fish in the Sea, like one from John Smith, who in 1608 encountered menhaden in the Chesapeake “lying so thick with their heads above the water, as for want of nets we attempted to catch them with a frying pan.”
by Alison Fairbrother, Washington Monthly | Read more:
Thursday, May 10, 2012
How to cook the perfect spaghetti carbonara
Garlic or onion, pecorino or parmesan, bacon or ham, cream or butter – how do you like your carbonara, and what's the secret to getting that perfect consistency?
A dish whose principal ingredients are eggs and bacon was always going to be a shoo-in for the British palate: certainly spaghetti carbonara was a regular in my dad's repertoire when pesto was only a glint in a supermarket buyer's eye. As with so many Italian foodstuffs, it has a disputed history, although most people accept that carbonara probably originated in, or near Rome.
It's apparently named after the carbonai, or charcoal burners, allegedly because it was a favourite of these grimy men who spent months deep in the Apennines, relying on foodstuffs that could be easily transported, stored and then prepared over a fire. Sophia Loren claims to have happened upon a group of these lucky fellows while filming Two Women in the mountains in the late fifties – who obligingly cooked her a slap-up carbonara lunch.
Loth as I'd be to contradict the legendary Loren, there are people who believe that the whole carbonaio thing is simply a romantic legend, suggesting instead that the dish was created by local cooks for American GIs who took their rations of bacon and eggs to them to prepare over streetside charcoal braziers. More mature Romans dispute this however, claiming they remember enjoying carbonara while said GIs were still eating milk and cookies at their mother's knees.
Most plausibly of all, in my opinion, is the theory that the name simply refers to the copious amounts of black pepper customarily added to the dish: so much, in fact, that it's almost as if it's been seasoned with charcoal. It's one of those things which people will no doubt still be squabbling over as the earth implodes: far more important, in my opinion, is working out how to make a really good one. Which is where I come in.
by Felicity Cloake, The Guardian | Read more:
Photograph: Felicity Cloake
Subscribe to:
Posts (Atom)