Sunday, May 13, 2012

Paralysis of the Heart

[ed. Happy Mother's Day to all the mothers everywhere.]

I was driving my 11-year-old son, Joe, to school. It had been one of those mornings. He was singing opera and doing hip-hop moves when I needed him to put on his shoes.

As we pulled up in front of school just in time, I snapped: “I can’t start our day this way. This kind of stress is going to make me sick.”

He burst into tears. “Don’t say that!” he yelled. “Promise to never say that again!” He raced out of the car, wiping his eyes with the back of his hand.

On more than a few occasions, he has expressed his fear that something might happen to me. As the child of a single mother, he clearly has been pondering the same questions I do: Who will take care of him if I die? Who will love him as much as I do?

Joe’s fear of my mortality jarred me into reality, and I called my doctor. There actually had been a reason for my harsh statement. My face and arm had been numb for months. I had shrugged it off as stress but then started to get chronic headaches, too.

My doctor agreed to see me right away. After examining me, she said, “If I can’t get you in for an M.R.I. at the imaging center, I’ll need to send you to the hospital in an ambulance.” She explained that stress doesn’t create the symptoms I was having. It could be an aneurysm, a tumor or early signs of multiple sclerosis.

Someone else might have panicked, but this kind of situation makes me practical. She got me an appointment for an hour later. In that time, I did what any sensible person who has been ordered to get an emergency M.R.I. does: I got the car washed. I wasn’t in denial; there’s just so much time to get stuff done, and worrying wasn’t on my checklist.

by Michelle Fiordaliso, NY Times |  Read more:
Illustration: Brian Rea

The Confidence Game

Ida M. Tarbell, a writer for McClure’s Magazine, a general-interest monthly, was chatting with her good friend and editor, John S. Phillips, in the magazine’s offices near New York’s Madison Square Park, trying to decide what she should take on next.

Tarbell, then forty-three years old, was already one of the most prominent journalists in America, having written popular multipart historical sketches of Napoleon, Lincoln, and a French revolutionary figure known as Madame Roland, a moderate republican guillotined during the Terror. Thanks in part to her work, McClure’s circulation had jumped to about 400,000, making it one of the most popular, and profitable, publications in the country.

Phillips, a founder of the magazine, was its backbone. Presiding over an office of bohemians and intellectuals, this father of five was as calm and deliberative as the magazine’s namesake, S. S. McClure, was manic and extravagant. Considered by many to be a genius, McClure was also just an impossible boss—forever steaming in from Europe, throwing the office into turmoil with new schemes, ideas, and editorial changes. “I can’t sit still,” he once told Lincoln Steffens. “That’s your job and I don’t see how you can do it!”

At McClure’s, there was always, as Tarbell would later put it, much “fingering” of a subject before the magazine decided to launch on a story, and in this case there was more than usual. The subject being kicked around was nothing less than the great industrial monopolies, known as “trusts,” that had come to dominate the American economy and political life. It was the summer of 1901.  (...)

Ah, old media. Good times. Savin’ the worl’. Remember when a single investigative reporter with the temerity to demand a decent living (McClure’s paid more than $1 million for the stories in today’s dollars) could pull the curtain back on one of the most powerful and secretive organizations on the face of the earth, a great lawbreaker as well as a value-creator? Tarbell is credited with triggering the great antitrust case that finally broke up the “octopus” in 1911. But her true greatness lies in how, using a mountain of facts carefully gathered and presented, she could explain to a bewildered and anxious middle class the great economic question of her age.

McClure’s had planned a three-part series, but, as copies flew off the newsstands, it soon became seven parts, then twelve, then a national sensation. New installments became news events in themselves, covered by other papers, including the fledgling Wall Street Journal. “The History of the Standard Oil Company” ended up as a nineteen-part series, quickly turned into a two-volume book. A cartoon in Puck would depict a pantheon of muckrakers with Tarbell as a Joan of Arc figure on horseback. Another contemporary magazine pronounced her “the most popular woman in America.”

No one reading this magazine needs to be told that we have crossed over into a new era. Industrial-age journalism has failed, we are told, and even if it hasn’t failed, it is over. Newspaper company stocks are trading for less than $1 a share. Great newsrooms have been cut down like so many sheaves of wheat. Where quasi-monopolies once reigned over whole metropolitan areas, we have conversation and communities, but also chaos and confusion.

A vanguard of journalism thinkers steps forward to explain things, and we should be grateful that they are here. If they weren’t, we’d have to invent them. Someone has to help us figure this out. Most prominent are Jeff Jarvis, Clay Shirky, and Jay Rosen, whose ideas we’ll focus on here, along with Dan Gillmor, John Paton, and others. Together their ideas form what I will call the future-of-news (FON) consensus.

According to this consensus, the future points toward a network-driven system of journalism in which news organizations will play a decreasingly important role. News won’t be collected and delivered in the traditional sense. It will be assembled, shared, and to an increasing degree, even gathered, by a sophisticated readership, one that is so active that the word “readership” will no longer apply. Let’s call it a user-ship or, better, a community. This is an interconnected world in which boundaries between storyteller and audience dissolve into a conversation between equal parties, the implication being that the conversation between reporter and reader was a hierarchical relationship, as opposed to, say, a simple division of labor.

At its heart, the FON consensus is anti-institutional. It believes that old institutions must wither to make way for the networked future. “The hallmark of revolution is that the goals of the revolutionaries cannot be contained by the institutional structure of the existing society,” Shirky wrote in Here Comes Everybody, his 2008 popularization of network theory. “As a result, either the revolutionaries are put down, or some of those institutions are altered, replaced or destroyed.” If this vision of the future does not square with your particular news preferences, well, as they might say on Twitter, #youmaybeSOL.

by Dean Starkman, Columbia Journalism Review |  Read more:
Photo: Wikipedia

The New Yorker Cover Department's Greatest Rejects


Françoise Mouly, The New Yorker’s art editor since 1993, doesn’t have normal relationships with the artists who draw the magazine's covers. “Think of me as your priest,” she told one of them. Mouly, who cofounded the avant-garde comics anthology RAW with her husband, Art Spiegelman, asks the artists she works with—Barry Blitt, Christoph Niemann, Ana Juan, R. Crumb—not to hold back anything in their cover sketches. If that means the occasional pedophilia gag or Holocaust joke finds its way to her desk, she's fine with that. Tasteless humor and failed setups are an essential part of the process. “Sometimes something is too provocative or too sexist or too racist,” Mouly says, “but it will inspire a line of thinking that will help develop an image that is publishable.”

Until recently, you would have had to visit Mouly's office on the 20th floor of the Condé Nast building to see the rejected covers she keeps pinned to a wall. Now, some of those uninhibited outtakes have been collected in a new book, Blown Covers: New Yorker Covers You Were Never Meant to See ($24.95, 128 pages), out today from Abrams. I talked to Mouly about the most incendiary sketches, the difficulty of publishing serious covers over Christmas, and why she heartily recommends listening to Rush Limbaugh.

What’s the process of deciding on a cover every week?

I’ve been the art editor for about 19 years, so I’ve been responsible for about 950 different published covers, and the process has been different for each one. But the general outline is that I set up a lineup every season of evergreen covers. So right now I’m talking to artists, soliciting ideas for Mother’s Day or spring or wedding or graduation.

And then there are timely political images or things that seems like the right idea at the right time—it can be a tsunami in Japan, but it can also simply be something that defines a time. Right now, one of the things I’m talking to artists about is the Republicans’ war on women. There’s not a specific moment for this, but it’s a subtext that’s in the air. Recently we did an image around the Republican primaries that involved a dog on top of a car, and that certainly was timely.

When we have something like that, then we are poised to upset the apple cart, and that can be turned around in as little as 24 hours. I’m in a constant conversation since I’m not commissioning or assigning any specific ideas. I’m not calling up artists and saying, “We need you to illustrate the war on women,” or whatever. We seldom have illustrations of cover stories on our covers. So we are dependent. What I’m really looking for are ideas that come from the artists on topics that will give us a sign of the era that we live in and, as a collection of images, will collect a picture of our time.

by Michael Silverberg, Imprint |  Read more:
Illustration: Art Spiegelman — The New Yorker — May 10, 1993

Saturday, May 12, 2012


Kelly Reemtsenshort leash.  laartdiary.com
via:

Empire of the Bun

Here’s the story Adam Fleischman likes to tell about the genesis of his Umami restaurant empire: Hunched over a ketchup-red plastic cafe-teria tray at the Culver City In-N-Out Burger, Fleischman, a 35-year-old wine entrepreneur, peers into a cardboard box flecked with french fry grease. He ponders the questions that bedevil future restaurant moguls: Why do Americans hunger for pizza and hamburgers more than any other dishes? And why, exactly, is the In-N-Out Double-Double he’s devouring his most beloved indulgence, not to mention one of Southern California’s premier sources of bragging rights?

Somewhere between bites of the dripping cheeseburger, a word comes to mind that afternoon in 2005. It’s one Fleischman has been encountering often, on select food blogs and in books by the pioneering British chef Heston Blumenthal. That word is umami. The Japanese chemist Kikunae Ikeda came up with it in 1908 to describe a flavor that’s at the root of Japanese cooking, present in staples like fermented soy, seaweed, and the funky dried-fish broth dashi. Americans experience umami in different ways. It’s one reason they crave bacon. It’s why Italian grandmothers sneak anchovies into everything, and why something that smells like an old gym sock can taste like heaven. The professional food world has embraced the umami flavor as a unique fifth taste distinct from the sensations of sweet, sour, salt, and bitter.

Fleischman concludes that what he loves about an In-N-Out Double-Double isn’t the fresh ingredients or the to-order preparation. A hamburger, he realizes, is America’s preferred umami intake device. It is also consumed by the billions each year. “That was the aha moment for me,” says Fleischman. “I saw Umami’s financial potential right away.”  (...)

After his umami revelation, Fleisch-man didn’t go on a fact-finding mission to Japan. (He still hasn’t, although he considers his dining experiences at Beverly Hills’ $325-a-head sushi restaurant Urasawa close enough.) Instead he dissolved his BottleRock partnership the following year and worked his way around town, consulting on a variety of other businesses. When similar dustups arose with partners at a second wine bar, Vinoteque, he’d had it. “I said, ‘OK, the next time there won’t be any partners. I’m going to do this all by myself.’  ”

In 2009, with $40,000 in his pocket from selling his stake in BottleRock, Fleischman decided to open a restaurant centered on the umami flavor. He knew that an umami-focused menu would attract a burgeoning breed of foodies who had been weaned on the Food Network and had developed a sort of teenybopper crush on the heady flavors of pork, organ meats, West Coast IPAs, and superripe cheeses. What his place would serve remained up in the air. As it happened, he settled on burgers.

Fleischman didn’t have a business degree or much experience in the food industry aside from helping his mother with her catering business as a kid. He certainly didn’t have any professional chef’s training, and his familiarity with hamburgers was limited to flipping a few in his backyard. But he did have a devout faith in his palate and a mean perfectionist streak that borders on the tyrannical.

On a late summer day he stepped into his kitchen armed with a bundle of Japanese ingredients he’d scooped up at Mitsuwa Marketplace in West L.A. He began to experiment with recipes, incorporating dashi, miso, fish sauce, and soy. He ground up fish heads and sprinkled them on top of ground beef and pork. He tried making Parmesan fondue and melting it over the patty. “It was a mess,” says Fleischman. Regardless, as a passionate, intellectually minded greenhorn, Fleischman—so he claims—created his masterpiece in a single day.

With that first burger-shaped umami bomb, Fleischman launched a brand that has not only changed the culinary landscape of L.A. but has turned its founder into a food industry powerhouse arguably as influential as Nancy Silverton or Wolfgang Puck. Since its debut in a former Korean taco stand on La Brea Avenue, Umami Burger has expanded into a multimillion-dollar restaurant group with financial backing from hospitality giant SBE. At present there are seven Umamis across L.A., one in San Francisco, and at least a dozen more in development nationwide. The Umami Group’s Neapolitan pizza place, 800 Degrees, recently opened in Westwood Village and continues to draw lines out the door. The newest addition is downtown’s 8,000-square-foot UmamIcatessen, which houses five food and beverage concepts. In the works are a scaled-down fast-food burger chain called Umami Ko and a line of Umami-brand condiments. The company also retains a controlling share of chef Jordan Kahn’s upscale Beverly Hills Vietnamese restaurant, Red Medicine. Umami Burger, however, remains the foundation of Fleischman’s realm.

The signature Umami burger isn’t some towering, sloppy menace that’s as impossible to grasp as it is to bite. It’s compact, almost cute, with a reasonable six-ounce patty served on an eggy, Portuguese-style bun that Fleischman sources from a top-secret local bakery. “The burger-to-bun ratio is key,” he says, “but it’s amazing that nobody ever gets that right.” Once cooked to the lowest, pinkest edge of medium rare, the meat is seasoned with the now-patented Umami Sauce and Umami Dust. “We don’t use MSG,” says Fleischman, despite many accusations to the contrary. The full recipe is classified, but he will allow that the sauce contains some soy sauce and the dust, some ground-up dried porcini mushrooms and dried fish heads, among other umami enhancers. Toppings include known umami heavy hitters such as oven-roasted tomatoes, shiitake mushrooms, caramelized onions, and a crisp Parmesan wafer. “Parmesan,” says Fleischman, “has the second-highest umami levels of any ingredient, and it has the most of any cheese.”

by Lesley Bargar Suter, Los Angeles Magazine |  Read more:
Photograph by Misha Gravenor

Joni Mitchell


[ed. Joni Mitchell would be a national treasure in the U.S. if she weren't from Canada. Instead, she's an international treasure. With a voice that's as accomplished as any jazz singer, she not only sings but composes, arranges and plays guitar (in a variety of complex tunings). A true genius talent. Here she is with a jazz supergroup composed of Pat Metheny, Lyle Mays, Jaco Pastorious, Randy Brecker and Don Alias.]

Lyrics

Bonus: Harry's House

Delta Dawn

How Sears, Roebuck & Co. midwifed the birth of the blues

Delta blues is as much legend as it is music. In the popular telling, blues articulated the hopelessness and poverty of an isolated, oppressed people through music that was disconnected from popular trends and technological advances. Delta blues giants like Robert Johnson were victims, buffeted by the winds of racism, singing out mostly for personal solace. The story is undoubtedly romantic, but it just isn’t true. “It angers me how scholars associate the blues strictly with tragedy,” B.B. King complained in his 1999 autobiography Blues All Around Me. “As a little kid, blues meant hope, excitement, pure emotion.” 

The tragic image of the blues that originated in the Mississippi Delta ignores the competitive and entrepreneurial spirit of the bluesman himself. While it is certainly true that the music was forged in part by the legacy of slavery and the insults of Jim Crow, the iconic image of the lone bluesman traveling the road with a guitar strapped to his back is also a story about innovators seizing on expanded opportunities brought about by the commercial and technological advances of the early 1900s. There was no Delta blues before there were cheap, readily available steel-string guitars. And those guitars, which transformed American culture, were brought to the boondocks by Sears, Roebuck & Co.

Music has always been an instrument of upward mobility in the black community. During slavery, performers were afforded higher status than field workers. As the entertainment for plantation soirees, musicians were expected to be well versed in the social dance styles demanded by white audiences. But when performing in slave quarters, they played roughly the same repertoire. Former slaves’ narratives reveal that the slave musical ensemble closely resembled later minstrel-show string bands: fiddles and banjos, accompanied by various percussion instruments, usually the tambourine and two bones being struck together as claves. While the image of slaves dancing waltzes seems odd now, it was common in rural black communities well into the 20th century.

At the conclusion of the Civil War, freed black men were suddenly looking for employment. Musicianers, as they were called, could earn more money than the typical day laborer. With newfound freedom of movement, and cultural norms that had established entertainment as one of the few widely accepted jobs for blacks, Reconstruction became a time of great opportunity for black musicians. In an 1882 article in The Century Magazine a white onlooker at a 19th-century Georgia corn shucking described the elite status of the musicianer like this: “The fiddler is the man of most importance. He always comes late, must have an extra share of whiskey, is the best-dressed man in the crowd, and unless every honor is shown him he will not play.”

The music played by these 19th-century musicians was not blues, and their plucked string instrument was not the guitar; it was the banjo. In 1781 Thomas Jefferson wrote about the instrument slaves played at his plantation, the banjar, “which they brought with them from the hinterlands of Africa.” These simple instruments usually had four strings and no frets.

It may seem odd that an instrument with African roots, originally played by plantation slaves, would become popular among the white masses, but the banjo was portable, melodic, and relatively easy to play. Banjo proselytizers, seeking to overcome anxieties about embracing a product of slave culture, would go so far in trying to whitewash the instrument’s ancestry as to claim that it had “reached its apogee through the contribution of whites” who had added frets and a fifth string to the original banjar.

A few early “classic blues” recordings featured the banjo, often fit with a guitar neck to provide a wider range. But these vaudevillian sides, cut by people like “Papa” Charlie Jackson, sound only distantly related to the Delta blues of Tommy Johnson or Skip James. The sound of the Delta is the sound of the steel-string guitar. The guitars of the 19th century used gut strings and were expensive and difficult to play. So despite having superior range and flexibility compared to banjos, guitars were still a rare sight in the black community. That all began to change in the 20th century.

by Chris Kjorness, Reason.com |  Read more:

The Dinner Party (fiction)

On occasion, the two women went to lunch and she came home offended by some pettiness. And he would say, “Why do this to yourself?” He wanted to keep her from being hurt. He also wanted his wife and her friend to drift apart so that he never had to sit through another dinner party with the friend and her husband. But after a few months the rift would inevitably heal and the friendship return to good standing. He couldn’t blame her. They went back a long way and you get only so many old friends.

He leaped four hours ahead of himself. He ruminated on the evening in future retrospect and recalled every gesture, every word. He walked back to the kitchen and stood with a new drink in front of the fridge, out of the way. “I can’t do it,” he said.

“Can’t do what?”

The balls were up in the air: water slowly coming to a boil on the stove, meat seasoned on a plate sitting on the butcher block. She stood beside the sink dicing an onion. Other vegetables waited their turn on the counter, bright and doomed. She stopped cutting long enough to lift her arm to her eyes in a tragic pose. Then she resumed, more tearfully. She wasn’t drinking much of her wine.

“I can predict everything that will happen from the moment they arrive to the little kiss on the cheek goodbye and I just can’t goddam do it.”

“You could stick your tongue down her throat instead of the kiss goodbye,” she offered casually as she continued to dice. She was game, his wife. She spoke to him in bad taste freely and he considered it one of her best qualities. “But then that would surprise her, I guess, not you.”

“They come in,” he said, “we take their coats. Everyone talks in a big hurry as if we didn’t have four long hours ahead of us. We self-medicate with alcohol. A lot of things are discussed, different issues. Everyone laughs a lot, but later no one can say what exactly was so witty. Compliments on the food. A couple of monologues. Then they start to yawn, we start to yawn. They say, ‘We should think about leaving, huh?,’ and we politely look away, like they’ve just decided to take a crap on the dinner table. Everyone stands, one of us gets their coats, peppy goodbyes. We all say what a lovely evening, do it again soon, blah-blah-blah. And then they leave and we talk about them and they hit the streets and talk about us.”

“What would make you happy?” she asked.

“A blow job.”

“Let’s wait until they get here for that,” she said.

She slid her finger along the blade to free the clinging onion. He handed her her glass. “Drink your wine,” he said. She took a sip. He left the kitchen.

by Joshua Ferris, The New Yorker (August, 2008) |  Read more:
Photograph: Gilbert & George, "The Shadow of the Glass" (1972) Courtesy Lehmann Maupin Gallery and Sonnabend Gallery

What Your I.Q. Means

116+
17 percent of the world population; superior I.Q.; appropriate average for individuals in professional occupations.

121+
10 percent; potentially gifted; average for college graduates

132+
2 percent; borderline genius; average I.Q. of most Ph.D. recipients

143+
1 percent; genius level; about average for Ph.D.'s in physics

158+
1 in 10,000; Nobel Prize winners

164+
1 in 30,000; Wolfgang Amadeus Mozart and the chess champion Bobby Fischer.

 via: Can You Make Yourself Smarter? (NY Times, April 2012)

Can You Call a 9-Year-Old a Psychopath?


By the time he turned 5, Michael had developed an uncanny ability to switch from full-blown anger to moments of pure rationality or calculated charm — a facility that Anne describes as deeply unsettling. “You never know when you’re going to see a proper emotion,” she said. She recalled one argument, over a homework assignment, when Michael shrieked and wept as she tried to reason with him. “I said: ‘Michael, remember the brainstorming we did yesterday? All you have to do is take your thoughts from that and turn them into sentences, and you’re done!’ He’s still screaming bloody murder, so I say, ‘Michael, I thought we brainstormed so we could avoid all this drama today.’ He stopped dead, in the middle of the screaming, turned to me and said in this flat, adult voice, ‘Well, you didn’t think that through very clearly then, did you?’ ”   (...)

Over the last six years, Michael’s parents have taken him to eight different therapists and received a proliferating number of diagnoses. “We’ve had so many people tell us so many different things,” Anne said. “Oh, it’s A.D.D. — oh, it’s not. It’s depression — or it’s not. You could open the DSM and point to a random thing, and chances are he has elements of it. He’s got characteristics of O.C.D. He’s got characteristics of sensory-integration disorder. Nobody knows what the predominant feature is, in terms of treating him. Which is the frustrating part.”

Then last spring, the psychologist treating Michael referred his parents to Dan Waschbusch, a researcher at Florida International University. Following a battery of evaluations, Anne and Miguel were presented with another possible diagnosis: their son Michael might be a psychopath.

For the past 10 years, Waschbusch has been studying “callous-unemotional” children — those who exhibit a distinctive lack of affect, remorse or empathy — and who are considered at risk of becoming psychopaths as adults. To evaluate Michael, Waschbusch used a combination of psychological exams and teacher- and family-rating scales, including the Inventory of Callous-Unemotional Traits, the Child Psychopathy Scale and a modified version of the Antisocial Process Screening Device — all tools designed to measure the cold, predatory conduct most closely associated with adult psychopathy. (The terms “sociopath” and “psychopath” are essentially identical.) A research assistant interviewed Michael’s parents and teachers about his behavior at home and in school. When all the exams and reports were tabulated, Michael was almost two standard deviations outside the normal range for callous-unemotional behavior, which placed him on the severe end of the spectrum.

Currently, there is no standard test for psychopathy in children, but a growing number of psychologists believe that psychopathy, like autism, is a distinct neurological condition — one that can be identified in children as young as 5. Crucial to this diagnosis are callous-unemotional traits, which most researchers now believe distinguish “fledgling psychopaths” from children with ordinary conduct disorder, who are also impulsive and hard to control and exhibit hostile or violent behavior. According to some studies, roughly one-third of children with severe behavioral problems — like the aggressive disobedience that Michael displays — also test above normal on callous-unemotional traits. (Narcissism and impulsivity, which are part of the adult diagnostic criteria, are difficult to apply to children, who are narcissistic and impulsive by nature.) (...)

The idea that a young child could have psychopathic tendencies remains controversial among psychologists. Laurence Steinberg, a psychologist at Temple University, has argued that psychopathy, like other personality disorders, is almost impossible to diagnose accurately in children, or even in teenagers — both because their brains are still developing and because normal behavior at these ages can be misinterpreted as psychopathic. Others fear that even if such a diagnosis can be made accurately, the social cost of branding a young child a psychopath is simply too high. (The disorder has historically been considered untreatable.) John Edens, a clinical psychologist at Texas A&M University, has cautioned against spending money on research to identify children at risk of psychopathy. “This isn’t like autism, where the child and parents will find support,” Edens observes. “Even if accurate, it’s a ruinous diagnosis. No one is sympathetic to the mother of a psychopath.”

by Jennifer Kahn, NY Times |  Read more:
Photo: Elinor Carucci/Redux

The Inquisition of Mr. Marvel


On the (surprisingly complicated) legacy of Stan Lee

Q: People ask, "Is Stan Lee still with Marvel Comics." Are you still with us?

STAN LEE: Sure! Especially on pay day!
Marvel Age magazine interview, 1983

Almost all the main characters in Avengers — including Thor, the Hulk, superspy Nick Fury, and the movie's primary villain, the trickster-god Loki — were introduced between 1961 and 1964, in comics written and drawn by Lee and Kirby. During that same period — a generative streak basically unparalleled in American comics history before or since — they also introduced the X-Men and the Fantastic Four.

Officially, Lee wrote the books and Kirby drew them. Officially, Stan supplied the realism — his heroes had flaws, they argued among themselves, they were prone to colds and bouts of self-loathing, and sometimes they'd forget to pay the rent and face eviction from their futuristic high-rise HQs, which were in New York, not a made-up metropolis — while Kirby supplied the propulsion, filling the pages with visions of eternity and calamity, along with action sequences that basically invented the visual grammar of modern superhero comics. (...)

Over the years, Marvel changed hands, went bankrupt, reemerged, restructured. Stan stayed in the picture. Each time he renegotiated his deal with the company, he did so from a unique position — half elder god, half mascot. Administration after administration recognized that it was in their best interests PR-wise to keep him on the payroll. For years, he received 10 percent of all revenue generated by the exploitation of his characters on TV and in movies, along with a six-figure salary. This came out in 2002, when Lee sued Marvel, claiming they'd failed to pay him his percentage of the profits from the first Spider-Man movie, a development the Comics Journal compared to Colonel Sanders suing Kentucky Fried Chicken.

It's unclear if Stan still co-owns any of Marvel's characters, but the company continues to take care of him. When Disney (which, full disclosure, is also the parent company of ESPN, which owns the website you're now reading) bought Marvel for $4 billion in 2009, part of the deal involved a Disney subsidiary buying a small piece of POW! Entertainment, a content-farm company Stan co-founded; another Disney-affiliated company currently pays POW! $1.25 million a year to loan out Stan as a consultant "on the exploitation of the assets of Marvel Entertainment."

Jack Kirby, on the other hand, was a contractor. You could sink a continent in the amount of ink that's been spilled on the question of whether it was Stan's voice or Jack's visuals that ultimately made Marvel what it was, but it's hard to argue that any of this would have happened had Kirby been hit by a bus in 1960. Yet like most comics creators back then, he was paid by the page and retained no rights to any of the work he did for the company or the characters he helped create; by cashing his paychecks, he signed those rights over to the company. It took him decades just to persuade Marvel to give him back some of his original art, much of which was lost or given away or stolen in the meantime; there are horror stories about original Kirby pages being gifted to the water-delivery guy.

Kirby never sued Marvel, over the art or anything else. But as the years wore on he blasted the company in interviews. He blasted Lee, its avatar. Compared him to Sammy Glick. Referred to him as a mere "office worker" who'd grabbed credit from true idea men. "It wasn't possible for a man like Stan Lee to come up with new things — or old things, for that matter," Kirby told the Comics Journal in an infamous 1990 interview. "Stan Lee wasn't a guy that read or that told stories. Stan Lee was a guy that knew where the papers were or who was coming to visit that day."

And all this happened back when the comics industry only manufactured and sold comic books. Back when even the medium's most vocal champions wouldn't have dreamed of Marvel (which filed for Chapter 11 bankruptcy in 1996) being worth $4 billion to anybody.

by Alex Pappademis, Grantland |  Read more: 
Photo: Jerod Harris/WireImage

Friday, May 11, 2012

Friday Book Club - The Remains of the Day

[ed. I don't know why it's taken me so long to come to this, it's a masterpiece. I love Ishiguro's writing style - spare and elegant - never a misplaced word or phrase anywhere (see also, Never Let Me Go). An engaging story that transports the reader deep into British culture.]

Kazuo Ishiguro's third novel, ''The Remains of the Day,'' is a dream of a book: a beguiling comedy of manners that evolves almost magically into a profound and heart-rending study of personality, class and culture. At the beginning, though, its narrator, an elderly English butler named Stevens, seems the least forthcoming (let alone enchanting) of companions. Cartoonishly punctilious and reserved, he edges slowly into an account of a brief motoring holiday from Oxfordshire to the West Country that he is taking alone at the insistence of his new employer, a genial American, Mr. Farraday.

The time is July 1956. Farraday has recently bought Darlington Hall near Oxford from the descendants of the last noble-born owner and has asked Stevens - a fixture there for nearly four decades - to relax a bit before implementing a much-reduced staff plan for the running of the house. Tense about his little holiday, Stevens hopes secretly to use it for professional advantage: to recruit the former housekeeper, the admirable Miss Kenton, who had years ago left service to marry, but who is now estranged from her husband and seems nostalgic for her old position.

In the early part of his story, the strait-laced Stevens plays perfectly the role of model butler as obliging narrator. Attentive to detail, solicitous of others, eager to serve, he primly sketches the history and current state of affairs at the great house and points out the agreeable features of the landscape as he moves slowly from Salisbury to Taunton, Tavistock and Little Compton in Cornwall. Much of this is dryly, deliciously funny, not so much because Stevens is witty or notably perceptive (he is neither) but because in his impassive formality he is so breathtakingly true to type, so very much the familiar product of the suppressive and now anachronistic social system that has produced him and to which he is so intensely loyal.

At different points in his subdued musings on the past, Stevens offers formulations of immemorial English attitudes that are likely to strike many contemporary readers as at once laughably parochial and quaintly endearing. Obsessed with notions of greatness, he proclaims that the English landscape is the most deeply satisfying in the world because of ''the very lack of obvious drama or spectacle.'' As he puts it, ''The sorts of sights offered in such places as Africa and America, though undoubtedly very exciting, would, I am sure, strike the objective viewer as inferior on account of their unseemly demonstrativeness.''

Similarly, Stevens provides a long, solemn, yet unwittingly brilliant disquisition on the question of what makes a great butler, a topic that has provoked ''much debate in our profession over the years'' and continues to obsess him throughout his narrative. The key, he confidently insists, is dignity, which has to do with a butler's ability to ''inhabit'' his role ''to the utmost.''

''Lesser butlers,'' Stevens muses, ''will abandon their professional being for the private one at the least provocation. For such persons, being a butler is like playing some pantomime role; a small push, a slight stumble, and the facade will drop off to reveal the actor underneath. The great butlers are great by virtue of their ability to inhabit their professional role and inhabit it to the utmost; they will not be shaken out by external events, however surprising, alarming or vexing. They wear their professionalism as a decent gentleman will wear his suit: he will not let ruffians or circumstance tear it off him in the public gaze; he will discard it when, and only when, he wills to do so, and this will invariably be when he is entirely alone. It is, as I say, a matter of 'dignity.' '' Mr. Ishiguro's command of Stevens' corseted idiom is masterly, and nowhere more tellingly so than in the way he controls the progressive revelation of unintended ironic meaning. Underneath what Stevens says, something else is being said, and the something else eventually turns out to be a moving series of chilly revelations of the butler's buried life - and, by implication, a powerful critique of the social machine in which he is a cog. As we move westward with Stevens in Farraday's vintage Ford, we learn more and more about the price he has paid in striving for his lofty ideal of professional greatness.

P.M. Dawn




Among the problems Nabokov’s Lolita poses for the book designer, probably the thorniest is the popular misconception of the title character. She’s chronically miscast as a teenage sexpot—just witness the dozens of soft-core covers over the years. “We are talking about a novel which has child rape at its core,” says John Bertram, an architect and blogger who, three years ago, sponsored a Lolita cover competition asking designers to do better.

via:

How Wall Street Killed Financial Reform


Two years ago, when he signed the Dodd-Frank Wall Street Reform and Consumer Protection Act, President Barack Obama bragged that he'd dealt a crushing blow to the extravagant financial corruption that had caused the global economic crash in 2008. "These reforms represent the strongest consumer financial protections in history," the president told an adoring crowd in downtown D.C. on July 21st, 2010. "In history."

This was supposed to be the big one. At 2,300 pages, the new law ostensibly rewrote the rules for Wall Street. It was going to put an end to predatory lending in the mortgage markets, crack down on hidden fees and penalties in credit contracts, and create a powerful new Consumer Financial Protection Bureau to safeguard ordinary consumers. Big banks would be banned from gambling with taxpayer money, and a new set of rules would limit speculators from making the kind of crazy-ass bets that cause wild spikes in the price of food and energy. There would be no more AIGs, and the world would never again face a financial apocalypse when a bank like Lehman Brothers went bankrupt.

Most importantly, even if any of that fiendish crap ever did happen again, Dodd-Frank guaranteed we wouldn't be expected to pay for it. "The American people will never again be asked to foot the bill for Wall Street's mistakes," Obama promised. "There will be no more taxpayer-funded bailouts. Period."

Two years later, Dodd-Frank is groaning on its deathbed. The giant reform bill turned out to be like the fish reeled in by Hemingway's Old Man – no sooner caught than set upon by sharks that strip it to nothing long before it ever reaches the shore. In a furious below-the-radar effort at gutting the law – roundly despised by Washington's Wall Street paymasters – a troop of water-carrying Eric Cantor Republicans are speeding nine separate bills through the House, all designed to roll back the few genuinely toothy portions left in Dodd-Frank. With the Quislingian covert assistance of Democrats, both in Congress and in the White House, those bills could pass through the House and the Senate with little or no debate, with simple floor votes – by a process usually reserved for things like the renaming of post offices or a nonbinding resolution celebrating Amelia Earhart's birthday.

The fate of Dodd-Frank over the past two years is an object lesson in the government's inability to institute even the simplest and most obvious reforms, especially if those reforms happen to clash with powerful financial interests. From the moment it was signed into law, lobbyists and lawyers have fought regulators over every line in the rulemaking process. Congressmen and presidents may be able to get a law passed once in a while – but they can no longer make sure it stays passed. You win the modern financial-regulation game by filing the most motions, attending the most hearings, giving the most money to the most politicians and, above all, by keeping at it, day after day, year after fiscal year, until stealing is legal again. "It's like a scorched-earth policy," says Michael Greenberger, a former regulator who was heavily involved with the drafting of Dodd-Frank. "It requires constant combat. And it never, ever ends."

That the banks have just about succeeded in strangling Dodd-Frank is probably not news to most Americans – it's how they succeeded that's the scary part. The banks followed a five-point strategy that offers a dependable blueprint for defeating any regulation – and for guaranteeing that when it comes to the economy, might will always equal right.

by Matt Taibbi, Rolling Stone |  Read more:
Photo: Rod Lamkey Jr./AFP Getty Images