Sunday, May 26, 2013
Cabbage Butterflies (Fiction)
The guy looked disappointed when he saw me. My one sales point is that I’m young, but my eyelids are so puffy they look like caterpillars, and my lips are pale and kind of caterpillary too, and so are my fingers and toes, so I’m pretty much caterpillars all over, and the problem was that the guy was fairly young himself. Not as young as me – I’m only 20 – but late twenties at most, which is about the same or a little older than my friend’s brother, who drives a Soarer and got arrested for possession of crank and who I had sex with a couple of times, but it was on bare tatami mats and the mats made a creepy-looking imprint on my ass.
'Good evening, I’m from the Snake Pit,’ I said, and the guy laughed and said, ‘“Snake Pit” – sounds like a gym for pro wrestlers.’ He had a very gentlemanly-sounding sort of laugh, which put me at ease in a way, but then again it’s always the ‘sophisticated’ ones who want you to stick your tongue in their ass or smooth out the wrinkles in their ball sack and lick it, stuff like that, which I hate, so I acted very shy when I walked in, like it was my first time to do this, but I couldn’t tell if the guy bought it or not. It seemed like he was experienced at this kind of thing, though, and it was one of the best hotels in the city, and the room had this giant bed, and I was thinking, Where does a young fucker like you get off, staying in such an expensive place?
While I was calling my office to tell them I’d arrived at the appointment, the guy took a bottle of wine from this ice bucket that looked like a robot’s head and popped the cork and poured himself a glass, and he seemed right at home, like he did this sort of stuff all the time. I had some wine too, and I was thinking it wasn’t as good as the gin and tonic LUI and me always drink together, but I stopped thinking about LUI when the guy said, ‘Show me your ass.’ I asked if I could take a shower first, and he said, ‘No need. I’ll hardly even touch your body. Just lift up your skirt and stick out your ass.’ I kind of stalled and kept fidgeting, and finally he said, ‘Never mind,’ and pulled a ten thousand yen bill from his wallet and held it out to me and said, ‘Take this and leave.’ I realise now that I should have just taken it and left, but at the time I thought he must be incredibly rich and maybe I could get a lot more money out of him, so I went ahead and showed him my ass. He poked at a cyst on my ass cheek with the tip of a ballpoint pen and said, ‘What the hell is this? It’s disgusting!’ I said it was a cyst, and he said, ‘You must be undernourished. That’s what happens when you eat nothing but noodles, like the Filipinas do – they’re all covered with cysts and boils.’ I knew some Filipinas from the last place I’d worked, and it made me so mad that he’d say that, it brought tears to my eyes.
‘You crying?’ He slapped my ass. ‘What are you, a moron?’ he said, and grabbed hold of my ass cheek and started touching my pussy with his other hand. He was so good with his fingers that even as I stood there crying, I started to get wet, and all I could think was, Shit, I really am a moron. Then he let go of me and said, ‘Cover your ass back up, it looks like hell,’ and took twenty thousand yen from his wallet and told me to leave again.
‘Please,’ I said, ‘I’ll do anything you ask.’
I don’t know why I said that. I think it must have been because my grandfather used to bawl me out about giving up on things. He used to tell me it’s important never to quit, that you need to finish things you’ve started, and I’ve always remembered that. But why would I think of my grandfather, who I loved so much, at a time like this? It made me sad, and I started crying again.
As I cried I knelt on the floor and tried to undo the zipper on the guy’s velveteen pants so I could blow him, but he grabbed my hand and told me to stop. LUI always forgives me if I give him a good long blow job, but I guess this guy was a different type. ‘Look, I’m offering to pay you,’ he said, ‘so just go,’ but I said, ‘They’ll yell at me at the office if I go back early.’ He stared at me for a minute, and then he said, ‘Ah, what the hell,’ and asked me if I was hungry. I nodded, and he took me down to a bar on the basement floor.
The bar was all gold and black, and the shelves were lined with bottles of liquor I’d never even seen before, and the waiters were tall, and it made you feel special to be in there. The guy ordered a steak sandwich and put it in front of me and said to the bartender, ‘This one’s got a cyst on her ass,’ and the bartender, who was mixing up a drink in a shaker, got a big laugh out of that. I was kind of shocked, but I figured maybe that sort of talk was normal in fancy bars like this, and I just quietly ate the steak sandwich. It was really delicious. I told the guy this, but he didn’t even look at me but kept talking to the bartender, saying stuff like, ‘Remember that Vietnamese girl? Half Chinese, half French. If she wasn’t strung out on smack I’d consider making her my main squeeze,’ and the bartender kept working the shaker and said a girl named Natsuki, from a bar in Ginza called Madonna or La Donna or something, had shown up and asked a waiter for the guy’s room number, but the waiter wouldn’t tell her, so she sat drinking whiskey and went through half a bottle before she got up and left. When the guy asked if she’d been wearing a kimono, the bartender nodded, and the guy said, ‘Phew, I really dodged a bullet there,’ and took a slice of pickle from my plate and stuck it in his mouth but didn’t chew it, only let it hang out over his lower lip, and said, ‘But that woman’s a genius at giving head. Let her have her way, she’ll suck on it all night long. You’ll wake up in the morning and find her slurping away,’ and this time both he and the bartender laughed.
It wasn’t until I was finishing my steak sandwich that I realised that I’d missed the last train, and then I remembered my grandfather again. My grandfather always said you should never let down your guard, not when you’re taking a dump, not when you’re sick – never – and here I was mooning over the brown juice oozing out from between the slices of bread and forgetting all about the time. To be honest, though, maybe the sandwich wasn’t the only reason. I’ve missed the last train before, and when I asked the customer I was with if I could spend the night, they were always happy to let me. I didn’t think this guy would let me but went ahead and asked him, and he said he was going to have a woman over. I asked if I could sleep in his room anyway, and he thought for a minute and said, ‘Well, that might be interesting. All right, stay and watch.’
by Ryu Murakami, Cabbage Butterflies | Read more:
Image: uncredited
Jerry Brown's Political Reboot
“Okay, you’re here!” the man on the other end of the call said, cheerily. I’d been trying to arrange a visit to his office for quite a while, and just the previous evening he’d let me know that if I got there in a hurry, he’d have time to talk the next day, as well as over the weekend. As I walked through the airport, he began reeling off turn-by-turn instructions for reaching his office in Oakland in my rental car. “You’ll take the Bay Bridge to the exit for the 580 East and the 24. But don’t go all the way to the 24! That would send you out to Concord. Take the 980 West until the exit for 27th Street, and then …”
It was like a moment from a Saturday Night Live sketch of "The Californians”—which seemed appropriate, since the man I was talking with was the Californian, Jerry Brown. Brown began his first two terms as governor in 1974, at age 36, following one Republican former actor, Ronald Reagan. He returned to the office at age 72, following another, Arnold Schwarzenegger. In between he ran for president three times and the U.S. Senate once, all of course unsuccessfully; served eight years as Oakland’s mayor and four as California’s attorney general; and lived in both Japan (studying Zen meditation) and India (volunteering for Mother Teresa). He celebrated his 75th birthday the weekend I was in Oakland, which means that if he runs for reelection next year and if he wins, both of which are considered likely—his approval rating this year has been the envy of other politicians in the state—he could still be governor at age 80. “This is certainly a new identity for Brown, so flighty in his first ‘Governor Moonbeam’ period as governor,” Bruce Cain, a political scientist and an expert on California politics at Stanford’s Bill Lane Center for the American West, told me. “Now he is the most trusted, stable, and reliable leader around.” I asked Kevin Starr, of the University of Southern California and the author of the acclaimed Americans and the California Dream series of books, how Brown was seen in his return to office. “He is now liked,” Starr said. “Eccentric, but liked.”
Life and health are provisional, and within the past two years, Brown has undergone radiation treatment for early-stage prostate cancer (while maintaining his normal work schedule) and had a cancerous growth removed from his nose. But he moves, talks, reacts, and laughs like someone who is in no mood, and feels no need, to slow down. He is nearly a decade older than Bill Clinton but comes across as younger and bouncier.
“I love what I am doing,” he told me once I got to his Oakland office. “I love it much more than the first time. Back then I got bored because we didn’t have big problems. Now I am very enthusiastic. Everything’s interesting, and it’s complicated. There is a zest!” He likes to pound the desk or table as he talks, and this passage was punctuated: love (bang) … love (bang) … zest! (bang bang bang!). Anne Gust Brown, a former Gap executive in her mid‑50s, who became his wife eight years ago and is widely regarded as his most influential and practical-minded adviser, arched an eyebrow from the other side of the room, where she was half-listening while working at a computer. “Ed-mund!” she said smilingly, but being sure to get his attention. (His official name is Edmund Gerald Brown Jr., after his father, Edmund G. “Pat” Brown, who was governor for eight years before he lost to Ronald Reagan in 1966.) “Don’t get yourself too worked up!” As a note on nomenclature: apart from his wife’s occasional joking use of Edmund and my own antiquated sense that I should address him as Governor, every other person I heard speak about—or with—him called him Jerry.
by James Fallows, The Atlantic | Read more:
Photo: Chris McPhersonThe Gift of Siblings
Given what a mouthy thing I grew up to be, it’s shocking to me that I began talking later than most children do. But I didn’t need words. I had my older brother, Mark.
The way my mother always recounted it, I’d squirm, pout, mewl, bawl or indicate my displeasure in some comparably articulate way, and before she could press me on what I wanted and perhaps coax actual language from me, Mark would rush in to solve the riddle.
“His blanket,” he’d say, and he’d be right.
“Another cookie,” he’d say, and he’d be even righter.
From the tenor of my sob or the twitch of one of my fat little fingers, Mark knew which chair I had designs on, which toy I was ogling. He decoded the signs and procured the goods. Only 17 months older, he was my psychic and my spokesman, my shaman and my Sherpa. With Mark around, I was safe.
This weekend he’s turning 50 — it’s horrifying, trust me — and we’ll all be together, as we were at his 40th and my 40th and seemingly every big milestone: he and I and our younger brother, Harry, and our sister, Adelle, the last one to come along. We marched (or, rather, crawled and toddled) into this crazy world together, and though we had no say in that, it’s by our own volition and determination that we march together still. Among my many blessings, this is the one I’d put at the top.
Two weeks ago, the calendar decreed that we Americans pause to celebrate mothers, as it does every year. Three weeks hence, fathers get their due. But as I await the arrival of my brothers, my sister and their spouses in Manhattan, which is where we’ll sing an off-key “Happy Birthday” to Mark and drink too much, my thoughts turn to siblings, who don’t have a special day but arguably have an even more special meaning to, and influence on, those of us privileged to have them.
“Siblings are the only relatives, and perhaps the only people you’ll ever know, who are with you through the entire arc of your life,” the writer Jeffrey Kluger observed to Salon in 2011, the year his book “The Sibling Effect” was published. “Your parents leave you too soon and your kids and spouse come along late, but your siblings know you when you are in your most inchoate form.”
Of course the “entire arc” part of Kluger’s comments assumes that untimely death doesn’t enter the picture, and that acrimony, geography or mundane laziness doesn’t pull brothers and sisters apart, to a point where they’re no longer primary witnesses to one another’s lives, no longer fellow passengers, just onetime housemates with common heritages.
That happens all too easily, and whenever I ponder why it didn’t happen with Mark, Harry, Adelle and me — each of us so different from the others — I’m convinced that family closeness isn’t a happy accident, a fortuitously smooth blend of personalities.
by Frank Bruni, NY Times | Read more:
The way my mother always recounted it, I’d squirm, pout, mewl, bawl or indicate my displeasure in some comparably articulate way, and before she could press me on what I wanted and perhaps coax actual language from me, Mark would rush in to solve the riddle.“His blanket,” he’d say, and he’d be right.
“Another cookie,” he’d say, and he’d be even righter.
From the tenor of my sob or the twitch of one of my fat little fingers, Mark knew which chair I had designs on, which toy I was ogling. He decoded the signs and procured the goods. Only 17 months older, he was my psychic and my spokesman, my shaman and my Sherpa. With Mark around, I was safe.
This weekend he’s turning 50 — it’s horrifying, trust me — and we’ll all be together, as we were at his 40th and my 40th and seemingly every big milestone: he and I and our younger brother, Harry, and our sister, Adelle, the last one to come along. We marched (or, rather, crawled and toddled) into this crazy world together, and though we had no say in that, it’s by our own volition and determination that we march together still. Among my many blessings, this is the one I’d put at the top.
Two weeks ago, the calendar decreed that we Americans pause to celebrate mothers, as it does every year. Three weeks hence, fathers get their due. But as I await the arrival of my brothers, my sister and their spouses in Manhattan, which is where we’ll sing an off-key “Happy Birthday” to Mark and drink too much, my thoughts turn to siblings, who don’t have a special day but arguably have an even more special meaning to, and influence on, those of us privileged to have them.
“Siblings are the only relatives, and perhaps the only people you’ll ever know, who are with you through the entire arc of your life,” the writer Jeffrey Kluger observed to Salon in 2011, the year his book “The Sibling Effect” was published. “Your parents leave you too soon and your kids and spouse come along late, but your siblings know you when you are in your most inchoate form.”
Of course the “entire arc” part of Kluger’s comments assumes that untimely death doesn’t enter the picture, and that acrimony, geography or mundane laziness doesn’t pull brothers and sisters apart, to a point where they’re no longer primary witnesses to one another’s lives, no longer fellow passengers, just onetime housemates with common heritages.
That happens all too easily, and whenever I ponder why it didn’t happen with Mark, Harry, Adelle and me — each of us so different from the others — I’m convinced that family closeness isn’t a happy accident, a fortuitously smooth blend of personalities.
by Frank Bruni, NY Times | Read more:
Image: Futurity
Saturday, May 25, 2013
The Suicide Epidemic
When Thomas Joiner was 25 years old, his father—whose name was also Thomas Joiner and who could do anything—disappeared from the family’s home. At the time, Joiner was a graduate student at the University of Texas, studying clinical psychology. His focus was depression, and it was obvious to him that his father was depressed. Six weeks earlier, on a family trip to the Georgia coast, the gregarious 56-year-old—the kind of guy who was forever talking and laughing and bending people his way—was sullen and withdrawn, spending days in bed, not sick or hungover, not really sleeping.
Joiner knew enough not to worry. He knew that the desire for death—the easy way out, the only relief—was a symptom of depression, and although at least 2 percent of those diagnosed make suicide their final chart line, his father didn’t match the suicidal types he had learned about in school. He wasn’t weak or impulsive. He wasn’t a brittle person with bad genes and big problems. Suicide was understood to be for losers, basically, the exact opposite of men like Thomas Joiner Sr.—a successful businessman, a former Marine, tough even by Southern standards.
But Dad had left an unmade bed in a spare room, and an empty spot where his van usually went. By nightfall he hadn’t been heard from, and the following morning Joiner’s mother called him at school. The police had found the van. It was parked in an office lot about a mile from the house, the engine cold. Inside, in the back, the police found Joiner’s father dead, covered in blood. He had been stabbed through the heart.
The investigators found slash marks on his father’s wrists and a note on a yellow sticky pad by the driver’s seat. “Is this the answer?” it read, in his father’s shaky scrawl. They ruled it a suicide, death by “puncture wound,” an impossibly grisly way to go, which made it all the more difficult for Joiner to understand. This didn’t seem like the easy way out.
Back home for the funeral, Joiner’s pain and confusion were compounded by ancient taboos. For centuries suicide was considered an act against God, a violation of law, and a stain on the community. He overheard one relative advise another to call it a heart attack. His girlfriend fretted about his tainted DNA. Even some of his peers and professors—highly trained, doctoral-level clinicians—failed to offer a simple “my condolences.” It was as though the Joiner family had failed dear old Dad, killed him somehow, just as surely as if they had stabbed him themselves. To Joiner, however, the only real failing was from his field, which clearly had a shaky understanding of suicide.
Survivors of a suicide are haunted by the same whys and hows, the what-ifs that can never be answered. Joiner was no different. He wanted to know why people die at their own hands: What makes them desire death in the first place? When exactly do they decide to end their lives? How do they build up the nerve to do it? But unlike most other survivors of suicide, for the last two decades he has been developing answers.
Joiner is 47 now, and a chaired professor at Florida State University, in Tallahassee. Physically, he is an imposing figure, 6-foot-3 with a lantern jaw and a head shaved clean with a razor. He wears an off-and-on beard, which grows in as heavy as iron filings. The look fits his work, which is dedicated to interrogating suicide as hard as anyone ever has, to finally understand it as a matter of public good and personal duty. He hopes to honor his father, by combating what killed him and by making his death a stepping stone to better treatment. “Because,” as he says, “no one should have to die alone in a mess in a hotel bathroom, in the back of a van, or on a park bench, thinking incorrectly that the world will be better off.”
He is the author of the first comprehensive theory of suicide, an explanation, as he told me, “for all suicides at all times in all cultures across all conditions.” He also has much more than a theory: he has a moment. This spring, suicide news paraded down America’s front pages and social-media feeds, led by a report from the Centers for Disease Control and Prevention, which called self-harm “an increasing public health concern.” Although the CDC revealed grabby figures—like the fact that there are more deaths by suicide than by road accident—the effort prompted only a tired spasm of talk about aging baby boomers and life in a recession. The CDC itself, in an editorial note, suggested that the party would rock on once the economy rebounded and our Dennis Hopper–cohort rode its hog into the sunset.
But suicide is not an economic problem or a generational tic. It’s not a secondary concern, a sideline that will solve itself with new jobs, less access to guns, or a more tolerant society, although all would be welcome. It’s a problem with a broad base and terrible momentum, a result of seismic changes in the way we live and a corresponding shift in the way we die—not only in America but around the world.
We know, thanks to a growing body of research on suicide and the conditions that accompany it, that more and more of us are living through a time of seamless black: a period of mounting clinical depression, blossoming thoughts of oblivion and an abiding wish to get there by the nonscenic route. Every year since 1999, more Americans have killed themselves than the year before, making suicide the nation’s greatest untamed cause of death. In much of the world, it’s among the only major threats to get significantly worse in this century than in the last.
The result is an accelerating paradox. Over the last five decades, millions of lives have been remade for the better. Yet within this brighter tomorrow, we suffer unprecedented despair. In a time defined by ever more social progress and astounding innovations, we have never been more burdened by sadness or more consumed by self-harm. And this may be only the beginning. If Joiner and others are right—and a landmark collection of studies suggests they are—we’ve reached the end of one order of human history and are at the beginning of a new order entirely, one beset by a whole lot of self-inflicted bloodshed, and a whole lot more to come.
Joiner knew enough not to worry. He knew that the desire for death—the easy way out, the only relief—was a symptom of depression, and although at least 2 percent of those diagnosed make suicide their final chart line, his father didn’t match the suicidal types he had learned about in school. He wasn’t weak or impulsive. He wasn’t a brittle person with bad genes and big problems. Suicide was understood to be for losers, basically, the exact opposite of men like Thomas Joiner Sr.—a successful businessman, a former Marine, tough even by Southern standards.
But Dad had left an unmade bed in a spare room, and an empty spot where his van usually went. By nightfall he hadn’t been heard from, and the following morning Joiner’s mother called him at school. The police had found the van. It was parked in an office lot about a mile from the house, the engine cold. Inside, in the back, the police found Joiner’s father dead, covered in blood. He had been stabbed through the heart.
The investigators found slash marks on his father’s wrists and a note on a yellow sticky pad by the driver’s seat. “Is this the answer?” it read, in his father’s shaky scrawl. They ruled it a suicide, death by “puncture wound,” an impossibly grisly way to go, which made it all the more difficult for Joiner to understand. This didn’t seem like the easy way out.
Back home for the funeral, Joiner’s pain and confusion were compounded by ancient taboos. For centuries suicide was considered an act against God, a violation of law, and a stain on the community. He overheard one relative advise another to call it a heart attack. His girlfriend fretted about his tainted DNA. Even some of his peers and professors—highly trained, doctoral-level clinicians—failed to offer a simple “my condolences.” It was as though the Joiner family had failed dear old Dad, killed him somehow, just as surely as if they had stabbed him themselves. To Joiner, however, the only real failing was from his field, which clearly had a shaky understanding of suicide.
Survivors of a suicide are haunted by the same whys and hows, the what-ifs that can never be answered. Joiner was no different. He wanted to know why people die at their own hands: What makes them desire death in the first place? When exactly do they decide to end their lives? How do they build up the nerve to do it? But unlike most other survivors of suicide, for the last two decades he has been developing answers.Joiner is 47 now, and a chaired professor at Florida State University, in Tallahassee. Physically, he is an imposing figure, 6-foot-3 with a lantern jaw and a head shaved clean with a razor. He wears an off-and-on beard, which grows in as heavy as iron filings. The look fits his work, which is dedicated to interrogating suicide as hard as anyone ever has, to finally understand it as a matter of public good and personal duty. He hopes to honor his father, by combating what killed him and by making his death a stepping stone to better treatment. “Because,” as he says, “no one should have to die alone in a mess in a hotel bathroom, in the back of a van, or on a park bench, thinking incorrectly that the world will be better off.”
He is the author of the first comprehensive theory of suicide, an explanation, as he told me, “for all suicides at all times in all cultures across all conditions.” He also has much more than a theory: he has a moment. This spring, suicide news paraded down America’s front pages and social-media feeds, led by a report from the Centers for Disease Control and Prevention, which called self-harm “an increasing public health concern.” Although the CDC revealed grabby figures—like the fact that there are more deaths by suicide than by road accident—the effort prompted only a tired spasm of talk about aging baby boomers and life in a recession. The CDC itself, in an editorial note, suggested that the party would rock on once the economy rebounded and our Dennis Hopper–cohort rode its hog into the sunset.
But suicide is not an economic problem or a generational tic. It’s not a secondary concern, a sideline that will solve itself with new jobs, less access to guns, or a more tolerant society, although all would be welcome. It’s a problem with a broad base and terrible momentum, a result of seismic changes in the way we live and a corresponding shift in the way we die—not only in America but around the world.
We know, thanks to a growing body of research on suicide and the conditions that accompany it, that more and more of us are living through a time of seamless black: a period of mounting clinical depression, blossoming thoughts of oblivion and an abiding wish to get there by the nonscenic route. Every year since 1999, more Americans have killed themselves than the year before, making suicide the nation’s greatest untamed cause of death. In much of the world, it’s among the only major threats to get significantly worse in this century than in the last.
The result is an accelerating paradox. Over the last five decades, millions of lives have been remade for the better. Yet within this brighter tomorrow, we suffer unprecedented despair. In a time defined by ever more social progress and astounding innovations, we have never been more burdened by sadness or more consumed by self-harm. And this may be only the beginning. If Joiner and others are right—and a landmark collection of studies suggests they are—we’ve reached the end of one order of human history and are at the beginning of a new order entirely, one beset by a whole lot of self-inflicted bloodshed, and a whole lot more to come.
by Tony Dokoupil, TDB/Newsweek | Read more:
Images: Vincent van Gogh, Wikipedia; Virginia Woolf, Bettman/Corbis
20 Great Essays by David Foster Wallace
[ed. The Electric Typewriter has 20 Great Essays by David Foster Wallace. All for free. Here's an excerpt from his masterpiece Infinite Jest, describing why video-phones never really took off.]
'VIDEOPHONY' SUDDENLY COLLAPSED LIKE A KICKED TENT, SO THAT, BY THE YEAR OF THE DEPEND ADULT UNDERGARMENT, FEWER THAN 10% OF ALL PRIVATE TELEPHONE COMMUNICATIONS UTILIZED ANY VIDEO-IMAGE-FIBER DATA-TRANSFERS OR COINCIDENT PRODUCTS AND SERVICES, THE AVERAGE U.S. PHONE-USER DECIDING THAT S/HE ACTUALLY PREFERRED THE RETROGRADE OLD LOW-TECH BELL-ERA VOICE-ONLY TELEPHONIC INTERFACE AFTER ALL, A PREFERENTIAL ABOUT-FACE THAT COST A GOOD MANY PRECIPITANT VIDEO-TELEPHONY-RELATED ENTREPRENEURS THEIR SHIRTS, PLUS DESTABILIZING TWO HIGHLY RESPECTED MUTUAL FUNDS THAT HAD GROUND-FLOORED HEAVILY IN VIDEO-PHONE TECHNOLOGY, AND VERY NEARLY WIPING OUT THE MARYLAND STATE EMPLOYEES' RETIREMENT SYSTEM'S FREDDIE-MAC FUND, A FUND WHOSE ADMINISTRATOR'S MISTRESS'S BROTHER HAD BEEN AN ALMOST MANICALLY PRECIPITANT VIDEO-PHONE-TECHNOLOGY ENTREPRENEUR . . . AND BUT SO WHY THE ABRUPT CONSUMER RETREAT BACK TO GOOD OLD VOICE-ONLY TELEPHONING?The answer, in a kind of trivalent nutshell, is: (1) emotional stress, (2) physical vanity, (3) a certain queer kind of self-obliterating logic in the microeconomics of consumer high-tech.
It turned out that there was something terribly stressful about visual telephone interfaces that hadn't been stressful at all about voice-only interfaces. Videophone consumers seemed suddenly to realize that they'd been subject to an insidious but wholly marvelous delusion about conventional voice-only telephony. They'd never noticed it before, the delusion — it's like it was so emotionally complex that it could be countenanced only in the context of its loss. Good old traditional audio-only phone conversations allowed you to presume that the person on the other end was paying complete attention to you while also permitting you not to have to pay anything even close to complete attention to her. A traditional aural-only conversation — utilizing a hand-held phone whose earpiece contained only 6 little pinholes but whose mouthpiece (rather significantly, it later seemed) contained (62) or 36 little pinholes — let you enter a kind of highway-hypnotic semi-attentive fugue: while conversing, you could look around the room, doodle, fine-groom, peel tiny bits of dead skin away from your cuticles, compose phone-pad haiku, stir things on the stove; you could even carry on a whole separate additional sign-language-and-exaggerated-facial-expression type of conversation with people right there in the room with you, all while seeming to be right there attending closely to the voice on the phone. And yet — and this was the retrospectively marvelous part — even as you were dividing your attention between the phone call and all sorts of other idle little fuguelike activities, you were somehow never haunted by the suspicion that the person on the other end's attention might be similarly divided. During a traditional call, e.g., as you let's say performed a close tactile blemish-scan of your chin, you were in no way oppressed by the thought that your phonemate was perhaps also devoting a good percentage of her attention to a close tactile blemish-scan. It was an illusion and the illusion was aural and aurally supported: the phone-line's other end's voice was dense, tightly compressed, and vectored right into your ear, enabling you to imagine that the voice's owner's attention was similarly compressed and focused . . . even though your own attention was not, was the thing. This bilateral illusion of unilateral attention was almost infantilely gratifying from an emotional standpoint: you got to believe you were receiving somebody's complete attention without having to return it. Regarded with the objectivity of hindsight, the illusion appears arational, almost literally fantastic: it would be like being able both to lie and to trust other people at the same time.
Video telephony rendered the fantasy insupportable. Callers now found they had to compose the same sort of earnest, slightly overintense listener's expression they had to compose for in-person exchanges. Those callers who out of unconscious habit succumbed to fuguelike doodling or pants-crease-adjustment now came off looking rude, absentminded, or childishly self-absorbed. Callers who even more unconsciously blemish-scanned or nostril-explored looked up to find horrified expressions on the video-faces at the other end. All of which resulted in videophonic stress.
Even worse, of course, was the traumatic expulsion-from-Eden feeling of looking up from tracing your thumb's outline on the Reminder Pad or adjusting the old Unit's angle of repose in your shorts and actually seeing your videophonic interfacee idly strip a shoelace of its gumlet as she talked to you, and suddenly realizing your whole infantile fantasy of commanding your partner's attention while you yourself got to fugue-doodle and make little genital-adjustments was deluded and insupportable and that you were actually commanding not one bit more attention than you were paying, here. The whole attention business was monstrously stressful, video callers found.
(2) And the videophonic stress was even worse if you were at all vain. I.e. if you worried at all about how you looked. As in to other people. Which all kidding aside who doesn't. Good old aural telephone calls could be fielded without makeup, toupee, surgical prostheses, etc. Even without clothes, if that sort of thing rattled your saber. But for the image-conscious, there was of course no such answer-as-you-are informality about visual-video telephone calls, which consumers began to see were less like having the good old phone ring than having the doorbell ring and having to throw on clothes and attach prostheses and do hair-checks in the foyer mirror before answering the door.
But the real coffin-nail for videophony involved the way callers' faces looked on their TP screen, during calls. Not their callers' faces, but their own, when they saw them on video. It was a three-button affair:, after all, to use the TP's cartridge-card's Video-Record option to record both pulses in a two-way visual call and play the call back and see how your face had actually looked to the other person during the call. This sort of appearance-check was no more resistible than a mirror. But the experience proved almost universally horrifying. People were horrified at how their own faces appeared on a TP screen. It wasn't just 'Anchorman's Bloat,' that well-known impression of extra weight that video inflicts on the face. It was worse. Even with high-end TPs' high-def viewer-screens, consumers perceived something essentially blurred and moist-looking about their phone-faces, a shiny pallid indefiniteness that struck them as not just unflattering but somehow evasive, furtive, untrustworthy, unlikable. In an early and ominous InterLace/G.T.E. focus-group survey that was all but ignored in a storm of entrepreneurial sci-fi-tech enthusiasm, almost 60% of respondents who received visual access to their own faces during videophonic calls specifically used the terms untrustworthy, unlikable, or hard to like in describing their own visage's appearance, with a phenomenally ominous 71 % of senior-citizen respondents specifically comparing their video-faces to that of Richard Nixon during the Nixon-Kennedy debates of B.S. 1960.
The proposed solution to what the telecommunications industry's psychological consultants termed Video-Physiognomic Dysphoria (or VPD) was, of course, the advent of High-Definition Masking; and in fact it was those entrepreneurs who gravitated toward the production of high-definition videophonic imaging and then outright masks who got in and out of the short-lived videophonic era with their shirts plus solid additional nets.
Mask-wise, the initial option of High-Definition Photographic Imaging — i.e. taking the most flattering elements of a variety of flattering multi-angle photos of a given phone-consumer and — thanks to existing image-configuration equipment already pioneered by the cosmetics and law-enforcement industries — combining them into a wildly attractive high-def broadcastable composite of a face wearing an earnest, slightly overintense expression of complete attention — was quickly supplanted by the more inexpensive and byte-economical option of (using the exact same cosmetic-and-FBI software) actually casting the enhanced facial image in a form-fitting polybutylene-resin mask, and consumers soon found that the high up-front cost of a permanent wearable mask was more than worth it, considering the stress- and VFD-reduction benefits, and the convenient Velcro straps for the back of the mask and caller's head cost peanuts; and for a couple fiscal quarters phone/cable companies were able to rally VPD-afflicted consumers' confidence by working out a horizontally integrated deal where free composite-and-masking services came with a videophone hookup. The high-def masks, when not in use, simply hung on a small hook on the side of a TP's phone-console, admittedly looking maybe a bit surreal and discomfiting when detached and hanging there empty and wrinkled, and sometimes there were potentially awkward mistaken-identity snafus involving multi-user family or company phones and the hurried selection and attachment of the wrong mask taken from some long row of empty hanging masks — but all in all the masks seemed initially like a viable industry response to the vanity,-stress,-and-Nixonian-facial-image problem.
by David Foster Wallace, Excerpt from Infinite Jest, The Electric Typewriter | Read more:
Image: uncredited
Friday, May 24, 2013
The Rise and Fall of Charm in American Men
One can say many things about the talents of Vaughn, and were Universal embarking on a bit of polyester parody—remaking, say, Tony Rome, among the least of the neo-noirs—Vaughn’s gift for sending up low pop would be just so. But to aim low in this case is to miss the deceptive grace that Garner brought to the original, and prompts a bigger question: Whatever happened to male charm—not just our appreciation of it, or our idea of it, but the thing itself?
Yes, yes, George Clooney—let’s get him out of the way. For nearly 20 years, any effort to link men and charm has inevitably led to Clooney. Ask women or men to name a living, publicly recognized charming man, and 10 out of 10 will say Clooney. That there exists only one choice—and an aging one—proves that we live in a culture all but devoid of male charm.
Mention Clooney, and the subject turns next to whether (or to what extent) he’s the modern version of that touchstone of male charm, Cary Grant. Significantly, Grant came to his charm only when he came, rather late, to his adulthood. An abandoned child and a teenage acrobat, he spent his first six years in Hollywood playing pomaded pretty boys. In nearly 30 stilted movies—close to half of all the pictures he would ever make—his acting was tentative, his personality unformed, his smile weak, his manner ingratiating, and his delivery creaky. See how woodenly he responds to Mae West’s most famous (and most misquoted) line, in She Done Him Wrong: “Why don’t you come up sometime and see me?” But in 1937 he made the screwball comedy The Awful Truth, and all at once the persona of Cary Grant gloriously burgeoned. Out of nowhere he had assimilated his offhand wit, his playful knowingness, and, in a neat trick that allowed him to be simultaneously cool and warm, his arch mindfulness of the audience he was letting in on the joke.
Grant had developed a new way to interact with a woman onscreen: he treated his leading lady as both a sexually attractive female and an idiosyncratic personality, an approach that often required little more than just listening to her—a tactic that had previously been as ignored in the pictures as it remains, among men, in real life. His knowing but inconspicuously generous style let the actress’s performance flourish, making his co-star simultaneously regal and hilarious.
In short, Grant suddenly and fully developed charm, a quality that is tantalizing because it simultaneously demands detachment and engagement. Only the self-aware can have charm: It’s bound up with a sensibility that at best approaches wisdom, or at least worldliness, and at worst goes well beyond cynicism. It can’t exist in the undeveloped personality. It’s an attribute foreign to many men because most are, for better and for worse, childlike. These days, it’s far more common among men over 70—probably owing to the era in which they reached maturity rather than to the mere fact of their advanced years. What used to be called good breeding is necessary (but not sufficient) for charm: no one can be charming who doesn’t draw out the overlooked, who doesn’t shift the spotlight onto others—who doesn’t, that is, possess those long-forgotten qualities of politesse and civilité. A great hostess perforce has charm (while legendary hostesses are legion—Elizabeth Montagu, Madame Geoffrin, Viscountess Melbourne, Countess Greffulhe—I can’t think of a single legendary host), but today this social virtue goes increasingly unrecognized. Still, charm is hardly selfless. All of these acts can be performed only by one at ease with himself yet also intensely conscious of himself and of his effect on others. And although it’s bound up with considerateness, it really has nothing to do with, and is in fact in some essential ways opposed to, goodness. Another word for the lightness of touch that charm requires in humor, conversation, and all other aspects of social relations is subtlety, which carries both admirable and dangerous connotations. Charm’s requisite sense of irony is also the requisite for social cruelty (...)
by Benjamin Schwarz, The Atlantic | Read more:
Illustration: Thomas Allen
Glaeser on Cities
Edward Glaeser of Harvard University and author of The Triumph of Cities talks with EconTalk host Russ Roberts about American cities. The conversation begins with a discussion of the history of Detroit over the last century and its current plight. What might be done to improve Detroit's situation? Why are other cities experiencing similar challenges to those facing Detroit? Why are some cities thriving and growing? What policies might help ailing cities and what policies have helped those cities that succeed? The conversation concludes with a discussion of why cities have such potential for growth. (ed. Podcast)
Intro. [Recording date: April 15, 2013.] Russ: Topic is cities; start with recent post you had at the New York Times's blog, Economix, on Detroit. Give us a brief history of that city. It's not doing well right now, but it wasn't always that way, was it?
Intro. [Recording date: April 15, 2013.] Russ: Topic is cities; start with recent post you had at the New York Times's blog, Economix, on Detroit. Give us a brief history of that city. It's not doing well right now, but it wasn't always that way, was it?
Guest: No. If you look back 120 years ago or so, Detroit looked like one of the most entrepreneurial places on the planet. It seemed as if there was an automotive genius on every street corner. If you look back 60 years ago, Detroit was among the most productive places on the planet, with the companies that were formed by those automotive geniuses coming to fruition and producing cars that were the technological wonder of the world. So, Detroit's decline is of more recent heritage, of the past 50 years. And it's an incredible story, an incredible tragedy. And it tells us a great deal about the way that cities work and the way that local economies function.
Russ: So, what went wrong?
Guest: If we go back to those small-scale entrepreneurs of 120 years ago--it's not just Henry Ford; it's the Dodge brothers, the Fisher brothers, David Dunbar Buick, Billy Durant nearby Flint--all of these men were trying to figure out how to solve this technological problem, making the automobile cost effective, produce cheap, solid cars for ordinary people to run in the world. They managed to do that, Ford above all, by taking advantage of each other's ideas, each other supplies, financing that was collaboratively arranged. And together they were able to achieve this remarkable technological feat. The problem was the big idea was a vast, vertically integrated factory. And that's a great recipe for short run productivity, but a really bad recipe for long run reinvention. And a bad recipe for urban areas more generally, because once you've got a River Rouge plant, once you've got this mass vertically integrated factory, it doesn't need the city; it doesn't give to the city. It's very, very productive but you could move it outside the city, as indeed Ford did when he moved his plant from the central city of Detroit to River Rouge. And then of course once you are at this stage of the technology of an industry, you can move those plants to wherever it is that cost minimization dictates you should go. And that's of course exactly what happens. Jobs first suburbanized, then moved to lower cost areas. The work of Tom Holmes at the U. of Minnesota shows how remarkable the difference is in state policies towards unions, labor, how powerful those policies were in explaining industrial growth after 1947. And of course it globalizes. It leaves cities altogether. And that's exactly what happened in automobiles. In some sense--and what was left was relatively little, because it's a sort of inversion[?] of the natural resource curse, because it was precisely because Detroit had these incredibly productive machines that they squeezed out all other sources of invention--rather than having lots of small entrepreneurs you had middle managers for General Motors (GM) and Ford. And those guys were not going to be particularly adept at figuring out some new industry and new activity when the automobile production moved elsewhere or declined. And that's at least how I think about this--that successful cities today are marked by small firms, smart people, and connections to the outside world. And that was what Detroit was about in 1890 but it's not what Detroit was about in 1970. And I think that sowed the seeds of decline.
4:25 Russ: So, one way to describe what you are saying is in the early part of the 20th century, Detroit was something like Silicon Valley, a hub of creative talent, a lot of complementarity between the ideas and the supply chain and interactions between those people that all came together. Lots of competition, which encouraged people to try harder and innovate, or do the best they could. Are you suggesting then that Silicon Valley is prone to this kind of change at some point? If the computer were to become less important somewhere down the road or produced in a different way?
Guest: The question is to what extent do the Silicon Valley firms become dominated by very strong returns to scale, a few dominant firms capitalize on it. I think it's built into the genes of every industry that they will eventually decline. The question is whether or not the region then reinvents itself. And there are two things that enable particular regions to reinvent themselves. One is skills, measured education, human capital. The year, the share or the fraction in the metropolitan area with a college degree as of 1940 or 1960 or 1970 has been a very good predictor of whether, particularly northeastern or northwestern metropolitan areas, have been able to turn themselves around. And a particular form of human capital, entrepreneurial human capital, also seems to be critical, despite the fact that our proxies for entrepreneurial talent are relatively weak. We typically use things like the number of establishments per worker in a given area, or the share of employment in startups from some initial time period. Those weak proxies are still very, very strong predictors of urban regeneration, places that have lots of little firms have managed to do much better than places that were dominated by a few large firms, particularly if they are in a single industry. So, let's think for a second about Silicon Valley. Silicon Valley has lots of skilled workers. That's good. But what I don't know is whether Silicon Valley is going to look like it's dominated by a few large firms, Google playing the role of General Motors. Or whether or not it will continue to have lots of little startups. There's nothing wrong with big firms in terms of productivity. But they tend to train middle managers, not entrepreneurs. So that's, I think the other thing to look for. And one of the things that we have seen historically is that those little entrepreneurs are pretty good at switching industries when they need to. Think about New York, which, the dominated industry in New York was garment manufacturing. It was a large industrial cluster in the 1950s than automobile production was. But those small scale people who led those garment firms, they were pretty adept at doing something else when the industry jettisoned hundreds of thousands of jobs in the 1960s. No way that the middle managers for U.S. Steel or General Motors were not.
by Edward Glaeser, Hosted by Russ Roberts, Library of Economics and Liberty | Read more:
by Edward Glaeser, Hosted by Russ Roberts, Library of Economics and Liberty | Read more:
Photo: Julian Dufort, Money Magazine
Subscribe to:
Comments (Atom)






.jpg)



.jpg)






