Friday, August 22, 2014
What a Difference a Day Makes
Before a romantic Caribbean weekend with her new boyfriend, Amanda Sanders decided she needed a little lift. So she called her doctor, Dr. Norman M. Rowe, to help out.
Dr. Rowe, a plastic surgeon in Manhattan, offers a quick fix — temporary breast enlargement. Instead of surgery, he injects a saline solution into the breasts, which briefly expands them.
The procedure began as as a way for women seeking breast enhancement to determine how they might look if they chose surgery. “We can take pictures and put them on computers, but those are sometimes unrealistic and can lead to false expectations,” Dr. Rowe said (giving new meaning, perhaps, to the term “falsies”). “So we said, if patients are unsure if they want implants, let’s put saline in the breast and let them live with it for 24 hours to see how they like it.”
It may not surprise that the injections were soon being requested as pick-me-ups for parties, weddings, bar mitzvahs, red-carpet events or, as with Ms. Sanders, a tropical vacation.
Ms. Sanders, 41, an image consultant in New York and a mother of two, had been toying with the idea of a breast lift to enhance her “very shallow C cup,” but she was a little reluctant. When she heard of the temporary saline option (cost: $3,500), she leapt at the chance. Twice.
“It was worth it,” she said. “I could wear halter tops and a string bikini and feel really sexy. I’m in the business of vanity. As an image consultant, I have to look the part and be the part.”
While “lunchtime lifts” using injectable fillers similar to Restylane or Juvéderm are available in Europe, they are not F.D.A.-approved in the United States. Macrolane, another filler, was banned in Britain as a breast injectable because it was thought to cloud mammogram readings, among other complications. Saline is essentially saltwater that is absorbed into the bloodstream in about 24 hours.
Breast enhancement surgeries are decidedly popular in the United States. According to the American Society for Aesthetic Plastic Surgery, 313,327 breast augmentations and 137,233 breast lifts were performed in 2013. A noninvasive procedure like a saline injection would seem to be just what the doctor ordered.

The procedure began as as a way for women seeking breast enhancement to determine how they might look if they chose surgery. “We can take pictures and put them on computers, but those are sometimes unrealistic and can lead to false expectations,” Dr. Rowe said (giving new meaning, perhaps, to the term “falsies”). “So we said, if patients are unsure if they want implants, let’s put saline in the breast and let them live with it for 24 hours to see how they like it.”
It may not surprise that the injections were soon being requested as pick-me-ups for parties, weddings, bar mitzvahs, red-carpet events or, as with Ms. Sanders, a tropical vacation.
Ms. Sanders, 41, an image consultant in New York and a mother of two, had been toying with the idea of a breast lift to enhance her “very shallow C cup,” but she was a little reluctant. When she heard of the temporary saline option (cost: $3,500), she leapt at the chance. Twice.
“It was worth it,” she said. “I could wear halter tops and a string bikini and feel really sexy. I’m in the business of vanity. As an image consultant, I have to look the part and be the part.”
While “lunchtime lifts” using injectable fillers similar to Restylane or Juvéderm are available in Europe, they are not F.D.A.-approved in the United States. Macrolane, another filler, was banned in Britain as a breast injectable because it was thought to cloud mammogram readings, among other complications. Saline is essentially saltwater that is absorbed into the bloodstream in about 24 hours.
Breast enhancement surgeries are decidedly popular in the United States. According to the American Society for Aesthetic Plastic Surgery, 313,327 breast augmentations and 137,233 breast lifts were performed in 2013. A noninvasive procedure like a saline injection would seem to be just what the doctor ordered.
by Abby Ellin, NY Times | Read more:
Image: Caryn PosnanskyThe Art of Sprezzatura
Sprezzatura is a word often bandied around within the world of fashion and style with reckless abandon. Much like other trademark words we use such as ‘essential’, ‘classic’ and ‘steez’ (which I hate, by the way), the definition of sprezzatura has come to mean many different and varying things, thanks to the general overuse and the lack of knowledge about the subject in general. Luckily, this is where FashionBeans comes in.
Regardless of whether you’re a beginner to the world of menswear or a seasoned pro, we can all benefit from a quick reminder of what sprezzatura really means and how it can be utilised in your own style. It’s an Italian word that first shows up in The Book of the Courtier by Baldassare Castiglione, where it is defined as: ‘a certain nonchalance, so as to conceal all art and make whatever one does or say appear to be without effort and almost without any thought about it.’ This essentially boils down to making difficult actions look easy while concealing the conscious effort that went into them. Or as Yeezy would say – ‘what? This old thing?’
So it boils down to making it seem like you don’t care then? Well, sort of. The easiest translation of sprezzatura is ‘artful dishevelment’ and there is a fine line between achieving it and simply being sloppy.
by Matt Allinson, FashionBeans | Read more:
Image: uncredited
Wednesday, August 20, 2014
America in Decay: The Sources of Political Dysfunction
The story of the U.S. Forest Service is not an isolated case but representative of a broader trend of political decay; public administration specialists have documented a steady deterioration in the overall quality of American government for more than a generation. In many ways, the U.S. bureaucracy has moved away from the Weberian ideal of an energetic and efficient organization staffed by people chosen for their ability and technical knowledge. The system as a whole is less merit-based: rather than coming from top schools, 45 percent of recent new hires to the federal service are veterans, as mandated by Congress. And a number of surveys of the federal work force paint a depressing picture. According to the scholar Paul Light, “Federal employees appear to be more motivated by compensation than mission, ensnared in careers that cannot compete with business and nonprofits, troubled by the lack of resources to do their jobs, dissatisfied with the rewards for a job well done and the lack of consequences for a job done poorly, and unwilling to trust their own organizations.”
WHY INSTITUTIONS DECAY
In his classic work Political Order in Changing Societies, the political scientist Samuel Huntington used the term “political decay” to explain political instability in many newly independent countries after World War II. Huntington argued that socioeconomic modernization caused problems for traditional political orders, leading to the mobilization of new social groups whose participation could not be accommodated by existing political institutions. Political decay was caused by the inability of institutions to adapt to changing circumstances. Decay was thus in many ways a condition of political development: the old had to break down in order to make way for the new. But the transitions could be extremely chaotic and violent, and there was no guarantee that the old political institutions would continuously and peacefully adapt to new conditions. (...)
The very stability of institutions, however, is also the source of political decay. Institutions are created to meet the demands of specific circumstances, but then circumstances change and institutions fail to adapt. One reason is cognitive: people develop mental models of how the world works and tend to stick to them, even in the face of contradictory evidence. Another reason is group interest: institutions create favored classes of insiders who develop a stake in the status quo and resist pressures to reform. (...)
Political decay thus occurs when institutions fail to adapt to changing external circumstances, either out of intellectual rigidities or because of the power of incumbent elites to protect their positions and block change. Decay can afflict any type of political system, authoritarian or democratic. And while democratic political systems theoretically have self-correcting mechanisms that allow them to reform, they also open themselves up to decay by legitimating the activities of powerful interest groups that can block needed change.
This is precisely what has been happening in the United States in recent decades, as many of its political institutions have become increasingly dysfunctional. A combination of intellectual rigidity and the power of entrenched political actors is preventing those institutions from being reformed. And there is no guarantee that the situation will change much without a major shock to the political order.

In his classic work Political Order in Changing Societies, the political scientist Samuel Huntington used the term “political decay” to explain political instability in many newly independent countries after World War II. Huntington argued that socioeconomic modernization caused problems for traditional political orders, leading to the mobilization of new social groups whose participation could not be accommodated by existing political institutions. Political decay was caused by the inability of institutions to adapt to changing circumstances. Decay was thus in many ways a condition of political development: the old had to break down in order to make way for the new. But the transitions could be extremely chaotic and violent, and there was no guarantee that the old political institutions would continuously and peacefully adapt to new conditions. (...)
The very stability of institutions, however, is also the source of political decay. Institutions are created to meet the demands of specific circumstances, but then circumstances change and institutions fail to adapt. One reason is cognitive: people develop mental models of how the world works and tend to stick to them, even in the face of contradictory evidence. Another reason is group interest: institutions create favored classes of insiders who develop a stake in the status quo and resist pressures to reform. (...)
Political decay thus occurs when institutions fail to adapt to changing external circumstances, either out of intellectual rigidities or because of the power of incumbent elites to protect their positions and block change. Decay can afflict any type of political system, authoritarian or democratic. And while democratic political systems theoretically have self-correcting mechanisms that allow them to reform, they also open themselves up to decay by legitimating the activities of powerful interest groups that can block needed change.
This is precisely what has been happening in the United States in recent decades, as many of its political institutions have become increasingly dysfunctional. A combination of intellectual rigidity and the power of entrenched political actors is preventing those institutions from being reformed. And there is no guarantee that the situation will change much without a major shock to the political order.
by Francis Fukuyama, Foreign Affairs | Read more:
Image: Max Whittaker/ ReutersTuesday, August 19, 2014
The Teaching Class
When Mary Margaret Vojtko died last September—penniless and virtually homeless and eighty-three years old, having been referred to Adult Protective Services because the effects of living in poverty made it seem to some that she was incapable of caring for herself—it made the news because she was a professor. That a French professor of twenty-five years would be let go from her job without retirement benefits, without even severance, sounded like some tragic mistake. In the Pittsburgh Post-Gazette op-ed that broke the story, Vojtko’s friend and attorney Daniel Kovalik describes an exchange he had with a caseworker from Adult Protective Services: “The caseworker paused and asked with incredulity, ‘She was a professor?’ I said yes. The caseworker was shocked; this was not the usual type of person for whom she was called in to help.” A professor belongs to the professional class, a professor earns a salary and owns a home, probably with a leafy yard, and has good health insurance and a retirement account. In the American imagination, a professor is perhaps disheveled, but as a product of brainy eccentricity, not of penury. In the American university, this is not the case.
Most university-level instructors are, like Vojtko, contingent employees, working on a contract basis year to year or semester to semester. Some of these contingent employees are full-time lecturers, and many are adjunct instructors: part-time employees, paid per class, often without health insurance or retirement benefits. This is a relatively new phenomenon: in 1969, 78 percent of professors held tenure-track positions. By 2009 this percentage had shrunk to 33.5. The rest of the professors holding jobs—whether part time or full time—do so without any job security. These are the conditions that left Vojtko in such a vulnerable position after twenty-five years at Duquesne University. Vojtko was earning between $3,000 and $3,500 per three-credit course. During years when she taught three courses per semester, and an additional two over the summer, she made less than $25,000, and received no health benefits through her employer. Though many universities limit the number of hours that adjunct professors can work each semester, keeping them nominally “part-time” employees, teaching three three-credit courses is certainly a full-time job. These circumstances are now the norm for university instructors, as the number of tenured and tenure-track positions shrinks and the ranks of contingent laborers swell.
A moment of full disclosure: I am an adjunct. I taught freshman composition at Columbia University for two years as a graduate student, then for a few semesters more as an adjunct after I finished my degree. I now tutor in a writing center in the City University of New York system. Many of my friends do this same kind of work at colleges around New York City, commuting from campus to campus, cobbling together more-than-full-time work out of multiple part-time jobs. We talk a lot about how to make adjuncting livable, comparing pay rates at different writing centers and English departments. We crowdsource answers to questions about how to go to the dentist, for example, since none of us has dental insurance—wait for a Groupon for a cleaning, or go to the student dentists at NYU for anything urgent. I do have health insurance at my current job, though I get an email a few times per year informing me that it may expire soon because negotiations between the union and the university over adjunct health insurance have stalled. This is mostly fine—my coverage has never actually been interrupted—but it is hard to swallow the notion that the university that employs me is constantly trying to get out of providing health insurance to teachers, particularly when it announces that it is giving our new chancellor an $18,000/month apartment for free.
So I have closely followed the news and op-ed coverage of the adjunct bubble that followed Vojtke’s death. And while I have been glad to see more attention being paid to the working conditions in higher education, I’ve been surprised that the issue is consistently framed as purely a workers’ rights problem. It is this, of course. But it is not only this.
by Rachel Riederer, Guernica | Read more:
Image: Zeke Berman

A moment of full disclosure: I am an adjunct. I taught freshman composition at Columbia University for two years as a graduate student, then for a few semesters more as an adjunct after I finished my degree. I now tutor in a writing center in the City University of New York system. Many of my friends do this same kind of work at colleges around New York City, commuting from campus to campus, cobbling together more-than-full-time work out of multiple part-time jobs. We talk a lot about how to make adjuncting livable, comparing pay rates at different writing centers and English departments. We crowdsource answers to questions about how to go to the dentist, for example, since none of us has dental insurance—wait for a Groupon for a cleaning, or go to the student dentists at NYU for anything urgent. I do have health insurance at my current job, though I get an email a few times per year informing me that it may expire soon because negotiations between the union and the university over adjunct health insurance have stalled. This is mostly fine—my coverage has never actually been interrupted—but it is hard to swallow the notion that the university that employs me is constantly trying to get out of providing health insurance to teachers, particularly when it announces that it is giving our new chancellor an $18,000/month apartment for free.
So I have closely followed the news and op-ed coverage of the adjunct bubble that followed Vojtke’s death. And while I have been glad to see more attention being paid to the working conditions in higher education, I’ve been surprised that the issue is consistently framed as purely a workers’ rights problem. It is this, of course. But it is not only this.
by Rachel Riederer, Guernica | Read more:
Image: Zeke Berman
Everybody Smiley Poops
We were at a concert when Maggie saw the woman her husband had cheated on her with. She’d planned her response in advance: a simple introduction, then she’d just stand there and watch the other woman react to her last name with (ideally) horrified guilt. But in the moment, Maggie couldn’t speak. Her face flushed. Her hands shook. We left immediately.
On the walk home she finally spoke, the words coming so fast she almost choked. I rubbed her back. This is when face-to-face communication is crucial—nothing I could have said would have helped, but gestures (hugs, back rubs, concerned face) made it clear I’d heard her and understood. According to the psychiatric journal Activitas Nervosa Superior, “Emotion arises from sensory stimulation and is typically accompanied by physiological and behavioral changes in the body.” We express our emotions most clearly through physical reactions—Maggie’s flushed cheeks and shaking hands—and receive comfort most effectively this way too. In highly emotional situations, gestures and expressions can usually communicate better than words.
But gestures weren’t available the next day, so I did what any busy, young, too-broke-to-send-flowers New Yorker checking on a friend from work would do: I sent a few emojis. But what was the right combination of tiny pictures to say, “I’m sorry you’re going through this and I know it sucks hard now and I want you to know it won’t suck forever because you’re great”?
Turns out, for Maggie at least, the right emoji sequence to symbolize sympathizing with heartbreak is a series of smiley poops. Two days later we had dinner in person. “I’m feeling better,” she said. “I had a session with my therapist… And your text the other day made me laugh—that helped.”
Tiny cartoon poops helped.
It sounds stupid. Their cuteness makes any serious conversation about emojis difficult, like talking to a baby in a grown-up voice. It feels embarrassing to posit these little cartoons as a vehicle of emotion or even a global language, yet they really are an almost universally understood form of communication. Most studies by social scientists, linguists, and psychologists skew pro-emoji (“A smiling emoticon activates the same brain areas as the image of a person smiling and can therefore be considered as a representation of that emotion” reads one study), but that doesn’t stop them from seeming a little foolish: eensy cartoon images, designed to be typed alongside or in lieu of text, uncool in their enthusiasm and earnestness.
Though the most tweeted emoji is a heart, most other top emojis, lending weight to the emoticon/emotion connection, are faces:
…and the list goes on. Smiley poop is #90 out of 845, not bad as far as poop goes, but definitely not top 10. Even so, it’s been used more than eight million times since July of last year, on Twitter alone, according to the mesmerizing, potentially seizure-inducing real-time emojitracker.com.
Why can’t we stop using emojis? Are they ridiculous, turning us into the equivalent of button-pressing Neanderthals? Or are they brilliant, providing a global medium to express our emotions and creativity? “Emojis mean everything and they mean nothing at the same time,” designer Liza Nelson wrote on her emoji art tumblr, Emoji IRL. LOL. “They’re really quite stupid. And they’re the best thing that ever happened to our generation.”
On the walk home she finally spoke, the words coming so fast she almost choked. I rubbed her back. This is when face-to-face communication is crucial—nothing I could have said would have helped, but gestures (hugs, back rubs, concerned face) made it clear I’d heard her and understood. According to the psychiatric journal Activitas Nervosa Superior, “Emotion arises from sensory stimulation and is typically accompanied by physiological and behavioral changes in the body.” We express our emotions most clearly through physical reactions—Maggie’s flushed cheeks and shaking hands—and receive comfort most effectively this way too. In highly emotional situations, gestures and expressions can usually communicate better than words.
But gestures weren’t available the next day, so I did what any busy, young, too-broke-to-send-flowers New Yorker checking on a friend from work would do: I sent a few emojis. But what was the right combination of tiny pictures to say, “I’m sorry you’re going through this and I know it sucks hard now and I want you to know it won’t suck forever because you’re great”?

Turns out, for Maggie at least, the right emoji sequence to symbolize sympathizing with heartbreak is a series of smiley poops. Two days later we had dinner in person. “I’m feeling better,” she said. “I had a session with my therapist… And your text the other day made me laugh—that helped.”
Tiny cartoon poops helped.
It sounds stupid. Their cuteness makes any serious conversation about emojis difficult, like talking to a baby in a grown-up voice. It feels embarrassing to posit these little cartoons as a vehicle of emotion or even a global language, yet they really are an almost universally understood form of communication. Most studies by social scientists, linguists, and psychologists skew pro-emoji (“A smiling emoticon activates the same brain areas as the image of a person smiling and can therefore be considered as a representation of that emotion” reads one study), but that doesn’t stop them from seeming a little foolish: eensy cartoon images, designed to be typed alongside or in lieu of text, uncool in their enthusiasm and earnestness.
Though the most tweeted emoji is a heart, most other top emojis, lending weight to the emoticon/emotion connection, are faces:

…and the list goes on. Smiley poop is #90 out of 845, not bad as far as poop goes, but definitely not top 10. Even so, it’s been used more than eight million times since July of last year, on Twitter alone, according to the mesmerizing, potentially seizure-inducing real-time emojitracker.com.
Why can’t we stop using emojis? Are they ridiculous, turning us into the equivalent of button-pressing Neanderthals? Or are they brilliant, providing a global medium to express our emotions and creativity? “Emojis mean everything and they mean nothing at the same time,” designer Liza Nelson wrote on her emoji art tumblr, Emoji IRL. LOL. “They’re really quite stupid. And they’re the best thing that ever happened to our generation.”
by Mary Mann, Medium | Read more:
Images: Liza Nelson
Monday, August 18, 2014
The Breast of Times: 10 Years of Irrational Nipple Controversy
[ed. If it weren't for nipples, there'd be no point to tits.]
Janet Jackson at Super Bowl XXXVIII
The nipple-baring that started the national conversation about wardrobe malfunctions took place at the 2004 Super Bowl XXXVIII halftime show. When Justin Timberlake dance-ripped Jackson’s top, viewers caught a glimpse of Jackson’s nipple for 9/16th of a second. 1.4 million people went on to complain to the Federal Communications Commission about the supposedly indecent exposure. (Yes, Americans complained more about 9/16th of a second of nipple on CBS rather than the 10+ years of Two and a Half Men they’ve been airing.) CBS was fined $550,000 by the FCC, and Jackson’s career arguably suffered after the fact. Timberlake is doing just fine for himself.
Nancy Grace on Dancing with the Stars
In 2011, justice-seeker Nancy Grace’s Dancing with the Stars routine ended up exposing one of her nipples briefly — as one can do when partaking in a vigorous physical activity while wearing a deep-cut dress. Grace vehemently denies it to this day. Meanwhile, contestants are getting entirely naked on Dancing with the Stars’ equivalent in Argentina.
The New Yorker on Facebook
When Mike Stevens posted a cartoon (below) on The New Yorker’s Facebook page in 2012, the magazine was temporarily banned from the site for violating their terms of service.
As Bob Mankoff explained:
“Some sleuthing showed that the offense was actually caused by the inclusion of these two dots in the cartoon,
which, by the way, also contained these two non-offending dots.”
Now, we all know that The New Yorker is a filthy rag staffed by smut peddlers, but Facebook seemingly overreacted with this one.
by Gabriella Paiella, The Hairpin | Read more:
Image: New YorkerCrab, Spinach & Coconut Soup
Prep Time: 15 minutes
Cook Time: 20 minutes
Total Time: 35 minutes
Yield: 4 to 6 servings
Ingredients:
4 slices bacon, chopped
1 medium yellow onion, chopped
2 cloves garlic, minced
1 medium russet potato, peeled and chopped
1 teaspoon salt
1/2 teaspoon black pepper
1/2 teaspoon red pepper flakes (more or less to taste)
1 teaspoon fresh thyme leaves, minced
2 cups chopped Roma tomatoes
4 cups chicken stock
1 can coconut milk
8 ounces lump crabmeat
5 cups fresh spinach
Sliced scallions, for garnish
Reserved bacon, for garnish
Fresh time wedges, for garnish
Instructions:
1. Brown bacon in a large dutch oven or stock pot. Remove bacon and set aside.
2. Transfer onion to the pot and saute for 3 to 4 minutes. Add garlic and continue sauteing for 1 minute. Add potatoes, salt, pepper, red pepper flakes, and thyme. Stir to combine.
3. Add tomatoes and chicken stock and bring to a boil. Reduce heat, cover and simmer for 10 minutes or until potatoes are tender.
4. Add coconut milk, stirring to combine. Add spinach and continue cooking until wilted. Immediately before serving, add crabmeat and stir gently so you don’t break up the lumps. Serve immediately with scallions, reserved bacon, and lime wedges.
The Lethality of Loneliness
Sometime in the late ’50s, Frieda Fromm-Reichmann sat down to write an essay about a subject that had been mostly overlooked by other psychoanalysts up to that point. Even Freud had only touched on it in passing. She was not sure, she wrote, “what inner forces” made her struggle with the problem of loneliness, though she had a notion. It might have been the young female catatonic patient who began to communicate only when Fromm-Reichmann asked her how lonely she was. “She raised her hand with her thumb lifted, the other four fingers bent toward her palm,” Fromm-Reichmann wrote. The thumb stood alone, “isolated from the four hidden fingers.” Fromm-Reichmann responded gently, “That lonely?” And at that, the woman’s “facial expression loosened up as though in great relief and gratitude, and her fingers opened.”
Fromm-Reichmann would later become world-famous as the dumpy little therapist mistaken for a housekeeper by a new patient, a severely disturbed schizophrenic girl named Joanne Greenberg. Fromm-Reichmann cured Greenberg, who had been deemed incurable. Greenberg left the hospital, went to college, became a writer, and immortalized her beloved analyst as “Dr. Fried” in the best-selling autobiographical novel I Never Promised You a Rose Garden (later also a movie and a pop song). Among analysts, Fromm-Reichmann, who had come to the United States from Germany to escape Hitler, was known for insisting that no patient was too sick to be healed through trust and intimacy. She figured that loneliness lay at the heart of nearly all mental illness and that the lonely person was just about the most terrifying spectacle in the world. She once chastised her fellow therapists for withdrawing from emotionally unreachable patients rather than risk being contaminated by them. The uncanny specter of loneliness “touches on our own possibility of loneliness,” she said. “We evade it and feel guilty.”
Her 1959 essay, “On Loneliness,” is considered a founding document in a fast-growing area of scientific research you might call loneliness studies. Over the past half-century, academic psychologists have largely abandoned psychoanalysis and made themselves over as biologists. And as they delve deeper into the workings of cells and nerves, they are confirming that loneliness is as monstrous as Fromm-Reichmann said it was. It has now been linked with a wide array of bodily ailments as well as the old mental ones.
In a way, these discoveries are as consequential as the germ theory of disease. Just as we once knew that infectious diseases killed, but didn’t know that germs spread them, we’ve known intuitively that loneliness hastens death, but haven’t been able to explain how. Psychobiologists can now show that loneliness sends misleading hormonal signals, rejiggers the molecules on genes that govern behavior, and wrenches a slew of other systems out of whack. They have proved that long-lasting loneliness not only makes you sick; it can kill you. Emotional isolation is ranked as high a risk factor for mortality as smoking. A partial list of the physical diseases thought to be caused or exacerbated by loneliness would include Alzheimer’s, obesity, diabetes, high blood pressure, heart disease, neurodegenerative diseases, and even cancer—tumors can metastasize faster in lonely people.
The psychological definition of loneliness hasn’t changed much since Fromm-Reichmann laid it out. “Real loneliness,” as she called it, is not what the philosopher Søren Kierkegaard characterized as the “shut-upness” and solitariness of the civilized. Nor is “real loneliness” the happy solitude of the productive artist or the passing irritation of being cooped up with the flu while all your friends go off on some adventure. It’s not being dissatisfied with your companion of the moment—your friend or lover or even spouse— unless you chronically find yourself in that situation, in which case you may in fact be a lonely person. Fromm-Reichmann even distinguished “real loneliness” from mourning, since the well-adjusted eventually get over that, and from depression, which may be a symptom of loneliness but is rarely the cause. Loneliness, she said—and this will surprise no one—is the want of intimacy.
Today’s psychologists accept Fromm-Reichmann’s inventory of all the things that loneliness isn’t and add a wrinkle she would surely have approved of. They insist that loneliness must be seen as an interior, subjective experience, not an external, objective condition. Loneliness “is not synonymous with being alone, nor does being with others guarantee protection from feelings of loneliness,” writes John Cacioppo, the leading psychologist on the subject. Cacioppo privileges the emotion over the social fact because—remarkably—he’s sure that it’s the feeling that wreaks havoc on the body and brain.
by Judith Shulevitz, TNR | Read more:

Her 1959 essay, “On Loneliness,” is considered a founding document in a fast-growing area of scientific research you might call loneliness studies. Over the past half-century, academic psychologists have largely abandoned psychoanalysis and made themselves over as biologists. And as they delve deeper into the workings of cells and nerves, they are confirming that loneliness is as monstrous as Fromm-Reichmann said it was. It has now been linked with a wide array of bodily ailments as well as the old mental ones.
In a way, these discoveries are as consequential as the germ theory of disease. Just as we once knew that infectious diseases killed, but didn’t know that germs spread them, we’ve known intuitively that loneliness hastens death, but haven’t been able to explain how. Psychobiologists can now show that loneliness sends misleading hormonal signals, rejiggers the molecules on genes that govern behavior, and wrenches a slew of other systems out of whack. They have proved that long-lasting loneliness not only makes you sick; it can kill you. Emotional isolation is ranked as high a risk factor for mortality as smoking. A partial list of the physical diseases thought to be caused or exacerbated by loneliness would include Alzheimer’s, obesity, diabetes, high blood pressure, heart disease, neurodegenerative diseases, and even cancer—tumors can metastasize faster in lonely people.
The psychological definition of loneliness hasn’t changed much since Fromm-Reichmann laid it out. “Real loneliness,” as she called it, is not what the philosopher Søren Kierkegaard characterized as the “shut-upness” and solitariness of the civilized. Nor is “real loneliness” the happy solitude of the productive artist or the passing irritation of being cooped up with the flu while all your friends go off on some adventure. It’s not being dissatisfied with your companion of the moment—your friend or lover or even spouse— unless you chronically find yourself in that situation, in which case you may in fact be a lonely person. Fromm-Reichmann even distinguished “real loneliness” from mourning, since the well-adjusted eventually get over that, and from depression, which may be a symptom of loneliness but is rarely the cause. Loneliness, she said—and this will surprise no one—is the want of intimacy.
Today’s psychologists accept Fromm-Reichmann’s inventory of all the things that loneliness isn’t and add a wrinkle she would surely have approved of. They insist that loneliness must be seen as an interior, subjective experience, not an external, objective condition. Loneliness “is not synonymous with being alone, nor does being with others guarantee protection from feelings of loneliness,” writes John Cacioppo, the leading psychologist on the subject. Cacioppo privileges the emotion over the social fact because—remarkably—he’s sure that it’s the feeling that wreaks havoc on the body and brain.
by Judith Shulevitz, TNR | Read more:
Image: Ariel Lee
The Mystery of Murakami
[ed. Something I've wondered about, too. Having read half a dozen Murakami books over the years including what are widely considered to be his masterpieces: 1Q84, The Wind-Up Bird Chronicle, Norwegian Wood, the plotting is always familiar, the sentences are indeed sometimes awful, and the story-line inevitably peters out somewhere deep in the woods. But somehow they still remain strangely entertaining.]
Murakami, who learned to speak English by reading American crime novels, begins with an opening paragraph that would make David Goodis proud. Tsukuru Tazaki, recently turned 20, is planning his suicide: “From July of his sophomore year in college until the following January, all Tsukuru Tazaki could think about was dying.” But where Goodis would write something like “All right, he told himself firmly, let’s do it and get it over with,” Murakami is balletic, evoking metaphysical realms and a fine sense of the grotesque. “Crossing that threshold between life and death,” he writes, “would have been easier than swallowing down a slick, raw egg.” It is one of the key aspects of his style, this seamless transition from noirish dread to mystical rumination; the most perfect Murakami title, which really could have been used for any of the 13 novels he has written since 1979, remains Hard-Boiled Wonderland and the End of the World. In Murakamiland, death means merely traveling across a “threshold” between reality and some other world. It is not necessarily the end. In fact, as we soon learn, Tsukuru’s obsession with death is only the beginning. (...)
And page after page, we are confronted with the riddle that is Murakami's prose. No great writer writes as many bad sentences as Murakami does. His crimes include awkward construction ("Just as he appreciated Sara’s appearance, he also enjoyed the way she dressed”); cliché addiction (from a single, paragraph-long character description: “He really hustled on the field … He wasn’t good at buckling down … He always looked people straight in the eye, spoke in a clear, strong voice, and had an amazing appetite … He was a good listener and a born leader”); and lazy repetition (“Sara gazed at his face for some time before speaking,” followed shortly by “Sara gazed at Tsukuru for a time before she spoke”). The dialogue is often robotic, if charmingly so. (...)
How is the author of these lines capable of an atrocity like “Her smile had ratcheted up a notch”? The most charitable explanation is that in Murakami’s fiction, his ugly sentences, though often distracting, serve a strategic purpose. Like the hokey vernacular and use of brand names in Stephen King’s fiction, Murakami’s impoverished language situates us in a realm of utter banality, a simplified black-and-white world in which everything is as it appears. When, inevitably, we pass through a wormhole into an uncanny dimension of fantasy and chaos, the contrast is unnerving.
And page after page, we are confronted with the riddle that is Murakami's prose. No great writer writes as many bad sentences as Murakami does. His crimes include awkward construction ("Just as he appreciated Sara’s appearance, he also enjoyed the way she dressed”); cliché addiction (from a single, paragraph-long character description: “He really hustled on the field … He wasn’t good at buckling down … He always looked people straight in the eye, spoke in a clear, strong voice, and had an amazing appetite … He was a good listener and a born leader”); and lazy repetition (“Sara gazed at his face for some time before speaking,” followed shortly by “Sara gazed at Tsukuru for a time before she spoke”). The dialogue is often robotic, if charmingly so. (...)
How is the author of these lines capable of an atrocity like “Her smile had ratcheted up a notch”? The most charitable explanation is that in Murakami’s fiction, his ugly sentences, though often distracting, serve a strategic purpose. Like the hokey vernacular and use of brand names in Stephen King’s fiction, Murakami’s impoverished language situates us in a realm of utter banality, a simplified black-and-white world in which everything is as it appears. When, inevitably, we pass through a wormhole into an uncanny dimension of fantasy and chaos, the contrast is unnerving.
by Nathaniel Rich, The Atlantic | Read more:
Image: Richie Pope[ed. Got my motorcycle license yesterday. Yay! Unbeknownst to me, someone took a picture after my last riding test. Too bad about that instructor...]
via:
Sunday, August 17, 2014
Honda’s Global Strategy? Go Local.
When financial journalist Jeffrey Rothfeder set out to understand why globalization has failed, he got pulled into the story of Honda, a company that has thrived as a multinational. In more than 60 years in business, Honda has never lost money. Its profit margins are the highest in the industry and its factories among the most productive. Rothfeder talked with The Washington Post about “Driving Honda,” in which he explores the enduring culture established by company founder Soichiro Honda, a perfectionist who embraced mistakes as a way to learn and improve. He also goes inside Honda’s plant in Lincoln, Ala., a model of flexible manufacturing. The following was edited for length and clarity.
How did this book come about?
I didn’t think I’d be writing about Honda or even a specific company. What interested me more was the issue of why globalization is failing, because for two decades now it’s been the guiding principle that runs U.S. economic policy — that there is going to be free trade and we’ll lose the borders and there’s no difference between General Electric here and General Electric in China. And essentially globalization was going to lift all boats economically.
But it isn’t working out the way people had hoped. Most multinational companies do not make money in their globalized operations. Classically, General Electric will say that they make more than 50 percent of their revenue outside the U.S., but they are losing money in many parts of the world.
So I was wondering, what would it take to make a successful multinational? I was also interested in the auto industry because it is such a global industry.

I didn’t think I’d be writing about Honda or even a specific company. What interested me more was the issue of why globalization is failing, because for two decades now it’s been the guiding principle that runs U.S. economic policy — that there is going to be free trade and we’ll lose the borders and there’s no difference between General Electric here and General Electric in China. And essentially globalization was going to lift all boats economically.
But it isn’t working out the way people had hoped. Most multinational companies do not make money in their globalized operations. Classically, General Electric will say that they make more than 50 percent of their revenue outside the U.S., but they are losing money in many parts of the world.
So I was wondering, what would it take to make a successful multinational? I was also interested in the auto industry because it is such a global industry.
Why Honda?
Because it is one of the few multinational companies that has succeeded at globalization. Their profit margins are high in the auto industry. Almost everywhere they go — over 5 percent profit margins. In most markets, they consistently are in the top 10 of specific models that sell. They’ve never lost money. They’ve been profitable every year. And they’ve been around since 1949, 1950. And it’s a company that really does see the world as its market and thinks very hard about what it takes to be successful at that.
Everything it does — from corporate culture to its operational principles to the way it globalizes — was different from any other company I’ve ever looked at.by Kelly Johnson, Washington Post | Read more:
Image: Blake J. Discher, AP
Reading Upward
How many times have we heard this opinion expressed? On this occasion the speaker was a literary critic on Canadian radio with whom I was discussing my recent blog post “Reading: The Struggle.” Needless to say the sentiment comes along with the regret that people are reading less and less these days and the notion of a hierarchy of writing with the likes of Joyce and Nabokov at the top and Fifty Shades of Grey at the bottom. Between the two it is assumed that there is a kind of neo-Platonic stairway, such that from the bottom one can pass by stages to the top, a sort of optimistic inversion of the lament that soft porn will lead you to hard and anyone smoking marijuana is irredeemably destined to descend through coke and crack to heroin. The user, that is, is always drawn to a more intense form of the same species of experience.
Of course, while the fear that one will descend from soft to hard drugs tends to be treated as a near certainty, the hope that one might ascend from Hermione Granger to Clarissa Dalloway is usually expressed as a tentative wish. Nevertheless, it serves to justify the intellectual’s saying, “Frankly, I don’t mind what they’re reading, etc.” (as if this were some kind of concession), and underwrites our cautious optimism when we see an adolescent son or daughter immersed in George R.R. Martin. It’s not Dostoevsky, but one day it might be, and in any event it’s better than a computer game or TV since these are not part of the reading stairway.
Is any of this borne out by reality? Do people really pass from Fifty Shades of Grey to Alice Munro? (Through how many intermediate steps? Never to return?) And if it is not true why does a certain kind of intellectual continue to express them? To what end?
In 1948 W.H. Auden published an essay, “The Guilty Vicarage,” on what he calls his “addiction” to detective novels. The point he makes is that these schematic narratives serve the escapist needs of readers who share his particular psychological make-up. These people will not, as a rule, Auden claims, with some elaborate argument, be the same readers as readers of light romances or thrillers, or fantasy fiction. Each genre has its pull on different types of minds. In any event, if he, Auden, is to get any serious work done, he has to make sure that there are no detective novels around, since if there are he can’t resist opening them, and if he opens them he won’t close them till he’s reached the end. Or rather, no new detective novels; for Auden notes this difference between the stuff of his addiction and literature: that the detective novel is no sooner read than forgotten and never invites a second reading, as literature often does.
The implications are clear enough. Auden denies any continuity between literary novels and genre novels, or indeed between the different genres. One does not pass from lower to higher. On the contrary one might perfectly well fall from the higher to the lower, or simply read both, as many people eat both good food and junk food, the only problem being that the latter can be addictive; by constantly repeating the same gratifying formula (the litmus test of genre fiction) it stimulates and satisfies a craving for endless sameness, to the point that the reader can well end up spending all the time he has available for reading with exactly the same fare. (My one powerful experience of this was a spell reading Simenon’s Maigret novels; after five or six it gets harder and harder to distinguish one from another, and yet one goes on.)
Auden, it should be noted, does not propose to stop reading detective novels—he continues to enjoy them—and expresses no regret that people read detective novels rather than, say, Faulkner or Charlotte Brontë, nor any wish that they use detective novels as a stepping stone to “higher things.” He simply notes that he has to struggle to control his addiction, presumably because he doesn’t want to remain trapped in a repetitive pattern of experience that allows no growth and takes him nowhere. His essay, in fact, reads like the reasoning of someone determined to explain to himself why he must not waste too much time with detective novels, and at the same time to forgive himself for the time he does spend with them. If anything, genre fiction prevents engagement with literary fiction, rather than vice versa, partly because of the time it occupies, but more subtly because while the latter is of its nature exploratory and potentially unsettling the former encourages the reader to stay in a comfort zone.
by Tim Parks, NYR | Read more:
Image: Arnold Eagle: Boys Climbing the Fire Escape of a Deserted Building, 1935
Subscribe to:
Posts (Atom)