Friday, January 11, 2013


Bruno Liljefors The Cat Keppe 1889

Ask Polly: Will I Be Alone Forever?

Dear Polly,

This may be one of those "What’s my problem? IS this is a problem?" problems.

I’m 28 and I’ve been single for six years. Very single. As in, years going by where I didn’t have much sex and little to no romance at all. I would tell you about my last relationship but it’s not that interesting and to be honest I don’t think about it much. My parents fought a lot when I was little and drama ensued for many years, but we all love each other and are mostly nice to each other. Shit’s pretty healthy and I’ve worked hard to make it that way. I exercise and paint, I have a good job in public relations, I love my friends, and I think I’m basically pretty and nice and interesting enough. Someone described me as popular the other day (yes, apparently adults do use this word) and it made me feel really weird and kind of guilty, but I had to admit it’s true. Just not with men.

And the thing is, I love men. Being in a committed relationship, having a life partner and some babies with that partner, is important to me. Fun is also important to me, and meeting new people and being open. So in the abstract, either being in a serious relationship or dating sounds good, but I just find the whole thing so, so stressful. I never even get close. I think it’s because as a kid I felt alone, so I thought a lot about finding the person who was my love and best friend forever but now it’s like I can’t get involved with someone without it feeling like a big deal. And by involved I mean, go on more than two dates. For six years.

I don’t want to be this way. I believe that even if some man broke my heart I would be fine. Look, I just typed it! My problem is I can’t stop freaking out. I love how you so often tell people to just chill out. I totally think you’re right, we tend to make problems for ourselves and overdo it on the being-in-control thing. But how do I not do that?

I made a list of the ways I’m weird which are preventing me from finding someone who really likes and understands me, then deleted the list because I realized I sound just like everyone else (probably there are men out there who like tequila and art museums and hockey). I know this is probably the most banal asking-for-advice letter ever, but being single is this big thing making me unhappy and I blame myself. Some things are just better with a boyfriend, like adopting a puppy and traveling to Asia and having sex and becoming a famous painter.

How do people meet each other and balance a reluctance to get involved with an inflated conception of love? How can I get myself to date someone who doesn’t immediately seem perfect? How can I practice relaxing so that my life is more interesting and I can get closer to having love like I always wanted? Do you think I am crazier than I’m admitting?

Very truly yours,
Locked Up Abroad

by Polly Esther, The Awl |  Read more:

Jade Maki | Irina Rumyantseva
via:

A Masterpiece You Might Not Want to See


Michael Haneke’s Amour is the ultimate horror film. With its portrayal of the shocks, the cruelties and indignities to which old age and disease subject a happily married Parisian couple, it’s far scarier and more disturbing than Hitchcock’s Psycho, Kubrick’s The Shining, or Polanski’s Rosemary’s Baby, and like those films, it stays with you long after you might have chosen to forget it. Like all of Haneke’s work, Amour raises interesting and perhaps unanswerable questions: Can a film be a masterpiece and still make you want to warn people not to see it? Can a movie make you think that an artist has done something extraordinary, original, extremely difficult—and yet you cannot imagine yourself uttering the words, “You’ve got to go see Amour”?

It’s hard not to appreciate the film’s extraordinary qualities. Jean-Louis Trintignant, Emmanuelle Riva, and Isabelle Huppert deliver performances so convincing and delicately nuanced that we forget they are actors, let alone French movie stars; their ability to make us think we are watching real people is partly why the film is at once so impressive and so distressing. Every camera angle seems meticulously chosen, every scene artfully composed; every detail of costume and setting—a worn bathrobe, a pair of slippers, dresses hanging in a closet, the precise way that the piano is positioned in the living room, the “good” furniture so familiar that it has become almost invisible to its owners—is appropriate to the two elderly musicians whose travails we are watching; every exchange appears to have been written and shot with perfect confidence about precisely how much to conceal or reveal.

But Amour can be excruciating to watch. The film’s narrative arc is more or less clear from the opening scene, in which we see firemen bashing their way into a handsome, high-ceilinged, old-fashioned Paris apartment. The workers are visibly disturbed by a smell that turns out to come from the corpse of an old woman: nicely dressed, comfortably positioned in bed, her hair fixed, decked with flowers, all of it conveying the odd jauntiness one sometimes observes in Sicilian catacombs. Then, the action shifts back in time. An elderly couple attends the recital of a young pianist, the wife’s former student. Returning home, Georges and Anne notice that someone has crudely and unsuccessfully tried to force open the lock on their front door. Haneke has long been fascinated by the subject of home invasion; in his most notorious work, the 1997 Funny Games (remade a decade later in Hollywood), two vicious sadists force their way in and proceed to torture an upper-middle-class couple and their son. My favorite of his films, Caché(2005), considers the possibility that a mysterious, extended, and possibly malevolent home surveillance may be a more subtly destructive form of invasion: a highly effective psychological weapon. But Amour reminds us that the most dangerous intruders are the more common—indeed the unavoidable—ones: time and death.

One morning, over breakfast, Anne glazes over, impervious to Georges’s attempts to rouse her. She comes to, but without enough motor control to pour a cup of tea. In a characteristically masterly scene, Haneke persuades us—precisely rendering George’s initial terror, his relief, and then the alarming sight of the teapot shaking in Anne’s hand—that this is exactly how it would be if a formerly healthy loved one suddenly and helplessly sloshed tea all over the kitchen table.

An unsuccessful operation leaves Anne paralyzed on one side. The couple copes, with some difficulty, but with dignity and grace. Then Anne sustains another stroke, and the question arises of how much of her remains in the bed, drinking from a Sippy Cup and howling the same words over and over. As her mental and physical condition degenerates, we watch as the strain begins to unhinge Georges. The only break in his dreadful routine is provided by unsatisfying visits from the couple’s daughter, played by Isabelle Huppert. A musician, like her parents, the high-strung and harried Eva understandably prefers to focus on her own familiar discontents—an unfaithful husband, a troubled marriage—than to face what has become reality for her parents.

Alerted by the firemen’s break-in, we wait for the violence that we intuit will occur, though we don’t know when. And when at last it comes, it is a relief, both for the characters and the audience. That this violence is an act of love (among other emotions) may be part of what has inspired critics to talk about Amour as evidence that its director—known for his provocative, confrontational use of cinematic mayhem—has at last discovered the power of our more tender feelings. Amour, many critics have argued, is Haneke with a heart.

by Francine Prose, NYRB |  Read more:
Photo: Sony Pictures Classics

Hayashi Takahiko. 1990.  Back bone of the wind-17P

The Dunbar Number, From the Guru of Social Networks

A little more than 10 years ago, the evolutionary psychologist Robin Dunbar began a study of the Christmas-card-sending habits of the English. This was in the days before online social networks made friends and “likes” as countable as miles on an odometer, and Dunbar wanted a proxy for meaningful social connection. He was curious to see not only how many people a person knew, but also how many people he or she cared about. The best way to find those connections, he decided, was to follow holiday cards. After all, sending them is an investment: You either have to know the address or get it; you have to buy the card or have it made from exactly the right collage of adorable family photos; you have to write something, buy a stamp, and put the envelope in the mail. These are not huge costs, but most people won’t incur them for just anybody.

Working with the anthropologist Russell Hill, Dunbar pieced together the average English household’s network of yuletide cheer. The researchers were able to report, for example, that about a quarter of cards went to relatives, nearly two-thirds to friends, and 8 percent to colleagues. The primary finding of the study, however, was a single number: the total population of the households each set of cards went out to. That number was 153.5, or roughly 150.

This was exactly the number that Dunbar expected. Over the past two decades, he and other like-minded researchers have discovered groupings of 150 nearly everywhere they looked. Anthropologists studying the world’s remaining hunter-gatherer societies have found that clans tend to have 150 members. Throughout Western military history, the size of the company—the smallest autonomous military unit—has hovered around 150. The self-governing communes of the Hutterites, an Anabaptist sect similar to the Amish and the Mennonites, always split when they grow larger than 150. So do the offices of W.L. Gore & Associates, the materials firm famous for innovative products such as Gore-Tex and for its radically nonhierarchical management structure. When a branch exceeds 150 employees, the company breaks it in two and builds a new office.

For Dunbar, there’s a simple explanation for this: In the same way that human beings can’t breathe underwater or run the 100-meter dash in 2.5 seconds or see microwaves with the naked eye, most cannot maintain many more than 150 meaningful relationships. Cognitively, we’re just not built for it. As with any human trait, there are outliers in either direction—shut-ins on the one hand, Bill Clinton on the other. But in general, once a group grows larger than 150, its members begin to lose their sense of connection. We live on an increasingly urban, crowded planet, but we have Stone Age social capabilities. “The figure of 150 seems to represent the maximum number of individuals with whom we can have a genuinely social relationship, the kind of relationship that goes with knowing who they are and how they relate to us,” Dunbar has written. “Putting it another way, it’s the number of people you would not feel embarrassed about joining uninvited for a drink if you happened to bump into them in a bar.”

While Dunbar has long been an influential scholar, today he is enjoying newfound popularity with a particular crowd: the Silicon Valley programmers who build online social networks. At Facebook (FB) and at startups such as Asana and Path, Dunbar’s ideas are regularly invoked in the attempt to replicate and enhance the social dynamics of the face-to-face world. Software engineers and designers are basing their thinking on what has come to be called Dunbar’s Number. Path, a mobile photo-sharing and messaging service founded in 2010, is built explicitly on the theory—it limits its users to 150 friends.

“What Dunbar’s research represents is that no matter how the march of technology goes on, fundamentally we’re all human, and being human has limits,” says Dave Morin, one of Path’s co-founders. To developers such as Morin, Dunbar’s insistence that the human capacity for connection has boundaries is a challenge to the ethos of Facebook, where one can stockpile friends by the thousands. Dunbar’s work has helped to crystallize a debate among social media architects over whether even the most cleverly designed technologies can expand the dimensions of a person’s social world. As he puts it, “The question is, ‘Does digital technology in general allow you to retain the old friends as well as the new ones and therefore increase the size of your social circle?’ The answer seems to be a resounding no, at least for the moment.”

by Drake Bennett, Bloomberg Businessweek |  Read more:
Photograph by Finn Taylor

There's More to Life Than Being Happy


[ed. This also brings to mind this previous post on Metamotivation.]

In September 1942, Viktor Frankl, a prominent Jewish psychiatrist and neurologist in Vienna, was arrested and transported to a Nazi concentration camp with his wife and parents. Three years later, when his camp was liberated, most of his family, including his pregnant wife, had perished -- but he, prisoner number 119104, had lived. In his bestselling 1946 book, Man's Search for Meaning, which he wrote in nine days about his experiences in the camps, Frankl concluded that the difference between those who had lived and those who had died came down to one thing: Meaning, an insight he came to early in life. When he was a high school student, one of his science teachers declared to the class, "Life is nothing more than a combustion process, a process of oxidation." Frankl jumped out of his chair and responded, "Sir, if this is so, then what can be the meaning of life?"

As he saw in the camps, those who found meaning even in the most horrendous circumstances were far more resilient to suffering than those who did not. "Everything can be taken from a man but one thing," Frankl wrote in Man's Search for Meaning, "the last of the human freedoms -- to choose one's attitude in any given set of circumstances, to choose one's own way."

Frankl worked as a therapist in the camps, and in his book, he gives the example of two suicidal inmates he encountered there. Like many others in the camps, these two men were hopeless and thought that there was nothing more to expect from life, nothing to live for. "In both cases," Frankl writes, "it was a question of getting them to realize that life was still expecting something from them; something in the future was expected of them." For one man, it was his young child, who was then living in a foreign country. For the other, a scientist, it was a series of books that he needed to finish. Frankl writes:
This uniqueness and singleness which distinguishes each individual and gives a meaning to his existence has a bearing on creative work as much as it does on human love. When the impossibility of replacing a person is realized, it allows the responsibility which a man has for his existence and its continuance to appear in all its magnitude. A man who becomes conscious of the responsibility he bears toward a human being who affectionately waits for him, or to an unfinished work, will never be able to throw away his life. He knows the "why" for his existence, and will be able to bear almost any "how."
In 1991, the Library of Congress and Book-of-the-Month Club listed Man's Search for Meaning as one of the 10 most influential books in the United States. It has sold millions of copies worldwide. Now, over twenty years later, the book's ethos -- its emphasis on meaning, the value of suffering, and responsibility to something greater than the self -- seems to be at odds with our culture, which is more interested in the pursuit of individual happiness than in the search for meaning. "To the European," Frankl wrote, "it is a characteristic of the American culture that, again and again, one is commanded and ordered to 'be happy.' But happiness cannot be pursued; it must ensue. One must have a reason to 'be happy.'"

According to Gallup , the happiness levels of Americans are at a four-year high -- as is, it seems, the number of best-selling books with the word "happiness" in their titles. At this writing, Gallup also reports that nearly 60 percent all Americans today feel happy without a lot of stress or worry. On the other hand, according to the Center for Disease Control, about 4 out of 10 Americans have not discovered a satisfying life purpose. Forty percent either do not think their lives have a clear sense of purpose or are neutral about whether their lives have purpose. Nearly a quarter of Americans feel neutral or do not have a strong sense of what makes their lives meaningful. Research has shown that having purpose and meaning in life increases overall well-being and life satisfaction, improves mental and physical health, enhances resiliency, enhances self-esteem, and decreases the chances of depression. On top of that, the single-minded pursuit of happiness is ironically leaving people less happy, according to recent research. "It is the very pursuit of happiness," Frankl knew, "that thwarts happiness."

by Emily Esfahani Smith, The Atlantic |  Read more:
Kacper Pempel/Reuters

Thursday, January 10, 2013

The Wailin' Jennys


The Parting Glass

Oh all the money that e'er I spent
I spent it in good company
And all the harm that e'er I've done
Alas, it was to none but me
And all I've done for want of wit
To memory now I can't recall
So fill to me the parting glass
Good night and joy be with you all

Oh all the comrades that e'er I've had
Are sorry for my going away
And all the sweethearts that e'er I've had
Would wish me one more day to stay
But since it falls unto my lot
That I should rise and you should not
I'll gently rise and I'll softly call
Good night and joy be with you all
Good night and joy be with you all

Photo: markk

My Name is Mark

Fonts of Inspiration


[ed. I would love to use a Palatino font and have even researched it, but the technicals of getting it done with Blogger are frankly beyond me.]

When did we all become amateur typography experts? Perhaps we should credit Steve Jobs, a calligraphy buff who built a bunch of cool typeface options into early Macs. By the time I got to college, any sophomore worth her salt had firm feelings about whether Palatino or Garamond looked better on her Classic II. And any professor worth her salt knew that a term paper printed in 12-point Courier was a desperate attempt to stretch eight thin pages to the required 10.

By 2007, some of us were actually watching a feature-length documentary about a font. We grew adept at spotting Helvetica, the ubiquitous "typeface of capitalism," on storefronts and billboards. We even took online quizzes that tested our capacity to distinguish its flat-topped t from Arial's slope-roofed impostor.

In 2008, a typeface won a presidential election. At least, that's the impression you may have gotten if you read one of the countless stories extolling the virtues of Gotham. Originally commissioned for a GQ redesign, Gotham came to define the Obama campaign's clean visual signature. The website of Hoefler & Frere-Jones, the foundry that invented Gotham, went momentarily viral after a catty blog post ridiculed the fonts used by rival campaigns. Though, to be fair, I find it hard to deny that the McCain logo seemed better suited to a downscale drugstore cologne.

Nowadays we raise a ruckus when Ikea abruptly switches its corporate identity from a customized version of Futura to Verdana.* We sign petitions proposing an outright ban on Comic Sans. We chuckle at cruel, font-based humor: "Comic Sans walks into a bar and the bartender says, 'We don't serve your type here.' " We leap to correct those who naively say "font" when the correct term is "typeface."(No doubt I've already done it in this essay, and will do so again. Many apologies.)

If you merely wish to be annoying at cocktail parties, Simon Garfield's 2011 book Just My Type covers the Ikea incident, the Comic Sans saga, and lots of other fun waypoints in the history of typography. If, however, your aim—like mine—is to blow past jovial dorkery, level up, and ascend to a realm reserved for the truly insufferable pedant ... may I recommend a new coffee table hardback from Stephen Coles? The Anatomy of Type offers granularity that would glaze the eyes of a normal, well-adjusted human. I couldn’t get enough of it.

by Seth Stevenson, Slate |  Read more:
Illustration from The Anatomy of Type

Me, Myself and I


The bluest period I ever spent was in Manhattan’s East Village, not so long back. I lived on East 2nd Street, in an unreconstructed tenement building, and each morning I walked across Tompkins Square Park to get my coffee. When I arrived the trees were bare, and I dedicated those walks to checking the progress of the blossoms. There are many community gardens in that part of town, and so I could examine irises and tulips, forsythia, cherry trees and a great weeping willow that seemed to drop its streamers overnight, like a ship about to lift anchor and sail away.

I wasn’t supposed to be in New York, or not like this, anyway. I’d met someone in America and then lost them almost instantly, but the future we’d dreamed up together retained its magnetism, and so I moved alone to the city I’d expected to become my home. I had friends there, but none of the ordinary duties and habits that comprise a life. I’d severed all those small, sustaining cords, and, as such, it wasn’t surprising that I experienced a loneliness more paralysing than anything I’d encountered in more than a decade of living alone.

What did it feel like? It felt like being hungry, I suppose, in a place where being hungry is shameful, and where one has no money and everyone else is full. It felt, at least sometimes, difficult and embarrassing and important to conceal. Being foreign didn’t help. I kept botching the ballgame of language: fumbling my catches, bungling my throws. Most days, I went for coffee in the same place, a glass-fronted café full of tiny tables, populated almost exclusively by people gazing into the glowing clamshells of their laptops. Each time, the same thing happened. I ordered the nearest thing to filter on the menu: a medium urn brew, which was written in large chalk letters on the board. Each time, without fail, the barista looked blankly up and asked me to repeat myself. I might have found it funny in England, or irritating, or I might not have noticed it all, but that spring it worked under my skin, depositing little grains of anxiety and shame.

Something funny happens to people who are lonely. The lonelier they get, the less adept they become at navigating social currents. Loneliness grows around them, like mould or fur, a prophylactic that inhibits contact, no matter how badly contact is desired. Loneliness is accretive, extending and perpetuating itself. Once it becomes impacted, it isn’t easy to dislodge. When I think of its advance, an anchoress’s cell comes to mind, as does the exoskeleton of a gastropod.

This sounds like paranoia, but in fact loneliness’s odd mode of increase has been mapped by medical researchers. It seems that the initial sensation triggers what psychologists call hypervigilance for social threat. In this state, which is entered into unknowingly, one tends to experience the world in negative terms, and to both expect and remember negative encounters — instances of rudeness, rejection or abrasion, like my urn brew episodes in the café. This creates, of course, a vicious circle, in which the lonely person grows increasingly more isolated, suspicious and withdrawn.

At the same time, the brain’s state of red alert brings about a series of physiological changes. Lonely people are restless sleepers. Loneliness drives up blood pressure, accelerates ageing, and acts as a precursor to cognitive decline. According to a 2010 study I came across in the Annals of Behavioral Medicine entitled ‘Loneliness Matters: A Theoretical and Empirical Review of Consequences and Mechanisms’, loneliness predicts increased morbidity and mortality, which is an elegant way of saying that loneliness can prove fatal.

by Olivia Laing, Aeon |  Read more:
Illustration: The Hotel Room (1931) by Edward Hopper. Photo by Francis G. Mayer/Corbis

The Post-Productive Economy

Take a look at these farm houses which I saw under construction in remote areas of Yunnan province China. They were not unusual; farmsteads this size were everywhere in rural China. Note the scale of these massive buildings. Each support post is cut from a single huge tree. The massive earth walls are three stories high and taper toward the top. They are homes for a single extended family built in the traditional Tibetan farmhouse style. They are larger than most middle-class American homes. The extensive wood carvings inside and outside will be painted in garish colors, like this family room shown in a finished home. This area of Yunnan is consider one of the poorer areas in China, and the standard of living of the inhabitants here would be classified as "poor."

Part of the reason is that these homes have no running water, no grid electricity, and no toilets. They don't even have outhouses.

But the farmers and their children who live in these homes all have cell phones, and they have accounts on the Chinese versions of Twitter and Facebook, and recharge via solar panels.

This is important because a recent thought-provoking article by a renowned economist argues that the US economy has not been growing during the internet boom and probably will not grow any more than it has already because computers and the internet are not as productive as the last two industrial revolutions.



You can read the article here: Is U.S. Economic Growth Over? (PDF) by Robert Gordon.

Gordon answers his own question with: Yes, US economic growth is over for a while. I think Robert Gordon is wrong about his conclusion, but I wanted to start with one of the bits of evidence he offers for his view. He is trying to argue that the consequences of the 2nd Industrial Revolution, which bought to common people electricity and plumbing, was far more important than the computers and internet which the 3rd Industrial Revolution has brought us. (Gordon's 1st Industrial revolution was steam and railroads.) As evidence of this claim he offers this hypothetical choice between option A and option B.
With option A you are allowed to keep 2002 electronic technology, including your Windows 98 laptop accessing Amazon, and you can keep running water and indoor toilets; but you can’t use anything invented since 2002. Option B is that you get everything invented in the past decade right up to Facebook, Twitter, and the iPad, but you have to give up running water and indoor toilets. You have to haul the water into your dwelling and carry out the waste. Even at 3am on a rainy night, your only toilet option is a wet and perhaps muddy walk to the outhouse. Which option do you choose?
Gordon then goes on to say:
I have posed this imaginary choice to several audiences in speeches, and the usual reaction is a guffaw, a chuckle, because the preference for Option A is so obvious.
But as I just recounted, Option A is not obvious at all.

The farmers in rural China have chosen cell phones and twitter over toilets and running water. To them, this is not a hypothetical choice at all, but a real one. and they have made their decision in massive numbers. Tens of millions, maybe hundreds of millions, if not billions of people in the rest of Asia, Africa and South America have chosen Option B. You can go to almost any African village to see this. And it is not because they are too poor to afford a toilet. As you can see from these farmers' homes in Yunnan, they definitely could have at least built an outhouse if they found it valuable. (I know they don't have a toilet because I've stayed in many of their homes.) But instead they found the intangible benefits of connection to be greater than the physical comforts of running water.

Most of the poor of the world don't have such access to resources as these Yunnan farmers, but even in their poorer environment they still choose to use their meager cash to purchase the benefits of the 3rd revolution over the benefits of the 2nd revolution. Connection before plumbing. It is an almost universal choice.

This choice may seem difficult for someone who has little experience in the developing world, but in the places were most of the world lives we can plainly see that the fruits of the 3rd generation of automation are at least as, and perhaps more, valuable than some fruits of the 2nd wave of industrialization.

So if people value the benefits of computers and internet so much why don't we see this value reflected in the growth of the US economy? According to Gordon growth has stalled in the internet age. This question was first asked by Robert Solow in 1987 and Gordon's answer is that there are 6 "headwinds," six negative, or contrary forces which deduct growth from the growth due to technology in the US (Gordon reiterates he is only speaking of he US). The six "headwinds" slowing down growth are the aging of the US population, stagnant levels of education, rising inequality, outsourcing and globalization, environmental constraints, and household and government debt. I agree with Gordon about these headwinds, particularly the first one, which he also sees as the most important.

Where Gordon is wrong is his misunderstanding and underestimating of the power of technological growth before it meets these headwinds.

by Kevin Kelly, The Technium |  Read more:

Viviane Sassen. Parasomnia. Testament.
via:

Wednesday, January 9, 2013

The Power of Positive Publishing

How-to writers are to other writers as frogs are to mammals,” wrote the critic Dwight MacDonald in a 1954 survey of “Howtoism.” “Their books are not born, they are spawned.”

MacDonald began his story by citing a list of 3,500 instructional books. Today, there are at least 45,000 specimens in print of the optimize-everything cult we now call “self-help,” but few of them look anything like those classic step-by-step “howtos,” which MacDonald and his Establishment brethren handled only with bemused disdain. These days, self-help is unembarrassed, out of the bedside drawer and up on the coffee table, wholly transformed from a disreputable publishing category to a category killer, having remade most of nonfiction in its own inspirational image along the way.

Many of the books on Amazon’s current list of “Best Sellers in Self-Help” would have been unrecognizable to MacDonald: Times business reporter Charles Duhigg’s The Power of Habit, a tour of the latest behavioral science; Paulo Coelho’s novel The Alchemist, a fable about an Andalusian shepherd seeking treasure in Egypt; Susan Cain’s Quiet: The Power of Introverts in a World That Can’t Stop Talking, a journalistic paean to reticence; publisher Will Schwalbe’s memoir The End of Your Life Book Club, about reading with his dying mother; and A Child Called “It,” David Pelzer’s recollections of harrowing and vicious child abuse. And these are just the books publishers identify as self-help; other hits are simply labeled “business” or “psychology” or “religion.” “There isn’t even a category officially called ‘self-help,’ ” says William Shinker, publisher of Gotham Books. Shinker discovered Men Are From Mars, Women Are From Venus and now publishes books on “willpower” and “vulnerability”—“self-help masquerading as ‘big-idea’ books.”

Twenty years ago, when Chicken Soup for the Soul was published, everyone knew where to find it and what it was for. Whatever you thought of self-help—godsend, guilty pleasure, snake oil—the genre was safely contained on one eclectic bookstore shelf. Today, every section of the store (or web page) overflows with instructions, anecdotes, and homilies. History books teach us how to lead, neuroscience how to use our amygdalas, and memoirs how to eat, pray, and love. The former CEO of CNN writes the biography of an ornery tech visionary and it becomes a best seller on the strength of its leadership lessons. The Nobel-laureate psychologist Daniel Kahneman writes a subtle analysis of our decision-making process and soon finds his best seller digested and summarized in M.B.A. seminars across the country. Philosophical essayist Alain de Botton launches a series of self-help books called “The School of Life,” whose titles will all begin with “how to.” Even before books are written, their advances are often predicated on strong “takeaways” targeted to proven demographics. More like a virus than MacDonald’s frogs, self-help has infiltrated and commandeered other fields in its drive to reproduce. This plague of usefulness has burrowed its way into the types of books that were traditionally meant to enlighten, or entertain, or influence policy, but not exactly to build better selves. It’s generally led to better self-help, more grounded in the facts and narratives that drive the other genres, but also to a nonfiction landscape in which every goal is subjugated to the self-­improvement imperative.

This new kind of self-help could never thrive in a vacuum. Or rather, it thrives in a particular vacuum—the one left behind by the disappearance of certain public values that once fulfilled our lives. Strains of self-help culture—entrepreneurship, pragmatism, fierce self-­reliance, gauzy spirituality—have been embedded in the national DNA since Poor Richard’s Almanack. But in the past there was always a countervailing force, an American stew of shame and pride and citizenship that kept these impulses walled off, sublimating private anxiety to the demands of an optimistic meritocracy. That force has gradually been weakened by the erosion of all sorts of structures, from the corporate career track to the extended family and the social safety net. Instead of regulation, we have that new buzzword, self-regulation; instead of an ambivalence over “selling out,” we have the millennial drive to “monetize”; and instead of seeking to build better institutions, we mine them in order to build better selves. Universities now devote faculty to fields (positive psychology, motivation science) that function as research arms of the self-help industry, while journalists schooled in a sense of public mission turn their skills to fulfilling our emotional needs. But since self-help trails with it that old shameful stigma, the smartest writers and publishers shun the obvious terminology. And the savviest readers enjoy the masquerade, knowing full well what’s behind the costume: self-help with none of the baggage.

It was in the seventies that we began to shed that baggage, starting with the outer layer of self-help: common sense. Children of the postwar middle class were weaned on the mass paperbacks of Dr. Spock, and their parents learned how to win friends and think positively from Dale Carnegie and Norman Vincent Peale. But in the late sixties, that gray-flannel-suit howtoism gave way to the reemergence of an older, more mystical strain, part bootstrapping and part magical thinking. The New Age was really a revival of what had once been called New Thought: a religious movement spawned in the primordial soup of Ralph Waldo Emerson, Sigmund Freud, and William James that preached the flip side of the Protestant work ethic: faith above works and a belief in one’s unlimited capacities on Earth. The new New Thought was the perfect religion for the Me Decade, a ­reality-show version of spirituality in which the meaning of life is to unleash the inner superstar.

by Boris Kachka, New York Magazine |  Read more:
Photo: Paul Ruscha/© Ed Ruscha/Courtesy of Ed Ruscha and Gagosian Gallery (“Me”, 2001)

Semi-Charmed Life


Recently, many books have been written about the state of people in their twenties, and the question that tends to crop up in them, explicitly or not, is: Well, whose twenties? Few decades of experience command such dazzled interest (the teen-age years are usually written up in a spirit of damage control; the literature of fiftysomethings is a grim conspectus of temperate gatherings and winded adultery), and yet few comprise such varied kinds of life. Twentysomethings spend their days rearing children, living hand to mouth in Asia, and working sixty-hour weeks on Wall Street. They are moved by dreams of adult happiness, but the form of those dreams is as serendipitous as ripples in a dune of sand. Maybe your life gained its focus in college. Maybe a Wisconsin factory is where the route took shape. Or maybe your idea of adulthood got its polish on a feckless trip to Iceland. Where you start out—rich or poor, rustic or urbane—won’t determine where you end up, perhaps, but it will determine how you get there. The twenties are when we turn what Frank O’Hara called “sharp corners.”

Allowing for a selective, basically narrow frame of reference, then, it’s worth noting that much of what we know about the twentysomething years comes down to selective, basically narrow frames of reference. Able-bodied middle-class Americans in their twenties—the real subject of these books—are impressionable; they’re fickle, too. Confusion triumphs. Is it smart to spend this crucial period building up a stable life: a promising job, a reliable partner, and an admirable assortment of kitchenware? Or is the time best spent sowing one’s wild oats? Can people even have wild oats while carrying smartphones? One morning, you open the newspaper and read that today’s young people are an assiduous, Web-savvy master race trying to steal your job and drive up the price of your housing stock. The next day, they’re reported to be living in your basement, eating all your shredded wheat, and failing to be marginally employed, even at Wendy’s. For young people with the luxury of time and choice, these ambiguities give rise to a particular style of panic. (...)

The fullest guide through this territory, as it happens, avoids pointedly prescriptive claims. In “Twentysomething: Why Do Young Adults Seem Stuck?” (Hudson Street), Robin Marantz Henig and Samantha Henig provide a densely researched report on the state of middle-class young people today, drawn from several data sources and filtered through a comparative lens. Robin Marantz Henig is a baby boomer and a veteran magazine journalist focussing on science. Samantha Henig, her daughter, is in her late twenties, with a twenty-first-century version of the same career. (She has worked as a Web editor and writer at several publications, including this one, and is now the online editor of the New York Times Magazine.) Together, trading the writing in tag-team fashion, they assess the key departments of twentysomething life—school, careers, dating, family-making, and so forth—and try to discern how much has actually changed. They are interested not so much in the Mark Zuckerbergs of the demographic as in the parental-basement dwellers; they believe that people in their twenties have been getting a bad rap and want to know whether concern is justified.

Their answer, which should not come as a surprise, is: it depends. “Twentysomething” has its origins in a much discussed Times Magazine article that Robin Marantz Henig published, in 2010, called “What Is It About 20-Somethings?” That piece had a narrow and provocative frame—the psychologist Jeffrey Jensen Arnett’s idea that the twenties make up a distinct life stage, a kind of second adolescence—which the book broadens in subject and style. From Samantha Henig, we get chatty, slangy, personal writing, often trimmed, in the manner of the genre, with quirky specifics. (“We painted and decorated, and bought a sectional couch on Craigslist, each piece light enough that we could transport it entirely on our own in Katie’s Honda CRV. Katie called it the No-Boyfriend Couch. We were single grown-up ladies, doin’ it on our own.”) From her mother, we get intergenerational reality checks, which help us to weigh each topic according to two standards: “Now Is New” and “Same as It Ever Was.”

Among the alleged crimes of twentysomethings these days is hiding out in school (or in various far-flung places, like Iceland), thus deferring adult life, or being fickle in the job market once they get there. Yet the Henigs dismiss the idea that insane tuition costs and rival opportunities have made education a bad investment—if nothing else, median salaries rise with every new degree. And they wonder whether the Wanderjahr truly offers much escapism. “Doors do eventually close—sometimes because of things you did, sometimes because of things you didn’t do,” Robin Marantz Henig notes.

As for professional fickleness: there seems to be a bad kind and a good kind. The bad kind is when you change professions entirely, several times—financial consultant, graphic designer, dog walker, academic. Two-thirds of career wage growth (and, presumably, the responsibilities that go with it) happens in the first ten years, so repeatedly resetting the counter makes it likely you’ll end up uncomfortably behind your cohort. The good kind, Henig tells us, has to do with how you use that ten-year span. Fifty years ago, one might have planned to join a large, stable company at twenty-three and to rise through the ranks until retirement. Try that now, though, and there’s a good chance you’ll fall behind your more restless peers, who get a salary and a status bump with every sideways leap—an entrepreneurial style for which the build and bail cycles of Silicon Valley are an influential template. Flightiness is the new aggression.

by Nathan Heller, New Yorker |  Read more:
Photos: Flickr/Getty

koi
via: