Tuesday, October 6, 2015
The Price Is Right
What advertising does to TV.
Ever since the finale of “Mad Men,” I’ve been meditating on its audacious last image. Don Draper, sitting cross-legged and purring “Ommmm,” is achieving inner peace at an Esalen-like retreat. He’s as handsome as ever, in khakis and a crisp white shirt. A bell rings, and a grin widens across his face. Then, as if cutting to a sponsor, we move to the iconic Coke ad from 1971—a green hillside covered with a racially diverse chorus of young people, trilling, in harmony, “I’d like to teach the world to sing.” Don Draper, recently suicidal, has invented the world’s greatest ad. He’s back, baby.
The scene triggered a debate online. From one perspective, the image looked cynical: the viewer is tricked into thinking that Draper has achieved Nirvana, only to be slapped with the source of his smile. It’s the grin of an adman who has figured out how to use enlightenment to peddle sugar water, co-opting the counterculture as a brand. Yet, from another angle, the scene looked idealistic. Draper has indeed had a spiritual revelation, one that he’s expressing in a beautiful way—through advertising, his great gift. The night the episode aired, it struck me as a dark joke. But, at a discussion a couple of days later, at the New York Public Library, Matthew Weiner, the show’s creator, told the novelist A. M. Homes that viewers should see the hilltop ad as “very pure,” the product of “an enlightened state.” To regard it otherwise, he warned, was itself the symptom of a poisonous mind-set.
The question of how television fits together with advertising—and whether we should resist that relationship or embrace it—has haunted the medium since its origins. Advertising is TV’s original sin. When people called TV shows garbage, which they did all the time, until recently, commercialism was at the heart of the complaint. Even great TV could never be good art, because it was tainted by definition. It was there to sell.
That was the argument made by George W. S. Trow in this magazine, in a feverish manifesto called “In the Context of No Context.” That essay, which ran in 1980, became a sensation, as coruscating denunciations of modernity so often do. In television, “the trivial is raised up to power,” Trow wrote. “The powerful is lowered toward the trivial.” Driven by “demography”—that is, by the corrupting force of money and ratings—television treats those who consume it like sales targets, encouraging them to view themselves that way. In one of several sections titled “Celebrities,” he writes, “The most successful celebrities are products. Consider the real role in American life of Coca-Cola. Is any man as well-loved as this soft drink is?”
Much of Trow’s essay, which runs to more than a hundred pages, makes little sense. It is written in the style of oracular poetry, full of elegant repetitions, elegant repetitions that induce a hypnotic effect, elegant repetitions that suggest authority through their wonderful numbing rhythms, but which contain few facts. It’s élitism in the guise of hipness. It is more nostalgic than “Mad Men” ever was for the era when Wasp men in hats ran New York. It’s a screed against TV written at the medium’s low point—after the energy of the sitcoms of the seventies had faded but before the innovations of the nineties—and it paints TV fans as brainwashed dummies.
And yet there’s something in Trow’s manifesto that I find myself craving these days: that rude resistance to being sold to, the insistence that there is, after all, such a thing as selling out. Those of us who love TV have won the war. The best scripted shows are regarded as significant art—debated, revered, denounced. TV showrunners are embraced as heroes and role models, even philosophers. At the same time, television’s business model is in chaos, splintered and re-forming itself, struggling with its own history. Making television has always meant bending to the money—and TV history has taught us to be cool with any compromise. But sometimes we’re knowing about things that we don’t know much about at all.
Ever since the finale of “Mad Men,” I’ve been meditating on its audacious last image. Don Draper, sitting cross-legged and purring “Ommmm,” is achieving inner peace at an Esalen-like retreat. He’s as handsome as ever, in khakis and a crisp white shirt. A bell rings, and a grin widens across his face. Then, as if cutting to a sponsor, we move to the iconic Coke ad from 1971—a green hillside covered with a racially diverse chorus of young people, trilling, in harmony, “I’d like to teach the world to sing.” Don Draper, recently suicidal, has invented the world’s greatest ad. He’s back, baby.

The question of how television fits together with advertising—and whether we should resist that relationship or embrace it—has haunted the medium since its origins. Advertising is TV’s original sin. When people called TV shows garbage, which they did all the time, until recently, commercialism was at the heart of the complaint. Even great TV could never be good art, because it was tainted by definition. It was there to sell.
That was the argument made by George W. S. Trow in this magazine, in a feverish manifesto called “In the Context of No Context.” That essay, which ran in 1980, became a sensation, as coruscating denunciations of modernity so often do. In television, “the trivial is raised up to power,” Trow wrote. “The powerful is lowered toward the trivial.” Driven by “demography”—that is, by the corrupting force of money and ratings—television treats those who consume it like sales targets, encouraging them to view themselves that way. In one of several sections titled “Celebrities,” he writes, “The most successful celebrities are products. Consider the real role in American life of Coca-Cola. Is any man as well-loved as this soft drink is?”
Much of Trow’s essay, which runs to more than a hundred pages, makes little sense. It is written in the style of oracular poetry, full of elegant repetitions, elegant repetitions that induce a hypnotic effect, elegant repetitions that suggest authority through their wonderful numbing rhythms, but which contain few facts. It’s élitism in the guise of hipness. It is more nostalgic than “Mad Men” ever was for the era when Wasp men in hats ran New York. It’s a screed against TV written at the medium’s low point—after the energy of the sitcoms of the seventies had faded but before the innovations of the nineties—and it paints TV fans as brainwashed dummies.
And yet there’s something in Trow’s manifesto that I find myself craving these days: that rude resistance to being sold to, the insistence that there is, after all, such a thing as selling out. Those of us who love TV have won the war. The best scripted shows are regarded as significant art—debated, revered, denounced. TV showrunners are embraced as heroes and role models, even philosophers. At the same time, television’s business model is in chaos, splintered and re-forming itself, struggling with its own history. Making television has always meant bending to the money—and TV history has taught us to be cool with any compromise. But sometimes we’re knowing about things that we don’t know much about at all.
by Emily Nussbaum, New Yorker | Read more:
Image: Michael Kirkham
Sex and Suffering: The Tragic Life of the Courtesan in Japan's Floating World
It’s difficult to get a window into the world of Edo-Period Japanese prostitutes without the gauzy romantic filter of the male gaze. The artworks in the new San Francisco Asian Art Museum exhibition, “Seduction: Japan’s Floating World,” were made by men for men, the patrons of the Yoshiwara pleasure district outside of Edo, which is now known as Tokyo. Every little detail of Yoshiwara—from the décor and fashion, to the delicacies served at teahouses, to the talents of courtesans, both sexual and intellectual—was engineered to sate a warlord’s every whim.
We’re left with the client-commissioned pretty-girl scroll paintings by masters like Hishikawa Moronobu and Katsukawa Shunshō, as well as woodblock prints and guidebooks by commercial artists meant to lure repeat visitors through the red-light district gates. These often lush and colorful artworks are rife with romantic longing, from the images of interchangeable beauties with inscrutable expressions, to the layers of richly patterned textiles they wore, and the highly symbolic haiku poetry written about them. The showstopper of the exhibition is Moronobu’s nearly 58-foot-long handscroll painting “A Visit to the Yoshiwara,” which takes viewers on a tour of the pleasure district from the street vendors and the food being prepared to the high-ranking courtesans on parade and a couple cuddling under the covers in a teahouse.
The Yoshiwara pleasure district was just part of what the Japanese referred to as “ukiyo” or “the floating world,” which also included the Kabuki theaters of Edo. Originally, the Buddhist term “ukiyo” referred to the sorrow and grief caused by desire, which was seen as an impediment to enlightenment.
“In the Buddhist context, ‘ukiyo’ was written with characters that meant ‘suffering world,’ which is the concept that desire leads to suffering and that’s the root of all the problems in the world,” explains Laura W. Allen, the curator of Japanese art at the Asian Art Museum who originated “Seduction.” “In the 17th century, that term was turned on its head and the term ‘ukiyo’ was written with new characters to mean ‘floating world.’ The concept of the floating world was ignoring the problems that might have existed in a very strictly regulated society and abandoning yourself, bobbing along on the current of pleasure. Then it became associated with two particular sites in Edo, one of which was the Kabuki theater district, the other the Yoshiwara pleasure quarter. The art of the floating worlds ‘ukiyo-e,’ which means ‘floating world pictures,’ usually depicts those two subjects.”
But, of course, by and large, this free-floating sensation belonged to men. Allen suggests that we, as viewers, resist indulging in the fantasies of Yoshiwara prostitutes presented in the artworks, and instead, consider the real lives of the women portrayed. Unfortunately, no true records of the Edo-Period prostitutes’ personal thoughts and experiences exists—and with good reason. Publicizing the dark side of the pleasure district would have been bad for business.
“Don’t take these paintings at face value,” Allen says. “It’s easy to say, ‘Oh, yes, it’s a picture of a beautiful woman, wearing beautiful clothing.’ But it’s not a photograph. It’s some artist’s rendition, made to promote this particular world, which was driven by economics. The profiteers urged the production of more paintings, which continued to feed the frenzy for the Yoshiwara.
“The artwork is very much glamorized and idealized,” she continues. “I haven’t been to 17th-century Japan so I don’t know what it was actually like, and the women didn’t write about it, so we don’t have their firsthand accounts. To imagine it from a woman’s perspective, it must have been a very harsh reality. There’s been some modern scholarship that promotes the idea that the women working as prostitutes had an economic power that they might not have otherwise had. But I think the day-to-day reality of living in the Yoshiwara could not have been pleasant.”
For one thing, most of the women involved didn’t have a choice about their occupation. Born into impoverished farming or fishing villages, they were sold to brothels by desperate parents around the ages of 7 or 8. This tradition was rationalized by Confucian ideals that allowed the children to work out of a duty to their parents, who usually brokered 10-year contracts with the brothel owners that their girls would have to work off. The little girls would do daily chores at the brothels and tended to their “sister” courtesans, cleaning and delivering messages. In those early years, they’d learn the tricks of the trade, how to speak using manipulative language, to write “love letters,” and to fake tears with a bit of alum hidden in their collars.
If a child attendant proved she was gifted by age 11 or 12, she would be chosen for elite courtesan training, where she would learn etiquette and refined arts from masters, including how to play flute or a three-stringed instrument called a samisen, to sing, to paint, to write haiku, to write in calligraphy, to dance, to perform a tea ceremony, and how to play games like go, backgammon, and kickball. She would be well-read and literate in order to engage in stimulating conversation. While these are pleasurable activities and such talents would be a source of pride, these women weren’t encouraged to pursue them for their own fulfillment, but to make themselves more attractive to men.
“They would be trained in the very polite, cultural accomplishments of the type that aristocratic women would have,” Allen says. “The idea was that they were comparable to the wife of a daimyo [feudal lord] or a high-ranking samurai [warrior] in terms of their level of accomplishment. The elite courtesans were supposed to know all of the lady-like skills, and their skill level was keyed to how much space they would have in a brothel and how lavish their clothing was. It was a very carefully calibrated hierarchy.”
by Lisa Hix, Collector's Weekly | Read more:
Image: Katsukawa Shunshō, Secret Games in the Spring PalaceMonday, October 5, 2015
The Reign of Recycling
If you live in the United States, you probably do some form of recycling. It’s likely that you separate paper from plastic and glass and metal. You rinse the bottles and cans, and you might put food scraps in a container destined for a composting facility. As you sort everything into the right bins, you probably assume that recycling is helping your community and protecting the environment. But is it? Are you in fact wasting your time?
In 1996, I wrote a long article for The New York Times Magazine arguing that the recycling process as we carried it out was wasteful. I presented plenty of evidence that recycling was costly and ineffectual, but its defenders said that it was unfair to rush to judgment. Noting that the modern recycling movement had really just begun just a few years earlier, they predicted it would flourish as the industry matured and the public learned how to recycle properly.
So, what’s happened since then? While it’s true that the recycling message has reached more people than ever, when it comes to the bottom line, both economically and environmentally, not much has changed at all.
Despite decades of exhortations and mandates, it’s still typically more expensive for municipalities to recycle household waste than to send it to a landfill. Prices for recyclable materials have plummeted because of lower oil prices and reduced demand for them overseas. The slump has forced some recycling companies to shut plants and cancel plans for new technologies. The mood is so gloomy that one industry veteran tried to cheer up her colleagues this summer with an article in a trade journal titled, “Recycling Is Not Dead!”
While politicians set higher and higher goals, the national rate of recycling has stagnated in recent years. Yes, it’s popular in affluent neighborhoods like Park Slope in Brooklyn and in cities like San Francisco, but residents of the Bronx and Houston don’t have the same fervor for sorting garbage in their spare time.
The future for recycling looks even worse. As cities move beyond recycling paper and metals, and into glass, food scraps and assorted plastics, the costs rise sharply while the environmental benefits decline and sometimes vanish. “If you believe recycling is good for the planet and that we need to do more of it, then there’s a crisis to confront,” says David P. Steiner, the chief executive officer of Waste Management, the largest recycler of household trash in the United States. “Trying to turn garbage into gold costs a lot more than expected. We need to ask ourselves: What is the goal here?”
Recycling has been relentlessly promoted as a goal in and of itself: an unalloyed public good and private virtue that is indoctrinated in students from kindergarten through college. As a result, otherwise well-informed and educated people have no idea of the relative costs and benefits.

So, what’s happened since then? While it’s true that the recycling message has reached more people than ever, when it comes to the bottom line, both economically and environmentally, not much has changed at all.
Despite decades of exhortations and mandates, it’s still typically more expensive for municipalities to recycle household waste than to send it to a landfill. Prices for recyclable materials have plummeted because of lower oil prices and reduced demand for them overseas. The slump has forced some recycling companies to shut plants and cancel plans for new technologies. The mood is so gloomy that one industry veteran tried to cheer up her colleagues this summer with an article in a trade journal titled, “Recycling Is Not Dead!”
While politicians set higher and higher goals, the national rate of recycling has stagnated in recent years. Yes, it’s popular in affluent neighborhoods like Park Slope in Brooklyn and in cities like San Francisco, but residents of the Bronx and Houston don’t have the same fervor for sorting garbage in their spare time.
The future for recycling looks even worse. As cities move beyond recycling paper and metals, and into glass, food scraps and assorted plastics, the costs rise sharply while the environmental benefits decline and sometimes vanish. “If you believe recycling is good for the planet and that we need to do more of it, then there’s a crisis to confront,” says David P. Steiner, the chief executive officer of Waste Management, the largest recycler of household trash in the United States. “Trying to turn garbage into gold costs a lot more than expected. We need to ask ourselves: What is the goal here?”
Recycling has been relentlessly promoted as a goal in and of itself: an unalloyed public good and private virtue that is indoctrinated in students from kindergarten through college. As a result, otherwise well-informed and educated people have no idea of the relative costs and benefits.
by John Tierney, NY Times | Read more:
Image: Santtu MustonenA Country Is Not a Company
College students who plan to go into business often major in economics, but few believe that they will end up using what they hear in the lecture hall. Those students understand a fundamental truth: What they learn in economics courses won’t help them run a business.
The converse is also true: What people learn from running a business won’t help them formulate economic policy. A country is not a big corporation. The habits of mind that make a great business leader are not, in general, those that make a great economic analyst; an executive who has made $1 billion is rarely the right person to turn to for advice about a $6 trillion economy.
Why should that be pointed out? After all, neither businesspeople nor economists are usually very good poets, but so what? Yet many people (not least successful business executives themselves) believe that someone who has made a personal fortune will know how to make an entire nation more prosperous. In fact, his or her advice is often disastrously misguided.
I am not claiming that business-people are stupid or that economists are particularly smart. On the contrary, if the 100 top U.S. business executives got together with the 100 leading economists, the least impressive of the former group would probably outshine the most impressive of the latter. My point is that the style of thinking necessary for economic analysis is very different from that which leads to success in business. By understanding that difference, we can begin to understand what it means to do good economic analysis and perhaps even help some businesspeople become the great economists they surely have the intellect to be.
Let me begin with two examples of economic issues that I have found business executives generally do not understand: first, the relationship between exports and job creation, and, second, the relationship between foreign investment and trade balances. Both issues involve international trade, partly because it is the area I know best but also because it is an area in which businesspeople seem particularly inclined to make false analogies between countries and corporations.
Exports and Jobs
Business executives consistently misunderstand two things about the relationship between international trade and domestic job creation. First, since most U.S. business-people support free trade, they generally agree that expanded world trade is good for world employment. Specifically, they believe that free trade agreements such as the recently concluded General Agreement on Tariffs and Trade are good largely because they mean more jobs around the world. Second, businesspeople tend to believe that countries compete for those jobs. The more the United States exports, the thinking goes, the more people we will employ, and the more we import, the fewer jobs will be available. According to that view, the United States must not only have free trade but also be sufficiently competitive to get a large proportion of the jobs that free trade creates.
Do those propositions sound reasonable? Of course they do. This sort of rhetoric dominated the last U.S. presidential election and will likely be heard again in the upcoming race. However, economists in general do not believe that free trade creates more jobs worldwide (or that its benefits should be measured in terms of job creation) or that countries that are highly successful exporters will have lower unemployment than those that run trade deficits.
Why don’t economists subscribe to what sounds like common sense to businesspeople? The idea that free trade means more global jobs seems obvious: More trade means more exports and therefore more export-related jobs. But there is a problem with that argument. Because one country’s exports are another country’s imports, every dollar of export sales is, as a matter of sheer mathematical necessity, matched by a dollar of spending shifted from some country’s domestic goods to imports. Unless there is some reason to think that free trade will increase total world spending—which is not a necessary outcome—overall world demand will not change.
Moreover, beyond this indisputable point of arithmetic lies the question of what limits the overall number of jobs available. Is it simply a matter of insufficient demand for goods? Surely not, except in the very short run. It is, after all, easy to increase demand. The Federal Reserve can print as much money as it likes, and it has repeatedly demonstrated its ability to create an economic boom when it wants to. Why, then, doesn’t the Fed try to keep the economy booming all the time? Because it believes, with good reason, that if it were to do so—if it were to create too many jobs—the result would be unacceptable and accelerating inflation. In other words, the constraint on the number of jobs in the United States is not the U.S. economy’s ability to generate demand, from exports or any other source, but the level of unemployment that the Fed thinks the economy needs in order to keep inflation under control.
That is not an abstract point. During 1994, the Fed raised interest rates seven times and made no secret of the fact that it was doing so to cool off an economic boom that it feared would create too many jobs, overheat the economy, and lead to inflation. Consider what that implies for the effect of trade on employment. Suppose that the U.S. economy were to experience an export surge. Suppose, for example, that the United States agreed to drop its objections to slave labor if China agreed to buy $200 billion worth of U.S. goods. What would the Fed do? It would offset the expansionary effect of the exports by raising interest rates; thus any increase in export-related jobs would be more or less matched by a loss of jobs in interest-rate-sensitive sectors of the economy, such as construction. Conversely, the Fed would surely respond to an import surge by lowering interest rates, so the direct loss of jobs to import competition would be roughly matched by an increased number of jobs elsewhere.
Even if we ignore the point that free trade always increases world imports by exactly as much as it increases world exports, there is still no reason to expect free trade to increase U.S. employment, nor should we expect any other trade policy, such as export promotion, to increase the total number of jobs in our economy. When the U.S. secretary of commerce returns from a trip abroad with billions of dollars in new orders for U.S. companies, he may or may not be instrumental in creating thousands of export-related jobs. If he is, he is also instrumental in destroying a roughly equal number of jobs elsewhere in the economy. The ability of the U.S. economy to increase exports or roll back imports has essentially nothing to do with its success in creating jobs.
Needless to say, this argument does not sit well with business audiences. (When I argued on one business panel that the North American Free Trade Agreement would have no effect, positive or negative, on the total number of jobs in the United States, one of my fellow panelists—a NAFTA supporter—reacted with rage: “It’s comments like that that explain why people hate economists!”) The job gains from increased exports or losses from import competition are tangible: You can actually see the people making the goods that foreigners buy, the workers whose factories were closed in the face of import competition. The other effects that economists talk about seem abstract. And yet if you accept the idea that the Fed has both a jobs target and the means to achieve it, you must conclude that changes in exports and imports have little effect on overall employment.

Why should that be pointed out? After all, neither businesspeople nor economists are usually very good poets, but so what? Yet many people (not least successful business executives themselves) believe that someone who has made a personal fortune will know how to make an entire nation more prosperous. In fact, his or her advice is often disastrously misguided.
I am not claiming that business-people are stupid or that economists are particularly smart. On the contrary, if the 100 top U.S. business executives got together with the 100 leading economists, the least impressive of the former group would probably outshine the most impressive of the latter. My point is that the style of thinking necessary for economic analysis is very different from that which leads to success in business. By understanding that difference, we can begin to understand what it means to do good economic analysis and perhaps even help some businesspeople become the great economists they surely have the intellect to be.
Let me begin with two examples of economic issues that I have found business executives generally do not understand: first, the relationship between exports and job creation, and, second, the relationship between foreign investment and trade balances. Both issues involve international trade, partly because it is the area I know best but also because it is an area in which businesspeople seem particularly inclined to make false analogies between countries and corporations.
Exports and Jobs
Business executives consistently misunderstand two things about the relationship between international trade and domestic job creation. First, since most U.S. business-people support free trade, they generally agree that expanded world trade is good for world employment. Specifically, they believe that free trade agreements such as the recently concluded General Agreement on Tariffs and Trade are good largely because they mean more jobs around the world. Second, businesspeople tend to believe that countries compete for those jobs. The more the United States exports, the thinking goes, the more people we will employ, and the more we import, the fewer jobs will be available. According to that view, the United States must not only have free trade but also be sufficiently competitive to get a large proportion of the jobs that free trade creates.
Do those propositions sound reasonable? Of course they do. This sort of rhetoric dominated the last U.S. presidential election and will likely be heard again in the upcoming race. However, economists in general do not believe that free trade creates more jobs worldwide (or that its benefits should be measured in terms of job creation) or that countries that are highly successful exporters will have lower unemployment than those that run trade deficits.
Why don’t economists subscribe to what sounds like common sense to businesspeople? The idea that free trade means more global jobs seems obvious: More trade means more exports and therefore more export-related jobs. But there is a problem with that argument. Because one country’s exports are another country’s imports, every dollar of export sales is, as a matter of sheer mathematical necessity, matched by a dollar of spending shifted from some country’s domestic goods to imports. Unless there is some reason to think that free trade will increase total world spending—which is not a necessary outcome—overall world demand will not change.
Moreover, beyond this indisputable point of arithmetic lies the question of what limits the overall number of jobs available. Is it simply a matter of insufficient demand for goods? Surely not, except in the very short run. It is, after all, easy to increase demand. The Federal Reserve can print as much money as it likes, and it has repeatedly demonstrated its ability to create an economic boom when it wants to. Why, then, doesn’t the Fed try to keep the economy booming all the time? Because it believes, with good reason, that if it were to do so—if it were to create too many jobs—the result would be unacceptable and accelerating inflation. In other words, the constraint on the number of jobs in the United States is not the U.S. economy’s ability to generate demand, from exports or any other source, but the level of unemployment that the Fed thinks the economy needs in order to keep inflation under control.
That is not an abstract point. During 1994, the Fed raised interest rates seven times and made no secret of the fact that it was doing so to cool off an economic boom that it feared would create too many jobs, overheat the economy, and lead to inflation. Consider what that implies for the effect of trade on employment. Suppose that the U.S. economy were to experience an export surge. Suppose, for example, that the United States agreed to drop its objections to slave labor if China agreed to buy $200 billion worth of U.S. goods. What would the Fed do? It would offset the expansionary effect of the exports by raising interest rates; thus any increase in export-related jobs would be more or less matched by a loss of jobs in interest-rate-sensitive sectors of the economy, such as construction. Conversely, the Fed would surely respond to an import surge by lowering interest rates, so the direct loss of jobs to import competition would be roughly matched by an increased number of jobs elsewhere.
Even if we ignore the point that free trade always increases world imports by exactly as much as it increases world exports, there is still no reason to expect free trade to increase U.S. employment, nor should we expect any other trade policy, such as export promotion, to increase the total number of jobs in our economy. When the U.S. secretary of commerce returns from a trip abroad with billions of dollars in new orders for U.S. companies, he may or may not be instrumental in creating thousands of export-related jobs. If he is, he is also instrumental in destroying a roughly equal number of jobs elsewhere in the economy. The ability of the U.S. economy to increase exports or roll back imports has essentially nothing to do with its success in creating jobs.
Needless to say, this argument does not sit well with business audiences. (When I argued on one business panel that the North American Free Trade Agreement would have no effect, positive or negative, on the total number of jobs in the United States, one of my fellow panelists—a NAFTA supporter—reacted with rage: “It’s comments like that that explain why people hate economists!”) The job gains from increased exports or losses from import competition are tangible: You can actually see the people making the goods that foreigners buy, the workers whose factories were closed in the face of import competition. The other effects that economists talk about seem abstract. And yet if you accept the idea that the Fed has both a jobs target and the means to achieve it, you must conclude that changes in exports and imports have little effect on overall employment.
by Paul Krugman, Harvard Business Review | Read more:
Image: Carlo Giambarresi
Prospects Are Dim for America’s Great Outdoors
[ed. The LWCF has been fundamental to conservation management for decades. What's gained by letting it expire?]
On September 30, Congress allowed a relatively little-known but very important conservation provision to expire: The Land and Water Conservation Fund. While the average outdoor lover might not be familiar with this program, chances are good that they’ve enjoyed one of the places it’s helped protect.
Over the years, it has contributed tens of millions of dollars to protect lands in all 50 states. Thanks to the LWCF, visitors can enjoy areas in Mount Rainier; Redwood and Acadia National Parks; George Washington’s birthplace; Brown v. the Board of Education historic sites; Cape Hatteras in North Carolina and other national seashores; and countless wildlife refuges, management areas, and access points. Closer to home, the fund has supported more than 40,000 state and local projects—ball fields, trails, parks, and community open spaces. Almost every county in the nation has a park project covered by the fund.
The fund uses royalty revenue from something dirty (offshore drilling in public waters) to fund something clean—namely new conservation efforts. The idea is to bring balance to the use of our public resources. The monies are often used to match grants for state and local parks and recreation projects. They’re also used for voluntary buy-outs of private inholdings in national parks and wildlife areas that would otherwise be developed. It’s an idea that has been tremendously successful and widely supported. After all, who wants a beautiful overlook of a subdivision?
The fund even has strong bipartisan support in Congress. Yet, thanks to a handful of ideologues, it has expired. Put simply, the loss of the fund jeopardizes the continued conservation of our outdoors. Congress’ past refusal to fully fund the program has created a backlog of billions of dollars in needs for land acquisition, state and local park maintenance, and public access improvements. With the total loss of the program, even more projects will go unrealized.
This means that the pristine natural environment of Hawaii Volcanoes National Park’s Pohue Bay could become a resort, that plans to secure permanent public access to more than 7,000 acres of forest in the Northern Rockies could fall by the wayside, and that parts of Gettysburg National Military Park, including an Underground Railroad site, could be developed. But it’s not just national parks and historic sites at risk. Also affected are local projects like plans to relocate parts of California’s Pacific Crest Trail, the Appalachian Trail in Tennessee, and the New England National Scenic Trail in Massachusetts.
by Dan Chu, Outside | Read more:
Image: Giant Ginkgo, flickr
On September 30, Congress allowed a relatively little-known but very important conservation provision to expire: The Land and Water Conservation Fund. While the average outdoor lover might not be familiar with this program, chances are good that they’ve enjoyed one of the places it’s helped protect.

The fund uses royalty revenue from something dirty (offshore drilling in public waters) to fund something clean—namely new conservation efforts. The idea is to bring balance to the use of our public resources. The monies are often used to match grants for state and local parks and recreation projects. They’re also used for voluntary buy-outs of private inholdings in national parks and wildlife areas that would otherwise be developed. It’s an idea that has been tremendously successful and widely supported. After all, who wants a beautiful overlook of a subdivision?
The fund even has strong bipartisan support in Congress. Yet, thanks to a handful of ideologues, it has expired. Put simply, the loss of the fund jeopardizes the continued conservation of our outdoors. Congress’ past refusal to fully fund the program has created a backlog of billions of dollars in needs for land acquisition, state and local park maintenance, and public access improvements. With the total loss of the program, even more projects will go unrealized.
This means that the pristine natural environment of Hawaii Volcanoes National Park’s Pohue Bay could become a resort, that plans to secure permanent public access to more than 7,000 acres of forest in the Northern Rockies could fall by the wayside, and that parts of Gettysburg National Military Park, including an Underground Railroad site, could be developed. But it’s not just national parks and historic sites at risk. Also affected are local projects like plans to relocate parts of California’s Pacific Crest Trail, the Appalachian Trail in Tennessee, and the New England National Scenic Trail in Massachusetts.
by Dan Chu, Outside | Read more:
Image: Giant Ginkgo, flickr
Sunday, October 4, 2015
Romeo Santos - Bachata
[ed. Formerly of Aventura, one of the groups popularizing Bachata. What is Bachata? Check out Daniel and Desiree.]
Do Millennials Really Deserve Their Bratty Reputation?

Is this group caricature anywhere close to fair, or a more virulent strain of traditional intergenerational bigotry? “I see something nasty in the getoffmylawnism that we get today that I don’t really remember previously,” the blogger Duncan Black noted at Eschaton. “I see a lot of hatred of the youngs. It’s troubling and weird.” Washington Post Wonkblog contributor Christopher Ingraham also sees a whole lot of hatin’ goin’ on, but believes it’s for the wrong reasons. Forget “the derisive talk of selfies and selfishness and Snapchat,” Ingraham wrote. “If you do want to hate on millennials, at least do them the credit of hating them for the right reasons,” he advises, helpfully coming up with five biggies, based upon recent polling. (1) Millennials are the most unpatriotic generation, a disgrace to everything John Wayne growled for. (The upside to this, though neocons will not see it as such, is that Millies “are also far less supportive of the use of military force and may have internalized a permanent case of ‘Iraq Aversion,’ ” according to a Cato Institute white paper called “Millennials and U.S. Foreign Policy.”) (2) For all their multi-culti airs, Millies are as racist in their attitudes as older coots. (3) They are the most clueless, duh generation when it comes to the news. (4) They’re the leading vaccine skeptics, “seven times as likely as seniors to believe in the unequivocally discredited link between vaccines and autism.” (5) They are queasy about free speech and expression, though I don’t consider the survey Ingraham cites on publishing Muhammad cartoons a convincing example. A better citation might have been the wave of “trigger warnings,” safe places, “micro-aggressions,” and virtuoso claims of victim status that are turning so many universities into high-rent nurseries. It is such coddling and cocooning of educated Millennials within a comfort zone patrolled by helicopter parents and their proxies that provoked the novelist and screenwriter Bret Easton Ellis to diaper-pin them as the hypersensitive “Generation Wuss.” The little wussies are fickle, too.
As a veteran contributor to Vanity Fair, I am unfazed by such talk of divas. Pull up to the campfire some night and I will relate thrilling tales of Divas I Have Known, or at least heard about over lunch. I suspect Millennials are minor-leaguers by comparison, but I’m spared the friction of finding out firsthand. In fact I feel I bring a cool impartiality to the topic, since I am not a marketer, manager, teacher, or, sigh of relief, parent; I don’t have to put up with Millennials, nor they with me, on a routine, close-quarter basis; and, as a good liberal of the Larry David school, I strive to avoid facile generalizing, following a policy of judging people not based on their birth cohort but strictly as individual interfaces, each with something unique and/or annoying to offer. If I am favorably inclined toward the Millies, it’s because the shining young exemplars I have come into contact with tend to be recent college graduates interested in arts journalism and criticism—smart, avid, outgoing, energetic, smooth, and almost opaline they are, displaying far better manners than many of the crocks I run into at the ballet or theater. With their dynamically designed résumés, business cards, and follow-up notes, these cadets are far more entrepreneurial and savvy than I was at their age. They have to be—they’re facing far greater odds, far fewer entry points to advance beyond magazine and cable-news internship, if they’re lucky enough to land even that. The multi-tiered print world that racketed and teemed when I arrived has been deforested and arts journalism largely vaporized; the post-dot-com-boom avalanche of Internet riches hasn’t flowed to creatives, but to Atlas Shrugged Silicon Valley app innovators, content funnels, and platform owners—“into the pockets of Digital Monopolists and Digital Thieves,” as Jonathan Taplin, the director of the U.S.C. Annenberg Innovation Lab, put it in an open letter to Millennials titled “Sleeping Through a Revolution,” which appeared online at Medium. Over the last two decades artists, writers, musicians, filmmakers, critics, and performers have seen their livelihoods devastated by the piracy and streaming-content penury of the Internet, but it’s not as if the carnage were limited to the infotainment sphere. Taplin: “My feeling is that media is just the canary in the coal mine, and that in the next 20 years, millions of the jobs you [Millennials] are training for might be automated.”
by James Wolcott, Vanity Fair | Read more:
Image: Darrow
How the Superwealthy Plan to Make Sure Their Kids Stay Superwealthy
The first clue that this is no ordinary crowd of sulky teenagers comes when the instructor asks those who’ve invested in the market to raise their hands. Most hands go up. As a financial planner explains the benefits of investing, one boy interrupts. “What do you suggest investing in right now?” asks Liam Whitfield, 18, a senior at a private Seattle high school, with swooping bangs and a shaggy sweater. The speaker, from a local investment firm, suggests a standard mix of 60 percent stocks and 40 percent bonds. Whitfield looks disappointed. He already owns shares of Apple, Facebook, and Starbucks. “I was kind of looking for an actual stock tip,” he says.
It’s a Saturday morning in March, and Whitfield is sitting with two dozen teens in an antiseptic meeting room for a lesson on money management arranged by their well-to-do parents. The lecturers have broken the ice with a Saturday Night Live ad for a book of financial advice called Don’t Buy Stuff You Cannot Afford. (It’s one page long.) They show photos of cars that go from humble to glamorous and ask the kids to pick one—but only after calculating how long it would take to afford by saving $2,000 a year. An instructor praises a girl who chooses a Volkswagen Jetta over a $90,000 Range Rover. “You followed all the rules—it’s exciting, guys, right?” says John Gage, a 6-foot-9-inch recent Stanford graduate who roams the front of the room. Gage works for Cornerstone Advisors, a wealth management firm in Bellevue, Wash., that’s hosting the class for children of clients and prospects. During an exercise in monthly budgeting drawn from real-life salaries, someone notes how difficult it can be. “Especially if you’re a teacher,” one kid cracks.
This is the most gilded age since the Gilded Age, with 5 percent of American households controlling 63 percent of the country’s wealth. Decades of stagnant income growth for the middle class contrasts with family dynasties such as the Waltons of Wal-Mart, wealthier than the poorest 40 percent of households combined. Some $59 trillion—the largest intergenerational transfer of wealth in U.S. history—will flow down from estates through 2061, according to Boston College’s Center on Wealth and Philanthropy.
None of that’s made the rich any less anxious, at least when it comes to keeping their money. The number of family offices for the ultrawealthy has doubled since 1998, branching into areas far beyond portfolio and tax planning. The advisory firms reach deep into their clients’ family lives, aiming to prevent squabbles among heirs and head off early signs of wastrelism. Some teach classes like this one near Seattle or organize family retreats. Others use board games and flashcards to drill sound money concepts into children as young as 5. One firm, Ascent Private Capital Management, employs an historian and two psychologists to help clients put their fortunes and family dynamics into perspective. “We didn’t just want to help clients manage wealth, we wanted to help clients manage the impact of wealth,” says Michael Cole, the firm’s president.
Like others in the business, he brings up an adage—shirtsleeves to shirtsleeves in three generations—and says, “It’s real.” Thought to be a variation on a saying from Lancashire, England, about families going from clogs to clogs, the idea resonates in many cultures. Japan’s version is rice bowl to rice bowl. In Italy, from stars to stall. Or, as the striving executive Jack Donaghy put it on 30 Rock: “The first generation works their fingers to the bone making things; the next generation goes to college and innovates new ideas; the third generation snowboards and takes improv classes.”
Adviser Roy Williams says he was recently approached by a representative for wealthy Asian families in the Pacific Northwest, each with more than $200 million. “They said, ‘The kids are consuming our wealth, buying Lamborghinis and Bentleys, and we don’t know how to change the pattern,’ ” he recalls.
by Peter Robison, Bloomberg | Read more:
Image: Getty
Labels:
Business,
Economics,
Education,
Psychology,
Relationships
Saturday, October 3, 2015
A Permanent State of Sneaker-ness: Inside the Battle Between Nike and Adidas
Everybody wants the Yeezys. It's a frigid February night during New York Fashion Week, and Kanye West has just spent the afternoon at a runway event in SoHo unveiling his first fashion collection for Adidas—a collection anchored by the futuristic Yeezy Boost 750s, a.k.a. the Yeezys, a.k.a. suede high-top sneakers that look straight out of the Star Wars props department, complete with side zips and patented springy soles made from spaceship-grade foam. And now here comes Kanye, clambering onto a purpose-built stage at the intersection of Broadway and Fifth Avenue, in the shadow of the Flatiron Building, at an event that's been billed as a concert but feels closer to a product launch. Ten thousand people have shown up tonight, many claiming their tickets with an Adidas app and the rest waiting untold hours in temperatures that barely top 15 degrees, the cold compounded by gut-punches of snowy wind barreling off the East River.
“We ain't even gonna mention that other company no more, right?” Kanye asks the crowd. “We ain't wearing that other company no more, right?”
That other company, of course, is Nike—not only the most popular sneaker manufacturer but the single most valuable apparel brand in the world. Nike has 57,000 employees and a market cap north of $86 billion. And in these halcyon days of sneaker culture—the once humble sneaker having become the focal point of personal style—Nike has a heritage that consumers respect and that its competitors can't buy.
In fact, until relatively recently, if you happened to be a big-name rapper or marquee athlete, you didn't really think twice about signing with Nike. Where else would you go? Kanye himself parked his Air Yeezy line at Nike for four years.
Then, in 2013, in a deal worth a reported $10 million, Kanye abruptly announced he was leaving Nike and going to Adidas, the German rival that keeps its North American headquarters in Portland, Oregon, just up the road from Nike HQ in suburban Beaverton. Nike was shackling his creative freedom, he said. Not paying him enough. Not respecting him as a designer. “They weren't giving me the opportunity to grow,” he alleged. “They were working off an old business model.” (...)
The experts who estimate the size of the global sneaker business put the number around $55 billion, greater than the entire GDP of Ethiopia. No one buys more sneakers than Americans, and we're buying more than ever. According to the premier analytics firm NPD Group, American consumers spent $28 billion on sneakers last year alone, an almost 50 percent bump from just five years previous. Matt Powell, a self-described “sneakerologist” with NPD, believes the growth will continue for the foreseeable future. We are entering, he says, a “permanent state of sneaker-ness.”
Subscribers to this magazine (or anyone who spends any reasonable time out of doors) will understand how Powell can be so confident. A decade back, sneakers were, for the majority of adults, casual footwear, designated for specific occasions: the gym, an athletic event, mowing the lawn. Today we wear sneakers everywhere—to work, to dinner, to church, to weddings—and spend as much on them as we do on dress shoes.
Controlling 62 percent of the market (compared with Adidas's 5 percent), Nike is the primary beneficiary of our addiction, and the reasons for its supremacy are myriad. It is big. It is smart. Its endorsement roster is a portfolio of human blue-chip stocks. It caters to traditionalists with old-school Blazers, Jordans, and Dunks—some of the coolest and most coveted sneakers ever made—while testing the bounds of how futuristic a shoe can look and feel. (See, most recently, the Flyknit.) It employs more designers than any other shoe manufacturer (650 compared with Adidas's 200) and gives them unparalleled resources. Nike will take expensive risks, and when it whiffs, as it recently did with an ill-fated and quickly canceled snowboarding line, it acknowledges the error and moves on.
For years, Adidas appeared destined to fall further behind Nike in the States. Yes, Adidas had its deep roots in soccer culture (it still outfits clubs including Manchester United, A.C. Milan, and Real Madrid), and yes, it remained a top sneaker retailer in Western Europe. But although it kept offices in Portland, most of its design staff and senior brass were stuck in Adidas's global headquarters, in the German factory town of Herzogenaurach. Unsurprisingly, Adidas products often appeared out of touch with the average U.S. customer and tone-deaf about the American holy trinity of football, baseball, and basketball.
That began to change last year, with the installation of a new Adidas Group North America president, Mark King, who has mounted an unprecedented challenge to Nike—of which the Kanye shoe is only a small part. Under King, Adidas has poured money into advertising and gobbled up new endorsees. His biggest coup came this summer, when he outbid Nike to snatch away the NBA's bearded wonder, James Harden, in a deal reportedly worth $200 million over 13 years. In fact, Adidas is in the midst of the most aggressive marketing campaign in company history, showcasing music-industry talent like Pharrell, who has designed his own polka-dot Adidas sneakers and lime green track jackets. Last year, Adidas also sold out of its $800-a-pair sneaker collaboration with goth designer Rick Owens, the dark lord of haute menswear, who stitched his freaky sneaks with goat leather. The low-top Yeezy Boost 350, with a Primeknit mesh upper and rope laces, dropped in June, selling out within an hour.
Adidas has unveiled a key innovation in its Boost line, which utilizes that springy, patented foam in the sole. It has also positioned classic Adidas Originals sneakers like the Stan Smith and the Superstar—recently relaunched for its forty-fifth anniversary—less as athletic footwear and more as straight fashion. And it has moved Adidas creative director Paul Gaudio from Herzogenaurach to Portland, along with a small army of top designers who have been tasked with ripping the American market away from Nike.
Young tastemakers are taking note. In August, Adidas announced the signing of the dapper, baby-Afro-wearing NBA rookie Justise Winslow, a national champion this year with Duke, whose statement about Adidas after signing was telling: “What they've been doing with Kanye and Originals is changing the game.”
Adidas may never be able to approach the reported $3 billion Nike spends every year on marketing, but it's trying everything it can to out-cool Nike—to win the battle of taste first, ultimately building enough street cred to win the long-term financial contest.
by Matthew Shaer, GQ | Read more:
Image: Getty

That other company, of course, is Nike—not only the most popular sneaker manufacturer but the single most valuable apparel brand in the world. Nike has 57,000 employees and a market cap north of $86 billion. And in these halcyon days of sneaker culture—the once humble sneaker having become the focal point of personal style—Nike has a heritage that consumers respect and that its competitors can't buy.
In fact, until relatively recently, if you happened to be a big-name rapper or marquee athlete, you didn't really think twice about signing with Nike. Where else would you go? Kanye himself parked his Air Yeezy line at Nike for four years.
Then, in 2013, in a deal worth a reported $10 million, Kanye abruptly announced he was leaving Nike and going to Adidas, the German rival that keeps its North American headquarters in Portland, Oregon, just up the road from Nike HQ in suburban Beaverton. Nike was shackling his creative freedom, he said. Not paying him enough. Not respecting him as a designer. “They weren't giving me the opportunity to grow,” he alleged. “They were working off an old business model.” (...)
The experts who estimate the size of the global sneaker business put the number around $55 billion, greater than the entire GDP of Ethiopia. No one buys more sneakers than Americans, and we're buying more than ever. According to the premier analytics firm NPD Group, American consumers spent $28 billion on sneakers last year alone, an almost 50 percent bump from just five years previous. Matt Powell, a self-described “sneakerologist” with NPD, believes the growth will continue for the foreseeable future. We are entering, he says, a “permanent state of sneaker-ness.”
Subscribers to this magazine (or anyone who spends any reasonable time out of doors) will understand how Powell can be so confident. A decade back, sneakers were, for the majority of adults, casual footwear, designated for specific occasions: the gym, an athletic event, mowing the lawn. Today we wear sneakers everywhere—to work, to dinner, to church, to weddings—and spend as much on them as we do on dress shoes.
Controlling 62 percent of the market (compared with Adidas's 5 percent), Nike is the primary beneficiary of our addiction, and the reasons for its supremacy are myriad. It is big. It is smart. Its endorsement roster is a portfolio of human blue-chip stocks. It caters to traditionalists with old-school Blazers, Jordans, and Dunks—some of the coolest and most coveted sneakers ever made—while testing the bounds of how futuristic a shoe can look and feel. (See, most recently, the Flyknit.) It employs more designers than any other shoe manufacturer (650 compared with Adidas's 200) and gives them unparalleled resources. Nike will take expensive risks, and when it whiffs, as it recently did with an ill-fated and quickly canceled snowboarding line, it acknowledges the error and moves on.
For years, Adidas appeared destined to fall further behind Nike in the States. Yes, Adidas had its deep roots in soccer culture (it still outfits clubs including Manchester United, A.C. Milan, and Real Madrid), and yes, it remained a top sneaker retailer in Western Europe. But although it kept offices in Portland, most of its design staff and senior brass were stuck in Adidas's global headquarters, in the German factory town of Herzogenaurach. Unsurprisingly, Adidas products often appeared out of touch with the average U.S. customer and tone-deaf about the American holy trinity of football, baseball, and basketball.
That began to change last year, with the installation of a new Adidas Group North America president, Mark King, who has mounted an unprecedented challenge to Nike—of which the Kanye shoe is only a small part. Under King, Adidas has poured money into advertising and gobbled up new endorsees. His biggest coup came this summer, when he outbid Nike to snatch away the NBA's bearded wonder, James Harden, in a deal reportedly worth $200 million over 13 years. In fact, Adidas is in the midst of the most aggressive marketing campaign in company history, showcasing music-industry talent like Pharrell, who has designed his own polka-dot Adidas sneakers and lime green track jackets. Last year, Adidas also sold out of its $800-a-pair sneaker collaboration with goth designer Rick Owens, the dark lord of haute menswear, who stitched his freaky sneaks with goat leather. The low-top Yeezy Boost 350, with a Primeknit mesh upper and rope laces, dropped in June, selling out within an hour.
Adidas has unveiled a key innovation in its Boost line, which utilizes that springy, patented foam in the sole. It has also positioned classic Adidas Originals sneakers like the Stan Smith and the Superstar—recently relaunched for its forty-fifth anniversary—less as athletic footwear and more as straight fashion. And it has moved Adidas creative director Paul Gaudio from Herzogenaurach to Portland, along with a small army of top designers who have been tasked with ripping the American market away from Nike.
Young tastemakers are taking note. In August, Adidas announced the signing of the dapper, baby-Afro-wearing NBA rookie Justise Winslow, a national champion this year with Duke, whose statement about Adidas after signing was telling: “What they've been doing with Kanye and Originals is changing the game.”
Adidas may never be able to approach the reported $3 billion Nike spends every year on marketing, but it's trying everything it can to out-cool Nike—to win the battle of taste first, ultimately building enough street cred to win the long-term financial contest.
by Matthew Shaer, GQ | Read more:
Image: Getty
Alcohol as Escape From Perfectionism
Racing in from a long day at the office, an evening of cooking and homework ahead, my first instinct is to go to the fridge or the cupboard and pop a cork. It soothes the transition from day to night. Chopping, dicing, sipping wine: It’s a common modern ritual.
For years it was me at the cutting board, a glass of chilled white at my side. And for years this habit was harmless—or it seemed that way. My house wine was Santa Margherita, a pale straw-blond Italian pinot grigio. There was always a bottle in my fridge, and I’d often pour a second glass before dinner, with seeming impunity.
In the years when this was my routine, I rarely thought to put the kettle on instead. These days, my go-to drink is Celestial Seasonings Bengal Spice tea: a rich mix of cardamom, cloves, chicory, cinnamon, pepper, and ginger. But back then, as I burst through the front door, laden with groceries, wound up from the day, my first instinct was to shed some stress as quickly as I shed my coat. Once, after an unusually difficult day, my fiance Jake pointed out that the fridge was open before my coat was off. It pained me to hear this, but I know it was true.
Within a few minutes, I would be standing at the cutting board, phone cradled on my shoulder while I sipped and chopped and chatted, often to my friend Judith or my sister, Cate. Nicholas, my son, would be upstairs, doing homework, and dinner would be in process. Sip, chop, sip, chat, exhale, relax. Breathe. With two parents who had their own serious troubles with alcohol, alarm bells should have been ringing.
But my habit seemed relatively harmless. Common, even. A glass or two seemed innocent enough.
And truth was, believe it or not, I got a lot done when I was drinking. In my alpha dog years—when I was holding down a senior job at a magazine, raising an artistic, athletic young man, giving speeches on the circuit—life was more than full. Alcohol smoothed the switch from one role to the other. It seemed to make life purr. I could juggle a lot. Until, of course, I couldn't.
That’s the thing about a drinking problem: It’s progressive. But for a long, long time, alcohol can step in as your able partner, providing welcome support—before you want to boot it out.
On a recent November evening, I took a stroll through the elegant streets of London’s Chelsea district around that witching hour—an hour when many had yet to pull the shades for the evening. Heading up from the Thames River, north on Tite Street, I passed more than one window with a woman standing at her kitchen counter, a half-drunk glass at her side while she worked on the evening meal. I passed a dad unloading children from a shiny BMW, children lugging heavy knapsacks, calling out to younger siblings waving in an upper window.
It was a cozy scene, and I found myself thinking wistfully of those rituals of younger years, when my son was under my roof—not far away in California, doing a master’s degree in fine art. Time was he would saunter into the kitchen, hungry and tall, and dance me around the room while dinner cooked—a boisterous little tango that left me flushed and laughing. More often he would serenade me with his guitar.
Those years were loud and rambunctious and incredibly busy, crammed with duties and chores. Once dinner was over, he’d do homework and I’d make lunches and then noodle with a little more work before bed. He was a rower and morning came early: I’d rise in the dark and ferry him down to the waterfront, standing with the other parents as the boys headed out on the water.
Those years were full of stress and laughter, in equal doses. Often, Nicholas and I would find ourselves up at night, talking in the kitchen: I would make popcorn and we would stand side by side, filling in the blanks for each other. We were a pack of two: our conversations were deep and rewarding, and we read each other easily. And when those precious years were over, when he went off to university, the house became very quiet. Too quiet: like a stage set after the actors exited. That’s when I wrote a column in the magazine, called “Mother Interrupted.” And that’s when I began to think that a third drink might make sense. And once it was three, I was in trouble.
Flying over to Britain, to do research for my writing, I splurged with my airline points and booked myself a first-class ticket. Flight attendant to me, after dinner: “Would you care for some port with your cheese, madam?” “No, thank you, I have to work.” She frowns. “Lots of people drink port while they work.” And indeed, she pours some for the neighboring woman, who is laboring over a spreadsheet with a glass of wine. All I can think is: “That used to be me.” Six years ago, that would have been me, and my exit from the plane would have been a little fuzzy.
In a recent poll done by Netmums in Britain, 81 percent of those who drank above the safe drinking guidelines said they did so “to wind down from a stressful day.” And 86 percent said they felt they should drink less. Jungian analyst Jan Bauer, author of Alcoholism and Women: The Background and the Psychology, believes women are looking for what she calls “oblivion drinking.” “Alcohol offers a time out from doing it all—‘Take me out of my perfectionism.’ Superwoman is a cliché now, but it is extremely dangerous. I've seen such a perversion of feminism, where everything becomes work: raising children, reading all the books, not listening to their instincts. The main question is: What self are they trying to turn off? These women have climbed so high that when they fall, they crash—and alcohol’s a perfect way to crash.”
I ask Leslie Buckley, the psychiatrist who heads the women’s addiction program at Toronto’s University Health Network, if she sees a pattern in the professional women who come to see her. She doesn't skip a beat: “Perfectionism.”
Such an unforgiving word, such an unforgiving way of being—echoed by yet another doctor, who speaks of patients who look like they stepped out of Vogue: perfect-looking women with perfect children at the right schools, living in perfect houses, aiming for a perfect performance at work, with eating disorders and serious substance abuse issues.
The tyrannical myth of perfection: it seizes the psyche and doesn't let go. My mother was in its grip, and she paid a serious price for it. This was in the 1960s, when men came home from work and expected dinner and a stiff drink—except my father was usually traveling. For years my mother held down the fort. She wrote perfect thank-you notes, she cooked perfect meals. As a new bride, she ironed bed sheets and pillowcases; as a new mother, she starched our smocked dresses. My sister and I wore white gloves when we traveled, velvet hairbands in our hair, and wrote perfect thank-you notes, too. And then my mother was the one with the stiff drink, and it all crashed—but not before I had it imprinted on me: Perfect was the way to be.
by Ann Dowsett Johnston, The Atlantic | Read more:
Image: via:
For years it was me at the cutting board, a glass of chilled white at my side. And for years this habit was harmless—or it seemed that way. My house wine was Santa Margherita, a pale straw-blond Italian pinot grigio. There was always a bottle in my fridge, and I’d often pour a second glass before dinner, with seeming impunity.
In the years when this was my routine, I rarely thought to put the kettle on instead. These days, my go-to drink is Celestial Seasonings Bengal Spice tea: a rich mix of cardamom, cloves, chicory, cinnamon, pepper, and ginger. But back then, as I burst through the front door, laden with groceries, wound up from the day, my first instinct was to shed some stress as quickly as I shed my coat. Once, after an unusually difficult day, my fiance Jake pointed out that the fridge was open before my coat was off. It pained me to hear this, but I know it was true.
Within a few minutes, I would be standing at the cutting board, phone cradled on my shoulder while I sipped and chopped and chatted, often to my friend Judith or my sister, Cate. Nicholas, my son, would be upstairs, doing homework, and dinner would be in process. Sip, chop, sip, chat, exhale, relax. Breathe. With two parents who had their own serious troubles with alcohol, alarm bells should have been ringing.
But my habit seemed relatively harmless. Common, even. A glass or two seemed innocent enough.
And truth was, believe it or not, I got a lot done when I was drinking. In my alpha dog years—when I was holding down a senior job at a magazine, raising an artistic, athletic young man, giving speeches on the circuit—life was more than full. Alcohol smoothed the switch from one role to the other. It seemed to make life purr. I could juggle a lot. Until, of course, I couldn't.
That’s the thing about a drinking problem: It’s progressive. But for a long, long time, alcohol can step in as your able partner, providing welcome support—before you want to boot it out.
On a recent November evening, I took a stroll through the elegant streets of London’s Chelsea district around that witching hour—an hour when many had yet to pull the shades for the evening. Heading up from the Thames River, north on Tite Street, I passed more than one window with a woman standing at her kitchen counter, a half-drunk glass at her side while she worked on the evening meal. I passed a dad unloading children from a shiny BMW, children lugging heavy knapsacks, calling out to younger siblings waving in an upper window.
It was a cozy scene, and I found myself thinking wistfully of those rituals of younger years, when my son was under my roof—not far away in California, doing a master’s degree in fine art. Time was he would saunter into the kitchen, hungry and tall, and dance me around the room while dinner cooked—a boisterous little tango that left me flushed and laughing. More often he would serenade me with his guitar.
Those years were loud and rambunctious and incredibly busy, crammed with duties and chores. Once dinner was over, he’d do homework and I’d make lunches and then noodle with a little more work before bed. He was a rower and morning came early: I’d rise in the dark and ferry him down to the waterfront, standing with the other parents as the boys headed out on the water.
Those years were full of stress and laughter, in equal doses. Often, Nicholas and I would find ourselves up at night, talking in the kitchen: I would make popcorn and we would stand side by side, filling in the blanks for each other. We were a pack of two: our conversations were deep and rewarding, and we read each other easily. And when those precious years were over, when he went off to university, the house became very quiet. Too quiet: like a stage set after the actors exited. That’s when I wrote a column in the magazine, called “Mother Interrupted.” And that’s when I began to think that a third drink might make sense. And once it was three, I was in trouble.
Flying over to Britain, to do research for my writing, I splurged with my airline points and booked myself a first-class ticket. Flight attendant to me, after dinner: “Would you care for some port with your cheese, madam?” “No, thank you, I have to work.” She frowns. “Lots of people drink port while they work.” And indeed, she pours some for the neighboring woman, who is laboring over a spreadsheet with a glass of wine. All I can think is: “That used to be me.” Six years ago, that would have been me, and my exit from the plane would have been a little fuzzy.
In a recent poll done by Netmums in Britain, 81 percent of those who drank above the safe drinking guidelines said they did so “to wind down from a stressful day.” And 86 percent said they felt they should drink less. Jungian analyst Jan Bauer, author of Alcoholism and Women: The Background and the Psychology, believes women are looking for what she calls “oblivion drinking.” “Alcohol offers a time out from doing it all—‘Take me out of my perfectionism.’ Superwoman is a cliché now, but it is extremely dangerous. I've seen such a perversion of feminism, where everything becomes work: raising children, reading all the books, not listening to their instincts. The main question is: What self are they trying to turn off? These women have climbed so high that when they fall, they crash—and alcohol’s a perfect way to crash.”
I ask Leslie Buckley, the psychiatrist who heads the women’s addiction program at Toronto’s University Health Network, if she sees a pattern in the professional women who come to see her. She doesn't skip a beat: “Perfectionism.”
Such an unforgiving word, such an unforgiving way of being—echoed by yet another doctor, who speaks of patients who look like they stepped out of Vogue: perfect-looking women with perfect children at the right schools, living in perfect houses, aiming for a perfect performance at work, with eating disorders and serious substance abuse issues.
The tyrannical myth of perfection: it seizes the psyche and doesn't let go. My mother was in its grip, and she paid a serious price for it. This was in the 1960s, when men came home from work and expected dinner and a stiff drink—except my father was usually traveling. For years my mother held down the fort. She wrote perfect thank-you notes, she cooked perfect meals. As a new bride, she ironed bed sheets and pillowcases; as a new mother, she starched our smocked dresses. My sister and I wore white gloves when we traveled, velvet hairbands in our hair, and wrote perfect thank-you notes, too. And then my mother was the one with the stiff drink, and it all crashed—but not before I had it imprinted on me: Perfect was the way to be.
by Ann Dowsett Johnston, The Atlantic | Read more:
Image: via:
Friday, October 2, 2015
Big Talk, Small Talk
My friend Anne sent me a lightning bolt. She also sent me three flexed biceps and a dripping faucet. Also a rainbow, a volcano, and a crying-with-joy face. No smiling pile of poop yet, and no frowning devil or smirking cat. She has nothing against those. The right occasions just haven’t arisen.
Anne loves emojis, the goofy digital pictograms that have become the latest bones of contention in our culture’s never-ending deathmatch of old codger versus eternal youth, and she chides me for my skepticism. The thing is, Anne isn’t a fourteen-year-old girl, a gadget-fetishist, or a trend-hound. She’s a witty, serious, and cultivated writer in her fifties—an award-winning novelist whose elegant and precise prose lingers and haunts, and epitomizes the splendor and necessity of nuanced language.
On emojis, she’s unequivocal. “They’re fun!” she cries. “Silly, sure, but that’s the point. It’s not a reason to reject them.”
She shows me her iPhone. The dripping faucet came from a text exchange with her college-age daughter about a running bathtub: affable shorthand for, “I know, I’m not an idiot.” The string of flexed biceps went to her sick personal trainer, a woman half Anne’s age: a perky “get well” card that strengthened an intergenerational bond. A gift-wrapped candy heart helped patch up an argument with her husband.
“Sometimes language can get in the way,” she explains. “An act is sometimes better than a word. Emojis are like tiny presents. There’s no need to attack them with your intellect.” I’m a theater critic and professor in my fifties who has impugned them as ridiculous and childish. She sends me screenshots so I can mull over her examples, a slow student receiving extra help from Teacher.
Some version of this argument has played out over the past few years between countless literate people. Ever since Apple and Google made emojis standard on iOS and Android smartphone keyboards in 2011 and 2013, they have proliferated not only in texts and emails but also in social media, the art world, literature, politics, advertising, music videos, and fashion.
Most people still use them the same way the Japanese teenagers who first drove their development did—as social lubrication in electronic messages. They’re a cute, shorthand way of clarifying emotional intention and smoothing the rough edges of quickie notes that are easily misunderstood without crucial facial cues. Women use them much more than men, researchers say, and their sincerity has powered a welcome pushback against the bullying brutality on social media.
At the same time, their downside is pretty obvious, at least to educated grownups. Emojis are an infantilization of language in the name of amusement. A New York magazine cover story last year compared them admiringly to ancient hieroglyphs without mentioning that civilization bounded forward after advancing from pictographs to symbolic language.
by Jonathan Kalb, Brooklyn Rail | Read more:
Image: uncredited

She shows me her iPhone. The dripping faucet came from a text exchange with her college-age daughter about a running bathtub: affable shorthand for, “I know, I’m not an idiot.” The string of flexed biceps went to her sick personal trainer, a woman half Anne’s age: a perky “get well” card that strengthened an intergenerational bond. A gift-wrapped candy heart helped patch up an argument with her husband.
“Sometimes language can get in the way,” she explains. “An act is sometimes better than a word. Emojis are like tiny presents. There’s no need to attack them with your intellect.” I’m a theater critic and professor in my fifties who has impugned them as ridiculous and childish. She sends me screenshots so I can mull over her examples, a slow student receiving extra help from Teacher.
Some version of this argument has played out over the past few years between countless literate people. Ever since Apple and Google made emojis standard on iOS and Android smartphone keyboards in 2011 and 2013, they have proliferated not only in texts and emails but also in social media, the art world, literature, politics, advertising, music videos, and fashion.
Most people still use them the same way the Japanese teenagers who first drove their development did—as social lubrication in electronic messages. They’re a cute, shorthand way of clarifying emotional intention and smoothing the rough edges of quickie notes that are easily misunderstood without crucial facial cues. Women use them much more than men, researchers say, and their sincerity has powered a welcome pushback against the bullying brutality on social media.
At the same time, their downside is pretty obvious, at least to educated grownups. Emojis are an infantilization of language in the name of amusement. A New York magazine cover story last year compared them admiringly to ancient hieroglyphs without mentioning that civilization bounded forward after advancing from pictographs to symbolic language.
by Jonathan Kalb, Brooklyn Rail | Read more:
Image: uncredited
Labels:
Culture,
Design,
Journalism,
Media,
Relationships
I’m a Pedophile, But Not a Monster
[ed. See also, this follow-up post: My week inside the vile right-wing hate machine.]
I was born without my right hand. As a child, this deformity quickly set me apart from my peers. In public I wore a prosthesis, an intimidating object to other youngsters because of its resemblance to a pirate’s hook. Even so, I wore it every day; I felt inadequate without it. I was shy, uncoordinated and terrible at sports, all of which put me on the outs with other boys my age. But I was good at drawing and making up stories for my own entertainment, and I spent more and more time in my own head, being a space adventurer or monster wrangler or whatever character I could think up. These would ultimately prove to be useful skills, but for now they only served to further alienate me from other kids. On top of it all, I still struggled with bladder control—likely due to my heaping pile of insecurities, to which this problem only added more—well into my elementary school years.
But none of this would compare to the final insult the universe would deal me. I’ve been stuck with the most unfortunate of sexual orientations, a preference for a group of people who are legally, morally and psychologically unable to reciprocate my feelings and desires. It’s a curse of the first order, a completely unworkable sexuality, and it’s mine. Who am I? Nice to meet you. My name is Todd Nickerson, and I’m a pedophile. Does that surprise you? Yeah, not many of us are willing to share our story, for good reason. To confess a sexual attraction to children is to lay claim to the most reviled status on the planet, one that effectively ends any chance you have of living a normal life. Yet, I’m not the monster you think me to be. I’ve never touched a child sexually in my life and never will, nor do I use child pornography.
But isn’t that the definition of a pedophile, you may ask, someone who molests kids? Not really. Although “pedophile” and “child molester” have often been used interchangeably in the media, and there is some overlap, at base, a pedophile is someone who’s sexually attracted to children. That’s it. There’s no inherent reason he must act on those desires with real children. Some pedophiles certainly do, but many of us don’t. Because the powerful taboo keeps us in hiding, it’s impossible to know how many non-offending pedophiles are out there, but signs indicate there are a lot of us, and too often we suffer in silence. That’s why I decided to speak up. (...)
Ultimate Causes:
It’s easy to assume that pedophilia is always the result of some early sexualization or abuse, and certainly there seems to be a connection in some cases. However, evidence suggests there’s no magic bullet that pedophilia can be traced back to. For every pedophile who was sexually abused as a child there’s another who wasn’t. Likewise, most abuse victims never manifest pedophilic desires. Some researchers surmise that pedophilia can be traced back to genetics. Others believe the cause is congenital, and still others that it’s environmental. Personally, I think the ultimate cause is likely some combination of those, and that it varies from person to person.
Another issue is the role feelings of inadequacy play in forming our sexuality. Pedophilia may not arise from such fears (otherwise there’d be a lot more pedophiles), but those fears can certainly reinforce it. I think it’s safe to say that many pedophiles have deep-seated feelings of inferiority in one way or another, or at least we did when our sexuality was forming, and this becomes a downward spiral during puberty and beyond. Anything can be the trigger of this: disabilities, weight issues, or just general feelings of unattractiveness to peers. These feelings can be influential on one’s developing sexuality, such that even the severe cultural taboo is not enough to override it. Indeed, the taboo itself can negatively influence these vulnerable children.
I recall an event from when I was 11, sitting in the family jeep with my dad and his friend Andy when a news piece on the radio reported the sexual abuse of a girl, to which my dad said to his friend something like, “They should take people like that and place weights on top of their genitals until they smash.” Pretty horrific imagery for an 11-year-old to process, and I couldn’t help but sympathize with the abuser. After all, I could recall my own molestation perfectly, and I hardly felt it warranted that kind of response.
The bile has only multiplied since then, and I believe all that hatred just serves to reinforce pedophilia in youngsters predisposed to it. It’s a form of cognitive bias called the Backfire Effect or polarization. Everyone does this to some extent. When challenged on deeply held beliefs, no matter how uncertain or incorrect they may be, we tend to dig in our heels. With sexuality, that effect is likely magnified because there’s a physiological component, a drive every bit as powerful as belief. In essence, your brain knows what it likes and isn’t going to take no for an answer. For that reason, the nature or nurture question with respect to sexual preference is ultimately irrelevant—it becomes all but hardwired soon enough, until it’s all you know. And it’s self-reinforcing, no matter how much you wish to dig it out. Eventually it all tangles together with the rest of who you are.
Getting Schooled:
Things went along OK until I was two years away from graduating college. I began to smoke pot, a drug I’d experimented with after high school but didn’t much care for then. I didn’t like it the second time around either; it made me anxious more often than not. But I did it anyway, largely because many people I respected smoked it, and I wanted to be more like them. I was trying desperately to reshape my identity before I was thrown out into the real world. I’d even begun working out, lifting weights and exercising to get in better shape. On the outside I might’ve seemed pretty normal, but on the inside I was screaming in terror at the prospect of having to “grow up” and be “normal”—which to me meant getting a real job, finding a girlfriend, eventually getting married and raising a family. Oh, I wanted to be normal, believe me, yet I knew myself well enough to know I wouldn’t be able to carry that charade off for long, and every fiber of my being resisted the forced transformation.
After graduation I fell into the deepest pit of despair imaginable, one that lasted several years, and I’ve only just begun to pull myself out of it. You can’t experience that much blind terror and pain for that long without being seriously impacted by it. I still worked out every other day, so I was hurting constantly, since depression saps your brain of the feel-good chemicals that helps to counteract pain; but I feltsomething, and that was better than the emotional numbness that had overtaken me. Thus, my project to remake myself into a regular person a complete failure, I retreated inward like a kicked dog, often spending days on end in my bedroom. At the nadir of my depression I was contemplating suicide daily; some days I could think of little else. I found some relief in opiates, which I had to obtain illegally because doctors won’t prescribe them for depression and anxiety. The occasional hydrocodone gave me a moment of respite from the agony I was going through. I’d tried antidepressants, but they were a joke.
In the midst of that dark era in my life, I discovered an unhealthy pedophile forum. Nothing illegal was happening there, but many of its most influential members were pro-contacters, meaning they believed that sex with children was theoretically OK and supported the elimination of age of consent laws. That forum still exists and I won’t name it here, but suffice it to say, I found myself taking up the same pro-contacter chants, if only to feel like I belonged somewhere. At the time it was all that was available in terms of an actual pedophile community, and I had nothing left to lose by joining the cause, misguided though it was, and even decided to out myself on that forum. Over the ensuing years, though, I was often at odds with the pro-contacters and flitted in and out of their clique; I wanted desperately to be friends with people who shared my sexual orientation, even if they held crazy beliefs, but I could never quite reconcile with their viewpoint.
Not long after I self-outed, a group of web vigilantes called Perverted Justice showed up. You’ve probably heard of them; they’re the people behind the now-defunct TV show “To Catch a Predator.” I was no predator, but that mattered not one iota to these guys; they lumped me together with the child rapists and internet creeps just the same. As I was already out of the closet as a pedo, I was an easy target, becoming one of the first people they profiled on their Wikisposure page, a site devoted to outing online pedos whether they’d broken any laws or not. It has since changed hands but still exists online, buried in a dark corner of the internet, and yep, I’m still on it. Not that I much care anymore. Perverted Justice had their day, but they eventually burned their own house down. Back when they were in full effect, however, they managed to make my already miserable existence that much more miserable. After their expose came out, I was fired from my job at Lowe’s.
But things are getting better. Slowly. These days I struggle with bitterness and apathy; it’s a constant uphill battle, and there are days I just don’t feel like making that climb. I eke out a living (barely) on a freelance graphic design business, in a small town where too many people know who and what I am. Now I have a bachelor’s degree in journalism that I’ve never used and I’m living well below the poverty line, existing on food stamps and the couple hundred dollars I manage to scrape together every month, sometimes augmented with financial help from my parents if the bills get too high. I tried filing for disability over my arm and my emotional issues, but that was a no-go in my conservative Southern state. This is what a law-abiding pedophile has been reduced to in this society. At times I’ve wondered why I’ve even bothered to stay legal. Maybe prison would be better, even at the risk of getting shanked as a Short Eyes. At least then it would all be over with. But alas, I could never hurt a child. No matter what, some small part of me still holds out hope that things will go back to normal, or as close to normal as a celibate pedophile with little prospect of a future can get. Besides, like I said earlier, I just couldn’t allow myself to foist this abomination onto another human being. So I simply endured. Until …
I was born without my right hand. As a child, this deformity quickly set me apart from my peers. In public I wore a prosthesis, an intimidating object to other youngsters because of its resemblance to a pirate’s hook. Even so, I wore it every day; I felt inadequate without it. I was shy, uncoordinated and terrible at sports, all of which put me on the outs with other boys my age. But I was good at drawing and making up stories for my own entertainment, and I spent more and more time in my own head, being a space adventurer or monster wrangler or whatever character I could think up. These would ultimately prove to be useful skills, but for now they only served to further alienate me from other kids. On top of it all, I still struggled with bladder control—likely due to my heaping pile of insecurities, to which this problem only added more—well into my elementary school years.

But isn’t that the definition of a pedophile, you may ask, someone who molests kids? Not really. Although “pedophile” and “child molester” have often been used interchangeably in the media, and there is some overlap, at base, a pedophile is someone who’s sexually attracted to children. That’s it. There’s no inherent reason he must act on those desires with real children. Some pedophiles certainly do, but many of us don’t. Because the powerful taboo keeps us in hiding, it’s impossible to know how many non-offending pedophiles are out there, but signs indicate there are a lot of us, and too often we suffer in silence. That’s why I decided to speak up. (...)
Ultimate Causes:
It’s easy to assume that pedophilia is always the result of some early sexualization or abuse, and certainly there seems to be a connection in some cases. However, evidence suggests there’s no magic bullet that pedophilia can be traced back to. For every pedophile who was sexually abused as a child there’s another who wasn’t. Likewise, most abuse victims never manifest pedophilic desires. Some researchers surmise that pedophilia can be traced back to genetics. Others believe the cause is congenital, and still others that it’s environmental. Personally, I think the ultimate cause is likely some combination of those, and that it varies from person to person.
Another issue is the role feelings of inadequacy play in forming our sexuality. Pedophilia may not arise from such fears (otherwise there’d be a lot more pedophiles), but those fears can certainly reinforce it. I think it’s safe to say that many pedophiles have deep-seated feelings of inferiority in one way or another, or at least we did when our sexuality was forming, and this becomes a downward spiral during puberty and beyond. Anything can be the trigger of this: disabilities, weight issues, or just general feelings of unattractiveness to peers. These feelings can be influential on one’s developing sexuality, such that even the severe cultural taboo is not enough to override it. Indeed, the taboo itself can negatively influence these vulnerable children.
I recall an event from when I was 11, sitting in the family jeep with my dad and his friend Andy when a news piece on the radio reported the sexual abuse of a girl, to which my dad said to his friend something like, “They should take people like that and place weights on top of their genitals until they smash.” Pretty horrific imagery for an 11-year-old to process, and I couldn’t help but sympathize with the abuser. After all, I could recall my own molestation perfectly, and I hardly felt it warranted that kind of response.
The bile has only multiplied since then, and I believe all that hatred just serves to reinforce pedophilia in youngsters predisposed to it. It’s a form of cognitive bias called the Backfire Effect or polarization. Everyone does this to some extent. When challenged on deeply held beliefs, no matter how uncertain or incorrect they may be, we tend to dig in our heels. With sexuality, that effect is likely magnified because there’s a physiological component, a drive every bit as powerful as belief. In essence, your brain knows what it likes and isn’t going to take no for an answer. For that reason, the nature or nurture question with respect to sexual preference is ultimately irrelevant—it becomes all but hardwired soon enough, until it’s all you know. And it’s self-reinforcing, no matter how much you wish to dig it out. Eventually it all tangles together with the rest of who you are.
Getting Schooled:
Things went along OK until I was two years away from graduating college. I began to smoke pot, a drug I’d experimented with after high school but didn’t much care for then. I didn’t like it the second time around either; it made me anxious more often than not. But I did it anyway, largely because many people I respected smoked it, and I wanted to be more like them. I was trying desperately to reshape my identity before I was thrown out into the real world. I’d even begun working out, lifting weights and exercising to get in better shape. On the outside I might’ve seemed pretty normal, but on the inside I was screaming in terror at the prospect of having to “grow up” and be “normal”—which to me meant getting a real job, finding a girlfriend, eventually getting married and raising a family. Oh, I wanted to be normal, believe me, yet I knew myself well enough to know I wouldn’t be able to carry that charade off for long, and every fiber of my being resisted the forced transformation.
After graduation I fell into the deepest pit of despair imaginable, one that lasted several years, and I’ve only just begun to pull myself out of it. You can’t experience that much blind terror and pain for that long without being seriously impacted by it. I still worked out every other day, so I was hurting constantly, since depression saps your brain of the feel-good chemicals that helps to counteract pain; but I feltsomething, and that was better than the emotional numbness that had overtaken me. Thus, my project to remake myself into a regular person a complete failure, I retreated inward like a kicked dog, often spending days on end in my bedroom. At the nadir of my depression I was contemplating suicide daily; some days I could think of little else. I found some relief in opiates, which I had to obtain illegally because doctors won’t prescribe them for depression and anxiety. The occasional hydrocodone gave me a moment of respite from the agony I was going through. I’d tried antidepressants, but they were a joke.
In the midst of that dark era in my life, I discovered an unhealthy pedophile forum. Nothing illegal was happening there, but many of its most influential members were pro-contacters, meaning they believed that sex with children was theoretically OK and supported the elimination of age of consent laws. That forum still exists and I won’t name it here, but suffice it to say, I found myself taking up the same pro-contacter chants, if only to feel like I belonged somewhere. At the time it was all that was available in terms of an actual pedophile community, and I had nothing left to lose by joining the cause, misguided though it was, and even decided to out myself on that forum. Over the ensuing years, though, I was often at odds with the pro-contacters and flitted in and out of their clique; I wanted desperately to be friends with people who shared my sexual orientation, even if they held crazy beliefs, but I could never quite reconcile with their viewpoint.
Not long after I self-outed, a group of web vigilantes called Perverted Justice showed up. You’ve probably heard of them; they’re the people behind the now-defunct TV show “To Catch a Predator.” I was no predator, but that mattered not one iota to these guys; they lumped me together with the child rapists and internet creeps just the same. As I was already out of the closet as a pedo, I was an easy target, becoming one of the first people they profiled on their Wikisposure page, a site devoted to outing online pedos whether they’d broken any laws or not. It has since changed hands but still exists online, buried in a dark corner of the internet, and yep, I’m still on it. Not that I much care anymore. Perverted Justice had their day, but they eventually burned their own house down. Back when they were in full effect, however, they managed to make my already miserable existence that much more miserable. After their expose came out, I was fired from my job at Lowe’s.
But things are getting better. Slowly. These days I struggle with bitterness and apathy; it’s a constant uphill battle, and there are days I just don’t feel like making that climb. I eke out a living (barely) on a freelance graphic design business, in a small town where too many people know who and what I am. Now I have a bachelor’s degree in journalism that I’ve never used and I’m living well below the poverty line, existing on food stamps and the couple hundred dollars I manage to scrape together every month, sometimes augmented with financial help from my parents if the bills get too high. I tried filing for disability over my arm and my emotional issues, but that was a no-go in my conservative Southern state. This is what a law-abiding pedophile has been reduced to in this society. At times I’ve wondered why I’ve even bothered to stay legal. Maybe prison would be better, even at the risk of getting shanked as a Short Eyes. At least then it would all be over with. But alas, I could never hurt a child. No matter what, some small part of me still holds out hope that things will go back to normal, or as close to normal as a celibate pedophile with little prospect of a future can get. Besides, like I said earlier, I just couldn’t allow myself to foist this abomination onto another human being. So I simply endured. Until …
by Todd Nickerson, Salon | Read more:
Image: : Mors via Shutterstock
Subscribe to:
Posts (Atom)