Monday, December 23, 2013
Surviving Anxiety
[ed. See also: Enough Anxiety to Fill Two Books]

If the usual pattern has held, as I stand up here talking to you now, I’ve got some Xanax in one pocket (in case I felt the need to pop another one before being introduced) and a minibar-size bottle or two of vodka in the other. I have been known to take a discreet last-second swig while walking onstage—because even as I’m still experiencing the anxiety that makes me want to drink more, my inhibition has been lowered, and my judgment impaired, by the liquor and benzodiazepines I’ve already consumed. If I’ve managed to hit the sweet spot—that perfect combination of timing and dosage whereby the cognitive and psychomotor sedating effect of the drugs and alcohol balances out the physiological hyperarousal of the anxiety—then I’m probably doing okay up here: nervous but not miserable; a little fuzzy but still able to speak clearly; the anxiogenic effects of the situation (me, speaking in front of people) counteracted by the anxiolytic effects of what I’ve consumed. But if I’ve overshot on the medication—too much Xanax or liquor—I may seem to be loopy or slurring or otherwise impaired. And if I didn’t self-medicate enough? Well, then, either I’m sweating profusely, with my voice quavering weakly and my attention folding in upon itself, or, more likely, I ran offstage before I got this far. I mean that literally: I’ve frozen, mortifyingly, onstage at public lectures and presentations before, and on several occasions I have been compelled to bolt from the stage.
Yes, I know. My method of dealing with my public-speaking anxiety is not healthy. It’s dangerous. But it works. Only when I am sedated to near-stupefaction by a combination of benzodiazepines and alcohol do I feel (relatively) confident in my ability to speak in public effectively and without torment. As long as I know that I’ll have access to my Xanax and liquor, I’ll suffer only moderate anxiety for days before a speech, rather than sleepless dread for months. (...)
My assortment of neuroses may be idiosyncratic, but my general condition is hardly unique. Anxiety and its associated disorders represent the most common form of officially classified mental illness in the United States today, more common even than depression and other mood disorders. According to the National Institute of Mental Health, some 40 million American adults, about one in six, are suffering from some kind of anxiety disorder at any given time; based on the most recent data from the Department of Health and Human Services, their treatment accounts for more than a quarter of all spending on mental-health care. Recent epidemiological data suggest that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetime. And it is debilitating: studies have compared the psychic and physical impairment tied to living with an anxiety disorder with the impairment tied to living with diabetes—both conditions are usually manageable, sometimes fatal, and always a pain to deal with. In 2012, Americans filled nearly 50 million prescriptions for just one antianxiety drug: alprazolam, the generic name for Xanax. (...)
Stigma still attaches to mental illness. Anxiety is seen as weakness. In presenting my anxiety to the world by writing publicly about it, I’ve been told, I will be, in effect, “coming out.” The implication is that this will be liberating. We’ll see about that. But my hope is that readers who share this affliction, to whatever extent, will find some value in this account—not a cure for their anxiety, but perhaps some sense of the redemptive value of an often wretched condition, as well as evidence that they can cope and even thrive in spite of it. Most of all, I hope they—and by “they” I mean “many of you”—will find some solace in learning that they are not alone.
by Scott Stossel, The Atlantic | Read more:
Image: Jamie ChungThe Weight of the Past
This column is about Edward Snowden. And the National Football League. And, I suspect, most of the rest of us.
Although it’s about Snowden, it’s not about the National Security Administration, surveillance or privacy. And although it’s about the N.F.L., it’s not about concussions. Instead, it’s about the unbalanced trajectory of human life.
Snowden’s actions, regardless of whether one supports them or not, have had a prodigious impact on the debate about privacy in the United States and will likely continue to do so. They have had roughly the impact that Snowden wanted them to have. That is, they have altered how many of us think about our relation to the government and to our own technology, and because of this, they infuse this period of his life with a luminescence that will always be with him. He will not forget it, nor will others.
There is an assumption I would like to make here, one that I can’t verify but I think is uncontroversial. It is very unlikely that Edward Snowden will ever do anything nearly as significant again. Nothing he does for the remainder of his life will have the resonance that his recent actions have had. The powers that be will ensure it. And undoubtedly he knows this. His life will go on, and it may not be as tortured as some people think. But in an important sense his life will have peaked at age 29 or 30.
This is not to say that Snowden’s days will not have their pleasures or their meaningfulness. Rather, those pleasures and that meaningfulness will likely always have lurking in the background the momentous period in the spring of 2013.
Players in the N.F.L. have an average career of six years. For many of them, those years — typically along with their college years — are the most exciting of their lives. They represent the cities they play for, enjoy the adulation of fans and receive higher salaries than they are ever likely to again. Many develop deep bonds with their teammates. They get to experience in reality what many male children dream about. And then it is over. If concussions don’t take their toll, they can expect to live another 45 or 50 years while not playing football.
For many people — not just activists like Snowden or professional athletes — life crests early. But it doesn’t end there. It goes on, burdened by a summit that can never be reached again, which one can gaze upon only by turning back. This is not to say that good and worthwhile things will not happen to them, and for a fortunate few there will be other, higher summits. For many, however, those earlier moments will be a quiet haunting, a reminder of what has been and cannot be again.
We might think of these kinds of lives, lives whose trajectories have early peaks and then fall off, as exceptional. In a way they are. But in another way they are not. There is something precisely in the extremity of these lives that brings out a phenomenon that appears more subtly for the rest of us. It appears in different times and different places and under different guises, but it is there for us nevertheless. (...)
Many will balk, reasonably so, at the characterization I have just given. After all, who is to say that a life has crested? How do we know, except perhaps in unique cases like Snowden or many N.F.L. players, that there aren’t higher peaks in our future? Who is able to confidently say they have already lived the best year of their life?
That consideration, I think, only adds to the difficulty. We don’t know. We cannot know whether the future will bring new experience that will light the fire again or will instead be a slowly dying ember. And the puzzle then becomes, how to respond to this ignorance? Do we seek to introduce more peaks, watching in distress if they do not arise? And how would we introduce those peaks? After all, the arc of our lives is determined not simply by us but also our circumstances. Alternatively, do we go on about our days hoping for the best? Or do we instead, as many people do, lead what Thoreau called lives of quiet desperation?
This is not to say that nostalgia is our inescapable fate. The lesson I am trying to draw from reflecting on the examples of Snowden and the N.F.L. is not that the thrill ends early. Rather, in their extremity these examples bring out something else. For most of us, as our lives unfold we simply do not, we cannot, know whether we have peaked in an area of our lives — or in our lives themselves — in ways that are most important to us. The past weighs upon us, not because it must cancel the future, but because it is of uncertain heft.
Although it’s about Snowden, it’s not about the National Security Administration, surveillance or privacy. And although it’s about the N.F.L., it’s not about concussions. Instead, it’s about the unbalanced trajectory of human life.
Snowden’s actions, regardless of whether one supports them or not, have had a prodigious impact on the debate about privacy in the United States and will likely continue to do so. They have had roughly the impact that Snowden wanted them to have. That is, they have altered how many of us think about our relation to the government and to our own technology, and because of this, they infuse this period of his life with a luminescence that will always be with him. He will not forget it, nor will others.
There is an assumption I would like to make here, one that I can’t verify but I think is uncontroversial. It is very unlikely that Edward Snowden will ever do anything nearly as significant again. Nothing he does for the remainder of his life will have the resonance that his recent actions have had. The powers that be will ensure it. And undoubtedly he knows this. His life will go on, and it may not be as tortured as some people think. But in an important sense his life will have peaked at age 29 or 30.
This is not to say that Snowden’s days will not have their pleasures or their meaningfulness. Rather, those pleasures and that meaningfulness will likely always have lurking in the background the momentous period in the spring of 2013.
Players in the N.F.L. have an average career of six years. For many of them, those years — typically along with their college years — are the most exciting of their lives. They represent the cities they play for, enjoy the adulation of fans and receive higher salaries than they are ever likely to again. Many develop deep bonds with their teammates. They get to experience in reality what many male children dream about. And then it is over. If concussions don’t take their toll, they can expect to live another 45 or 50 years while not playing football.
For many people — not just activists like Snowden or professional athletes — life crests early. But it doesn’t end there. It goes on, burdened by a summit that can never be reached again, which one can gaze upon only by turning back. This is not to say that good and worthwhile things will not happen to them, and for a fortunate few there will be other, higher summits. For many, however, those earlier moments will be a quiet haunting, a reminder of what has been and cannot be again.
We might think of these kinds of lives, lives whose trajectories have early peaks and then fall off, as exceptional. In a way they are. But in another way they are not. There is something precisely in the extremity of these lives that brings out a phenomenon that appears more subtly for the rest of us. It appears in different times and different places and under different guises, but it is there for us nevertheless. (...)
Many will balk, reasonably so, at the characterization I have just given. After all, who is to say that a life has crested? How do we know, except perhaps in unique cases like Snowden or many N.F.L. players, that there aren’t higher peaks in our future? Who is able to confidently say they have already lived the best year of their life?
That consideration, I think, only adds to the difficulty. We don’t know. We cannot know whether the future will bring new experience that will light the fire again or will instead be a slowly dying ember. And the puzzle then becomes, how to respond to this ignorance? Do we seek to introduce more peaks, watching in distress if they do not arise? And how would we introduce those peaks? After all, the arc of our lives is determined not simply by us but also our circumstances. Alternatively, do we go on about our days hoping for the best? Or do we instead, as many people do, lead what Thoreau called lives of quiet desperation?
This is not to say that nostalgia is our inescapable fate. The lesson I am trying to draw from reflecting on the examples of Snowden and the N.F.L. is not that the thrill ends early. Rather, in their extremity these examples bring out something else. For most of us, as our lives unfold we simply do not, we cannot, know whether we have peaked in an area of our lives — or in our lives themselves — in ways that are most important to us. The past weighs upon us, not because it must cancel the future, but because it is of uncertain heft.
by Todd May, NY Times | Read more:
Sunday, December 22, 2013
Voice Hero: The Inventor of Karaoke Speaks
It’s one a.m. The bar is closing but the night isn’t over yet. While milling about on the sidewalk, a friend suggests, ‘Karaoke?’ And suddenly the night gets a lot brighter—and a little more embarrassing.
It’s safe to say that at no point in human history have there been as many people singing the songs of themselves, uncaring that their song was first sung by Gloria Gaynor, Frank Sinatra, or Bruce Springsteen. Karaoke has become inescapable, taking over bars from Manila to Manchester. Passions run high. In the Philippines, anger over off-key renditions of ‘My Way’ have left at least six dead. That statistic hides, however, the countless renditions of the Sinatra anthem that leave people smiling—or at least just wincing. The sing-along music machine terrifies the truly introverted, but it is a hero to countless closet extroverts, letting them reveal their private musical joy. Literally, karaoke is the combination of two Japanese words, ‘empty,’ and ‘orchestra’—but we might also lovingly translate it as ‘awkward delight.’
Yet for all karaoke’s fame, the name of its Dr. Frankenstein is less known, perhaps because he never took a patent out on the device and only copyrighted its name in the U.S. in 2009. His name is Daisuke Inoue, a Japanese businessman and inventor born in Osaka in 1940. In 2004 he was honored with an Ig Nobel Prize, given for unusual inventions or research.
In 2005, he shared the story of his life leading up to the Ig Nobel in an interview with Robert Scott Field for Topic Magazine. No longer in print, Topic was one of The Appendix’s inspirations (along with StoryCorps) for its celebration of the everyday and undersung heroes of our world. As a history of another sort of invention, Mr. Inoue’s interview was particularly memorable and deserves to be more widely available. With the permission of both Topic and Mr. Inoue, we are pleased to re-present his delightfully inspiring account of his life and work.
We hope you sing along.
***
Last year I received a fax from Harvard University. I don’t really speak English, but lucky for me, my wife does. She figured out the letter was about the Ig Nobel Prizes, awards that Harvard presents for inventions that make people laugh—and then make them think. I was nominated for an Ig Nobel Peace Prize as the inventor of karaoke, which teaches people to bear the awful singing of ordinary citizens, and enjoy it anyway. That is “genuine peace,” they told me.
Before I tell you about my hilarious adventures at the prize ceremony, though, you need to know how I came to invent the first karaoke machine. I was born in May 1940, in a small town called Juso, in Osaka, Japan. My father owned a small pool hall. When I was three and a half years old, I fell from the second floor and hit my head. I was unconscious for two weeks. The doctors told my parents that if I lived, I would probably have brain damage. A Buddhist priest visited me, blessed me and replaced my birth name, Yusuke, with a new name: Daisuke, which means, in the written characters of kanji, “Big Help.” I needed it. Later I learned that the same Buddhist priest had commented that the name would also lead me to help others.
by Daisuke Inoue and Robert Scott, The Appendix | Read more:
Image: courtesy Daisuke Inoue

Yet for all karaoke’s fame, the name of its Dr. Frankenstein is less known, perhaps because he never took a patent out on the device and only copyrighted its name in the U.S. in 2009. His name is Daisuke Inoue, a Japanese businessman and inventor born in Osaka in 1940. In 2004 he was honored with an Ig Nobel Prize, given for unusual inventions or research.
In 2005, he shared the story of his life leading up to the Ig Nobel in an interview with Robert Scott Field for Topic Magazine. No longer in print, Topic was one of The Appendix’s inspirations (along with StoryCorps) for its celebration of the everyday and undersung heroes of our world. As a history of another sort of invention, Mr. Inoue’s interview was particularly memorable and deserves to be more widely available. With the permission of both Topic and Mr. Inoue, we are pleased to re-present his delightfully inspiring account of his life and work.
We hope you sing along.
***
Last year I received a fax from Harvard University. I don’t really speak English, but lucky for me, my wife does. She figured out the letter was about the Ig Nobel Prizes, awards that Harvard presents for inventions that make people laugh—and then make them think. I was nominated for an Ig Nobel Peace Prize as the inventor of karaoke, which teaches people to bear the awful singing of ordinary citizens, and enjoy it anyway. That is “genuine peace,” they told me.
Before I tell you about my hilarious adventures at the prize ceremony, though, you need to know how I came to invent the first karaoke machine. I was born in May 1940, in a small town called Juso, in Osaka, Japan. My father owned a small pool hall. When I was three and a half years old, I fell from the second floor and hit my head. I was unconscious for two weeks. The doctors told my parents that if I lived, I would probably have brain damage. A Buddhist priest visited me, blessed me and replaced my birth name, Yusuke, with a new name: Daisuke, which means, in the written characters of kanji, “Big Help.” I needed it. Later I learned that the same Buddhist priest had commented that the name would also lead me to help others.
by Daisuke Inoue and Robert Scott, The Appendix | Read more:
Image: courtesy Daisuke Inoue
A Robot Walks Into a Bar...
[ed. See also: Google's Robot Army.]

The audience was understanding, even brightened by his stiffness. People laughed, if not at wit or silliness then at least at the cheap gag of a robot seeking affirmation in such a human way: telling jokes to strangers. Between a microphone and a brick wall, novelty may be no less effective than toilet humor. (...)
His latest turn, as a standup, is an attempt to make him more human, Jackson said, by “building people’s empathy with the machine. People anthropomorphize machines. We’re interested in pushing the boundaries of that, and seeing how much people buy a lump of metal as a personality.” They started the project last year, when a graduate student named Kleomenis Katevas arrived from Queen Mary University of London. Katevas’s doctoral thesis addresses the mathematical attributes of comedy; after RoboThespian’s standup début, he said, “It’s clear already that even relatively small changes in the timing of delivery make a big difference to audience response.”
Katevas developed an algorithm for comic timing: tell a joke, wait two seconds to measure audio feedback from the crowd, and pause for laughter, holding for no more than five seconds. If the audience responds positively, encourage them; if not, RoboThespian might say “Hmm” or “Take your time.” (...)
Comedy is an art of precision. “The difference between an amateur and a professional is that it feels off the cuff, but it’s something I’ve worked very hard on,” the comedian Rob Delaney, the author of an eponymous new book, told me. “I have a narrative arc that I want to adhere to. Sure, I’ll make changes, but it’ll be eighty-per-cent similar.” He added, “I do a thing that a robot could do, which is: I listen to the room. That, I think, could be learned.”
by Betsy Morais, New Yorker | Read more:
Image: Kiyoshi Ota/Bloomberg/GettyHow Sexual Perversion Became the New Norm
In 2007, in front of a small group of invited guests and a camera crew, a wedding took place on the left bank of the Seine in Paris. The bride was a 37-year-old American former soldier called Erika and the groom was a French feat of engineering called the Eiffel Tower. The marriage was consummated after the ceremony when the bride lifted her trench coat and straddled one of the groom’s steel girders. Erika was the more sexually experienced of the pair, having previously been in a relationship with San Francisco’s Golden Gate Bridge. Her first love affair had been with Lance, her archery bow; she has never been sexually attracted to a human being.
Erika La Tour Eiffel, as she now calls herself, is the one of the world’s 40 recognised “objectophiles”. In the American science writer Jesse Bering’s new book Perv – the British edition of which comes out in February next year – her condition is described as being akin to fetishism, in so far as an object has been invested with erotic appeal. But while the fetishist finds a shoe or a lock of hair arousing because they stand in for a human being, the objectophile is drawn to the object as an erotic target in itself. In addition, objectophiles, many of whom are autistic, believe that their love is reciprocated. “What does your beloved object find most attractive about you?” a researcher asked a number of objectophiles. “Well,” replied one woman, who is in a relationship with a flag called Libby, “Libby is always telling me she thinks I am funny. We make each other laugh so hard!”
I don’t wave at flags, despite their fun-loving side, but I’d be lying if I said I couldn’t see the appeal of the Eiffel Tower. Erika’s husband ticks all the boxes: tall, stable, glamorous, evidently not going anywhere in a hurry. As far as Erika is concerned, the tower is unlikely to let her down. Eija-Riitta Eklöf, on the other hand, a Swedish objectophile who married the Berlin Wall, now considers herself a widow, as does the poor woman who tied the knot with the Twin Towers.
If there were a party game where we could all hook up with an architectural structure, I would certainly tip my bonnet in the direction of the Eiffel Tower. Except – and this is where it gets trippy – Erika doesn’t see the Eiffel Tower as a man at all; she thinks of the 324m erection as female and considers herself in a lesbian relationship. Now that really is perverse.
There are, Bering says, 500 identified “paraphilias” and all of us, whether we like it or not, fit into the spectrum at some point. A paraphilia is defined as “a way of seeing the world through a singular sexual lens”, which cannot be repaired or, in the absence of a lobotomy, easily removed. It’s a genetic and not a moral failing. The cheery chap who does your dry-cleaning might be a plushophile who lusts after stuffed animal toys and spends his weekends looking for sex at “ConFurences” while dressed as a Disney creature. Or he could be a formicophile, who gets his pleasure from the feeling of ants and snails crawling over his erotic zones. But so long as he’s not harming anyone, and does your dry-cleaning on time, why does it matter how he reaches his peak?

I don’t wave at flags, despite their fun-loving side, but I’d be lying if I said I couldn’t see the appeal of the Eiffel Tower. Erika’s husband ticks all the boxes: tall, stable, glamorous, evidently not going anywhere in a hurry. As far as Erika is concerned, the tower is unlikely to let her down. Eija-Riitta Eklöf, on the other hand, a Swedish objectophile who married the Berlin Wall, now considers herself a widow, as does the poor woman who tied the knot with the Twin Towers.
If there were a party game where we could all hook up with an architectural structure, I would certainly tip my bonnet in the direction of the Eiffel Tower. Except – and this is where it gets trippy – Erika doesn’t see the Eiffel Tower as a man at all; she thinks of the 324m erection as female and considers herself in a lesbian relationship. Now that really is perverse.
There are, Bering says, 500 identified “paraphilias” and all of us, whether we like it or not, fit into the spectrum at some point. A paraphilia is defined as “a way of seeing the world through a singular sexual lens”, which cannot be repaired or, in the absence of a lobotomy, easily removed. It’s a genetic and not a moral failing. The cheery chap who does your dry-cleaning might be a plushophile who lusts after stuffed animal toys and spends his weekends looking for sex at “ConFurences” while dressed as a Disney creature. Or he could be a formicophile, who gets his pleasure from the feeling of ants and snails crawling over his erotic zones. But so long as he’s not harming anyone, and does your dry-cleaning on time, why does it matter how he reaches his peak?
by Frances Wilson, The New Statesman | Read more:
Image: illustration for Alfred de Musset’s erotic novel Gamiani (1833)The Late, Great American WASP
The U.S. once had an unofficial but nonetheless genuine ruling class, drawn from what came to be known as the WASP establishment. Members of this establishment dominated politics, economics and education, but they do so no longer. The WASPocracy, as I think of it, lost its confidence and, with it, the power and interest to lead. We are now without a ruling class, unless one includes the entity that has come to be known as the meritocracy—presumably an aristocracy of sheer intelligence, men and women trained in the nation's most prestigious schools.
The acronym WASP derives, of course, from White Anglo-Saxon Protestant, but as acronyms go, this one is more deficient than most. Lots of people, including powerful figures and some presidents, have been white, Anglo-Saxon and Protestant but were far from being WASPs. Neither Jimmy Carter nor Bill Clinton qualified.
WASPs were a caste, closed off to all not born within it, with the possible exception of those who crashed the barriers by marrying in. WASP credentials came with lineage, and lineage—that is, proper birth—automatically brought connections to the right institutions. Yale, Princeton and Harvard were the great WASP universities, backed up by Choate, Groton, Andover, Exeter and other prep schools. WASPs tended to live in exclusive neighborhoods: on upper Park and Fifth Avenues in New York, on the Main Line in Philadelphia, the Back Bay in Boston, Lake Forest and Winnetka in Chicago. (...)
The State Department was once dominated by WASPs, and so, too, was the Supreme Court, with one seat traditionally left unoccupied for a Jewish jurist of proper mien. The House of Representatives was never preponderantly WASP, though a number of prominent senators— Henry Cabot Lodge and Leverett A. Saltonstall, both of Massachusetts, come to mind—have been WASPs. Looking down on the crudities of quotidian American politics, Henry Adams, a WASP to the highest power, called the dealings of Congress, the horse-trading and corruption and the rest of it, "the dance of democracy." In one of his short stories, Henry James has characters modeled on Adams and his wife Clover, planning a social evening, say, "Let us be vulgar and have some fun—let us invite the President."
So dominant was WASP culture that some wealthy families who didn't qualify by lineage attempted to imitate and live the WASP life. The Catholic Kennedys were the most notable example. The Kennedy compound at Hyannis Port—the sailing, the clothes, the touch football played on expansive green lawns—was pure WASP mimicry, all of it, except that true WASPs were too upstanding to go in for the unscrupulous business dealings of Joseph P. Kennedy Sr. or the feckless philanderings of him and some of his sons.
That the Kennedys did their best to imitate WASP life is perhaps not surprising, for in their exclusion, the Irish may have felt the sting of envy for WASPocracy more than any others. The main literary chroniclers of WASP culture— F. Scott Fitzgerald, say, or John O'Hara—were Irish. (Both Fitzgerald and O'Hara tried to live their lives on the WASP model.) But the pangs weren't limited to the Irish alone. To this day, the designer Ralph Lauren (né Lifshitz) turns out clothes inspired by his notion of the WASP high life, lived on the gracious margins of expensive leisure.
by Joseph Epstein, WSJ | Read more:

WASPs were a caste, closed off to all not born within it, with the possible exception of those who crashed the barriers by marrying in. WASP credentials came with lineage, and lineage—that is, proper birth—automatically brought connections to the right institutions. Yale, Princeton and Harvard were the great WASP universities, backed up by Choate, Groton, Andover, Exeter and other prep schools. WASPs tended to live in exclusive neighborhoods: on upper Park and Fifth Avenues in New York, on the Main Line in Philadelphia, the Back Bay in Boston, Lake Forest and Winnetka in Chicago. (...)
The State Department was once dominated by WASPs, and so, too, was the Supreme Court, with one seat traditionally left unoccupied for a Jewish jurist of proper mien. The House of Representatives was never preponderantly WASP, though a number of prominent senators— Henry Cabot Lodge and Leverett A. Saltonstall, both of Massachusetts, come to mind—have been WASPs. Looking down on the crudities of quotidian American politics, Henry Adams, a WASP to the highest power, called the dealings of Congress, the horse-trading and corruption and the rest of it, "the dance of democracy." In one of his short stories, Henry James has characters modeled on Adams and his wife Clover, planning a social evening, say, "Let us be vulgar and have some fun—let us invite the President."
So dominant was WASP culture that some wealthy families who didn't qualify by lineage attempted to imitate and live the WASP life. The Catholic Kennedys were the most notable example. The Kennedy compound at Hyannis Port—the sailing, the clothes, the touch football played on expansive green lawns—was pure WASP mimicry, all of it, except that true WASPs were too upstanding to go in for the unscrupulous business dealings of Joseph P. Kennedy Sr. or the feckless philanderings of him and some of his sons.
That the Kennedys did their best to imitate WASP life is perhaps not surprising, for in their exclusion, the Irish may have felt the sting of envy for WASPocracy more than any others. The main literary chroniclers of WASP culture— F. Scott Fitzgerald, say, or John O'Hara—were Irish. (Both Fitzgerald and O'Hara tried to live their lives on the WASP model.) But the pangs weren't limited to the Irish alone. To this day, the designer Ralph Lauren (né Lifshitz) turns out clothes inspired by his notion of the WASP high life, lived on the gracious margins of expensive leisure.
by Joseph Epstein, WSJ | Read more:
Image: Thomas Fuchs
Saturday, December 21, 2013
Endless Summer: The Next Big Thing in Surfing
Bruce McFarland’s San Diego office is just a skateboard ride from some of California’s prime surf spots. And right now, McFarland is gazing at the perfect wave—a glassy, barreling wall of water. But it’s breaking inside his building, and McFarland, an engineer and surfer, is controlling the wave with an iPad.
Sure, the wave is only three inches tall and is contained in a pint-sized pool built by McFarland’s company, American Wave Machines. But two surf parks deploying the company’s PerfectSwell technology are set to open in Russia and New Jersey, generating four- to six-foot (1.2 to 1.8 meter) waves at the push of a button. “We want to create waves so that anyone, anywhere can surf,” says McFarland.
Bringing surfing to the landlocked masses could be the biggest change to hit the sport since Hawaii’s Duke Kahanamoku taught Californians how to ride the waves a century ago. American Wave Machines is just one of half a dozen companies developing artificial wave technology, including a Los Angeles startup founded by 11-time surfing world champion Kelly Slater.
With a mix of hope and hype, the $7 billion surf industry is embracing wave parks as way to grow a flat-lining business. Kids in Kansas and Qatar could become real surfers, not just boardshorts-wearing wannabes. Pro surfing executives, meanwhile, are pushing surf parks as predictable, television-friendly venues to stage competitions as they lobby to make surfing an Olympic sport. “Surf parks will create an entire new generation of aspirational surfers,” says Jess Ponting, director of the Center for Surf Research at San Diego State University. “These new surfers will not just buy for fashion but for equipment as well, and not just in the US but in Russia, China and Europe.”
Surfing has always been as much a way of life as a sport, the exclusive domain of a coastal wave tribe with its own rites and rituals. (Disclosure: I’m one of them.) Now with dozens of surf parks under development worldwide, surfing is about to get Disneyfied—buy a ticket, stand in line, and go for a ride.

Bringing surfing to the landlocked masses could be the biggest change to hit the sport since Hawaii’s Duke Kahanamoku taught Californians how to ride the waves a century ago. American Wave Machines is just one of half a dozen companies developing artificial wave technology, including a Los Angeles startup founded by 11-time surfing world champion Kelly Slater.
With a mix of hope and hype, the $7 billion surf industry is embracing wave parks as way to grow a flat-lining business. Kids in Kansas and Qatar could become real surfers, not just boardshorts-wearing wannabes. Pro surfing executives, meanwhile, are pushing surf parks as predictable, television-friendly venues to stage competitions as they lobby to make surfing an Olympic sport. “Surf parks will create an entire new generation of aspirational surfers,” says Jess Ponting, director of the Center for Surf Research at San Diego State University. “These new surfers will not just buy for fashion but for equipment as well, and not just in the US but in Russia, China and Europe.”
Surfing has always been as much a way of life as a sport, the exclusive domain of a coastal wave tribe with its own rites and rituals. (Disclosure: I’m one of them.) Now with dozens of surf parks under development worldwide, surfing is about to get Disneyfied—buy a ticket, stand in line, and go for a ride.
by Todd Woody, Quartz | Read more:
Image: Wavegarden
NSA Surveillance: 'It's Going to Get Worse'
[ed. See also: The 9 Most Important Recommendations From the President's NSA Surveillance Panel]
Most people would object to the government searching their homes without a warrant. If you were told that that while you are at work, the government is coming into your home every day and searching it without cause, you might be unsettled. You might even think it a violation of your rights specifically, and the bill of rights generally.
But what if the government, in its defence, said: "First of all, we're searching everyone's home, so you're not being singled out. Second, we don't connect your address to your name, so don't worry about it. All we're doing is searching every home in the United States, every day, without exception, and if we find something noteworthy, we'll let you know."
This is the essence of the NSA's domestic spying programme. They are collecting records of every call made in the US, and every call made from the US to recipients abroad. Any number of government agencies can access this data – about who you have called any day, any week, any year. And this information is being kept indefinitely.
This is as clear a violation of the fourth amendment as could be conjured. That amendment protects us against unreasonable search and seizure, and yet the NSA is subjecting all American citizens to both. By collecting records of who we call, the NSA is searching through our private affairs without individualised warrants, and without suspecting the vast majority of citizens of any crime. That is illegal search. And storage of this information constitutes illegal seizure.
A series of revelations about the activities of the NSA has alarmed civil liberties advocates and fans of the constitution, as well as those who value privacy. But until more recently, with the ever-more-astounding revelations made by Edward Snowden, most of the US citizenry has been sanguine. Poll numbers indicate that about 50% of Americans think the NSA's surveillance is just fine, presumably taking comfort in two things: first, in the agency's assertions that it's only the metadata that they're collecting – not the content of the calls; that is, they only know who we have called but not what we've said. Second, General Keith Alexander, the director of the NSA, has said that through this sort of data mining, they have prevented "over 50" terrorist attacks.
The problem here is that two things cannot be proven: we can't prove the assertion that 50 – or any – terrorist attacks have been prevented; and more pressingly, we can't prove that the NSA isn't doing more than collecting this metadata – or won't do more unless its powers are checked.

But what if the government, in its defence, said: "First of all, we're searching everyone's home, so you're not being singled out. Second, we don't connect your address to your name, so don't worry about it. All we're doing is searching every home in the United States, every day, without exception, and if we find something noteworthy, we'll let you know."
This is the essence of the NSA's domestic spying programme. They are collecting records of every call made in the US, and every call made from the US to recipients abroad. Any number of government agencies can access this data – about who you have called any day, any week, any year. And this information is being kept indefinitely.
This is as clear a violation of the fourth amendment as could be conjured. That amendment protects us against unreasonable search and seizure, and yet the NSA is subjecting all American citizens to both. By collecting records of who we call, the NSA is searching through our private affairs without individualised warrants, and without suspecting the vast majority of citizens of any crime. That is illegal search. And storage of this information constitutes illegal seizure.
A series of revelations about the activities of the NSA has alarmed civil liberties advocates and fans of the constitution, as well as those who value privacy. But until more recently, with the ever-more-astounding revelations made by Edward Snowden, most of the US citizenry has been sanguine. Poll numbers indicate that about 50% of Americans think the NSA's surveillance is just fine, presumably taking comfort in two things: first, in the agency's assertions that it's only the metadata that they're collecting – not the content of the calls; that is, they only know who we have called but not what we've said. Second, General Keith Alexander, the director of the NSA, has said that through this sort of data mining, they have prevented "over 50" terrorist attacks.
The problem here is that two things cannot be proven: we can't prove the assertion that 50 – or any – terrorist attacks have been prevented; and more pressingly, we can't prove that the NSA isn't doing more than collecting this metadata – or won't do more unless its powers are checked.
by Dave Eggers, The Guardian | Read more:
Image: Woody Allen, The Front
Friday, December 20, 2013
The Urbanization of the Eastern Gray Squirrel in the United States
The dominant image of the eastern gray squirrel in early nineteenth-century American culture was as a shy woodland creature that supplied meat for frontiersmen and Indians and game for the recreational hunter but could also become a pest in agricultural areas. Although some other members of the squirrel family, such as the American red squirrel (Tamiasciurus hudsonicus, also known as the pine squirrel), were present in eighteenth- and nineteenth-century American cities, and although small numbers of gray squirrels could be found in woodlands on urban fringes, the gray squirrel was effectively absent from densely settled areas. Sometimes called the “migratory” squirrel, the species was known for unpredictable mass movements by the thousands or even millions across the rural landscape. In The Winning of the West Theodore Roosevelt wrote of the eighteenth-century American backwoodsman's fight against “black and gray squirrels [that] swarmed, devastating the cornfields, and at times gathering in immense companies and migrating across mountain and river.” Crop depredation by gray squirrels—Roosevelt's “black squirrels” were merely a color variant of the species—led residents to set bounties and carry out large-scale squirrel hunts well into the nineteenth century.
The only gray squirrels found in urban areas during this period were pets, such as Mungo, memorialized by Benjamin Franklin in a 1772 epitaph, who escaped from captivity and was killed by a dog after surviving a transatlantic journey to England. In most cases such pets had been taken from nests while young, and many were probably abandoned, killed, or had managed to escape after they matured. Nonetheless, they provided opportunities for urban Americans to form opinions about the habits and character of squirrels that complemented and sometimes contradicted those opinions formed in the context of hunting and farming. Pet squirrels, for example, which were widely available from live-animal dealers, were not shy like the wild squirrels in areas where hunting was common, and they often became importunate in their search for food in pockets and pantries. Familiar within the home, these pets appeared exotic and out of place when they escaped into the urban environment. In 1856 the New-York Daily Times reported that the appearance of an “unusual visitor” in a tree in the park near city hall had attracted a crowd of hundreds; until they were scattered by a policeman, the onlookers cheered the efforts to recapture the pet squirrel.
The first introductions of free-living squirrels to urban centers took place in cities along the Eastern Seaboard between the 1840s and the 1860s. Philadelphia seems to have been the pioneering city, with Boston and New Haven, Connecticut, following soon after. In 1847 three squirrels were released in Philadelphia's Franklin Square and were provided with food and boxes for nesting. Additional squirrels were introduced in the following years, and by 1853 gray squirrels were reported to be present in Independence, Walnut Street, and Logan Squares, where the city supplied nest boxes and food, and where visiting children often provided supplementary nuts and cakes. In 1857 a recent visitor to Philadelphia noted that the city's squirrels were “so tame that they will come and take nuts out of one's hand” and added so much to the liveliness of the parks that “it was a wonder that they are not in the public parks of all great cities.” Boston followed Philadelphia's example by introducing a handful of gray squirrels to Boston Common in 1855, and New Haven had a population of squirrels on its town green by the early 1860s.7
The people who introduced squirrels and other animals to public squares and commons in Philadelphia, Boston, and New Haven sought to beautify and enliven the urban landscape at a time when American cities were growing in geographic extent, population density, and cultural diversity. A typical expression of the motivation behind this effort can be found in an 1853 article in the Philadelphia press describing the introduction of squirrels, deer, and peacocks as steps toward making public squares into “truly delightful resorts, affording the means of increasing enjoyment to the increasing multitudes that throng this metropolis.” In Boston the release of squirrels on the Common was the project of Jerome V. C. Smith, a physician, natural historian, member of the short-lived Native American party, and Boston's mayor from 1854 to 1856. Smith's decision to have Vermont squirrels released on Boston Common was interpreted even by his critics as an attempt to “augment the attractions” of an increasingly leisure-oriented public space. For George Perkins Marsh, the author of Man and Nature, the tameness of the squirrels of the Common was a foretaste of the rewards to be expected when man moderated his destructive behavior toward nature. Like the planting of elms and other shade trees in cities and towns across the United States, the conversion of town commons and greens from pastures and spaces of labor into leisure grounds, and the creation of quasi-rural retreats such as Mount Auburn Cemetery (established outside Boston in the 1830s), the fostering of semitame squirrels in urban spaces aimed to create oases of restful nature in the industrializing city.
by Etienne Benson, Journal of American History | Read more:
Image: via:
Subscribe to:
Posts (Atom)