Sunday, July 1, 2012
The Medication Generation
When I was a college freshman in the late 1990s, antidepressants were everywhere. Prozac was appearing on magazine covers, and I'd just seen my first commercial for Paxil on TV. Halfway through the semester, I was laid out by a prolonged anxiety attack and found myself in the school's campus health center, tearfully telling a newly minted psychiatry resident about my feelings of panic and despair. Given the spirit of the times, it wasn't a complete surprise when she sent me away a few minutes later with a prescription and a generous supply of small cardboard boxes full of beautiful blue pills, free samples dropped off on campus by a company rep.
The school psychiatrist didn't suggest talk therapy. She simply asked that I return for a "med check" every few weeks to make sure that the pills were working.
Work they did. My dread burned off like valley fog in the sun, and my tears dried up as decisively as if someone had turned off a spigot. Soon I felt less anxious and more sociable than I could ever remember being.
When I started using antidepressants, I didn't know anyone else my age who was taking them. Within a few years, I felt hard-pressed at times to find someone who wasn't. Antidepressants and other psychiatric medications went mainstream in the 1990s and 2000s, and my generation became the first to use these drugs in significant numbers as adolescents and young adults.
Young people are medicated even more aggressively now, and intervention often starts younger. In children, as in adults, antidepressants and medications for attention-deficit hyperactivity disorder are often used continuously for years. These trends have produced a novel but fast-growing group—young people who have known themselves longer on medication than off it. (...)
Like me, most young adults who take antidepressants have felt relief from symptoms. But there are several aspects of the experience of growing up on antidepressants that should give us pause.
First, using antidepressants when you're young raises tough questions of personal identity. Adults who take these drugs often report that the pills turn them back into the people they were before depression obscured their true selves. But for adolescents whose identity is still under construction, the picture is more complex. Lacking a reliable conception of what it is to feel "like themselves," young people have no way to gauge the effects of the drugs on their developing personalities.
Emily, 28, grew up in the Midwest and began taking Prozac for the depression and anxiety that began to overwhelm her at age 14. (Like all the young people I interviewed, she agreed to talk on the condition of being identified by a pseudonym.) She has used it nearly continuously since. Emily is confident that Prozac helps her, even crediting it with allowing her to work. Even so, she describes a painful and persistent desire to know what she would be like without medication.
"I think Prozac has helped me a lot," she said. "But I wonder, if I'd never gotten antidepressants, who would I be? What would I be like?"
by Katherine Sharpe, WSJ | Read more:
Photo Illustration by Stephen Webster
The school psychiatrist didn't suggest talk therapy. She simply asked that I return for a "med check" every few weeks to make sure that the pills were working. Work they did. My dread burned off like valley fog in the sun, and my tears dried up as decisively as if someone had turned off a spigot. Soon I felt less anxious and more sociable than I could ever remember being.
When I started using antidepressants, I didn't know anyone else my age who was taking them. Within a few years, I felt hard-pressed at times to find someone who wasn't. Antidepressants and other psychiatric medications went mainstream in the 1990s and 2000s, and my generation became the first to use these drugs in significant numbers as adolescents and young adults.
Young people are medicated even more aggressively now, and intervention often starts younger. In children, as in adults, antidepressants and medications for attention-deficit hyperactivity disorder are often used continuously for years. These trends have produced a novel but fast-growing group—young people who have known themselves longer on medication than off it. (...)
Like me, most young adults who take antidepressants have felt relief from symptoms. But there are several aspects of the experience of growing up on antidepressants that should give us pause.
First, using antidepressants when you're young raises tough questions of personal identity. Adults who take these drugs often report that the pills turn them back into the people they were before depression obscured their true selves. But for adolescents whose identity is still under construction, the picture is more complex. Lacking a reliable conception of what it is to feel "like themselves," young people have no way to gauge the effects of the drugs on their developing personalities.
Emily, 28, grew up in the Midwest and began taking Prozac for the depression and anxiety that began to overwhelm her at age 14. (Like all the young people I interviewed, she agreed to talk on the condition of being identified by a pseudonym.) She has used it nearly continuously since. Emily is confident that Prozac helps her, even crediting it with allowing her to work. Even so, she describes a painful and persistent desire to know what she would be like without medication.
"I think Prozac has helped me a lot," she said. "But I wonder, if I'd never gotten antidepressants, who would I be? What would I be like?"
by Katherine Sharpe, WSJ | Read more:
Photo Illustration by Stephen Webster
Getting Away with It
In the spring of 2012 the Obama campaign decided to go after Mitt Romney’s record at Bain Capital, a private-equity firm that had specialized in taking over companies and extracting money for its investors—sometimes by promoting growth, but often at workers’ expense instead. Indeed, there were several cases in which Bain managed to profit even as it drove its takeover targets into bankruptcy.
So there was plenty of justification for an attack on Romney’s Bain record, and there were also clear political reasons to make that attack. For one thing, it had worked for Ted Kennedy, who used tales of workers injured by Bain to good effect against Romney in the 1994 Massachusetts Senate race. Also, to the extent that Romney had any real campaign theme to offer, it was his claim that as a successful businessman he could fix the economy where Obama had not. Pointing out both the many shadows in that business record and the extent to which what was good for Bain was definitely not good for America therefore made sense.
Yet as we were writing this review, two prominent Democratic politicians stepped up to undercut Obama’s message. First, Cory Booker, the mayor of Newark, described the attacks on private equity as “nauseating.” Then none other than Bill Clinton piped up to describe Romney’s record as “sterling,” adding, “I don’t think we ought to get into the position where we say ‘This is bad work. This is good work.’” (He later appeared with Obama and said that a Romney presidency would be “calamitous.”)
What was going on? The answer gets to the heart of the disappointments—political and economic—of the Obama years.
When Obama was elected in 2008, many progressives looked forward to a replay of the New Deal. The economic situation was, after all, strikingly similar. As in the 1930s, a runaway financial system had led first to excessive private debt, then financial crisis; the slump that followed (and that persists to this day), while not as severe as the Great Depression, bears an obvious family resemblance. So why shouldn’t policy and politics follow a similar script?
But while the economy now may bear a strong resemblance to that of the 1930s, the political scene does not, because neither the Democrats nor the Republicans are what once they were. Coming into the Obama presidency, much of the Democratic Party was close to, one might almost say captured by, the very financial interests that brought on the crisis; and as the Booker and Clinton incidents showed, some of the party still is. Meanwhile, Republicans have become extremists in a way they weren’t three generations ago; contrast the total opposition Obama has faced on economic issues with the fact that most Republicans in Congress voted for, not against, FDR’s crowning achievement, the Social Security Act of 1935.
by Paul Krugman and Robin Wells, NY Review of Books | Read more:
Photo: Pete Souza/White House
So there was plenty of justification for an attack on Romney’s Bain record, and there were also clear political reasons to make that attack. For one thing, it had worked for Ted Kennedy, who used tales of workers injured by Bain to good effect against Romney in the 1994 Massachusetts Senate race. Also, to the extent that Romney had any real campaign theme to offer, it was his claim that as a successful businessman he could fix the economy where Obama had not. Pointing out both the many shadows in that business record and the extent to which what was good for Bain was definitely not good for America therefore made sense.
Yet as we were writing this review, two prominent Democratic politicians stepped up to undercut Obama’s message. First, Cory Booker, the mayor of Newark, described the attacks on private equity as “nauseating.” Then none other than Bill Clinton piped up to describe Romney’s record as “sterling,” adding, “I don’t think we ought to get into the position where we say ‘This is bad work. This is good work.’” (He later appeared with Obama and said that a Romney presidency would be “calamitous.”)
What was going on? The answer gets to the heart of the disappointments—political and economic—of the Obama years.
When Obama was elected in 2008, many progressives looked forward to a replay of the New Deal. The economic situation was, after all, strikingly similar. As in the 1930s, a runaway financial system had led first to excessive private debt, then financial crisis; the slump that followed (and that persists to this day), while not as severe as the Great Depression, bears an obvious family resemblance. So why shouldn’t policy and politics follow a similar script?
But while the economy now may bear a strong resemblance to that of the 1930s, the political scene does not, because neither the Democrats nor the Republicans are what once they were. Coming into the Obama presidency, much of the Democratic Party was close to, one might almost say captured by, the very financial interests that brought on the crisis; and as the Booker and Clinton incidents showed, some of the party still is. Meanwhile, Republicans have become extremists in a way they weren’t three generations ago; contrast the total opposition Obama has faced on economic issues with the fact that most Republicans in Congress voted for, not against, FDR’s crowning achievement, the Social Security Act of 1935.
by Paul Krugman and Robin Wells, NY Review of Books | Read more:
Photo: Pete Souza/White House
The Four-Stringed Wonder
Until a few years ago, most Americans
thought of the ukulele—if they thought of it at all—as a fake
instrument. It was just a toy, something your grandpa might've played in
the living room during the family cocktail hour, or a prop for
vaudeville routines. The uke had a few high-profile partisans over the
years—including George Harrison, who reportedly brought them to friends'
houses as gifts—but as far as the rest of the world was concerned, the
ukulele stopped with "Raindrops Keep Fallin' on My Head" and Tiny Tim.
Ten or 15 years ago, things started to change. (...)
"It's like a little chord machine," Beloff said. "There are all kinds of musically sophisticated things you can wring out of those four strings." He pointed to old players like Lyle Ritz—of the elite session band the Wrecking Crew, who played for Phil Spector, the Beach Boys, the Byrds, etc.—who made extraordinary ukulele jazz records. Beloff and his wife are about to release a songbook with ukulele arrangements for works by Vivaldi, Bach, and other baroque composers. (...)
Eddie Vedder, who has composed on the ukulele for years, found his first serious one on a surfing trip in a remote Hawaiian town. He went to the liquor store for some cases of beer and was sitting on them, waiting for a friend who'd gone to the grocery store. "I turned around and there was this ukulele hanging on the wall, right above my shoulder," he said, "just like a parrot on a pirate." He bought it and started fooling around in the sun. He'd left the case open on the sidewalk and people started throwing money into it. A new relationship was born.
"Instruments can be friends, and there's a big transition playing an instrument when it becomes your friend," he said. "You remember the day when it isn't a guest/host relationship. Most instruments take a while before they let you play them. The ukulele is different—it's a really gregarious little friend. And for its size, it's really forthright and giving. It doesn't have a Napoleon complex."
A good ukulele sounds gregarious. Vedder told one story about a night playing casually with a fellow musician. The friend was in the corner, trying to write something dark and evil-sounding on the ukulele, like it was a challenge. But he couldn't do it. They stayed up all night trying—and partying. "In the fog of the morning, he was vomiting over the balcony," Vedder said. "The uke had won!"
by Brenden Kiley, The Stranger | Read more:
Photo: Collings Guitars
Redefining Success and Celebrating the Ordinary
I've been thinking a lot about the ordinary and extraordinary lately. All year, my sons’ school newsletters were filled with stories about students winning prizes for university-level scientific research, stellar musical accomplishments and statewide athletic laurels.
I wonder if there is any room for the ordinary any more, for the child or teenager — or adult — who enjoys a pickup basketball game but is far from Olympic material, who will be a good citizen but won’t set the world on fire.
We hold so dearly onto the idea that we should all aspire to being remarkable that when David McCullough Jr., an English teacher, told graduating seniors at Wellesley High School in Massachusetts recently, “You are not special. You are not exceptional,” the speech went viral.
“In our unspoken but not so subtle Darwinian competition with one another — which springs, I think, from our fear of our own insignificance, a subset of our dread of mortality — we have of late, we Americans, to our detriment, come to love accolades more than genuine achievement,” he told the students and parents. “We have come to see them as the point — and we’re happy to compromise standards, or ignore reality, if we suspect that’s the quickest way, or only way, to have something to put on the mantelpiece, something to pose with, crow about, something with which to leverage ourselves into a better spot on the social totem pole.”
I understand that Mr. McCullough, son of the Pulitzer Prize-winning historian, is telling these high school seniors that the world might not embrace them as unconditionally as their parents have. That just because they’ve been told they’re amazing doesn’t mean that they are. That they have to do something to prove themselves, not just accept compliments and trophies.
So where did this intense need to be exceptional come from?
Madeline Levine, a psychologist, said that for baby boomers, “the notion of being special is in our blood.” She added: “How could our children be anything but? And future generations kept building on that.”
by Alena Tugend, NY Times | Read more:
Photo: Charlie Riedel/Associated Press
The Perfect Listen: Fiona Apple As A Lesson In Irrational Music Rituals
On June 19, a week and a half ago, Fiona Apple released a brand new album, her first in seven years. The entire album had been available for streaming by NPR Music for a week and a half by then. Three days later, my copy arrived in the mail. It hasn't left my desk since.
I still haven't listened to it.
Mind you, I've been looking forward to The Idler Wheel... more than maybe any other album this year. Her stunning Boston show in March floored me; it was unquestionably the best concert I've seen in five years, and it took me half a day to recover to a point where I could even listen to other music. Sure, the album's reviews have been breathless and hagiographic, but the prospect of it falling short of expectations – which is always a possibility, though similar reports about her recent performances turned out to be right on target – isn't the issue.
What has kept me from just putting the damn thing in my CD player and pressing "play" is a bit of what I fully accept is compulsive irrationality: I want to hear it so much that I want to make sure that conditions are exactly right the very first time I listen to it, and conditions have not been exactly right. And that is, in a word, stupid.
And I know stupid, because I have my own first-listen music-listening rituals. The first time I play an album, I have to listen to it straight through, with no interruptions, no pausing, no "I'll get to the rest of it later"; if it's 60 minutes long, then I'd better be sure I can carve out an hour for it. If there are lyrics in the liner notes, I'll read along as it plays. What I want, really, is to be able to give it my full, undivided attention.
But for all the romanticizing of the first time we hear an album or a song, that's almost never the moment of its crucial impact. That's not really how music works, not if it can actually hold up beyond that first listen. Unlike books, movies or plays (and television, to a lesser extent), recorded music is consumed repetitively. It's usually anywhere between the second and fifth listen that fragments that maybe weren't evident on first glance suddenly come at you or your brain makes a connection that could only have been made indirectly. That's when a song start to mean something to you.
by Marc Hirsh, NPR | Read more:
Photo: Fiona Apple, by Jack Plunkett/AP
Saturday, June 30, 2012
—Iggy Pop explains his current relationship with his penis.
h/t The Awl
California Takes Foie Gras Off the Menu
At Mélisse in Santa Monica, diners were preparing Saturday for "one last huzzah" in honour of a controversial delicacy that will soon become contraband across California.
Awaiting them at the upmarket French bistro is a feast of foie gras, a seven-course special celebrating the food stuff that makes animal rights campaigners gag, but leaves aficionados wanting more.
Those who make it through to the final dish – a strawberry shortcake stuffed with foie gras mouse and accompanied with foie gras ice cream – will be battling time, as well as their belts.
For at midnight California will enact a law it promised eight years ago, making the fattened livers of force-fed ducks and geese illegal.
Foie gras has long been a target for those calling for the ethical treatment of livestock. Translated to English as "fatty liver", foie gras is produced by a process known as gavage, in which the birds are force-fed corn through a tube.
It is designed to enlarge the birds' livers before being slaughtered, after which the organs are harvested and served up as a rich – and to fans a mouth-watering – delicacy.
The process dates back centuries. But in late 2004, then California governor Arnold Schwarzenegger signed a bill banning the sale of foie gras.
Diners and chefs were given a suitably long grace period to find an alternative method to gavage or wean themselves off the stuff it produces.
But despite a concerted effort by some to get the proposed ban overturned, seven and a half years down the line the law is now to be enacted.
From July 1, any restaurant serving foie gras will be fined up to $1,000 according to the statute. As the deadline has neared, restaurants have seen a growth in patrons wanting foie gras.
by Matt Williams, The Guardian | Read more:
Photograph: Dimitar Dilkoff/AFP/Getty Images
Awaiting them at the upmarket French bistro is a feast of foie gras, a seven-course special celebrating the food stuff that makes animal rights campaigners gag, but leaves aficionados wanting more.Those who make it through to the final dish – a strawberry shortcake stuffed with foie gras mouse and accompanied with foie gras ice cream – will be battling time, as well as their belts.
For at midnight California will enact a law it promised eight years ago, making the fattened livers of force-fed ducks and geese illegal.
Foie gras has long been a target for those calling for the ethical treatment of livestock. Translated to English as "fatty liver", foie gras is produced by a process known as gavage, in which the birds are force-fed corn through a tube.
It is designed to enlarge the birds' livers before being slaughtered, after which the organs are harvested and served up as a rich – and to fans a mouth-watering – delicacy.
The process dates back centuries. But in late 2004, then California governor Arnold Schwarzenegger signed a bill banning the sale of foie gras.
Diners and chefs were given a suitably long grace period to find an alternative method to gavage or wean themselves off the stuff it produces.
But despite a concerted effort by some to get the proposed ban overturned, seven and a half years down the line the law is now to be enacted.
From July 1, any restaurant serving foie gras will be fined up to $1,000 according to the statute. As the deadline has neared, restaurants have seen a growth in patrons wanting foie gras.
by Matt Williams, The Guardian | Read more:
Photograph: Dimitar Dilkoff/AFP/Getty Images
Our Robot Future
It was chaos over Zuccotti Park on the early morning of Nov. 15. New York City policemen surrounded the park in Lower Manhattan where hundreds of activists had been living as part of the nationwide Occupy movement. The 1:00 AM raid followed a court order allowing the city to prohibit camping gear in the privately-owned park.
Many protestors resisted and nearly 200 were arrested. Journalists hurrying towards the park reported being illegally barred by police. The crews of two news-choppers–one each from CBS and NBC–claimed they were ordered out of the airspace over Zuccotti Park by the NYPD. Later, NBC claimed its crew misunderstood directions from the control tower. “NYPD cannot, and did not, close air space. Only FAA can do that,” a police spokesperson told Columbia Journalism Review. The FAA said it issued no flight ban.Regardless, the confusion resulted in a de facto media blackout for big media. Just one reporter had the unconstrained ability to get a bird’s-eye view on police action during the height of the Occupy protests. Tim Pool, a 26-year-old independent video journalist, in early December began sending a customized two-foot-wide robot–made by French company Parrot–whirring over the police’s and protestors’ heads. The camera-equipped ‘bot streamed live video to Pool’s smartphone, which relayed the footage to a public Internet stream.If the police ever noticed the diminutive, all-seeing automaton–and there’s no evidence they did–they never did anything to stop it. Unlike CBS and NBC, the boyish Pool, forever recognizable in his signature black knit cap, understood the law. He knew his pioneering drone flights were legal–just barely.
Pool’s robot coup was a preview of the future, as rapid advances in cheap drone technology dovetail with a loosening legal regime that, combined, could allow pretty much anybody to deploy their own flying robot–and all within the next three years. The spread of do-it-yourself robotics could radically change the news, the police, business and politics. And it could spark a sort of drone arms race as competing robot users seek to balance out their rivals.
Imagine police drones patrolling at treetop level down city streets, their cameras scanning crowds for weapons or suspicious activity. “Newsbots” might follow in their wake, streaming live video of the goings-on. Drones belonging to protest groups hover over both, watching the watchers. In nearby zip codes, drones belonging to real estate agents scope out hot properties. Robots deliver pizzas by following the signal from customers’ cell phones. Meanwhile, anti-drone “freedom fighters,” alarmed by the spread of cheap, easy overhead surveillance, take potshots at the robots with rifles and shotguns.
These aren’t just fantasies. All of these things are happening today, although infrequently and sometimes illegally. The only thing holding back the robots is government regulations that have failed to keep up with technology. The regs are due for an overhaul in 2015. That’s the year drones could make their major debut. “Everyone’s ready to do this,” Pool tells ANIMAL. “It’s only going to get crazier.”
by David Axe, AnimalNewYork | Read more:
Amber Waves of Green
The gap between the richest and the poorest among us is now wider than it has been since we all nose-dived into the Great Depression. So GQ sent Jon Ronson on a journey into the secret financial lives of six different people on the ladder, from a guy washing dishes for 200 bucks a week in Miami to a self-storage gazillionaire. What he found are some surprising truths about class, money, and making it in America.
As I drive along the Pacific Coast Highway into Malibu, I catch glimpses of incredible cliff-top mansions discreetly obscured from the road, which is littered with abandoned gas stations and run-down mini-marts. The offlce building I pull up to is quite drab and utilitarian. There are no ornaments on the conference-room shelves—just a bottle of hand sanitizer. An elderly, broad-shouldered man greets me. He's wearing jogging pants. They don't look expensive. His name is B. Wayne Hughes.
You almost definitely won't have heard of him. He hardly ever gives interviews. He only agreed to this one because—as his people explained to me—income disparity is a hugely important topic for him. They didn't explain how it was important, so I assumed he thought it was bad.
I approached Wayne, as he's known, for wholly mathematical reasons. I'd worked out that there are six degrees of economic separation between a guy making ten bucks an hour and a Forbes billionaire, if you multiply each person's income by five. So I decided to journey across America to meet one representative of each multiple. By connecting these income brackets to actual people, I hoped to understand how money shapes their lives—and the life of the country—at a moment when the gap between rich and poor is such a combustible issue. Everyone in this story, then, makes roughly five times more than the last person makes. There's a dishwasher in Miami with an unbelievably stressful life, some nice middle-class Iowans with quite difflcult lives, me with a perfectly fine if frequently anxiety-inducing life, a millionaire with an annoyingly happy life, a multimillionaire with a stunningly amazing life, and then, finally, at the summit, this great American eagle, Wayne, who tells me he's "pissed off" right now.
"I live my life paying my taxes and taking care of my responsibilities, and I'm a little surprised to find out that I'm an enemy of the state at this time in my life," he says. (...)
In 2006, Wayne was America's sixty-first-richest man, according to Forbes, with $4.1 billion. Today he's the 242nd richest (and the 683rd richest in the world), with $1.9 billion. He's among the least famous people on the list. In fact, he once asked the magazine to remove his name. "I said, 'It's an imposition. Forbes should not be doing that. It's the wrong thing to do. It puts my children and my grandchildren at risk.' "
"And what did they say?" I ask.
"They said when Trump called up, he said the number next to his name was too small."
When Wayne is in Malibu, he stays in his daughter's spare room. His home is a three-bedroom farmhouse on a working stud farm in Lexington, Kentucky.
"I have no fancy living at all," he says. "Well, I have a house in Sun Valley. Five acres in the woods. I guess that's fancy."
I like Wayne very much. He's avuncular and salt of the earth. I admire how far he has risen from the Grapes of Wrath circumstances into which he was born; he's the very embodiment of the American Dream. I'm surprised, though, and a little taken aback, by his anger. I'll return to Wayne—and the curiously aggrieved way he views his place in the world—a bit later.
But first let's plummet all the way down to the very, very bottom, as if we're falling down a well, to a concrete slab of a house in a downtrodden Miami neighborhood called Little Haiti.
by Jon Ronson, GQ | Read more:
As I drive along the Pacific Coast Highway into Malibu, I catch glimpses of incredible cliff-top mansions discreetly obscured from the road, which is littered with abandoned gas stations and run-down mini-marts. The offlce building I pull up to is quite drab and utilitarian. There are no ornaments on the conference-room shelves—just a bottle of hand sanitizer. An elderly, broad-shouldered man greets me. He's wearing jogging pants. They don't look expensive. His name is B. Wayne Hughes. You almost definitely won't have heard of him. He hardly ever gives interviews. He only agreed to this one because—as his people explained to me—income disparity is a hugely important topic for him. They didn't explain how it was important, so I assumed he thought it was bad.
I approached Wayne, as he's known, for wholly mathematical reasons. I'd worked out that there are six degrees of economic separation between a guy making ten bucks an hour and a Forbes billionaire, if you multiply each person's income by five. So I decided to journey across America to meet one representative of each multiple. By connecting these income brackets to actual people, I hoped to understand how money shapes their lives—and the life of the country—at a moment when the gap between rich and poor is such a combustible issue. Everyone in this story, then, makes roughly five times more than the last person makes. There's a dishwasher in Miami with an unbelievably stressful life, some nice middle-class Iowans with quite difflcult lives, me with a perfectly fine if frequently anxiety-inducing life, a millionaire with an annoyingly happy life, a multimillionaire with a stunningly amazing life, and then, finally, at the summit, this great American eagle, Wayne, who tells me he's "pissed off" right now.
"I live my life paying my taxes and taking care of my responsibilities, and I'm a little surprised to find out that I'm an enemy of the state at this time in my life," he says. (...)
In 2006, Wayne was America's sixty-first-richest man, according to Forbes, with $4.1 billion. Today he's the 242nd richest (and the 683rd richest in the world), with $1.9 billion. He's among the least famous people on the list. In fact, he once asked the magazine to remove his name. "I said, 'It's an imposition. Forbes should not be doing that. It's the wrong thing to do. It puts my children and my grandchildren at risk.' "
"And what did they say?" I ask.
"They said when Trump called up, he said the number next to his name was too small."
When Wayne is in Malibu, he stays in his daughter's spare room. His home is a three-bedroom farmhouse on a working stud farm in Lexington, Kentucky.
"I have no fancy living at all," he says. "Well, I have a house in Sun Valley. Five acres in the woods. I guess that's fancy."
I like Wayne very much. He's avuncular and salt of the earth. I admire how far he has risen from the Grapes of Wrath circumstances into which he was born; he's the very embodiment of the American Dream. I'm surprised, though, and a little taken aback, by his anger. I'll return to Wayne—and the curiously aggrieved way he views his place in the world—a bit later.
But first let's plummet all the way down to the very, very bottom, as if we're falling down a well, to a concrete slab of a house in a downtrodden Miami neighborhood called Little Haiti.
by Jon Ronson, GQ | Read more:
Friday, June 29, 2012
Why We Cheat
Behavioral economist Dan Ariely,
who teaches at Duke University, is known as one of the most original
designers of experiments in social science. Not surprisingly, the best-selling author’s creativity is evident throughout his latest book, The (Honest) Truth About Dishonesty.
A lively tour through the impulses that cause many of us to cheat, the
book offers especially keen insights into the ways in which we cut
corners while still thinking of ourselves as moral people. Here, in
Ariely’s own words, are seven lessons you didn’t learn in school about
dishonesty. (Interview edited and condensed by Gary Belsky.)
1. Most of us are 98-percenters.
“A student told me a story about a locksmith he met when he locked himself out of the house. This student was amazed at how easily the locksmith picked his lock, but the locksmith explained that locks were really there to keep honest people from stealing. His view was that 1% of people would never steal, another 1% would always try to steal, and the rest of us are honest as long as we’re not easily tempted. Locks remove temptation for most people. And that’s good, because in our research over many years, we’ve found that everybody has the capacity to be dishonest and almost everybody is at some point or another.”
2. We’ll happily cheat … until it hurts.
“The Simple Model of Rational Crime suggests that the greater the reward, the greater the likelihood that people will cheat. But we’ve found that for most of us, the biggest driver of dishonesty is the ability to rationalize our actions so that we don’t lose the sense of ourselves as good people. In one of our matrix experiments [a puzzle-solving exercise Ariely uses in his work to measure dishonesty], the level of cheating didn’t change as the reward for cheating rose. In fact, the highest payout resulted in a little less cheating, probably because the amount of money got to be big enough that people couldn’t rationalize their cheating as harmless. Most people are able to cheat a little because they can maintain the sense of themselves as basically honest people. They won’t commit major fraud on their tax returns or insurance claims or expense reports, but they’ll cut corners or exaggerate here or there because they don’t feel that bad about it.”
3. It’s no wonder people steal from work.
“In one matrix experiment, we added a condition where some participants were paid in tokens, which they knew they could quickly exchange for real money. But just having that one step of separation resulted in a significant increase in cheating. Another time, we surveyed golfers and asked which act of moving a ball illegally would make other golfers most uncomfortable: using a club, their foot or their hand. More than twice as many said it would be less of a problem — for other golfers, of course — to use their club than to pick the ball up. Our willingness to cheat increases as we gain psychological distance from the action. So as we gain distance from money, it becomes easier to see ourselves as doing something other than stealing. That’s why many of us have no problem taking pencils or a stapler home from work when we’d never take the equivalent amount of money from petty cash. And that’s why I’m a little concerned about the direction we’re taking toward becoming a cashless society. Virtual payments are a great convenience, but our research suggests we should worry that the farther people get from using actual money, the easier it becomes to steal.”
by Gary Belsky, Time | Read more:
Photo: Getty Images
1. Most of us are 98-percenters.
“A student told me a story about a locksmith he met when he locked himself out of the house. This student was amazed at how easily the locksmith picked his lock, but the locksmith explained that locks were really there to keep honest people from stealing. His view was that 1% of people would never steal, another 1% would always try to steal, and the rest of us are honest as long as we’re not easily tempted. Locks remove temptation for most people. And that’s good, because in our research over many years, we’ve found that everybody has the capacity to be dishonest and almost everybody is at some point or another.”
2. We’ll happily cheat … until it hurts.
“The Simple Model of Rational Crime suggests that the greater the reward, the greater the likelihood that people will cheat. But we’ve found that for most of us, the biggest driver of dishonesty is the ability to rationalize our actions so that we don’t lose the sense of ourselves as good people. In one of our matrix experiments [a puzzle-solving exercise Ariely uses in his work to measure dishonesty], the level of cheating didn’t change as the reward for cheating rose. In fact, the highest payout resulted in a little less cheating, probably because the amount of money got to be big enough that people couldn’t rationalize their cheating as harmless. Most people are able to cheat a little because they can maintain the sense of themselves as basically honest people. They won’t commit major fraud on their tax returns or insurance claims or expense reports, but they’ll cut corners or exaggerate here or there because they don’t feel that bad about it.”
3. It’s no wonder people steal from work.
“In one matrix experiment, we added a condition where some participants were paid in tokens, which they knew they could quickly exchange for real money. But just having that one step of separation resulted in a significant increase in cheating. Another time, we surveyed golfers and asked which act of moving a ball illegally would make other golfers most uncomfortable: using a club, their foot or their hand. More than twice as many said it would be less of a problem — for other golfers, of course — to use their club than to pick the ball up. Our willingness to cheat increases as we gain psychological distance from the action. So as we gain distance from money, it becomes easier to see ourselves as doing something other than stealing. That’s why many of us have no problem taking pencils or a stapler home from work when we’d never take the equivalent amount of money from petty cash. And that’s why I’m a little concerned about the direction we’re taking toward becoming a cashless society. Virtual payments are a great convenience, but our research suggests we should worry that the farther people get from using actual money, the easier it becomes to steal.”
by Gary Belsky, Time | Read more:
Photo: Getty Images
54 Smart Thinkers Everyone Should Follow On Twitter
Today everyone is getting their news and information from Twitter. At Business Insider, it's how we get a lot of our story ideas.
But figuring out exactly who to follow is a tough task. So we've put together a guide of some of the most influential thought leaders in the world who tweet.
Our criteria was simply this: that these people are respected voices in their fields — whether it be neuroscience, economics, business or journalism — and that they have developed a following for their insightful commentary on Twitter.
by Aimee Groth, Danielle Schlanger and Kim Bhasin, Business Insider | Read more:
Cities Grow More than Suburbs, First Time in 100 Years
For the first time in a century, most of America's largest cities are growing at a faster rate than their surrounding suburbs as young adults seeking a foothold in the weak job market shun home-buying and stay put in bustling urban centers.
New 2011 census estimates released Thursday highlight the dramatic switch.
Driving the resurgence are young adults, who are delaying careers, marriage and having children amid persistently high unemployment. Burdened with college debt or toiling in temporary, lower-wage positions, they are spurning homeownership in the suburbs for shorter-term, no-strings-attached apartment living, public transit and proximity to potential jobs in larger cities.
While economists tend to believe the city boom is temporary, that is not stopping many city planning agencies and apartment developers from seeking to boost their appeal to the sizable demographic of 18-to-29-year olds. They make up roughly 1 in 6 Americans, and some sociologists are calling them "generation rent." The planners and developers are betting on young Americans' continued interest in urban living, sensing that some longer-term changes such as decreased reliance on cars may be afoot.
The last time growth in big cities surpassed that in outlying areas occurred prior to 1920, before the rise of mass-produced automobiles spurred expansion beyond city cores. (...)
"The recession hit suburban markets hard. What we're seeing now is young adults moving out from their parents' homes and starting to find jobs," Shepard said. "There's a bigger focus on building residences near transportation hubs, such as a train or subway station, because fewer people want to travel by car for an hour and a half for work anymore."
Katherine Newman, a sociologist and dean of arts and sciences at Johns Hopkins University who chronicled the financial struggles of young adults in a recent book, said they are emerging as a new generation of renters due to stricter mortgage requirements and mounting college debt. From 2009 to 2011, just 9 percent of 29- to 34-year-olds were approved for a first-time mortgage.
"Young adults simply can't amass the down payments needed and don't have the earnings," she said. "They will be renting for a very long time."
by Hope Yen and Kristen Wyatt, MSNBC | Read more:
Photo: Kristen Wyatt
New 2011 census estimates released Thursday highlight the dramatic switch.
Driving the resurgence are young adults, who are delaying careers, marriage and having children amid persistently high unemployment. Burdened with college debt or toiling in temporary, lower-wage positions, they are spurning homeownership in the suburbs for shorter-term, no-strings-attached apartment living, public transit and proximity to potential jobs in larger cities.
While economists tend to believe the city boom is temporary, that is not stopping many city planning agencies and apartment developers from seeking to boost their appeal to the sizable demographic of 18-to-29-year olds. They make up roughly 1 in 6 Americans, and some sociologists are calling them "generation rent." The planners and developers are betting on young Americans' continued interest in urban living, sensing that some longer-term changes such as decreased reliance on cars may be afoot.
The last time growth in big cities surpassed that in outlying areas occurred prior to 1920, before the rise of mass-produced automobiles spurred expansion beyond city cores. (...)
"The recession hit suburban markets hard. What we're seeing now is young adults moving out from their parents' homes and starting to find jobs," Shepard said. "There's a bigger focus on building residences near transportation hubs, such as a train or subway station, because fewer people want to travel by car for an hour and a half for work anymore."
Katherine Newman, a sociologist and dean of arts and sciences at Johns Hopkins University who chronicled the financial struggles of young adults in a recent book, said they are emerging as a new generation of renters due to stricter mortgage requirements and mounting college debt. From 2009 to 2011, just 9 percent of 29- to 34-year-olds were approved for a first-time mortgage.
"Young adults simply can't amass the down payments needed and don't have the earnings," she said. "They will be renting for a very long time."
by Hope Yen and Kristen Wyatt, MSNBC | Read more:
Photo: Kristen Wyatt
Subscribe to:
Comments (Atom)












