Monday, July 2, 2012

A Huge Break in the LIBOR Banking Investigation

[ed. Libor, the inter-bank lending rate that affects a host of financial instruments (e.g. futures, variable rate mortgages, currencies, etc. It was announced today that the chairman of Barclays has resigned.]

This is a huge story:
On Wednesday, Barclays won the race to reach a deal with U.S. and British regulators, beating UBS, which was reportedly the first bank to begin cooperating with international antitrust authorities. Barclays agreed to pay at least $450 million to resolve government investigations of manipulation of Libor and the Euro interbank offered rate (or Euribor): $200 million to the U.S. Commodity Futures Trading Commission, $160 million tothe criminal division of the U.S. Department of Justice and $92.8 million to Britain's Financial Services Authority.
I wrote about the Libor investigation in the current issue of Rolling Stone, in "The Scam Wall Street Learned From the Mafia," about muni bond bid-rigging. Throughout this spring, while the Carollo bid-rigging case played out in a Manhattan courtroom, negotiations between banks and regulators were going on in this far larger cartel-corruption case. It’s been clear for some time now that a number of players had begun cooperating, and the only question was which bank was going to settle first.

Despite widespread expectation that it would be UBS, it turned out to be Barclays. You know how in Law and Order Jack McCoy always puts the two murder accomplices in separate rooms and tells them both that whoever talks first wins? Something like that happened here. In any case, the Department of Justice filing on the settlement contained excerpts of emails and other evidence that recall the taped phone conversations in the Carollo case: once again, we have seemingly incontrovertible evidence of wide-scale market manipulation. From Alison Frankel at Reuters:
Barclays employees agreed to manipulate the rates they submitted to the banking authority that oversees the daily Libor report for seemingly anyone who asked them to monkey with it: senior Barclays officials concerned that the bank would look weak if it reported too high a borrowing rate; interest rate swap traders trying to improve Barclays' derivatives trading position; even former Barclays traders begging for favors. We're talking naked, blatant manipulation. Here's one exchange cited in the DOJ filing:

Trader: "Can you pls continue to go in for 3m Libor at 5.365 or lower, we are all very long cash here in ny."

Libor rate submitter: "How long?"

Trader: "Until the effective date goes over year end (i.e. turn drops out) if possible."

Submitter: "Will do my best sir."
This is unbelievable, shocking stuff. A sizable chunk of the world’s adjustable-rate investment vehicles are pegged to Libor, and here we have evidence that banks were tweaking the rate downward to massage their own derivatives positions. The consequences for this boggle the mind. For instance, almost every city and town in America has investment holdings tied to Libor. If banks were artificially lowering the rates to beef up their trading profiles, that means communities all over the world were cheated out of ungodly amounts of money.

by Matt Taibbi, Rolling Stone |  Read more:

How William Faulkner Tackled Race — and Freed the South From Itself


A poll of well over a hundred writers and critics, taken a few years back by Oxford American magazine, named William Faulkner’s “Absalom, Absalom!” the “greatest Southern novel ever written,” by a decisive margin — and the poll was conducted while looking back on a century in which a disproportionate number of the best American books were Southern — so to say that this novel requires no introduction is just to speak plainly.

Of course, it’s the kind of book a person would put first in a poll like that. You can feel reasonably confident, in voting for it, that nobody quite fathoms it enough to question its achievement. Self-consciously ambitious and structurally complex (unintelligible, a subset of not unsophisticated readers has always maintained), “Absalom, Absalom!” partakes of what the critic Irving Howe called “a fearful impressiveness,” the sort that “comes when a writer has driven his vision to an extreme.” It may represent the closest American literature came to producing an analog for “Ulysses,” which influenced it deeply — each in its way is a provincial Modernist novel about a young man trying to awaken from history — and like “Ulysses,” it lives as a book more praised than read, or more esteemed than enjoyed.

But good writers don’t look for impressedness in their readers — it’s at best another layer of distortion — and “greatness” can leave a book isolated in much the way it can a human being. (Surely a reason so many have turned away from “Ulysses” over the last near-hundred years is that they can’t read it without a suffocating sense of each word’s cultural importance and their duty to respond, a shame in that case, given how often Joyce was trying to be amusing.) A good writer wants from us — or has no right to ask more than — intelligence, good faith and time. A legitimate question to ask is, What happens with “Absalom, Absalom!” if we set aside its laurels and apply those things instead? What has Faulkner left us?

A prose of exceptional vividness, for one thing. The same few passages, in the very first pages, remind me of this — they’re markings on an entryway — sudden bursts of bristly adjective clusters. The September afternoon on which the book opens in a “dim hot airless” room is described as “long still hot weary dead.” If you’ve ever taken a creative-writing workshop, you’ve been warned never to do this, pile up adjectives, interpose descriptive terms between the reader’s imagination and the scene. But here something’s different. Faulkner’s choices are so precise, and his juxtaposition of the words so careful in conditioning our sense reception, that he doesn’t so much solve as overpower the problem. The sparrows flying into the window trellis beat their wings with a sound that’s “dry vivid dusty,” each syllable a note in a chord he’s forming. The Civil War ghosts that haunt the room are “garrulous outraged baffled.”

The rules Faulkner doesn’t ignore in this novel he tends to obliterate. The plot, for instance. There is none. He tells us on the third page (in italics) pretty much everything that will happen in the book, actionwise. If you ever feel lost, you can refer back to it, a little not-even-paragraph that begins, “It seems that this demon — his name was Sutpen”

A fundamental law of storytelling is: withhold information. As the writer Paul Metcalf put it, “The only real work in creative endeavor is keeping things from falling together too soon.” What we discover, though, on advancing into the novel’s maze, is that Faulkner has given nothing away, not of the things he most values. He’s not concerned with holding us in suspense over the unearthing of events but in keeping us transfixed, as he goes about excavating the soil beneath them, and tracing their post-mortem effects (embodied, perhaps, by the worm that comes to light in a shovelful of dirt, “doubtless alive when the clod was thrown up though by afternoon it was frozen again”). The nightmare of the Southern past exists — an accomplished thing. To delve into the nature of the tragedy is the novel’s drama.

by John Jeremiah Sullivan, NY Times |  Read more:

Sunday, July 1, 2012


Karen Hollingsworth, Beach Read
Oil on canvas
via:

thomas saliot. Walking Kubrick.
via:

The Medication Generation

When I was a college freshman in the late 1990s, antidepressants were everywhere. Prozac was appearing on magazine covers, and I'd just seen my first commercial for Paxil on TV. Halfway through the semester, I was laid out by a prolonged anxiety attack and found myself in the school's campus health center, tearfully telling a newly minted psychiatry resident about my feelings of panic and despair. Given the spirit of the times, it wasn't a complete surprise when she sent me away a few minutes later with a prescription and a generous supply of small cardboard boxes full of beautiful blue pills, free samples dropped off on campus by a company rep.

The school psychiatrist didn't suggest talk therapy. She simply asked that I return for a "med check" every few weeks to make sure that the pills were working.

Work they did. My dread burned off like valley fog in the sun, and my tears dried up as decisively as if someone had turned off a spigot. Soon I felt less anxious and more sociable than I could ever remember being.

When I started using antidepressants, I didn't know anyone else my age who was taking them. Within a few years, I felt hard-pressed at times to find someone who wasn't. Antidepressants and other psychiatric medications went mainstream in the 1990s and 2000s, and my generation became the first to use these drugs in significant numbers as adolescents and young adults.

Young people are medicated even more aggressively now, and intervention often starts younger. In children, as in adults, antidepressants and medications for attention-deficit hyperactivity disorder are often used continuously for years. These trends have produced a novel but fast-growing group—young people who have known themselves longer on medication than off it. (...)

Like me, most young adults who take antidepressants have felt relief from symptoms. But there are several aspects of the experience of growing up on antidepressants that should give us pause.

First, using antidepressants when you're young raises tough questions of personal identity. Adults who take these drugs often report that the pills turn them back into the people they were before depression obscured their true selves. But for adolescents whose identity is still under construction, the picture is more complex. Lacking a reliable conception of what it is to feel "like themselves," young people have no way to gauge the effects of the drugs on their developing personalities.

Emily, 28, grew up in the Midwest and began taking Prozac for the depression and anxiety that began to overwhelm her at age 14. (Like all the young people I interviewed, she agreed to talk on the condition of being identified by a pseudonym.) She has used it nearly continuously since. Emily is confident that Prozac helps her, even crediting it with allowing her to work. Even so, she describes a painful and persistent desire to know what she would be like without medication.

"I think Prozac has helped me a lot," she said. "But I wonder, if I'd never gotten antidepressants, who would I be? What would I be like?"

by Katherine Sharpe, WSJ |  Read more:
Photo Illustration by Stephen Webster

Albert Marquet, Bay of Naples, c. 1908 or 1930s
via:

Getting Away with It

In the spring of 2012 the Obama campaign decided to go after Mitt Romney’s record at Bain Capital, a private-equity firm that had specialized in taking over companies and extracting money for its investors—sometimes by promoting growth, but often at workers’ expense instead. Indeed, there were several cases in which Bain managed to profit even as it drove its takeover targets into bankruptcy.

So there was plenty of justification for an attack on Romney’s Bain record, and there were also clear political reasons to make that attack. For one thing, it had worked for Ted Kennedy, who used tales of workers injured by Bain to good effect against Romney in the 1994 Massachusetts Senate race. Also, to the extent that Romney had any real campaign theme to offer, it was his claim that as a successful businessman he could fix the economy where Obama had not. Pointing out both the many shadows in that business record and the extent to which what was good for Bain was definitely not good for America therefore made sense.

Yet as we were writing this review, two prominent Democratic politicians stepped up to undercut Obama’s message. First, Cory Booker, the mayor of Newark, described the attacks on private equity as “nauseating.” Then none other than Bill Clinton piped up to describe Romney’s record as “sterling,” adding, “I don’t think we ought to get into the position where we say ‘This is bad work. This is good work.’” (He later appeared with Obama and said that a Romney presidency would be “calamitous.”)

What was going on? The answer gets to the heart of the disappointments—political and economic—of the Obama years.

When Obama was elected in 2008, many progressives looked forward to a replay of the New Deal. The economic situation was, after all, strikingly similar. As in the 1930s, a runaway financial system had led first to excessive private debt, then financial crisis; the slump that followed (and that persists to this day), while not as severe as the Great Depression, bears an obvious family resemblance. So why shouldn’t policy and politics follow a similar script?

But while the economy now may bear a strong resemblance to that of the 1930s, the political scene does not, because neither the Democrats nor the Republicans are what once they were. Coming into the Obama presidency, much of the Democratic Party was close to, one might almost say captured by, the very financial interests that brought on the crisis; and as the Booker and Clinton incidents showed, some of the party still is. Meanwhile, Republicans have become extremists in a way they weren’t three generations ago; contrast the total opposition Obama has faced on economic issues with the fact that most Republicans in Congress voted for, not against, FDR’s crowning achievement, the Social Security Act of 1935.

by Paul Krugman and Robin Wells, NY Review of Books |  Read more:
Photo: Pete Souza/White House

The Four-Stringed Wonder

Until a few years ago, most Americans thought of the ukulele—if they thought of it at all—as a fake instrument. It was just a toy, something your grandpa might've played in the living room during the family cocktail hour, or a prop for vaudeville routines. The uke had a few high-profile partisans over the years—including George Harrison, who reportedly brought them to friends' houses as gifts—but as far as the rest of the world was concerned, the ukulele stopped with "Raindrops Keep Fallin' on My Head" and Tiny Tim.

Ten or 15 years ago, things started to change. (...)

"It's like a little chord machine," Beloff said. "There are all kinds of musically sophisticated things you can wring out of those four strings." He pointed to old players like Lyle Ritz—of the elite session band the Wrecking Crew, who played for Phil Spector, the Beach Boys, the Byrds, etc.—who made extraordinary ukulele jazz records. Beloff and his wife are about to release a songbook with ukulele arrangements for works by Vivaldi, Bach, and other baroque composers. (...)

Eddie Vedder, who has composed on the ukulele for years, found his first serious one on a surfing trip in a remote Hawaiian town. He went to the liquor store for some cases of beer and was sitting on them, waiting for a friend who'd gone to the grocery store. "I turned around and there was this ukulele hanging on the wall, right above my shoulder," he said, "just like a parrot on a pirate." He bought it and started fooling around in the sun. He'd left the case open on the sidewalk and people started throwing money into it. A new relationship was born.

"Instruments can be friends, and there's a big transition playing an instrument when it becomes your friend," he said. "You remember the day when it isn't a guest/host relationship. Most instruments take a while before they let you play them. The ukulele is different—it's a really gregarious little friend. And for its size, it's really forthright and giving. It doesn't have a Napoleon complex."

A good ukulele sounds gregarious. Vedder told one story about a night playing casually with a fellow musician. The friend was in the corner, trying to write something dark and evil-sounding on the ukulele, like it was a challenge. But he couldn't do it. They stayed up all night trying—and partying. "In the fog of the morning, he was vomiting over the balcony," Vedder said. "The uke had won!"

by Brenden Kiley, The Stranger |  Read more:
Photo:  Collings Guitars

Redefining Success and Celebrating the Ordinary


I've been thinking a lot about the ordinary and extraordinary lately. All year, my sons’ school newsletters were filled with stories about students winning prizes for university-level scientific research, stellar musical accomplishments and statewide athletic laurels.

I wonder if there is any room for the ordinary any more, for the child or teenager — or adult — who enjoys a pickup basketball game but is far from Olympic material, who will be a good citizen but won’t set the world on fire.

We hold so dearly onto the idea that we should all aspire to being remarkable that when David McCullough Jr., an English teacher, told graduating seniors at Wellesley High School in Massachusetts recently, “You are not special. You are not exceptional,” the speech went viral.

“In our unspoken but not so subtle Darwinian competition with one another — which springs, I think, from our fear of our own insignificance, a subset of our dread of mortality — we have of late, we Americans, to our detriment, come to love accolades more than genuine achievement,” he told the students and parents. “We have come to see them as the point — and we’re happy to compromise standards, or ignore reality, if we suspect that’s the quickest way, or only way, to have something to put on the mantelpiece, something to pose with, crow about, something with which to leverage ourselves into a better spot on the social totem pole.”

I understand that Mr. McCullough, son of the Pulitzer Prize-winning historian, is telling these high school seniors that the world might not embrace them as unconditionally as their parents have. That just because they’ve been told they’re amazing doesn’t mean that they are. That they have to do something to prove themselves, not just accept compliments and trophies.

So where did this intense need to be exceptional come from?

Madeline Levine, a psychologist, said that for baby boomers, “the notion of being special is in our blood.” She added: “How could our children be anything but? And future generations kept building on that.”

by Alena Tugend, NY Times |  Read more:
Photo: Charlie Riedel/Associated Press

The Perfect Listen: Fiona Apple As A Lesson In Irrational Music Rituals


On June 19, a week and a half ago, Fiona Apple released a brand new album, her first in seven years. The entire album had been available for streaming by NPR Music for a week and a half by then. Three days later, my copy arrived in the mail. It hasn't left my desk since.

I still haven't listened to it.

Mind you, I've been looking forward to The Idler Wheel... more than maybe any other album this year. Her stunning Boston show in March floored me; it was unquestionably the best concert I've seen in five years, and it took me half a day to recover to a point where I could even listen to other music. Sure, the album's reviews have been breathless and hagiographic, but the prospect of it falling short of expectations – which is always a possibility, though similar reports about her recent performances turned out to be right on target – isn't the issue.

What has kept me from just putting the damn thing in my CD player and pressing "play" is a bit of what I fully accept is compulsive irrationality: I want to hear it so much that I want to make sure that conditions are exactly right the very first time I listen to it, and conditions have not been exactly right. And that is, in a word, stupid.

And I know stupid, because I have my own first-listen music-listening rituals. The first time I play an album, I have to listen to it straight through, with no interruptions, no pausing, no "I'll get to the rest of it later"; if it's 60 minutes long, then I'd better be sure I can carve out an hour for it. If there are lyrics in the liner notes, I'll read along as it plays. What I want, really, is to be able to give it my full, undivided attention.

But for all the romanticizing of the first time we hear an album or a song, that's almost never the moment of its crucial impact. That's not really how music works, not if it can actually hold up beyond that first listen. Unlike books, movies or plays (and television, to a lesser extent), recorded music is consumed repetitively. It's usually anywhere between the second and fifth listen that fragments that maybe weren't evident on first glance suddenly come at you or your brain makes a connection that could only have been made indirectly. That's when a song start to mean something to you.

by Marc Hirsh, NPR |  Read more:
Photo: Fiona Apple, by Jack Plunkett/AP

Saturday, June 30, 2012


René Magritte, Time Transfixed, 1938, oil on canvas (via The Art Institute of Chicago)
via:
"I would characterize it sort of like a powerful interest group within a political party at this point. It used to be the entire political party."
—Iggy Pop explains his current relationship with his penis.

h/t The Awl 

California Takes Foie Gras Off the Menu

At Mélisse in Santa Monica, diners were preparing Saturday for "one last huzzah" in honour of a controversial delicacy that will soon become contraband across California.

Awaiting them at the upmarket French bistro is a feast of foie gras, a seven-course special celebrating the food stuff that makes animal rights campaigners gag, but leaves aficionados wanting more.

Those who make it through to the final dish – a strawberry shortcake stuffed with foie gras mouse and accompanied with foie gras ice cream – will be battling time, as well as their belts.

For at midnight California will enact a law it promised eight years ago, making the fattened livers of force-fed ducks and geese illegal.

Foie gras has long been a target for those calling for the ethical treatment of livestock. Translated to English as "fatty liver", foie gras is produced by a process known as gavage, in which the birds are force-fed corn through a tube.

It is designed to enlarge the birds' livers before being slaughtered, after which the organs are harvested and served up as a rich – and to fans a mouth-watering – delicacy.

The process dates back centuries. But in late 2004, then California governor Arnold Schwarzenegger signed a bill banning the sale of foie gras.

Diners and chefs were given a suitably long grace period to find an alternative method to gavage or wean themselves off the stuff it produces.

But despite a concerted effort by some to get the proposed ban overturned, seven and a half years down the line the law is now to be enacted.

From July 1, any restaurant serving foie gras will be fined up to $1,000 according to the statute. As the deadline has neared, restaurants have seen a growth in patrons wanting foie gras.

by Matt Williams, The Guardian |  Read more:
Photograph: Dimitar Dilkoff/AFP/Getty Images

Our Robot Future


It was chaos over Zuccotti Park on the early morning of Nov. 15. New York City policemen surrounded the park in Lower Manhattan where hundreds of activists had been living as part of the nationwide Occupy movement. The 1:00 AM raid followed a court order allowing the city to prohibit camping gear in the privately-owned park.

Many protestors resisted and nearly 200 were arrested. Journalists hurrying towards the park reported being illegally barred by police. The crews of two news-choppers–one each from CBS and NBC–claimed they were ordered out of the airspace over Zuccotti Park by the NYPD. Later, NBC claimed its crew misunderstood directions from the control tower. “NYPD cannot, and did not, close air space. Only FAA can do that,” a police spokesperson told Columbia Journalism Review. The FAA said it issued no flight ban.Regardless, the confusion resulted in a de facto media blackout for big media. Just one reporter had the unconstrained ability to get a bird’s-eye view on police action during the height of the Occupy protests. Tim Pool, a 26-year-old independent video journalist, in early December began sending a customized two-foot-wide robot–made by French company Parrot–whirring over the police’s and protestors’ heads. The camera-equipped ‘bot streamed live video to Pool’s smartphone, which relayed the footage to a public Internet stream.If the police ever noticed the diminutive, all-seeing automaton–and there’s no evidence they did–they never did anything to stop it. Unlike CBS and NBC, the boyish Pool, forever recognizable in his signature black knit cap, understood the law. He knew his pioneering drone flights were legal–just barely.

Pool’s robot coup was a preview of the future, as rapid advances in cheap drone technology dovetail with a loosening legal regime that, combined, could allow pretty much anybody to deploy their own flying robot–and all within the next three years. The spread of do-it-yourself robotics could radically change the news, the police, business and politics. And it could spark a sort of drone arms race as competing robot users seek to balance out their rivals.

Imagine police drones patrolling at treetop level down city streets, their cameras scanning crowds for weapons or suspicious activity. “Newsbots” might follow in their wake, streaming live video of the goings-on. Drones belonging to protest groups hover over both, watching the watchers. In nearby zip codes, drones belonging to real estate agents scope out hot properties. Robots deliver pizzas by following the signal from customers’ cell phones. Meanwhile, anti-drone “freedom fighters,” alarmed by the spread of cheap, easy overhead surveillance, take potshots at the robots with rifles and shotguns.

These aren’t just fantasies. All of these things are happening today, although infrequently and sometimes illegally. The only thing holding back the robots is government regulations that have failed to keep up with technology. The regs are due for an overhaul in 2015. That’s the year drones could make their major debut. “Everyone’s ready to do this,” Pool tells ANIMAL. “It’s only going to get crazier.”

by David Axe, AnimalNewYork |  Read more:

Amber Waves of Green

The gap between the richest and the poorest among us is now wider than it has been since we all nose-dived into the Great Depression. So GQ sent Jon Ronson on a journey into the secret financial lives of six different people on the ladder, from a guy washing dishes for 200 bucks a week in Miami to a self-storage gazillionaire. What he found are some surprising truths about class, money, and making it in America.

As I drive along the Pacific Coast Highway into Malibu, I catch glimpses of incredible cliff-top mansions discreetly obscured from the road, which is littered with abandoned gas stations and run-down mini-marts. The offlce building I pull up to is quite drab and utilitarian. There are no ornaments on the conference-room shelves—just a bottle of hand sanitizer. An elderly, broad-shouldered man greets me. He's wearing jogging pants. They don't look expensive. His name is B. Wayne Hughes.

You almost definitely won't have heard of him. He hardly ever gives interviews. He only agreed to this one because—as his people explained to me—income disparity is a hugely important topic for him. They didn't explain how it was important, so I assumed he thought it was bad.

I approached Wayne, as he's known, for wholly mathematical reasons. I'd worked out that there are six degrees of economic separation between a guy making ten bucks an hour and a Forbes billionaire, if you multiply each person's income by five. So I decided to journey across America to meet one representative of each multiple. By connecting these income brackets to actual people, I hoped to understand how money shapes their lives—and the life of the country—at a moment when the gap between rich and poor is such a combustible issue. Everyone in this story, then, makes roughly five times more than the last person makes. There's a dishwasher in Miami with an unbelievably stressful life, some nice middle-class Iowans with quite difflcult lives, me with a perfectly fine if frequently anxiety-inducing life, a millionaire with an annoyingly happy life, a multimillionaire with a stunningly amazing life, and then, finally, at the summit, this great American eagle, Wayne, who tells me he's "pissed off" right now.

"I live my life paying my taxes and taking care of my responsibilities, and I'm a little surprised to find out that I'm an enemy of the state at this time in my life," he says. (...)

In 2006, Wayne was America's sixty-first-richest man, according to Forbes, with $4.1 billion. Today he's the 242nd richest (and the 683rd richest in the world), with $1.9 billion. He's among the least famous people on the list. In fact, he once asked the magazine to remove his name. "I said, 'It's an imposition. Forbes should not be doing that. It's the wrong thing to do. It puts my children and my grandchildren at risk.' "

"And what did they say?" I ask.

"They said when Trump called up, he said the number next to his name was too small."

When Wayne is in Malibu, he stays in his daughter's spare room. His home is a three-bedroom farmhouse on a working stud farm in Lexington, Kentucky.

"I have no fancy living at all," he says. "Well, I have a house in Sun Valley. Five acres in the woods. I guess that's fancy."

I like Wayne very much. He's avuncular and salt of the earth. I admire how far he has risen from the Grapes of Wrath circumstances into which he was born; he's the very embodiment of the American Dream. I'm surprised, though, and a little taken aback, by his anger. I'll return to Wayne—and the curiously aggrieved way he views his place in the world—a bit later.

But first let's plummet all the way down to the very, very bottom, as if we're falling down a well, to a concrete slab of a house in a downtrodden Miami neighborhood called Little Haiti.

by Jon Ronson, GQ |  Read more:

Friday, June 29, 2012


Sergei Ivanov:  Firing Squad (1905)
via:

Why We Cheat

Behavioral economist Dan Ariely, who teaches at Duke University, is known as one of the most original designers of experiments in social science. Not surprisingly, the best-selling author’s creativity is evident throughout his latest book, The (Honest) Truth About Dishonesty. A lively tour through the impulses that cause many of us to cheat, the book offers especially keen insights into the ways in which we cut corners while still thinking of ourselves as moral people. Here, in Ariely’s own words, are seven lessons you didn’t learn in school about dishonesty. (Interview edited and condensed by Gary Belsky.)

1. Most of us are 98-percenters.

“A student told me a story about a locksmith he met when he locked himself out of the house. This student was amazed at how easily the locksmith picked his lock, but the locksmith explained that locks were really there to keep honest people from stealing. His view was that 1% of people would never steal, another 1% would always try to steal, and the rest of us are honest as long as we’re not easily tempted. Locks remove temptation for most people. And that’s good, because in our research over many years, we’ve found that everybody has the capacity to be dishonest and almost everybody is at some point or another.”

2. We’ll happily cheat … until it hurts.

“The Simple Model of Rational Crime suggests that the greater the reward, the greater the likelihood that people will cheat. But we’ve found that for most of us, the biggest driver of dishonesty is the ability to rationalize our actions so that we don’t lose the sense of ourselves as good people. In one of our matrix experiments [a puzzle-solving exercise Ariely uses in his work to measure dishonesty], the level of cheating didn’t change as the reward for cheating rose. In fact, the highest payout resulted in a little less cheating, probably because the amount of money got to be big enough that people couldn’t rationalize their cheating as harmless. Most people are able to cheat a little because they can maintain the sense of themselves as basically honest people. They won’t commit major fraud on their tax returns or insurance claims or expense reports, but they’ll cut corners or exaggerate here or there because they don’t feel that bad about it.”

3. It’s no wonder people steal from work.

“In one matrix experiment, we added a condition where some participants were paid in tokens, which they knew they could quickly exchange for real money. But just having that one step of separation resulted in a significant increase in cheating. Another time, we surveyed golfers and asked which act of moving a ball illegally would make other golfers most uncomfortable: using a club, their foot or their hand. More than twice as many said it would be less of a problem — for other golfers, of course — to use their club than to pick the ball up. Our willingness to cheat increases as we gain psychological distance from the action. So as we gain distance from money, it becomes easier to see ourselves as doing something other than stealing. That’s why many of us have no problem taking pencils or a stapler home from work when we’d never take the equivalent amount of money from petty cash. And that’s why I’m a little concerned about the direction we’re taking toward becoming a cashless society. Virtual payments are a great convenience, but our research suggests we should worry that the farther people get from using actual money, the easier it becomes to steal.”

by Gary Belsky, Time |  Read more:
Photo: Getty Images

Tokyo