Tuesday, June 5, 2012
"Don't Eat Fortune's Cookie"
My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don't want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.
I wrote a book about this, called "Moneyball." It was ostensibly about baseball but was in fact about something else. There are poor teams and rich teams in professional baseball, and they spend radically different sums of money on their players. When I wrote my book the richest team in professional baseball, the New York Yankees, was then spending about $120 million on its 25 players. The poorest team, the Oakland A's, was spending about $30 million. And yet the Oakland team was winning as many games as the Yankees — and more than all the other richer teams.
This isn't supposed to happen. In theory, the rich teams should buy the best players and win all the time. But the Oakland team had figured something out: the rich teams didn't really understand who the best baseball players were. The players were misvalued. And the biggest single reason they were misvalued was that the experts did not pay sufficient attention to the role of luck in baseball success. Players got given credit for things they did that depended on the performance of others: pitchers got paid for winning games, hitters got paid for knocking in runners on base. Players got blamed and credited for events beyond their control. Where balls that got hit happened to land on the field, for example.
Forget baseball, forget sports. Here you had these corporate employees, paid millions of dollars a year. They were doing exactly the same job that people in their business had been doing forever. In front of millions of people, who evaluate their every move. They had statistics attached to everything they did. And yet they were misvalued — because the wider world was blind to their luck.
This had been going on for a century. Right under all of our noses. And no one noticed — until it paid a poor team so well to notice that they could not afford not to notice. And you have to ask: if a professional athlete paid millions of dollars can be misvalued who can't be? If the supposedly pure meritocracy of professional sports can't distinguish between lucky and good, who can?
The "Moneyball" story has practical implications. If you use better data, you can find better values; there are always market inefficiencies to exploit, and so on. But it has a broader and less practical message: don't be deceived by life's outcomes. Life's outcomes, while not entirely random, have a huge amount of luck baked into them. Above all, recognize that if you have had success, you have also had luck — and with luck comes obligation. You owe a debt, and not just to your Gods. You owe a debt to the unlucky.
I make this point because — along with this speech — it is something that will be easy for you to forget.
I now live in Berkeley, California. A few years ago, just a few blocks from my home, a pair of researchers in the Cal psychology department staged an experiment. They began by grabbing students, as lab rats. Then they broke the students into teams, segregated by sex. Three men, or three women, per team. Then they put these teams of three into a room, and arbitrarily assigned one of the three to act as leader. Then they gave them some complicated moral problem to solve: say what should be done about academic cheating, or how to regulate drinking on campus.
Exactly 30 minutes into the problem-solving the researchers interrupted each group. They entered the room bearing a plate of cookies. Four cookies. The team consisted of three people, but there were these four cookies. Every team member obviously got one cookie, but that left a fourth cookie, just sitting there. It should have been awkward. But it wasn't. With incredible consistency the person arbitrarily appointed leader of the group grabbed the fourth cookie, and ate it. Not only ate it, but ate it with gusto: lips smacking, mouth open, drool at the corners of their mouths. In the end all that was left of the extra cookie were crumbs on the leader's shirt.
This leader had performed no special task. He had no special virtue. He'd been chosen at random, 30 minutes earlier. His status was nothing but luck. But it still left him with the sense that the cookie should be his.
This experiment helps to explain Wall Street bonuses and CEO pay, and I'm sure lots of other human behavior. But it also is relevant to new graduates of Princeton University. In a general sort of way you have been appointed the leader of the group. Your appointment may not be entirely arbitrary. But you must sense its arbitrary aspect: you are the lucky few. Lucky in your parents, lucky in your country, lucky that a place like Princeton exists that can take in lucky people, introduce them to other lucky people, and increase their chances of becoming even luckier. Lucky that you live in the richest society the world has ever seen, in a time when no one actually expects you to sacrifice your interests to anything.
All of you have been faced with the extra cookie. All of you will be faced with many more of them. In time you will find it easy to assume that you deserve the extra cookie. For all I know, you may. But you'll be happier, and the world will be better off, if you at least pretend that you don't.
Never forget: In the nation's service. In the service of all nations.
by Michael Lewis, 2012 Baccalaureate Remarks, Princeton University | Read more:
I wrote a book about this, called "Moneyball." It was ostensibly about baseball but was in fact about something else. There are poor teams and rich teams in professional baseball, and they spend radically different sums of money on their players. When I wrote my book the richest team in professional baseball, the New York Yankees, was then spending about $120 million on its 25 players. The poorest team, the Oakland A's, was spending about $30 million. And yet the Oakland team was winning as many games as the Yankees — and more than all the other richer teams.
This isn't supposed to happen. In theory, the rich teams should buy the best players and win all the time. But the Oakland team had figured something out: the rich teams didn't really understand who the best baseball players were. The players were misvalued. And the biggest single reason they were misvalued was that the experts did not pay sufficient attention to the role of luck in baseball success. Players got given credit for things they did that depended on the performance of others: pitchers got paid for winning games, hitters got paid for knocking in runners on base. Players got blamed and credited for events beyond their control. Where balls that got hit happened to land on the field, for example.
Forget baseball, forget sports. Here you had these corporate employees, paid millions of dollars a year. They were doing exactly the same job that people in their business had been doing forever. In front of millions of people, who evaluate their every move. They had statistics attached to everything they did. And yet they were misvalued — because the wider world was blind to their luck.
This had been going on for a century. Right under all of our noses. And no one noticed — until it paid a poor team so well to notice that they could not afford not to notice. And you have to ask: if a professional athlete paid millions of dollars can be misvalued who can't be? If the supposedly pure meritocracy of professional sports can't distinguish between lucky and good, who can?
The "Moneyball" story has practical implications. If you use better data, you can find better values; there are always market inefficiencies to exploit, and so on. But it has a broader and less practical message: don't be deceived by life's outcomes. Life's outcomes, while not entirely random, have a huge amount of luck baked into them. Above all, recognize that if you have had success, you have also had luck — and with luck comes obligation. You owe a debt, and not just to your Gods. You owe a debt to the unlucky.
I make this point because — along with this speech — it is something that will be easy for you to forget.
I now live in Berkeley, California. A few years ago, just a few blocks from my home, a pair of researchers in the Cal psychology department staged an experiment. They began by grabbing students, as lab rats. Then they broke the students into teams, segregated by sex. Three men, or three women, per team. Then they put these teams of three into a room, and arbitrarily assigned one of the three to act as leader. Then they gave them some complicated moral problem to solve: say what should be done about academic cheating, or how to regulate drinking on campus.
Exactly 30 minutes into the problem-solving the researchers interrupted each group. They entered the room bearing a plate of cookies. Four cookies. The team consisted of three people, but there were these four cookies. Every team member obviously got one cookie, but that left a fourth cookie, just sitting there. It should have been awkward. But it wasn't. With incredible consistency the person arbitrarily appointed leader of the group grabbed the fourth cookie, and ate it. Not only ate it, but ate it with gusto: lips smacking, mouth open, drool at the corners of their mouths. In the end all that was left of the extra cookie were crumbs on the leader's shirt.
This leader had performed no special task. He had no special virtue. He'd been chosen at random, 30 minutes earlier. His status was nothing but luck. But it still left him with the sense that the cookie should be his.
This experiment helps to explain Wall Street bonuses and CEO pay, and I'm sure lots of other human behavior. But it also is relevant to new graduates of Princeton University. In a general sort of way you have been appointed the leader of the group. Your appointment may not be entirely arbitrary. But you must sense its arbitrary aspect: you are the lucky few. Lucky in your parents, lucky in your country, lucky that a place like Princeton exists that can take in lucky people, introduce them to other lucky people, and increase their chances of becoming even luckier. Lucky that you live in the richest society the world has ever seen, in a time when no one actually expects you to sacrifice your interests to anything.
All of you have been faced with the extra cookie. All of you will be faced with many more of them. In time you will find it easy to assume that you deserve the extra cookie. For all I know, you may. But you'll be happier, and the world will be better off, if you at least pretend that you don't.
Never forget: In the nation's service. In the service of all nations.
by Michael Lewis, 2012 Baccalaureate Remarks, Princeton University | Read more:
What Cool Things Can I Do with All This Free Cloud Storage Space?
Dear Lifehacker,
Anytime I see an offer for free cloud storage, I'm all over it. I have over 8GB of Dropbox space, 5GB on Google Drive, 20GB on Amazon Cloud Drive, 50GB on Box, and 7GB on Microsoft's SkyDrive—and I want to take advantage of all of it. Any suggestions?
Thanks,
Drowning in Free Space
Dear Drowning,
We hear you! With all the cloud services handing out free space like
it's candy, it's easy to end up with a lot of unused space just waiting
to be filled. Unfortunately, there's no way to consolidate all that
storage space spread out across your accounts (though you can use
services like previously mentioned Otixo and Primadesk
to see all your online drives at once). One way to make use of all of
these services without too much confusion is to separate the types of
files you store across services, and in fact, you can do so in a way
that takes advantage of the strengths of each.
For example, you can dedicate Dropbox to your active projects, because it's the syncing service where you have the most storage space. Use other services for backing up your photos, music, and other data.
These services all have unique strengths that can help you decide what to use them for. You don't need to use every single one of these services, but if you want to mix and match, here's an overview of what they're best for:
Best Uses for Different Cloud Services
Sync Your Music with Amazon Cloud Drive or Google Play Music
Neither Amazon Cloud Drive nor Google Play Music sync your files, so they're not useful for storing stuff that needs to always be up-to-date. They are, however, ideal for your music files.
If you buy your MP3s from Amazon, they're automatically stored to your Amazon Cloud Drive and don't count against your storage space. Even better, if you're on a paid plan (starting at $20/year for 20GB), you get unlimited storage space for all music, regardless of where you bought it. Amazon can stream your music on the web and on Android and iOS devices.
Google Play Music now incorporates the former Google Music service into Google's Play marketplace to store your songs—and books—online and stream them on the web and your Android phone. Play's limit for music is 20,000 songs, rather than a set amount of space in gigabytes. (You get unlimited space for ebooks and can use Play to rent movies but not store them in the cloud). Plus, Adam Pash's Music Plus Chrome extension makes Play Music even more awesome.
Learn more about the differences between Google Play Music and Amazon Cloud Drive in our cloud music comparison, which also includes iCloud. It's also worth noting that SugarSync can stream a folder of music to iOS and Android, and gives you 5GB of free space.
by Melanie Pinola, Lifehacker | Read more:
Anytime I see an offer for free cloud storage, I'm all over it. I have over 8GB of Dropbox space, 5GB on Google Drive, 20GB on Amazon Cloud Drive, 50GB on Box, and 7GB on Microsoft's SkyDrive—and I want to take advantage of all of it. Any suggestions?
Thanks,
Drowning in Free Space
Dear Drowning,

For example, you can dedicate Dropbox to your active projects, because it's the syncing service where you have the most storage space. Use other services for backing up your photos, music, and other data.
These services all have unique strengths that can help you decide what to use them for. You don't need to use every single one of these services, but if you want to mix and match, here's an overview of what they're best for:
Best Uses for Different Cloud Services
Sync Your Music with Amazon Cloud Drive or Google Play Music
Neither Amazon Cloud Drive nor Google Play Music sync your files, so they're not useful for storing stuff that needs to always be up-to-date. They are, however, ideal for your music files.
If you buy your MP3s from Amazon, they're automatically stored to your Amazon Cloud Drive and don't count against your storage space. Even better, if you're on a paid plan (starting at $20/year for 20GB), you get unlimited storage space for all music, regardless of where you bought it. Amazon can stream your music on the web and on Android and iOS devices.
Google Play Music now incorporates the former Google Music service into Google's Play marketplace to store your songs—and books—online and stream them on the web and your Android phone. Play's limit for music is 20,000 songs, rather than a set amount of space in gigabytes. (You get unlimited space for ebooks and can use Play to rent movies but not store them in the cloud). Plus, Adam Pash's Music Plus Chrome extension makes Play Music even more awesome.
Learn more about the differences between Google Play Music and Amazon Cloud Drive in our cloud music comparison, which also includes iCloud. It's also worth noting that SugarSync can stream a folder of music to iOS and Android, and gives you 5GB of free space.
by Melanie Pinola, Lifehacker | Read more:
Failure and Rescue
In commencement addresses like this, people admonish us: take risks; be willing to fail. But this has always puzzled me. Do you want a surgeon whose motto is “I like taking risks”? We do in fact want people to take risks, to strive for difficult goals even when the possibility of failure looms. Progress cannot happen otherwise. But how they do it is what seems to matter. The key to reducing death after surgery was the introduction of ways to reduce the risk of things going wrong—through specialization, better planning, and technology. They have produced a remarkable transformation in the field. Not that long ago, surgery was so inherently dangerous that you would only consider it as a last resort. Large numbers of patients developed serious infections afterward, bleeding, and other deadly problems we euphemistically called “complications.” Now surgery has become so safe and routine that most is day surgery—you go home right afterward.
But there continue to be huge differences between hospitals in the outcomes of their care. Some places still have far higher death rates than others. And an interesting line of research has opened up asking why.
Researchers at the University of Michigan discovered the answer recently, and it has a twist I didn’t expect. I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe.
Scientists have given a new name to the deaths that occur in surgery after something goes wrong—whether it is an infection or some bizarre twist of the stomach. They call them a “failure to rescue.” More than anything, this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more.
This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all. (...)
There was, as I said, every type of error. But the key one was the delay in accepting that something serious was wrong. We see this in national policy, too. All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself. In surgery, you learn early how essential that is. You are imperfect. Your knowledge is never complete. The science is never certain. Your skills are never infallible. Yet you must act. You cannot let yourself become paralyzed by fear.
by Atul Gawande, The New Yorker | Read more:
Photograph courtesy Hulton Archive/Getty.
But there continue to be huge differences between hospitals in the outcomes of their care. Some places still have far higher death rates than others. And an interesting line of research has opened up asking why.
Researchers at the University of Michigan discovered the answer recently, and it has a twist I didn’t expect. I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe.
Scientists have given a new name to the deaths that occur in surgery after something goes wrong—whether it is an infection or some bizarre twist of the stomach. They call them a “failure to rescue.” More than anything, this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more.
This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all. (...)
There was, as I said, every type of error. But the key one was the delay in accepting that something serious was wrong. We see this in national policy, too. All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.
But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself. In surgery, you learn early how essential that is. You are imperfect. Your knowledge is never complete. The science is never certain. Your skills are never infallible. Yet you must act. You cannot let yourself become paralyzed by fear.
by Atul Gawande, The New Yorker | Read more:
Photograph courtesy Hulton Archive/Getty.
Can Music Save Your Life?
Who hasn't at least once had the feeling of being remade through music? Who is there who doesn't date a new phase in life to hearing this or that symphony or song? I heard it—we say—and everything changed. I heard it, and a gate flew open and I walked through. But does music constantly provide revelation—or does it have some other effects, maybe less desirable?
For those of us who teach, the question is especially pressing. Our students tend to spend hours a day plugged into their tunes. Yet, at least in my experience, they are reluctant to talk about music. They'll talk about sex, they'll talk about drugs—but rock 'n' roll, or whatever else they may be listening to, is off-limits. What's going on there?
When I first heard Bob Dylan's "Like a Rolling Stone" in 1965, not long after it came out, I was amazed. At the time, I liked to listen to pop on the radio—the Beatles were fine, the Stones were better. But nothing I'd heard until then prepared me for Dylan's song. It had all the fluent joy of a pop number, but something else was going on too. This song was about lyrics: language. Dylan wasn't chanting some truism about being in love or wanting to get free or wasted for the weekend. He had something to say. He was exasperated. He was pissed off. He'd clearly been betrayed by somebody, or a whole nest of somebodies, and he was letting them have it. His words were exuberantly weird and sometimes almost embarrassingly inventive—and I didn't know what they all meant. "You used to ride on the chrome horse with your diplomat / Who carried on his shoulder a Siamese cat." Chrome horse? Diplomat? What?
I sensed Dylan's disdain and his fury, but the song suggested way more than it declared. This was a sidewinder of a song—intense and angry, but indirect and riddling too. I tried to hear every line—Dylan's voice seemed garbled, and our phonograph wasn't new. I can still see myself with my head cocked to the spindle, eyes clenched, trying to shut out the room around me as I strained to grab the words from the harsh melodious wind of the song. "Ain't it hard when you discovered that / He really wasn't where it's at / After he took from you everything he could steal."
I couldn't listen to that song enough. I'd liked music before that. I'd liked stuff I'd heard on the radio; I'd even liked the Beethoven and the Mahler that my father played at top volume on Sunday mornings, though I never would have admitted as much to him. But Dylan was different. Other music made me temporarily happy, or tranquil, or energized. But this music made me puzzled. There was something in the grooves that I wasn't getting. There was something in the mix of the easy, available pop hook and the grating voice and elliptical words that signaled in the direction of experiences I hadn't had yet, and maybe never would. The song made me feel that life was larger than I had thought and made me want to find out what I was missing.
That song kicked open a door in my mind—to borrow a phrase Bruce Springsteen used to describe his own experience with it. But to be honest, in time that door may have gotten a little rusty from lack of use. Because really, after I heard "Like a Rolling Stone" on the radio and bought the single and listened to it 50 or so times, I put it away. I never went out to cop a Dylan album. I never even thought much about the guy for the next five years.
by Mark Edmundson, The Chronicle of Higher Education | Read more:
Photo: Tim MacPherson, Cultura, Aurora Photos
There She Blew
The history of American whaling.
If, under the spell of “Moby-Dick,” you decided to run away to the modern equivalent of whaling, where would you go? Because petroleum displaced whale oil as a source of light and lubrication more than a century ago, it might seem logical to join workers in Arabian oil fields or on drilling platforms at sea. On the other hand, firemen, like whalers, are united by their care for one another and for the vehicle that bears them, and the fireman’s alacrity with ladders and hoses resembles the whaler’s with masts and ropes. Then, there are the armed forces, which, like a nineteenth-century whaleship, can take you around the world in the company of people from ethnic and social backgrounds unfamiliar to you. All these lines of work are dangerous but indispensable, as whaling once was, but none seem perfectly analogous. Ultimately, there is nothing like rowing a little boat up to a sixty-ton mammal that swims, stabbing it, and hoping that it dies a relatively well-mannered death.
Nor is there anything like skinning the whale’s penis, “longer than a Kentuckian is tall,” and wearing it as a tunic while you slice up the fat harvested from the rest of its body. Melville’s narrator, Ishmael, claims that the mincer of blubber usually wore such a tunic, in a clerical cut that made him look like “a candidate for an archbishoprick.” For “Moby-Dick,” Melville drew on scientific, historical, and journalistic accounts of whales, but he had a reputation for blurring the line between fact and fiction, and scholars have noted that for this chapter “none of Melville’s fish documents was particularly helpful.” In other words, he may have made the tunic up, for the sake of the archiepiscopal pun and perhaps, too, as a symbol. In another chapter long suspected of symbolism, Ishmael falls into ecstasy while squeezing the lumps out of spermaceti freshly bailed from the head of a sperm whale: “I found myself unwittingly squeezing my co-laborers’ hands in it, mistaking their hands for the gentle globules.” But, fanciful as it sounds, sperm-squeezing is attested to by another source. In an 1874 memoir, the whaler William M. Davis recalled how “luxurious” it was to wade into pots of sperm and “squeeze and strain out the fibres,” which would darken the oil unless they were removed, and added that, in the rich bath, “I almost fell in love with the touch of my own poor legs.”
It is difficult to follow in Melville’s footsteps if you can’t tell when he’s fibbing, but there is no shortage of whaling histories for a Melville aficionado to turn to. (“Though of real knowledge there be little,” Melville wrote, “yet of books there are a plenty.”) In the latest, “Leviathan” (Norton; $27.95), Eric Jay Dolin offers a pleasantly anecdotal history of American whaling so comprehensive that he seems to have harpooned at least one fact from every cetacean text ever printed. “Leviathan” is a gentle book about a brutal industry. By ending his story when America stopped whaling, Dolin omits the most gruesome years of international whaling history, when new technology increased killing capacity approximately tenfold. He presents whaling in a more innocent age, when it was the fifth-largest industry in America and a source of national pride—in the time before ecology, as well as before steamships, as it were.
It’s hard to say who qualifies as the first American whaler. The Inuit hunted whales in the Canadian Arctic a thousand years ago, but Dolin isn’t convinced that anyone in what is now the United States did so before Europeans arrived. Basque whale hunters reached Labrador in the sixteenth century, and in 1614 John Smith, unable to return to his beloved Virginia, tried to catch whales near what is now Maine. (They got away.) The day after the Pilgrims signed their 1620 compact, whales surrounded the Mayflower, but the Pilgrims lacked whale-catching equipment, and Dolin suspects that they lit their lamps with oil from dead whales they found on the beach. The first to hunt whales and actually catch them in waters that today belong to the United States were the Dutch, off the coast of Delaware, in the sixteen-thirties. The vagueness of this prehistory says a lot about Colonial America, which had few clear political borders, but even more about whaling, which throughout its history has tended to defy them. “In whaling the natural resource (the stock of whales) was owned by no one,” the economists Lance E. Davis, Robert E. Gallman, and Karin Gleiter noted in a definitive 1997 analysis of the industry. One theme to emerge from Dolin’s book is the oblique angle the history of whaling forms with political history. To whalers, nation-states were usually irrelevant and sometimes a hazard.
Once a whale washed ashore, of course, it was bound to end up as someone’s property, and whales entered early American law through the question of who owned them when they did. On Long Island, a town’s householders divvied up the oil among themselves, after paying a few shillings to the finder and something to the butcher, and sometimes surrendering the fins and flukes to local Indians for ceremonial use. In Massachusetts, Plymouth Colony taxed towns by taking a barrel of oil from every drift whale. In the sixteen-forties and fifties, colonists began to sail a few miles to kill whales spotted from shore, and, not long afterward, Colonial governments were demanding a share of the profits from these whales, too.
Serious money was at stake. When two shallops of Rhode Islanders towed home a right whale in 1662, a contemporary commented that “they had earned more than a whole farm would bring us in an entire year.” Besides oil, right whales contained baleen, a fibrous and feathery tissue in their mouths, which is probably responsible for the “strange, grassy, cutting sound” that Ishmael hears as he watches them feed. Flexible when heated, baleen, also known as whalebone, kept whatever shape it was cooled into, like plastic. It was used primarily in corsets, fashionable from the sixteenth century to the dawn of the twentieth, but it could be molded into items as various as umbrella ribs, fishing rods, and shoehorns.
by Caleb Crain, The New Yorker | Read more:
Illustration: Jacques De Loustal

If, under the spell of “Moby-Dick,” you decided to run away to the modern equivalent of whaling, where would you go? Because petroleum displaced whale oil as a source of light and lubrication more than a century ago, it might seem logical to join workers in Arabian oil fields or on drilling platforms at sea. On the other hand, firemen, like whalers, are united by their care for one another and for the vehicle that bears them, and the fireman’s alacrity with ladders and hoses resembles the whaler’s with masts and ropes. Then, there are the armed forces, which, like a nineteenth-century whaleship, can take you around the world in the company of people from ethnic and social backgrounds unfamiliar to you. All these lines of work are dangerous but indispensable, as whaling once was, but none seem perfectly analogous. Ultimately, there is nothing like rowing a little boat up to a sixty-ton mammal that swims, stabbing it, and hoping that it dies a relatively well-mannered death.
Nor is there anything like skinning the whale’s penis, “longer than a Kentuckian is tall,” and wearing it as a tunic while you slice up the fat harvested from the rest of its body. Melville’s narrator, Ishmael, claims that the mincer of blubber usually wore such a tunic, in a clerical cut that made him look like “a candidate for an archbishoprick.” For “Moby-Dick,” Melville drew on scientific, historical, and journalistic accounts of whales, but he had a reputation for blurring the line between fact and fiction, and scholars have noted that for this chapter “none of Melville’s fish documents was particularly helpful.” In other words, he may have made the tunic up, for the sake of the archiepiscopal pun and perhaps, too, as a symbol. In another chapter long suspected of symbolism, Ishmael falls into ecstasy while squeezing the lumps out of spermaceti freshly bailed from the head of a sperm whale: “I found myself unwittingly squeezing my co-laborers’ hands in it, mistaking their hands for the gentle globules.” But, fanciful as it sounds, sperm-squeezing is attested to by another source. In an 1874 memoir, the whaler William M. Davis recalled how “luxurious” it was to wade into pots of sperm and “squeeze and strain out the fibres,” which would darken the oil unless they were removed, and added that, in the rich bath, “I almost fell in love with the touch of my own poor legs.”
It is difficult to follow in Melville’s footsteps if you can’t tell when he’s fibbing, but there is no shortage of whaling histories for a Melville aficionado to turn to. (“Though of real knowledge there be little,” Melville wrote, “yet of books there are a plenty.”) In the latest, “Leviathan” (Norton; $27.95), Eric Jay Dolin offers a pleasantly anecdotal history of American whaling so comprehensive that he seems to have harpooned at least one fact from every cetacean text ever printed. “Leviathan” is a gentle book about a brutal industry. By ending his story when America stopped whaling, Dolin omits the most gruesome years of international whaling history, when new technology increased killing capacity approximately tenfold. He presents whaling in a more innocent age, when it was the fifth-largest industry in America and a source of national pride—in the time before ecology, as well as before steamships, as it were.
It’s hard to say who qualifies as the first American whaler. The Inuit hunted whales in the Canadian Arctic a thousand years ago, but Dolin isn’t convinced that anyone in what is now the United States did so before Europeans arrived. Basque whale hunters reached Labrador in the sixteenth century, and in 1614 John Smith, unable to return to his beloved Virginia, tried to catch whales near what is now Maine. (They got away.) The day after the Pilgrims signed their 1620 compact, whales surrounded the Mayflower, but the Pilgrims lacked whale-catching equipment, and Dolin suspects that they lit their lamps with oil from dead whales they found on the beach. The first to hunt whales and actually catch them in waters that today belong to the United States were the Dutch, off the coast of Delaware, in the sixteen-thirties. The vagueness of this prehistory says a lot about Colonial America, which had few clear political borders, but even more about whaling, which throughout its history has tended to defy them. “In whaling the natural resource (the stock of whales) was owned by no one,” the economists Lance E. Davis, Robert E. Gallman, and Karin Gleiter noted in a definitive 1997 analysis of the industry. One theme to emerge from Dolin’s book is the oblique angle the history of whaling forms with political history. To whalers, nation-states were usually irrelevant and sometimes a hazard.
Once a whale washed ashore, of course, it was bound to end up as someone’s property, and whales entered early American law through the question of who owned them when they did. On Long Island, a town’s householders divvied up the oil among themselves, after paying a few shillings to the finder and something to the butcher, and sometimes surrendering the fins and flukes to local Indians for ceremonial use. In Massachusetts, Plymouth Colony taxed towns by taking a barrel of oil from every drift whale. In the sixteen-forties and fifties, colonists began to sail a few miles to kill whales spotted from shore, and, not long afterward, Colonial governments were demanding a share of the profits from these whales, too.
Serious money was at stake. When two shallops of Rhode Islanders towed home a right whale in 1662, a contemporary commented that “they had earned more than a whole farm would bring us in an entire year.” Besides oil, right whales contained baleen, a fibrous and feathery tissue in their mouths, which is probably responsible for the “strange, grassy, cutting sound” that Ishmael hears as he watches them feed. Flexible when heated, baleen, also known as whalebone, kept whatever shape it was cooled into, like plastic. It was used primarily in corsets, fashionable from the sixteenth century to the dawn of the twentieth, but it could be molded into items as various as umbrella ribs, fishing rods, and shoehorns.
by Caleb Crain, The New Yorker | Read more:
Illustration: Jacques De Loustal
Are We Losing San Francisco?
[ed. See also: Can Mom and Pop Shops Survive Extreme Gentrification?]
Twitter moves into its new headquarters in downtown San Francisco this month, it will occupy three floors of an 11-story 1937 Art Deco building that has sat shuttered for five years. Outside, its blue bird logo will replace the former main tenant’s sign, whose analog clocks remain frozen at 9:18, 4:33 and other times past.
Far from Silicon Valley’s self-enclosed campuses, Twitter and other tech start-ups are gravitating to an urban core here that has defied development for decades. Its soon-to-be neighbors include liquor stores, check-cashing stores and discount hotels.
At the Ironwok Japanese and Chinese restaurant, whose half-torn storefront banner flapped in the wind on a recent afternoon, the owners were waiting for Twitter with the same mixture of expectation and trepidation shared by much of the city toward the second tech boom in a little over a decade.
“Of course, Twitter is good for the city, but how about me?” said the owner, Jenny Liu, 41, explaining that her landlord was raising her monthly rent to $12,000 from $8,000.
Even more than the dot-com bubble of the 1990s, this boom could transform the fabric of the city.
This time, Twitter, Zynga, Yelp and other social network companies favored by venture capitalists have made San Francisco their home, creating jobs and raising commercial rents. At the same time, a growing number of young Silicon Valley workers, drawn by San Francisco’s urban charms, are also moving into the city as commuters and further raising rents.
In a city often regarded as unfriendly to business, Mayor Edwin M. Lee, elected last year with the tech industry’s strong backing, has aggressively courted start-ups.
But this boom has also raised fears about the tech industry’s growing political clout and its spillover economic effects. Apartment rents have soared to record highs as affordable housing advocates warn that a new wave of gentrification will price middle-class residents out of the city. At risk, many say, are the very qualities that have drawn generations of outsiders here, like the city’s diversity and creativity. Families, black residents, artists and others will increasingly be forced across the bridge to Oakland, they warn.
“Is Oakland Cooler Than San Francisco?” The San Francisco Bay Guardian captured the prevailing angst on a recent cover.
Kenneth Rosen, an economist and expert on real estate at the University of California, Berkeley, said that the boom was starting to hurt the “poor and middle class” but that it would benefit the “upper middle class.” Its full impact will not be felt for another couple of years, he said, adding, “We are early on in this boom.”
by Norimitsu Onishi, NY Times | Read more:
Photo: Jim Wilson/The New York Times
Monday, June 4, 2012
The Little 747 Who Dared to Dream
Do they still make children's books with sad endings? Like The Velveteen Rabbit? Because I think I've got a doozy here.
It's all about a 747 who loves to fly. It's what she was built to do and it's what she does best. For years, she soars through the skies, ferrying cargo and, possibly, some nondescript men in nice suits. (Or maybe not. Depends on when she went into service.) But through it all, the little 747 just wants to spend as much time as she can aloft, among the clouds, where she belongs.
But then, one day, the nondescript men in nice suits tell her that it's time she retire. They take her to a place in the desert and leave her there, with lots of other retired planes who've given up and are slowly falling apart. Other men come and they take her engines. Then they take all the beautiful buttons and switches from cockpit. The other planes tell her that, soon, men will come with saws to cut away parts of her fuselage. But the little 747 never breaks. They can take her apart, bit by bit, but they can't take away her dreams. And still, sometimes, in the boneyard, she tries to take to the skies just one last time.
Seriously. Somebody call the Newberry committee.
And bring me a hanky.
by Maggie Koerth-Baker, Boing Boing
Real Cool
One of the more memorable encounters in the history of modern art occurred late in 1961 when the period’s preeminent avant-garde dealer, Leo Castelli, paid a call at the Upper East Side Manhattan townhouse-cum-studio of Andy Warhol, whose pioneering Pop paintings based on cartoon characters including Dick Tracy, the Little King, Nancy, Popeye, and Superman had caught the eye of Castelli’s gallery director, Ivan Karp, who in turn urged his boss to go have a look for himself. Warhol, eager to make the difficult leap from commercial artist to “serious” painter, decades later recalled his crushing disappointment when Castelli coolly told him, “Well, it’s unfortunate, the timing, because I just took on Roy Lichtenstein, and the two of you in the same gallery would collide.”
Although Lichtenstein, then a thirty-eight-year-old assistant art professor at Rutgers University’s Douglass College in New Jersey, was also making pictures based on comic-book prototypes—an example of wholly independent multiple discovery not unlike such scientific findings as calculus, oxygen, photography, and evolution—he and Warhol were in fact doing quite different things with similar source material, as the divergent tangents of their later careers would amply demonstrate. By 1964, Castelli recognized his mistake and added the thwarted aspirant to his gallery roster, though not before Warhol forswore cartoon imagery, fearful of seeming to imitate Lichtenstein, of whom he always remained somewhat in awe.
In fact, what Lichtenstein and his five-years-younger contemporary Warhol had most in common was being the foremost exemplars of Cool among their generation of American visual artists. The first half of the 1960s was the apogee of what might be termed the Age of Cool—as defined by that quality of being simultaneously with-it and disengaged, in control but nonchalant, knowing but ironically self-aware, and above all inscrutably undemonstrative.
Coolness (which was largely but not exclusively a male attribute) suffused American culture back then, from our supremely compartmentalized commander in chief, John Kennedy, to the action-movie star Steve McQueen, nicknamed “The King of Cool,” and from the middle-class cool of the TV talk-show host Johnny Carson to the far-out jazz trumpeter Miles Davis, whose LP album Birth of the Cool could serve as the soundtrack for that brief interlude before things suddenly turned hot toward the end of the Sixties. Coolness even had its own philosopher-theoretician, Marshall McLuhan, whose influential treatise Understanding Media (1964) codified comic books and television as “cool” means of communication.
Today, a quarter-century after Warhol’s death and fifteen years after Lichtenstein’s (in a hideous coincidence, both unexpectedly succumbed after what had been deemed routine hospital procedures), they remain the two Pop artists best known to the general public, if only in the most simplistic terms, with Warhol as the Campbell’s Soup guy and Lichtenstein as the cartoon guy. A pair of exhibitions that nearly overlapped this spring—a major one on Lichtenstein now at the Art Institute of Chicago before it travels internationally and a numerically comparable but physically more compact one on Warhol at the McNay Museum in San Antonio that was seen only there—offer telling contrasts between these two consummately cool customers.
by Martin Filler, NY Review of Books | Read more:
Illusration: Roy Lichtenstein: Brushstroke with Spatter, 1966. Art Institute of Chicago
WTF in China
I expected China to be different; exotic, challenging, overwhelming in its otherness. But, in many ways, it was depressingly familiar; the mall next to my apartment building had a Gap, an H&M, a Subway and a Baskin Robbins. The New York Pizza restaurant was always at least as busy as the excellent Dim Sum restaurant a few doors down from it. Beijing and Shanghai each have a 5th Avenue equivalent sporting a Louis Vuitton, an enormous Cartier, an equally huge Tiffanys, gigantic Apple stores and all the brands that you'd expect to accompany these. I saw a few Aston Martin and Porsche dealerships and it seemed like every other person was driving an Audi.
My tour guide at the Great Wall of China, Leo, looked at my iPhone and asked, "4S?" I replied yes and he bemoaned the fact that his was only an iPhone 4. By the way, you can get great 3G phone reception at the Great Wall. The Pudong area in Shanghai, which was all farmland 20 years ago, is now adding fantastical skyscrapers so quickly that, when I left for a week to go to Beijing, I thought buildings would pop up while I was away.
There is restricted access to the Internet in China, but it wasn't as bad as I thought it would be and clearly the barriers are pretty easy to work around. Leo asked if I'd like to be his Facebook friend and told me he'd friend me when he got home and could get on the VPN that went around the country's firewall.
But in ways that I wasn't expecting, China was as foreign and incomprehensible as anywhere I've ever been in my life. In the roughly 5 weeks (on and off) that I was there, I had more truly inexplicable encounters and conversations than in the rest of my life put together. My colleague Diana and I coined a phrase, WTF in China (WTFIC). We'd say this to each other every time there was really nothing else to say because words failed us.
One day in Beijing, we were sitting in a taxi in heavy traffic. We noticed a few vendors going between the cars selling mobile phone car chargers. This seemed like a clever idea. Then Diana noticed that each vendor had chargers in one hand and a live turtle in the other. What was the deal with the turtles? Were they selling them? Were they a marketing gimmick? We emailed Leo, who had offered to help us post-tour with any questions. Before I got his reply back, I said to Diana, "you know, even once he answers us, we're not going to be any more illuminated. I just know it's going to be a WTF in China issue." And indeed, this was Leo's answer, "For turtles, they are the symbol of longevity and fortune, so people may buy when they get bored in traffic!" Clearly, this answer made perfect logical sense to Leo. And to all the people sitting in rush hour traffic jams making a spur of the moment purchase of an animal that would probably outlive them.
And talking of driving in China…sometimes driving down the road, it was very hard to tell the difference between "something major has happened" and just the normal everyday chaos. Every day I felt like I took my life in my hands just getting a taxi to the office. Every taxi driver drives insanely fast and wildly, honking his horn even if there are no other cars on the road. It turns out that most of the Shanghai taxi drivers are living and working there illegally from other provinces and none of them seem to know how to get to almost anywhere in Shanghai. I became conversant in enough Mandarin to communicate and direct them to my office and back to my apartment building because this seemed to be a survival tactic.
When I was able to move beyond my fear for my life on these car rides and ones in Beijing, I noticed that cars often stopped, seemingly in the middle of the road or lined up on the hard shoulder, particularly at the weekend. This never helped the terrible traffic congestion. We asked Leo what that was all about. He told us that driving is a pretty new phenomenon for most people in Shanghai and Beijing and they see it as a social activity. When they go out for the day, they want to drive along with their friends. So they'll park in the meridian or by the side of the road to wait for them to catch up. Given that they all have cell phones, I'm not sure why they can't just track each other via GPS or phone and say, "where are you?" But again, this seemed a perfectly reasonable activity to Leo.
by Sarah Firisen, 3 Quarks Daily | Read more:
My tour guide at the Great Wall of China, Leo, looked at my iPhone and asked, "4S?" I replied yes and he bemoaned the fact that his was only an iPhone 4. By the way, you can get great 3G phone reception at the Great Wall. The Pudong area in Shanghai, which was all farmland 20 years ago, is now adding fantastical skyscrapers so quickly that, when I left for a week to go to Beijing, I thought buildings would pop up while I was away.
There is restricted access to the Internet in China, but it wasn't as bad as I thought it would be and clearly the barriers are pretty easy to work around. Leo asked if I'd like to be his Facebook friend and told me he'd friend me when he got home and could get on the VPN that went around the country's firewall.
But in ways that I wasn't expecting, China was as foreign and incomprehensible as anywhere I've ever been in my life. In the roughly 5 weeks (on and off) that I was there, I had more truly inexplicable encounters and conversations than in the rest of my life put together. My colleague Diana and I coined a phrase, WTF in China (WTFIC). We'd say this to each other every time there was really nothing else to say because words failed us.
One day in Beijing, we were sitting in a taxi in heavy traffic. We noticed a few vendors going between the cars selling mobile phone car chargers. This seemed like a clever idea. Then Diana noticed that each vendor had chargers in one hand and a live turtle in the other. What was the deal with the turtles? Were they selling them? Were they a marketing gimmick? We emailed Leo, who had offered to help us post-tour with any questions. Before I got his reply back, I said to Diana, "you know, even once he answers us, we're not going to be any more illuminated. I just know it's going to be a WTF in China issue." And indeed, this was Leo's answer, "For turtles, they are the symbol of longevity and fortune, so people may buy when they get bored in traffic!" Clearly, this answer made perfect logical sense to Leo. And to all the people sitting in rush hour traffic jams making a spur of the moment purchase of an animal that would probably outlive them.
And talking of driving in China…sometimes driving down the road, it was very hard to tell the difference between "something major has happened" and just the normal everyday chaos. Every day I felt like I took my life in my hands just getting a taxi to the office. Every taxi driver drives insanely fast and wildly, honking his horn even if there are no other cars on the road. It turns out that most of the Shanghai taxi drivers are living and working there illegally from other provinces and none of them seem to know how to get to almost anywhere in Shanghai. I became conversant in enough Mandarin to communicate and direct them to my office and back to my apartment building because this seemed to be a survival tactic.
When I was able to move beyond my fear for my life on these car rides and ones in Beijing, I noticed that cars often stopped, seemingly in the middle of the road or lined up on the hard shoulder, particularly at the weekend. This never helped the terrible traffic congestion. We asked Leo what that was all about. He told us that driving is a pretty new phenomenon for most people in Shanghai and Beijing and they see it as a social activity. When they go out for the day, they want to drive along with their friends. So they'll park in the meridian or by the side of the road to wait for them to catch up. Given that they all have cell phones, I'm not sure why they can't just track each other via GPS or phone and say, "where are you?" But again, this seemed a perfectly reasonable activity to Leo.
by Sarah Firisen, 3 Quarks Daily | Read more:
Expert Issues a Cyberwar Warning
[ed. See also: Why Antivirus Companies Like Mine Failed to Catch Flame and Stuxnet]
When Eugene Kaspersky, the founder of Europe’s largest antivirus company, discovered the Flame virus that is afflicting computers in Iran and the Middle East, he recognized it as a technologically sophisticated virus that only a government could create.
He also recognized that the virus, which he compares to the Stuxnet virus built by programmers employed by the United States and Israel, adds weight to his warnings of the grave dangers posed by governments that manufacture and release viruses on the Internet.
“Cyberweapons are the most dangerous innovation of this century,” he told a gathering of technology company executives, called the CeBIT conference, last month in Sydney, Australia. While the United States and Israel are using the weapons to slow the nuclear bomb-making abilities of Iran, they could also be used to disrupt power grids and financial systems or even wreak havoc with military defenses.
Computer security companies have for years used their discovery of a new virus or worm to call attention to themselves and win more business from companies seeking computer protection. Mr. Kaspersky, a Russian computer security expert, and his company, Kaspersky Lab, are no different in that regard. But he is also using his company’s integral role in exposing or decrypting three computer viruses apparently intended to slow or halt Iran’s nuclear program to argue for an international treaty banning computer warfare.
A growing array of nations and other entities are using online weapons, he says, because they are “thousands of times cheaper” than conventional armaments.
While antivirus companies might catch some, he says, only an international treaty that would ban militaries and spy agencies from making viruses will truly solve the problem.
The wide disclosure of the details of the Flame virus by Kaspersky Lab also seems intended to promote the Russian call for a ban on cyberweapons like those that blocked poison gas or expanding bullets from the armies of major nations and other entities.
by Andrew E. Kramer and Nicole Perlroth, NY Times | Read more:
Photo: Alexey Sazonov/Agence France-Presse - Getty Image
When Eugene Kaspersky, the founder of Europe’s largest antivirus company, discovered the Flame virus that is afflicting computers in Iran and the Middle East, he recognized it as a technologically sophisticated virus that only a government could create.

“Cyberweapons are the most dangerous innovation of this century,” he told a gathering of technology company executives, called the CeBIT conference, last month in Sydney, Australia. While the United States and Israel are using the weapons to slow the nuclear bomb-making abilities of Iran, they could also be used to disrupt power grids and financial systems or even wreak havoc with military defenses.
Computer security companies have for years used their discovery of a new virus or worm to call attention to themselves and win more business from companies seeking computer protection. Mr. Kaspersky, a Russian computer security expert, and his company, Kaspersky Lab, are no different in that regard. But he is also using his company’s integral role in exposing or decrypting three computer viruses apparently intended to slow or halt Iran’s nuclear program to argue for an international treaty banning computer warfare.
A growing array of nations and other entities are using online weapons, he says, because they are “thousands of times cheaper” than conventional armaments.
While antivirus companies might catch some, he says, only an international treaty that would ban militaries and spy agencies from making viruses will truly solve the problem.
The wide disclosure of the details of the Flame virus by Kaspersky Lab also seems intended to promote the Russian call for a ban on cyberweapons like those that blocked poison gas or expanding bullets from the armies of major nations and other entities.
by Andrew E. Kramer and Nicole Perlroth, NY Times | Read more:
Photo: Alexey Sazonov/Agence France-Presse - Getty Image
Subscribe to:
Posts (Atom)