Wednesday, August 3, 2011

Books Without Borders

My Life at the World's Dumbest Bookstore Chain.
by Paul Constant

It's embarrassing now, but on the day that I was hired to work at Boston's flagship Borders store in 1996, I was so happy that I danced around my apartment. After dropping out of college, I had worked a succession of crappy jobs: mall Easter Bunny, stock boy at Sears and Kmart and Walmart, a brief and nearly fatal stint as a landscaper. A job at Borders seemed to be a step, at long last, toward my ultimate goal of writing for a living. At least I would be working with books. And the scruffy Borders employees, in their jeans and band T-shirts, felt a lot closer to my ideal urban intellectuals than the stuffy Barnes & Noble employees with their oppressive dress code and lame vests.

The fact that Borders offered me a full-time job, which allowed me to quit two part-time jobs (at a Staples and a Stop & Shop) and offered health insurance (that promised to help pay for my impending wisdom tooth extraction), was a pretty big deal, too.

For better and for worse, Borders was my college experience. I behaved badly—fucked, drank, and did drugs with everyone I could. My fellow employees snuck me into bars when I was underage, and then cheered when, during my 21st birthday party, I wound up facedown in the gutter sobbing about how my heart had been ripped in two by an ex-fiancĂ©e. I was not alone in my bad behavior: Every week, different employees were hooking up, having affairs, breaking up, recoupling, playing drinking games that involved comically large hunting knives, getting in fights, getting pregnant, and showing up drunk for work.

In the beginning, the store felt like a tight-knit family. As time went on, we became a confederation of hedonists with little regard for one another's feelings. At one Christmas party that I didn't attend, a new female employee reportedly gave blowjobs to anybody who wanted one. (Later, at least a couple of men who stood in line for the newbie's ministrations complained about picking up an STD.) Suddenly, the parties weren't as fun anymore. One employee hanged himself. Another died of a heart attack in the DVD section on the overnight replenishment shift and wasn't discovered until the store opened for business the next morning.

But it wasn't all an endless cycle of party and hangover. The 20 percent discount—plus an employee credit account that went up to $300, with the store paying off $20 of that debt a month—allowed me to explore books I'd never heard of. It's hard to remember now, but when Borders began proliferating in suburban parking lots around the country, they had a truly excellent selection curated, at least in part, by each store's employees. I bought my first title from countercultural Washington press Feral House—Apocalypse Culture—at the brand-new Borders at the Maine Mall when I was a teenager, and it still ranks as one of my most mind-blowing reading experiences. I read my first David Foster Wallace and Matt Ruff books while working at Borders; I explored the lesser-known works of Twain and Melville and Dickens and St. Vincent Millay. I learned who Edward Abbey and Noam Chomsky and Kathy Acker were. I discovered young writers like Banana Yoshimoto and Colson Whitehead and Chuck Palahniuk and Haruki Murakami. Thanks to my coworkers in the music department, which was just as far-reaching as the book department, I learned to love Miles Davis and Glenn Gould and an obscure punk band from way out west called Sleater-Kinney.

At the time, independent bookstores were blaming Borders for a spate of mom-and-pop bookstore closures around the country. I'll never forget the employee at Bookland in Maine who coldly accused me of single-handedly destroying her small chain when I admitted who my employer was, even as I was buying $50 worth of books from her. Of course, the accusations had truth to them—small bookstores simply couldn't compete with the deep discounts the chains offered—but for what it's worth, every employee who worked at Borders, at least when I first joined the company, adored literature. We were not automatons out to assassinate local business. We wanted to work with the cultural artifacts that were the most important things in our lives, the things that made us who we were. Not all of us could find work at independent bookstores, so we did the next best thing: We went to work for a company that seemingly cared about quality literature and regional reading tastes, and gave its employees a small-but-fair wage for full-time bookselling careers, with excellent benefits. It sure didn't feel like selling out.

Until suddenly, one day, it did feel like selling out. Because it was. Our displays were bought and paid for by publishers; where we used to present books that we loved and wanted to champion, now mediocre crap was piled on every flat surface. The front of the store, with all the kitchen magnets and board games and junk you don't need took over large chunks of the expansive magazine and local-interest sections. Orders came from the corporate headquarters in Ann Arbor every Sunday to change out the displays. One time I had to take down some of the store's most exciting up-and-coming fiction titles (including a newly published book that was gathering word-of-mouth buzz, thanks to our booksellers, called Harry Potter and the Sorcerer's Stone) to put up a wall of Clash CDs. One month, for some reason, the cafe sold Ernest Hemingway–branded chai.

Read more:
RLyonsArt
via:

Shark Week: Remembering Bruce

by Nicholas Jackson

There are only a few dozen shark attacks on humans every year. It has been widely reported that you are 30 times more likely to die from a lightning strike than you are from an attack. In 2003, Reuters ran a story claiming that more people are killed by vending machines each year than are killed by sharks. And yet, I would bet that just about anybody who has spent time at the beach has thought about the possibility of an attack. I know I certainly have. Before dipping so much as a toe into the ocean, I scan the horizon for a dark, approaching shadow from the deep. And I thank Steven Spielberg for that.

In 1975, Spielberg released the first of what would become a franchise. Jaws was a landmark horror-thriller, recognized by everyone from Empire magazine (fifth greatest film ever made) to the New York Times (one of the 1,000 best movies ever) to the American Film Institute (number 48 on the "100 Years... 100 Movies" list). It won three Academy Awards and was even nominated for Best Picture. (It lost to One Flew Over the Cuckoo's Nest.) Perhaps more importantly, the movie created the wide-release summer blockbuster, a tradition of providing big-budget thrills in ever major theater across America during the hottest months of the year that continues to this day. Jaws brought in more money than any other film and held that title until George Lucas released Star Wars two years later.

An instant classic, Jaws received rave reviews. Roger Ebert called it "a sensationally effective action picture, a scary thriller that works all the better because it's populated with characters that have been developed into human beings we get to know and care about." There's Roy Scheider as Brody, the police chief who we can all identify with, who doesn't like to swim, who is genuinely terrified of the water. There's Robert Shaw as Quint, "a caricature of the crusty old seafaring salt," at Ebert put it in that 1975 write-up. There's Hooper, the rich- kid-turned-oceanographer played by Richard Dreyfuss, just off a string of successes as the nice kid in American Graffiti and the title character in the Canadian hit The Apprenticeship of Duddy Kravitz. But the most important character -- and, in many ways, one of the most human -- is the shark itself.

Everyone knows the story by now: The shark is a great white that terrorizes a small resort town during the Fourth of July weekend, a weekend critical to the economy of this little village. In an effort to track down and kill the shark, these three men leave their families behind (where applicable) and set out on a rickety boat. It's leaky. It's too small. It's old. This boat, we know from the outset, just isn't cut out for shark hunting. At least not hunting sharks of the size we suspect this great white to be.

"There are no doubt supposed to be all sorts of levels of meanings in such an archetypal story," Ebert notes. But he doesn't bother writing about them or trying to figure them out. And neither does Spielberg. "This is an action film content to stay entirely within the perimeters of its story, and none of the characters has to wade through speeches expounding on the significance of it all." And what an action film it is. This isn't just about the dark shadow from the deep -- though it is that, too. Before the story comes to an end, many individuals both on and off the island have been killed in a series of terrifying scenes that allow you to get up close and personal with the shark.

The only reason this works -- the only reason that theatergoers in the 1970s left their seats terrified of these macropredatory beasts and that modern viewers can't turn off the lights when screening the film in their own living rooms -- is the craftsmanship and technology that went into creating the main characters: Jaws.

In early May of 1974, the rights had been acquired to Peter Benchley's book of the same name, the contracts had been signed by Spielberg and principal photography began on Martha's Vineyard. It could have failed. By all accounts, it probably should have failed. Spielberg, not yet 30, was largely untested as a director of big-budget productions and nothing was in place. "We started the film without a script, without a cast and without a shark," Richard Dreyfuss would tell James Lipton during a taping of Inside the Actor's Studio years later. But the cast would come together. And the shark was already in the works.

Read more:

Plan B




Jay Rosen on Journalism in the Internet Age

by Sophie Roelle

In a break from our usual practice of focusing on books, we asked the journalism analyst and veteran blogger to recommend five articles illustrating the upheavals of the news business

I know that as journalists we have to adapt rapidly to new ways of doing things, but you've really thrown me in at the deep end – you’ve chosen five online articles instead of five books, and we’re doing the interview on Google chat rather than by telephone.

I like to do things differently. For example, using PressThink for longform blogging which wasn't the normal thing at the time, in 2003.

Will you give me an overall sense of what you are saying about changes in journalism with the articles you've chosen?

Well, first there's been a shift in power. The users have more than they did because they can publish and connect to one another, not just to the media. Second, the people formerly known as the audience are configured differently. They are connected horizontally as well as vertically, which is why today we speak of social media. This is what I sometimes call “audience atomisation overcome”. Third, the media still have power and journalism still matters. In some ways the essence of it has not changed. But a lot of what journalists did became bound up with particular forms of production and distribution. Since the web has radically altered those forms, it has radically changed journalistic work, even though the value of good journalism remains the same – timely, accurate, useful information that tells us what's happening in our world over the horizon of our personal experience.
Corrado Vanelli
via:

Enter the Cyber-dragon

by Michael Joseph Gross


Hackers have attacked America’s defense establishment, as well as companies from Google to Morgan Stanley to security giant RSA, and fingers point to China as the culprit. The author gets an exclusive look at the raging cyber-war—Operation Aurora! Operation Shady rat!—and learns why Washington has been slow to fight back.


Lying there in the junk-mail folder, in the spammy mess of mortgage offers and erectile-dysfunction drug ads, an e-mail from an associate with a subject line that looked legitimate caught the man’s eye. The subject line said “2011 Recruitment Plan.” It was late winter of 2011. The man clicked on the message, downloaded the attached Excel spreadsheet file, and unwittingly set in motion a chain of events allowing hackers to raid the computer networks of his employer, RSA. RSA is the security division of the high-tech company EMC. Its products protect computer networks at the White House, the Central Intelligence Agency, the National Security Agency, the Pentagon, the Department of Homeland Security, most top defense contractors, and a majority of Fortune 500 corporations.

The parent company disclosed the breach on March 17 in a filing with the Securities and Exchange Commission. The hack gravely undermined the reputation of RSA’s popular SecurID security service. As spring gave way to summer, bloggers and computer-security experts found evidence that the attack on RSA had come from China. They also linked the RSA attack to the penetration of computer networks at some of RSA’s most powerful defense-contractor clients—among them, Lockheed Martin, Northrop Grumman, and L-3 Communications. Few details of these episodes have been made public.

The RSA and defense-contractor hacks are among the latest battles in a decade-long spy war. Hackers from many countries have been exfiltrating—that is, stealing—intellectual property from American corporations and the U.S. government on a massive scale, and Chinese hackers are among the main culprits. Because virtual attacks can be routed through computer servers anywhere in the world, it is almost impossible to attribute any hack with total certainty. Dozens of nations have highly developed industrial cyber-espionage programs, including American allies such as France and Israel. And because the People’s Republic of China is such a massive entity, it is impossible to know how much Chinese hacking is done on explicit orders from the government. In some cases, the evidence suggests that government and military groups are executing the attacks themselves. In others, Chinese authorities are merely turning a blind eye to illegal activities that are good for China’s economy and bad for America’s. Last year Google became the first major company to blow the whistle on Chinese hacking when it admitted to a penetration known as Operation Aurora, which also hit Intel, Morgan Stanley, and several dozen other corporations. (The attack was given that name because the word “aurora” appears in the malware that victims downloaded.) Earlier this year, details concerning the most sweeping intrusion since Operation Aurora were discovered by the cyber-security firm McAfee. Dubbed “Operation Shady rat,” the attacks (of which more later) are being reported here for the first time. Most companies have preferred not to talk about or even acknowledge violations of their computer systems, for fear of panicking shareholders and exposing themselves to lawsuits—or for fear of offending the Chinese and jeopardizing their share of that country’s exploding markets. The U.S. government, for its part, has been fecklessly circumspect in calling out the Chinese.

A scattered alliance of government insiders and cyber-security experts are working to bring attention to the threat, but because of the topic’s extreme sensitivity, much of their consciousness-raising activity must be covert. The result in at least one case, according to documents obtained by Vanity Fair, has been a surreal new creation of American bureaucracy: government-directed “hacktivism,” in which an intelligence agency secretly provides information to a group of private-sector hackers so that truths too sensitive for the government to tell will nevertheless come out.

This unusual project began in March, when National Security Agency officials asked a private defense contractor to organize a cadre of elite non-government experts to study the RSA cyber-attacks. The experts constituted a SEAL Team Six of cyber-security and referred to their work as Operation Starlight. “This is the N.S.A. outsourcing the finger-pointing to the private sector,” says one person who was invited to join the group and has been privy to its e-mail logs. The N.S.A. provided Operation Starlight with the data it needed for its forensic analysis.

Longing

by Traer Scott

"This is from my recent trip to Thailand where I was commissioned to photograph Asian Elephants, centering around those at Boon Lott's Elephant Sanctuary in Baan Tuek. This beautiful elephant looks on as her mahout cuddles baby Noah. She had recently lost her calf who was struck by lightening."

via:

Tuesday, August 2, 2011

 
Artwork by Will Varner
via:

Dave Matthews,Tim Reynolds


"Nine planets around the sun
Only one does the sun embrace
Upon this watered one
So much we take for granted..,"

(lyrics)

Cut Off From The Herd

by S.L. Price, Sports Illustrated
August 25, 1997

[ed.  Interesting read after having the benefit of surveying Randy Moss's career, 14 years down the road.]

Everybody's watching him. Randy Moss can feel the eyes of the lunchtime crowd at the Bob Evans restaurant, the double takes and furtive glances from the men in short sleeves and wide ties. He's got his act down: gray hood over his head, butt slumped in the booth, eyes as lifeless as buttons. Moss is a wide receiver at Marshall University, in Huntington, W.Va., and he figures to be rich before long. He jabs at his toast with a plastic straw.

"If I didn't have this hood on, and they saw us sitting here, people would say an agent picked up Randy Moss and took him to Bob Evans," he says. "That's why I got this hood on. Some people are looking, and some are not. Some know I'm here and you're here, they see a bill and they'll say, 'The agent paid for his food.' Anything can happen."

He shrugs. Moss says he doesn't care about the world's judgments anymore, and it's easy to believe he means it. Certainly no player in college football bears more stains on his name. Two and a half years ago, as a high school senior, Moss stomped a kid in a fight, pleaded guilty to two counts of battery and was sentenced to 30 days in jail and a year's probation. That cost him a scholarship to Notre Dame. He enrolled at Florida State. The following spring he broke probation by smoking marijuana, was kicked out of Florida State and served two more months in prison. Then last fall, as Moss was on his way to shattering various NCAA and Marshall records with 28 touchdowns and 1,709 receiving yards as a freshman, he was charged with domestic battery against the mother of his baby daughter.

Yet Moss is not much interested in image-mending. His first words this morning were that he slept through his communications class. His hair is braided in long rows against his skull, a style he knows will give the wrong impression. "People perceive: Only black thug guys have braids," he says, his voice carrying to a dozen tables. "If I want to grow hair, I'll grow it. If I want to wear lipstick and makeup, I'll do that. God didn't put makeup on this world just for women. They perceive me as a thug? I'm not. I'm a gentleman. I know what I am, my mom knows what I am, most people know what I am. Don't judge me until you know me."

Notre Dame did just that, and Moss will never forgive the school for it. "They didn't take me, because they see me as a thug," he says. "Then Florida State...I don't know. You win some, you lose some. That's a loss." Moss pauses, laughs a humorless laugh. "But in the long run I'm going to have the victory. In the long run...victorious."

Moss is sure of this because he has sports' trump card: talent. Better, Moss has the kind of breathtaking athletic gifts seen once in a generation. At 6'5", with a 39-inch vertical leap and 4.25 speed in the 40, he established himself as West Virginia's greatest high school athlete since Jerry West. Irish coach Lou Holtz declared him one of the best high school football players he'd ever seen. Moss was twice named West Virginia's Player of the Year—in basketball. "He does things you've never seen anyone else do," says Jim Fout, Moss's basketball coach at DuPont High in the town of Belle. Moss also ran track for a while. As a sophomore he was the state champ in the 100 and 200 meters.

Felix Vallotton, Road at St. Paul
via:

As Atheists Know, You Can be Good Without God

by Jerry A. Coyne

One cold Chicago day last February, I watched a Federal Express delivery man carry an armful of boxes to his truck. In the middle of the icy street, he slipped, scattering the boxes and exposing himself to traffic. Without thinking, I ran into the street, stopped cars, hoisted the man up and helped him recover his load. Pondering this afterward, I realized that my tiny act of altruism had been completely instinctive; there was no time for calculation.

We see the instinctive nature of moral acts and judgments in many ways: in the automatic repugnance we feel when someone such as Bernie Madoff bilks the gullible and trusting, in our disapproval of the person who steals food from the office refrigerator, in our admiration for someone who risks his life to save a drowning child. And although some morality comes from reason and persuasion — we must learn, for example, to share our toys — much of it seems intuitive and inborn.

Many Americans, including Francis Collins, director of the National Institutes of Health and an evangelical Christian, see instinctive morality as both a gift from God and strong evidence for His existence.

As a biologist, I see belief in God-given morality as American's biggest impediment to accepting the fact of evolution. "Evolution," many argue, "could never have given us feelings of kindness, altruism and morality. For if we were merely evolved beasts, we would act like beasts. Surely our good behavior, and the moral sentiments that promote it, reflect impulses that God instilled in our soul."

So while morality supposedly comes from God, immorality is laid at the door of Charles Darwin, who has been blamed for everything from Nazism to the shootings in Columbine.

Why it couldn't be God

But though both moral and immoral behaviors can be promoted by religions, morality itself — either in individual behavior or social codes — simply cannot come from the will or commands of a God. This has been recognized by philosophers since the time of Plato.

Religious people can appreciate this by considering Plato's question: Do actions become moral simply because they're dictated by God, or are they dictated by God because they are moral? It doesn't take much thought to see that the right answer is the second one. Why? Because if God commanded us to do something obviously immoral, such as kill our children or steal, it wouldn't automatically become OK. Of course, you can argue that God would never sanction something like that because he's a completely moral being, but then you're still using some idea of morality that is independent of God. Either way, it's clear that even for the faithful, God cannot be the source of morality but at best a transmitter of some human-generated morality.

A Brief History of the Corporation: 1600 to 2100

by Venkat

On 8 June, a Scottish banker named Alexander Fordyce shorted the collapsing Company’s shares in the London markets. But a momentary bounce-back in the stock ruined his plans, and he skipped town leaving £550,000 in debt. Much of this was owed to the Ayr Bank, which imploded. In less than three weeks, another 30 banks collapsed across Europe, bringing trade to a standstill. On July 15, the directors of the Company applied to the Bank of England for a £400,000 loan. Two weeks later, they wanted another £300,000. By August, the directors wanted a £1 million bailout. The news began leaking out and seemingly contrite executives, running from angry shareholders, faced furious Parliament members. By January, the terms of a comprehensive bailout were worked out, and the British government inserted its czars into the Company’s management to ensure compliance with its terms.

If this sounds eerily familiar, it shouldn’t. The year was 1772, exactly 239 years ago today, the apogee of power for the corporation as a business construct. The company was the British East India company (EIC). The bubble that burst was the East India Bubble. Between the founding of the EIC in 1600 and the post-subprime world of 2011, the idea of the corporation was born, matured, over-extended, reined-in, refined, patched, updated, over-extended again, propped-up and finally widely declared to be obsolete. Between 2011 and 2100, it will decline — hopefully gracefully — into a well-behaved retiree on the economic scene.

In its 400+ year history, the corporation has achieved extraordinary things, cutting around-the-world travel time from years to less than a day, putting a computer on every desk, a toilet in every home (nearly) and a cellphone within reach of every human. It even put a man on the Moon and kinda-sorta cured AIDS.

So it is a sort of grim privilege for the generations living today to watch the slow demise of such a spectacularly effective intellectual construct. The Age of Corporations is coming to an end. The traditional corporation won’t vanish, but it will cease to be the center of gravity of economic life in another generation or two. They will live on as religious institutions do today, as weakened ghosts of more vital institutions from centuries ago.

It is not yet time for the obituary (and that time may never come), but the sun is certainly setting on the Golden Age of corporations. It is time to review the memoirs of the corporation as an idea, and contemplate a post-corporate future framed by its gradual withdrawal from the center stage of the world’s economic affairs.

Monday, August 1, 2011

Crashing Down


by Brad Melekian

I'm sitting at one end of a 15-foot-long conference table inside Billabong’s U.S. headquarters—a glass-and-steel building in a nondescript office park in Irvine, California, off Interstate 5. It’s late June, and I’ve been summoned here by the surf manufacturer’s CEO, Paul Naude, and his VP of marketing, Graham Stapelberg, both of them South Africans. They have brought highlighted printouts of a story I wrote for Outside’s January issue, “Last Drop,” about the death of Andy Irons, a three-time world surfing champion and Billabong’s top sponsored athlete. It’s clear they mean for me to speak first, to explain myself.

Things are a little tense because, in late November, only weeks after his November 2 death in a Dallas airport hotel room, I wrote about Irons’s history of drug and alcohol abuse, which nearly killed him on at least one occasion. At the time, the family was standing by its initial press release that Irons had “reportedly been battling” the tropical disease dengue fever when he died, and neither they nor Billabong were talking—though one Billabong rep sent an e-mail saying he couldn’t comment but that we could “count on” Irons having died of dengue.

For writing that story, and especially for recounting that 1999 near-death binge-drinking episode in Indonesia, I was threatened by numerous people within the surf industry and accused of spitting on Irons’s grave. Then on June 10, a week prior to my sit-down at Billabong, after multiple legal challenges from Irons’s family, a Texas medical examiner had finally released a toxicology report detailing what killed Irons.

The report should have cleared up any lingering mystery, but that’s not what happened. Tarrant County medical examiner Nizam Peerwani wrote that he’d found evidence of cocaine, methamphetamine, methadone, a generic form of the anti-anxiety drug Xanax, and marijuana in Irons’s system, and the original police report noted that a bottle of sleeping pills was on a table in the hotel room. But he also concluded that Irons had a severely clogged artery and ruled that “the primary and the underlying cause of death is ischemic heart disease.”

What about all those pharmaceuticals? “Drugs,” the report continued, “particularly methadone and cocaine, are other significant factors contributing to death.”

It was the kind of wording you could interpret to suit your biases or needs, which some have done. Members of Irons’s family, surf journalists, and the Association of Surfing Professionals (ASP)—who presumably didn’t want the public to believe that Irons died of a drug overdose—viewed the report as vindication. A statement released by the Irons family in June read, “Traveling while sick and suffering from an undiagnosed heart condition was more than even Andy could overcome.” Bruce Irons, Andy’s brother and also a pro surfer, recently told ESPN, “When we got the results that it was the artery I went and did a test, and my arteries are fine. Now I know and understand deep down inside that it was brother’s time to go.” Editors at the website Surfline tweeted, “Andy Irons died of sudden cardiac arrest due to a blocked artery. His heart was full of passion for life & surfing.” After the results came out, ASP officials agreed to an interview but later backed out, and PR director Dave Prodan sent me this e-mail: “The ASP has no further comment at this time, aside from: The loss of Andy Irons from the sporting world has been devastating, but we feel fortunate enough to have witnessed his incredible accomplishments and unbridled passion for the sport of surfing.”

Read more:
Jerry LoFaro
via:

Heavy Bags

by Larry Dorman

Stepping inside the ropes at a PGA Tour event is not for the faint of heart. It requires physical strength, mental toughness, resiliency, good golf course management, accuracy, a lot of nerve and a thick skin.

And that is just for the caddies. For the players, the ability to drive it long, hit precise iron shots and stroke putts like a metronome is also required.

Lately, the caddie profession has been in the spotlight because of Tiger Woods and his recent public split from his longtime caddie Steve Williams. The Internet has been abuzz on the topic, with a flood of serious — and whimsical — missives expressing a desire to replace Williams in the job, which paid about $1 million a year.

To which the appropriate response is, dream on. First, Woods has filled the job — for how long is anyone’s guess — with Bryon Bell, his childhood friend and the current president of Woods’s golf course design company.

So put Woods aside for a minute and consider what it takes to be a tour caddie. The basic requirements: strong back, strong legs and strong golf knowledge are a must. Playing experience at a high level is also a big plus. Must be reliable, flexible, able to travel extensively and make quick decisions under pressure. Salary is negotiable, employment status is subject to change without notice.

That is only the beginning. Then there are the unquantifiable necessities, which include the ability to keep an anxious player calm, get a bored player interested, say the right things at the right times and crack a joke right when your player needs to hear one.

Some or all of those abilities are what separate the top caddies, like Williams, Jim Mackay, Billy Foster, Joe LaCava, Paul Tesori, Ricci Roberts, Tony Navarro, Brett Waldman, Bobby Brown, Mike Cowan, John Wood, Fanny Sunesson, Damon Green, Brennan Little.

Most or all of these have been mentioned as possible long-term candidates for the Woods job. But even the caddies who have not been mentioned are well-schooled in the other skills.

They are traffic cops, psychiatrists and meteorologists. They are chauffeurs, butlers, and bodyguards, buddies, sidekicks and frequent dinner companions. When things get really tough, they are guard dogs, attack dogs — or the dog that gets scolded when the man of the house is unhappy.

And, as Lee Westwood put it the other day at the Irish Open when shooting down speculation that his caddie, Foster, was going to replace Williams on Woods’s bag, “Good caddies are worth their weight in gold sometimes.”

How much gold is up to the player and the caddie to work out. The standard formula is pretty much a $1,000-a-week base salary and a 5 percent cut of earnings for a finish outside the top 10, a 7 percent cut for a top-10 finish and a 10 percent cut of a winner’s check. If a player misses the cut and makes nothing, so does the caddie.

Read more:

Sunday, July 31, 2011

Minority Rules

Scientists at Rensselaer Polytechnic Institute have found that when just 10 percent of the population holds an unshakable belief, their belief will always be adopted by the majority of the society. The scientists, who are members of the Social Cognitive Networks Academic Research Center (SCNARC) at Rensselaer, used computational and analytical methods to discover the tipping point where a minority belief becomes the majority opinion. The finding has implications for the study and influence of societal interactions ranging from the spread of innovations to the movement of political ideals.

“When the number of committed opinion holders is below 10 percent, there is no visible progress in the spread of ideas. It would literally take the amount of time comparable to the age of the universe for this size group to reach the majority,” said SCNARC Director Boleslaw Szymanski, the Claire and Roland Schmitt Distinguished Professor at Rensselaer. “Once that number grows above 10 percent, the idea spreads like flame.”

As an example, the ongoing events in Tunisia and Egypt appear to exhibit a similar process, according to Szymanski. “In those countries, dictators who were in power for decades were suddenly overthrown in just a few weeks.”

The findings were published in the July 22, 2011, early online edition of the journal Physical Review E in an article titled “Social consensus through the influence of committed minorities.”

An important aspect of the finding is that the percent of committed opinion holders required to shift majority opinion does not change significantly regardless of the type of network in which the opinion holders are working. In other words, the percentage of committed opinion holders required to influence a society remains at approximately 10 percent, regardless of how or where that opinion starts and spreads in the society.

To reach their conclusion, the scientists developed computer models of various types of social networks. One of the networks had each person connect to every other person in the network. The second model included certain individuals who were connected to a large number of people, making them opinion hubs or leaders. The final model gave every person in the model roughly the same number of connections. The initial state of each of the models was a sea of traditional-view holders. Each of these individuals held a view, but were also, importantly, open minded to other views.

Once the networks were built, the scientists then “sprinkled” in some true believers throughout each of the networks. These people were completely set in their views and unflappable in modifying those beliefs. As those true believers began to converse with those who held the traditional belief system, the tides gradually and then very abruptly began to shift.

Read more:

Obama: His Words and His Deeds

by David Bromwich


In early June, a constitutional crisis faced Barack Obama over his defiance of the War Powers Act of 1973. The law requires the President to seek approval by Congress within sixty days of committing American forces to an armed conflict anywhere in the world. Two resolutions emerged and were debated in Congress to force compliance from Obama. One, drafted by the Speaker of the House, John Boehner, called for the President to give a justification of US actions in Libya. On June 3, the Boehner resolution passed by a vote of 268–145. An alternative resolution, drafted by Dennis Kucinich, the best-known anti-interventionist among Democrats, would have called for US withdrawal from Libya within fifteen days. The Kucinich resolution was defeated 148–265.

The debate and the two votes were the first major signs of congressional resistance to the aggrandizement of executive power begun by George W. Bush in Afghanistan and Iraq and continued by Obama in Afghanistan and Libya. The reasons the President had cited in a letter to Congress for his circumvention of congressional approval of his actions in Libya betrayed a curious mixture of arrogance and disregard for the War Powers Act. The US military role in Libya, Obama said, was subordinate, and, since NATO was now in command, the Libya war hardly qualified as a war. Congress was free to discuss the matter if it liked, and he would welcome its approval, but in his view he acted within his legal powers in giving the orders without approval.

Few members of Congress as yet hold a fully articulated objection to America’s wars in Asia and North Africa. But other causes in play may trouble the President’s determination to show his sympathy with the Arab Spring by military action in Libya. Obama has an unfortunate propensity to be specific when it would serve him well to avoid particulars, and to become vague at times when dates, names, numbers, or “a line in the sand” is what is needed to clarify a policy. On Libya, he was specific. He said the American commitment would last “days, not weeks.” It has now lasted a little under three months. Reliable reporters such as Anthony Shadid of The New York Times and Patrick Cockburn of The Independent have suggested that an end to the conflict is nowhere in sight.

The narrow aim of enforcing a “no-fly zone” to protect civilians, asserted by Susan Rice and Hillary Clinton as the limit of American aims, turns out to have been a wedge for an air war against Qaddafi, a war, in fact, as thorough as is compatible with avoidance of harm to civilians. The surest thing one can say about the end of this engagement is that the US—along with France, Great Britain, and perhaps also Italy, which arranged the intervention—will at some point install a client state and fit out a friendly government with a democratic constitution. Nothing about the war affords any insight into the intermediate calculations of Obama and his collaborators, Nicolas Sarkozy and David Cameron.

Obama was in BrasĂ­lia on March 19 when he announced his authorization of “limited military action” in Libya. For that matter, he has been away from Washington for a large part of his two and a half years as president. This fact may be dwelt on excessively by his detractors, especially at Fox News, but its importance is scarcely acknowledged by his allies. (According to figures compiled at the end of 2010 by the CBS reporter Mark Knoller, Obama’s first twenty-three months in office saw seventy days on foreign trips and fifty-eight days on vacation trips.) He has gambled that it pays to present himself as a statesman above the scramble of something disagreeable called Washington.

Here he follows a path trodden by almost all his predecessors. Carter, Reagan, Clinton, and George W. Bush all affected the stance of outsider; only Bush Senior scorned to adopt the tactic (and could not have gotten away with it if he tried). Nor does taking such a position confer an automatic advantage. It worked well for Reagan until the Iran-contra scandal in 1986. Clinton was helped and hurt in about equal parts by the outsider pretense. For Carter and the younger Bush, it seems to have added to the impression of incompetence or disengagement. People came to think that there were things these men could have learned from Washington.

The anti-Washington tactic, and the extensive travel it licenses, have not worked well for Obama. He retains the wish to be seen as a man above party; and a more general distaste for politics is also involved. But what is Barack Obama if not a politician? By his tones of voice and selection of venues he has implied several possibilities: organizer, pastor, school principal, counselor on duties and values. Most prominently, over the past six months he seems to have improvised the role (from materials left behind by Reagan) of a kind of national host or “moderator” of the concerns of Americans. From mid-2009 through most of 2010, Obama embarked on solo missions to shape public opinion at town hall meetings and talk show bookings, but the preferred format now appears to be the craftily timed and planned and much-heralded ecumenical address. Obama’s televised speech on January 12 at the memorial service after the Tucson shooting was his first major venture on those lines. His speech on May 19 at the State Department was the second; and its announced subject was even more ambitious: the entire domain of US policy in the Middle East.

Being president of the world has sometimes seemed a job more agreeable to Barack Obama than being president of the United States. This goes with another predilection. Obama has always preferred the symbolic authority of the grand utterance to the actual authority of a directed policy: a policy fought for in particulars, carefully sustained, and traceable to his own intentions. The danger of the built-up speech venues—the Nobel Prize speech of December 2009 was another example—is that they cast Obama as the most famous holder-forth in the world, and yet it is never clear what follows for him from the fact that the world is listening. These settings make a president who is now more famous than popular seem not popular but galactic.

Read more: