Thursday, August 4, 2011

Growing Old or Living Long

by Ann Conkle

Aging. To many people it’s wrinkles, retirement communities, and a steady decline in the ability to remember things. But before you reach for the Botox or buy a sports car, you might be interested in research by APS Fellow and Charter Member Laura Carstensen, Stanford University. In her recent lecture, “Growing Old or Living Long: Take Your Pick,” this year’s Henry and Bryna David Lecture at the National Academies of Sciences, Carsenten presented evidence that the future is not so grim.

Aging is an undeniable issue in today’s developed nations. According to Carstensen, More years were added to average life expectancy in the 20th century than all increases in all other millennia combined.” For most of human existence, life expectancy hovered around 27. It increased during the 18th and 19th centuries, hitting about 47 at the turn of the 20th century. During the 20th century, life expectancy almost doubled, reaching 77 at the century’s close. This shift has created an entirely new life stage which essentially did not exist 100 years ago. Today, the health of older adults affects almost all aspects of society, from family life to finances and politics. It’s time to use what Carstensen describes as our “breathtaking” scientific capacity to create a world in which this new set of older people can thrive.

Most research on aging focuses on and affirms a steady decline in cognitive ability as we get older. (Keep in mind that this decline is a lifelong process; as Carstensen likes to remind her undergraduate students, the slope is just as steep from 20 to 40 as it is from 60 to 80.)  Studies have shown that working memory, perceptual speed, comprehension of text and language, and word finding ability do decline with age. But it is important to remember that while these abilities to process new information may be degraded, they are not eliminated.  People continue to learn, increasing in expertise and knowledge as they age. It is not just a simple story of decline.

Carstensen’s aging research has focused on motivation. Her Socioemotional Selectivity Theory describes how goals change as we age based on two key concepts. First, humans are the only species whose members have a sense of where they are in the life cycle, which they are consciously and subconsciously aware of throughout their lives. Second, we always consider the temporal context when setting goals. If you ran into a friend unexpectedly and only had a few minutes to chat, for example, you would have a different conversation than you would if you were sitting down to an hour-long lunch. Therefore, because we are aware of our position in the life cycle and because our goals are affected by temporal context, our goals will change throughout our lifetime as our temporal context changes.

For young people, the future seems expansive.  They tend to seek out new things and new people. Everything is interesting because it could be useful in some unforeseen future situation. For older people, however, the future is more limited. They tend to turn their attention to the present, focusing on relationships with important people already in their lives, rather than new things; they are motivated to pursue emotionally meaningful goals. This difference is supported by experiments in which people are asked to identify their goals or pick from a set of goals. When the experiment constrains the future (e.g. imagine you are about to move across country alone, now choose your goals) both the young and the old pick more emotional goals. When the future is expanded (your doctor just called about a new treatment that will add 20 years to your life), both groups pick more informational goals.

Read more:

image credit:

Audience Atomization Overcome: Why the Internet Weakens the Authority of the Press

[ed.  It looks like this will be Jay Rosen week.  Following a link in his recent interview with Sophie Roell (below) I found this post from 2009.  I don't know how I missed it.  It's a fascinating distillation of media behavior and its effect on political dialogue.  Well worth checking out, including the responses he received following its publication.]


"In the age of mass media, the press was able to define the sphere of legitimate debate with relative ease because the people on the receiving end were atomized-- connected "up" to Big Media but not across to each other. And now that authority is eroding. I will try to explain why."


It’s easily the most useful diagram I’ve found for understanding the practice of journalism in the United States, and the hidden politics of that practice. You can draw it by hand right now. Take a sheet of paper and make a big circle in the middle. In the center of that circle draw a smaller one to create a doughnut shape. Label the doughnut hole “sphere of consensus.” Call the middle region “sphere of legitimate debate,” and the outer region “sphere of deviance.”

Daniel C. Hallin's Spheres of Consensus, Controversy and Deviance

That’s the entire model. Now you have a way to understand why it’s so unproductive to argue with journalists about the deep politics of their work. They don’t know about this freakin’ diagram! Here it is in its original form, from the 1986 book The Uncensored War by press scholar Daniel C. Hallin. Hallin felt he needed something more supple—and truthful—than calcified notions like objectivity and “opinions are confined to the editorial page.” So he came up with this diagram.

Let’s look more carefully at his three regions.

1.) The sphere of legitimate debate is the one journalists recognize as real, normal, everyday terrain. They think of their work as taking place almost exclusively within this space. (It doesn’t, but they think so.) Hallin: “This is the region of electoral contests and legislative debates, of issues recognized as such by the major established actors of the American political process.”

Here the two-party system reigns, and the news agenda is what the people in power are likely to have on their agenda. Perhaps the purest expression of this sphere is Washington Week on PBS, where journalists discuss what the two-party system defines as “the issues.” Objectivity and balance are “the supreme journalistic virtues” for the panelists on Washington Week because when there is legitimate debate it’s hard to know where the truth lies. There are risks in saying that truth lies with one faction in the debate, as against another— even when it does. He said, she said journalism is like the bad seed of this sphere, but also a logical outcome of it.

2. ) The sphere of consensus is the “motherhood and apple pie” of politics, the things on which everyone is thought to agree. Propositions that are seen as uncontroversial to the point of boring, true to the point of self-evident, or so widely-held that they’re almost universal lie within this sphere. Here, Hallin writes, “journalists do not feel compelled either to present opposing views or to remain disinterested observers.” (Which means that anyone whose basic views lie outside the sphere of consensus will experience the press not just as biased but savagely so.)

Consensus in American politics begins, of course, with the United States Constitution, but it includes other propositions too, like “Lincoln was a great president,” and “it doesn’t matter where you come from, you can succeed in America.” Whereas journalists equate ideology with the clash of programs and parties in the debate sphere, academics know that the consensus or background sphere is almost pure ideology: the American creed.

3.) In the sphere of deviance we find “political actors and views which journalists and the political mainstream of society reject as unworthy of being heard.” As in the sphere of consensus, neutrality isn’t the watchword here; journalists maintain order by either keeping the deviant out of the news entirely or identifying it within the news frame as unacceptable, radical, or just plain impossible. The press “plays the role of exposing, condemning, or excluding from the public agenda” the deviant view, says Hallin. It “marks out and defends the limits of acceptable political conduct.”

Anyone whose views lie within the sphere of deviance—as defined by journalists—will experience the press as an opponent in the struggle for recognition. If you don’t think separation of church and state is such a good idea; if you do think a single payer system is the way to go; if you dissent from the “lockstep behavior of both major American political parties when it comes to Israel” (Glenn Greenwald) chances are you will never find your views reflected in the news. It’s not that there’s a one-sided debate; there’s no debate.

Wednesday, August 3, 2011

Sueellen Ross - Ennui
via:

Books Without Borders

My Life at the World's Dumbest Bookstore Chain.
by Paul Constant

It's embarrassing now, but on the day that I was hired to work at Boston's flagship Borders store in 1996, I was so happy that I danced around my apartment. After dropping out of college, I had worked a succession of crappy jobs: mall Easter Bunny, stock boy at Sears and Kmart and Walmart, a brief and nearly fatal stint as a landscaper. A job at Borders seemed to be a step, at long last, toward my ultimate goal of writing for a living. At least I would be working with books. And the scruffy Borders employees, in their jeans and band T-shirts, felt a lot closer to my ideal urban intellectuals than the stuffy Barnes & Noble employees with their oppressive dress code and lame vests.

The fact that Borders offered me a full-time job, which allowed me to quit two part-time jobs (at a Staples and a Stop & Shop) and offered health insurance (that promised to help pay for my impending wisdom tooth extraction), was a pretty big deal, too.

For better and for worse, Borders was my college experience. I behaved badly—fucked, drank, and did drugs with everyone I could. My fellow employees snuck me into bars when I was underage, and then cheered when, during my 21st birthday party, I wound up facedown in the gutter sobbing about how my heart had been ripped in two by an ex-fiancĂ©e. I was not alone in my bad behavior: Every week, different employees were hooking up, having affairs, breaking up, recoupling, playing drinking games that involved comically large hunting knives, getting in fights, getting pregnant, and showing up drunk for work.

In the beginning, the store felt like a tight-knit family. As time went on, we became a confederation of hedonists with little regard for one another's feelings. At one Christmas party that I didn't attend, a new female employee reportedly gave blowjobs to anybody who wanted one. (Later, at least a couple of men who stood in line for the newbie's ministrations complained about picking up an STD.) Suddenly, the parties weren't as fun anymore. One employee hanged himself. Another died of a heart attack in the DVD section on the overnight replenishment shift and wasn't discovered until the store opened for business the next morning.

But it wasn't all an endless cycle of party and hangover. The 20 percent discount—plus an employee credit account that went up to $300, with the store paying off $20 of that debt a month—allowed me to explore books I'd never heard of. It's hard to remember now, but when Borders began proliferating in suburban parking lots around the country, they had a truly excellent selection curated, at least in part, by each store's employees. I bought my first title from countercultural Washington press Feral House—Apocalypse Culture—at the brand-new Borders at the Maine Mall when I was a teenager, and it still ranks as one of my most mind-blowing reading experiences. I read my first David Foster Wallace and Matt Ruff books while working at Borders; I explored the lesser-known works of Twain and Melville and Dickens and St. Vincent Millay. I learned who Edward Abbey and Noam Chomsky and Kathy Acker were. I discovered young writers like Banana Yoshimoto and Colson Whitehead and Chuck Palahniuk and Haruki Murakami. Thanks to my coworkers in the music department, which was just as far-reaching as the book department, I learned to love Miles Davis and Glenn Gould and an obscure punk band from way out west called Sleater-Kinney.

At the time, independent bookstores were blaming Borders for a spate of mom-and-pop bookstore closures around the country. I'll never forget the employee at Bookland in Maine who coldly accused me of single-handedly destroying her small chain when I admitted who my employer was, even as I was buying $50 worth of books from her. Of course, the accusations had truth to them—small bookstores simply couldn't compete with the deep discounts the chains offered—but for what it's worth, every employee who worked at Borders, at least when I first joined the company, adored literature. We were not automatons out to assassinate local business. We wanted to work with the cultural artifacts that were the most important things in our lives, the things that made us who we were. Not all of us could find work at independent bookstores, so we did the next best thing: We went to work for a company that seemingly cared about quality literature and regional reading tastes, and gave its employees a small-but-fair wage for full-time bookselling careers, with excellent benefits. It sure didn't feel like selling out.

Until suddenly, one day, it did feel like selling out. Because it was. Our displays were bought and paid for by publishers; where we used to present books that we loved and wanted to champion, now mediocre crap was piled on every flat surface. The front of the store, with all the kitchen magnets and board games and junk you don't need took over large chunks of the expansive magazine and local-interest sections. Orders came from the corporate headquarters in Ann Arbor every Sunday to change out the displays. One time I had to take down some of the store's most exciting up-and-coming fiction titles (including a newly published book that was gathering word-of-mouth buzz, thanks to our booksellers, called Harry Potter and the Sorcerer's Stone) to put up a wall of Clash CDs. One month, for some reason, the cafe sold Ernest Hemingway–branded chai.

Read more:
RLyonsArt
via:

Shark Week: Remembering Bruce

by Nicholas Jackson

There are only a few dozen shark attacks on humans every year. It has been widely reported that you are 30 times more likely to die from a lightning strike than you are from an attack. In 2003, Reuters ran a story claiming that more people are killed by vending machines each year than are killed by sharks. And yet, I would bet that just about anybody who has spent time at the beach has thought about the possibility of an attack. I know I certainly have. Before dipping so much as a toe into the ocean, I scan the horizon for a dark, approaching shadow from the deep. And I thank Steven Spielberg for that.

In 1975, Spielberg released the first of what would become a franchise. Jaws was a landmark horror-thriller, recognized by everyone from Empire magazine (fifth greatest film ever made) to the New York Times (one of the 1,000 best movies ever) to the American Film Institute (number 48 on the "100 Years... 100 Movies" list). It won three Academy Awards and was even nominated for Best Picture. (It lost to One Flew Over the Cuckoo's Nest.) Perhaps more importantly, the movie created the wide-release summer blockbuster, a tradition of providing big-budget thrills in ever major theater across America during the hottest months of the year that continues to this day. Jaws brought in more money than any other film and held that title until George Lucas released Star Wars two years later.

An instant classic, Jaws received rave reviews. Roger Ebert called it "a sensationally effective action picture, a scary thriller that works all the better because it's populated with characters that have been developed into human beings we get to know and care about." There's Roy Scheider as Brody, the police chief who we can all identify with, who doesn't like to swim, who is genuinely terrified of the water. There's Robert Shaw as Quint, "a caricature of the crusty old seafaring salt," at Ebert put it in that 1975 write-up. There's Hooper, the rich- kid-turned-oceanographer played by Richard Dreyfuss, just off a string of successes as the nice kid in American Graffiti and the title character in the Canadian hit The Apprenticeship of Duddy Kravitz. But the most important character -- and, in many ways, one of the most human -- is the shark itself.

Everyone knows the story by now: The shark is a great white that terrorizes a small resort town during the Fourth of July weekend, a weekend critical to the economy of this little village. In an effort to track down and kill the shark, these three men leave their families behind (where applicable) and set out on a rickety boat. It's leaky. It's too small. It's old. This boat, we know from the outset, just isn't cut out for shark hunting. At least not hunting sharks of the size we suspect this great white to be.

"There are no doubt supposed to be all sorts of levels of meanings in such an archetypal story," Ebert notes. But he doesn't bother writing about them or trying to figure them out. And neither does Spielberg. "This is an action film content to stay entirely within the perimeters of its story, and none of the characters has to wade through speeches expounding on the significance of it all." And what an action film it is. This isn't just about the dark shadow from the deep -- though it is that, too. Before the story comes to an end, many individuals both on and off the island have been killed in a series of terrifying scenes that allow you to get up close and personal with the shark.

The only reason this works -- the only reason that theatergoers in the 1970s left their seats terrified of these macropredatory beasts and that modern viewers can't turn off the lights when screening the film in their own living rooms -- is the craftsmanship and technology that went into creating the main characters: Jaws.

In early May of 1974, the rights had been acquired to Peter Benchley's book of the same name, the contracts had been signed by Spielberg and principal photography began on Martha's Vineyard. It could have failed. By all accounts, it probably should have failed. Spielberg, not yet 30, was largely untested as a director of big-budget productions and nothing was in place. "We started the film without a script, without a cast and without a shark," Richard Dreyfuss would tell James Lipton during a taping of Inside the Actor's Studio years later. But the cast would come together. And the shark was already in the works.

Read more:

Plan B




Jay Rosen on Journalism in the Internet Age

by Sophie Roelle

In a break from our usual practice of focusing on books, we asked the journalism analyst and veteran blogger to recommend five articles illustrating the upheavals of the news business

I know that as journalists we have to adapt rapidly to new ways of doing things, but you've really thrown me in at the deep end – you’ve chosen five online articles instead of five books, and we’re doing the interview on Google chat rather than by telephone.

I like to do things differently. For example, using PressThink for longform blogging which wasn't the normal thing at the time, in 2003.

Will you give me an overall sense of what you are saying about changes in journalism with the articles you've chosen?

Well, first there's been a shift in power. The users have more than they did because they can publish and connect to one another, not just to the media. Second, the people formerly known as the audience are configured differently. They are connected horizontally as well as vertically, which is why today we speak of social media. This is what I sometimes call “audience atomisation overcome”. Third, the media still have power and journalism still matters. In some ways the essence of it has not changed. But a lot of what journalists did became bound up with particular forms of production and distribution. Since the web has radically altered those forms, it has radically changed journalistic work, even though the value of good journalism remains the same – timely, accurate, useful information that tells us what's happening in our world over the horizon of our personal experience.
Corrado Vanelli
via:

Enter the Cyber-dragon

by Michael Joseph Gross


Hackers have attacked America’s defense establishment, as well as companies from Google to Morgan Stanley to security giant RSA, and fingers point to China as the culprit. The author gets an exclusive look at the raging cyber-war—Operation Aurora! Operation Shady rat!—and learns why Washington has been slow to fight back.


Lying there in the junk-mail folder, in the spammy mess of mortgage offers and erectile-dysfunction drug ads, an e-mail from an associate with a subject line that looked legitimate caught the man’s eye. The subject line said “2011 Recruitment Plan.” It was late winter of 2011. The man clicked on the message, downloaded the attached Excel spreadsheet file, and unwittingly set in motion a chain of events allowing hackers to raid the computer networks of his employer, RSA. RSA is the security division of the high-tech company EMC. Its products protect computer networks at the White House, the Central Intelligence Agency, the National Security Agency, the Pentagon, the Department of Homeland Security, most top defense contractors, and a majority of Fortune 500 corporations.

The parent company disclosed the breach on March 17 in a filing with the Securities and Exchange Commission. The hack gravely undermined the reputation of RSA’s popular SecurID security service. As spring gave way to summer, bloggers and computer-security experts found evidence that the attack on RSA had come from China. They also linked the RSA attack to the penetration of computer networks at some of RSA’s most powerful defense-contractor clients—among them, Lockheed Martin, Northrop Grumman, and L-3 Communications. Few details of these episodes have been made public.

The RSA and defense-contractor hacks are among the latest battles in a decade-long spy war. Hackers from many countries have been exfiltrating—that is, stealing—intellectual property from American corporations and the U.S. government on a massive scale, and Chinese hackers are among the main culprits. Because virtual attacks can be routed through computer servers anywhere in the world, it is almost impossible to attribute any hack with total certainty. Dozens of nations have highly developed industrial cyber-espionage programs, including American allies such as France and Israel. And because the People’s Republic of China is such a massive entity, it is impossible to know how much Chinese hacking is done on explicit orders from the government. In some cases, the evidence suggests that government and military groups are executing the attacks themselves. In others, Chinese authorities are merely turning a blind eye to illegal activities that are good for China’s economy and bad for America’s. Last year Google became the first major company to blow the whistle on Chinese hacking when it admitted to a penetration known as Operation Aurora, which also hit Intel, Morgan Stanley, and several dozen other corporations. (The attack was given that name because the word “aurora” appears in the malware that victims downloaded.) Earlier this year, details concerning the most sweeping intrusion since Operation Aurora were discovered by the cyber-security firm McAfee. Dubbed “Operation Shady rat,” the attacks (of which more later) are being reported here for the first time. Most companies have preferred not to talk about or even acknowledge violations of their computer systems, for fear of panicking shareholders and exposing themselves to lawsuits—or for fear of offending the Chinese and jeopardizing their share of that country’s exploding markets. The U.S. government, for its part, has been fecklessly circumspect in calling out the Chinese.

A scattered alliance of government insiders and cyber-security experts are working to bring attention to the threat, but because of the topic’s extreme sensitivity, much of their consciousness-raising activity must be covert. The result in at least one case, according to documents obtained by Vanity Fair, has been a surreal new creation of American bureaucracy: government-directed “hacktivism,” in which an intelligence agency secretly provides information to a group of private-sector hackers so that truths too sensitive for the government to tell will nevertheless come out.

This unusual project began in March, when National Security Agency officials asked a private defense contractor to organize a cadre of elite non-government experts to study the RSA cyber-attacks. The experts constituted a SEAL Team Six of cyber-security and referred to their work as Operation Starlight. “This is the N.S.A. outsourcing the finger-pointing to the private sector,” says one person who was invited to join the group and has been privy to its e-mail logs. The N.S.A. provided Operation Starlight with the data it needed for its forensic analysis.

Longing

by Traer Scott

"This is from my recent trip to Thailand where I was commissioned to photograph Asian Elephants, centering around those at Boon Lott's Elephant Sanctuary in Baan Tuek. This beautiful elephant looks on as her mahout cuddles baby Noah. She had recently lost her calf who was struck by lightening."

via:

Tuesday, August 2, 2011

 
Artwork by Will Varner
via:

Dave Matthews,Tim Reynolds


"Nine planets around the sun
Only one does the sun embrace
Upon this watered one
So much we take for granted..,"

(lyrics)

Cut Off From The Herd

by S.L. Price, Sports Illustrated
August 25, 1997

[ed.  Interesting read after having the benefit of surveying Randy Moss's career, 14 years down the road.]

Everybody's watching him. Randy Moss can feel the eyes of the lunchtime crowd at the Bob Evans restaurant, the double takes and furtive glances from the men in short sleeves and wide ties. He's got his act down: gray hood over his head, butt slumped in the booth, eyes as lifeless as buttons. Moss is a wide receiver at Marshall University, in Huntington, W.Va., and he figures to be rich before long. He jabs at his toast with a plastic straw.

"If I didn't have this hood on, and they saw us sitting here, people would say an agent picked up Randy Moss and took him to Bob Evans," he says. "That's why I got this hood on. Some people are looking, and some are not. Some know I'm here and you're here, they see a bill and they'll say, 'The agent paid for his food.' Anything can happen."

He shrugs. Moss says he doesn't care about the world's judgments anymore, and it's easy to believe he means it. Certainly no player in college football bears more stains on his name. Two and a half years ago, as a high school senior, Moss stomped a kid in a fight, pleaded guilty to two counts of battery and was sentenced to 30 days in jail and a year's probation. That cost him a scholarship to Notre Dame. He enrolled at Florida State. The following spring he broke probation by smoking marijuana, was kicked out of Florida State and served two more months in prison. Then last fall, as Moss was on his way to shattering various NCAA and Marshall records with 28 touchdowns and 1,709 receiving yards as a freshman, he was charged with domestic battery against the mother of his baby daughter.

Yet Moss is not much interested in image-mending. His first words this morning were that he slept through his communications class. His hair is braided in long rows against his skull, a style he knows will give the wrong impression. "People perceive: Only black thug guys have braids," he says, his voice carrying to a dozen tables. "If I want to grow hair, I'll grow it. If I want to wear lipstick and makeup, I'll do that. God didn't put makeup on this world just for women. They perceive me as a thug? I'm not. I'm a gentleman. I know what I am, my mom knows what I am, most people know what I am. Don't judge me until you know me."

Notre Dame did just that, and Moss will never forgive the school for it. "They didn't take me, because they see me as a thug," he says. "Then Florida State...I don't know. You win some, you lose some. That's a loss." Moss pauses, laughs a humorless laugh. "But in the long run I'm going to have the victory. In the long run...victorious."

Moss is sure of this because he has sports' trump card: talent. Better, Moss has the kind of breathtaking athletic gifts seen once in a generation. At 6'5", with a 39-inch vertical leap and 4.25 speed in the 40, he established himself as West Virginia's greatest high school athlete since Jerry West. Irish coach Lou Holtz declared him one of the best high school football players he'd ever seen. Moss was twice named West Virginia's Player of the Year—in basketball. "He does things you've never seen anyone else do," says Jim Fout, Moss's basketball coach at DuPont High in the town of Belle. Moss also ran track for a while. As a sophomore he was the state champ in the 100 and 200 meters.

Felix Vallotton, Road at St. Paul
via:

As Atheists Know, You Can be Good Without God

by Jerry A. Coyne

One cold Chicago day last February, I watched a Federal Express delivery man carry an armful of boxes to his truck. In the middle of the icy street, he slipped, scattering the boxes and exposing himself to traffic. Without thinking, I ran into the street, stopped cars, hoisted the man up and helped him recover his load. Pondering this afterward, I realized that my tiny act of altruism had been completely instinctive; there was no time for calculation.

We see the instinctive nature of moral acts and judgments in many ways: in the automatic repugnance we feel when someone such as Bernie Madoff bilks the gullible and trusting, in our disapproval of the person who steals food from the office refrigerator, in our admiration for someone who risks his life to save a drowning child. And although some morality comes from reason and persuasion — we must learn, for example, to share our toys — much of it seems intuitive and inborn.

Many Americans, including Francis Collins, director of the National Institutes of Health and an evangelical Christian, see instinctive morality as both a gift from God and strong evidence for His existence.

As a biologist, I see belief in God-given morality as American's biggest impediment to accepting the fact of evolution. "Evolution," many argue, "could never have given us feelings of kindness, altruism and morality. For if we were merely evolved beasts, we would act like beasts. Surely our good behavior, and the moral sentiments that promote it, reflect impulses that God instilled in our soul."

So while morality supposedly comes from God, immorality is laid at the door of Charles Darwin, who has been blamed for everything from Nazism to the shootings in Columbine.

Why it couldn't be God

But though both moral and immoral behaviors can be promoted by religions, morality itself — either in individual behavior or social codes — simply cannot come from the will or commands of a God. This has been recognized by philosophers since the time of Plato.

Religious people can appreciate this by considering Plato's question: Do actions become moral simply because they're dictated by God, or are they dictated by God because they are moral? It doesn't take much thought to see that the right answer is the second one. Why? Because if God commanded us to do something obviously immoral, such as kill our children or steal, it wouldn't automatically become OK. Of course, you can argue that God would never sanction something like that because he's a completely moral being, but then you're still using some idea of morality that is independent of God. Either way, it's clear that even for the faithful, God cannot be the source of morality but at best a transmitter of some human-generated morality.

A Brief History of the Corporation: 1600 to 2100

by Venkat

On 8 June, a Scottish banker named Alexander Fordyce shorted the collapsing Company’s shares in the London markets. But a momentary bounce-back in the stock ruined his plans, and he skipped town leaving £550,000 in debt. Much of this was owed to the Ayr Bank, which imploded. In less than three weeks, another 30 banks collapsed across Europe, bringing trade to a standstill. On July 15, the directors of the Company applied to the Bank of England for a £400,000 loan. Two weeks later, they wanted another £300,000. By August, the directors wanted a £1 million bailout. The news began leaking out and seemingly contrite executives, running from angry shareholders, faced furious Parliament members. By January, the terms of a comprehensive bailout were worked out, and the British government inserted its czars into the Company’s management to ensure compliance with its terms.

If this sounds eerily familiar, it shouldn’t. The year was 1772, exactly 239 years ago today, the apogee of power for the corporation as a business construct. The company was the British East India company (EIC). The bubble that burst was the East India Bubble. Between the founding of the EIC in 1600 and the post-subprime world of 2011, the idea of the corporation was born, matured, over-extended, reined-in, refined, patched, updated, over-extended again, propped-up and finally widely declared to be obsolete. Between 2011 and 2100, it will decline — hopefully gracefully — into a well-behaved retiree on the economic scene.

In its 400+ year history, the corporation has achieved extraordinary things, cutting around-the-world travel time from years to less than a day, putting a computer on every desk, a toilet in every home (nearly) and a cellphone within reach of every human. It even put a man on the Moon and kinda-sorta cured AIDS.

So it is a sort of grim privilege for the generations living today to watch the slow demise of such a spectacularly effective intellectual construct. The Age of Corporations is coming to an end. The traditional corporation won’t vanish, but it will cease to be the center of gravity of economic life in another generation or two. They will live on as religious institutions do today, as weakened ghosts of more vital institutions from centuries ago.

It is not yet time for the obituary (and that time may never come), but the sun is certainly setting on the Golden Age of corporations. It is time to review the memoirs of the corporation as an idea, and contemplate a post-corporate future framed by its gradual withdrawal from the center stage of the world’s economic affairs.

Monday, August 1, 2011

Crashing Down


by Brad Melekian

I'm sitting at one end of a 15-foot-long conference table inside Billabong’s U.S. headquarters—a glass-and-steel building in a nondescript office park in Irvine, California, off Interstate 5. It’s late June, and I’ve been summoned here by the surf manufacturer’s CEO, Paul Naude, and his VP of marketing, Graham Stapelberg, both of them South Africans. They have brought highlighted printouts of a story I wrote for Outside’s January issue, “Last Drop,” about the death of Andy Irons, a three-time world surfing champion and Billabong’s top sponsored athlete. It’s clear they mean for me to speak first, to explain myself.

Things are a little tense because, in late November, only weeks after his November 2 death in a Dallas airport hotel room, I wrote about Irons’s history of drug and alcohol abuse, which nearly killed him on at least one occasion. At the time, the family was standing by its initial press release that Irons had “reportedly been battling” the tropical disease dengue fever when he died, and neither they nor Billabong were talking—though one Billabong rep sent an e-mail saying he couldn’t comment but that we could “count on” Irons having died of dengue.

For writing that story, and especially for recounting that 1999 near-death binge-drinking episode in Indonesia, I was threatened by numerous people within the surf industry and accused of spitting on Irons’s grave. Then on June 10, a week prior to my sit-down at Billabong, after multiple legal challenges from Irons’s family, a Texas medical examiner had finally released a toxicology report detailing what killed Irons.

The report should have cleared up any lingering mystery, but that’s not what happened. Tarrant County medical examiner Nizam Peerwani wrote that he’d found evidence of cocaine, methamphetamine, methadone, a generic form of the anti-anxiety drug Xanax, and marijuana in Irons’s system, and the original police report noted that a bottle of sleeping pills was on a table in the hotel room. But he also concluded that Irons had a severely clogged artery and ruled that “the primary and the underlying cause of death is ischemic heart disease.”

What about all those pharmaceuticals? “Drugs,” the report continued, “particularly methadone and cocaine, are other significant factors contributing to death.”

It was the kind of wording you could interpret to suit your biases or needs, which some have done. Members of Irons’s family, surf journalists, and the Association of Surfing Professionals (ASP)—who presumably didn’t want the public to believe that Irons died of a drug overdose—viewed the report as vindication. A statement released by the Irons family in June read, “Traveling while sick and suffering from an undiagnosed heart condition was more than even Andy could overcome.” Bruce Irons, Andy’s brother and also a pro surfer, recently told ESPN, “When we got the results that it was the artery I went and did a test, and my arteries are fine. Now I know and understand deep down inside that it was brother’s time to go.” Editors at the website Surfline tweeted, “Andy Irons died of sudden cardiac arrest due to a blocked artery. His heart was full of passion for life & surfing.” After the results came out, ASP officials agreed to an interview but later backed out, and PR director Dave Prodan sent me this e-mail: “The ASP has no further comment at this time, aside from: The loss of Andy Irons from the sporting world has been devastating, but we feel fortunate enough to have witnessed his incredible accomplishments and unbridled passion for the sport of surfing.”

Read more:
Jerry LoFaro
via: