Friday, August 5, 2011
Centenarians Have Bad Habits Too
by Anahad O'Connor
For those of us relying on healthy habits to get us to age 100, the findings from a new study of centenarians may come as a bit of a blow.
The centenarians in the study indulged in smoking and drinking just as much as their shorter-lived counterparts. They did not appear to follow healthier or more stringent diets than others in the general population. They were also just as likely to be overweight, and may even have exercised less. So what contributed to their unusually long lives?
Scientists have long debated the roles of nature and nurture in longevity. Centenarians are, for example, far more likely than the average person to have long-lived relatives, suggesting that long life may be largely inherited. And yet studies have shown that identical twins separated at birth and reared apart can have vastly different life spans — with one living exceptionally long, and the other dying long before — indicating that genes have only so much influence.
The new findings, part of an ongoing look into longevity by researchers at the Albert Einstein College of Medicine, focused on Ashkenazi Jews, a group that is more genetically homogenous than other populations, making it easier to identify genetic differences that contribute to life span. In the study, the researchers followed 477 Ashkenazi centenarians who were 95 or older and living independently. They asked them about their habits and the ways they lived when they were younger. Using data collected in the 1970s, the researchers compared the long-lived group with another group of 3,000 people in the general population who were born around the same time but who generally did not make it to age 95.
They found that the people who lived to 95 and beyond did not seem to exhibit healthier lifestyles than those who died younger. Forty-three percent of the male centenarians reported exercising regularly at moderate intensity, compared with 57 percent of men in the other group. About 24 percent of the men in the older group drank alcohol daily, compared with 22 percent in the other group. Among women, they found that the same percentage in both groups reported following low-calorie diets.
Almost 30 percent of the women who lived exceptionally long were smokers, slightly more than the 26 percent of women in the comparison population who smoked. About 60 percent of the older men smoked, and 74 percent of their shorter-lived counterparts did.
Men and women in both groups were also just as likely to be overweight as people in the general population. The one difference in that area was that centenarians were less likely to be obese. Only 4.5 percent of men in the older group were obese, compared with 12 percent of the other male subjects. A similar pattern was found among women.
So did all that hard living just make them happier, contributing to their extended life spans? Much has been made over the years about optimism and other social factors that may contribute to longevity. But in the latest study, only 19 percent of the people who lived past 95 said they believed a “positive attitude” played a role in their longevity, while just 6 percent credited their religious faith or spirituality.
Read more:
For those of us relying on healthy habits to get us to age 100, the findings from a new study of centenarians may come as a bit of a blow.
The centenarians in the study indulged in smoking and drinking just as much as their shorter-lived counterparts. They did not appear to follow healthier or more stringent diets than others in the general population. They were also just as likely to be overweight, and may even have exercised less. So what contributed to their unusually long lives?
Scientists have long debated the roles of nature and nurture in longevity. Centenarians are, for example, far more likely than the average person to have long-lived relatives, suggesting that long life may be largely inherited. And yet studies have shown that identical twins separated at birth and reared apart can have vastly different life spans — with one living exceptionally long, and the other dying long before — indicating that genes have only so much influence.
The new findings, part of an ongoing look into longevity by researchers at the Albert Einstein College of Medicine, focused on Ashkenazi Jews, a group that is more genetically homogenous than other populations, making it easier to identify genetic differences that contribute to life span. In the study, the researchers followed 477 Ashkenazi centenarians who were 95 or older and living independently. They asked them about their habits and the ways they lived when they were younger. Using data collected in the 1970s, the researchers compared the long-lived group with another group of 3,000 people in the general population who were born around the same time but who generally did not make it to age 95.
They found that the people who lived to 95 and beyond did not seem to exhibit healthier lifestyles than those who died younger. Forty-three percent of the male centenarians reported exercising regularly at moderate intensity, compared with 57 percent of men in the other group. About 24 percent of the men in the older group drank alcohol daily, compared with 22 percent in the other group. Among women, they found that the same percentage in both groups reported following low-calorie diets.
Almost 30 percent of the women who lived exceptionally long were smokers, slightly more than the 26 percent of women in the comparison population who smoked. About 60 percent of the older men smoked, and 74 percent of their shorter-lived counterparts did.
Men and women in both groups were also just as likely to be overweight as people in the general population. The one difference in that area was that centenarians were less likely to be obese. Only 4.5 percent of men in the older group were obese, compared with 12 percent of the other male subjects. A similar pattern was found among women.
So did all that hard living just make them happier, contributing to their extended life spans? Much has been made over the years about optimism and other social factors that may contribute to longevity. But in the latest study, only 19 percent of the people who lived past 95 said they believed a “positive attitude” played a role in their longevity, while just 6 percent credited their religious faith or spirituality.
Read more:
China's Black Market in Babies
[ed. How terribly sad.]
by Sharon Lafraniere
LONGHUI COUNTY, China — Many parents and grandparents in this mountainous region of terraced rice and sweet potato fields have long known to grab their babies and find the nearest hiding place whenever family planning officials show up. Too many infants, they say, have been snatched by officials, never to be seen again.
But Yuan Xinquan was caught by surprise one December morning in 2005. Then a new father at the age of 19, Mr. Yuan was holding his 52-day-old daughter at a bus stop when a half-dozen men sprang from a white government van and demanded his marriage certificate.
He did not have one. Both he and his daughter’s mother were below the legal age for marriage.
Nor did he have 6,000 renminbi, then about $745, to pay the fine he said they demanded if he wanted to keep his child. He was left with a plastic bag holding her baby clothes and some powdered formula.
“They are pirates,” he said last month in an interview at his home, a half-hour trek up a narrow mountain path between terraced rice paddies.
Nearly six years later, he said, he still hopes to relay a message to his daughter: “Please come home as soon as possible.”
Mr. Yuan’s daughter was among at least 16 children who were seized by family planning officials between 1999 and late 2006 in Longhui County, an impoverished rural area in Hunan, a southern Chinese province, parents, grandparents and other residents said in interviews last month.
The abduction of children is a continuing problem in China, where a lingering preference for boys coupled with strict controls on the number of births have helped create a lucrative black market in children. Just last week, the police announced that they had rescued 89 babies from child traffickers, and the deputy director of the Public Security Ministry assailed what he called the practice of “buying and selling children in this country.”
But parents in Longhui say that in their case, it was local government officials who treated babies as a source of revenue, routinely imposing fines of $1,000 or more — five times as much as an average local family’s yearly income. If parents could not pay the fines, the babies were illegally taken from their families and often put up for adoption by foreigners, another big source of revenue.
Read more:
by Sharon Lafraniere
LONGHUI COUNTY, China — Many parents and grandparents in this mountainous region of terraced rice and sweet potato fields have long known to grab their babies and find the nearest hiding place whenever family planning officials show up. Too many infants, they say, have been snatched by officials, never to be seen again.
But Yuan Xinquan was caught by surprise one December morning in 2005. Then a new father at the age of 19, Mr. Yuan was holding his 52-day-old daughter at a bus stop when a half-dozen men sprang from a white government van and demanded his marriage certificate.
He did not have one. Both he and his daughter’s mother were below the legal age for marriage.
Nor did he have 6,000 renminbi, then about $745, to pay the fine he said they demanded if he wanted to keep his child. He was left with a plastic bag holding her baby clothes and some powdered formula.
“They are pirates,” he said last month in an interview at his home, a half-hour trek up a narrow mountain path between terraced rice paddies.
Nearly six years later, he said, he still hopes to relay a message to his daughter: “Please come home as soon as possible.”
Mr. Yuan’s daughter was among at least 16 children who were seized by family planning officials between 1999 and late 2006 in Longhui County, an impoverished rural area in Hunan, a southern Chinese province, parents, grandparents and other residents said in interviews last month.
The abduction of children is a continuing problem in China, where a lingering preference for boys coupled with strict controls on the number of births have helped create a lucrative black market in children. Just last week, the police announced that they had rescued 89 babies from child traffickers, and the deputy director of the Public Security Ministry assailed what he called the practice of “buying and selling children in this country.”
But parents in Longhui say that in their case, it was local government officials who treated babies as a source of revenue, routinely imposing fines of $1,000 or more — five times as much as an average local family’s yearly income. If parents could not pay the fines, the babies were illegally taken from their families and often put up for adoption by foreigners, another big source of revenue.
Read more:
Thursday, August 4, 2011
The Truth Wears Off
by Jonah Lehrer
On September 18, 2007, a few dozen neuroscientists, psychiatrists, and drug-company executives gathered in a hotel conference room in Brussels to hear some startling news. It had to do with a class of drugs known as atypical or second-generation antipsychotics, which came on the market in the early nineties. The drugs, sold under brand names such as Abilify, Seroquel, and Zyprexa, had been tested on schizophrenics in several large clinical trials, all of which had demonstrated a dramatic decrease in the subjects’ psychiatric symptoms. As a result, second-generation antipsychotics had become one of the fastest-growing and most profitable pharmaceutical classes. By 2001, Eli Lilly’s Zyprexa was generating more revenue than Prozac. It remains the company’s top-selling drug.
But the data presented at the Brussels meeting made it clear that something strange was happening: the therapeutic power of the drugs appeared to be steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, in the early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any better than first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now look even worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.
Before the effectiveness of a drug can be confirmed, it must be tested and tested again. Different scientists in different labs need to repeat the protocols and publish their results. The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology. In the field of medicine, the phenomenon seems extremely widespread, affecting not only antipsychotics but also therapies ranging from cardiac stents to Vitamin E and antidepressants: Davis has a forthcoming analysis demonstrating that the efficacy of antidepressants has gone down as much as threefold in recent decades.
For many scientists, the effect is especially troubling because of what it exposes about the scientific process. If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe? Francis Bacon, the early-modern philosopher and pioneer of the scientific method, once declared that experiments were essential, because they allowed us to “put nature to the question.” But it appears that nature often gives us different answers.
On September 18, 2007, a few dozen neuroscientists, psychiatrists, and drug-company executives gathered in a hotel conference room in Brussels to hear some startling news. It had to do with a class of drugs known as atypical or second-generation antipsychotics, which came on the market in the early nineties. The drugs, sold under brand names such as Abilify, Seroquel, and Zyprexa, had been tested on schizophrenics in several large clinical trials, all of which had demonstrated a dramatic decrease in the subjects’ psychiatric symptoms. As a result, second-generation antipsychotics had become one of the fastest-growing and most profitable pharmaceutical classes. By 2001, Eli Lilly’s Zyprexa was generating more revenue than Prozac. It remains the company’s top-selling drug.But the data presented at the Brussels meeting made it clear that something strange was happening: the therapeutic power of the drugs appeared to be steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, in the early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any better than first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now look even worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.
Before the effectiveness of a drug can be confirmed, it must be tested and tested again. Different scientists in different labs need to repeat the protocols and publish their results. The test of replicability, as it’s known, is the foundation of modern research. Replicability is how the community enforces itself. It’s a safeguard for the creep of subjectivity. Most of the time, scientists know what results they want, and that can influence the results they get. The premise of replicability is that the scientific community can correct for these flaws.
But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology. In the field of medicine, the phenomenon seems extremely widespread, affecting not only antipsychotics but also therapies ranging from cardiac stents to Vitamin E and antidepressants: Davis has a forthcoming analysis demonstrating that the efficacy of antidepressants has gone down as much as threefold in recent decades.
For many scientists, the effect is especially troubling because of what it exposes about the scientific process. If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe? Francis Bacon, the early-modern philosopher and pioneer of the scientific method, once declared that experiments were essential, because they allowed us to “put nature to the question.” But it appears that nature often gives us different answers.
Muhammad Ali vs. Cleveland Williams, Houston Astrodome (80ft above the ring). 1966
Photo: Neil Leifer
This is often regarded as one of the greatest sports photographs of the 20th century and is Leifer’s favourite photograph of his 40 year career.
via:
Growing Old or Living Long
by Ann Conkle
Aging. To many people it’s wrinkles, retirement communities, and a steady decline in the ability to remember things. But before you reach for the Botox or buy a sports car, you might be interested in research by APS Fellow and Charter Member Laura Carstensen, Stanford University. In her recent lecture, “Growing Old or Living Long: Take Your Pick,” this year’s Henry and Bryna David Lecture at the National Academies of Sciences, Carsenten presented evidence that the future is not so grim.
Aging is an undeniable issue in today’s developed nations. According to Carstensen, “More years were added to average life expectancy in the 20th century than all increases in all other millennia combined.” For most of human existence, life expectancy hovered around 27. It increased during the 18th and 19th centuries, hitting about 47 at the turn of the 20th century. During the 20th century, life expectancy almost doubled, reaching 77 at the century’s close. This shift has created an entirely new life stage which essentially did not exist 100 years ago. Today, the health of older adults affects almost all aspects of society, from family life to finances and politics. It’s time to use what Carstensen describes as our “breathtaking” scientific capacity to create a world in which this new set of older people can thrive.
Most research on aging focuses on and affirms a steady decline in cognitive ability as we get older. (Keep in mind that this decline is a lifelong process; as Carstensen likes to remind her undergraduate students, the slope is just as steep from 20 to 40 as it is from 60 to 80.) Studies have shown that working memory, perceptual speed, comprehension of text and language, and word finding ability do decline with age. But it is important to remember that while these abilities to process new information may be degraded, they are not eliminated. People continue to learn, increasing in expertise and knowledge as they age. It is not just a simple story of decline.
Carstensen’s aging research has focused on motivation. Her Socioemotional Selectivity Theory describes how goals change as we age based on two key concepts. First, humans are the only species whose members have a sense of where they are in the life cycle, which they are consciously and subconsciously aware of throughout their lives. Second, we always consider the temporal context when setting goals. If you ran into a friend unexpectedly and only had a few minutes to chat, for example, you would have a different conversation than you would if you were sitting down to an hour-long lunch. Therefore, because we are aware of our position in the life cycle and because our goals are affected by temporal context, our goals will change throughout our lifetime as our temporal context changes.
For young people, the future seems expansive. They tend to seek out new things and new people. Everything is interesting because it could be useful in some unforeseen future situation. For older people, however, the future is more limited. They tend to turn their attention to the present, focusing on relationships with important people already in their lives, rather than new things; they are motivated to pursue emotionally meaningful goals. This difference is supported by experiments in which people are asked to identify their goals or pick from a set of goals. When the experiment constrains the future (e.g. imagine you are about to move across country alone, now choose your goals) both the young and the old pick more emotional goals. When the future is expanded (your doctor just called about a new treatment that will add 20 years to your life), both groups pick more informational goals.
Read more:
image credit:
Aging. To many people it’s wrinkles, retirement communities, and a steady decline in the ability to remember things. But before you reach for the Botox or buy a sports car, you might be interested in research by APS Fellow and Charter Member Laura Carstensen, Stanford University. In her recent lecture, “Growing Old or Living Long: Take Your Pick,” this year’s Henry and Bryna David Lecture at the National Academies of Sciences, Carsenten presented evidence that the future is not so grim.
Aging is an undeniable issue in today’s developed nations. According to Carstensen, “More years were added to average life expectancy in the 20th century than all increases in all other millennia combined.” For most of human existence, life expectancy hovered around 27. It increased during the 18th and 19th centuries, hitting about 47 at the turn of the 20th century. During the 20th century, life expectancy almost doubled, reaching 77 at the century’s close. This shift has created an entirely new life stage which essentially did not exist 100 years ago. Today, the health of older adults affects almost all aspects of society, from family life to finances and politics. It’s time to use what Carstensen describes as our “breathtaking” scientific capacity to create a world in which this new set of older people can thrive.
Most research on aging focuses on and affirms a steady decline in cognitive ability as we get older. (Keep in mind that this decline is a lifelong process; as Carstensen likes to remind her undergraduate students, the slope is just as steep from 20 to 40 as it is from 60 to 80.) Studies have shown that working memory, perceptual speed, comprehension of text and language, and word finding ability do decline with age. But it is important to remember that while these abilities to process new information may be degraded, they are not eliminated. People continue to learn, increasing in expertise and knowledge as they age. It is not just a simple story of decline.
Carstensen’s aging research has focused on motivation. Her Socioemotional Selectivity Theory describes how goals change as we age based on two key concepts. First, humans are the only species whose members have a sense of where they are in the life cycle, which they are consciously and subconsciously aware of throughout their lives. Second, we always consider the temporal context when setting goals. If you ran into a friend unexpectedly and only had a few minutes to chat, for example, you would have a different conversation than you would if you were sitting down to an hour-long lunch. Therefore, because we are aware of our position in the life cycle and because our goals are affected by temporal context, our goals will change throughout our lifetime as our temporal context changes.
For young people, the future seems expansive. They tend to seek out new things and new people. Everything is interesting because it could be useful in some unforeseen future situation. For older people, however, the future is more limited. They tend to turn their attention to the present, focusing on relationships with important people already in their lives, rather than new things; they are motivated to pursue emotionally meaningful goals. This difference is supported by experiments in which people are asked to identify their goals or pick from a set of goals. When the experiment constrains the future (e.g. imagine you are about to move across country alone, now choose your goals) both the young and the old pick more emotional goals. When the future is expanded (your doctor just called about a new treatment that will add 20 years to your life), both groups pick more informational goals.
Read more:
image credit:
Audience Atomization Overcome: Why the Internet Weakens the Authority of the Press
[ed. It looks like this will be Jay Rosen week. Following a link in his recent interview with Sophie Roell (below) I found this post from 2009. I don't know how I missed it. It's a fascinating distillation of media behavior and its effect on political dialogue. Well worth checking out, including the responses he received following its publication.]
"In the age of mass media, the press was able to define the sphere of legitimate debate with relative ease because the people on the receiving end were atomized-- connected "up" to Big Media but not across to each other. And now that authority is eroding. I will try to explain why."
It’s easily the most useful diagram I’ve found for understanding the practice of journalism in the United States, and the hidden politics of that practice. You can draw it by hand right now. Take a sheet of paper and make a big circle in the middle. In the center of that circle draw a smaller one to create a doughnut shape. Label the doughnut hole “sphere of consensus.” Call the middle region “sphere of legitimate debate,” and the outer region “sphere of deviance.”
That’s the entire model. Now you have a way to understand why it’s so unproductive to argue with journalists about the deep politics of their work. They don’t know about this freakin’ diagram! Here it is in its original form, from the 1986 book The Uncensored War by press scholar Daniel C. Hallin. Hallin felt he needed something more supple—and truthful—than calcified notions like objectivity and “opinions are confined to the editorial page.” So he came up with this diagram.
Let’s look more carefully at his three regions.
1.) The sphere of legitimate debate is the one journalists recognize as real, normal, everyday terrain. They think of their work as taking place almost exclusively within this space. (It doesn’t, but they think so.) Hallin: “This is the region of electoral contests and legislative debates, of issues recognized as such by the major established actors of the American political process.”
Here the two-party system reigns, and the news agenda is what the people in power are likely to have on their agenda. Perhaps the purest expression of this sphere is Washington Week on PBS, where journalists discuss what the two-party system defines as “the issues.” Objectivity and balance are “the supreme journalistic virtues” for the panelists on Washington Week because when there is legitimate debate it’s hard to know where the truth lies. There are risks in saying that truth lies with one faction in the debate, as against another— even when it does. He said, she said journalism is like the bad seed of this sphere, but also a logical outcome of it.
2. ) The sphere of consensus is the “motherhood and apple pie” of politics, the things on which everyone is thought to agree. Propositions that are seen as uncontroversial to the point of boring, true to the point of self-evident, or so widely-held that they’re almost universal lie within this sphere. Here, Hallin writes, “journalists do not feel compelled either to present opposing views or to remain disinterested observers.” (Which means that anyone whose basic views lie outside the sphere of consensus will experience the press not just as biased but savagely so.)
Consensus in American politics begins, of course, with the United States Constitution, but it includes other propositions too, like “Lincoln was a great president,” and “it doesn’t matter where you come from, you can succeed in America.” Whereas journalists equate ideology with the clash of programs and parties in the debate sphere, academics know that the consensus or background sphere is almost pure ideology: the American creed.
3.) In the sphere of deviance we find “political actors and views which journalists and the political mainstream of society reject as unworthy of being heard.” As in the sphere of consensus, neutrality isn’t the watchword here; journalists maintain order by either keeping the deviant out of the news entirely or identifying it within the news frame as unacceptable, radical, or just plain impossible. The press “plays the role of exposing, condemning, or excluding from the public agenda” the deviant view, says Hallin. It “marks out and defends the limits of acceptable political conduct.”
Anyone whose views lie within the sphere of deviance—as defined by journalists—will experience the press as an opponent in the struggle for recognition. If you don’t think separation of church and state is such a good idea; if you do think a single payer system is the way to go; if you dissent from the “lockstep behavior of both major American political parties when it comes to Israel” (Glenn Greenwald) chances are you will never find your views reflected in the news. It’s not that there’s a one-sided debate; there’s no debate.
Wednesday, August 3, 2011
Books Without Borders
My Life at the World's Dumbest Bookstore Chain.
by Paul Constant
It's embarrassing now, but on the day that I was hired to work at Boston's flagship Borders store in 1996, I was so happy that I danced around my apartment. After dropping out of college, I had worked a succession of crappy jobs: mall Easter Bunny, stock boy at Sears and Kmart and Walmart, a brief and nearly fatal stint as a landscaper. A job at Borders seemed to be a step, at long last, toward my ultimate goal of writing for a living. At least I would be working with books. And the scruffy Borders employees, in their jeans and band T-shirts, felt a lot closer to my ideal urban intellectuals than the stuffy Barnes & Noble employees with their oppressive dress code and lame vests.
The fact that Borders offered me a full-time job, which allowed me to quit two part-time jobs (at a Staples and a Stop & Shop) and offered health insurance (that promised to help pay for my impending wisdom tooth extraction), was a pretty big deal, too.
For better and for worse, Borders was my college experience. I behaved badly—fucked, drank, and did drugs with everyone I could. My fellow employees snuck me into bars when I was underage, and then cheered when, during my 21st birthday party, I wound up facedown in the gutter sobbing about how my heart had been ripped in two by an ex-fiancée. I was not alone in my bad behavior: Every week, different employees were hooking up, having affairs, breaking up, recoupling, playing drinking games that involved comically large hunting knives, getting in fights, getting pregnant, and showing up drunk for work.
In the beginning, the store felt like a tight-knit family. As time went on, we became a confederation of hedonists with little regard for one another's feelings. At one Christmas party that I didn't attend, a new female employee reportedly gave blowjobs to anybody who wanted one. (Later, at least a couple of men who stood in line for the newbie's ministrations complained about picking up an STD.) Suddenly, the parties weren't as fun anymore. One employee hanged himself. Another died of a heart attack in the DVD section on the overnight replenishment shift and wasn't discovered until the store opened for business the next morning.
But it wasn't all an endless cycle of party and hangover. The 20 percent discount—plus an employee credit account that went up to $300, with the store paying off $20 of that debt a month—allowed me to explore books I'd never heard of. It's hard to remember now, but when Borders began proliferating in suburban parking lots around the country, they had a truly excellent selection curated, at least in part, by each store's employees. I bought my first title from countercultural Washington press Feral House—Apocalypse Culture—at the brand-new Borders at the Maine Mall when I was a teenager, and it still ranks as one of my most mind-blowing reading experiences. I read my first David Foster Wallace and Matt Ruff books while working at Borders; I explored the lesser-known works of Twain and Melville and Dickens and St. Vincent Millay. I learned who Edward Abbey and Noam Chomsky and Kathy Acker were. I discovered young writers like Banana Yoshimoto and Colson Whitehead and Chuck Palahniuk and Haruki Murakami. Thanks to my coworkers in the music department, which was just as far-reaching as the book department, I learned to love Miles Davis and Glenn Gould and an obscure punk band from way out west called Sleater-Kinney.
At the time, independent bookstores were blaming Borders for a spate of mom-and-pop bookstore closures around the country. I'll never forget the employee at Bookland in Maine who coldly accused me of single-handedly destroying her small chain when I admitted who my employer was, even as I was buying $50 worth of books from her. Of course, the accusations had truth to them—small bookstores simply couldn't compete with the deep discounts the chains offered—but for what it's worth, every employee who worked at Borders, at least when I first joined the company, adored literature. We were not automatons out to assassinate local business. We wanted to work with the cultural artifacts that were the most important things in our lives, the things that made us who we were. Not all of us could find work at independent bookstores, so we did the next best thing: We went to work for a company that seemingly cared about quality literature and regional reading tastes, and gave its employees a small-but-fair wage for full-time bookselling careers, with excellent benefits. It sure didn't feel like selling out.
Until suddenly, one day, it did feel like selling out. Because it was. Our displays were bought and paid for by publishers; where we used to present books that we loved and wanted to champion, now mediocre crap was piled on every flat surface. The front of the store, with all the kitchen magnets and board games and junk you don't need took over large chunks of the expansive magazine and local-interest sections. Orders came from the corporate headquarters in Ann Arbor every Sunday to change out the displays. One time I had to take down some of the store's most exciting up-and-coming fiction titles (including a newly published book that was gathering word-of-mouth buzz, thanks to our booksellers, called Harry Potter and the Sorcerer's Stone) to put up a wall of Clash CDs. One month, for some reason, the cafe sold Ernest Hemingway–branded chai.
Read more:
by Paul Constant
It's embarrassing now, but on the day that I was hired to work at Boston's flagship Borders store in 1996, I was so happy that I danced around my apartment. After dropping out of college, I had worked a succession of crappy jobs: mall Easter Bunny, stock boy at Sears and Kmart and Walmart, a brief and nearly fatal stint as a landscaper. A job at Borders seemed to be a step, at long last, toward my ultimate goal of writing for a living. At least I would be working with books. And the scruffy Borders employees, in their jeans and band T-shirts, felt a lot closer to my ideal urban intellectuals than the stuffy Barnes & Noble employees with their oppressive dress code and lame vests. The fact that Borders offered me a full-time job, which allowed me to quit two part-time jobs (at a Staples and a Stop & Shop) and offered health insurance (that promised to help pay for my impending wisdom tooth extraction), was a pretty big deal, too.
For better and for worse, Borders was my college experience. I behaved badly—fucked, drank, and did drugs with everyone I could. My fellow employees snuck me into bars when I was underage, and then cheered when, during my 21st birthday party, I wound up facedown in the gutter sobbing about how my heart had been ripped in two by an ex-fiancée. I was not alone in my bad behavior: Every week, different employees were hooking up, having affairs, breaking up, recoupling, playing drinking games that involved comically large hunting knives, getting in fights, getting pregnant, and showing up drunk for work.
In the beginning, the store felt like a tight-knit family. As time went on, we became a confederation of hedonists with little regard for one another's feelings. At one Christmas party that I didn't attend, a new female employee reportedly gave blowjobs to anybody who wanted one. (Later, at least a couple of men who stood in line for the newbie's ministrations complained about picking up an STD.) Suddenly, the parties weren't as fun anymore. One employee hanged himself. Another died of a heart attack in the DVD section on the overnight replenishment shift and wasn't discovered until the store opened for business the next morning.
But it wasn't all an endless cycle of party and hangover. The 20 percent discount—plus an employee credit account that went up to $300, with the store paying off $20 of that debt a month—allowed me to explore books I'd never heard of. It's hard to remember now, but when Borders began proliferating in suburban parking lots around the country, they had a truly excellent selection curated, at least in part, by each store's employees. I bought my first title from countercultural Washington press Feral House—Apocalypse Culture—at the brand-new Borders at the Maine Mall when I was a teenager, and it still ranks as one of my most mind-blowing reading experiences. I read my first David Foster Wallace and Matt Ruff books while working at Borders; I explored the lesser-known works of Twain and Melville and Dickens and St. Vincent Millay. I learned who Edward Abbey and Noam Chomsky and Kathy Acker were. I discovered young writers like Banana Yoshimoto and Colson Whitehead and Chuck Palahniuk and Haruki Murakami. Thanks to my coworkers in the music department, which was just as far-reaching as the book department, I learned to love Miles Davis and Glenn Gould and an obscure punk band from way out west called Sleater-Kinney.
At the time, independent bookstores were blaming Borders for a spate of mom-and-pop bookstore closures around the country. I'll never forget the employee at Bookland in Maine who coldly accused me of single-handedly destroying her small chain when I admitted who my employer was, even as I was buying $50 worth of books from her. Of course, the accusations had truth to them—small bookstores simply couldn't compete with the deep discounts the chains offered—but for what it's worth, every employee who worked at Borders, at least when I first joined the company, adored literature. We were not automatons out to assassinate local business. We wanted to work with the cultural artifacts that were the most important things in our lives, the things that made us who we were. Not all of us could find work at independent bookstores, so we did the next best thing: We went to work for a company that seemingly cared about quality literature and regional reading tastes, and gave its employees a small-but-fair wage for full-time bookselling careers, with excellent benefits. It sure didn't feel like selling out.
Until suddenly, one day, it did feel like selling out. Because it was. Our displays were bought and paid for by publishers; where we used to present books that we loved and wanted to champion, now mediocre crap was piled on every flat surface. The front of the store, with all the kitchen magnets and board games and junk you don't need took over large chunks of the expansive magazine and local-interest sections. Orders came from the corporate headquarters in Ann Arbor every Sunday to change out the displays. One time I had to take down some of the store's most exciting up-and-coming fiction titles (including a newly published book that was gathering word-of-mouth buzz, thanks to our booksellers, called Harry Potter and the Sorcerer's Stone) to put up a wall of Clash CDs. One month, for some reason, the cafe sold Ernest Hemingway–branded chai.
Read more:
Shark Week: Remembering Bruce
by Nicholas Jackson There are only a few dozen shark attacks on humans every year. It has been widely reported that you are 30 times more likely to die from a lightning strike than you are from an attack. In 2003, Reuters ran a story claiming that more people are killed by vending machines each year than are killed by sharks. And yet, I would bet that just about anybody who has spent time at the beach has thought about the possibility of an attack. I know I certainly have. Before dipping so much as a toe into the ocean, I scan the horizon for a dark, approaching shadow from the deep. And I thank Steven Spielberg for that.
In 1975, Spielberg released the first of what would become a franchise. Jaws was a landmark horror-thriller, recognized by everyone from Empire magazine (fifth greatest film ever made) to the New York Times (one of the 1,000 best movies ever) to the American Film Institute (number 48 on the "100 Years... 100 Movies" list). It won three Academy Awards and was even nominated for Best Picture. (It lost to One Flew Over the Cuckoo's Nest.) Perhaps more importantly, the movie created the wide-release summer blockbuster, a tradition of providing big-budget thrills in ever major theater across America during the hottest months of the year that continues to this day. Jaws brought in more money than any other film and held that title until George Lucas released Star Wars two years later.
An instant classic, Jaws received rave reviews. Roger Ebert called it "a sensationally effective action picture, a scary thriller that works all the better because it's populated with characters that have been developed into human beings we get to know and care about." There's Roy Scheider as Brody, the police chief who we can all identify with, who doesn't like to swim, who is genuinely terrified of the water. There's Robert Shaw as Quint, "a caricature of the crusty old seafaring salt," at Ebert put it in that 1975 write-up. There's Hooper, the rich- kid-turned-oceanographer played by Richard Dreyfuss, just off a string of successes as the nice kid in American Graffiti and the title character in the Canadian hit The Apprenticeship of Duddy Kravitz. But the most important character -- and, in many ways, one of the most human -- is the shark itself.
Everyone knows the story by now: The shark is a great white that terrorizes a small resort town during the Fourth of July weekend, a weekend critical to the economy of this little village. In an effort to track down and kill the shark, these three men leave their families behind (where applicable) and set out on a rickety boat. It's leaky. It's too small. It's old. This boat, we know from the outset, just isn't cut out for shark hunting. At least not hunting sharks of the size we suspect this great white to be.
"There are no doubt supposed to be all sorts of levels of meanings in such an archetypal story," Ebert notes. But he doesn't bother writing about them or trying to figure them out. And neither does Spielberg. "This is an action film content to stay entirely within the perimeters of its story, and none of the characters has to wade through speeches expounding on the significance of it all." And what an action film it is. This isn't just about the dark shadow from the deep -- though it is that, too. Before the story comes to an end, many individuals both on and off the island have been killed in a series of terrifying scenes that allow you to get up close and personal with the shark.
The only reason this works -- the only reason that theatergoers in the 1970s left their seats terrified of these macropredatory beasts and that modern viewers can't turn off the lights when screening the film in their own living rooms -- is the craftsmanship and technology that went into creating the main characters: Jaws.
Read more:
Jay Rosen on Journalism in the Internet Age
by Sophie Roelle In a break from our usual practice of focusing on books, we asked the journalism analyst and veteran blogger to recommend five articles illustrating the upheavals of the news business
I know that as journalists we have to adapt rapidly to new ways of doing things, but you've really thrown me in at the deep end – you’ve chosen five online articles instead of five books, and we’re doing the interview on Google chat rather than by telephone.
I like to do things differently. For example, using PressThink for longform blogging – which wasn't the normal thing at the time, in 2003.
Will you give me an overall sense of what you are saying about changes in journalism with the articles you've chosen?
Well, first there's been a shift in power. The users have more than they did because they can publish and connect to one another, not just to the media. Second, the people formerly known as the audience are configured differently. They are connected horizontally as well as vertically, which is why today we speak of social media. This is what I sometimes call “audience atomisation overcome”. Third, the media still have power and journalism still matters. In some ways the essence of it has not changed. But a lot of what journalists did became bound up with particular forms of production and distribution. Since the web has radically altered those forms, it has radically changed journalistic work, even though the value of good journalism remains the same – timely, accurate, useful information that tells us what's happening in our world over the horizon of our personal experience.
Enter the Cyber-dragon
by Michael Joseph Gross
Lying there in the junk-mail folder, in the spammy mess of mortgage offers and erectile-dysfunction drug ads, an e-mail from an associate with a subject line that looked legitimate caught the man’s eye. The subject line said “2011 Recruitment Plan.” It was late winter of 2011. The man clicked on the message, downloaded the attached Excel spreadsheet file, and unwittingly set in motion a chain of events allowing hackers to raid the computer networks of his employer, RSA. RSA is the security division of the high-tech company EMC. Its products protect computer networks at the White House, the Central Intelligence Agency, the National Security Agency, the Pentagon, the Department of Homeland Security, most top defense contractors, and a majority of Fortune 500 corporations.
The parent company disclosed the breach on March 17 in a filing with the Securities and Exchange Commission. The hack gravely undermined the reputation of RSA’s popular SecurID security service. As spring gave way to summer, bloggers and computer-security experts found evidence that the attack on RSA had come from China. They also linked the RSA attack to the penetration of computer networks at some of RSA’s most powerful defense-contractor clients—among them, Lockheed Martin, Northrop Grumman, and L-3 Communications. Few details of these episodes have been made public.
The RSA and defense-contractor hacks are among the latest battles in a decade-long spy war. Hackers from many countries have been exfiltrating—that is, stealing—intellectual property from American corporations and the U.S. government on a massive scale, and Chinese hackers are among the main culprits. Because virtual attacks can be routed through computer servers anywhere in the world, it is almost impossible to attribute any hack with total certainty. Dozens of nations have highly developed industrial cyber-espionage programs, including American allies such as France and Israel. And because the People’s Republic of China is such a massive entity, it is impossible to know how much Chinese hacking is done on explicit orders from the government. In some cases, the evidence suggests that government and military groups are executing the attacks themselves. In others, Chinese authorities are merely turning a blind eye to illegal activities that are good for China’s economy and bad for America’s. Last year Google became the first major company to blow the whistle on Chinese hacking when it admitted to a penetration known as Operation Aurora, which also hit Intel, Morgan Stanley, and several dozen other corporations. (The attack was given that name because the word “aurora” appears in the malware that victims downloaded.) Earlier this year, details concerning the most sweeping intrusion since Operation Aurora were discovered by the cyber-security firm McAfee. Dubbed “Operation Shady rat,” the attacks (of which more later) are being reported here for the first time. Most companies have preferred not to talk about or even acknowledge violations of their computer systems, for fear of panicking shareholders and exposing themselves to lawsuits—or for fear of offending the Chinese and jeopardizing their share of that country’s exploding markets. The U.S. government, for its part, has been fecklessly circumspect in calling out the Chinese.
A scattered alliance of government insiders and cyber-security experts are working to bring attention to the threat, but because of the topic’s extreme sensitivity, much of their consciousness-raising activity must be covert. The result in at least one case, according to documents obtained by Vanity Fair, has been a surreal new creation of American bureaucracy: government-directed “hacktivism,” in which an intelligence agency secretly provides information to a group of private-sector hackers so that truths too sensitive for the government to tell will nevertheless come out.
This unusual project began in March, when National Security Agency officials asked a private defense contractor to organize a cadre of elite non-government experts to study the RSA cyber-attacks. The experts constituted a SEAL Team Six of cyber-security and referred to their work as Operation Starlight. “This is the N.S.A. outsourcing the finger-pointing to the private sector,” says one person who was invited to join the group and has been privy to its e-mail logs. The N.S.A. provided Operation Starlight with the data it needed for its forensic analysis.
Hackers have attacked America’s defense establishment, as well as companies from Google to Morgan Stanley to security giant RSA, and fingers point to China as the culprit. The author gets an exclusive look at the raging cyber-war—Operation Aurora! Operation Shady rat!—and learns why Washington has been slow to fight back.
Lying there in the junk-mail folder, in the spammy mess of mortgage offers and erectile-dysfunction drug ads, an e-mail from an associate with a subject line that looked legitimate caught the man’s eye. The subject line said “2011 Recruitment Plan.” It was late winter of 2011. The man clicked on the message, downloaded the attached Excel spreadsheet file, and unwittingly set in motion a chain of events allowing hackers to raid the computer networks of his employer, RSA. RSA is the security division of the high-tech company EMC. Its products protect computer networks at the White House, the Central Intelligence Agency, the National Security Agency, the Pentagon, the Department of Homeland Security, most top defense contractors, and a majority of Fortune 500 corporations. The parent company disclosed the breach on March 17 in a filing with the Securities and Exchange Commission. The hack gravely undermined the reputation of RSA’s popular SecurID security service. As spring gave way to summer, bloggers and computer-security experts found evidence that the attack on RSA had come from China. They also linked the RSA attack to the penetration of computer networks at some of RSA’s most powerful defense-contractor clients—among them, Lockheed Martin, Northrop Grumman, and L-3 Communications. Few details of these episodes have been made public.
The RSA and defense-contractor hacks are among the latest battles in a decade-long spy war. Hackers from many countries have been exfiltrating—that is, stealing—intellectual property from American corporations and the U.S. government on a massive scale, and Chinese hackers are among the main culprits. Because virtual attacks can be routed through computer servers anywhere in the world, it is almost impossible to attribute any hack with total certainty. Dozens of nations have highly developed industrial cyber-espionage programs, including American allies such as France and Israel. And because the People’s Republic of China is such a massive entity, it is impossible to know how much Chinese hacking is done on explicit orders from the government. In some cases, the evidence suggests that government and military groups are executing the attacks themselves. In others, Chinese authorities are merely turning a blind eye to illegal activities that are good for China’s economy and bad for America’s. Last year Google became the first major company to blow the whistle on Chinese hacking when it admitted to a penetration known as Operation Aurora, which also hit Intel, Morgan Stanley, and several dozen other corporations. (The attack was given that name because the word “aurora” appears in the malware that victims downloaded.) Earlier this year, details concerning the most sweeping intrusion since Operation Aurora were discovered by the cyber-security firm McAfee. Dubbed “Operation Shady rat,” the attacks (of which more later) are being reported here for the first time. Most companies have preferred not to talk about or even acknowledge violations of their computer systems, for fear of panicking shareholders and exposing themselves to lawsuits—or for fear of offending the Chinese and jeopardizing their share of that country’s exploding markets. The U.S. government, for its part, has been fecklessly circumspect in calling out the Chinese.
A scattered alliance of government insiders and cyber-security experts are working to bring attention to the threat, but because of the topic’s extreme sensitivity, much of their consciousness-raising activity must be covert. The result in at least one case, according to documents obtained by Vanity Fair, has been a surreal new creation of American bureaucracy: government-directed “hacktivism,” in which an intelligence agency secretly provides information to a group of private-sector hackers so that truths too sensitive for the government to tell will nevertheless come out.
This unusual project began in March, when National Security Agency officials asked a private defense contractor to organize a cadre of elite non-government experts to study the RSA cyber-attacks. The experts constituted a SEAL Team Six of cyber-security and referred to their work as Operation Starlight. “This is the N.S.A. outsourcing the finger-pointing to the private sector,” says one person who was invited to join the group and has been privy to its e-mail logs. The N.S.A. provided Operation Starlight with the data it needed for its forensic analysis.
Subscribe to:
Comments (Atom)









