Friday, March 9, 2012
Keynes, Hayek and Orange Peppers
So the other day I go out to buy some peppers. Like all middle-class twerps in north London, I am now incapable of cooking anything that does not come with the imprimatur of Yotam Ottolenghi—master of the delicious but finicky fusion dish, ambassador for sumac and precise counter of curry leaves. The dish I’m cooking wants three red peppers, so I go to Sainsbury’s and find three peppers. In a pack: one red, one green, one orange.

No more waiting six months for BT to put in a telephone line and having three shades of brown Bakelite handset to choose from. No more nasty, gherkiny bit in your Whopper: you could “Have It Your Way.” The world would be a giant retail pic ‘n’ mix. Our lives would be bespoke. And yet here—ostensibly for no more profound reason than that it looks cutely like a traffic light—comes their refutation. You cannot acquire a red pepper without also acquiring a green one (which the recipe doesn’t require) and an orange one (which very few if any recipes require.) Take it or leave it.
I stand fuming in the aisle of my local supermarket, while sullen teenagers with earphones going tss-bomp tss-bomp tss-bomp tss-bomp shoulder past me towards the pouches of glutionous pre-fabricated stir-fry sauce, while mothers wheeling gargantuan three-wheeled toddler-wagons bark my shins and trolley-pushers back up and reverse into three-for-two sliced meats to let them past, everyone doing the “scoose, sorry, scoose, sorry” dance and wishing—on balance—that the earth would cease upon the stroke of midnight with no pain. And I look at this packet of peppers, and I think: is this what the long history of human ingenuity has come to?
by Sam Leith, Prospect | Read more:
The Gray Box: An Investigative Look at Solitary Confinement
Among the misperceptions about solitary confinement is that it’s used only on the most violent inmates, and only for a few weeks or months. In fact, an estimated 80,000 Americans — many with no record of violence either inside or outside prison — are living in seclusion. They stay there for years, even decades. What this means, generally, is 23 hours a day in a cell the size of two queen-sized mattresses, with a single hour in an exercise cage, also alone. Some prisoners aren’t allowed visits or phone calls. Some have no TV or radio. Some never lay eyes on each other. And some go years without fresh air or sunlight.
Solitary is a place where the slightest details can mean the world. Things like whether you can see a patch of grass or only sky outside your window – if you’re lucky enough to have a window. Or whether the guy who occupies cells before you in rotation has a habit of smearing feces on the wall. Are the lights on 24/7? Is there a clock or calendar to mark time? If you scream, could anyone hear you? (...)
Plenty of corrections officers might tell you that offenders doing time in solitary don’t deserve the roofs over their heads or the meals shoved through their food slots. To be sure, many of these prisoners have done heinous, unforgivable things for which we lock them up tightly. Just how tightly is no small question. Yet, as a matter of public policy, the question hardly comes up. Compared to how much we as a nation have debated capital punishment, a sentence served by a small fraction of the incarcerated, we barely discuss how severely we’re willing to punish nearly everyone else.
“When the door is locked against the prisoner, we do not think about what is behind it,” Supreme Court Associate Justice Anthony Kennedy once said.
Solitary confinement started in the U.S. as a morally progressive social experiment in the 1820s by Quakers, who wanted lawmakers to replace mutilations, amputations and the death penalty with rehabilitation. The hope was that long periods of introspection would help criminals repent.
After touring a Pennsylvania prison in the 1840s, Charles Dickens described prolonged isolation as a “slow and daily tampering with the mysteries of the brain immeasurably worse than any torture of the body.” He also wrote, “There is a depth of terrible endurance in it which none but the sufferers themselves can fathom.”
Some of his contemporaries shared that view. “It devours the victims incessantly and unmercifully,” Alexis de Tocqueville reported from a prison in New York in the 1820s. “It does not reform, it kills.” (...)
Most prisons suspended the practice in the mid- to late-1800s once it became clear the theory didn’t work. The U.S. Supreme Court punctuated that point in 1890 when it freed a Colorado man who had been sentenced to death for killing his wife, recognizing the psychological harm isolation had caused him.
“This matter of solitary confinement is not … a mere unimportant regulation as to the safe-keeping of the prisoner,” the court ruled in the case of James Medley. “A considerable number of the prisoners fell, after even a short confinement, into a semi-fatuous condition, from which it was next to impossible to arouse them, while those who stood the ordeal better were not generally reformed, and in most cases did not recover sufficient mental activity to be of any subsequent service to the community.”
Solitary confinement was largely unused for about a century until October 1983 when, in separate incidents, inmates killed two guards in one day at the U.S. Penitentiary in Marion, Ill., which had replaced Alcatraz as home to the most dangerous federal convicts. The prison went into lockdown for the next 23 years, setting the model for dozens of state and federal supermaxes – prisons designed specifically for mass isolation — that since have been built in the name of officer safety. “Never again,” promised Reagan-era shock doctrinarians who set out at great cost to crack down on prison violence.
“Whole prisons have been built, people have gotten funding for supermax facilities based on the act of a single (inmate),” says Michael Randle, former director of the Illinois Department of Corrections.
by Susan Greene, Dart Society | Read more:
Image via: Quote/Counterquote

Plenty of corrections officers might tell you that offenders doing time in solitary don’t deserve the roofs over their heads or the meals shoved through their food slots. To be sure, many of these prisoners have done heinous, unforgivable things for which we lock them up tightly. Just how tightly is no small question. Yet, as a matter of public policy, the question hardly comes up. Compared to how much we as a nation have debated capital punishment, a sentence served by a small fraction of the incarcerated, we barely discuss how severely we’re willing to punish nearly everyone else.
“When the door is locked against the prisoner, we do not think about what is behind it,” Supreme Court Associate Justice Anthony Kennedy once said.
Solitary confinement started in the U.S. as a morally progressive social experiment in the 1820s by Quakers, who wanted lawmakers to replace mutilations, amputations and the death penalty with rehabilitation. The hope was that long periods of introspection would help criminals repent.
After touring a Pennsylvania prison in the 1840s, Charles Dickens described prolonged isolation as a “slow and daily tampering with the mysteries of the brain immeasurably worse than any torture of the body.” He also wrote, “There is a depth of terrible endurance in it which none but the sufferers themselves can fathom.”
Some of his contemporaries shared that view. “It devours the victims incessantly and unmercifully,” Alexis de Tocqueville reported from a prison in New York in the 1820s. “It does not reform, it kills.” (...)
Most prisons suspended the practice in the mid- to late-1800s once it became clear the theory didn’t work. The U.S. Supreme Court punctuated that point in 1890 when it freed a Colorado man who had been sentenced to death for killing his wife, recognizing the psychological harm isolation had caused him.
“This matter of solitary confinement is not … a mere unimportant regulation as to the safe-keeping of the prisoner,” the court ruled in the case of James Medley. “A considerable number of the prisoners fell, after even a short confinement, into a semi-fatuous condition, from which it was next to impossible to arouse them, while those who stood the ordeal better were not generally reformed, and in most cases did not recover sufficient mental activity to be of any subsequent service to the community.”
Solitary confinement was largely unused for about a century until October 1983 when, in separate incidents, inmates killed two guards in one day at the U.S. Penitentiary in Marion, Ill., which had replaced Alcatraz as home to the most dangerous federal convicts. The prison went into lockdown for the next 23 years, setting the model for dozens of state and federal supermaxes – prisons designed specifically for mass isolation — that since have been built in the name of officer safety. “Never again,” promised Reagan-era shock doctrinarians who set out at great cost to crack down on prison violence.
“Whole prisons have been built, people have gotten funding for supermax facilities based on the act of a single (inmate),” says Michael Randle, former director of the Illinois Department of Corrections.
by Susan Greene, Dart Society | Read more:
Image via: Quote/Counterquote
IMAX Fighter Pilot
This IMAX film is about fighter pilots and fighter planes and the quality of the video is awesome, so make sure you watch it in full screen, on the 720pHD setting and the sound quality is fantastic too.
h/t: GS
Thursday, March 8, 2012
Smooth Moves
[ed. Also: this story by Forbes.]
One sticky morning last summer, Sara Blakely, the inventor of Spanx, which over the past decade has become to women’s foundation garments what Scotch is to cellophane tape, was sitting in the Park Avenue offices of her husband, Jesse Itzler, confronting a new challenge: the male anatomy. Red boxes of stretchy Spanx undershirts for men were strewn across a table before the couple. “Sara sent my dad, who is going to be eighty-two years old, a tank top,” Itzler said.
Blakely smiled. “He said it took him half a day to get into it, and half a day to get out,” she said. “My mom said she’d never laughed harder,” Itzler added.
There was also a prototype of black cotton briefs with a sturdy “3D” pouch over the groin, devised by Spanx’s product-development team after several male testers complained to Blakely that they needed more support. “They were, like, ‘It just kind of hangs,’ ” she said.
“And is the hole big enough? To get through?” Blakely went on, fingering the pouch. “It seems like you’d have to really . . . bring it out. Look, I’m in foreign territory here.”
“Yes, yes,” Itzler said, rolling his eyes.
Blakely, who recently turned forty and is a size 6 (“the largest I’ve ever been,” she said), with long blond hair and bright-white teeth, believes that there is no figure problem—saddlebags, upper-arm jiggle, stomach rolls—that can’t be solved with a little judiciously placed Lycra. “Where I get my energy is: ‘How can I make it better?’ ” Blakely said. “I’ll ask my brother, ‘If you could wave your wand and make your boxer shorts better, what would you do?” Her first big idea, in 1998, was to chop the feet off a pair of control-top panty hose so that she could get a svelte, seamless look under white slacks without stockings poking out of her sandals. The resulting product, Footless Pantyhose, has sold nine million pairs since October of 2000, when Blakely, who was then a fax-machine saleswoman and a part-time standup comic, started Spanx, with five thousand dollars of savings. Another Spanx product, a lightweight girdle called Power Panties that retails for around thirty dollars, has sold six million units since it was introduced, in 2002. (...)
Spanx’s popularity repudiates the late-twentieth-century belief, perpetuated by Jane Fonda and Nike, that a firm body can be achieved only through sweaty resolve. “There’s a whole subset of women who don’t relate to that idea,” Blakely said. She is overseeing a new line of activewear, called In It to Slim It, but there is a desultory feel to the enterprise. “I started thinking about joy,” Blakely said. “Everything in our society is so purposeful. Let’s bring joy back to the experience—have fun when you’re doing it,” meaning exercise. She has already expanded into legwear (Tight-End Tights); lacy lingerie (Haute Contour); casual separates (Bod a Bing!); and retro, ruffled swimwear. Spanx now offers more than two hundred different products, and executives at the company, which is privately held, and reported three hundred and fifty million dollars in global retail sales in 2008, worry that customers are having trouble distinguishing among them. Part of the line is manufactured by Acme-McCrary, a century-old firm in the hosiery mecca of Asheboro, North Carolina (the rest is outsourced to other countries). Larry Small, who until recently was Acme-McCrary’s C.E.O., told me that Spanx represents close to a third of his business, and he called Blakely a “rock star” in an industry of good ol’ boys. “I’ve always wondered how the heck men are supposed to sell hosiery,” he said.
Blakely chose the brand’s name partly for what she calls its “virgin-whore tension,” and partly for its “k” sound, which has a good track record in both business and comedy. “I used to hold my breath every time I said it out loud,” she told me. “People were so offended they’d hang up on me.” When the Spanx Web site first went live, Blakely’s mother accidentally directed a tableful of luncheon guests to spanks.com, a porn site. (...)
Blakely has several phobias, but her greatest fear is heights. “When we first got this apartment, I thought I might have to sell it as soon as we moved in,” she said. “And my husband was, like, ‘Why are you telling me this now?’ ” In addition to running his marketing company, Itzler is the co-founder of Marquis Jets, which leases private planes; the couple met at a poker tournament in Las Vegas (Bill Gates and Warren Buffett were among the guests), after Blakely had become a loyal Marquis customer, figuring that if she panicked during a flight she could order the pilot to land. She travels constantly, to give speeches—she is also afraid of public speaking—and to tend to the charitable foundation that she started, with seven hundred and fifty thousand dollars in prize money that she won after appearing on Branson’s show “The Rebel Billionaire”; she has so far donated around ten million dollars to women’s causes, a million to Oprah Winfrey’s Leadership Academy alone. “I took a Fear of Flying class, and I always missed the class, because I was always flying,” she said.
The couple hired a former Navy SEAL to devise emergency escape methods from the New York apartment, which is decorated in a modern rococo style, with ottomans covered in zebra print, ankle-deep rugs, maroon tiled ceilings, sequinned pillows, feminist art, and a silver chandelier in the dining room. Hidden behind the bar are jet packs and an inflatable motorboat. “We can jump out the window if we have to,” she said. Among her tasks as a contestant on “Rebel Billionaire” were nosedives in a 747 (“I also have a fear of puking,” she said) and a long climb up the side of a hot-air balloon on a rope ladder, to have tea with Branson at eight thousand feet. Blakely regarded these not as acts of masochism but as spiritual challenges.
by Alexandria Jacobs, The New Yorker | Read more:
Photograph by Josef Astor
One sticky morning last summer, Sara Blakely, the inventor of Spanx, which over the past decade has become to women’s foundation garments what Scotch is to cellophane tape, was sitting in the Park Avenue offices of her husband, Jesse Itzler, confronting a new challenge: the male anatomy. Red boxes of stretchy Spanx undershirts for men were strewn across a table before the couple. “Sara sent my dad, who is going to be eighty-two years old, a tank top,” Itzler said.
Blakely smiled. “He said it took him half a day to get into it, and half a day to get out,” she said. “My mom said she’d never laughed harder,” Itzler added.
There was also a prototype of black cotton briefs with a sturdy “3D” pouch over the groin, devised by Spanx’s product-development team after several male testers complained to Blakely that they needed more support. “They were, like, ‘It just kind of hangs,’ ” she said.
“And is the hole big enough? To get through?” Blakely went on, fingering the pouch. “It seems like you’d have to really . . . bring it out. Look, I’m in foreign territory here.”
“Yes, yes,” Itzler said, rolling his eyes.
Blakely, who recently turned forty and is a size 6 (“the largest I’ve ever been,” she said), with long blond hair and bright-white teeth, believes that there is no figure problem—saddlebags, upper-arm jiggle, stomach rolls—that can’t be solved with a little judiciously placed Lycra. “Where I get my energy is: ‘How can I make it better?’ ” Blakely said. “I’ll ask my brother, ‘If you could wave your wand and make your boxer shorts better, what would you do?” Her first big idea, in 1998, was to chop the feet off a pair of control-top panty hose so that she could get a svelte, seamless look under white slacks without stockings poking out of her sandals. The resulting product, Footless Pantyhose, has sold nine million pairs since October of 2000, when Blakely, who was then a fax-machine saleswoman and a part-time standup comic, started Spanx, with five thousand dollars of savings. Another Spanx product, a lightweight girdle called Power Panties that retails for around thirty dollars, has sold six million units since it was introduced, in 2002. (...)
Spanx’s popularity repudiates the late-twentieth-century belief, perpetuated by Jane Fonda and Nike, that a firm body can be achieved only through sweaty resolve. “There’s a whole subset of women who don’t relate to that idea,” Blakely said. She is overseeing a new line of activewear, called In It to Slim It, but there is a desultory feel to the enterprise. “I started thinking about joy,” Blakely said. “Everything in our society is so purposeful. Let’s bring joy back to the experience—have fun when you’re doing it,” meaning exercise. She has already expanded into legwear (Tight-End Tights); lacy lingerie (Haute Contour); casual separates (Bod a Bing!); and retro, ruffled swimwear. Spanx now offers more than two hundred different products, and executives at the company, which is privately held, and reported three hundred and fifty million dollars in global retail sales in 2008, worry that customers are having trouble distinguishing among them. Part of the line is manufactured by Acme-McCrary, a century-old firm in the hosiery mecca of Asheboro, North Carolina (the rest is outsourced to other countries). Larry Small, who until recently was Acme-McCrary’s C.E.O., told me that Spanx represents close to a third of his business, and he called Blakely a “rock star” in an industry of good ol’ boys. “I’ve always wondered how the heck men are supposed to sell hosiery,” he said.
Blakely chose the brand’s name partly for what she calls its “virgin-whore tension,” and partly for its “k” sound, which has a good track record in both business and comedy. “I used to hold my breath every time I said it out loud,” she told me. “People were so offended they’d hang up on me.” When the Spanx Web site first went live, Blakely’s mother accidentally directed a tableful of luncheon guests to spanks.com, a porn site. (...)
Blakely has several phobias, but her greatest fear is heights. “When we first got this apartment, I thought I might have to sell it as soon as we moved in,” she said. “And my husband was, like, ‘Why are you telling me this now?’ ” In addition to running his marketing company, Itzler is the co-founder of Marquis Jets, which leases private planes; the couple met at a poker tournament in Las Vegas (Bill Gates and Warren Buffett were among the guests), after Blakely had become a loyal Marquis customer, figuring that if she panicked during a flight she could order the pilot to land. She travels constantly, to give speeches—she is also afraid of public speaking—and to tend to the charitable foundation that she started, with seven hundred and fifty thousand dollars in prize money that she won after appearing on Branson’s show “The Rebel Billionaire”; she has so far donated around ten million dollars to women’s causes, a million to Oprah Winfrey’s Leadership Academy alone. “I took a Fear of Flying class, and I always missed the class, because I was always flying,” she said.
The couple hired a former Navy SEAL to devise emergency escape methods from the New York apartment, which is decorated in a modern rococo style, with ottomans covered in zebra print, ankle-deep rugs, maroon tiled ceilings, sequinned pillows, feminist art, and a silver chandelier in the dining room. Hidden behind the bar are jet packs and an inflatable motorboat. “We can jump out the window if we have to,” she said. Among her tasks as a contestant on “Rebel Billionaire” were nosedives in a 747 (“I also have a fear of puking,” she said) and a long climb up the side of a hot-air balloon on a rope ladder, to have tea with Branson at eight thousand feet. Blakely regarded these not as acts of masochism but as spiritual challenges.
by Alexandria Jacobs, The New Yorker | Read more:
Photograph by Josef Astor
Nimbus
These stunning photos of indoor clouds might look like digital creations, but they're actually of real scenes created by Dutch artist Berndnaut Smilde.
The clouds are generated using a smoke machine, but Smilde must carefully monitor a room's humidity and atmosphere in order to get the smoke to hang so elegantly, and with such life-like form. Backlighting is used to bring out shadows from within the cloud, to give it that look of a looming and ominous rain cloud.
"I wanted to make the image of a typical Dutch raincloud inside a space," Smilde told Gizmag. "I'm interested in the ephemeral aspect of the work. It's there for a brief moment and then the cloud falls apart. The work only exists as a photograph."
by Bryan Nelson, MNN | Read more:
Two burger-obsessed Japanese writers tell us their likes, dislikes - and why General MacArthur would be the ideal dining companion.
Metropolis: What makes a good hamburger?
Yoshihide Matsubara: First, a good burger is one that has all the parts assembled with perfect balance and harmony. Many burgers in Japan are served with all the flavorings already included, so before adding ketchup and mustard, try it as it’s served to you first. Secondly, a good hamburger has a dignified appearance and is built up beautifully. In some traditional American diner-style shops, they serve the ingredients side-by-side on a flat plate. But I’m of the opinion that places which serve the hamburger in its complete form think precisely about the proper way to pile ingredients—the size of the patty, bun and vegetables and their order —and this really reflects the sensitive and delicate technique of the Japanese. Especially in downtown Tokyo, hamburgers have crossed over from being “American-style” to being simply a “delicious meal.” Their originality is evolving every day. Lastly, a good hamburger needs to be dynamic and hearty. After all, hamburgers are entertainment!
Metropolis: What makes a good hamburger?
Ken Saito: Of course, the actual taste of the hamburger is crucial, but what happens before you bite into the hamburger is equally important. Mainly, the appearance and the aroma are extremely important factors. In smaller restaurants, you can hear and smell the hamburger patty being cooked. A perfectly assembled hamburger is a work of art. It’s exciting to imagine the taste of the hamburger before you actually take a bite. When I pick up the hamburger and smell the charcoal aroma, I can sense that I’m in for a helluva ride.
Pinterest and the Acquisitive Gaze
But recently Pinterest has entered the mainstream, as a para-retailing apparatus presumed to appeal mainly to women. The site’s supposed femaleness has occasioned a lot of theorizing, some of which Nathan Jurgenson details in this post, as has its anodyne commerciality. Bon Stewart argues that Pinterest, since it discourages self-promotion and relies entirely on the appropriation of someone else’s creative expression, turns curation into passive consumerism; it allows for the construction and circulation of a bland sanitized “Stepford” identity. In other words, it becomes another tool for enhancing our digital brands at the expense of the possibility of an uncommodified self.
Give that emphasis on passive consumption, it’s not surprising that Pinterest has come to be associated with shopping fantasies. Pinterest’s great technological advance seems to be that it lets users shop for images over the sprawl of the internet, turning it into a endless visual shopping mall in which one never runs out of money. Chris Tackett suggests that sites like Pinterest are actually “anti-consumerist” because they allow people the instant gratification of choosing things without actually having to buy them. “Virtual consumerism means a real world reduction in wasteful consumption,” he writes, and that’s all well and good, though I’m not sure that making window shopping more convenient is in any way “anticonsumerist.” If anything that seems to reinforce the consumerist mentality while overcoming one of its main obstacles — people’s financial inability to perpetually shop. With Pinterest, they can at least simulate that experience, acquiring the images of things and associating them with themselves, appropriating the qualities the goods/images are thought to signify at that given moment. Pinterest allows for the purest expression of the Baudrillardian “passion for the code” “It is not the passion (whether of objects or subjects) for substances that speaks in fetishism, it is the passion for the code, which, by governing both objects and subjects, and by subordinating them to itself, delivers them up to abstract manipulation,” Baudrillard wrote in “Fetishism and Ideology“ that we’ve yet seen. We accumulate and sort images, trying to extract their assimilable essences, and in the process reduce ourselves to a similar image, a similar agglomeration of putative qualities that can be read out of a surface.
by Rob Horning, The New Inquiry | Read more:
The Rosetta Stone is an ancient Egyptian granodiorite stele inscribed with a decree issued at Memphis in 196 BC on behalf of King Ptolemy V. The decree appears in three scripts: the upper text is Ancient Egyptian hieroglyphs, the middle portion Demotic script, and the lowest Ancient Greek. Because it presents essentially the same text in all three scripts (with some minor differences between them), it provided the key to the modern understanding of Egyptian hieroglyphs.
via: Wikipedia | Read more:
Shark Cartilage May Contain Toxin
Shark cartilage, which has been hyped as a cancer preventive and joint-health supplement, may contain a neurotoxin that has been linked with Alzheimer’s and Lou Gehrig’s disease.
Scientists at the University of Miami analyzed cartilage samples collected from seven species of sharks off the coast of Florida. The specimens all contained high levels of a compound called beta-methylamino-L-alanine, or BMAA, which has been linked to the development of neurodegenerative diseases. Sharks accumulate the compound because of their status at the top of the oceanic food chain, consuming fish and other sea creatures that feed on BMAA-containing algae. The small tissue samples were obtained from sharks that were caught, tagged and released for tracking research, and no sharks were harmed for the study.
The findings are important because of the growing popularity of supplements that contain cartilage from shark fins. The products are widely sold and remain popular with consumers who view them as cancer fighters or as a remedy for joint and bone problems. The notion that shark cartilage can prevent cancer grew largely from the popularity of the 1992 book “Sharks Don’t Get Cancer.”
Although a number of studies have discredited shark cartilage as a cancer fighter, supplement makers have nonetheless made bold claims. In 2000, two supplement makers settled a federal suit as a result of hyping shark cartilage and paid restitution to customers.
Although the Miami scientists didn’t examine shark cartilage supplements directly, their findings add further cause for concern about the popularity of shark fin supplements. In the study, published in the journal Marine Drugs, the researchers found levels of BMAA ranging from 144 to 1,836 nanograms per milligram of cartilage in seven shark species, including hammerhead, blacknose, nurse and bull sharks. The toxin initially is produced by bacteria in large algae blooms that are brought on by agricultural runoff and sewage pollution.
Earlier studies have suggested that BMAA may be common in the brains of people with degenerative diseases. One in 2009, for example, found that brain samples from people who died of Alzheimer’s or Lou Gehrig’s disease had BMAA levels as high as 256 ng/mg. The brains of control subjects who died of other causes had only trace amounts or none at all. While BMAA has never been definitively cited as a cause of degenerative diseases in humans, some scientists hypothesize that it may be a contributing factor.

The findings are important because of the growing popularity of supplements that contain cartilage from shark fins. The products are widely sold and remain popular with consumers who view them as cancer fighters or as a remedy for joint and bone problems. The notion that shark cartilage can prevent cancer grew largely from the popularity of the 1992 book “Sharks Don’t Get Cancer.”
Although a number of studies have discredited shark cartilage as a cancer fighter, supplement makers have nonetheless made bold claims. In 2000, two supplement makers settled a federal suit as a result of hyping shark cartilage and paid restitution to customers.
Although the Miami scientists didn’t examine shark cartilage supplements directly, their findings add further cause for concern about the popularity of shark fin supplements. In the study, published in the journal Marine Drugs, the researchers found levels of BMAA ranging from 144 to 1,836 nanograms per milligram of cartilage in seven shark species, including hammerhead, blacknose, nurse and bull sharks. The toxin initially is produced by bacteria in large algae blooms that are brought on by agricultural runoff and sewage pollution.
Earlier studies have suggested that BMAA may be common in the brains of people with degenerative diseases. One in 2009, for example, found that brain samples from people who died of Alzheimer’s or Lou Gehrig’s disease had BMAA levels as high as 256 ng/mg. The brains of control subjects who died of other causes had only trace amounts or none at all. While BMAA has never been definitively cited as a cause of degenerative diseases in humans, some scientists hypothesize that it may be a contributing factor.
by Anahad O'Connor, NY Times | Read more:
Barbara Walton/European Pressphoto Agency
Wednesday, March 7, 2012
We're Underestimating the Risk of Human Extinction
Bostrom, who directs Oxford's Future of Humanity Institute, has argued over the course of several papers that human extinction risks are poorly understood and, worse still, severely underestimated by society. Some of these existential risks are fairly well known, especially the natural ones. But others are obscure or even exotic. Most worrying to Bostrom is the subset of existential risks that arise from human technology, a subset that he expects to grow in number and potency over the next century.
Despite his concerns about the risks posed to humans by technological progress, Bostrom is no luddite. In fact, he is a longtime advocate of transhumanism---the effort to improve the human condition, and even human nature itself, through technological means. In the long run he sees technology as a bridge, a bridge we humans must cross with great care, in order to reach new and better modes of being. In his work, Bostrom uses the tools of philosophy and mathematics, in particular probability theory, to try and determine how we as a species might achieve this safe passage. What follows is my conversation with Bostrom about some of the most interesting and worrying existential risks that humanity might encounter in the decades and centuries to come, and about what we can do to make sure we outlast them.
Some have argued that we ought to be directing our resources toward humanity's existing problems, rather than future existential risks, because many of the latter are highly improbable. You have responded by suggesting that existential risk mitigation may in fact be a dominant moral priority over the alleviation of present suffering. Can you explain why?
Bostrom: Well suppose you have a moral view that counts future people as being worth as much as present people. You might say that fundamentally it doesn't matter whether someone exists at the current time or at some future time, just as many people think that from a fundamental moral point of view, it doesn't matter where somebody is spatially---somebody isn't automatically worth less because you move them to the moon or to Africa or something. A human life is a human life. If you have that moral point of view that future generations matter in proportion to their population numbers, then you get this very stark implication that existential risk mitigation has a much higher utility than pretty much anything else that you could do. There are so many people that could come into existence in the future if humanity survives this critical period of time---we might live for billions of years, our descendants might colonize billions of solar systems, and there could be billions and billions times more people than exist currently. Therefore, even a very small reduction in the probability of realizing this enormous good will tend to outweigh even immense benefits like eliminating poverty or curing malaria, which would be tremendous under ordinary standards.
In the short term you don't seem especially worried about existential risks that originate in nature like asteroid strikes, supervolcanoes and so forth. Instead you have argued that the majority of future existential risks to humanity are anthropogenic, meaning that they arise from human activity. Nuclear war springs to mind as an obvious example of this kind of risk, but that's been with us for some time now. What are some of the more futuristic or counterintuitive ways that we might bring about our own extinction?
Bostrom: I think the biggest existential risks relate to certain future technological capabilities that we might develop, perhaps later this century. For example, machine intelligence or advanced molecular nanotechnology could lead to the development of certain kinds of weapons systems. You could also have risks associated with certain advancements in synthetic biology.
Of course there are also existential risks that are not extinction risks. The concept of an existential risk certainly includes extinction, but it also includes risks that could permanently destroy our potential for desirable human development. One could imagine certain scenarios where there might be a permanent global totalitarian dystopia. Once again that's related to the possibility of the development of technologies that could make it a lot easier for oppressive regimes to weed out dissidents or to perform surveillance on their populations, so that you could have a permanently stable tyranny, rather than the ones we have seen throughout history, which have eventually been overthrown.
by Ross Andersen, The Atlantic | Read more:
The Era of Small and Many
Or another way to say it: can you figure out which way history wants to head (since no politician can really fight the current) and suggest how we might surf that wave?
Here’s my answer: we’re moving, if we’re lucky, from the world of few and big to the world of small and many. We’ll either head there purposefully or we’ll be dragged kicking, but we’ve reached one of those moments when tides reverse. (...)
Many of us get a preview of life in the age of small and many when we sit down at our computers each day. Fifteen years ago we still depended on a handful of TV networks and newspaper conglomerates to define our world for us; now we have a farmers’ market in ideas. We all add to the flow with each Facebook post, and we can find almost infinite sources of information. It’s reshaping the way we see the world—not, of course, without some trauma (from the hours wasted answering e-mail to the death of too much good, old-school journalism). All these transitions will be traumatic to one extent or another, since they are so very big. We’re reversing the trend of generations.
But the general direction seems to me increasingly clear. Health care? In place of a few huge, high-tech hospitals dispensing the most expensive care possible, all the data suggest we’d be healthier with lots of primary and preventive care from physicians’ assistants and nurse practitioners in our neighborhoods. Banking? Instead of putting more than half our assets in half a dozen money-center banks that devote themselves to baroque financial instruments, we need capital closer to home, where loan officers have some sense for gauging risk and need.
by Bill McKibben, Orion Magazine | Read more:
Painting: Suzanne Stryk
The Mouse Trap
Mark Mattson knows a lot about mice and rats. He's fed them; he's bred them; he's cut their heads open with a scalpel. Over a brilliant 25-year career in neuroscience—one that's made him a Laboratory Chief at the National Institute on Aging, a professor of neuroscience at Johns Hopkins, a consultant to Alzheimer's nonprofits, and a leading scholar of degenerative brain conditions—Mattson has completed more than 500 original, peer-reviewed studies, using something on the order of 20,000 laboratory rodents. He's investigated the progression and prevention of age-related diseases in rats and mice of every kind: black ones and brown ones; agoutis and albinos; juveniles and adults; males and females. Still, he never quite noticed how fat they were—how bloated and sedentary and sickly—until a Tuesday afternoon in February 2007. That's the day it occurred to him, while giving a lecture at Emory University in Atlanta, that his animals were nothing less (and nothing more) than lazy little butterballs. His animals and everyone else's, too.
Mattson was lecturing on a research program that he'd been conducting since 1995, on whether a strict diet can help ward off brain damage and disease. He'd generated some dramatic data to back up the theory: If you put a rat on a limited feeding schedule—depriving it of food every other day—and then blocked off one of its cerebral arteries to induce a stroke, its brain damage would be greatly reduced. The same held for mice that had been engineered to develop something like Parkinson's disease: Take away their food, and their brains stayed healthier.
How would these findings apply to humans, asked someone in the audience. Should people skip meals, too? At 5-foot-7 and 125 pounds, Mattson looks like a meal-skipper, and he is one. Instead of having breakfast or lunch, he takes all his food over a period of a few hours each evening—a bowl of steamed cabbage, a bit of salmon, maybe some yogurt. It's not unlike the regime that appears to protect his lab animals from cancer, stroke, and neurodegenerative disease. "Why do we eat three meals a day?" he asks me over the phone, not waiting for an answer. "From my research, it's more like a social thing than something with a basis in our biology."
But Mattson wasn't so quick to prescribe his stern feeding schedule to the crowd in Atlanta. He had faith in his research on diet and the brain but was beginning to realize that it suffered from a major complication. It might well be the case that a mouse can be starved into good health—that a deprived and skinny brain is more robust than one that's well-fed. But there was another way to look at the data. Maybe it's not that limiting a mouse's food intake makes it healthy, he thought; it could be that not limiting a mouse's food makes it sick. Mattson's control animals—the rodents that were supposed to yield a normal response to stroke and Parkinson's—might have been overweight, and that would mean his baseline data were skewed.
"I began to realize that the ‘control’ animals used for research studies throughout the world are couch potatoes," he tells me. It's been shown that mice living under standard laboratory conditions eat more and grow bigger than their country cousins. At the National Institute on Aging, as at every major research center, the animals are grouped in plastic cages the size of large shoeboxes, topped with a wire lid and a food hopper that's never empty of pellets. This form of husbandry, known as ad libitum feeding, is cheap and convenient since animal technicians need only check the hoppers from time to time to make sure they haven’t run dry. Without toys or exercise wheels to distract them, the mice are left with nothing to do but eat and sleep—and then eat some more.
That such a lifestyle would make rodents unhealthy, and thus of limited use for research, may seem obvious, but the problem appears to be so flagrant and widespread that few scientists bother to consider it. Ad libitum feeding and lack of exercise are industry-standard for the massive rodent-breeding factories that ship out millions of lab mice and rats every year and fuel a $1.1-billion global business in living reagents for medical research. When Mattson made that point in Atlanta, and suggested that the control animals used in labs were sedentary and overweight as a rule, several in the audience gasped. His implication was clear: The basic tool of biomedicine—and its workhorse in the production of new drugs and other treatments—had been transformed into a shoddy, industrial product. Researchers in the United States and abroad were drawing the bulk of their conclusions about the nature of human disease—and about Nature itself—from an organism that's as divorced from its natural state as feedlot cattle or oven-stuffer chickens.
Mattson isn't much of a doomsayer in conversation. "I realized that this information should be communicated more widely," he says without inflection, of that tumultuous afternoon in Atlanta. In 2010, he co-authored a more extensive, but still measured, analysis of the problem for the Proceedings of the National Academy of Sciences. The paper, titled " 'Control' laboratory rodents are metabolically morbid: Why it matters," laid out the case for how a rodent obesity epidemic might be affecting human health.
Standard lab rats and lab mice are insulin-resistant, hypertensive, and short-lived, he and his co-authors explained. Having unlimited access to food makes the animals prone to cancer, type-2 diabetes, and renal failure; it alters their gene expression in substantial ways; and it leads to cognitive decline. And there's reason to believe that ragged and rundown rodents will respond differently—abnormally, even—to experimental drugs.
by Daniel Engber, Slate | Read more:
Illustration by Rob Donnelly.
Friend Me, or Else
If you think privacy settings on your Facebook and Twitter accounts guarantee future employers or schools can't see your private posts, guess again.
Employers and colleges find the treasure-trove of personal information hiding behind password-protected accounts and privacy walls just too tempting, and some are demanding full access from job applicants and student athletes.
In Maryland, job seekers applying to the state's Department of Corrections have been asked during interviews to log into their accounts and let an interviewer watch while the potential employee clicks through wall posts, friends, photos and anything else that might be found behind the privacy wall.
Previously, applicants were asked to surrender their user name and password, but a complaint from the ACLU stopped that practice last year. While submitting to a Facebook review is voluntary, virtually all applicants agree to it out of a desire to score well in the interview, according Maryland ACLU legislative director Melissa Coretz Goemann.
Student-athletes in colleges around the country also are finding out they can no longer maintain privacy in Facebook communications because schools are requiring them to "friend" a coach or compliance officer, giving that person access to their “friends-only” posts. Schools are also turning to social media monitoring companies with names like UDilligence and Varsity Monitor for software packages that automate the task. The programs offer a "reputation scoreboard" to coaches and send "threat level" warnings about individual athletes to compliance officers.
A recent revision in the handbook at the University of North Carolina is typical:
"Each team must identify at least one coach or administrator who is responsible for having access to and regularly monitoring the content of team members’ social networking sites and postings,” it reads. "The athletics department also reserves the right to have other staff members monitor athletes’ posts."
All this scrutiny is too much for Bradley Shear, a Washington D.C.-lawyer who says both schools and employers are violating the First Amendment with demands for access to otherwise private social media content.
"I can't believe some people think it's OK to do this,” he said. “Maybe it's OK if you live in a totalitarian regime, but we still have a Constitution to protect us. It's not a far leap from reading people's Facebook posts to reading their email. ... As a society, where are we going to draw the line?"
by Bob Sullivan, MSNBC | Read more:
Image via: Mashable
Tuesday, March 6, 2012
Subscribe to:
Posts (Atom)