Sunday, July 17, 2011
The Trillions of Microbes That Call Us Home
by Michael Tennesen
In the intensive care nursery at Duke University Medical Center, doctors and nurses attend to premature infants in rows of incubators surrounded by ventilators and monitors. As new parents holding packages of breast milk watch their tiny babies, neonatologist Susan LaTuga makes her rounds, checking vital signs and evaluating how the infants tolerate feeding. She consults with nurses, dietitians, and pharmacists about the course of the day’s treatment for the babies, some of whom weigh as little as one pound and were born as much as 17 weeks early.
At the end of her shift, LaTuga stops at a freezer and inspects stool samples from some of the infants that are at the center of a remarkable new study. Across the Duke campus, technicians are waiting to analyze them with a powerful gene sequencer capable of penetrating the hidden world of the billions of microorganisms growing inside each infant.
LaTuga is one of several medical researchers at Duke working with microbial ecologists to study the development of the human microbiome—the enormous population of microbes, including bacteria, fungi, and viruses, that live in the human body, predominantly in the gut. There are 20 times as many of these microbes as there are cells in the body, up to 200 trillion in an adult, and each of us hosts at least 1,000 different species. Seen through the prism of the microbiome, a person is not so much an individual human body as a superorganism made up of diverse ecosystems, each teeming with microscopic creatures that are essential to our well-being. “Our hope is that if we can understand the normal microbial communities of healthy babies, then we can manipulate unhealthy ones,” LaTuga says.
The Duke study is just one of many projects begun in the past five years that use genetic sequencing to explore how the diversity of the microbiome impacts our health. Two of the largest efforts are the Human Microbiome Project, funded by the National Institutes of Health, and the European Union’s Metagenomics of the Human Intestinal Tract. Although these groups have only just begun to publish their findings, it is already clear that the microbiome is much more complex and very likely more critical to human health than anyone suspected. Understanding and controlling the diversity of our germs, as opposed to assaulting them with antibiotics, could be the key to a range of future medical treatments.
In-depth analysis of the human body’s microflora has been possible only in the past few years—a by-product of the same new gene sequencing techniques that have allowed scientists to cheaply and accurately identify the DNA of the human genome. “Gene sequencing has opened a huge door to how complex these communities are,” says Patrick Seed, a Duke pediatrician specializing in infectious disease, who with biologist Rob Jackson is a lead investigator of the premature infant study.
Before sequencing was available at a reasonable price, microbes were identified by growing them in a petri dish. But “not all microbes will grow in culture,” LaTuga says. “It identifies only about 20 percent of the microbes in the gut.”
Like a lush rain forest, a healthy microbiome in the human gut is a diverse ecosystem that thrives only when all the interdependent species are healthy too. “In an ecological sense, more diverse communities are healthy on land and in the seas,” Jackson says. “No one species is dominant, and the ecosystem is more productive and resistant to major changes.” The comparison is more than just a convenient analogy. Jackson was studying microbial communities around the world, including in the Amazon, when he realized that the ecological balance in those environments was not so different from the balance present in a healthy human gut. (One of his more counterintuitive findings is that microbial communities are more biodiverse in the American Plains than in the Amazon rain forest.)
Jackson’s work on microbial diversity caught the attention of Seed, who was already interested in the microbiome in the guts of preterm infants but who did not have a background in ecology. He sought out Jackson, and the two decided to collaborate on what they call the Preemie Microbiome Project. The Duke medical researchers and ecologists who have joined that project hope to identify which species flourish in early stages of the human microbiome, how they are influenced by the consumption of breast milk, and what role they play in critical diseases affecting infants as well as in chronic diseases that occur later in life.
“The classical view of infectious disease is that a single organism invades and produces an infection,” Seed says. “But then we found that certain diseases, like irritable bowel syndrome, seem to be caused by imbalances in the organisms that communicate with the host. So then people asked, ‘Why is this not the case for many other states of human health?’ ” Preliminary work by other groups, similarly made up of both biomedical researchers and microbial ecologists, suggests that imbalances in the microbiome might also be linked to allergies, diabetes, and obesity.
Read more:
In the intensive care nursery at Duke University Medical Center, doctors and nurses attend to premature infants in rows of incubators surrounded by ventilators and monitors. As new parents holding packages of breast milk watch their tiny babies, neonatologist Susan LaTuga makes her rounds, checking vital signs and evaluating how the infants tolerate feeding. She consults with nurses, dietitians, and pharmacists about the course of the day’s treatment for the babies, some of whom weigh as little as one pound and were born as much as 17 weeks early.
At the end of her shift, LaTuga stops at a freezer and inspects stool samples from some of the infants that are at the center of a remarkable new study. Across the Duke campus, technicians are waiting to analyze them with a powerful gene sequencer capable of penetrating the hidden world of the billions of microorganisms growing inside each infant. LaTuga is one of several medical researchers at Duke working with microbial ecologists to study the development of the human microbiome—the enormous population of microbes, including bacteria, fungi, and viruses, that live in the human body, predominantly in the gut. There are 20 times as many of these microbes as there are cells in the body, up to 200 trillion in an adult, and each of us hosts at least 1,000 different species. Seen through the prism of the microbiome, a person is not so much an individual human body as a superorganism made up of diverse ecosystems, each teeming with microscopic creatures that are essential to our well-being. “Our hope is that if we can understand the normal microbial communities of healthy babies, then we can manipulate unhealthy ones,” LaTuga says.
The Duke study is just one of many projects begun in the past five years that use genetic sequencing to explore how the diversity of the microbiome impacts our health. Two of the largest efforts are the Human Microbiome Project, funded by the National Institutes of Health, and the European Union’s Metagenomics of the Human Intestinal Tract. Although these groups have only just begun to publish their findings, it is already clear that the microbiome is much more complex and very likely more critical to human health than anyone suspected. Understanding and controlling the diversity of our germs, as opposed to assaulting them with antibiotics, could be the key to a range of future medical treatments.
In-depth analysis of the human body’s microflora has been possible only in the past few years—a by-product of the same new gene sequencing techniques that have allowed scientists to cheaply and accurately identify the DNA of the human genome. “Gene sequencing has opened a huge door to how complex these communities are,” says Patrick Seed, a Duke pediatrician specializing in infectious disease, who with biologist Rob Jackson is a lead investigator of the premature infant study.
Before sequencing was available at a reasonable price, microbes were identified by growing them in a petri dish. But “not all microbes will grow in culture,” LaTuga says. “It identifies only about 20 percent of the microbes in the gut.”
Like a lush rain forest, a healthy microbiome in the human gut is a diverse ecosystem that thrives only when all the interdependent species are healthy too. “In an ecological sense, more diverse communities are healthy on land and in the seas,” Jackson says. “No one species is dominant, and the ecosystem is more productive and resistant to major changes.” The comparison is more than just a convenient analogy. Jackson was studying microbial communities around the world, including in the Amazon, when he realized that the ecological balance in those environments was not so different from the balance present in a healthy human gut. (One of his more counterintuitive findings is that microbial communities are more biodiverse in the American Plains than in the Amazon rain forest.)
Jackson’s work on microbial diversity caught the attention of Seed, who was already interested in the microbiome in the guts of preterm infants but who did not have a background in ecology. He sought out Jackson, and the two decided to collaborate on what they call the Preemie Microbiome Project. The Duke medical researchers and ecologists who have joined that project hope to identify which species flourish in early stages of the human microbiome, how they are influenced by the consumption of breast milk, and what role they play in critical diseases affecting infants as well as in chronic diseases that occur later in life.
“The classical view of infectious disease is that a single organism invades and produces an infection,” Seed says. “But then we found that certain diseases, like irritable bowel syndrome, seem to be caused by imbalances in the organisms that communicate with the host. So then people asked, ‘Why is this not the case for many other states of human health?’ ” Preliminary work by other groups, similarly made up of both biomedical researchers and microbial ecologists, suggests that imbalances in the microbiome might also be linked to allergies, diabetes, and obesity.
Read more:
Cinemagraphs
by Elizabeth Flock
It’s somewhere between a photo and a video, a piece of artwork that seeks to perfectly capture a fleeting moment in time.
New York City-based photographer Jamie Beck and Web designer Kevin Burg “hand-stitch” together her photos and his Web design to make animated gifs they now call “cinemagraphs.”
Beck and Burg first started making cinemagraphs at Fashion Week in New York earlier this year, spending a day or two to capture Vogue magazine’s Anna Wintour examining the catwalk, or the fluttering of a fashion model’s hair. People loved their work so much that Beck and Burg soon expanded to longer narratives, about how Dogfish Brewey makes its beer (below), or profiles that captured a moment in the life of a couple in love in Brooklyn (above). Companies even asked their help in making food come alive.
“It’s taken over our whole lives,” Beck says.
Read more:
Digital Meltdown
by Tracy Mayor
I am not cool. My husband is not cool. But like a pair of nags that has somehow managed to produce thoroughbreds, we have cool children. So cool, in fact, that the older one managed to secure for himself an invitation to Google+ -- Google's new social networking space and would-be Facebook killer -- on the first day it launched.
Because we have taught him to be compassionate and take pity on the uncool, he shared a Google+ invitation with me. The moment was the digital equivalent of his preschool days, when he'd arrive home to proudly gift me with a handmade object of unknown utility. "This is lovely," I'd say, my heart swelling as I considered the lump carefully, trying to figure if it looked more like a candy dish or a paper clip holder. "What's it for?"
When I ask the 17-year-old version of that boy what Google+ is for, he says -- texts, actually -- "its pretty sick, there're a lot of cool features thatll be awesome once more people get on. like better chatting and you can really control who sees what."
Alrighty then. Feeling positively hip, I head over, activate my invitation, upload a good-hair-day picture and type in a few simple words for my profile that seem to fit well with the spare, airy Google interface: "Writer, editor, public school advocate, parent, lover, friend, walker of dog." So far, so good.
I click "Circles" and a lovely row of them appears for me to populate -- Friends, Family, Acquaintances, Following and one helpfully left blank for me to label (Frenemies? Mean Girls? Former Crushes?) -- along with a phrase in red awaiting my click: "Find and Invite (560)."
Whoa, Google+ wants to find and invite 560 of my contacts? Hold up. Even though my son would tell me, with eyes rolling, that only losers click on "find all" menu options like that, it's a potent reminder that I'm starting down the slippery slope of adding yet another social medium into my already overwhelmed digital life.
Read more:
I am not cool. My husband is not cool. But like a pair of nags that has somehow managed to produce thoroughbreds, we have cool children. So cool, in fact, that the older one managed to secure for himself an invitation to Google+ -- Google's new social networking space and would-be Facebook killer -- on the first day it launched.
Because we have taught him to be compassionate and take pity on the uncool, he shared a Google+ invitation with me. The moment was the digital equivalent of his preschool days, when he'd arrive home to proudly gift me with a handmade object of unknown utility. "This is lovely," I'd say, my heart swelling as I considered the lump carefully, trying to figure if it looked more like a candy dish or a paper clip holder. "What's it for?" When I ask the 17-year-old version of that boy what Google+ is for, he says -- texts, actually -- "its pretty sick, there're a lot of cool features thatll be awesome once more people get on. like better chatting and you can really control who sees what."
Alrighty then. Feeling positively hip, I head over, activate my invitation, upload a good-hair-day picture and type in a few simple words for my profile that seem to fit well with the spare, airy Google interface: "Writer, editor, public school advocate, parent, lover, friend, walker of dog." So far, so good.
I click "Circles" and a lovely row of them appears for me to populate -- Friends, Family, Acquaintances, Following and one helpfully left blank for me to label (Frenemies? Mean Girls? Former Crushes?) -- along with a phrase in red awaiting my click: "Find and Invite (560)."
Whoa, Google+ wants to find and invite 560 of my contacts? Hold up. Even though my son would tell me, with eyes rolling, that only losers click on "find all" menu options like that, it's a potent reminder that I'm starting down the slippery slope of adding yet another social medium into my already overwhelmed digital life.
Read more:
We’re Spent
by David Leonhardt
There is no shortage of explanations for the economy’s maddening inability to leave behind the Great Recession and start adding large numbers of jobs: The deficit is too big. The stimulus was flawed. China is overtaking us. Businesses are overregulated. Wall Street is underregulated.
But the real culprit — or at least the main one — has been hiding in plain sight. We are living through a tremendous bust. It isn’t simply a housing bust. It’s a fizzling of the great consumer bubble that was decades in the making.
The auto industry is on pace to sell 28 percent fewer new vehicles this year than it did 10 years ago — and 10 years ago was 2001, when the country was in recession. Sales of ovens and stoves are on pace to be at their lowest level since 1992. Home sales over the past year have fallen back to their lowest point since the crisis began. And big-ticket items are hardly the only problem.
The Federal Reserve Bank of New York recently published a jarring report on what it calls discretionary service spending, a category that excludes housing, food and health care and includes restaurant meals, entertainment, education and even insurance. Going back decades, such spending had never fallen more than 3 percent per capita in a recession. In this slump, it is down almost 7 percent, and still has not really begun to recover.
The past week brought more bad news. Retail sales in June were weaker than expected, and consumer confidence fell, causing economists to downgrade their estimates for economic growth yet again. It’s a familiar routine by now. Forecasters in Washington and on Wall Street keep saying the recovery’s problems are temporary — and then they redefine temporary.
If you’re looking for one overarching explanation for the still-terrible job market, it is this great consumer bust. Business executives are only rational to hold back on hiring if they do not know when their customers will fully return. Consumers, for their part, are coping with a sharp loss of wealth and an uncertain future (and many have discovered that they don’t need to buy a new car or stove every few years). Both consumers and executives are easily frightened by the latest economic problem, be it rising gas prices or the debt-ceiling impasse.
Earlier this year, Charles M. Holley Jr., the chief financial officer of Wal-Mart, said that his company had noticed consumers were often buying smaller packages toward the end of the month, just before many households receive their next paychecks. “You see customers that are running out of money at the end of the month,” Mr. Holley said.
In past years, many of those customers could have relied on debt, often a home-equity line of credit or a credit card, to tide them over. Debt soared in the late 1980s, 1990s and the last decade, which allowed spending to grow faster than incomes and helped cushion every recession in that period.
Now, the economic version of the law of gravity is reasserting itself. We are feeling the deferred pain from 25 years of excess, as people try to rebuild their depleted savings. This pattern is a classic one. The definitive book about financial crises has become “This Time Is Different: Eight Centuries of Financial Folly,” published in 2009 with exquisite timing, by Carmen M. Reinhart, now of the Peterson Institute for International Economics, and Kenneth S. Rogoff, of Harvard.
Read more:
There is no shortage of explanations for the economy’s maddening inability to leave behind the Great Recession and start adding large numbers of jobs: The deficit is too big. The stimulus was flawed. China is overtaking us. Businesses are overregulated. Wall Street is underregulated.
But the real culprit — or at least the main one — has been hiding in plain sight. We are living through a tremendous bust. It isn’t simply a housing bust. It’s a fizzling of the great consumer bubble that was decades in the making.
The auto industry is on pace to sell 28 percent fewer new vehicles this year than it did 10 years ago — and 10 years ago was 2001, when the country was in recession. Sales of ovens and stoves are on pace to be at their lowest level since 1992. Home sales over the past year have fallen back to their lowest point since the crisis began. And big-ticket items are hardly the only problem.
The Federal Reserve Bank of New York recently published a jarring report on what it calls discretionary service spending, a category that excludes housing, food and health care and includes restaurant meals, entertainment, education and even insurance. Going back decades, such spending had never fallen more than 3 percent per capita in a recession. In this slump, it is down almost 7 percent, and still has not really begun to recover.
The past week brought more bad news. Retail sales in June were weaker than expected, and consumer confidence fell, causing economists to downgrade their estimates for economic growth yet again. It’s a familiar routine by now. Forecasters in Washington and on Wall Street keep saying the recovery’s problems are temporary — and then they redefine temporary.
If you’re looking for one overarching explanation for the still-terrible job market, it is this great consumer bust. Business executives are only rational to hold back on hiring if they do not know when their customers will fully return. Consumers, for their part, are coping with a sharp loss of wealth and an uncertain future (and many have discovered that they don’t need to buy a new car or stove every few years). Both consumers and executives are easily frightened by the latest economic problem, be it rising gas prices or the debt-ceiling impasse.
Earlier this year, Charles M. Holley Jr., the chief financial officer of Wal-Mart, said that his company had noticed consumers were often buying smaller packages toward the end of the month, just before many households receive their next paychecks. “You see customers that are running out of money at the end of the month,” Mr. Holley said.
In past years, many of those customers could have relied on debt, often a home-equity line of credit or a credit card, to tide them over. Debt soared in the late 1980s, 1990s and the last decade, which allowed spending to grow faster than incomes and helped cushion every recession in that period.
Now, the economic version of the law of gravity is reasserting itself. We are feeling the deferred pain from 25 years of excess, as people try to rebuild their depleted savings. This pattern is a classic one. The definitive book about financial crises has become “This Time Is Different: Eight Centuries of Financial Folly,” published in 2009 with exquisite timing, by Carmen M. Reinhart, now of the Peterson Institute for International Economics, and Kenneth S. Rogoff, of Harvard.
Read more:
Saturday, July 16, 2011
The Beekeeper's Lament
by Maggie Koerth-Baker
What's killing the bees? After reading The Beekeeper's Lament —Hannah Nordhaus' lyrical, haunting book about the complicated lives and deaths of America's honeybees—my question has shifted more towards, "Good lord, what doesn't kill bees?"
Domesticated bees turn out to be some amazingly fragile creatures. In fact, Nordhaus writes, bees were delicate even before the modern age of industrial farming. It wasn't until the second half of the 19th century that humans were able to reliably domesticate bees. Even then, beekeeping was anything but a stable business to be in. But in the last decade, the job has gotten harder, and the bee deaths have piled up faster. Bees are killed by moths and mites, bacteria and viruses, heat and cold. They're killed by the pesticides used on the plants they pollinate, and by the other pesticides used to protect them from murderous insects. And they're killed by the almond crop, which draws millions of bees from all over the nation to one small region of California, where they join in an orgy of pollination and another of disease sharing.
None of this negates the seriousness of Colony Collapse Disorder, that still-mysterious ailment that reduced more than 1/3 of America's healthy beehives to empty boxes in 2007. But what Nordhaus does (and does well) is put those famous losses into a broader context. Colony Collapse Disorder is a problem. But it isn't the problem. Instead, it's just a great big insult piled on top of an already rising injury rate. Saving the honeybee isn't just about figuring out CCD. Bees were already in trouble before that came along. In the years since 2007, Nordhaus writes, bees have died at a rate higher than the expected and "acceptable" 15% annual loss, but the majority of those deaths weren't always caused by CCD.
The picture of bee maladies that Nordhaus paints isn't a pretty one. The bees continue to be extremely important to our national food system, and they continue to die in numbers that are far more vast than the normally high death rates beekeepers have always dealt with. Worse, there's no easy answer. At least not one that scientific evidence has been able to pin down yet. If you're looking for a simple solution—if you want somebody to justify your pet explanation, whether pesticides, or GMOs, or totally natural causes that have nothing to do with modern farming practices—then you probably won't like what Nordhaus has to say.
But if you're interested in the real complexity behind the headlines, you're in luck. There's so much going on in this book, details that are vitally important to understanding how modern beekeeping works and what happens when it fails, and which almost never make it into the short articles and TV segments. Nordhaus doesn't even really start talking about Colony Collapse Disorder in an in-depth way until chapter 6. And that's a good thing. By the time you get to that chapter, it's clear that she couldn't have written about it any sooner. There's too much context that you need to understand before you can really make sense of that hot-button issue.
Better yet, Nordhaus manages to wrap all that nuance up in some of the best narrative and storytelling I've had the pleasure of reading since Rebecca Skloot's The Immortal Life of Henrietta Lacks. Like Skloot, Nordhaus owes some of the credit to the fact that her primary source is a fabulous character to hang a story on. John Miller, the professional beekeeper whose work and adventures set the stage for Nordhaus' reporting, is curmudgeonly and charming, hard-headed and hilarious. He's a conservative farmer who likes fast cars, loves his bees, and writes Nordhaus emails that read like Zen koans. Even when it's clear that some of the practices that keep people like Miller in business are also hurting the bee populations, it's hard not to root for him, as a person.
Nordhaus puts the bee panic into perspective, and Miller puts a human face on the complexities and contradictions behind it. Before you build a beehive, before you post another Internet forum message about what absolutely just has to be killing the bees, you must read this book.
The Beekeeper's Lament: How One Man and Half a Billion Honey Bees Help Feed America by Hannah Nordhaus
Image: Return of the Bee, a Creative Commons Attribution Non-Commercial (2.0) image from mightyboybrian's photostream
via:
Pat Metheny
From his new acoustic album, What's It All About.
Ali Now
by Cal Fussman MUHAMMAD ALI came through the double doors into the living room of his hotel suite on slow, tender steps. I held out my hand. He opened his arms. Ali lowered himself into a wide, soft chair, and I sat on an adjacent sofa. "I've come," I said, "to ask about the wisdom you've taken from all you've been through." Ali seemed preoccupied with his right hand, which was trembling over his right thigh, and he did not speak. "George Foreman told me that you were the most important man in the world. When I asked him why, he said that when you walked into a room, it didn't matter who was there--presidents, prime ministers, CEOs, movie stars--everybody turned toward you. The most famous person in that room was wondering, Should I go to meet him? Or stay here? He said you were the most important man in the world because you made everybody else's heart beat faster."
The shaking in Ali's right hand seemed to creep above his elbow. Both of his arms were quivering now, and his breaths were short and quick.
I leaned in awkwardly, not knowing quite what to do. Half a minute passed in silence. I wondered if I should call for his wife.
Ali stooped over, and now his whole body was trembling and his breaths were almost gasps.
"Champ! You okay? You okay?"
Ali's head lifted and slowly turned to me with the smile of an eight-year-old.
"Scared ya, huh?" he said.
Ali was in Dublin for the opening ceremonies of the Special Olympics. A van pulled up outside the hotel, and it was with much effort that he slowly lifted himself into the front seat. And yet as soon as the driver pulled out onto the road, the left-hand side of the road, Ali was waving his arms in a childish portent of doom and gasping, "Head-on collision! Head-on collision!"
There were four of us in the back: Ali's wife, Lonnie, who is fourteen years younger than Ali and who grew up across the street from his childhood home in Louisville; his best friend, Howard Bingham, a photographer he met in 1962 and who's snapped more pictures of Ali than anyone has ever taken of anybody; and a businessman named Harlan Werner, who's organized public appearances for Ali during the last sixteen years.
Jet lag had come to dinner with all of us as we took our table near a flower-filled courtyard at Ernie's Restaurant. Ali asked for a felt-tipped pen, and while the rest of us talked he pushed his plate and silverware out of the way, spread out his cloth napkin, and began to draw on it. First he sketched a boxing ring, then two stick figures, one of which he labeled "Ali" in a cartoon bubble and the other "Frazier." Then he began to set in the crowd around the ring in the form of dots. Tens of dots, then hundreds of dots, then thousands, his right hand driving the pen down again and again as if to say: And he saw it! And he saw it! And she saw it! And he saw it!
Occasionally, Ali would stop, examine his canvas for empty space, and then go on, jackhammering in more dots. It must have taken him more than twenty minutes to squeeze all of humanity onto the napkin. Then he signed his name inside a huge heart and handed it to me.
"Thank you," I said.
"That'll be ten dollars, please," he whispered.
And that was the way it was with Ali. You just couldn't help laughing, even when you knew he'd play that same line off the next hundred people. No matter how many times he told a fresh face, "You ain't as dumb as you look," it worked.
But watching him eat was awkward. His right hand picked up a piece of lamb and trembled as he tried to bring it to his lips. He did not eat very much. His sciatic nerve was paining him, he was tired, and it was best to go back to the hotel.
On the way to the door, the diners at the other tables--every one of them--stood and applauded. Ali moved gingerly, almost painfully, and then, all of a sudden, he bit his bottom lip, took a mock swing at one of the applauding men, and the restaurant roared with laughter.
Read more:
image credit:
That Stalling Feeling
by Nouriel Roubini
NEW YORK – Despite the series of low-probability, high-impact events that have hit the global economy in 2011, financial markets continued to rise happily until a month or so ago. The year began with rising food, oil, and commodity prices, giving rise to the specter of high inflation. Then massive turmoil erupted in the Middle East, further ratcheting up oil prices. Then came Japan’s terrible earthquake, which severely damaged both its economy and global supply chains. And then Greece, Ireland, and Portugal lost access to credit markets, requiring bailout packages from the International Monetary Fund and the European Union.
But that was not the end of it. Although Greece was bailed out a year ago, Plan A has now clearly failed. Greece will require another official bailout – or a bail-in of private creditors, an option that is fueling heated disagreement among European policymakers.
Lately, concerns about America’s unsustainable fiscal deficits have, likewise, resulted in ugly political infighting, almost leading to a government shutdown. A similar battle is now brewing about America’s “debt ceiling,” which, if unresolved, introduces the risk of a “technical” default on US public debt.
Until recently, markets seemed to discount these shocks; apart from a few days when panic about Japan or the Middle East caused a correction, they continued their upward march. But, since the end of April, a more persistent correction in global equity markets has set in, driven by worries that economic growth in the United States and worldwide may be slowing sharply.
Data from the US, the United Kingdom, the periphery of the eurozone, Japan, and even emerging-market economies is signaling that part of the global economy – especially advanced economies – may be stalling, if not dropping into a double-dip recession. Global risk-aversion has also increased, as the option of further “extend and pretend” or “delay and pray” on Greece is becoming less desirable, and the specter of a disorderly workout is becoming more likely.
Optimists argue that the global economy has merely hit a “soft patch.” Firms and consumers reacted to this year’s shocks by “temporarily” slowing consumption, capital spending, and job creation. As long as the shocks don’t worsen (and as some become less acute), confidence and growth will recover in the second half of the year, and stock markets will rally again.
But there are good reasons to believe that we are experiencing a more persistent slump. First, the problems of the eurozone periphery are in some cases problems of actual insolvency, not illiquidity: large and rising public and private deficits and debt; damaged financial systems that need to be cleaned up and recapitalized; massive loss of competitiveness; lack of economic growth; and rising unemployment. It is no longer possible to deny that public and/or private debts in Greece, Ireland, and Portugal will need to be restructured.
Read more:
NEW YORK – Despite the series of low-probability, high-impact events that have hit the global economy in 2011, financial markets continued to rise happily until a month or so ago. The year began with rising food, oil, and commodity prices, giving rise to the specter of high inflation. Then massive turmoil erupted in the Middle East, further ratcheting up oil prices. Then came Japan’s terrible earthquake, which severely damaged both its economy and global supply chains. And then Greece, Ireland, and Portugal lost access to credit markets, requiring bailout packages from the International Monetary Fund and the European Union.
Lately, concerns about America’s unsustainable fiscal deficits have, likewise, resulted in ugly political infighting, almost leading to a government shutdown. A similar battle is now brewing about America’s “debt ceiling,” which, if unresolved, introduces the risk of a “technical” default on US public debt.
Until recently, markets seemed to discount these shocks; apart from a few days when panic about Japan or the Middle East caused a correction, they continued their upward march. But, since the end of April, a more persistent correction in global equity markets has set in, driven by worries that economic growth in the United States and worldwide may be slowing sharply.
Data from the US, the United Kingdom, the periphery of the eurozone, Japan, and even emerging-market economies is signaling that part of the global economy – especially advanced economies – may be stalling, if not dropping into a double-dip recession. Global risk-aversion has also increased, as the option of further “extend and pretend” or “delay and pray” on Greece is becoming less desirable, and the specter of a disorderly workout is becoming more likely.
Optimists argue that the global economy has merely hit a “soft patch.” Firms and consumers reacted to this year’s shocks by “temporarily” slowing consumption, capital spending, and job creation. As long as the shocks don’t worsen (and as some become less acute), confidence and growth will recover in the second half of the year, and stock markets will rally again.
But there are good reasons to believe that we are experiencing a more persistent slump. First, the problems of the eurozone periphery are in some cases problems of actual insolvency, not illiquidity: large and rising public and private deficits and debt; damaged financial systems that need to be cleaned up and recapitalized; massive loss of competitiveness; lack of economic growth; and rising unemployment. It is no longer possible to deny that public and/or private debts in Greece, Ireland, and Portugal will need to be restructured.
Read more:
Friday, July 15, 2011
Friday Book Club - The Liar's Club
Imagine you are a child of 7 and this is your sharpest memory: "Our family doctor knelt before me where I sat on a mattress on the bare floor. . . . He was pulling at the hem of my favorite nightgown. . . . 'Show me the marks,' he said. 'Come on, now. I won't hurt you.' "
Thus opens "The Liars' Club," Mary Karr's haunting memoir of growing up in East Texas in the early 1960's, virtually motherless, and fiercely seeking to understand her parents, their lives and their relationship to her sister and herself.
Daddy drank every day, but "he never missed a day of work in 42 years at the plant; never cried -- on the morning after -- that he felt some ax wedged in his forehead; never drew his belt from his pant loops to strap on us or got weepy over cowboy songs the way some guys down at the Legion did." Mother was a different story. "Looking back from this distance, I can also see Mother trapped in some way, stranded in her own silence. How small she seems in her silk dress, drinking stale coffee."
A reader could conclude that no one speaks in this memoir except the narrator, and that would be almost true. But even mute, this mother is the story; give or take a few exceptions, she's the whole story. Charlie Marie Moore Karr, a k a Mother, is a huge enigma that by her very presence, her silent, raging sadness and fierce passions dominates the family. She is an enigma not only to her daughters and husband, but to the set of children whom she abandoned years before giving birth to Mary and her older sister, Lecia, and whose existence she has held as a corrosive secret. And she has remained an enigma to everyone, including the six men she has married and divorced, even Daddy, J. P. Karr, whom she married twice.
The Liars' Club turns out to be just a place where the men meet on their days off to play dominoes and drink in the back room of the bait shop. Mary Karr's father is mainly just a regular guy. It is her mother who takes on enormous, suffocating dimension.
As Mother rarely speaks, it is left to the imagination of the daughters to attempt to translate her silences. While Daddy, who works in the oilfields of Leechfield, where Agent Orange is manufactured, has a sweet steady Texas grit, Mother has what her daughter calls East Coast longings. She is too refined for Texas, and is "adjudged more or less permanently Nervous." Born in West Texas, she had gone to New York, where she spent her youth and first marriages and went to the opera and to museums. Back in East Texas, she reads Camus and Sartre and tries to throw herself out of speeding cars while drunk.
In Mary's eyes, the most admirable thing about Charlie is that she's a painter. Daddy and his card-playing buddies in the Liars' Club build her a studio in the back of their house, and the first thing she paints on her visits home from caring for her own mother is "a portrait of Grandma . . . from a Polaroid taken just before Grandma lost the leg."
Shortly before the major catastrophe that's about to happen to these girls, Ms. Karr notes, "I see Mother's face wearing that thousand-yard stare. . . . The back door she's staring through opens on a wet black night." Charlie is immeasurably, palpably sad. Her art, in the end, is not enough to hold her -- nor is any art. She just reads Tolstoy, plays old Bessie Smith records and cries.
Read more:
Thus opens "The Liars' Club," Mary Karr's haunting memoir of growing up in East Texas in the early 1960's, virtually motherless, and fiercely seeking to understand her parents, their lives and their relationship to her sister and herself.Daddy drank every day, but "he never missed a day of work in 42 years at the plant; never cried -- on the morning after -- that he felt some ax wedged in his forehead; never drew his belt from his pant loops to strap on us or got weepy over cowboy songs the way some guys down at the Legion did." Mother was a different story. "Looking back from this distance, I can also see Mother trapped in some way, stranded in her own silence. How small she seems in her silk dress, drinking stale coffee."
A reader could conclude that no one speaks in this memoir except the narrator, and that would be almost true. But even mute, this mother is the story; give or take a few exceptions, she's the whole story. Charlie Marie Moore Karr, a k a Mother, is a huge enigma that by her very presence, her silent, raging sadness and fierce passions dominates the family. She is an enigma not only to her daughters and husband, but to the set of children whom she abandoned years before giving birth to Mary and her older sister, Lecia, and whose existence she has held as a corrosive secret. And she has remained an enigma to everyone, including the six men she has married and divorced, even Daddy, J. P. Karr, whom she married twice.
The Liars' Club turns out to be just a place where the men meet on their days off to play dominoes and drink in the back room of the bait shop. Mary Karr's father is mainly just a regular guy. It is her mother who takes on enormous, suffocating dimension.
As Mother rarely speaks, it is left to the imagination of the daughters to attempt to translate her silences. While Daddy, who works in the oilfields of Leechfield, where Agent Orange is manufactured, has a sweet steady Texas grit, Mother has what her daughter calls East Coast longings. She is too refined for Texas, and is "adjudged more or less permanently Nervous." Born in West Texas, she had gone to New York, where she spent her youth and first marriages and went to the opera and to museums. Back in East Texas, she reads Camus and Sartre and tries to throw herself out of speeding cars while drunk.
In Mary's eyes, the most admirable thing about Charlie is that she's a painter. Daddy and his card-playing buddies in the Liars' Club build her a studio in the back of their house, and the first thing she paints on her visits home from caring for her own mother is "a portrait of Grandma . . . from a Polaroid taken just before Grandma lost the leg."
Shortly before the major catastrophe that's about to happen to these girls, Ms. Karr notes, "I see Mother's face wearing that thousand-yard stare. . . . The back door she's staring through opens on a wet black night." Charlie is immeasurably, palpably sad. Her art, in the end, is not enough to hold her -- nor is any art. She just reads Tolstoy, plays old Bessie Smith records and cries.
Read more:
Everyone's So Smart Now!
by Catherine Rampell
We’ve written before about some of the work of Stuart Rojstaczer and Christopher Healy, grade inflation chroniclers extraordinaire. They have put together a new, comprehensive study of college grading over the decades, and let me tell you, it is a doozy.
The researchers collected historical data on letter grades awarded by more than 200 four-year colleges and universities. Their analysis (published in the Teachers College Record) confirm that the share of A grades awarded has skyrocketed over the years. Take a look at the red line in the chart below, which refers to the share of grades given that are A’s:
Stuart Rojstaczer and Christopher Healy Note: 1940 and 1950 (nonconnected data points in figure) represent averages from 1935 to 1944 and 1945 to 1954, respectively. Data from 1960 onward represent annual averages in their database, smoothed with a three-year centered moving average.
Most recently, about 43 percent of all letter grades given were A’s, an increase of 28 percentage points since 1960 and 12 percentage points since 1988. The distribution of B’s has stayed relatively constant; the growing share of A’s instead comes at the expense of a shrinking share of C’s, D’s and F’s. In fact, only about 10 percent of grades awarded are D’s and F’s.
As we have written before, private colleges and universities are by far the biggest offenders on grade inflation, even when you compare private schools to equally selective public schools. Here’s another chart showing the grading curves for public versus private schools in the years 1960, 1980 and 2007:
As you can see, public and private school grading curves started out as relatively similar, and gradually pulled further apart. Both types of institutions made their curves easier over time, but private schools made their grades much easier.
Read more:
ps. Thanks to Hairpin for the great title.
We’ve written before about some of the work of Stuart Rojstaczer and Christopher Healy, grade inflation chroniclers extraordinaire. They have put together a new, comprehensive study of college grading over the decades, and let me tell you, it is a doozy.
The researchers collected historical data on letter grades awarded by more than 200 four-year colleges and universities. Their analysis (published in the Teachers College Record) confirm that the share of A grades awarded has skyrocketed over the years. Take a look at the red line in the chart below, which refers to the share of grades given that are A’s:
Stuart Rojstaczer and Christopher Healy Note: 1940 and 1950 (nonconnected data points in figure) represent averages from 1935 to 1944 and 1945 to 1954, respectively. Data from 1960 onward represent annual averages in their database, smoothed with a three-year centered moving average.
Most recently, about 43 percent of all letter grades given were A’s, an increase of 28 percentage points since 1960 and 12 percentage points since 1988. The distribution of B’s has stayed relatively constant; the growing share of A’s instead comes at the expense of a shrinking share of C’s, D’s and F’s. In fact, only about 10 percent of grades awarded are D’s and F’s.
As we have written before, private colleges and universities are by far the biggest offenders on grade inflation, even when you compare private schools to equally selective public schools. Here’s another chart showing the grading curves for public versus private schools in the years 1960, 1980 and 2007:
Stuart Rojstaczer and Christopher Healy Note: 1960 and 1980 data represent averages from 1959–1961 and 1979–1981, respectively.
As you can see, public and private school grading curves started out as relatively similar, and gradually pulled further apart. Both types of institutions made their curves easier over time, but private schools made their grades much easier.
Read more:
ps. Thanks to Hairpin for the great title.
Thanks for Sharing
by Felix Salmon
Is there a company in the world which isn’t trying to “harness and leverage the power of social media to amplify our brand” or somesuch? I’m a pretty small fish in the Twitter pond, and I get asked on a very regular basis to talk to various marketing types about how they should be using Twitter. A smart organization with a big Twitter presence, then, will naturally start trying to leverage its ability to leverage Twitter by putting together sophisticated presentations full of “insights to help marketers align their content-sharing strategies” and the like. Which is exactly what the New York Times has just done.
The slideshow can be found here, and it’s worth downloading just to see how many photos the NYT art department could find of good-looking young people looking happy in minimalist houses. But it actually includes some interesting insights, too, which were spelled out at a conference yesterday by Brian Brett of the NYT Customer Research Group.
The survey claims to be the first of its kind on why people share content, which is a very good question. A large part of how people enjoy themselves online these days is by creating and sharing content, which is both exciting and a little bit scary for anybody in a media organization. And the NYT methodology was fun, too: aside from the standard surveys and interviews, they asked a bunch of people who don’t normally share much to spend a week sharing a lot; and they also asked a lot of heavy sharers to spend a week sharing nothing. (“It was like quitting smoking,” said one, “only harder”.)
The first striking insight is about the degree to which the act of sharing deepens understanding. It’s not at all surprising to learn that 85% of people say that they use other people’s responses to help them understand and process information — in fact 100% of people do that, and they’ve been doing it for centuries. We always react to news and information in large part by looking at how other people react to it.
But more interesting is the fact that 73% of people say that the simple act of sharing a piece of information with others makes them likely to process that information more deeply and thoughtfully. It’s like writing things down to remember them: the more you engage with something, the more important and salient it becomes to you.
Is there a company in the world which isn’t trying to “harness and leverage the power of social media to amplify our brand” or somesuch? I’m a pretty small fish in the Twitter pond, and I get asked on a very regular basis to talk to various marketing types about how they should be using Twitter. A smart organization with a big Twitter presence, then, will naturally start trying to leverage its ability to leverage Twitter by putting together sophisticated presentations full of “insights to help marketers align their content-sharing strategies” and the like. Which is exactly what the New York Times has just done.
The slideshow can be found here, and it’s worth downloading just to see how many photos the NYT art department could find of good-looking young people looking happy in minimalist houses. But it actually includes some interesting insights, too, which were spelled out at a conference yesterday by Brian Brett of the NYT Customer Research Group.
The survey claims to be the first of its kind on why people share content, which is a very good question. A large part of how people enjoy themselves online these days is by creating and sharing content, which is both exciting and a little bit scary for anybody in a media organization. And the NYT methodology was fun, too: aside from the standard surveys and interviews, they asked a bunch of people who don’t normally share much to spend a week sharing a lot; and they also asked a lot of heavy sharers to spend a week sharing nothing. (“It was like quitting smoking,” said one, “only harder”.)
The first striking insight is about the degree to which the act of sharing deepens understanding. It’s not at all surprising to learn that 85% of people say that they use other people’s responses to help them understand and process information — in fact 100% of people do that, and they’ve been doing it for centuries. We always react to news and information in large part by looking at how other people react to it.
But more interesting is the fact that 73% of people say that the simple act of sharing a piece of information with others makes them likely to process that information more deeply and thoughtfully. It’s like writing things down to remember them: the more you engage with something, the more important and salient it becomes to you.
Don't Be Evil
By Evgeny Morozov
July 13, 2011
In the Plex: How Google Thinks, Works, and Shapes Our Lives
By Steven Levy
The Googlization of Everything (And Why We Should Worry)
By Siva Vaidhyanathan
I.
For cyber-optimists and cyber-pessimists alike, the advent of Google marks off two very distinct periods in Internet history. The optimists remember the age before Google as chaotic, inefficient, and disorganized. Most search engines at the time had poor ethics (some made money by misrepresenting ads as search results) and terrible algorithms (some could not even find their parent companies online). All of that changed when two Stanford graduate students invented an ingenious way to rank Web pages based on how many other pages link to them. Other innovations spurred by Google—especially its novel platform for selling highly targeted ads—have created a new “ecosystem” (the optimists’ favorite buzzword) for producing and disseminating information. Thanks to Google, publishers of all stripes—from novice bloggers in New Delhi to media mandarins in New York—could cash in on their online popularity.
Cyber-pessimists see things quite differently. They wax nostalgic for the early days of the Web when discovery was random, and even fun. They complain that Google has destroyed the joy of serendipitous Web surfing, while its much-celebrated ecosystem is just a toxic wasteland of info-junk. Worse, it’s being constantly polluted by a contingent of “content farms” that produce trivial tidbits of information in order to receive a hefty advertising paycheck from the Googleplex. The skeptics charge that the company treats information as a commodity, trivializing the written word and seeking to turn access to knowledge into a dubious profit-center. Worst of all, Google’s sprawling technology may have created a digital panopticon, making privacy obsolete.
Both camps like to stress that Google is a unique enterprise that stands apart from the rest of Silicon Valley. The optimists do this to convince the public that the company’s motives are benign. If only we could bring ourselves to trust Google, their logic goes, its bright young engineers would deliver us the revolutionary services that we could never expect from our governments. The pessimists make a more intriguing case: for them, the company is so new, sly, and fluid, and the threats that it poses to society are so invisible, insidious, and monumental, that regulators may not yet have the proper analytical models to understand its true market and cultural power. That our anachronistic laws may be incapable of treating such a complex entity should not detract us from thwarting its ambitions.
These are not mutually exclusive positions. History is rife with examples of how benign and humanistic ideals can yield rather insidious outcomes—especially when backed by unchecked power and messianic rhetoric. The real question, then, is whether there is anything truly exceptional about Google’s principles, goals, and methods that would help it avoid this fate.
IS GOOGLE’S EXCEPTIONALISM genuine? On the surface, the answer seems self-evident. The company’s collegial working environment, its idealistic belief that corporations can make money without dirtying their hands, its quixotic quest to organize all of the world’s information, its founders’ contempt for marketing and conventional advertising— everything about the company screams, “We are special!” What normal company warns investors—on the very day of its initial public offering!—that it is willing to “forgo some short-term gains” in order to do “good things for the world”?
As Google’s ambitions multiply, however, its exceptionalism can no longer be taken for granted. Two new books shed light on this issue. Steven Levy had unrivaled access to Google’s executives, and In the Plex is a colorful journalistic account of the company’s history. Levy’s basic premise is that Google is both special and crucial, while the battle for its future is also a battle for the future of the Internet. As Levy puts it, “To understand this pioneering company and its people is to grasp our technological destiny.” What the German poet Friedrich Hebbel said of nineteenth-century Austria—that it is “a little world where the big one holds its tryouts”—also applies to Google. Siva Vaidhyanathan’s book is a far more intellectually ambitious project that seeks to document the company’s ecological footprint on the public sphere. Unlike Levy, Vaidhyanathan seeks to place Google’s meteoric rise and exceptionalism in the proper historical, cultural, and regulatory contexts, and suggests public alternatives to some of Google’s ambitious projects.
Even though both writers share the initial premise that, to quote Vaidhyanathan, Google is “nothing like anything we have seen before,” they provide different explanations of Google’s uniqueness. Levy opts for a “great man of history” approach and emphasizes the idealism and the quirkiness of its two founders. The obvious limitation of Levy’s method is that he pays very little attention to the broader intellectual context—the ongoing scholarly debates about the best approaches to information retrieval and the utility (and feasibility) of artificial intelligence—that must have shaped Google’s founders far more than the Montessori schooling system that so excites him.
Vaidhyanathan, while arguing that Google is “such a new phenomenon that old metaphors and precedents don’t fit the challenges the company presents to competitors and users,” posits that its power is mostly a function of recent developments in the information industry as well as of various market and public failures that occurred in the last few decades. Quoting the Marxist theorist David Harvey, Vaidhyanathan argues that the fall of communism in Eastern Europe and the resulting euphoria over “the end of history” and the triumph of neoliberalism has made the “notion of gentle, creative state involvement to guide processes toward the public good ... impossible to imagine, let alone propose.” Moreover, the growing penetration of market solutions into sectors that were traditionally managed by public institutions—from fighting wars to managing prisons and from schooling to health care—has made Google’s forays into digitizing books appear quite normal, set against the dismal state of public libraries and the continued sell-out of higher education to the highest corporate bidder. Thus Vaidhyanathan arrives at a rather odd and untenable conclusion: that Google is indeed exceptional—but its exceptionalism has little to do with Google.
Google’s two founders appear to firmly believe in their own exceptionalism. They are bold enough to think that the laws of sociology and organizational theory—for example, that most institutions, no matter how creative, are likely to end up in the “iron cage” of highly rationalized bureaucracy—do not apply to Google. This belief runs so deep that for a while they tried to run the company without middle managers—with disastrous results. Google’s embarrassing bouts of corporate autism—those increasingly frequent moments when the company is revealed to be out of touch with the outside world—stem precisely from this odd refusal to acknowledge its own normality. Time and again, its engineers fail to anticipate the loud public outcry over the privacy flaws in its products, not because they lack the technical knowledge to patch the related problems but because they have a hard time imagining an outside world where Google is seen as just another greedy corporation that might have incentives to behave unethically.
Read more:
July 13, 2011
In the Plex: How Google Thinks, Works, and Shapes Our Lives
By Steven Levy
The Googlization of Everything (And Why We Should Worry)
By Siva Vaidhyanathan
I.
For cyber-optimists and cyber-pessimists alike, the advent of Google marks off two very distinct periods in Internet history. The optimists remember the age before Google as chaotic, inefficient, and disorganized. Most search engines at the time had poor ethics (some made money by misrepresenting ads as search results) and terrible algorithms (some could not even find their parent companies online). All of that changed when two Stanford graduate students invented an ingenious way to rank Web pages based on how many other pages link to them. Other innovations spurred by Google—especially its novel platform for selling highly targeted ads—have created a new “ecosystem” (the optimists’ favorite buzzword) for producing and disseminating information. Thanks to Google, publishers of all stripes—from novice bloggers in New Delhi to media mandarins in New York—could cash in on their online popularity.
Cyber-pessimists see things quite differently. They wax nostalgic for the early days of the Web when discovery was random, and even fun. They complain that Google has destroyed the joy of serendipitous Web surfing, while its much-celebrated ecosystem is just a toxic wasteland of info-junk. Worse, it’s being constantly polluted by a contingent of “content farms” that produce trivial tidbits of information in order to receive a hefty advertising paycheck from the Googleplex. The skeptics charge that the company treats information as a commodity, trivializing the written word and seeking to turn access to knowledge into a dubious profit-center. Worst of all, Google’s sprawling technology may have created a digital panopticon, making privacy obsolete.
Both camps like to stress that Google is a unique enterprise that stands apart from the rest of Silicon Valley. The optimists do this to convince the public that the company’s motives are benign. If only we could bring ourselves to trust Google, their logic goes, its bright young engineers would deliver us the revolutionary services that we could never expect from our governments. The pessimists make a more intriguing case: for them, the company is so new, sly, and fluid, and the threats that it poses to society are so invisible, insidious, and monumental, that regulators may not yet have the proper analytical models to understand its true market and cultural power. That our anachronistic laws may be incapable of treating such a complex entity should not detract us from thwarting its ambitions.
These are not mutually exclusive positions. History is rife with examples of how benign and humanistic ideals can yield rather insidious outcomes—especially when backed by unchecked power and messianic rhetoric. The real question, then, is whether there is anything truly exceptional about Google’s principles, goals, and methods that would help it avoid this fate.
IS GOOGLE’S EXCEPTIONALISM genuine? On the surface, the answer seems self-evident. The company’s collegial working environment, its idealistic belief that corporations can make money without dirtying their hands, its quixotic quest to organize all of the world’s information, its founders’ contempt for marketing and conventional advertising— everything about the company screams, “We are special!” What normal company warns investors—on the very day of its initial public offering!—that it is willing to “forgo some short-term gains” in order to do “good things for the world”?
As Google’s ambitions multiply, however, its exceptionalism can no longer be taken for granted. Two new books shed light on this issue. Steven Levy had unrivaled access to Google’s executives, and In the Plex is a colorful journalistic account of the company’s history. Levy’s basic premise is that Google is both special and crucial, while the battle for its future is also a battle for the future of the Internet. As Levy puts it, “To understand this pioneering company and its people is to grasp our technological destiny.” What the German poet Friedrich Hebbel said of nineteenth-century Austria—that it is “a little world where the big one holds its tryouts”—also applies to Google. Siva Vaidhyanathan’s book is a far more intellectually ambitious project that seeks to document the company’s ecological footprint on the public sphere. Unlike Levy, Vaidhyanathan seeks to place Google’s meteoric rise and exceptionalism in the proper historical, cultural, and regulatory contexts, and suggests public alternatives to some of Google’s ambitious projects.
Even though both writers share the initial premise that, to quote Vaidhyanathan, Google is “nothing like anything we have seen before,” they provide different explanations of Google’s uniqueness. Levy opts for a “great man of history” approach and emphasizes the idealism and the quirkiness of its two founders. The obvious limitation of Levy’s method is that he pays very little attention to the broader intellectual context—the ongoing scholarly debates about the best approaches to information retrieval and the utility (and feasibility) of artificial intelligence—that must have shaped Google’s founders far more than the Montessori schooling system that so excites him.
Vaidhyanathan, while arguing that Google is “such a new phenomenon that old metaphors and precedents don’t fit the challenges the company presents to competitors and users,” posits that its power is mostly a function of recent developments in the information industry as well as of various market and public failures that occurred in the last few decades. Quoting the Marxist theorist David Harvey, Vaidhyanathan argues that the fall of communism in Eastern Europe and the resulting euphoria over “the end of history” and the triumph of neoliberalism has made the “notion of gentle, creative state involvement to guide processes toward the public good ... impossible to imagine, let alone propose.” Moreover, the growing penetration of market solutions into sectors that were traditionally managed by public institutions—from fighting wars to managing prisons and from schooling to health care—has made Google’s forays into digitizing books appear quite normal, set against the dismal state of public libraries and the continued sell-out of higher education to the highest corporate bidder. Thus Vaidhyanathan arrives at a rather odd and untenable conclusion: that Google is indeed exceptional—but its exceptionalism has little to do with Google.
Google’s two founders appear to firmly believe in their own exceptionalism. They are bold enough to think that the laws of sociology and organizational theory—for example, that most institutions, no matter how creative, are likely to end up in the “iron cage” of highly rationalized bureaucracy—do not apply to Google. This belief runs so deep that for a while they tried to run the company without middle managers—with disastrous results. Google’s embarrassing bouts of corporate autism—those increasingly frequent moments when the company is revealed to be out of touch with the outside world—stem precisely from this odd refusal to acknowledge its own normality. Time and again, its engineers fail to anticipate the loud public outcry over the privacy flaws in its products, not because they lack the technical knowledge to patch the related problems but because they have a hard time imagining an outside world where Google is seen as just another greedy corporation that might have incentives to behave unethically.
Read more:
Subscribe to:
Comments (Atom)






