Monday, November 5, 2012
Looking Into the Future
Can AIDS be cured?” That was the question being whispered in the back rooms and satellite meetings of the 19th International AIDS Conference, held in Washington, DC, this week. The conference’s formal business was to keep up the momentum behind the most successful public-health campaign of the past 30 years: the taming, at the cost of a few pills a day, of an infection that was once an inevitable killer. It still kills. About 1.7m people succumbed last year. But that figure is down from 2.3m in 2005 (see chart 1), and is expected to continue falling. Now, therefore, some people are starting to look beyond the antiretroviral (ARV) drugs which have brought this success. They are asking if something else could do even better.
The drugs work, and are getting cheaper by the year: a report released during the conference by the Clinton Foundation, an American global-health charity, put the annual cost of treatment at $200; it used to be $10,000. But once on them, you are on them for life. Stop, and the virus crawls out of cellular hidey-holes that ARVs cannot reach and rapidly reinfects you. This has implications both for patients, whose lives are constrained by the need for constant medication, and taxpayers, who bear most of the cost of this indefinite treatment.
Many of those taxpayers do not live in the rich world but in the worst-afflicted countries. A new estimate by UNAIDS, the United Nations agency charged with combating the disease, suggests that more than half of the cost of treating and preventing AIDS is now borne by these countries, rather than paid for by international agencies (see chart 2). As many of these countries have high economic growth rates, that is only right and proper. But it does mean that they, too, have a strong interest in a cure. And researchers would like to provide them with one.
The road to Berlin
A race is therefore on to work out how to flush the virus from its hiding places and get rid of it completely. Several clues suggest a cure may be possible. But no one knows which route will lead to it.
One of those routes passes through Timothy Brown. Mr Brown, pictured above, is known as the Berlin patient. He was living in that city in 2007 when he underwent radical treatment for leukaemia. This required the destruction of his immune system—the source of the cancer—and its replacement using stem cells transplanted from the bone marrow of a donor, which allowed him to grow a new (but alien) immune system.
Mr Brown did not just have leukaemia. He was also infected with HIV. So his doctor, with his permission, tried an experiment. The doctor searched for and found a donor who had a rare genetic mutation which confers immunity to HIV infection by disabling a protein on cell surfaces to which the virus attaches itself in order to gain entry to a cell.
After the transplant, the virus seemed to disappear from Mr Brown’s body. Traces of viral genes were found recently, but these may have been contamination, and in any case they did not amount to entire, working viruses. There is no disputing, however, that Mr Brown no longer needs drugs to stay healthy, and has not needed them for five years.
No one is suggesting immune-system transplants as a treatment for AIDS. They are far too dangerous and costly. The intriguing point about Mr Brown’s procedure is that it would have been expected to destroy directly only one of the hiding places of the virus: immune-system cells squirrelled away in a quiescent state as the system’s memory. (These allow it to recognise and respond to infections experienced in the past.) Other reservoirs, particularly certain brain cells, would not have been affected directly—and in Mr Brown’s case checking his brain to find out what is going on would be grossly unethical.
Clearly, it is dangerous to draw conclusions from a single example. But if quiescent memory cells are the main source of viral rebound, that would simplify the task of finding a cure. And many groups of researchers are trying to do just that, by waking up the memory cells so that ARVs can get at the virus within them.
The drugs work, and are getting cheaper by the year: a report released during the conference by the Clinton Foundation, an American global-health charity, put the annual cost of treatment at $200; it used to be $10,000. But once on them, you are on them for life. Stop, and the virus crawls out of cellular hidey-holes that ARVs cannot reach and rapidly reinfects you. This has implications both for patients, whose lives are constrained by the need for constant medication, and taxpayers, who bear most of the cost of this indefinite treatment.
Many of those taxpayers do not live in the rich world but in the worst-afflicted countries. A new estimate by UNAIDS, the United Nations agency charged with combating the disease, suggests that more than half of the cost of treating and preventing AIDS is now borne by these countries, rather than paid for by international agencies (see chart 2). As many of these countries have high economic growth rates, that is only right and proper. But it does mean that they, too, have a strong interest in a cure. And researchers would like to provide them with one.
The road to Berlin
A race is therefore on to work out how to flush the virus from its hiding places and get rid of it completely. Several clues suggest a cure may be possible. But no one knows which route will lead to it.
One of those routes passes through Timothy Brown. Mr Brown, pictured above, is known as the Berlin patient. He was living in that city in 2007 when he underwent radical treatment for leukaemia. This required the destruction of his immune system—the source of the cancer—and its replacement using stem cells transplanted from the bone marrow of a donor, which allowed him to grow a new (but alien) immune system.
Mr Brown did not just have leukaemia. He was also infected with HIV. So his doctor, with his permission, tried an experiment. The doctor searched for and found a donor who had a rare genetic mutation which confers immunity to HIV infection by disabling a protein on cell surfaces to which the virus attaches itself in order to gain entry to a cell.
After the transplant, the virus seemed to disappear from Mr Brown’s body. Traces of viral genes were found recently, but these may have been contamination, and in any case they did not amount to entire, working viruses. There is no disputing, however, that Mr Brown no longer needs drugs to stay healthy, and has not needed them for five years.
No one is suggesting immune-system transplants as a treatment for AIDS. They are far too dangerous and costly. The intriguing point about Mr Brown’s procedure is that it would have been expected to destroy directly only one of the hiding places of the virus: immune-system cells squirrelled away in a quiescent state as the system’s memory. (These allow it to recognise and respond to infections experienced in the past.) Other reservoirs, particularly certain brain cells, would not have been affected directly—and in Mr Brown’s case checking his brain to find out what is going on would be grossly unethical.
Clearly, it is dangerous to draw conclusions from a single example. But if quiescent memory cells are the main source of viral rebound, that would simplify the task of finding a cure. And many groups of researchers are trying to do just that, by waking up the memory cells so that ARVs can get at the virus within them.
by The Economist | Read more:
Photo: Eyevine
The Visitor
It has always been a key step on the ‘way’ or ‘path’ in Taoist philosophy (‘way’ being the literal translation of Tao) to go into the wilderness and lay oneself bare to whatever one finds there, whether that be the agonies of St Anthony, or the detachment of the Taoist masters. Alone in the wild, we shed the conventions that keep society ticking over — freedom from the clock, in particular, is a hugely important factor. We are opened up to other, less conventional, customs: in the wild, animals may talk to us, birds will sometimes guide us to water or light, the wind may become a second skin. In the wild, we may even find our true bodies, creaturely and vivid and indivisible from the rest of creation — but this comes only when we break free, not just from the constraints of clock and calendar and social convention, but also from the sometimes-clandestine hopes, expectations and fears with which we arrived.
For many of us, solitude is tempting because it is ‘the place of purification’, as the Israeli philosopher Martin Buber called it. Our aspiration for travelling to that place might be the simple pleasure of being away, unburdened by the pettiness and corruption of the day-to-day round. For me, being alone is about staying sane in a noisy and cluttered world – I have what the Canadian pianist Glenn Gould called a ‘high solitude quotient’ — but it is also a way of opening out a creative space, to give myself a chance to be quiet enough to see or hear what happens next.
There are those who are inclined to be purely temporary dwellers in the wilderness, who don’t stay long. As soon as they are renewed by a spell of lonely contemplation, they are eager to return to the everyday fray. Meanwhile, the committed wilderness dwellers are after something more. Yet, even if contemplative solitude gives them a glimpse of the sublime (or, if they are so disposed, the divine), questions arise immediately afterwards. What now? What is the purpose of this solitude? Whom does it serve?
To take oneself out into the wilderness as part of a spiritual quest is one thing, but to remain there in a kind of barren ecstasy is another. The Anglo-American mystic Thomas Merton argues that ‘there is no greater disaster in the spiritual life than to be immersed in unreality, for life is maintained and nourished in us by our vital relation with realities outside and above us. When our life feeds on unreality, it must starve.’ If practised as part of a living spiritual path, he says, and not simply as an escape from corruption or as an expression of misanthropy, ‘your solitude will bear immense fruit in the souls of men you will never see on earth’. It is a point Ralph Waldo Emerson, Thoreau’s friend and teacher, also makes. Solitude is essential to the spiritual path, he argues, but ‘we require such solitude as shall hold us to its revelations when we are in the streets and in palaces … it is not the circumstances of seeing more or fewer people but the readiness of sympathy that imports’.
For many of us, solitude is tempting because it is ‘the place of purification’, as the Israeli philosopher Martin Buber called it. Our aspiration for travelling to that place might be the simple pleasure of being away, unburdened by the pettiness and corruption of the day-to-day round. For me, being alone is about staying sane in a noisy and cluttered world – I have what the Canadian pianist Glenn Gould called a ‘high solitude quotient’ — but it is also a way of opening out a creative space, to give myself a chance to be quiet enough to see or hear what happens next.
There are those who are inclined to be purely temporary dwellers in the wilderness, who don’t stay long. As soon as they are renewed by a spell of lonely contemplation, they are eager to return to the everyday fray. Meanwhile, the committed wilderness dwellers are after something more. Yet, even if contemplative solitude gives them a glimpse of the sublime (or, if they are so disposed, the divine), questions arise immediately afterwards. What now? What is the purpose of this solitude? Whom does it serve?
To take oneself out into the wilderness as part of a spiritual quest is one thing, but to remain there in a kind of barren ecstasy is another. The Anglo-American mystic Thomas Merton argues that ‘there is no greater disaster in the spiritual life than to be immersed in unreality, for life is maintained and nourished in us by our vital relation with realities outside and above us. When our life feeds on unreality, it must starve.’ If practised as part of a living spiritual path, he says, and not simply as an escape from corruption or as an expression of misanthropy, ‘your solitude will bear immense fruit in the souls of men you will never see on earth’. It is a point Ralph Waldo Emerson, Thoreau’s friend and teacher, also makes. Solitude is essential to the spiritual path, he argues, but ‘we require such solitude as shall hold us to its revelations when we are in the streets and in palaces … it is not the circumstances of seeing more or fewer people but the readiness of sympathy that imports’.
by John Burnside, Aeon | Read more:
Illustration: Sarah Maycock
The Permanent Militarization of America
In 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.
In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.
The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.
Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.
But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.
Like all institutions, the military works to enhance its public image, but this is just one element of militarization. Most of the political discourse on military matters comes from civilians, who are more vocal about “supporting our troops” than the troops themselves. It doesn’t help that there are fewer veterans in Congress today than at any previous point since World War II. Those who have served are less likely to offer unvarnished praise for the military, for it, like all institutions, has its own frustrations and failings. But for non-veterans — including about four-fifths of all members of Congress — there is only unequivocal, unhesitating adulation. The political costs of anything else are just too high.
In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.
The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.
Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.
But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.
Like all institutions, the military works to enhance its public image, but this is just one element of militarization. Most of the political discourse on military matters comes from civilians, who are more vocal about “supporting our troops” than the troops themselves. It doesn’t help that there are fewer veterans in Congress today than at any previous point since World War II. Those who have served are less likely to offer unvarnished praise for the military, for it, like all institutions, has its own frustrations and failings. But for non-veterans — including about four-fifths of all members of Congress — there is only unequivocal, unhesitating adulation. The political costs of anything else are just too high.
by Aaron B. O'Connell, NY Times | Read more:
Photo: Wikipedia
Sunday, November 4, 2012
Buzz Off
[ed. As if real mosquitoes aren't irritating enough...]
Is this a mosquito?
No. It’s an insect spy drone for urban areas, already in production, funded by the US Government. It can be remotely controlled and is equipped with a camera and a microphone. It can land on you, and it may have the potential to take a DNA sample or leave RFID tracking nanotechnology on your skin. It can fly through an open window, or it can attach to your clothing until you take it in your home.
via:
Tracking the Trackers
The life of a politician in campaign mode is brutal. Go here, say this, go there, say that, smile, smile, smile, smile, shake hands, remember policy positions, learn new policy positions, learn talking points, learn names, attend the next rally, the next 7 a.m. breakfast, the next evening debate, the next lunchtime forum, keep your bladder in check, keep your libido in check, kiss ass, kiss babies, kiss spouse who is perfect and without whom etc., fundraise, fundraise, fundraise, and through all of it, never make a mistake, ever.
Not easy. But now consider the job of the person who has to constantly follow this politician around. Not this politician's pen- and Purell-carrying body man. Not the spokesperson who keeps the media at bay. Someone else. Someone from the opposing party, someone whose job is literally just to follow this politician everywhere and record everything that happens. The tracker.
If it takes a certain kind of fanatical drive to be a politician running for high office—and it does—then it takes a slightly different but equally fanatical drive to be the person who watches that politician, day in and day out, for an entire campaign season. It takes a guy like, say, Keith Schipper.
Schipper is 25 years old, he's a Republican, and on this day in March he's trying to talk his way into an event being put on by the Democratic candidate for governor, Jay Inslee, in an office park in Kent. Schipper's small Canon HD video camera is stashed in the pocket of his coat, ready to be pulled out in an instant. His rap about the people's right to know is cued up.
No dice. Inslee's people made Schipper the second he walked in the door. They've researched him, and they've researched their rights. This green-vehicle-manufacturing company is unquestionably private property, and Schipper's not welcome.
He gets the boot and gamely heads back to his messy green Nissan Pathfinder. No big loss. There will be a public Inslee event soon, no doubt, and Schipper will be there, by rights un-ejectable. I follow him out into the parking lot because I'm curious, and as Schipper drives off, I notice a University of Washington sticker on his back window.
Schipper studied political science and philosophy at the UW. I know what he studied because I decided to track Schipper a bit after that first encounter. Researched his history. Watched him at political events. Noted the tin of chew he keeps in the right pocket of his pants. Followed his Twitter feed, where he talks of "pounding Monsters on a long drive home from Spokane" and boasts that "sicking the police on a bunch of #UW students may very well end up being my most favorite thing I did in this election cycle."
I didn't just track him surreptitiously. I tried to get an interview with Schipper through his bosses at the state Republican Party but was ignored. I also tried to message him through Facebook. No answer. But that was fine. As Schipper knows, a core truth of tracker life is that the person you're following will show up in public eventually.
It's odd, though, the coyness of trackers. They're supposedly devoted to the idea that nothing should be hidden from the voters anymore, but they're not exactly eager to have themselves described to voters. Maybe it's because they don't want to become the story and distract from whatever campaign narrative they're trying to push. Maybe they know that tracking comes off as unseemly to a lot of people. Maybe they want to try to avoid having "Shame on you!" shouted at them at events, as happened to a Democratic tracker in Florida recently (video seemed to show her leaving the event, a memorial for Vietnam veterans, crying). Or perhaps it's just that trackers are so intimately familiar with how quickly one captured moment can come to define a person—like the moment that solidified the current obsession with tracking candidates, Republican Senate candidate George Allen's "Macaca Moment" on the campaign trail in Virginia several elections ago.
On that day in August 2006, at a campaign stop, Allen pointed at a Democratic tracker who had been following him everywhere and who happened to be Indian American. He said, "This fellow here over here with the yellow shirt, Macaca or whatever his name is, he's with my opponent, he's following us around everywhere." Video of Allen losing his cool went viral, he lost the election, and the rest is tracker history.
It's the kind of moment all trackers now hope to capture, a moment not unlike the one that a certain still-anonymous individual captured earlier this year at a private Romney fundraiser in Florida at which the candidate talked about 47 percent of Americans acting like "victims" who can't be bothered to "take personal responsibility and care for their lives." And just like the person who captured that "47 percent" remark, most trackers (and their handlers) remain reluctant to take a bow in public. When I called the state Democratic Party and asked them to put me in touch with their gubernatorial tracker, Zach Wurtz—aka "Zach the Track"—no one was very excited about the idea. But I kept shaking the tree, and one day earlier this month, I got a text from Wurtz telling me that he would be at an upcoming forum featuring Inslee and the Republican candidate for governor, Rob McKenna. I made it my business to be there.
Not easy. But now consider the job of the person who has to constantly follow this politician around. Not this politician's pen- and Purell-carrying body man. Not the spokesperson who keeps the media at bay. Someone else. Someone from the opposing party, someone whose job is literally just to follow this politician everywhere and record everything that happens. The tracker.If it takes a certain kind of fanatical drive to be a politician running for high office—and it does—then it takes a slightly different but equally fanatical drive to be the person who watches that politician, day in and day out, for an entire campaign season. It takes a guy like, say, Keith Schipper.
Schipper is 25 years old, he's a Republican, and on this day in March he's trying to talk his way into an event being put on by the Democratic candidate for governor, Jay Inslee, in an office park in Kent. Schipper's small Canon HD video camera is stashed in the pocket of his coat, ready to be pulled out in an instant. His rap about the people's right to know is cued up.
No dice. Inslee's people made Schipper the second he walked in the door. They've researched him, and they've researched their rights. This green-vehicle-manufacturing company is unquestionably private property, and Schipper's not welcome.
He gets the boot and gamely heads back to his messy green Nissan Pathfinder. No big loss. There will be a public Inslee event soon, no doubt, and Schipper will be there, by rights un-ejectable. I follow him out into the parking lot because I'm curious, and as Schipper drives off, I notice a University of Washington sticker on his back window.
Schipper studied political science and philosophy at the UW. I know what he studied because I decided to track Schipper a bit after that first encounter. Researched his history. Watched him at political events. Noted the tin of chew he keeps in the right pocket of his pants. Followed his Twitter feed, where he talks of "pounding Monsters on a long drive home from Spokane" and boasts that "sicking the police on a bunch of #UW students may very well end up being my most favorite thing I did in this election cycle."
I didn't just track him surreptitiously. I tried to get an interview with Schipper through his bosses at the state Republican Party but was ignored. I also tried to message him through Facebook. No answer. But that was fine. As Schipper knows, a core truth of tracker life is that the person you're following will show up in public eventually.
It's odd, though, the coyness of trackers. They're supposedly devoted to the idea that nothing should be hidden from the voters anymore, but they're not exactly eager to have themselves described to voters. Maybe it's because they don't want to become the story and distract from whatever campaign narrative they're trying to push. Maybe they know that tracking comes off as unseemly to a lot of people. Maybe they want to try to avoid having "Shame on you!" shouted at them at events, as happened to a Democratic tracker in Florida recently (video seemed to show her leaving the event, a memorial for Vietnam veterans, crying). Or perhaps it's just that trackers are so intimately familiar with how quickly one captured moment can come to define a person—like the moment that solidified the current obsession with tracking candidates, Republican Senate candidate George Allen's "Macaca Moment" on the campaign trail in Virginia several elections ago.
On that day in August 2006, at a campaign stop, Allen pointed at a Democratic tracker who had been following him everywhere and who happened to be Indian American. He said, "This fellow here over here with the yellow shirt, Macaca or whatever his name is, he's with my opponent, he's following us around everywhere." Video of Allen losing his cool went viral, he lost the election, and the rest is tracker history.
It's the kind of moment all trackers now hope to capture, a moment not unlike the one that a certain still-anonymous individual captured earlier this year at a private Romney fundraiser in Florida at which the candidate talked about 47 percent of Americans acting like "victims" who can't be bothered to "take personal responsibility and care for their lives." And just like the person who captured that "47 percent" remark, most trackers (and their handlers) remain reluctant to take a bow in public. When I called the state Democratic Party and asked them to put me in touch with their gubernatorial tracker, Zach Wurtz—aka "Zach the Track"—no one was very excited about the idea. But I kept shaking the tree, and one day earlier this month, I got a text from Wurtz telling me that he would be at an upcoming forum featuring Inslee and the Republican candidate for governor, Rob McKenna. I made it my business to be there.
Your Employee Is an Online Celebrity. Now What Do You Do?
Meet your newest management headache: the co-branded employee.A growing number of professionals are using social media to build a personal, public identity—a brand of their own—based on their work. Think of an accountant who writes a widely read blog about auditing, or a sales associate who has attracted a big following online by tweeting out his store's latest deals.
Co-branded employees may exist largely below the radar now, but that's changing fast, and employers need to start preparing for the ever-greater challenges they pose for managers, co-workers and companies. Their activities can either complement a company's own brand image or clash with it. Companies that fail to make room for co-branded employees—or worse yet, embrace them without thinking through the implications—risk alienating or losing their best employees, or confusing or even burning their corporate brand.
Part of this change is generational. Younger employees show up on the job with an existing social-media presence, which they aren't about to abandon—especially since they see their personal brands lasting longer than any single job or career.
Social-media services like LinkedIn and Facebook also encourage users to build networks and share their professional as well as personal expertise. And increasingly, companies are recognizing that these activities have a business value. When a management consultant leads a large LinkedIn group, he builds a valuable source of referrals and recruitment prospects; when a lawyer tweets the latest legal news, she positions her firm as the go-to experts in that field. How can an employer resist?
And yet, there is a downside: Co-branded employees can raise tough questions about how to contain their online activities—and how to compensate them. It also isn't easy for managers to balance responsibilities among the bloggers and nonbloggers within a team. And it takes an effort to make sure employees' brands align with the company's.
To ensure that co-branded employees benefit a company, rather than undermine it, managers need to consider these questions:
America Gone Wild
This year, Princeton, N.J., has hired sharpshooters to cull 250 deer from the town's herd of 550 over the winter. The cost: $58,700. Columbia, S.C., is spending $1 million to rid its drainage systems of beavers and their dams. The 2009 "miracle on the Hudson," when US Airways flight 1549 had to make an emergency landing after its engines ingested Canada geese, saved 155 passengers and crew, but the $60 million A320 Airbus was a complete loss. In the U.S., the total cost of wildlife damage to crops, landscaping and infrastructure now exceeds $28 billion a year ($1.5 billion from deer-vehicle crashes alone), according to Michael Conover of Utah State University, who monitors conflicts between people and wildlife.
The resurgence of wildlife in the U.S. has led to an increase in conflict between wildlife and people.
Those conflicts often pit neighbor against neighbor. After a small dog in Wheaton, Ill., was mauled by a coyote and had to be euthanized, officials hired a nuisance wildlife mitigation company. Its operator killed four coyotes and got voice-mail death threats. A brick was tossed through a city official's window, city-council members were peppered with threatening emails and letters, and the FBI was called in. After Princeton began culling deer 12 years ago, someone splattered the mayor's car with deer innards.
Welcome to the nature wars, in which Americans fight each other over too much of a good thing—expanding wildlife populations produced by our conservation and environmental successes. We now routinely encounter wild birds and animals that our parents and grandparents rarely saw. As their numbers have grown, wild creatures have spread far beyond their historic ranges into new habitats, including ours. It is very likely that in the eastern United States today more people live in closer proximity to more wildlife than anywhere on Earth at any time in history.
In a world full of eco-woes like species extinctions, this should be wonderful news—unless, perhaps, you are one of more than 4,000 drivers who will hit a deer today, or your child's soccer field is carpeted with goose droppings, or feral cats have turned your bird feeder into a fast-food outlet, or wild turkeys have eaten your newly planted seed corn, or beavers have flooded your driveway, or bears are looting your trash cans. And that's just the beginning.
The resurgence of wildlife in the U.S. has led to an increase in conflict between wildlife and people.
Those conflicts often pit neighbor against neighbor. After a small dog in Wheaton, Ill., was mauled by a coyote and had to be euthanized, officials hired a nuisance wildlife mitigation company. Its operator killed four coyotes and got voice-mail death threats. A brick was tossed through a city official's window, city-council members were peppered with threatening emails and letters, and the FBI was called in. After Princeton began culling deer 12 years ago, someone splattered the mayor's car with deer innards.
Welcome to the nature wars, in which Americans fight each other over too much of a good thing—expanding wildlife populations produced by our conservation and environmental successes. We now routinely encounter wild birds and animals that our parents and grandparents rarely saw. As their numbers have grown, wild creatures have spread far beyond their historic ranges into new habitats, including ours. It is very likely that in the eastern United States today more people live in closer proximity to more wildlife than anywhere on Earth at any time in history.
In a world full of eco-woes like species extinctions, this should be wonderful news—unless, perhaps, you are one of more than 4,000 drivers who will hit a deer today, or your child's soccer field is carpeted with goose droppings, or feral cats have turned your bird feeder into a fast-food outlet, or wild turkeys have eaten your newly planted seed corn, or beavers have flooded your driveway, or bears are looting your trash cans. And that's just the beginning.
by Jim Sterba, WSJ | Read more:
Illustration: Jesse LenzSaturday, November 3, 2012
The Art of Waiting
It's spring when I realize that I may never have children, and around that time the thirteen-year cicadas return, burrowing out of neat, round holes in the ground to shed their larval shells, sprout wings, and fly to the treetops, filling the air with the sound of their singular purpose: reproduction. In the woods where I live, an area mostly protected from habitat destruction, the males’ mating song, a vibrating, whooshing, endless hum, a sound at once faraway and up-close, makes me feel like I am living inside a seashell.Near the river, where the song is louder, their discarded larval shells—translucent amber bodies, weightless and eerie—crunch underfoot on my daily walks. Across the river, in a nest constructed near the top of a tall, spindly pine, bald eagles take turns caring for two new eaglets. Baby turtles, baby snakes, and ducklings appear on the water. Under my parents’ porch, three feral cats give birth in quick succession. And on the news, a miracle pregnancy: Jamani, an eleven-year-old female gorilla at the North Carolina Zoo, is expecting, the first gorilla pregnancy there in twenty-two years. (...)
Like ours, the animal world is full of paradoxical examples of gentleness, brutality, and suffering, often performed in the service of reproduction. Female black widow spiders sometimes devour their partners after a complex and delicate mating dance. Bald eagle parents, who mate for life and share the responsibility of rearing young, will sometimes look on impassively as the stronger eaglet kills its sibling. At the end of their life cycle, after swimming thousands of miles in salt water, Pacific salmon swim up their natal, freshwater streams to spawn, while the fresh water decays their flesh. Animals will do whatever it takes to ensure reproductive success.
For humans, “whatever it takes” has come to mean in vitro fertilization (IVF), a procedure developed in the 1970s that involves the hormonal manipulation of a woman’s cycle followed by the harvest and fertilization of her eggs, which are transferred as embryos to her uterus. Nearly 4 million babies worldwide have been born through IVF, which has become a multibillion-dollar industry.
“Test-tube baby,” says another woman at the infertility support group, a young ER doctor who has given herself five at-home inseminations and is thinking of moving on to IVF. “I really hate that term. It’s a baby. That’s all it is.” She has driven seventy miles to talk to seven other women about the stress and isolation of infertility.
In the clinics, they call what the doctors and lab technicians do ART—assisted reproductive technology—softening the idea of the test-tube baby, the lab-created human. Art is something human, social, nonthreatening. Art does not clone or copy, but creates. It is often described as priceless, timeless, healing. It is far from uncommon to spend large amounts of
money on art. It’s an investment.
All of these ideas soothe, whether we think them through or not, just as the experience of treating infertility, while often painful and undignified, soothes as well. For the woman, treating infertility is about nurturing her body, which will hopefully produce eggs and a rich uterine lining where a fertilized egg could implant. All of the actions she might take in a given month—abstaining from caffeine and alcohol, taking Clomid or Femara, injecting herself with Gonal-f or human chorionic gonadotropin, charting her temperature and cervical mucus on a specialized calendar—are essentially maternal, repetitive, and self-sacrificing. In online message boards, where women gather to talk about their Clomid cycles and inseminations and IVF cycles, a form of baby talk is used to discuss the organs and cells of the reproductive process. Ovarian follicles are “follies,” embryos are “embies,” and frozen embryos—the embryos not used in an IVF cycle, which are frozen for future tries—are “snowbabies.” The frequent ultrasounds given to women in a treatment cycle, which monitor the growth of follicles and the endometrial lining, are not unlike the ultrasounds of pregnant women in the early stages of pregnancy. There is a wand, a screen, and something growing.
And always: something more to do, something else to try. It doesn’t take long, in an ART clinic, to spend tens of thousands of dollars on tests, medicine, and procedures. When I began to wonder why I could not conceive, I said the most I would do was read a book and chart my temperature. My next limit was pills: I would take them, but no more than that. Next was intrauterine insemination, a relatively inexpensive and low-tech procedure that requires no sedation. Compared to the women in my support group, women who leave the room to give themselves injections in the hospital bathroom, I’m a lightweight. Often during their discussions of medications and procedures I have no idea what they’re talking about, and part of the reason I attend each month is to listen to their horror stories. I’m hoping to detach from the process, to see what I could spare myself if I gave up.
But after three years of trying, it’s hard to give up. I know that it would be better for the planet if I did (if infinitesimally so), better for me, in some ways, as a writer. Certainly giving up makes financial sense. Years ago, when I saw such decisions as black or white, ight or wrong, I would have felt it was selfish and wasteful to spend thousands of dollars on unnecessary medical procedures. Better, the twenty-two-year-old me would have argued, to donate the money to an orphanage or a children’s hospital. Better to adopt.
The thirty-four-year-old me has careful but limited savings, knows how difficult adoption is, and desperately wants her body to work the way it is supposed to.
by Belle Boggs, Orion | Read more:
Art: Lorna Stevens
Once the Wild is Gone
Conservationists love charismatic species such as elephants. They appear on brochures, websites, and logos. The catastrophic decline in elephant numbers due to illegal hunting in the 1970s (and again now) provides one of the longest-running and most clear-cut stories about the plight of wildlife in the modern world. Who could forget the images of elephant carcasses, with their tusks removed, rotting in the bush? Or the huge pile of confiscated ivory set on fire by Daniel Arap Moi, Kenya’s President, in 1989?
Tourists also love elephants, and wildlife holidays in game reserves and parks offer a deeply romantic experience of wild creatures and people in apparent harmony in a remote, unspoiled land. In establishing protected areas for species such as elephants, conservation creates special places where the normal destructive rules of engagement between people and nature do not seem to apply.
However, nature reserves and national parks — or, in broad terms, ‘protected areas’ — are much more than a romantic idea. In the Anthropocene era, humankind is an increasingly dominant ecological force across the planet, from the tropics to the poles. Biodiversity is in decline everywhere, and the human impact on nature includes over-harvesting and overfishing, agricultural intensification and the growth of cities, toxic chemicals, ocean acidification, climate change, and many others. There is a real possibility of reaching ‘tipping points’, or changes that cause permanent shifts in the state of global ecological systems.
The loss of global biodiversity is the focus of huge efforts by charitable foundations, non-governmental organisations, and governments. The nature of the challenge is widely researched and, broadly, well-understood, yet international biodiversity targets are not being met. Recognising this, parties to the Convention on Biological Diversity pledged in 2010 to create more and better protected areas (at sea as well as on land). This is the familiar strategy of setting aside spaces for nature, which has dominated modern conservation since the late 19th century. (...)
Part of the problem is biological. Protected areas such as national parks do help preserve the animals and plants inside them, if the areas are large enough. Yet, despite the fact that there has been a huge increase in both the number and extent of protected areas through the 20th century, biodiversity loss has continued apace, accelerating in many regions. What is going wrong?
The problem is that protected areas become ecological islands. In the 1960s, a famous series of experiments on patterns of extinction and immigration were conducted in the islets of the Florida Keys by EO Wilson and his student Daniel Simberloff. Their findings became the basis of the ‘theory of island biogeography’. Simply put, islands lose species: the smaller the island, the faster they are lost. Since then, ecologists have recognised that these islands of habitat need not be surrounded by a sea of water. In Amazonia, ecologists conducted experiments on land that had been converted from forest to farms: islands of trees in a sea of dirt. They preserved square blocks of forest of different dimensions and studied the effect on diversity. Edge effects — the increase of sun, wind and weeds at the boundary between forest and cleared land — changed the microclimate of the forest, and species were lost. The smaller the remnant forest patch, the faster the species disappeared.
Landscape ecology, the science of animal populations, and studies of ecological networks all point the same way. Small protected areas surrounded by land without suitable habitat will not be sufficient to protect global biodiversity. And for large mammals, a park that is ‘too small’ might in fact be very large indeed. One of the greatest conservation challenges in Africa is to manage elephants, whose enormous ranges cannot be contained even in the greatest of parks.
One response is to seek more and bigger reserves, or to build corridors between them (‘more, bigger, better and joined’ was the slogan of a UK Government report Making Space for Nature in 2010). Yet, at most, a protected area strategy will create biodiverse islands on a fraction of the Earth’s surface (perhaps 17 per cent) leaving the rest of the Earth (to which humanity is restricted) radically transformed, and perhaps permanently impoverished in diversity.
Science is not the only critic of protected areas. They are often resisted and subverted by the people who have to live with them as neighbours. To understand why so many people around the world feel a burning resentment of protected areas from which they are excluded, we need to know more about their history, which starts in the 19th century — the heyday of empire and expansion of the Western world.
Tourists also love elephants, and wildlife holidays in game reserves and parks offer a deeply romantic experience of wild creatures and people in apparent harmony in a remote, unspoiled land. In establishing protected areas for species such as elephants, conservation creates special places where the normal destructive rules of engagement between people and nature do not seem to apply.
However, nature reserves and national parks — or, in broad terms, ‘protected areas’ — are much more than a romantic idea. In the Anthropocene era, humankind is an increasingly dominant ecological force across the planet, from the tropics to the poles. Biodiversity is in decline everywhere, and the human impact on nature includes over-harvesting and overfishing, agricultural intensification and the growth of cities, toxic chemicals, ocean acidification, climate change, and many others. There is a real possibility of reaching ‘tipping points’, or changes that cause permanent shifts in the state of global ecological systems.
The loss of global biodiversity is the focus of huge efforts by charitable foundations, non-governmental organisations, and governments. The nature of the challenge is widely researched and, broadly, well-understood, yet international biodiversity targets are not being met. Recognising this, parties to the Convention on Biological Diversity pledged in 2010 to create more and better protected areas (at sea as well as on land). This is the familiar strategy of setting aside spaces for nature, which has dominated modern conservation since the late 19th century. (...)
Part of the problem is biological. Protected areas such as national parks do help preserve the animals and plants inside them, if the areas are large enough. Yet, despite the fact that there has been a huge increase in both the number and extent of protected areas through the 20th century, biodiversity loss has continued apace, accelerating in many regions. What is going wrong?
The problem is that protected areas become ecological islands. In the 1960s, a famous series of experiments on patterns of extinction and immigration were conducted in the islets of the Florida Keys by EO Wilson and his student Daniel Simberloff. Their findings became the basis of the ‘theory of island biogeography’. Simply put, islands lose species: the smaller the island, the faster they are lost. Since then, ecologists have recognised that these islands of habitat need not be surrounded by a sea of water. In Amazonia, ecologists conducted experiments on land that had been converted from forest to farms: islands of trees in a sea of dirt. They preserved square blocks of forest of different dimensions and studied the effect on diversity. Edge effects — the increase of sun, wind and weeds at the boundary between forest and cleared land — changed the microclimate of the forest, and species were lost. The smaller the remnant forest patch, the faster the species disappeared.
Landscape ecology, the science of animal populations, and studies of ecological networks all point the same way. Small protected areas surrounded by land without suitable habitat will not be sufficient to protect global biodiversity. And for large mammals, a park that is ‘too small’ might in fact be very large indeed. One of the greatest conservation challenges in Africa is to manage elephants, whose enormous ranges cannot be contained even in the greatest of parks.
One response is to seek more and bigger reserves, or to build corridors between them (‘more, bigger, better and joined’ was the slogan of a UK Government report Making Space for Nature in 2010). Yet, at most, a protected area strategy will create biodiverse islands on a fraction of the Earth’s surface (perhaps 17 per cent) leaving the rest of the Earth (to which humanity is restricted) radically transformed, and perhaps permanently impoverished in diversity.
Science is not the only critic of protected areas. They are often resisted and subverted by the people who have to live with them as neighbours. To understand why so many people around the world feel a burning resentment of protected areas from which they are excluded, we need to know more about their history, which starts in the 19th century — the heyday of empire and expansion of the Western world.
The Parenting Trap
Under no circumstances are you to cut this out and stick it on the fridge door. Or put it in the file marked “Kids’ Stuff.” There’s nothing here for you. Nothing to do, nothing to act on. No consciousness-raising or attitude-flipping. No strategies or slogans. There is no help. And absolutely no solace. Because, really, what the world doesn’t need now is any more advice on raising children. We’re done with the finger wagging and the head patting. We’ve tried everything and we’ve read everything. We’ve asked, tweeted, blogged, prayed, and read it all. We’ve sat up at night and commiserated with other parents when we should have been having sex or at least paying off the sleep deficit. We’ve done everything, and still it’s like a cinnamon-and-lavender-scented Gettysburg out there.
Why don’t we just stop trying and do nothing? Because nothing can’t make us and the kids feel any worse than we feel now.
I have two lots of kids, a boy and a girl and a boy and a girl. They neatly bookend my responsibilities as a parent. The elder girl is in her last year of college. The youngest two are just starting the times table and phonetics, and the older boy is somewhere in Southeast Asia, on what he calls his “gap life,” collecting infections and tattoos of what he thinks are Jim Morrison lyrics written in pretty, curly, local languages but in fact probably say, “I like cock.”
Having spent a great deal of money to educate the first two, I realized along the way that I’ve learned nothing. But then, none of us have any idea what we’re doing. That’s right, none of us know anything. I stand at the school gates and watch the fear in the eyes of other fathers. The barely contained panic as they herd their offspring, already looking like hobbit Sherpas, carrying enormous schoolbags full of folders and books and photocopied letters and invitations to birthdays and concerts and playdates and football and after-school math clubs. You know my younger kids carry more paperwork than I do? And my job is paperwork. And they can’t read.
In the 100 years since we really got serious about education as a universally good idea, we’ve managed to take the 15 years of children’s lives that should be the most carefree, inquisitive, and memorable and fill them with a motley collection of stress and a neurotic fear of failure. Education is a dress-up box of good intentions, swivel-eyed utopianism, cruel competition, guilt, snobbery, wish fulfillment, special pleading, government intervention, bureaucracy, and social engineering. And no one is smart enough now to understand how we can stop it. Parents have no rational defense against the byzantine demands of the education-industrial complex. But this multi-national business says that they’re acting in the children’s best interests. And we can only react emotionally to the next Big Idea or the Cure or the Shortcut to Happiness.
I have two lots of kids, a boy and a girl and a boy and a girl. They neatly bookend my responsibilities as a parent. The elder girl is in her last year of college. The youngest two are just starting the times table and phonetics, and the older boy is somewhere in Southeast Asia, on what he calls his “gap life,” collecting infections and tattoos of what he thinks are Jim Morrison lyrics written in pretty, curly, local languages but in fact probably say, “I like cock.”
Having spent a great deal of money to educate the first two, I realized along the way that I’ve learned nothing. But then, none of us have any idea what we’re doing. That’s right, none of us know anything. I stand at the school gates and watch the fear in the eyes of other fathers. The barely contained panic as they herd their offspring, already looking like hobbit Sherpas, carrying enormous schoolbags full of folders and books and photocopied letters and invitations to birthdays and concerts and playdates and football and after-school math clubs. You know my younger kids carry more paperwork than I do? And my job is paperwork. And they can’t read.
In the 100 years since we really got serious about education as a universally good idea, we’ve managed to take the 15 years of children’s lives that should be the most carefree, inquisitive, and memorable and fill them with a motley collection of stress and a neurotic fear of failure. Education is a dress-up box of good intentions, swivel-eyed utopianism, cruel competition, guilt, snobbery, wish fulfillment, special pleading, government intervention, bureaucracy, and social engineering. And no one is smart enough now to understand how we can stop it. Parents have no rational defense against the byzantine demands of the education-industrial complex. But this multi-national business says that they’re acting in the children’s best interests. And we can only react emotionally to the next Big Idea or the Cure or the Shortcut to Happiness.
by A.A. Gill, Vanity Fair | Read more:
Photos: Left, from Classic Stock/The Image Works, Right, Hulton-Deutsch Collection/Corbis; Digital Colorization by Lorna Clark.
Subscribe to:
Comments (Atom)














