Wednesday, November 7, 2012

I-502: The End of Prohibition

Washington enthusiastically leapt into history Tuesday, becoming the first state, with Colorado, to reject federal drug-control policy and legalize recreational marijuana use.

Initiative 502 was winning 55 to 45 percent, with support from more than half of Washington's counties, rural and urban.

The vote puts Washington and Colorado to the left of the Netherlands on marijuana law, and makes them the nexus of a new social experiment with uncertain consequences. National and international media watched as vote counts rolled into I-502's election-night party in Seattle amid jubilant cheers.

"I'm going to go ahead and give my victory speech right now. After this I can go sit down and stop shaking," said Alison Holcomb, I-502's campaign manager and primary architect.

"Today the state of Washington looked at 75 years of national marijuana prohibition and said it is time for a new approach," she said.

As of Dec. 6, it will no longer be illegal for adults 21 and over to possess an ounce of marijuana. A new "drugged driving" law for marijuana impairment also kicks in then.

Tuesday's vote also begins a yearlong process for the state Liquor Control Board to set rules for heavily taxed and regulated sales at state-licensed marijuana stores, which are estimated to raise $1.9 billion in new revenue over five years.

Many legal experts expect the U.S. Justice Department, which remained silent during presidential-year politics, to push back and perhaps sue to block I-502 based on federal supremacy.

But Seattle City Attorney Pete Holmes said Seattle's U.S. Attorney Jenny Durkan told him Tuesday the federal government "has no plans, except to talk."

by Jonathan Martin, Seattle Times |  Read more:
Image: Wikipedia

Tuesday, November 6, 2012

Skyfall: The New Serious

The rapture inspired by Skyfall in critics and public alike might have surprised Bond fans of the past. For the franchise's 23rd installment lacks what some would have considered its quintessential ingredient.

What used to distinguish 007 from previous thriller heroes was his unique brand of ironic detachment. Ian Fleming's books demanded to be taken straight. The earlier films mocked their source material's vanity, as well as the thriller genre, love, death and Her Majesty's secret service. Their studied cheesiness mocked the mockery itself.

In Skyfall, Daniel Craig's Bond delivers a scattering of old-style quips, but the chronic flippancy from which they used to spring has disappeared. Indeed, the film's lack of larkiness is the point of one of the cracks. Ben Whishaw's Q, favouring practicality over hilarity, offers Bond only a gun and a radio tracker. When this produces a raised eyebrow, he says: "Were you expecting an exploding pen? We don't really go in for that any more." Thus frugally equipped, our hero confronts a world pervaded by guilt, doubt, grief and foreboding rather than the joshing sadism of his previous outings.

The asperity of that world is no novelty for Craig's 007. In Casino Royale, he suffered the humiliation of being tortured in the nude. Even more startlingly, he declared himself unambiguously in love. Quantum of Solace provided him with the psychological driver for his behaviour that had previously been considered unnecessary.

Still, James Bond is not the only screen hero to have sobered up. When his mirthfulness was at its height, it infected his big-screen beefcake peers. In the 1990 version of Total Recall, Arnold Schwarzenegger tries to outdo Bond in homicidal gibes. Impaling an enemy on a drill, he remarks: "Screw you!" When his wife tells him he can't hurt her because they are married, he shoots her in the forehead and says: "Consider that a divorce." This summer's reworking of the story, on the other hand, was glumly earnest, offering social and political allusions in place of flippancy.

The cheery Batman of 1966 has become the grim and agonised Dark Knight. Prometheus aspired to a portentousness of which Alien felt no need. The teen-flick turned sombre in The Hunger Games, while The Amazing Spider-Man spent so much time grappling with existential angst that he had little left for derring-do. Inception presented more of a mental puzzle than a white-knuckle ride. Even Harry Potter felt obliged to exit amid such unalloyed grimness that there were fears he might scare the children.

Cinema still plays host to gross-out, farce and facetiousness; yet it is darkness, deliberation and doom that are doing some of the best business.

by David Cox, The Guardian |  Read more:
Photograph: Sportsphoto Ltd/Allstar

Eye Am a Camera: Surveillance and Sousveillance in the Glassage

Digital eye glasses like Google’s Project Glass, and my earlier Digital Eye Glass, will transform society because they introduce a two-sided surveillance and sousveillance.

Not only will authorities and shops be watching us and recording our comings and goings (surveillance as we know it today), but we will also be watching and recording them (sousveillance) through small wearable computers like Digital Eye Glass. This affects secrecy, not just privacy. As one of the early inventors and developers of wearable computing and reality augmenting and mediating, I was asked by TIME Tech to write about the history and future predictions of these technologies.

Through the Glass

Society has entered the era of augmented and augmediated reality. Most of us use smartphones, which are, in some sense, wearable computers. Many smartphone apps overlay information onto the real world and this is a good example of augmented reality. Augmediated reality serves to both augment and mediate our surroundingsSoon, the smartphone will become eyeglass-based so that these overlays can augment and mediate our everyday lives. Companies like Google and Apple will soon be bringing out products for wearable computing in everyday life. The intended purpose of these products is the hands-free displaying of information currently available to most smartphone users. The small screen on the glass flashes information right on cue which, for instance, allows the eye glass wearer to get directions in the city, find a book in a store and even videoconference with a friend. (...)

Opposition from Authorities and Shops

In my high school days, the opposition to my technology was mostly due to peer pressure — simply to its overall strangeness in being ahead of its time. But now that this peer pressure is in fact reversed (individual people want this now), a new kind of opposition is emerging. This opposition comes not from peers, but from authorities and shops. The very authorities that are installing surveillance cameras on buildings and light posts are afraid of cameras being installed on people. For example, I was wearing my Digital Eye Glass while eating at McDonald’s and was suddenly physically assaulted by McDonald’s employees. See “Physical assault by McDonald’s for wearing Digital Eye Glass.” They claimed they were enforcing (as vigilantes perhaps), a privacy law that does not even exist. See their side of the story in “Computerized seeing aids forbidden in McDonald’s.”

Although not a recording device, in the presence of such an attack, damage to the device causes it to retain temporarily stored data that would have otherwise been overwritten. In this sense the perpetrators of the attack have made what would otherwise not have been a recording device into one.

Ironically, the very establishments that oppose wearable cameras are usually the places where lots of surveillance is used. Thus I coined the new word “McVeillance” to denote a highly mass-produced (“McDonaldized”) form of veillance, in the same way that a “McMansion” is a mass-produced mansion. McVeillance also implies a prohibition on individual veillance; for example, a prohibition on what we call “sousveillance”. The term “sousveillance” stems from the contrasting French words sur, meaning “above”, and sous, meaning “below”. So “surveillance” denotes the “eye-in-the-sky” watching from above, whereas “sousveillance” denotes bringing the camera or other means of observation down to human level, either physically (mounting cameras on people rather than on buildings), or hierarchically (ordinary people doing the watching, rather than higher authorities, large entities or architectures doing the watching).

Thus, McVeillance, for example, is the installation of a large number of security cameras in a restaurant while at the same time physically assaulting guests for using their own camera to photograph the menu.

by Steve Mann, Time Tech |  Read more:
Photo: Steve Mann

Monday, November 5, 2012

The Anatomies of Bureaucracy

The underlying bureaucratic key is the ability to deal with boredom. To function effectively in an environment that precludes everything vital and human. To breathe, so to speak, without air. The key is the ability, whether innate or conditioned, to find the other side of the rote, the picayune, the meaningless, the repetitive, the pointlessly complex. To be, in a word, unborable. It is the key to modern life. If you are immune to boredom, there is literally nothing you cannot accomplish.” 
~ David Foster Wallace
One of the things that Hurricane Sandy draws to our attention are all of the bureaucratic forces that quietly and almost imperceptibly but decisively shape our lives and the world we inhabit. Bureaucratic institutions like FEMA, City Hall, the NYPD, the Department of Sanitation, Con Edison, and so forth. Catastrophes tend to offer them a moment to step into the spotlight and either dazzle or utterly fail. One of the reasons their emergence in the public’s attention is interesting is that the work they do in non-catastrophic circumstances is so workmanlike and dull that it’s boring to even think about.

In one of the more amusing passages in David Foster Wallace’s The Pale King, a character mistakenly enters the wrong university classroom and finds himself developing an unexpected interest in accounting. The Jesuit accounting professor delivers remarkably fascinating reflections on the subject during his lectures, at one point making the following claim:

Enduring tedium over real time in a confined space is what real courage is… The truth is that the heroism of your childhood entertainments was not true valour. It was theatre. The grand gesture, the moment of choice, the mortal danger, the external foe, the climactic battle whose outcome resolves all – all designed to appear heroic, to excite and gratify an audience… Gentlemen, welcome to the world of reality – there is no audience. No one to applaud, to admire… actual heroism receives no ovation, entertains no one. No one queues up to see it. No one is interested.

The real heroes, it seems, are perhaps not those who make grand gestures or defeat foes but rather people like accountants: those who toil in obscurity and make the wheels of commerce and bureaucracy turn. Wallace called his last novel a “portrait of bureaucracy,” and the portrait offers is both horrifying and hopeful. His work explores this dialectic of ecstasy and crushing boredom, and the relation of freedom and rigid structure. Most intriguing is the way he understands the ecstasies and freedoms to be found even in the most boring and structured of scenarios—like working for the IRS.

The question of whether he is actually endorsing bureaucracy remains an open one, but more interesting to consider are the heroic pleasures he insists exist in the boredom of being a cog in a machine. At the very least it provides him an occasion to test out his unyielding belief that “Almost anything that you pay close, direct attention to becomes interesting.” All of this is well-known to anyone who reads or reads about Wallace, and it’s one of his major contributions to literary and to some degree even political discourse.

His last novel got me thinking about how other writers have grappled with life in a bureaucracy. One of Whitman’s greatest poems, for instance, is about the generally boring and unconscious experience of commuting. Granted, crossing Brooklyn Ferry is probably more interesting than taking the subway, but still—part of what makes that poem about the profoundly human dimensions of the daily commute so interesting is that it takes place in the context of going to or coming home from one’s job.

Much more recently, “The Office” and Office Space wrung a fair amount of humor out of the boredom and fellow-feeling of a bureaucratic life. Part of what’s funny and even tender and moving about these works is that everyone in a bureaucracy is constantly desperately seeking ways to retrieve some human element from the otherwise crushing banalities of the workplace. The fleeting and/or enduring romances, for instance, are compelling because they emerge in the context of the featureless terrains of corporate America.

When I worked as a temp at a huge accounting firm for a brief while in Chicago several years ago, I remember being shocked to discover that my boss—a partner in the firm with a magisterial view of the city stretching out below his window—spent a fair amount of time playing solitaire on his computer (which I could see reflected in the window whenever I poked my head into to tell him something or other. The fact that even the bureaucrats poach time back from the machine is still a surprising thing to consider.

Orwell and Kafka are probably the first writers who come to mind, but for me, the really great tale of life in a bureaucracy is Melville’s “Bartleby the Scrivener,” a “tale of Wall Street.” One of the things that makes Melville’s story so compelling is Bartleby’s strange relationship to the bureaucracy he is a part of (granted, it’s a small bureaucracy—a lawyer’s office, but it’s a bureaucracy in miniature, and a wheel within other wheels). Bartleby is both part of the bureaucracy and not; he seems indifferent to the whole thing. It hard to figure out what’s going on in his head at any given moment and he seems to recognize the need to work but also not much to care, preferring simply "not to." It's not exactly saying No! In Thunder. And it’s surprising how much Bartleby anticipates and informs later iterations. Turkey and Nippers cannot but remind one of the cast of peculiar characters in “The Office,” and of course the Lawyer (who narrates the story) contains in his bones the DNA of David Brent.

If to be unborable is the key to modern life, then there are many figures in literature that might help us think about living in a bureaucracy. Perhaps some of us have managed to escape the direct tentacles of the network of bureaucracies that surrounds us, but as Melville once said in a slightly different context, “who ain’t a slave?” We are all implicated in one way or another, whether we want to be or not. One question that underlies most of this discussion is about how we choose to inhabit this role, or perhaps, as Wallace would have it, what we choose to pay attention to. The deeper and perhaps more troubling question, though, is whether or not learning to find pleasure and interest in this role is to be complicit in some of the dehumanizing structures and forces that generate this very pleasure and interest. To be unbored is no doubt crucial to living a full and happy life, but is it, in short, a good thing?

I’m still not sure.

How Not to Abolish the Electoral College

Another U.S. presidential election is upon us, and once again the electoral college looms large as a threat to the legitimacy of government and people's faith in democracy. On the eve of what may be another split between the electoral college and the nationwide popular vote total, we are no closer to a direct popular election than we were twelve years ago when the winner was decided by the U.S. Supreme Court.

But that may not be such a bad thing for those of us who want to see the electoral college abolished. In fact, the best chance for abolition may lie in sharing the pain by reversing the party polarity of the 2000 split: i.e., for President Obama to win the electoral college and Mitt Romney to win the popular vote. With the likelihood that the electoral college will favor the Democrats for at least the next few elections, our best hope may lie in a split that infuriates Republicans so deeply that they would clamor for reform as Democrats did after 2000.

Perhaps the worst idea out there for ending the reign of the electoral college is an effort called the National Popular Vote Interstate Compact (NPVIC). The NPVIC reminds us of all that's wrong with the clause in the Constitution that leaves the choosing of the electors to the states. The more we mess with the state statutes governing the awarding of electoral votes, the more we may regress to a past when popular votes for U.S. President were not held at all by the states.

In my last column on the electoral college, I tried to overturn, with simple arithmetic, the widely-held myth that small states benefit from the electoral college. One encounters this myth everywhere including, most recently, Andrew Tanenbaum's widely-followed website electoral-vote.com. As I've argued in the past, the more partisan a state's presidential vote happens to be, the more that state will underperform in the electoral college, as opposed to the effect that that state would have on a nationwide popular vote, regardless of the size of the state. Thus, states like Utah, Wyoming, Idaho, and Alaska—usually among the most partisan in recent presidential elections—have a greater impact on the nationwide vote total than they do on the electoral college. Despite the obstacles, the safest, surest way to abolish the electoral college—without causing a host of new problems—is through constitutional amendment, not by the NPVIC, for reasons I will explain.

by Jeff Strabone, 3 Quarks Daily |  Read more: 

Jon Measures
via:

Looking Into the Future

Can AIDS be cured?” That was the question being whispered in the back rooms and satellite meetings of the 19th International AIDS Conference, held in Washington, DC, this week. The conference’s formal business was to keep up the momentum behind the most successful public-health campaign of the past 30 years: the taming, at the cost of a few pills a day, of an infection that was once an inevitable killer. It still kills. About 1.7m people succumbed last year. But that figure is down from 2.3m in 2005 (see chart 1), and is expected to continue falling. Now, therefore, some people are starting to look beyond the antiretroviral (ARV) drugs which have brought this success. They are asking if something else could do even better.

The drugs work, and are getting cheaper by the year: a report released during the conference by the Clinton Foundation, an American global-health charity, put the annual cost of treatment at $200; it used to be $10,000. But once on them, you are on them for life. Stop, and the virus crawls out of cellular hidey-holes that ARVs cannot reach and rapidly reinfects you. This has implications both for patients, whose lives are constrained by the need for constant medication, and taxpayers, who bear most of the cost of this indefinite treatment.

Many of those taxpayers do not live in the rich world but in the worst-afflicted countries. A new estimate by UNAIDS, the United Nations agency charged with combating the disease, suggests that more than half of the cost of treating and preventing AIDS is now borne by these countries, rather than paid for by international agencies (see chart 2). As many of these countries have high economic growth rates, that is only right and proper. But it does mean that they, too, have a strong interest in a cure. And researchers would like to provide them with one.

The road to Berlin

A race is therefore on to work out how to flush the virus from its hiding places and get rid of it completely. Several clues suggest a cure may be possible. But no one knows which route will lead to it.

One of those routes passes through Timothy Brown. Mr Brown, pictured above, is known as the Berlin patient. He was living in that city in 2007 when he underwent radical treatment for leukaemia. This required the destruction of his immune system—the source of the cancer—and its replacement using stem cells transplanted from the bone marrow of a donor, which allowed him to grow a new (but alien) immune system.

Mr Brown did not just have leukaemia. He was also infected with HIV. So his doctor, with his permission, tried an experiment. The doctor searched for and found a donor who had a rare genetic mutation which confers immunity to HIV infection by disabling a protein on cell surfaces to which the virus attaches itself in order to gain entry to a cell.

After the transplant, the virus seemed to disappear from Mr Brown’s body. Traces of viral genes were found recently, but these may have been contamination, and in any case they did not amount to entire, working viruses. There is no disputing, however, that Mr Brown no longer needs drugs to stay healthy, and has not needed them for five years.

No one is suggesting immune-system transplants as a treatment for AIDS. They are far too dangerous and costly. The intriguing point about Mr Brown’s procedure is that it would have been expected to destroy directly only one of the hiding places of the virus: immune-system cells squirrelled away in a quiescent state as the system’s memory. (These allow it to recognise and respond to infections experienced in the past.) Other reservoirs, particularly certain brain cells, would not have been affected directly—and in Mr Brown’s case checking his brain to find out what is going on would be grossly unethical.

Clearly, it is dangerous to draw conclusions from a single example. But if quiescent memory cells are the main source of viral rebound, that would simplify the task of finding a cure. And many groups of researchers are trying to do just that, by waking up the memory cells so that ARVs can get at the virus within them.

by The Economist |  Read more:
Photo: Eyevine

The Visitor


I have a great deal of company in my house; especially in the morning, when nobody calls.’ Henry David Thoreau’s remark about his experience of solitude expresses many of the common ideas we have about the work — and the apparent privileges — of being alone. As he put it so vividly in Walden(1854), his classic account of the time he spent alone in the Massachusetts woods, he went there to ‘live deep and suck out all the marrow of life’. Similarly, when I retreat into solitude, I hope to reconnect with a wider, more-than-human world and by so doing become more fully alive, recovering what the Gospel of Thomas called, ‘he who was, before he came into being’.

It has always been a key step on the ‘way’ or ‘path’ in Taoist philosophy (‘way’ being the literal translation of Tao) to go into the wilderness and lay oneself bare to whatever one finds there, whether that be the agonies of St Anthony, or the detachment of the Taoist masters. Alone in the wild, we shed the conventions that keep society ticking over — freedom from the clock, in particular, is a hugely important factor. We are opened up to other, less conventional, customs: in the wild, animals may talk to us, birds will sometimes guide us to water or light, the wind may become a second skin. In the wild, we may even find our true bodies, creaturely and vivid and indivisible from the rest of creation — but this comes only when we break free, not just from the constraints of clock and calendar and social convention, but also from the sometimes-clandestine hopes, expectations and fears with which we arrived.

For many of us, solitude is tempting because it is ‘the place of purification’, as the Israeli philosopher Martin Buber called it. Our aspiration for travelling to that place might be the simple pleasure of being away, unburdened by the pettiness and corruption of the day-to-day round. For me, being alone is about staying sane in a noisy and cluttered world – I have what the Canadian pianist Glenn Gould called a ‘high solitude quotient’ — but it is also a way of opening out a creative space, to give myself a chance to be quiet enough to see or hear what happens next.
There are those who are inclined to be purely temporary dwellers in the wilderness, who don’t stay long. As soon as they are renewed by a spell of lonely contemplation, they are eager to return to the everyday fray. Meanwhile, the committed wilderness dwellers are after something more. Yet, even if contemplative solitude gives them a glimpse of the sublime (or, if they are so disposed, the divine), questions arise immediately afterwards. What now? What is the purpose of this solitude? Whom does it serve?

To take oneself out into the wilderness as part of a spiritual quest is one thing, but to remain there in a kind of barren ecstasy is another. The Anglo-American mystic Thomas Merton argues that ‘there is no greater disaster in the spiritual life than to be immersed in unreality, for life is maintained and nourished in us by our vital relation with realities outside and above us. When our life feeds on unreality, it must starve.’ If practised as part of a living spiritual path, he says, and not simply as an escape from corruption or as an expression of misanthropy, ‘your solitude will bear immense fruit in the souls of men you will never see on earth’. It is a point Ralph Waldo Emerson, Thoreau’s friend and teacher, also makes. Solitude is essential to the spiritual path, he argues, but ‘we require such solitude as shall hold us to its revelations when we are in the streets and in palaces … it is not the circumstances of seeing more or fewer people but the readiness of sympathy that imports’.

by John Burnside, Aeon |  Read more:
Illustration: Sarah Maycock

The Permanent Militarization of America

In 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.

In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.

The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.

Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.

But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.

Like all institutions, the military works to enhance its public image, but this is just one element of militarization. Most of the political discourse on military matters comes from civilians, who are more vocal about “supporting our troops” than the troops themselves. It doesn’t help that there are fewer veterans in Congress today than at any previous point since World War II. Those who have served are less likely to offer unvarnished praise for the military, for it, like all institutions, has its own frustrations and failings. But for non-veterans — including about four-fifths of all members of Congress — there is only unequivocal, unhesitating adulation. The political costs of anything else are just too high.

by Aaron B. O'Connell, NY Times |  Read more:
Photo: Wikipedia

Sunday, November 4, 2012


Joel Philip Myers
via:

Buzz Off


[ed. As if real mosquitoes aren't irritating enough...]

Is this a mosquito?

No. It’s an insect spy drone for urban areas, already in production, funded by the US Government. It can be remotely controlled and is equipped with a camera and a microphone. It can land on you, and it may have the potential to take a DNA sample or leave RFID tracking nanotechnology on your skin. It can fly through an open window, or it can attach to your clothing until you take it in your home.
via:

Chiều trên phố (by Cường Đỗ Mạnh)
via:

Tracking the Trackers

The life of a politician in campaign mode is brutal. Go here, say this, go there, say that, smile, smile, smile, smile, shake hands, remember policy positions, learn new policy positions, learn talking points, learn names, attend the next rally, the next 7 a.m. breakfast, the next evening debate, the next lunchtime forum, keep your bladder in check, keep your libido in check, kiss ass, kiss babies, kiss spouse who is perfect and without whom etc., fundraise, fundraise, fundraise, and through all of it, never make a mistake, ever.

Not easy. But now consider the job of the person who has to constantly follow this politician around. Not this politician's pen- and Purell-carrying body man. Not the spokesperson who keeps the media at bay. Someone else. Someone from the opposing party, someone whose job is literally just to follow this politician everywhere and record everything that happens. The tracker.

If it takes a certain kind of fanatical drive to be a politician running for high office—and it does—then it takes a slightly different but equally fanatical drive to be the person who watches that politician, day in and day out, for an entire campaign season. It takes a guy like, say, Keith Schipper.

Schipper is 25 years old, he's a Republican, and on this day in March he's trying to talk his way into an event being put on by the Democratic candidate for governor, Jay Inslee, in an office park in Kent. Schipper's small Canon HD video camera is stashed in the pocket of his coat, ready to be pulled out in an instant. His rap about the people's right to know is cued up.

No dice. Inslee's people made Schipper the second he walked in the door. They've researched him, and they've researched their rights. This green-vehicle-manufacturing company is unquestionably private property, and Schipper's not welcome.

He gets the boot and gamely heads back to his messy green Nissan Pathfinder. No big loss. There will be a public Inslee event soon, no doubt, and Schipper will be there, by rights un-ejectable. I follow him out into the parking lot because I'm curious, and as Schipper drives off, I notice a University of Washington sticker on his back window.

Schipper studied political science and philosophy at the UW. I know what he studied because I decided to track Schipper a bit after that first encounter. Researched his history. Watched him at political events. Noted the tin of chew he keeps in the right pocket of his pants. Followed his Twitter feed, where he talks of "pounding Monsters on a long drive home from Spokane" and boasts that "sicking the police on a bunch of #UW students may very well end up being my most favorite thing I did in this election cycle."

I didn't just track him surreptitiously. I tried to get an interview with Schipper through his bosses at the state Republican Party but was ignored. I also tried to message him through Facebook. No answer. But that was fine. As Schipper knows, a core truth of tracker life is that the person you're following will show up in public eventually.

It's odd, though, the coyness of trackers. They're supposedly devoted to the idea that nothing should be hidden from the voters anymore, but they're not exactly eager to have themselves described to voters. Maybe it's because they don't want to become the story and distract from whatever campaign narrative they're trying to push. Maybe they know that tracking comes off as unseemly to a lot of people. Maybe they want to try to avoid having "Shame on you!" shouted at them at events, as happened to a Democratic tracker in Florida recently (video seemed to show her leaving the event, a memorial for Vietnam veterans, crying). Or perhaps it's just that trackers are so intimately familiar with how quickly one captured moment can come to define a person—like the moment that solidified the current obsession with tracking candidates, Republican Senate candidate George Allen's "Macaca Moment" on the campaign trail in Virginia several elections ago.

On that day in August 2006, at a campaign stop, Allen pointed at a Democratic tracker who had been following him everywhere and who happened to be Indian American. He said, "This fellow here over here with the yellow shirt, Macaca or whatever his name is, he's with my opponent, he's following us around everywhere." Video of Allen losing his cool went viral, he lost the election, and the rest is tracker history.

It's the kind of moment all trackers now hope to capture, a moment not unlike the one that a certain still-anonymous individual captured earlier this year at a private Romney fundraiser in Florida at which the candidate talked about 47 percent of Americans acting like "victims" who can't be bothered to "take personal responsibility and care for their lives." And just like the person who captured that "47 percent" remark, most trackers (and their handlers) remain reluctant to take a bow in public. When I called the state Democratic Party and asked them to put me in touch with their gubernatorial tracker, Zach Wurtz—aka "Zach the Track"—no one was very excited about the idea. But I kept shaking the tree, and one day earlier this month, I got a text from Wurtz telling me that he would be at an upcoming forum featuring Inslee and the Republican candidate for governor, Rob McKenna. I made it my business to be there.

by Eli Sanders, The Stranger |  Read more:
Photo: Kelly O

Robert Glasper


Your Employee Is an Online Celebrity. Now What Do You Do?

Meet your newest management headache: the co-branded employee.

A growing number of professionals are using social media to build a personal, public identity—a brand of their own—based on their work. Think of an accountant who writes a widely read blog about auditing, or a sales associate who has attracted a big following online by tweeting out his store's latest deals.

Co-branded employees may exist largely below the radar now, but that's changing fast, and employers need to start preparing for the ever-greater challenges they pose for managers, co-workers and companies. Their activities can either complement a company's own brand image or clash with it. Companies that fail to make room for co-branded employees—or worse yet, embrace them without thinking through the implications—risk alienating or losing their best employees, or confusing or even burning their corporate brand.

Part of this change is generational. Younger employees show up on the job with an existing social-media presence, which they aren't about to abandon—especially since they see their personal brands lasting longer than any single job or career.

Social-media services like LinkedIn and Facebook also encourage users to build networks and share their professional as well as personal expertise. And increasingly, companies are recognizing that these activities have a business value. When a management consultant leads a large LinkedIn group, he builds a valuable source of referrals and recruitment prospects; when a lawyer tweets the latest legal news, she positions her firm as the go-to experts in that field. How can an employer resist?

And yet, there is a downside: Co-branded employees can raise tough questions about how to contain their online activities—and how to compensate them. It also isn't easy for managers to balance responsibilities among the bloggers and nonbloggers within a team. And it takes an effort to make sure employees' brands align with the company's.

To ensure that co-branded employees benefit a company, rather than undermine it, managers need to consider these questions:

by Alexandra Samuel, WSJ |  Read more:
Illustration: Viktor Koen

America Gone Wild

This year, Princeton, N.J., has hired sharpshooters to cull 250 deer from the town's herd of 550 over the winter. The cost: $58,700. Columbia, S.C., is spending $1 million to rid its drainage systems of beavers and their dams. The 2009 "miracle on the Hudson," when US Airways flight 1549 had to make an emergency landing after its engines ingested Canada geese, saved 155 passengers and crew, but the $60 million A320 Airbus was a complete loss. In the U.S., the total cost of wildlife damage to crops, landscaping and infrastructure now exceeds $28 billion a year ($1.5 billion from deer-vehicle crashes alone), according to Michael Conover of Utah State University, who monitors conflicts between people and wildlife.

The resurgence of wildlife in the U.S. has led to an increase in conflict between wildlife and people.

Those conflicts often pit neighbor against neighbor. After a small dog in Wheaton, Ill., was mauled by a coyote and had to be euthanized, officials hired a nuisance wildlife mitigation company. Its operator killed four coyotes and got voice-mail death threats. A brick was tossed through a city official's window, city-council members were peppered with threatening emails and letters, and the FBI was called in. After Princeton began culling deer 12 years ago, someone splattered the mayor's car with deer innards.

Welcome to the nature wars, in which Americans fight each other over too much of a good thing—expanding wildlife populations produced by our conservation and environmental successes. We now routinely encounter wild birds and animals that our parents and grandparents rarely saw. As their numbers have grown, wild creatures have spread far beyond their historic ranges into new habitats, including ours. It is very likely that in the eastern United States today more people live in closer proximity to more wildlife than anywhere on Earth at any time in history.

In a world full of eco-woes like species extinctions, this should be wonderful news—unless, perhaps, you are one of more than 4,000 drivers who will hit a deer today, or your child's soccer field is carpeted with goose droppings, or feral cats have turned your bird feeder into a fast-food outlet, or wild turkeys have eaten your newly planted seed corn, or beavers have flooded your driveway, or bears are looting your trash cans. And that's just the beginning.

by Jim Sterba, WSJ |  Read more:
Illustration: Jesse Lenz