Thursday, November 8, 2012

Given Tablets but No Teachers, Ethiopian Children Teach Themselves

The One Laptop Per Child project started as a way of delivering technology and resources to schools in countries with little or no education infrastructure, using inexpensive computers to improve traditional curricula. What the OLPC Project has realized over the last five or six years, though, is that teaching kids stuff is really not that valuable. Yes, knowing all your state capitols how to spell "neighborhood" properly and whatnot isn't a bad thing, but memorizing facts and procedures isn't going to inspire kids to go out and learn by teaching themselves, which is the key to a good education. Instead, OLPC is trying to figure out a way to teach kids to learn, which is what this experiment is all about.

Rather than give out laptops (they're actually Motorola Zoom tablets plus solar chargers running custom software) to kids in schools with teachers, the OLPC Project decided to try something completely different: it delivered some boxes of tablets to two villages in Ethiopia, taped shut, with no instructions whatsoever. Just like, "hey kids, here's this box, you can open it if you want, see ya!"

Just to give you a sense of what these villages in Ethiopia are like, the kids (and most of the adults) there have never seen a word. No books, no newspapers, no street signs, no labels on packaged foods or goods. Nothing. And these villages aren't unique in that respect; there are many of them in Africa where the literacy rate is close to zero. So you might think that if you're going to give out fancy tablet computers, it would be helpful to have someone along to show these people how to use them, right?

But that's not what OLPC did. They just left the boxes there, sealed up, containing one tablet for every kid in each of the villages (nearly a thousand tablets in total), pre-loaded with a custom English-language operating system and SD cards with tracking software on them to record how the tablets were used. Here's how it went down, as related by OLPC founder Nicholas Negroponte at MIT Technology Review's EmTech conference last week:
"We left the boxes in the village. Closed. Taped shut. No instruction, no human being. I thought, the kids will play with the boxes! Within four minutes, one kid not only opened the box, but found the on/off switch. He'd never seen an on/off switch. He powered it up. Within five days, they were using 47 apps per child per day. Within two weeks, they were singing ABC songs [in English] in the village. And within five months, they had hacked Android. Some idiot in our organization or in the Media Lab had disabled the camera! And they figured out it had a camera, and they hacked Android."
 by Evan Ackerman, DVICE |  Read more:

Wednesday, November 7, 2012

Spotify and its Discontents

Walking—dazed—through a flea market in Greenwich Village, taking in the skewered meats, the empanadas, and the dumb T-shirts, I came across a fugitive salesman, probably near sixty years old, in a ripped Allman Brothers T-shirt. Like a technicolor mirage, he was hawking CDs and singing along to “You Don’t Love Me,” which was blaring, a tad trebly, from a boom box atop a fold-out table.

I spent much of my time, and some of my happiest hours, hanging out in record stores—until they all disappeared about five years ago. So I was relieved to see this funky dude and his valuables, because I knew how to behave in the presence of funky dudes and their coveted records. I nodded and smiled and proceeded to evaluate the merchandise. As I flipped through his CDs, the cases clacked against one another, and the familiar sound not only restored a sense of equilibrium within me, but generated the rumblings of anticipation—a fluttery response in my gut, the slim possibility that, hidden within the stacks, was an album that would alter my perception, just enough, so as to restore the wonder and blue-sky beauty of the ordinary world.

I spotted a Paul Butterfield recording that looked promising, and though I no longer owned a CD player, I wanted to buy it. But the cost was fifteen bucks, and I only had ten in my wallet, and there was no A.T.M. in sight. Holding the Butterfield CD, wrapped in cellophane, the psychedelic cover a piece of artwork in its own right, it suddenly seemed inconceivable I could live without the album, lest I wished to lead a life of regret and unfulfilled desire.

Since I was at a flea market, I called out to the proprietor and offered him my ten bucks, to which he replied, in so many words, that I could take said money and, for all he cared, chuck it in the East River—fifteen bucks or no deal. Well, I told him, that’s not very nice, and a bit unreasonable, as I could easily, in the comfort of my own home, dig up the Butterfield album on Spotify, and listen to it for free. “Whatever you prefer,” he said, as if there were really a choice. I responded with a big laugh—a chortle—the condescending kind reserved for prickly geezers who, despite overwhelming evidence to the contrary, cling to the last vestiges of a dead world. “Whipping Post” began to play out of his boom box, and, as he sang along, I walked away, his raspy voice, and Berry Oakley’s distorted bass line, fading in my wake.

After returning to my apartment, I powered up my computer and, without any hassle, quickly located the Butterfield album, whose track listing was laid out on the screen like an account ledger. This was supposed to be a victory of sorts, but I was quickly overcome by the blunt banality of the moment. In front of me was not only the album I desired, but also every other Butterfield recording ever made. And once I sampled and sated my hunger for Paul Butterfield’s blues, I could locate just about any recording ever made. But what, I wondered, were the consequences?

by Mike Spies, New Yorker |  Read more:

Thoughts on The Facts of the Matter


What is the relationship of truth and invention in literary nonfiction? Over at TriQuarterly, an anonymous post called “The Facts of the Matter” frames the issue in a fascinating way. Presented as a personal essay, written by a middle-aged male author who, as an undergraduate at Yale, sexually assaulted “a girl I liked,” it is a meditation on revelation, narrative and construction, raising questions about the interplay of fact and narrative by admitting to a brutal truth.

Or is it? An editor’s note suggests that something else may be at work. “When we received this anonymous nonfiction submission,” it reads, “it caused quite a stir. One staff member insisted we call the New Haven, Ct., police immediately to report the twentieth-century crime it recounts. But first, we figured out by the mailing address that the author was someone whose work had been solicited for TriQuarterly. Other questions remained. What animal was this? A memoir? Essay? Craft essay? Fictional autobiography? Should we publish it with an introduction, a warning -- and what should we say? The author later labeled it ‘meta-nonfiction.’ We thought it was worth publishing for the issues it raises.”

And what issues are those? First, I think, is anonymity, which puts a barrier between writer and reader that belies the intentions of the form. A key faith of the personal essay, after all, is its intimacy, the idea that we are in the presence of a writer, working under his or her own name and in his or her own voice, as something profound is explored.

That exploration doesn’t have to be dramatic -- I think of Bernard Cooper’s meditation on sighing orJoan Didion’s riff on migraines -- but at its heart is authorial identity. And the first building block of identity is a name. This is one of the key ways we position ourselves, as readers, in an essay: to know who is speaking, and why. For that reason, the anonymity here makes me immediately suspicious, as if the essay were a kind of con.

And yet, what essay -- or for that matter, what novel, story, film, song, painting -- isn’t a con at the most basic level, a manipulation of memory and experience, a shaping of the chaos of the world? This is the paradox of art, that it is both utterly necessary and utterly invented, and it is the paradox of this post, as well.

As the anonymous author notes: “Would it matter to know my name, my race, or hers, or is a piece of nonfiction more potent for not knowing who I am, for not being able to make this personal, singular, my problem, not yours? Is it discretion not to reveal more of the facts, protecting her identity, or am I merely protecting my own? How much telling is a factual tale, and how much telling is too much? (Does it matter that I’ve never told anyone this?)”

by David L. Ulin, LA Times |  Read more:

Jean NĂ©gulesco Guitar and White Vase 1929

I-502: The End of Prohibition

Washington enthusiastically leapt into history Tuesday, becoming the first state, with Colorado, to reject federal drug-control policy and legalize recreational marijuana use.

Initiative 502 was winning 55 to 45 percent, with support from more than half of Washington's counties, rural and urban.

The vote puts Washington and Colorado to the left of the Netherlands on marijuana law, and makes them the nexus of a new social experiment with uncertain consequences. National and international media watched as vote counts rolled into I-502's election-night party in Seattle amid jubilant cheers.

"I'm going to go ahead and give my victory speech right now. After this I can go sit down and stop shaking," said Alison Holcomb, I-502's campaign manager and primary architect.

"Today the state of Washington looked at 75 years of national marijuana prohibition and said it is time for a new approach," she said.

As of Dec. 6, it will no longer be illegal for adults 21 and over to possess an ounce of marijuana. A new "drugged driving" law for marijuana impairment also kicks in then.

Tuesday's vote also begins a yearlong process for the state Liquor Control Board to set rules for heavily taxed and regulated sales at state-licensed marijuana stores, which are estimated to raise $1.9 billion in new revenue over five years.

Many legal experts expect the U.S. Justice Department, which remained silent during presidential-year politics, to push back and perhaps sue to block I-502 based on federal supremacy.

But Seattle City Attorney Pete Holmes said Seattle's U.S. Attorney Jenny Durkan told him Tuesday the federal government "has no plans, except to talk."

by Jonathan Martin, Seattle Times |  Read more:
Image: Wikipedia

Tuesday, November 6, 2012

Skyfall: The New Serious

The rapture inspired by Skyfall in critics and public alike might have surprised Bond fans of the past. For the franchise's 23rd installment lacks what some would have considered its quintessential ingredient.

What used to distinguish 007 from previous thriller heroes was his unique brand of ironic detachment. Ian Fleming's books demanded to be taken straight. The earlier films mocked their source material's vanity, as well as the thriller genre, love, death and Her Majesty's secret service. Their studied cheesiness mocked the mockery itself.

In Skyfall, Daniel Craig's Bond delivers a scattering of old-style quips, but the chronic flippancy from which they used to spring has disappeared. Indeed, the film's lack of larkiness is the point of one of the cracks. Ben Whishaw's Q, favouring practicality over hilarity, offers Bond only a gun and a radio tracker. When this produces a raised eyebrow, he says: "Were you expecting an exploding pen? We don't really go in for that any more." Thus frugally equipped, our hero confronts a world pervaded by guilt, doubt, grief and foreboding rather than the joshing sadism of his previous outings.

The asperity of that world is no novelty for Craig's 007. In Casino Royale, he suffered the humiliation of being tortured in the nude. Even more startlingly, he declared himself unambiguously in love. Quantum of Solace provided him with the psychological driver for his behaviour that had previously been considered unnecessary.

Still, James Bond is not the only screen hero to have sobered up. When his mirthfulness was at its height, it infected his big-screen beefcake peers. In the 1990 version of Total Recall, Arnold Schwarzenegger tries to outdo Bond in homicidal gibes. Impaling an enemy on a drill, he remarks: "Screw you!" When his wife tells him he can't hurt her because they are married, he shoots her in the forehead and says: "Consider that a divorce." This summer's reworking of the story, on the other hand, was glumly earnest, offering social and political allusions in place of flippancy.

The cheery Batman of 1966 has become the grim and agonised Dark Knight. Prometheus aspired to a portentousness of which Alien felt no need. The teen-flick turned sombre in The Hunger Games, while The Amazing Spider-Man spent so much time grappling with existential angst that he had little left for derring-do. Inception presented more of a mental puzzle than a white-knuckle ride. Even Harry Potter felt obliged to exit amid such unalloyed grimness that there were fears he might scare the children.

Cinema still plays host to gross-out, farce and facetiousness; yet it is darkness, deliberation and doom that are doing some of the best business.

by David Cox, The Guardian |  Read more:
Photograph: Sportsphoto Ltd/Allstar

Eye Am a Camera: Surveillance and Sousveillance in the Glassage

Digital eye glasses like Google’s Project Glass, and my earlier Digital Eye Glass, will transform society because they introduce a two-sided surveillance and sousveillance.

Not only will authorities and shops be watching us and recording our comings and goings (surveillance as we know it today), but we will also be watching and recording them (sousveillance) through small wearable computers like Digital Eye Glass. This affects secrecy, not just privacy. As one of the early inventors and developers of wearable computing and reality augmenting and mediating, I was asked by TIME Tech to write about the history and future predictions of these technologies.

Through the Glass

Society has entered the era of augmented and augmediated reality. Most of us use smartphones, which are, in some sense, wearable computers. Many smartphone apps overlay information onto the real world and this is a good example of augmented reality. Augmediated reality serves to both augment and mediate our surroundingsSoon, the smartphone will become eyeglass-based so that these overlays can augment and mediate our everyday lives. Companies like Google and Apple will soon be bringing out products for wearable computing in everyday life. The intended purpose of these products is the hands-free displaying of information currently available to most smartphone users. The small screen on the glass flashes information right on cue which, for instance, allows the eye glass wearer to get directions in the city, find a book in a store and even videoconference with a friend. (...)

Opposition from Authorities and Shops

In my high school days, the opposition to my technology was mostly due to peer pressure — simply to its overall strangeness in being ahead of its time. But now that this peer pressure is in fact reversed (individual people want this now), a new kind of opposition is emerging. This opposition comes not from peers, but from authorities and shops. The very authorities that are installing surveillance cameras on buildings and light posts are afraid of cameras being installed on people. For example, I was wearing my Digital Eye Glass while eating at McDonald’s and was suddenly physically assaulted by McDonald’s employees. See “Physical assault by McDonald’s for wearing Digital Eye Glass.” They claimed they were enforcing (as vigilantes perhaps), a privacy law that does not even exist. See their side of the story in “Computerized seeing aids forbidden in McDonald’s.”

Although not a recording device, in the presence of such an attack, damage to the device causes it to retain temporarily stored data that would have otherwise been overwritten. In this sense the perpetrators of the attack have made what would otherwise not have been a recording device into one.

Ironically, the very establishments that oppose wearable cameras are usually the places where lots of surveillance is used. Thus I coined the new word “McVeillance” to denote a highly mass-produced (“McDonaldized”) form of veillance, in the same way that a “McMansion” is a mass-produced mansion. McVeillance also implies a prohibition on individual veillance; for example, a prohibition on what we call “sousveillance”. The term “sousveillance” stems from the contrasting French words sur, meaning “above”, and sous, meaning “below”. So “surveillance” denotes the “eye-in-the-sky” watching from above, whereas “sousveillance” denotes bringing the camera or other means of observation down to human level, either physically (mounting cameras on people rather than on buildings), or hierarchically (ordinary people doing the watching, rather than higher authorities, large entities or architectures doing the watching).

Thus, McVeillance, for example, is the installation of a large number of security cameras in a restaurant while at the same time physically assaulting guests for using their own camera to photograph the menu.

by Steve Mann, Time Tech |  Read more:
Photo: Steve Mann

Monday, November 5, 2012

The Anatomies of Bureaucracy

The underlying bureaucratic key is the ability to deal with boredom. To function effectively in an environment that precludes everything vital and human. To breathe, so to speak, without air. The key is the ability, whether innate or conditioned, to find the other side of the rote, the picayune, the meaningless, the repetitive, the pointlessly complex. To be, in a word, unborable. It is the key to modern life. If you are immune to boredom, there is literally nothing you cannot accomplish.” 
~ David Foster Wallace
One of the things that Hurricane Sandy draws to our attention are all of the bureaucratic forces that quietly and almost imperceptibly but decisively shape our lives and the world we inhabit. Bureaucratic institutions like FEMA, City Hall, the NYPD, the Department of Sanitation, Con Edison, and so forth. Catastrophes tend to offer them a moment to step into the spotlight and either dazzle or utterly fail. One of the reasons their emergence in the public’s attention is interesting is that the work they do in non-catastrophic circumstances is so workmanlike and dull that it’s boring to even think about.

In one of the more amusing passages in David Foster Wallace’s The Pale King, a character mistakenly enters the wrong university classroom and finds himself developing an unexpected interest in accounting. The Jesuit accounting professor delivers remarkably fascinating reflections on the subject during his lectures, at one point making the following claim:

Enduring tedium over real time in a confined space is what real courage is… The truth is that the heroism of your childhood entertainments was not true valour. It was theatre. The grand gesture, the moment of choice, the mortal danger, the external foe, the climactic battle whose outcome resolves all – all designed to appear heroic, to excite and gratify an audience… Gentlemen, welcome to the world of reality – there is no audience. No one to applaud, to admire… actual heroism receives no ovation, entertains no one. No one queues up to see it. No one is interested.

The real heroes, it seems, are perhaps not those who make grand gestures or defeat foes but rather people like accountants: those who toil in obscurity and make the wheels of commerce and bureaucracy turn. Wallace called his last novel a “portrait of bureaucracy,” and the portrait offers is both horrifying and hopeful. His work explores this dialectic of ecstasy and crushing boredom, and the relation of freedom and rigid structure. Most intriguing is the way he understands the ecstasies and freedoms to be found even in the most boring and structured of scenarios—like working for the IRS.

The question of whether he is actually endorsing bureaucracy remains an open one, but more interesting to consider are the heroic pleasures he insists exist in the boredom of being a cog in a machine. At the very least it provides him an occasion to test out his unyielding belief that “Almost anything that you pay close, direct attention to becomes interesting.” All of this is well-known to anyone who reads or reads about Wallace, and it’s one of his major contributions to literary and to some degree even political discourse.

His last novel got me thinking about how other writers have grappled with life in a bureaucracy. One of Whitman’s greatest poems, for instance, is about the generally boring and unconscious experience of commuting. Granted, crossing Brooklyn Ferry is probably more interesting than taking the subway, but still—part of what makes that poem about the profoundly human dimensions of the daily commute so interesting is that it takes place in the context of going to or coming home from one’s job.

Much more recently, “The Office” and Office Space wrung a fair amount of humor out of the boredom and fellow-feeling of a bureaucratic life. Part of what’s funny and even tender and moving about these works is that everyone in a bureaucracy is constantly desperately seeking ways to retrieve some human element from the otherwise crushing banalities of the workplace. The fleeting and/or enduring romances, for instance, are compelling because they emerge in the context of the featureless terrains of corporate America.

When I worked as a temp at a huge accounting firm for a brief while in Chicago several years ago, I remember being shocked to discover that my boss—a partner in the firm with a magisterial view of the city stretching out below his window—spent a fair amount of time playing solitaire on his computer (which I could see reflected in the window whenever I poked my head into to tell him something or other. The fact that even the bureaucrats poach time back from the machine is still a surprising thing to consider.

Orwell and Kafka are probably the first writers who come to mind, but for me, the really great tale of life in a bureaucracy is Melville’s “Bartleby the Scrivener,” a “tale of Wall Street.” One of the things that makes Melville’s story so compelling is Bartleby’s strange relationship to the bureaucracy he is a part of (granted, it’s a small bureaucracy—a lawyer’s office, but it’s a bureaucracy in miniature, and a wheel within other wheels). Bartleby is both part of the bureaucracy and not; he seems indifferent to the whole thing. It hard to figure out what’s going on in his head at any given moment and he seems to recognize the need to work but also not much to care, preferring simply "not to." It's not exactly saying No! In Thunder. And it’s surprising how much Bartleby anticipates and informs later iterations. Turkey and Nippers cannot but remind one of the cast of peculiar characters in “The Office,” and of course the Lawyer (who narrates the story) contains in his bones the DNA of David Brent.

If to be unborable is the key to modern life, then there are many figures in literature that might help us think about living in a bureaucracy. Perhaps some of us have managed to escape the direct tentacles of the network of bureaucracies that surrounds us, but as Melville once said in a slightly different context, “who ain’t a slave?” We are all implicated in one way or another, whether we want to be or not. One question that underlies most of this discussion is about how we choose to inhabit this role, or perhaps, as Wallace would have it, what we choose to pay attention to. The deeper and perhaps more troubling question, though, is whether or not learning to find pleasure and interest in this role is to be complicit in some of the dehumanizing structures and forces that generate this very pleasure and interest. To be unbored is no doubt crucial to living a full and happy life, but is it, in short, a good thing?

I’m still not sure.

How Not to Abolish the Electoral College

Another U.S. presidential election is upon us, and once again the electoral college looms large as a threat to the legitimacy of government and people's faith in democracy. On the eve of what may be another split between the electoral college and the nationwide popular vote total, we are no closer to a direct popular election than we were twelve years ago when the winner was decided by the U.S. Supreme Court.

But that may not be such a bad thing for those of us who want to see the electoral college abolished. In fact, the best chance for abolition may lie in sharing the pain by reversing the party polarity of the 2000 split: i.e., for President Obama to win the electoral college and Mitt Romney to win the popular vote. With the likelihood that the electoral college will favor the Democrats for at least the next few elections, our best hope may lie in a split that infuriates Republicans so deeply that they would clamor for reform as Democrats did after 2000.

Perhaps the worst idea out there for ending the reign of the electoral college is an effort called the National Popular Vote Interstate Compact (NPVIC). The NPVIC reminds us of all that's wrong with the clause in the Constitution that leaves the choosing of the electors to the states. The more we mess with the state statutes governing the awarding of electoral votes, the more we may regress to a past when popular votes for U.S. President were not held at all by the states.

In my last column on the electoral college, I tried to overturn, with simple arithmetic, the widely-held myth that small states benefit from the electoral college. One encounters this myth everywhere including, most recently, Andrew Tanenbaum's widely-followed website electoral-vote.com. As I've argued in the past, the more partisan a state's presidential vote happens to be, the more that state will underperform in the electoral college, as opposed to the effect that that state would have on a nationwide popular vote, regardless of the size of the state. Thus, states like Utah, Wyoming, Idaho, and Alaska—usually among the most partisan in recent presidential elections—have a greater impact on the nationwide vote total than they do on the electoral college. Despite the obstacles, the safest, surest way to abolish the electoral college—without causing a host of new problems—is through constitutional amendment, not by the NPVIC, for reasons I will explain.

by Jeff Strabone, 3 Quarks Daily |  Read more: 

Jon Measures
via:

Looking Into the Future

Can AIDS be cured?” That was the question being whispered in the back rooms and satellite meetings of the 19th International AIDS Conference, held in Washington, DC, this week. The conference’s formal business was to keep up the momentum behind the most successful public-health campaign of the past 30 years: the taming, at the cost of a few pills a day, of an infection that was once an inevitable killer. It still kills. About 1.7m people succumbed last year. But that figure is down from 2.3m in 2005 (see chart 1), and is expected to continue falling. Now, therefore, some people are starting to look beyond the antiretroviral (ARV) drugs which have brought this success. They are asking if something else could do even better.

The drugs work, and are getting cheaper by the year: a report released during the conference by the Clinton Foundation, an American global-health charity, put the annual cost of treatment at $200; it used to be $10,000. But once on them, you are on them for life. Stop, and the virus crawls out of cellular hidey-holes that ARVs cannot reach and rapidly reinfects you. This has implications both for patients, whose lives are constrained by the need for constant medication, and taxpayers, who bear most of the cost of this indefinite treatment.

Many of those taxpayers do not live in the rich world but in the worst-afflicted countries. A new estimate by UNAIDS, the United Nations agency charged with combating the disease, suggests that more than half of the cost of treating and preventing AIDS is now borne by these countries, rather than paid for by international agencies (see chart 2). As many of these countries have high economic growth rates, that is only right and proper. But it does mean that they, too, have a strong interest in a cure. And researchers would like to provide them with one.

The road to Berlin

A race is therefore on to work out how to flush the virus from its hiding places and get rid of it completely. Several clues suggest a cure may be possible. But no one knows which route will lead to it.

One of those routes passes through Timothy Brown. Mr Brown, pictured above, is known as the Berlin patient. He was living in that city in 2007 when he underwent radical treatment for leukaemia. This required the destruction of his immune system—the source of the cancer—and its replacement using stem cells transplanted from the bone marrow of a donor, which allowed him to grow a new (but alien) immune system.

Mr Brown did not just have leukaemia. He was also infected with HIV. So his doctor, with his permission, tried an experiment. The doctor searched for and found a donor who had a rare genetic mutation which confers immunity to HIV infection by disabling a protein on cell surfaces to which the virus attaches itself in order to gain entry to a cell.

After the transplant, the virus seemed to disappear from Mr Brown’s body. Traces of viral genes were found recently, but these may have been contamination, and in any case they did not amount to entire, working viruses. There is no disputing, however, that Mr Brown no longer needs drugs to stay healthy, and has not needed them for five years.

No one is suggesting immune-system transplants as a treatment for AIDS. They are far too dangerous and costly. The intriguing point about Mr Brown’s procedure is that it would have been expected to destroy directly only one of the hiding places of the virus: immune-system cells squirrelled away in a quiescent state as the system’s memory. (These allow it to recognise and respond to infections experienced in the past.) Other reservoirs, particularly certain brain cells, would not have been affected directly—and in Mr Brown’s case checking his brain to find out what is going on would be grossly unethical.

Clearly, it is dangerous to draw conclusions from a single example. But if quiescent memory cells are the main source of viral rebound, that would simplify the task of finding a cure. And many groups of researchers are trying to do just that, by waking up the memory cells so that ARVs can get at the virus within them.

by The Economist |  Read more:
Photo: Eyevine

The Visitor


I have a great deal of company in my house; especially in the morning, when nobody calls.’ Henry David Thoreau’s remark about his experience of solitude expresses many of the common ideas we have about the work — and the apparent privileges — of being alone. As he put it so vividly in Walden(1854), his classic account of the time he spent alone in the Massachusetts woods, he went there to ‘live deep and suck out all the marrow of life’. Similarly, when I retreat into solitude, I hope to reconnect with a wider, more-than-human world and by so doing become more fully alive, recovering what the Gospel of Thomas called, ‘he who was, before he came into being’.

It has always been a key step on the ‘way’ or ‘path’ in Taoist philosophy (‘way’ being the literal translation of Tao) to go into the wilderness and lay oneself bare to whatever one finds there, whether that be the agonies of St Anthony, or the detachment of the Taoist masters. Alone in the wild, we shed the conventions that keep society ticking over — freedom from the clock, in particular, is a hugely important factor. We are opened up to other, less conventional, customs: in the wild, animals may talk to us, birds will sometimes guide us to water or light, the wind may become a second skin. In the wild, we may even find our true bodies, creaturely and vivid and indivisible from the rest of creation — but this comes only when we break free, not just from the constraints of clock and calendar and social convention, but also from the sometimes-clandestine hopes, expectations and fears with which we arrived.

For many of us, solitude is tempting because it is ‘the place of purification’, as the Israeli philosopher Martin Buber called it. Our aspiration for travelling to that place might be the simple pleasure of being away, unburdened by the pettiness and corruption of the day-to-day round. For me, being alone is about staying sane in a noisy and cluttered world – I have what the Canadian pianist Glenn Gould called a ‘high solitude quotient’ — but it is also a way of opening out a creative space, to give myself a chance to be quiet enough to see or hear what happens next.
There are those who are inclined to be purely temporary dwellers in the wilderness, who don’t stay long. As soon as they are renewed by a spell of lonely contemplation, they are eager to return to the everyday fray. Meanwhile, the committed wilderness dwellers are after something more. Yet, even if contemplative solitude gives them a glimpse of the sublime (or, if they are so disposed, the divine), questions arise immediately afterwards. What now? What is the purpose of this solitude? Whom does it serve?

To take oneself out into the wilderness as part of a spiritual quest is one thing, but to remain there in a kind of barren ecstasy is another. The Anglo-American mystic Thomas Merton argues that ‘there is no greater disaster in the spiritual life than to be immersed in unreality, for life is maintained and nourished in us by our vital relation with realities outside and above us. When our life feeds on unreality, it must starve.’ If practised as part of a living spiritual path, he says, and not simply as an escape from corruption or as an expression of misanthropy, ‘your solitude will bear immense fruit in the souls of men you will never see on earth’. It is a point Ralph Waldo Emerson, Thoreau’s friend and teacher, also makes. Solitude is essential to the spiritual path, he argues, but ‘we require such solitude as shall hold us to its revelations when we are in the streets and in palaces … it is not the circumstances of seeing more or fewer people but the readiness of sympathy that imports’.

by John Burnside, Aeon |  Read more:
Illustration: Sarah Maycock

The Permanent Militarization of America

In 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.

In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.

The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.

Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.

But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.

Like all institutions, the military works to enhance its public image, but this is just one element of militarization. Most of the political discourse on military matters comes from civilians, who are more vocal about “supporting our troops” than the troops themselves. It doesn’t help that there are fewer veterans in Congress today than at any previous point since World War II. Those who have served are less likely to offer unvarnished praise for the military, for it, like all institutions, has its own frustrations and failings. But for non-veterans — including about four-fifths of all members of Congress — there is only unequivocal, unhesitating adulation. The political costs of anything else are just too high.

by Aaron B. O'Connell, NY Times |  Read more:
Photo: Wikipedia

Sunday, November 4, 2012


Joel Philip Myers
via:

Buzz Off


[ed. As if real mosquitoes aren't irritating enough...]

Is this a mosquito?

No. It’s an insect spy drone for urban areas, already in production, funded by the US Government. It can be remotely controlled and is equipped with a camera and a microphone. It can land on you, and it may have the potential to take a DNA sample or leave RFID tracking nanotechnology on your skin. It can fly through an open window, or it can attach to your clothing until you take it in your home.
via: