Thursday, November 8, 2012
Noam Chomsky on Where Artificial Intelligence Went Wrong
In May of last year, during the 150th anniversary of the Massachusetts Institute of Technology, a symposium on "Brains, Minds and Machines" took place, where leading computer scientists, psychologists and neuroscientists gathered to discuss the past and future of artificial intelligence and its connection to the neurosciences.
The gathering was meant to inspire multidisciplinary enthusiasm for the revival of the scientific question from which the field of artificial intelligence originated: how does intelligence work? How does our brain give rise to our cognitive abilities, and could this ever be implemented in a machine?
Noam Chomsky, speaking in the symposium, wasn't so enthused. Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.
This critique sparked an elaborate reply to Chomsky from Google's director of research and noted AI researcher, Peter Norvig, who defended the use of statistical models and argued that AI's new methods and definition of progress is not far off from what happens in the other sciences.
Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow. We wouldn't have taught the computer much about what the phrase "physicist Sir Isaac Newton" really means, even if we can build a search engine that returns sensible hits to users who type the phrase in.
It turns out that related disagreements have been pressing biologists who try to understand more traditional biological systems of the sort Chomsky likened to the language faculty. Just as the computing revolution enabled the massive data analysis that fuels the "new AI", so has the sequencing revolution in modern biology given rise to the blooming fields of genomics and systems biology. High-throughput sequencing, a technique by which millions of DNA molecules can be read quickly and cheaply, turned the sequencing of a genome from a decade-long expensive venture to an affordable, commonplace laboratory procedure. Rather than painstakingly studying genes in isolation, we can now observe the behavior of a system of genes acting in cells as a whole, in hundreds or thousands of different conditions.
The sequencing revolution has just begun and a staggering amount of data has already been obtained, bringing with it much promise and hype for new therapeutics and diagnoses for human disease. For example, when a conventional cancer drug fails to work for a group of patients, the answer might lie in the genome of the patients, which might have a special property that prevents the drug from acting. With enough data comparing the relevant features of genomes from these cancer patients and the right control groups, custom-made drugs might be discovered, leading to a kind of "personalized medicine." Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.
The success of fields like personalized medicine and other offshoots of the sequencing revolution and the systems-biology approach hinge upon our ability to deal with what Chomsky called "masses of unanalyzed data" -- placing biology in the center of a debate similar to the one taking place in psychology and artificial intelligence since the 1960s.
Systems biology did not rise without skepticism. The great geneticist and Nobel-prize winning biologist Sydney Brenner once defined the field as "low input, high throughput, no output science." Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."
Brenner's catch-phrase bite at systems biology and related techniques in neuroscience is not far off from Chomsky's criticism of AI. An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery. Yet, ever-improving technologies yield massive data related to the system, only a fraction of which might be relevant. Do we rely on powerful computing and statistical approaches to tease apart signal from noise, or do we look for the more basic principles that underlie the system and explain its essence? The urge to gather more data is irresistible, though it's not always clear what theoretical framework these data might fit into. These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?
by Yarden Katz, The Atlantic | Read more:
The gathering was meant to inspire multidisciplinary enthusiasm for the revival of the scientific question from which the field of artificial intelligence originated: how does intelligence work? How does our brain give rise to our cognitive abilities, and could this ever be implemented in a machine?
Noam Chomsky, speaking in the symposium, wasn't so enthused. Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.
This critique sparked an elaborate reply to Chomsky from Google's director of research and noted AI researcher, Peter Norvig, who defended the use of statistical models and argued that AI's new methods and definition of progress is not far off from what happens in the other sciences.
Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow. We wouldn't have taught the computer much about what the phrase "physicist Sir Isaac Newton" really means, even if we can build a search engine that returns sensible hits to users who type the phrase in.
It turns out that related disagreements have been pressing biologists who try to understand more traditional biological systems of the sort Chomsky likened to the language faculty. Just as the computing revolution enabled the massive data analysis that fuels the "new AI", so has the sequencing revolution in modern biology given rise to the blooming fields of genomics and systems biology. High-throughput sequencing, a technique by which millions of DNA molecules can be read quickly and cheaply, turned the sequencing of a genome from a decade-long expensive venture to an affordable, commonplace laboratory procedure. Rather than painstakingly studying genes in isolation, we can now observe the behavior of a system of genes acting in cells as a whole, in hundreds or thousands of different conditions.
The sequencing revolution has just begun and a staggering amount of data has already been obtained, bringing with it much promise and hype for new therapeutics and diagnoses for human disease. For example, when a conventional cancer drug fails to work for a group of patients, the answer might lie in the genome of the patients, which might have a special property that prevents the drug from acting. With enough data comparing the relevant features of genomes from these cancer patients and the right control groups, custom-made drugs might be discovered, leading to a kind of "personalized medicine." Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.
The success of fields like personalized medicine and other offshoots of the sequencing revolution and the systems-biology approach hinge upon our ability to deal with what Chomsky called "masses of unanalyzed data" -- placing biology in the center of a debate similar to the one taking place in psychology and artificial intelligence since the 1960s.
Systems biology did not rise without skepticism. The great geneticist and Nobel-prize winning biologist Sydney Brenner once defined the field as "low input, high throughput, no output science." Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."
Brenner's catch-phrase bite at systems biology and related techniques in neuroscience is not far off from Chomsky's criticism of AI. An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery. Yet, ever-improving technologies yield massive data related to the system, only a fraction of which might be relevant. Do we rely on powerful computing and statistical approaches to tease apart signal from noise, or do we look for the more basic principles that underlie the system and explain its essence? The urge to gather more data is irresistible, though it's not always clear what theoretical framework these data might fit into. These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?
by Yarden Katz, The Atlantic | Read more:
Photo: Graham Gordon Ramsay
Jan Zrzavy: Girl Friends, 1923 - Oil on Canvas (Centre for Modern and Contemporary Art, Veletrzni (Trades Fair) Palace, Prague)
via:
Can't a Guy Just Make Some Friends Around Here? Maybe.
A little more than a year ago, I moved into a West Plaza apartment. It was neat and spacious, with hardwood floors for the rug I'd bought in Istanbul but never stepped on. And because it was walking distance from both the Plaza and Westport, I could mosey to a coffee shop or stagger home drunk from a bar without even glancing at my car. Best of all, it was cheap enough that I could live alone and realize one of the defining fantasies of many 20-something men: I'd be responsible for every mess I created, without fighting with a roommate for kitchen-counter space or bathroom time as if we were sharing the West Bank. This was city living as personal libertarian utopia.Affordable rent and abundant space, after all, are what Kansas City is supposed to be about. After a rough few months in Washington, D.C. — where I hadn't liked the government achievatrons I met at parties and where an invasive landlord (who was cheating on her property taxes) had kicked me out of an egregiously expensive apartment (which was being rented illegally) — I was ready to build a better life in KC.
And a better life in KC was so easy — at first.
The Lattéland on Jefferson Street had a fine patio, so I spent a lot of time reading outside there. The Cinemark a few blocks from my apartment cost only $6 a show, so I saw every Zac Efron and Ryan Reynolds abomination that rolled through. (Because for six bucks, I'll watch anything in a theater.) These things I did alone, which was fine for a while.
Yet the warning signs of loneliness started to emerge. The apartment I'd chosen for its spaciousness began to look to me like that photo of Steve Jobs' living room, which had nothing but a rug and a lamp. The cheap movies I saw were stupid romantic comedies, and watching them alone lost its irony. And there were no co-workers to run into at the coffee shop after work because I was a freelancer with no co-workers. (...)
Last fall, I also spent a lot of time in downtown Cairo, which was like living inside a human beehive. The crush of people packed into small spaces, combined with more outgoing social norms between strangers, means that every time you step outside is a trip into the unexpected. It's almost impossible not to meet people. To live in Cairo is to share in a high-density experience, one in which all the human molecules jostle together to create a kind of social friction you rarely see in Kansas City. I thought about all the times I'd walked along Roanoke Parkway without passing another person on foot.
But you don't have to travel abroad to know that the way we live in Kansas City — by ourselves, in spread-out homes, often away from our families and detached from our friends, wedged into our cars — is a historical aberration and exceptional compared with many other parts of the world. And in recent years, various media outlets have singled us out for some embarrassing stats, telling the rest of the country what we already knew about ourselves. In 2009, for example, Forbes crunched population data for America's 40 largest metropolitan areas and ranked Kansas City dead last for the number of single people. Perhaps correspondingly, KC's number of bars, restaurants and nightclubs per capita didn't rank much better.
That same year, according to U.S. Census data, more than 30 percent of American 20-somethings moved. No other age group uproots itself as frequently. We move because of new jobs or new relationships, and we arrive with few attachments. We're looking for those bars and restaurants and clubs, and the ongoing renaissance of KC's downtown offers some encouragement that life for young, urban-minded people is getting a little more vibrant.
So maybe, I thought, it was time to look for some new friends. But when I set out to do that, I found that my fellow Kansas Citians were feeling all kinds of lonely. And some weren't shy about admitting it.
At least, that's what I learned from Craigslist.
by Matt Pearce, The Pitch | Read more:
Illustration: Shannon FreshwaterForget About 2012, the Race for the White House in 2016 has Already Begun
[ed. See also: this interesting piece on Obama's place in history.]
The 2012 presidential election is over; let the 2016 election begin! Without skipping a beat, the endless electoral process in the US has started over. It is the American equivalent of: “The king is dead. Long live the king.” It is why political junkies across the world love American politics. It never ends. So what do we know about the next race?
With Barack Obama still in the White House and unable to run again due to term limits, both parties are looking for a new candidate. In the Republican Party, it used to be that if you didn’t win the presidency one year, you could put yourself forward the next. Richard Nixon lost to John F Kennedy in 1960 but bounced back in 1968 to beat Hubert Humphrey. Now voters are so fickle and campaigns so punishing that you only get one shot. Americans don’t like losers and the electoral battlefield is strewn with the corpses of failed presidential candidates who overnight became unpersons. So bye-bye, Mitt. Missing you already. As F. Scott Fitzgerald observed, there are no second acts in American lives.
The 2012 presidential election is over; let the 2016 election begin! Without skipping a beat, the endless electoral process in the US has started over. It is the American equivalent of: “The king is dead. Long live the king.” It is why political junkies across the world love American politics. It never ends. So what do we know about the next race?
With Barack Obama still in the White House and unable to run again due to term limits, both parties are looking for a new candidate. In the Republican Party, it used to be that if you didn’t win the presidency one year, you could put yourself forward the next. Richard Nixon lost to John F Kennedy in 1960 but bounced back in 1968 to beat Hubert Humphrey. Now voters are so fickle and campaigns so punishing that you only get one shot. Americans don’t like losers and the electoral battlefield is strewn with the corpses of failed presidential candidates who overnight became unpersons. So bye-bye, Mitt. Missing you already. As F. Scott Fitzgerald observed, there are no second acts in American lives.
Team Billary
Yet politicians who merely fail to win their party’s nomination can keep plugging away. Hillary Clinton was beaten to the punch by Obama in 2008 but is expected to run in 2016. That explains why Bill Clinton has been going hoarse urging voters to back Obama. With Romney in the White House, Hillary would find it hard to unseat him. So, she needed Romney to lose and there was no better way to ensure that than to set her husband on to him. Having loyally served Obama as secretary of state, Hillary expects the president to repay the compliment and back her bid. The general view among Democrats is that if Hillary wants the nomination, it’s hers. They feel that her impeccable performance as senator for New York, then at the state department, has repaired the reputation for divisiveness and aggression that she acquired when she was first lady. She will be 69 in 2016.
It is not quite that easy. The Clintons may be the first couple of Democratic politics but after Obama’s rope-a-dope in the Denver debate, Vice-President Joe Biden became a Democratic hero by upstaging and smothering his rival, Paul Ryan, in their televised debate. No one would begrudge Biden a run at the primaries. Despite failing in 1988, when he was caught plagiarising a speech by Neil Kinnock, and in 2008, when he lost out to Obama and Hillary, his ambition remains undimmed. He will be 74 in 2016.
Yet politicians who merely fail to win their party’s nomination can keep plugging away. Hillary Clinton was beaten to the punch by Obama in 2008 but is expected to run in 2016. That explains why Bill Clinton has been going hoarse urging voters to back Obama. With Romney in the White House, Hillary would find it hard to unseat him. So, she needed Romney to lose and there was no better way to ensure that than to set her husband on to him. Having loyally served Obama as secretary of state, Hillary expects the president to repay the compliment and back her bid. The general view among Democrats is that if Hillary wants the nomination, it’s hers. They feel that her impeccable performance as senator for New York, then at the state department, has repaired the reputation for divisiveness and aggression that she acquired when she was first lady. She will be 69 in 2016.
It is not quite that easy. The Clintons may be the first couple of Democratic politics but after Obama’s rope-a-dope in the Denver debate, Vice-President Joe Biden became a Democratic hero by upstaging and smothering his rival, Paul Ryan, in their televised debate. No one would begrudge Biden a run at the primaries. Despite failing in 1988, when he was caught plagiarising a speech by Neil Kinnock, and in 2008, when he lost out to Obama and Hillary, his ambition remains undimmed. He will be 74 in 2016.
by Nicholas Wapshott, The New Statesman | Read more:
Photograph: Getty ImagesGiven Tablets but No Teachers, Ethiopian Children Teach Themselves
The One Laptop Per Child project started as a way of delivering technology and resources to schools in countries with little or no education infrastructure, using inexpensive computers to improve traditional curricula. What the OLPC Project has realized over the last five or six years, though, is that teaching kids stuff is really not that valuable. Yes, knowing all your state capitols how to spell "neighborhood" properly and whatnot isn't a bad thing, but memorizing facts and procedures isn't going to inspire kids to go out and learn by teaching themselves, which is the key to a good education. Instead, OLPC is trying to figure out a way to teach kids to learn, which is what this experiment is all about.
Rather than give out laptops (they're actually Motorola Zoom tablets plus solar chargers running custom software) to kids in schools with teachers, the OLPC Project decided to try something completely different: it delivered some boxes of tablets to two villages in Ethiopia, taped shut, with no instructions whatsoever. Just like, "hey kids, here's this box, you can open it if you want, see ya!"
Just to give you a sense of what these villages in Ethiopia are like, the kids (and most of the adults) there have never seen a word. No books, no newspapers, no street signs, no labels on packaged foods or goods. Nothing. And these villages aren't unique in that respect; there are many of them in Africa where the literacy rate is close to zero. So you might think that if you're going to give out fancy tablet computers, it would be helpful to have someone along to show these people how to use them, right?
But that's not what OLPC did. They just left the boxes there, sealed up, containing one tablet for every kid in each of the villages (nearly a thousand tablets in total), pre-loaded with a custom English-language operating system and SD cards with tracking software on them to record how the tablets were used. Here's how it went down, as related by OLPC founder Nicholas Negroponte at MIT Technology Review's EmTech conference last week:
Rather than give out laptops (they're actually Motorola Zoom tablets plus solar chargers running custom software) to kids in schools with teachers, the OLPC Project decided to try something completely different: it delivered some boxes of tablets to two villages in Ethiopia, taped shut, with no instructions whatsoever. Just like, "hey kids, here's this box, you can open it if you want, see ya!"
Just to give you a sense of what these villages in Ethiopia are like, the kids (and most of the adults) there have never seen a word. No books, no newspapers, no street signs, no labels on packaged foods or goods. Nothing. And these villages aren't unique in that respect; there are many of them in Africa where the literacy rate is close to zero. So you might think that if you're going to give out fancy tablet computers, it would be helpful to have someone along to show these people how to use them, right?
But that's not what OLPC did. They just left the boxes there, sealed up, containing one tablet for every kid in each of the villages (nearly a thousand tablets in total), pre-loaded with a custom English-language operating system and SD cards with tracking software on them to record how the tablets were used. Here's how it went down, as related by OLPC founder Nicholas Negroponte at MIT Technology Review's EmTech conference last week:
"We left the boxes in the village. Closed. Taped shut. No instruction, no human being. I thought, the kids will play with the boxes! Within four minutes, one kid not only opened the box, but found the on/off switch. He'd never seen an on/off switch. He powered it up. Within five days, they were using 47 apps per child per day. Within two weeks, they were singing ABC songs [in English] in the village. And within five months, they had hacked Android. Some idiot in our organization or in the Media Lab had disabled the camera! And they figured out it had a camera, and they hacked Android."by Evan Ackerman, DVICE | Read more:
Wednesday, November 7, 2012
Spotify and its Discontents
Walking—dazed—through a flea market in Greenwich Village, taking in the skewered meats, the empanadas, and the dumb T-shirts, I came across a fugitive salesman, probably near sixty years old, in a ripped Allman Brothers T-shirt. Like a technicolor mirage, he was hawking CDs and singing along to “You Don’t Love Me,” which was blaring, a tad trebly, from a boom box atop a fold-out table.
I spent much of my time, and some of my happiest hours, hanging out in record stores—until they all disappeared about five years ago. So I was relieved to see this funky dude and his valuables, because I knew how to behave in the presence of funky dudes and their coveted records. I nodded and smiled and proceeded to evaluate the merchandise. As I flipped through his CDs, the cases clacked against one another, and the familiar sound not only restored a sense of equilibrium within me, but generated the rumblings of anticipation—a fluttery response in my gut, the slim possibility that, hidden within the stacks, was an album that would alter my perception, just enough, so as to restore the wonder and blue-sky beauty of the ordinary world.
I spotted a Paul Butterfield recording that looked promising, and though I no longer owned a CD player, I wanted to buy it. But the cost was fifteen bucks, and I only had ten in my wallet, and there was no A.T.M. in sight. Holding the Butterfield CD, wrapped in cellophane, the psychedelic cover a piece of artwork in its own right, it suddenly seemed inconceivable I could live without the album, lest I wished to lead a life of regret and unfulfilled desire.
Since I was at a flea market, I called out to the proprietor and offered him my ten bucks, to which he replied, in so many words, that I could take said money and, for all he cared, chuck it in the East River—fifteen bucks or no deal. Well, I told him, that’s not very nice, and a bit unreasonable, as I could easily, in the comfort of my own home, dig up the Butterfield album on Spotify, and listen to it for free. “Whatever you prefer,” he said, as if there were really a choice. I responded with a big laugh—a chortle—the condescending kind reserved for prickly geezers who, despite overwhelming evidence to the contrary, cling to the last vestiges of a dead world. “Whipping Post” began to play out of his boom box, and, as he sang along, I walked away, his raspy voice, and Berry Oakley’s distorted bass line, fading in my wake.
After returning to my apartment, I powered up my computer and, without any hassle, quickly located the Butterfield album, whose track listing was laid out on the screen like an account ledger. This was supposed to be a victory of sorts, but I was quickly overcome by the blunt banality of the moment. In front of me was not only the album I desired, but also every other Butterfield recording ever made. And once I sampled and sated my hunger for Paul Butterfield’s blues, I could locate just about any recording ever made. But what, I wondered, were the consequences?
I spent much of my time, and some of my happiest hours, hanging out in record stores—until they all disappeared about five years ago. So I was relieved to see this funky dude and his valuables, because I knew how to behave in the presence of funky dudes and their coveted records. I nodded and smiled and proceeded to evaluate the merchandise. As I flipped through his CDs, the cases clacked against one another, and the familiar sound not only restored a sense of equilibrium within me, but generated the rumblings of anticipation—a fluttery response in my gut, the slim possibility that, hidden within the stacks, was an album that would alter my perception, just enough, so as to restore the wonder and blue-sky beauty of the ordinary world.
I spotted a Paul Butterfield recording that looked promising, and though I no longer owned a CD player, I wanted to buy it. But the cost was fifteen bucks, and I only had ten in my wallet, and there was no A.T.M. in sight. Holding the Butterfield CD, wrapped in cellophane, the psychedelic cover a piece of artwork in its own right, it suddenly seemed inconceivable I could live without the album, lest I wished to lead a life of regret and unfulfilled desire.
Since I was at a flea market, I called out to the proprietor and offered him my ten bucks, to which he replied, in so many words, that I could take said money and, for all he cared, chuck it in the East River—fifteen bucks or no deal. Well, I told him, that’s not very nice, and a bit unreasonable, as I could easily, in the comfort of my own home, dig up the Butterfield album on Spotify, and listen to it for free. “Whatever you prefer,” he said, as if there were really a choice. I responded with a big laugh—a chortle—the condescending kind reserved for prickly geezers who, despite overwhelming evidence to the contrary, cling to the last vestiges of a dead world. “Whipping Post” began to play out of his boom box, and, as he sang along, I walked away, his raspy voice, and Berry Oakley’s distorted bass line, fading in my wake.
After returning to my apartment, I powered up my computer and, without any hassle, quickly located the Butterfield album, whose track listing was laid out on the screen like an account ledger. This was supposed to be a victory of sorts, but I was quickly overcome by the blunt banality of the moment. In front of me was not only the album I desired, but also every other Butterfield recording ever made. And once I sampled and sated my hunger for Paul Butterfield’s blues, I could locate just about any recording ever made. But what, I wondered, were the consequences?
by Mike Spies, New Yorker | Read more:
Thoughts on The Facts of the Matter
Or is it? An editor’s note suggests that something else may be at work. “When we received this anonymous nonfiction submission,” it reads, “it caused quite a stir. One staff member insisted we call the New Haven, Ct., police immediately to report the twentieth-century crime it recounts. But first, we figured out by the mailing address that the author was someone whose work had been solicited for TriQuarterly. Other questions remained. What animal was this? A memoir? Essay? Craft essay? Fictional autobiography? Should we publish it with an introduction, a warning -- and what should we say? The author later labeled it ‘meta-nonfiction.’ We thought it was worth publishing for the issues it raises.”
And what issues are those? First, I think, is anonymity, which puts a barrier between writer and reader that belies the intentions of the form. A key faith of the personal essay, after all, is its intimacy, the idea that we are in the presence of a writer, working under his or her own name and in his or her own voice, as something profound is explored.
That exploration doesn’t have to be dramatic -- I think of Bernard Cooper’s meditation on sighing orJoan Didion’s riff on migraines -- but at its heart is authorial identity. And the first building block of identity is a name. This is one of the key ways we position ourselves, as readers, in an essay: to know who is speaking, and why. For that reason, the anonymity here makes me immediately suspicious, as if the essay were a kind of con.
And yet, what essay -- or for that matter, what novel, story, film, song, painting -- isn’t a con at the most basic level, a manipulation of memory and experience, a shaping of the chaos of the world? This is the paradox of art, that it is both utterly necessary and utterly invented, and it is the paradox of this post, as well.
As the anonymous author notes: “Would it matter to know my name, my race, or hers, or is a piece of nonfiction more potent for not knowing who I am, for not being able to make this personal, singular, my problem, not yours? Is it discretion not to reveal more of the facts, protecting her identity, or am I merely protecting my own? How much telling is a factual tale, and how much telling is too much? (Does it matter that I’ve never told anyone this?)”
by David L. Ulin, LA Times | Read more:
I-502: The End of Prohibition
Washington enthusiastically leapt into history Tuesday, becoming the first state, with Colorado, to reject federal drug-control policy and legalize recreational marijuana use.
Initiative 502 was winning 55 to 45 percent, with support from more than half of Washington's counties, rural and urban.
The vote puts Washington and Colorado to the left of the Netherlands on marijuana law, and makes them the nexus of a new social experiment with uncertain consequences. National and international media watched as vote counts rolled into I-502's election-night party in Seattle amid jubilant cheers.
"I'm going to go ahead and give my victory speech right now. After this I can go sit down and stop shaking," said Alison Holcomb, I-502's campaign manager and primary architect.
"Today the state of Washington looked at 75 years of national marijuana prohibition and said it is time for a new approach," she said.
As of Dec. 6, it will no longer be illegal for adults 21 and over to possess an ounce of marijuana. A new "drugged driving" law for marijuana impairment also kicks in then.
Tuesday's vote also begins a yearlong process for the state Liquor Control Board to set rules for heavily taxed and regulated sales at state-licensed marijuana stores, which are estimated to raise $1.9 billion in new revenue over five years.
Many legal experts expect the U.S. Justice Department, which remained silent during presidential-year politics, to push back and perhaps sue to block I-502 based on federal supremacy.
But Seattle City Attorney Pete Holmes said Seattle's U.S. Attorney Jenny Durkan told him Tuesday the federal government "has no plans, except to talk."
Initiative 502 was winning 55 to 45 percent, with support from more than half of Washington's counties, rural and urban.
The vote puts Washington and Colorado to the left of the Netherlands on marijuana law, and makes them the nexus of a new social experiment with uncertain consequences. National and international media watched as vote counts rolled into I-502's election-night party in Seattle amid jubilant cheers.
"I'm going to go ahead and give my victory speech right now. After this I can go sit down and stop shaking," said Alison Holcomb, I-502's campaign manager and primary architect.
"Today the state of Washington looked at 75 years of national marijuana prohibition and said it is time for a new approach," she said.
As of Dec. 6, it will no longer be illegal for adults 21 and over to possess an ounce of marijuana. A new "drugged driving" law for marijuana impairment also kicks in then.
Tuesday's vote also begins a yearlong process for the state Liquor Control Board to set rules for heavily taxed and regulated sales at state-licensed marijuana stores, which are estimated to raise $1.9 billion in new revenue over five years.
Many legal experts expect the U.S. Justice Department, which remained silent during presidential-year politics, to push back and perhaps sue to block I-502 based on federal supremacy.
But Seattle City Attorney Pete Holmes said Seattle's U.S. Attorney Jenny Durkan told him Tuesday the federal government "has no plans, except to talk."
by Jonathan Martin, Seattle Times | Read more:
Image: Wikipedia
Tuesday, November 6, 2012
Skyfall: The New Serious
The rapture inspired by Skyfall in critics and public alike might have surprised Bond fans of the past. For the franchise's 23rd installment lacks what some would have considered its quintessential ingredient.
What used to distinguish 007 from previous thriller heroes was his unique brand of ironic detachment. Ian Fleming's books demanded to be taken straight. The earlier films mocked their source material's vanity, as well as the thriller genre, love, death and Her Majesty's secret service. Their studied cheesiness mocked the mockery itself.
In Skyfall, Daniel Craig's Bond delivers a scattering of old-style quips, but the chronic flippancy from which they used to spring has disappeared. Indeed, the film's lack of larkiness is the point of one of the cracks. Ben Whishaw's Q, favouring practicality over hilarity, offers Bond only a gun and a radio tracker. When this produces a raised eyebrow, he says: "Were you expecting an exploding pen? We don't really go in for that any more." Thus frugally equipped, our hero confronts a world pervaded by guilt, doubt, grief and foreboding rather than the joshing sadism of his previous outings.
The asperity of that world is no novelty for Craig's 007. In Casino Royale, he suffered the humiliation of being tortured in the nude. Even more startlingly, he declared himself unambiguously in love. Quantum of Solace provided him with the psychological driver for his behaviour that had previously been considered unnecessary.
Still, James Bond is not the only screen hero to have sobered up. When his mirthfulness was at its height, it infected his big-screen beefcake peers. In the 1990 version of Total Recall, Arnold Schwarzenegger tries to outdo Bond in homicidal gibes. Impaling an enemy on a drill, he remarks: "Screw you!" When his wife tells him he can't hurt her because they are married, he shoots her in the forehead and says: "Consider that a divorce." This summer's reworking of the story, on the other hand, was glumly earnest, offering social and political allusions in place of flippancy.
The cheery Batman of 1966 has become the grim and agonised Dark Knight. Prometheus aspired to a portentousness of which Alien felt no need. The teen-flick turned sombre in The Hunger Games, while The Amazing Spider-Man spent so much time grappling with existential angst that he had little left for derring-do. Inception presented more of a mental puzzle than a white-knuckle ride. Even Harry Potter felt obliged to exit amid such unalloyed grimness that there were fears he might scare the children.
Cinema still plays host to gross-out, farce and facetiousness; yet it is darkness, deliberation and doom that are doing some of the best business.
What used to distinguish 007 from previous thriller heroes was his unique brand of ironic detachment. Ian Fleming's books demanded to be taken straight. The earlier films mocked their source material's vanity, as well as the thriller genre, love, death and Her Majesty's secret service. Their studied cheesiness mocked the mockery itself.
In Skyfall, Daniel Craig's Bond delivers a scattering of old-style quips, but the chronic flippancy from which they used to spring has disappeared. Indeed, the film's lack of larkiness is the point of one of the cracks. Ben Whishaw's Q, favouring practicality over hilarity, offers Bond only a gun and a radio tracker. When this produces a raised eyebrow, he says: "Were you expecting an exploding pen? We don't really go in for that any more." Thus frugally equipped, our hero confronts a world pervaded by guilt, doubt, grief and foreboding rather than the joshing sadism of his previous outings.
The asperity of that world is no novelty for Craig's 007. In Casino Royale, he suffered the humiliation of being tortured in the nude. Even more startlingly, he declared himself unambiguously in love. Quantum of Solace provided him with the psychological driver for his behaviour that had previously been considered unnecessary.
Still, James Bond is not the only screen hero to have sobered up. When his mirthfulness was at its height, it infected his big-screen beefcake peers. In the 1990 version of Total Recall, Arnold Schwarzenegger tries to outdo Bond in homicidal gibes. Impaling an enemy on a drill, he remarks: "Screw you!" When his wife tells him he can't hurt her because they are married, he shoots her in the forehead and says: "Consider that a divorce." This summer's reworking of the story, on the other hand, was glumly earnest, offering social and political allusions in place of flippancy.
The cheery Batman of 1966 has become the grim and agonised Dark Knight. Prometheus aspired to a portentousness of which Alien felt no need. The teen-flick turned sombre in The Hunger Games, while The Amazing Spider-Man spent so much time grappling with existential angst that he had little left for derring-do. Inception presented more of a mental puzzle than a white-knuckle ride. Even Harry Potter felt obliged to exit amid such unalloyed grimness that there were fears he might scare the children.
Cinema still plays host to gross-out, farce and facetiousness; yet it is darkness, deliberation and doom that are doing some of the best business.
by David Cox, The Guardian | Read more:
Photograph: Sportsphoto Ltd/AllstarEye Am a Camera: Surveillance and Sousveillance in the Glassage
Digital eye glasses like Google’s Project Glass, and my earlier Digital Eye Glass, will transform society because they introduce a two-sided surveillance and sousveillance.
Not only will authorities and shops be watching us and recording our comings and goings (surveillance as we know it today), but we will also be watching and recording them (sousveillance) through small wearable computers like Digital Eye Glass. This affects secrecy, not just privacy. As one of the early inventors and developers of wearable computing and reality augmenting and mediating, I was asked by TIME Tech to write about the history and future predictions of these technologies.
Not only will authorities and shops be watching us and recording our comings and goings (surveillance as we know it today), but we will also be watching and recording them (sousveillance) through small wearable computers like Digital Eye Glass. This affects secrecy, not just privacy. As one of the early inventors and developers of wearable computing and reality augmenting and mediating, I was asked by TIME Tech to write about the history and future predictions of these technologies.
Through the Glass
Society has entered the era of augmented and augmediated reality. Most of us use smartphones, which are, in some sense, wearable computers. Many smartphone apps overlay information onto the real world and this is a good example of augmented reality. Augmediated reality serves to both augment and mediate our surroundingsSoon, the smartphone will become eyeglass-based so that these overlays can augment and mediate our everyday lives. Companies like Google and Apple will soon be bringing out products for wearable computing in everyday life. The intended purpose of these products is the hands-free displaying of information currently available to most smartphone users. The small screen on the glass flashes information right on cue which, for instance, allows the eye glass wearer to get directions in the city, find a book in a store and even videoconference with a friend. (...)
Opposition from Authorities and Shops
Society has entered the era of augmented and augmediated reality. Most of us use smartphones, which are, in some sense, wearable computers. Many smartphone apps overlay information onto the real world and this is a good example of augmented reality. Augmediated reality serves to both augment and mediate our surroundingsSoon, the smartphone will become eyeglass-based so that these overlays can augment and mediate our everyday lives. Companies like Google and Apple will soon be bringing out products for wearable computing in everyday life. The intended purpose of these products is the hands-free displaying of information currently available to most smartphone users. The small screen on the glass flashes information right on cue which, for instance, allows the eye glass wearer to get directions in the city, find a book in a store and even videoconference with a friend. (...)
Opposition from Authorities and Shops
In my high school days, the opposition to my technology was mostly due to peer pressure — simply to its overall strangeness in being ahead of its time. But now that this peer pressure is in fact reversed (individual people want this now), a new kind of opposition is emerging. This opposition comes not from peers, but from authorities and shops. The very authorities that are installing surveillance cameras on buildings and light posts are afraid of cameras being installed on people. For example, I was wearing my Digital Eye Glass while eating at McDonald’s and was suddenly physically assaulted by McDonald’s employees. See “Physical assault by McDonald’s for wearing Digital Eye Glass.” They claimed they were enforcing (as vigilantes perhaps), a privacy law that does not even exist. See their side of the story in “Computerized seeing aids forbidden in McDonald’s.”
Although not a recording device, in the presence of such an attack, damage to the device causes it to retain temporarily stored data that would have otherwise been overwritten. In this sense the perpetrators of the attack have made what would otherwise not have been a recording device into one.
Ironically, the very establishments that oppose wearable cameras are usually the places where lots of surveillance is used. Thus I coined the new word “McVeillance” to denote a highly mass-produced (“McDonaldized”) form of veillance, in the same way that a “McMansion” is a mass-produced mansion. McVeillance also implies a prohibition on individual veillance; for example, a prohibition on what we call “sousveillance”. The term “sousveillance” stems from the contrasting French words sur, meaning “above”, and sous, meaning “below”. So “surveillance” denotes the “eye-in-the-sky” watching from above, whereas “sousveillance” denotes bringing the camera or other means of observation down to human level, either physically (mounting cameras on people rather than on buildings), or hierarchically (ordinary people doing the watching, rather than higher authorities, large entities or architectures doing the watching).
Thus, McVeillance, for example, is the installation of a large number of security cameras in a restaurant while at the same time physically assaulting guests for using their own camera to photograph the menu.
Although not a recording device, in the presence of such an attack, damage to the device causes it to retain temporarily stored data that would have otherwise been overwritten. In this sense the perpetrators of the attack have made what would otherwise not have been a recording device into one.
Ironically, the very establishments that oppose wearable cameras are usually the places where lots of surveillance is used. Thus I coined the new word “McVeillance” to denote a highly mass-produced (“McDonaldized”) form of veillance, in the same way that a “McMansion” is a mass-produced mansion. McVeillance also implies a prohibition on individual veillance; for example, a prohibition on what we call “sousveillance”. The term “sousveillance” stems from the contrasting French words sur, meaning “above”, and sous, meaning “below”. So “surveillance” denotes the “eye-in-the-sky” watching from above, whereas “sousveillance” denotes bringing the camera or other means of observation down to human level, either physically (mounting cameras on people rather than on buildings), or hierarchically (ordinary people doing the watching, rather than higher authorities, large entities or architectures doing the watching).
Thus, McVeillance, for example, is the installation of a large number of security cameras in a restaurant while at the same time physically assaulting guests for using their own camera to photograph the menu.
by Steve Mann, Time Tech | Read more:
Photo: Steve Mann
Monday, November 5, 2012
The Anatomies of Bureaucracy
The underlying bureaucratic key is the ability to deal with boredom. To function effectively in an environment that precludes everything vital and human. To breathe, so to speak, without air. The key is the ability, whether innate or conditioned, to find the other side of the rote, the picayune, the meaningless, the repetitive, the pointlessly complex. To be, in a word, unborable. It is the key to modern life. If you are immune to boredom, there is literally nothing you cannot accomplish.”
~ David Foster Wallace
One of the things that Hurricane Sandy draws to our attention are all of the bureaucratic forces that quietly and almost imperceptibly but decisively shape our lives and the world we inhabit. Bureaucratic institutions like FEMA, City Hall, the NYPD, the Department of Sanitation, Con Edison, and so forth. Catastrophes tend to offer them a moment to step into the spotlight and either dazzle or utterly fail. One of the reasons their emergence in the public’s attention is interesting is that the work they do in non-catastrophic circumstances is so workmanlike and dull that it’s boring to even think about.In one of the more amusing passages in David Foster Wallace’s The Pale King, a character mistakenly enters the wrong university classroom and finds himself developing an unexpected interest in accounting. The Jesuit accounting professor delivers remarkably fascinating reflections on the subject during his lectures, at one point making the following claim:
Enduring tedium over real time in a confined space is what real courage is… The truth is that the heroism of your childhood entertainments was not true valour. It was theatre. The grand gesture, the moment of choice, the mortal danger, the external foe, the climactic battle whose outcome resolves all – all designed to appear heroic, to excite and gratify an audience… Gentlemen, welcome to the world of reality – there is no audience. No one to applaud, to admire… actual heroism receives no ovation, entertains no one. No one queues up to see it. No one is interested.
The real heroes, it seems, are perhaps not those who make grand gestures or defeat foes but rather people like accountants: those who toil in obscurity and make the wheels of commerce and bureaucracy turn. Wallace called his last novel a “portrait of bureaucracy,” and the portrait offers is both horrifying and hopeful. His work explores this dialectic of ecstasy and crushing boredom, and the relation of freedom and rigid structure. Most intriguing is the way he understands the ecstasies and freedoms to be found even in the most boring and structured of scenarios—like working for the IRS.
The question of whether he is actually endorsing bureaucracy remains an open one, but more interesting to consider are the heroic pleasures he insists exist in the boredom of being a cog in a machine. At the very least it provides him an occasion to test out his unyielding belief that “Almost anything that you pay close, direct attention to becomes interesting.” All of this is well-known to anyone who reads or reads about Wallace, and it’s one of his major contributions to literary and to some degree even political discourse.
His last novel got me thinking about how other writers have grappled with life in a bureaucracy. One of Whitman’s greatest poems, for instance, is about the generally boring and unconscious experience of commuting. Granted, crossing Brooklyn Ferry is probably more interesting than taking the subway, but still—part of what makes that poem about the profoundly human dimensions of the daily commute so interesting is that it takes place in the context of going to or coming home from one’s job.
Much more recently, “The Office” and Office Space wrung a fair amount of humor out of the boredom and fellow-feeling of a bureaucratic life. Part of what’s funny and even tender and moving about these works is that everyone in a bureaucracy is constantly desperately seeking ways to retrieve some human element from the otherwise crushing banalities of the workplace. The fleeting and/or enduring romances, for instance, are compelling because they emerge in the context of the featureless terrains of corporate America.
When I worked as a temp at a huge accounting firm for a brief while in Chicago several years ago, I remember being shocked to discover that my boss—a partner in the firm with a magisterial view of the city stretching out below his window—spent a fair amount of time playing solitaire on his computer (which I could see reflected in the window whenever I poked my head into to tell him something or other. The fact that even the bureaucrats poach time back from the machine is still a surprising thing to consider.
Orwell and Kafka are probably the first writers who come to mind, but for me, the really great tale of life in a bureaucracy is Melville’s “Bartleby the Scrivener,” a “tale of Wall Street.” One of the things that makes Melville’s story so compelling is Bartleby’s strange relationship to the bureaucracy he is a part of (granted, it’s a small bureaucracy—a lawyer’s office, but it’s a bureaucracy in miniature, and a wheel within other wheels). Bartleby is both part of the bureaucracy and not; he seems indifferent to the whole thing. It hard to figure out what’s going on in his head at any given moment and he seems to recognize the need to work but also not much to care, preferring simply "not to." It's not exactly saying No! In Thunder. And it’s surprising how much Bartleby anticipates and informs later iterations. Turkey and Nippers cannot but remind one of the cast of peculiar characters in “The Office,” and of course the Lawyer (who narrates the story) contains in his bones the DNA of David Brent.
If to be unborable is the key to modern life, then there are many figures in literature that might help us think about living in a bureaucracy. Perhaps some of us have managed to escape the direct tentacles of the network of bureaucracies that surrounds us, but as Melville once said in a slightly different context, “who ain’t a slave?” We are all implicated in one way or another, whether we want to be or not. One question that underlies most of this discussion is about how we choose to inhabit this role, or perhaps, as Wallace would have it, what we choose to pay attention to. The deeper and perhaps more troubling question, though, is whether or not learning to find pleasure and interest in this role is to be complicit in some of the dehumanizing structures and forces that generate this very pleasure and interest. To be unbored is no doubt crucial to living a full and happy life, but is it, in short, a good thing?
I’m still not sure.
How Not to Abolish the Electoral College
Another U.S. presidential election is upon us, and once again the electoral college looms large as a threat to the legitimacy of government and people's faith in democracy. On the eve of what may be another split between the electoral college and the nationwide popular vote total, we are no closer to a direct popular election than we were twelve years ago when the winner was decided by the U.S. Supreme Court.
But that may not be such a bad thing for those of us who want to see the electoral college abolished. In fact, the best chance for abolition may lie in sharing the pain by reversing the party polarity of the 2000 split: i.e., for President Obama to win the electoral college and Mitt Romney to win the popular vote. With the likelihood that the electoral college will favor the Democrats for at least the next few elections, our best hope may lie in a split that infuriates Republicans so deeply that they would clamor for reform as Democrats did after 2000.
Perhaps the worst idea out there for ending the reign of the electoral college is an effort called the National Popular Vote Interstate Compact (NPVIC). The NPVIC reminds us of all that's wrong with the clause in the Constitution that leaves the choosing of the electors to the states. The more we mess with the state statutes governing the awarding of electoral votes, the more we may regress to a past when popular votes for U.S. President were not held at all by the states.
In my last column on the electoral college, I tried to overturn, with simple arithmetic, the widely-held myth that small states benefit from the electoral college. One encounters this myth everywhere including, most recently, Andrew Tanenbaum's widely-followed website electoral-vote.com. As I've argued in the past, the more partisan a state's presidential vote happens to be, the more that state will underperform in the electoral college, as opposed to the effect that that state would have on a nationwide popular vote, regardless of the size of the state. Thus, states like Utah, Wyoming, Idaho, and Alaska—usually among the most partisan in recent presidential elections—have a greater impact on the nationwide vote total than they do on the electoral college. Despite the obstacles, the safest, surest way to abolish the electoral college—without causing a host of new problems—is through constitutional amendment, not by the NPVIC, for reasons I will explain.
But that may not be such a bad thing for those of us who want to see the electoral college abolished. In fact, the best chance for abolition may lie in sharing the pain by reversing the party polarity of the 2000 split: i.e., for President Obama to win the electoral college and Mitt Romney to win the popular vote. With the likelihood that the electoral college will favor the Democrats for at least the next few elections, our best hope may lie in a split that infuriates Republicans so deeply that they would clamor for reform as Democrats did after 2000.
Perhaps the worst idea out there for ending the reign of the electoral college is an effort called the National Popular Vote Interstate Compact (NPVIC). The NPVIC reminds us of all that's wrong with the clause in the Constitution that leaves the choosing of the electors to the states. The more we mess with the state statutes governing the awarding of electoral votes, the more we may regress to a past when popular votes for U.S. President were not held at all by the states.
In my last column on the electoral college, I tried to overturn, with simple arithmetic, the widely-held myth that small states benefit from the electoral college. One encounters this myth everywhere including, most recently, Andrew Tanenbaum's widely-followed website electoral-vote.com. As I've argued in the past, the more partisan a state's presidential vote happens to be, the more that state will underperform in the electoral college, as opposed to the effect that that state would have on a nationwide popular vote, regardless of the size of the state. Thus, states like Utah, Wyoming, Idaho, and Alaska—usually among the most partisan in recent presidential elections—have a greater impact on the nationwide vote total than they do on the electoral college. Despite the obstacles, the safest, surest way to abolish the electoral college—without causing a host of new problems—is through constitutional amendment, not by the NPVIC, for reasons I will explain.
by Jeff Strabone, 3 Quarks Daily | Read more:
Subscribe to:
Comments (Atom)















