Friday, November 9, 2012


nancy mccarthy
via:

How Human Beings Almost Vanished From Earth In 70,000 B.C.

Add all of us up, all 7 billion human beings on earth, and clumped together we weigh roughly 750 billion pounds. That, says Harvard biologist E.O. Wilson, is more than 100 times the biomass of any large animal that's ever walked the Earth. And we're still multiplying. Most demographers say we will hit 9 billion before we peak, and what happens then?

Well, we've waxed. So we can wane. Let's just hope we wane gently. Because once in our history, the world-wide population of human beings skidded so sharply we were down to roughly a thousand reproductive adults. One study says we hit as low as 40.

Forty? Come on, that can't be right. Well, the technical term is 40 "breeding pairs" (children not included). More likely there was a drastic dip and then 5,000 to 10,000 bedraggled Homo sapiens struggled together in pitiful little clumps hunting and gathering for thousands of years until, in the late Stone Age, we humans began to recover. But for a time there, says science writer Sam Kean, "We damn near went extinct."

I'd never heard of this almost-blinking-out. That's because I'd never heard of Toba, the "supervolcano." It's not a myth. While details may vary, Toba happened.

Toba, The Supervolcano

Once upon a time, says Sam, around 70,000 B.C., a volcano called Toba, on Sumatra, in Indonesia went off, blowing roughly 650 miles of vaporized rock into the air. It is the largest volcanic eruption we know of, dwarfing everything else...

That eruption dropped roughly six centimeters of ash — the layer can still be seen on land — over all of South Asia, the Indian Ocean, the Arabian and South China Sea. According to the Volcanic Explosivity Index, the Toba eruption scored an "8", which translates to "mega-colossal" — that's two orders of magnitude greater than the largest volcanic eruption in historic times at Mount Tambora in Indonesia, which caused the 1816 "Year Without a Summer" in the northern hemisphere.

With so much ash, dust and vapor in the air, Sam Kean says it's a safe guess that Toba "dimmed the sun for six years, disrupted seasonal rains, choked off streams and scattered whole cubic miles of hot ash (imagine wading through a giant ashtray) across acres and acres of plants." Berries, fruits, trees, African game became scarce; early humans, living in East Africa just across the Indian Ocean from Mount Toba, probably starved, or at least, he says, "It's not hard to imagine the population plummeting."

by Robert Krulwich, NPR |  Read more:
Illustration: Robert Krulwich

Leaving Digital for DIY

Wired's long-time editor in chief, Chris Anderson, announced on Friday that he was leaving the magazine to become CEO of his DIY-drone company, 3D Robotics. This move comes a month after the release of his latest book, Makers: The New Industrial Revolution. In an interview last week (and a brief follow-up after Friday's announcement), Anderson talked with me about today's biggest revolution in how and where we actually make things. If the last few decades have been about big digital forces — the Internet, social media — he notes that the future will be about applying all of that in the real world. "Wondrous as the Web is," he writes, "it doesn’t compare to the real world. Not in economic size (online commerce is less than 10 percent of all sales) and not in its place in our lives. The digital revolution has been largely limited to screens." But, he adds, the salient fact remains that "we live in homes, drive in cars, and work in offices." And it is that physical part of the economy that is undergoing the biggest and most fundamental change. (...)

Some people hear the word "maker" and imagine we are going back to the past, a world of artisans using traditional tools to make craft products. From reading your book, that’s not exactly what you mean. You're talking about a blurring of what might be called the analog and digital worlds. Tell us more about how you see this playing out.

The "Maker Movement" is simply what happened when the web revolution hit the real world. The term, in its current sense, was first coined in 2005 by Dale Dougherty of the tech book publisher O’Reilly, to describe what he saw as a resurgence of tinkering, that great American tradition. But rather than isolated hobbyists in their garages the way it used to be, this was coming out of Web communities and increasingly using digital tools, from 3D printers, which were just then starting to be available for regular consumers, and to a new generation of free and easy CAD software programs. ...The world’s factories are now increasingly open to anyone via the web, creating what amounts to "cloud manufacturing." And huge Maker communities have grown around sites such as Kickstarter and Etsy. In Silicon Valley, the phrase is that "hardware is the new software." The web's powerful innovation model can now be applied to making real stuff. As a result, we’re going from the "tinkerer" phase of this movement to entrepreneurship, too. What began as a social revolution is starting to look like an industrial revolution.

What are the key technological innovations and shifts that are enabling and powering the revolution in making things?

There are really two: the first on the desktop and the second in the cloud.

On the desktop, it's been the arrival of cheap and easy-to-use digital fabrication tools for consumers. Although the technology, from 3D printers to laser cutters and CNC machines, have been used in industry for decades, they've only reached the consumer desktop over the past few years. Five years ago, that started with the RepRap project, which was an open-source 3D printer design that could be assembled as a kit and led to the first MakerBots.

Call that the Apple II phase, where the machines were mostly sold to geeks who were willing to put up with a lot of complexity to experiment with an exciting new technology. But over the past year, to extend the analogy, we've entered the Macintosh phase: consumer 3D printers that come ready to run, and just work out of the box with simple software.

That allows anyone to fabricate complex objects, with no special machine-shop skills or tools. In the same way that the first consumer laser printers, back in the 1980s, were able to hide all the complexity of professional printing behind the a simple menu item that said "Print," today’s 3D printers hide the complexity of computer-controlled fabrication behind a simple menu item that says "Make."

That desktop manufacturing revolution is great for making a few of something, as a custom product or prototype, but it should not be confused with mass production. It can take an hour or more to 3D-print a single object. So how do we get from there to an industrial revolution? Enter the second enabling technology: the cloud.

Over the past few decade, the world’s factories have embraced the Web. Thanks to online marketplaces such as Alibaba (in China) and MFG.com (in the U.S.), factories that would once only work for big commercial customers will now take orders from anyone. That means that once you've prototyped your widget on your desktop, you can send the same digital design to a big factory to be turned into a form that can be mass-produced. You don't need to be a company, and typically such factories are willing to work at any scale, from hundreds to hundreds of thousands.

Once, to get into manufacturing, you needed to own a factory. Then, with outsourcing, you needed to at least know someone who owned a factory. Now all you need is a web browser and a credit card to get robots in China to work for you!

by Richard Florida, Atlantic Cities |  Read more:
Photo: Creative Commons by Joi Ito

The Heart Grows Smarter

If you go back and read a bunch of biographies of people born 100 to 150 years ago, you notice a few things that were more common then than now.

First, many more families suffered the loss of a child, which had a devastating and historically underappreciated impact on their overall worldviews.

Second, and maybe related, many more children grew up in cold and emotionally distant homes, where fathers, in particular, barely knew their children and found it impossible to express their love for them.

It wasn’t only parents who were emotionally diffident; it was the people who studied them. In 1938, a group of researchers began an intensive study of 268 students at Harvard University. The plan was to track them through their entire lives, measuring, testing and interviewing them every few years to see how lives develop.

In the 1930s and 1940s, the researchers didn’t pay much attention to the men’s relationships. Instead, following the intellectual fashions of the day, they paid a lot of attention to the men’s physiognomy. Did they have a “masculine” body type? Did they show signs of vigorous genetic endowments?

But as this study — the Grant Study — progressed, the power of relationships became clear. The men who grew up in homes with warm parents were much more likely to become first lieutenants and majors in World War II. The men who grew up in cold, barren homes were much more likely to finish the war as privates.

Body type was useless as a predictor of how the men would fare in life. So was birth order or political affiliation. Even social class had a limited effect. But having a warm childhood was powerful. As George Vaillant, the study director, sums it up in “Triumphs of Experience,” his most recent summary of the research, “It was the capacity for intimate relationships that predicted flourishing in all aspects of these men’s lives.”

Of the 31 men in the study incapable of establishing intimate bonds, only four are still alive. Of those who were better at forming relationships, more than a third are living.

It’s not that the men who flourished had perfect childhoods. Rather, as Vaillant puts it, “What goes right is more important than what goes wrong.” The positive effect of one loving relative, mentor or friend can overwhelm the negative effects of the bad things that happen.

In case after case, the magic formula is capacity for intimacy combined with persistence, discipline, order and dependability. The men who could be affectionate about people and organized about things had very enjoyable lives.

But a childhood does not totally determine a life. The beauty of the Grant Study is that, as Vaillant emphasizes, it has followed its subjects for nine decades. The big finding is that you can teach an old dog new tricks. The men kept changing all the way through, even in their 80s and 90s.

by David Brooks, NY Times |  Read more:
Illustration: via

Thursday, November 8, 2012


Sydney Bella Sparrow, Three Greens Convene 2009
via:

Noam Chomsky on Where Artificial Intelligence Went Wrong

In May of last year, during the 150th anniversary of the Massachusetts Institute of Technology, a symposium on "Brains, Minds and Machines" took place, where leading computer scientists, psychologists and neuroscientists gathered to discuss the past and future of artificial intelligence and its connection to the neurosciences.

The gathering was meant to inspire multidisciplinary enthusiasm for the revival of the scientific question from which the field of artificial intelligence originated: how does intelligence work? How does our brain give rise to our cognitive abilities, and could this ever be implemented in a machine?

Noam Chomsky, speaking in the symposium, wasn't so enthused. Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.

This critique sparked an elaborate reply to Chomsky from Google's director of research and noted AI researcher, Peter Norvig, who defended the use of statistical models and argued that AI's new methods and definition of progress is not far off from what happens in the other sciences.

Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow. We wouldn't have taught the computer much about what the phrase "physicist Sir Isaac Newton" really means, even if we can build a search engine that returns sensible hits to users who type the phrase in.

It turns out that related disagreements have been pressing biologists who try to understand more traditional biological systems of the sort Chomsky likened to the language faculty. Just as the computing revolution enabled the massive data analysis that fuels the "new AI", so has the sequencing revolution in modern biology given rise to the blooming fields of genomics and systems biology. High-throughput sequencing, a technique by which millions of DNA molecules can be read quickly and cheaply, turned the sequencing of a genome from a decade-long expensive venture to an affordable, commonplace laboratory procedure. Rather than painstakingly studying genes in isolation, we can now observe the behavior of a system of genes acting in cells as a whole, in hundreds or thousands of different conditions.

The sequencing revolution has just begun and a staggering amount of data has already been obtained, bringing with it much promise and hype for new therapeutics and diagnoses for human disease. For example, when a conventional cancer drug fails to work for a group of patients, the answer might lie in the genome of the patients, which might have a special property that prevents the drug from acting. With enough data comparing the relevant features of genomes from these cancer patients and the right control groups, custom-made drugs might be discovered, leading to a kind of "personalized medicine." Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.

The success of fields like personalized medicine and other offshoots of the sequencing revolution and the systems-biology approach hinge upon our ability to deal with what Chomsky called "masses of unanalyzed data" -- placing biology in the center of a debate similar to the one taking place in psychology and artificial intelligence since the 1960s.

Systems biology did not rise without skepticism. The great geneticist and Nobel-prize winning biologist Sydney Brenner once defined the field as "low input, high throughput, no output science." Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."

Brenner's catch-phrase bite at systems biology and related techniques in neuroscience is not far off from Chomsky's criticism of AI. An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery. Yet, ever-improving technologies yield massive data related to the system, only a fraction of which might be relevant. Do we rely on powerful computing and statistical approaches to tease apart signal from noise, or do we look for the more basic principles that underlie the system and explain its essence? The urge to gather more data is irresistible, though it's not always clear what theoretical framework these data might fit into. These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?

by Yarden Katz, The Atlantic |  Read more:
Photo: Graham Gordon Ramsay

Jan Zrzavy: Girl Friends, 1923 - Oil on Canvas (Centre for Modern and Contemporary Art, Veletrzni (Trades Fair) Palace, Prague)
via:

Ryan Adams


Can't a Guy Just Make Some Friends Around Here? Maybe.

A little more than a year ago, I moved into a West Plaza apartment. It was neat and spacious, with hardwood floors for the rug I'd bought in Istanbul but never stepped on. And because it was walking distance from both the Plaza and Westport, I could mosey to a coffee shop or stagger home drunk from a bar without even glancing at my car. Best of all, it was cheap enough that I could live alone and realize one of the defining fantasies of many 20-something men: I'd be responsible for every mess I created, without fighting with a roommate for kitchen-counter space or bathroom time as if we were sharing the West Bank. This was city living as personal libertarian utopia.

Affordable rent and abundant space, after all, are what Kansas City is supposed to be about. After a rough few months in Washington, D.C. — where I hadn't liked the government achievatrons I met at parties and where an invasive landlord (who was cheating on her property taxes) had kicked me out of an egregiously expensive apartment (which was being rented illegally) — I was ready to build a better life in KC.

And a better life in KC was so easy — at first.

The Lattéland on Jefferson Street had a fine patio, so I spent a lot of time reading outside there. The Cinemark a few blocks from my apartment cost only $6 a show, so I saw every Zac Efron and Ryan Reynolds abomination that rolled through. (Because for six bucks, I'll watch anything in a theater.) These things I did alone, which was fine for a while.

Yet the warning signs of loneliness started to emerge. The apartment I'd chosen for its spaciousness began to look to me like that photo of Steve Jobs' living room, which had nothing but a rug and a lamp. The cheap movies I saw were stupid romantic comedies, and watching them alone lost its irony. And there were no co-workers to run into at the coffee shop after work because I was a freelancer with no co-workers. (...)

Last fall, I also spent a lot of time in downtown Cairo, which was like living inside a human beehive. The crush of people packed into small spaces, combined with more outgoing social norms between strangers, means that every time you step outside is a trip into the unexpected. It's almost impossible not to meet people. To live in Cairo is to share in a high-density experience, one in which all the human molecules jostle together to create a kind of social friction you rarely see in Kansas City. I thought about all the times I'd walked along Roanoke Parkway without passing another person on foot.

But you don't have to travel abroad to know that the way we live in Kansas City — by ourselves, in spread-out homes, often away from our families and detached from our friends, wedged into our cars — is a historical aberration and exceptional compared with many other parts of the world. And in recent years, various media outlets have singled us out for some embarrassing stats, telling the rest of the country what we already knew about ourselves. In 2009, for example, Forbes crunched population data for America's 40 largest metropolitan areas and ranked Kansas City dead last for the number of single people. Perhaps correspondingly, KC's number of bars, restaurants and nightclubs per capita didn't rank much better.

That same year, according to U.S. Census data, more than 30 percent of American 20-somethings moved. No other age group uproots itself as frequently. We move because of new jobs or new relationships, and we arrive with few attachments. We're looking for those bars and restaurants and clubs, and the ongoing renaissance of KC's downtown offers some encouragement that life for young, urban-minded people is getting a little more vibrant.

So maybe, I thought, it was time to look for some new friends. But when I set out to do that, I found that my fellow Kansas Citians were feeling all kinds of lonely. And some weren't shy about admitting it.

At least, that's what I learned from Craigslist.

by Matt Pearce, The Pitch |  Read more:
Illustration: Shannon Freshwater

Forget About 2012, the Race for the White House in 2016 has Already Begun

[ed. See also: this interesting piece on Obama's place in history.]

The 2012 presidential election is over; let the 2016 election begin! Without skipping a beat, the endless electoral process in the US has started over. It is the American equivalent of: “The king is dead. Long live the king.” It is why political junkies across the world love American politics. It never ends. So what do we know about the next race?

With Barack Obama still in the White House and unable to run again due to term limits, both parties are looking for a new candidate. In the Republican Party, it used to be that if you didn’t win the presidency one year, you could put yourself forward the next. Richard Nixon lost to John F Kennedy in 1960 but bounced back in 1968 to beat Hubert Humphrey. Now voters are so fickle and campaigns so punishing that you only get one shot. Americans don’t like losers and the electoral battlefield is strewn with the corpses of failed presidential candidates who overnight became unpersons. So bye-bye, Mitt. Missing you already. As F. Scott Fitzgerald observed, there are no second acts in American lives.

Team Billary

Yet politicians who merely fail to win their party’s nomination can keep plugging away. Hillary Clinton was beaten to the punch by Obama in 2008 but is expected to run in 2016. That explains why Bill Clinton has been going hoarse urging voters to back Obama. With Romney in the White House, Hillary would find it hard to unseat him. So, she needed Romney to lose and there was no better way to ensure that than to set her husband on to him. Having loyally served Obama as secretary of state, Hillary expects the president to repay the compliment and back her bid. The general view among Democrats is that if Hillary wants the nomination, it’s hers. They feel that her impeccable performance as senator for New York, then at the state department, has repaired the reputation for divisiveness and aggression that she acquired when she was first lady. She will be 69 in 2016.

It is not quite that easy. The Clintons may be the first couple of Democratic politics but after Obama’s rope-a-dope in the Denver debate, Vice-President Joe Biden became a Democratic hero by upstaging and smothering his rival, Paul Ryan, in their televised debate. No one would begrudge Biden a run at the primaries. Despite failing in 1988, when he was caught plagiarising a speech by Neil Kinnock, and in 2008, when he lost out to Obama and Hillary, his ambition remains undimmed. He will be 74 in 2016.

by Nicholas Wapshott, The New Statesman |  Read more:
Photograph: Getty Images

Given Tablets but No Teachers, Ethiopian Children Teach Themselves

The One Laptop Per Child project started as a way of delivering technology and resources to schools in countries with little or no education infrastructure, using inexpensive computers to improve traditional curricula. What the OLPC Project has realized over the last five or six years, though, is that teaching kids stuff is really not that valuable. Yes, knowing all your state capitols how to spell "neighborhood" properly and whatnot isn't a bad thing, but memorizing facts and procedures isn't going to inspire kids to go out and learn by teaching themselves, which is the key to a good education. Instead, OLPC is trying to figure out a way to teach kids to learn, which is what this experiment is all about.

Rather than give out laptops (they're actually Motorola Zoom tablets plus solar chargers running custom software) to kids in schools with teachers, the OLPC Project decided to try something completely different: it delivered some boxes of tablets to two villages in Ethiopia, taped shut, with no instructions whatsoever. Just like, "hey kids, here's this box, you can open it if you want, see ya!"

Just to give you a sense of what these villages in Ethiopia are like, the kids (and most of the adults) there have never seen a word. No books, no newspapers, no street signs, no labels on packaged foods or goods. Nothing. And these villages aren't unique in that respect; there are many of them in Africa where the literacy rate is close to zero. So you might think that if you're going to give out fancy tablet computers, it would be helpful to have someone along to show these people how to use them, right?

But that's not what OLPC did. They just left the boxes there, sealed up, containing one tablet for every kid in each of the villages (nearly a thousand tablets in total), pre-loaded with a custom English-language operating system and SD cards with tracking software on them to record how the tablets were used. Here's how it went down, as related by OLPC founder Nicholas Negroponte at MIT Technology Review's EmTech conference last week:
"We left the boxes in the village. Closed. Taped shut. No instruction, no human being. I thought, the kids will play with the boxes! Within four minutes, one kid not only opened the box, but found the on/off switch. He'd never seen an on/off switch. He powered it up. Within five days, they were using 47 apps per child per day. Within two weeks, they were singing ABC songs [in English] in the village. And within five months, they had hacked Android. Some idiot in our organization or in the Media Lab had disabled the camera! And they figured out it had a camera, and they hacked Android."
 by Evan Ackerman, DVICE |  Read more:

Wednesday, November 7, 2012

Spotify and its Discontents

Walking—dazed—through a flea market in Greenwich Village, taking in the skewered meats, the empanadas, and the dumb T-shirts, I came across a fugitive salesman, probably near sixty years old, in a ripped Allman Brothers T-shirt. Like a technicolor mirage, he was hawking CDs and singing along to “You Don’t Love Me,” which was blaring, a tad trebly, from a boom box atop a fold-out table.

I spent much of my time, and some of my happiest hours, hanging out in record stores—until they all disappeared about five years ago. So I was relieved to see this funky dude and his valuables, because I knew how to behave in the presence of funky dudes and their coveted records. I nodded and smiled and proceeded to evaluate the merchandise. As I flipped through his CDs, the cases clacked against one another, and the familiar sound not only restored a sense of equilibrium within me, but generated the rumblings of anticipation—a fluttery response in my gut, the slim possibility that, hidden within the stacks, was an album that would alter my perception, just enough, so as to restore the wonder and blue-sky beauty of the ordinary world.

I spotted a Paul Butterfield recording that looked promising, and though I no longer owned a CD player, I wanted to buy it. But the cost was fifteen bucks, and I only had ten in my wallet, and there was no A.T.M. in sight. Holding the Butterfield CD, wrapped in cellophane, the psychedelic cover a piece of artwork in its own right, it suddenly seemed inconceivable I could live without the album, lest I wished to lead a life of regret and unfulfilled desire.

Since I was at a flea market, I called out to the proprietor and offered him my ten bucks, to which he replied, in so many words, that I could take said money and, for all he cared, chuck it in the East River—fifteen bucks or no deal. Well, I told him, that’s not very nice, and a bit unreasonable, as I could easily, in the comfort of my own home, dig up the Butterfield album on Spotify, and listen to it for free. “Whatever you prefer,” he said, as if there were really a choice. I responded with a big laugh—a chortle—the condescending kind reserved for prickly geezers who, despite overwhelming evidence to the contrary, cling to the last vestiges of a dead world. “Whipping Post” began to play out of his boom box, and, as he sang along, I walked away, his raspy voice, and Berry Oakley’s distorted bass line, fading in my wake.

After returning to my apartment, I powered up my computer and, without any hassle, quickly located the Butterfield album, whose track listing was laid out on the screen like an account ledger. This was supposed to be a victory of sorts, but I was quickly overcome by the blunt banality of the moment. In front of me was not only the album I desired, but also every other Butterfield recording ever made. And once I sampled and sated my hunger for Paul Butterfield’s blues, I could locate just about any recording ever made. But what, I wondered, were the consequences?

by Mike Spies, New Yorker |  Read more:

Thoughts on The Facts of the Matter


What is the relationship of truth and invention in literary nonfiction? Over at TriQuarterly, an anonymous post called “The Facts of the Matter” frames the issue in a fascinating way. Presented as a personal essay, written by a middle-aged male author who, as an undergraduate at Yale, sexually assaulted “a girl I liked,” it is a meditation on revelation, narrative and construction, raising questions about the interplay of fact and narrative by admitting to a brutal truth.

Or is it? An editor’s note suggests that something else may be at work. “When we received this anonymous nonfiction submission,” it reads, “it caused quite a stir. One staff member insisted we call the New Haven, Ct., police immediately to report the twentieth-century crime it recounts. But first, we figured out by the mailing address that the author was someone whose work had been solicited for TriQuarterly. Other questions remained. What animal was this? A memoir? Essay? Craft essay? Fictional autobiography? Should we publish it with an introduction, a warning -- and what should we say? The author later labeled it ‘meta-nonfiction.’ We thought it was worth publishing for the issues it raises.”

And what issues are those? First, I think, is anonymity, which puts a barrier between writer and reader that belies the intentions of the form. A key faith of the personal essay, after all, is its intimacy, the idea that we are in the presence of a writer, working under his or her own name and in his or her own voice, as something profound is explored.

That exploration doesn’t have to be dramatic -- I think of Bernard Cooper’s meditation on sighing orJoan Didion’s riff on migraines -- but at its heart is authorial identity. And the first building block of identity is a name. This is one of the key ways we position ourselves, as readers, in an essay: to know who is speaking, and why. For that reason, the anonymity here makes me immediately suspicious, as if the essay were a kind of con.

And yet, what essay -- or for that matter, what novel, story, film, song, painting -- isn’t a con at the most basic level, a manipulation of memory and experience, a shaping of the chaos of the world? This is the paradox of art, that it is both utterly necessary and utterly invented, and it is the paradox of this post, as well.

As the anonymous author notes: “Would it matter to know my name, my race, or hers, or is a piece of nonfiction more potent for not knowing who I am, for not being able to make this personal, singular, my problem, not yours? Is it discretion not to reveal more of the facts, protecting her identity, or am I merely protecting my own? How much telling is a factual tale, and how much telling is too much? (Does it matter that I’ve never told anyone this?)”

by David L. Ulin, LA Times |  Read more:

Jean Négulesco Guitar and White Vase 1929

I-502: The End of Prohibition

Washington enthusiastically leapt into history Tuesday, becoming the first state, with Colorado, to reject federal drug-control policy and legalize recreational marijuana use.

Initiative 502 was winning 55 to 45 percent, with support from more than half of Washington's counties, rural and urban.

The vote puts Washington and Colorado to the left of the Netherlands on marijuana law, and makes them the nexus of a new social experiment with uncertain consequences. National and international media watched as vote counts rolled into I-502's election-night party in Seattle amid jubilant cheers.

"I'm going to go ahead and give my victory speech right now. After this I can go sit down and stop shaking," said Alison Holcomb, I-502's campaign manager and primary architect.

"Today the state of Washington looked at 75 years of national marijuana prohibition and said it is time for a new approach," she said.

As of Dec. 6, it will no longer be illegal for adults 21 and over to possess an ounce of marijuana. A new "drugged driving" law for marijuana impairment also kicks in then.

Tuesday's vote also begins a yearlong process for the state Liquor Control Board to set rules for heavily taxed and regulated sales at state-licensed marijuana stores, which are estimated to raise $1.9 billion in new revenue over five years.

Many legal experts expect the U.S. Justice Department, which remained silent during presidential-year politics, to push back and perhaps sue to block I-502 based on federal supremacy.

But Seattle City Attorney Pete Holmes said Seattle's U.S. Attorney Jenny Durkan told him Tuesday the federal government "has no plans, except to talk."

by Jonathan Martin, Seattle Times |  Read more:
Image: Wikipedia