Monday, January 14, 2013

Time Passing

So here's the problem. If you don't believe in God or an afterlife; or if you believe that the existence of God or an afterlife are fundamentally unanswerable questions; or if you do believe in God or an afterlife but you accept that your belief is just that, a belief, something you believe rather than something you know -- if any of that is true for you, then death can be an appalling thing to think about. Not just frightening, not just painful. It can be paralyzing. The fact that your lifespan is an infinitesimally tiny fragment in the life of the universe, and that there is, at the very least, a strong possibility that when you die, you disappear completely and forever, and that in five hundred years nobody will remember you and in five billion years the Earth will be boiled into the sun: this can be a profound and defining truth about your existence that you reflexively repulse, that you flinch away from and refuse to accept or even think about, consistently pushing to the back of your mind whenever it sneaks up, for fear that if you allow it to sit in your mind even for a minute, it will swallow everything else. It can make everything you do, and everything anyone else does, seem meaningless, trivial to the point of absurdity. It can make you feel erased, wipe out joy, make your life seem like ashes in your hands. Those of us who are skeptics and doubters are sometimes dismissive of people who fervently hold beliefs they have no evidence for simply because they find them comforting -- but when you're in the grip of this sort of existential despair, it can be hard to feel like you have anything but that handful of ashes to offer them in exchange.

But here's the thing. I think it's possible to be an agnostic, or an atheist, or to have religious or spiritual beliefs that you don't have certainty about, and still feel okay about death. I think there are ways to look at death, ways to experience the death of other people and to contemplate our own, that allow us to feel the value of life without denying the finality of death. I can't make myself believe in things I don't actually believe -- Heaven, or reincarnation, or a greater divine plan for our lives -- simply because believing those things would make death easier to accept. And I don't think I have to, or that anyone has to. I think there are ways to think about death that are comforting, that give peace and solace, that allow our lives to have meaning and even give us more of that meaning -- and that have nothing whatsoever to do with any kind of God, or any kind of afterlife.

Here's the first thing. The first thing is time, and the fact that we live in it. Our existence and experience are dependent on the passing of time, and on change. No, not dependent -- dependent is too weak a word. Time and change are integral to who we are, the foundation of our consciousness, and its warp and weft as well. I can't imagine what it would mean to be conscious without passing through time and being aware of it. There may be some form of existence outside of time, some plane of being in which change and the passage of time is an illusion, but it certainly isn't ours.

And inherent in change is loss. The passing of time has loss and death woven into it: each new moment kills the moment before it, and its own death is implied in the moment that comes after. There is no way to exist in the world of change without accepting loss, if only the loss of a moment in time: the way the sky looks right now, the motion of the air, the number of birds in the tree outside your window, the temperature, the placement of your body, the position of the people in the street. It's inherent in the nature of having moments: you never get to have this exact one again.

And a good thing, too. Because all the things that give life joy and meaning -- music, conversation, eating, dancing, playing with children, reading, thinking, making love, all of it -- are based on time passing, and on change, and on the loss of an infinitude of moments passing through us and then behind us. Without loss and death, we don't get to have existence. We don't get to have Shakespeare, or sex, or five-spice chicken, without allowing their existence and our experience of them to come into being and then pass on. We don't get to listen to Louis Armstrong without letting the E-flat disappear and turn into a G. We don't get to watch "Groundhog Day" without letting each frame of it pass in front of us for a 24th of a second and then move on. We don't get to walk in the forest without passing by each tree and letting it fall behind us; we don't even get to stand still in the forest and gaze at one tree for hours without seeing the wind blow off a leaf, a bird break off a twig for its nest, the clouds moving behind it, each manifestation of the tree dying and a new one taking its place.

And we wouldn't want to have it if we could. The alternative would be time frozen, a single frame of the film, with nothing to precede it and nothing to come after. I don't think any of us would want that. And if we don't want that, if instead we want the world of change, the world of music and talking and sex and whatnot, then it is worth our while to accept, and even love, the loss and the death that make it possible.

Here's the second thing. Imagine, for a moment, stepping away from time, the way you'd step back from a physical place, to get a better perspective on it. Imagine being outside of time, looking at all of it as a whole -- history, the present, the future -- the way the astronauts stepped back from the Earth and saw it whole.

Keep that image in your mind. Like a timeline in a history class, but going infinitely forward and infinitely back. And now think of a life, a segment of that timeline, one that starts in, say, 1961, and ends in, say, 2037. Does that life go away when 2037 turns into 2038? Do the years 1961 through 2037 disappear from time simply because we move on from them and into a new time, any more than Chicago disappears when we leave it behind and go to California?

It does not. The time that you live in will always exist, even after you've passed out of it, just like Paris exists before you visit it, and continues to exist after you leave. And the fact that people in the 23rd century will probably never know you were alive... that doesn't make your life disappear, any more than Paris disappears if your cousin Ethel never sees it. Your segment on that timeline will always have been there. The fact of your death doesn't make the time that you were alive disappear.

And it doesn't make it meaningless. Yes, stepping back and contemplating all of time and space can be daunting, can make you feel tiny and trivial. And that perception isn't entirely inaccurate. It's true; the small slice of time that we have is no more important than the infinitude of time that came before we were born, or the infinitude that will follow after we die.

But it's no less important, either.

I don't know what happens when we die. I don't know if we come back in a different body, or if we get to hover over time and space and view it in all its glory and splendor, or if our souls dissolve into the world-soul the way our bodies dissolve into the ground, or if, as seems very likely, we simply disappear. I have no idea. And I don't know that it matters. What matters is that we get to be alive. We get to be conscious. We get to be connected with each other, and with the world, and we get to be aware of that connection and to spend a few years mucking about in its possibilities. We get to have a slice of time and space that's ours. As it happened, we got the slice that has Beatles records and Thai restaurants and AIDS and the Internet. People who came before us got the slice that had horse-drawn carriages and whist and dysentery, or the one that had stone huts and Viking invasions and pigs in the yard. And the people who come after us will get the slice that has, I don't know, flying cars and soybean pies and identity chips in their brains. But our slice is no less important because it comes when it does, and it's no less important because we'll leave it someday. The fact that time will continue after we die does not negate the time that we were alive. We are alive now, and nothing can erase that.

Greta Christina's Blog, via:  (repost)

Creativity vs. Innovation

Creativity

Innovation

Sometimes there's a difference

image credits:
[ed. Repost]
[ed. Is it just me, or is it hard to find good new music these days? Here are a few favorites from the archives.]




Walmart, Your One-Stop Destination


When Adam Lanza entered Sandy Hook Elementary School on Friday, December 14, inexplicably bent on ending as many lives as possible, he was carrying a Bushmaster AR-15 assault rifle and several high-capacity magazines. Sadly, this isn’t the first time the country has had to deal with the aftermath of a horrific shooting spree, nor is it the first time we’ve encountered an AR-15 in this context: only days earlier, it was the weapon of choice for a shooting at an Oregon mall that killed two people. Five months earlier, it was used by James Holmes in an attack that wounded fifty-eight people and killed twelve in an Aurora, Colorado, movie theater. And several years before that, a man and his teenage accomplice used a Bushmaster AR-15 to terrorize the Washington, DC, area with a series of random shootings.

Although it is not yet clear where the Bushmaster AR-15 used by Lanza (and registered to his mother) was purchased, the model is familiar to many Walmart shoppers. It’s on sale at about 1,700 Walmart stores nationwide, though the retail chain pulled the weapon from its website three days after the attack. While the deadly rampage in Connecticut has finally and unmistakably highlighted the madness of making such weapons readily available, it’s a concern that many people with a Walmart in their community have been trying to address for years. Several months back, the Rev. Greg Brown had a troubling conversation with two members of his youth group from the northwest side of South Bend, Indiana. “They were honor roll students and little young folks that love the Lord,” Brown recounted. “One of the kids came up to me and said, ‘Rev, you ain’t gonna believe what happened the other day at Walmart.’” The kids went on to describe how, on a recent visit to the big-box store, a man asked them to fill up a gym bag with ammunition and sneak it out of the store for him. They declined.

Walmart’s ammunition sales have troubled Brown since at least 2009, when two teenagers shoplifted bullets from the local Walmart, shot at an employee who tried to stop them in the parking lot, and then embarked on a citywide robbery spree in which one man was seriously injured. When Brown headed down to the store to see how easy it would be to steal ammunition, he was shocked. Not only were there bullets arrayed on the unlocked shelves; there were rows of guns as well, including assault rifles.

South Bend has the most violent crime per capita in Indiana and well more than double the national median. Brown was outraged that Walmart was even selling these weapons, let alone that they were unlocked and under the supervision of hourly employees without specific training in firearm handling and sales. (Brown says a former Marine handles the gun sales at a nearby Dick’s Sporting Goods.) “It’s totally wrong, and it’s totally unacceptable,” he said. “You look back there and see a dad holding a gun, his son pulling on his pocket. And the son knows the gun is going home. The son’s going to know where the gun is.”

South Bend isn’t the only place where Walmart is stocking guns, including combat-style weapons and gun-related paraphernalia. The big-box chain at one point sold guns in only about a third of its stores, mainly in remote rural areas where hunting is popular. But in 2011, without much fanfare, Walmart expanded gun sales to about half of its 3,982 stores nationwide, including those in more urban areas like Albuquerque and Spokane.

The expansion of gun sales at Walmart came after a five-year slowdown. In 2006, the chain announced that it was rolling back gun sales, citing declining profit margins on the relatively expensive weapons, which even at Walmart can retail for hundreds of dollars. But in 2011, company executives were looking at eight straight quarters of declining sales at stores open for a year or more—the worst slump in Walmart’s history.

They must also have noticed that Barack Obama’s inauguration had sparked a rally in gun sales, which have steadily increased every year since 2008. The government isn’t allowed to track firearm sales, but the FBI does release figures on how many retailers ask it to run background checks—a relatively reliable indicator of total gun sales, although likely a lowball estimate, since a person can buy multiple guns on a single background check, and many gun shows aren’t required to perform such checks. In 2007, retailers asked the FBI for just over 11 million background checks; by the end of 2009, 14 million checks were requested—a 27 percent increase.

In April 2011, Walmart began stocking guns in more and more stores, expanding the sales to 1,750 outlets nationwide. By the end of that year, the FBI received 16.4 million background check requests; the number is 16.8 million this year. Overall Walmart sales figures are back on track after the 2011 slump, and executive vice president Duncan Mac Naughton told shareholders at a meeting in October 2012 that gun sales in particular are a staple of the chain’s strategy to continue boosting its numbers. He said that over the past twenty-six months, gun sales at Walmart stores open for a year or more were up an astonishing 76 percent, while ammunition sales were up 30 percent. Walmart is now the biggest seller of firearms and ammunition in America.

“This gun thing, it’s really just a nightmare,” says Bertha Lewis, president of the Black Institute, which has been organizing Walmart workers this year to protest wages and working conditions. Given its aggressive gun sales, Walmart’s logo “shouldn’t be a smiley face; it should be an automatic weapon,” she adds.

by George Zornick, The Nation |  Read more:
Photo: Uncredited

Do I Smell a Metaphor Melting?

Edge has a fascinating, discursive new interview with the renowned philosopher-of-mind Daniel C. Dennett. As someone who has a deep distrust of the popular metaphor that portrays the brain as a computer, I was struck by something Dennett says near the start:
“The vision of the brain as a computer, which I still champion, is changing so fast. The brain’s a computer, but it’s so different from any computer that you’re used to. It’s not like your desktop or your laptop at all, and it’s not like your iPhone except in some ways. It’s a much more interesting phenomenon.”
Normally, the explanatory power of a metaphor comes from describing a thing we don’t understand in terms of a thing we do understand. But this brain-as-computer metaphor now seems to be diverging from that model. The computer in the metaphor seems to be something very different from what we mean when we talk about a “computer.” The part of the metaphor that is supposed to be concrete has turned into a mystery fluid.
The brain is like a computer! 
Cool. What kind of computer is the brain like? 
It’s not actually like any computer that’s ever been invented. 
So what kind of computer is it like? 
It’s like the unique form of a computer that we call a brain. 
So the brain is like a brain? 
Yes, exactly.
It sounds like it’s time for a new metaphor.

The new explanatory metaphor Dennett is proposing, or at least playing with, doesn’t sound much at all like a digital computer, even if there’s computation of some sort going on:
“We’re getting away from the rigidity of that model, which was worth trying for all it was worth. You go for the low-hanging fruit first. First, you try to make minds as simple as possible. You make them as much like digital computers, as much like von Neumann machines, as possible. It doesn’t work.”
The new metaphor, like the brain itself, is much more interesting:
“Each neuron is imprisoned in your brain. I now think of these as cells within cells, as cells within prison cells. Realize that every neuron in your brain, every human cell in your body (leaving aside all the symbionts), is a direct descendent of eukaryotic cells that lived and fended for themselves for about a billion years as free-swimming, free-living little agents. They fended for themselves, and they survived. 
They had to develop an awful lot of know-how, a lot of talent, a lot of self-protective talent to do that. When they joined forces into multi-cellular creatures, they gave up a lot of that. They became, in effect, domesticated. They became part of larger, more monolithic organizations. … [B]ut in the brain I think that (and this is my wild idea) maybe only in one species, us, and maybe only in the obviously more volatile parts of the brain, the cortical areas, some little switch has been thrown in the genetics that, in effect, makes our neurons a little bit feral, a little bit like what happens when you let sheep or pigs go feral, and they recover their wild talents very fast. 
Maybe a lot of the neurons in our brains are not just capable but, if you like, motivated to be more adventurous, more exploratory or risky in the way they comport themselves, in the way they live their lives. They’re struggling amongst themselves with each other for influence, just for staying alive, and there’s competition going on between individual neurons. As soon as that happens, you have room for cooperation to create alliances, and I suspect that a more free-wheeling, anarchic organization is the secret of our greater capacities of creativity, imagination, thinking outside the box and all that, and the price we pay for it is our susceptibility to obsessions, mental illnesses, delusions and smaller problems.”
A pack of feral pigs going rogue in a jailhouse: Now, that sounds a lot like my brain. Much more so than does an iMac running Microsoft Office.

by Nicholas Carr, Rough Type |  Read more:
Photo by jennystiles315

The Flu, Explained

This year's flu season is no joke: On Friday, the Centers for Disease Control and Prevention announced that it had reached epidemic status. Although experts believe that the season may have peaked in most places, flu incidence is still thought to be very high. The media blitz about the flu seems to be an epidemic of its own—so I spoke to several experts to set the record straight on some of the most common flu questions.

Where in the US is the flu worst right now?

It's sort of hard to tell, since the CDC is not releasing any real time data; its stats are about a week old. Also, maps like the one below don't track the flu itself, just flu-like symptoms. Here's a look at the CDC's symptom activity map for the week that ended on January 5:

Map courtesy of CDC

How do I even know I have the flu? How can my doctor tell?

To know for certain, you'd need to have a blood test. But most doctors won't do that, since it won't really change the treatment (rest, drink fluids). But there are some key differences between a bad cold and a flu, says CDC spokesman Curtis Allen. "You will be running a high temperature for several days, and it will keep you in bed for a week or more," he says. But the most distinctive feature of the flu is its sudden onset. "You could be feeling fine at 10 and very sick at noon."

If the flu season has peaked, should I still get a flu shot?

Yes. A typical flu season is 10 to 12 weeks long—so if it just peaked, that means there's still another five or six weeks left. The caveat: The shot takes about two weeks to kick in, so even if you got the shot today, you could still come down with the flu, says Allen. Even if you think you've already had the flu this year, you should get a shot; it's possible (though unlikely) that you could still come down with a different strain. (...)

How do they figure out what to put in those shots, anyway?

You can thank the Southern Hemisphere for this year's vaccine. During our summer, it's flu season down under. Public-health experts monitor the strains there and use that information to predict which strains will hit us down the road. This year, they did a pretty good job of figuring out which three strains to include in the vaccines; the CDC estimates that 90 percent of this year's flu cases are from one of the three strains in the vaccine. Next year, Allen hopes that the vaccine will include four strains.

That said, flu prediction is not an exact science, says Jeffrey Shaman, a flu researcher and assistant professor in the department of environmental health sciences at Columbia University's Mailman School of Public Health. "They monitor the flu, they see what's happening," he says. "People look at that and get a feel for the patterns. It's what we used to do for weather before we had sophisticated models." Shaman is trying to improve this science with a new tool that uses real-time data from Google Flu Trends, which employs users' search terms ("flu," "fever," "body aches," etc.) to paint a picture of the location and severity of the flu. (It's actually pretty accurate.) Since Google's information is constantly updating, Shaman's models are continuously learning new information about how the flu behaves—thus increasing their accuracy. Shaman says his model can offer pretty accurate predictions of flu timing and case numbers at the municipal level. He declined, however, to offer a prediction for this year.

by Kiera Butler, Mother Jones |  Read more:
Photo: USACE Europe District/Flickr

Sunday, January 13, 2013


Josef Stoitzner (Austrian, 1884-1951), Aus den Tauern, c. 1915. Colour woodcut.
via:

The New Monopolists

Ask Jack Dorsey, the co-founder of the social network Twitter and the mobile-payment start-up Square, what his two companies have in common, and he has a quick answer: “They’re both utilities.” Mark Zuckerberg might agree: he spent years trying to convince people that Facebook is not a social network but a “social utility.”

It’s an intriguing choice of words for such of-the-moment entrepreneurs. Utilities tend to be boring, slow-growing beasts. They also—and this is the more important point—tend to be monopolies that are either regulated heavily by governments or owned outright by them.

Indeed, once they get beyond a certain size, technology companies do become wary of the word. Google has been called a utility by lots of people, but you won’t hear the company’s executives using the term (at least, I couldn’t find any examples). And Zuckerberg, when asked in 2010 whether, as a utility, Facebook ought to be regulated, said he hadn’t meant the word that way at all: “Something that’s cool can fade. But something that’s useful won’t. That’s what I meant by utility.”

Yet there are lots of useful things in the world—clothing, breakfast, this issue of The Atlantic—that no one would ever think of calling a utility. Yes, there is an innocuous class of computer software known as utilities. But what companies like Twitter, Square, and Facebook—not to mention Google, Amazon, and Apple—aspire to, and in some cases have achieved, is a status similar to that of traditional utilities like Ma Bell. They attempt to position themselves such that customers can’t get around them, or can’t afford to leave them. And when they succeed, they start appearing to some customers, would-be competitors, and regulators like scary monopolies that somebody needs to do something about.

The connection between attractive business opportunity and monopoly is not new. Pursuing a “short run” monopoly, the economist Joseph Schumpeter wrote in 1942, is what profit-seeking enterprises do—in the process, driving significant innovation and economic growth. In the 1970s, the business-school discipline of strategy arose as the study of how to build and defend these short-run monopolies—a sort of mirror image of the antitrust classes long found in law schools. “Strategy is antitrust with a minus sign in front of it,” says the Columbia Law School professor Tim Wu, who has taught both subjects. That is, strategy tries to maximize what antitrust tries to minimize.

What is new is that the path from looking for an edge to being attacked as a monopoly has gotten a lot shorter—and that gaining a monopoly seems such a plausible goal within some of the fastest-growing parts of the economy. Standard Oil had been in business for 36 years when the Justice Department sued it for antitrust violations; AT&T for 97. By comparison, Microsoft was just 15 when federal regulators started looking into its business practices, 23 when Justice sued. Google, a mere 14 years old, is already under antitrust investigation.

Then there’s Facebook, which turns 9 in February. The company has not yet been the target of significant antitrust attention. But its ubiquity and reach into users’ daily lives give it a status that—socially if not economically—really does feel like that of a monopoly utility. Every tweak Facebook makes to its privacy settings or its look sparks heated public discussion. Last summer, the company had to agree to regular audits of its privacy policy, mandated by the Federal Trade Commission. One even hears occasional calls (meant more as thought experiments than as serious policy proposals, but still) for it to be nationalized.

Today’s technology entrepreneurs are well aware of the tight link between profit and monopoly. Few are as open about it as the PayPal co-founder and early Facebook investor Peter Thiel, who has described monopoly as the natural goal of any smart tech entrepreneur. But everybody gets the basic idea. “There’s a joke in Silicon Valley,” says the UC Berkeley economist Carl Shapiro: “ ‘You know you’ve really made it when you’ve got antitrust problems.’ That’s the sign of success.”

The modern theory of monopoly began its rise in the mid-1980s, when a handful of scholars—Shapiro among them—noted some salient characteristics of a fast-growing new industry. Many information-technology businesses, observed Stanford’s W. Brian Arthur, benefit from increasing returns: as they make more of something, the cost per piece keeps falling. This is especially true of software, for which the cost per piece moves quickly to zero. (Increasing returns had been deemed in the late 19th century to be the mark of a natural monopoly, an industry that would inevitably be dominated by one entity.)

Another trait that characterized many technology businesses, these same scholars observed, was lock-in, or prohibitive switching costs. Companies that committed to getting their mainframe computers from, say, IBM would eventually find switching to another provider hugely expensive and disruptive. (Later, with the PC, Microsoft was able to shift the lock-in from hardware to software.)

But most intriguing of all was the enormous power of network effects. A telephone “without a connection at the other end of the line … is one of the most useless things in the world,” AT&T President Theodore N. Vail wrote in the company’s annual report in 1908. “Its value depends on the connection with the other telephone—and increases with the number of connections.” In 1980, Bob Metcalfe, an inventor trying to persuade people to buy his $5,000 Ethernet cards, which connected computers in a local area network, came up with a formula that expressed the value of a network as the number of connections squared. The specifics of “Metcalfe’s Law” have frequently been challenged, but the basic idea that networks add value exponentially as they grow has not.

For society as a whole, though, these phenomena can have a dark side. In a famous paper, the Stanford economic historian Paul David described in 1985 how the ubiquitous QWERTY keyboard layout had been devised mainly to prevent jamming of primitive typewriter mechanisms. Later, as typewriters improved, there were repeated attempts to supplant QWERTY with configurations that allowed for faster typing. But by then the layout’s high switching costs had made it an impregnable standard. Economic forces, wrote David, “drove the industry prematurely into standardization on the wrong system.”

by Justin Fox, The Atlantic |  Read more:
Illustration:Christoph Neimann

Saturday, January 12, 2013

The George Saunders Experiment


[ed. I just finished reading Saunders' In Persuasion Nation and all I can say is 'wow'. Almost the same dream-sense of reality you'd get reading Gabriel Garcia Marquez or Franz Kafka, but with a contemporary slant. If you'd like a sample of his work, check out this previous post: The Semplica-Girl Diaries.]

You could call this desire — to really have that awareness, to be as open as possible, all the time, to beauty and cruelty and stupid human fallibility and unexpected grace — the George Saunders Experiment. It’s the trope of all tropes to say that a writer is “the writer for our time.” Still, if we were to define “our time” as a historical moment in which the country we live in is dropping bombs on people about whose lives we have the most abstracted and unnuanced ideas, and who have the most distorted notions of ours; or a time in which some of us are desperate simply for a job that would lead to the ability to purchase a few things that would make our kids happy and result in an uptick in self- and family esteem; or even just a time when a portion of the population occasionally feels scared out of its wits for reasons that are hard to name, or overcome with emotion when we see our children asleep, or happy when we risk revealing ourselves to someone and they respond with kindness — if we define “our time” in these ways, then George Saunders is the writer for our time.

This week, Saunders’s fourth book of stories, “Tenth of December,” will be published by Random House. He is 54 years old and published his first book, “CivilWarLand in Bad Decline,” in 1996, when he was 37. Since then there have been two other collections, “Pastoralia” and “In Persuasion Nation”; a novella, “The Brief and Frightening Reign of Phil”; a children’s book, “The Very Persistent Gappers of Frip”; and a collection of reported nonfiction, essays and short humor pieces called “The Braindead Megaphone.”

When “CivilWarLand” first came out, there was a lot of talk about Saunders as a new, savage, satirical voice bursting onto the scene, though he’d been publishing the stories one at a time over eight years, writing them while making a living at a day job preparing technical reports for a company called the Radian Corporation, in Rochester. His stories are set in what might be described as a just slightly futuristic America or, maybe better, present-day America, where, because of the exigencies of capitalism, things have gotten a little weird. These initial stories often take place in theme parks gone to seed or soul-withering exurban office strips, but the stories themselves are overflowing with vitality; they are sometimes very dark but they are also very, very funny. The characters speak in a strange new language — a kind of heightened bureaucratese, or a passively received vernacular that is built around self-improvement clichés (“It made me livid and twice that night I had to step into a closet and perform my Hatred Abatement Breathing”) — and this lends them the feeling of allegory, though they are something else too, that’s harder to place. The book was published right around the same time as David Foster Wallace’s “Infinite Jest,” and it felt back then as if those two writers (and a handful of others) were busy establishing the new terms for contemporary American fiction. (...)

That kind of thing has been said a lot about Saunders since then. For people who pay close attention to the state of American fiction, he has become a kind of superhero. His stories now appear regularly in The New Yorker, he has been anthologized all over the place, and he has won a bunch of awards, among them a “genius grant” in 2006 from the MacArthur Foundation, which described him as a “highly imaginative author [who] continues to influence a generation of young writers and brings to contemporary American fiction a sense of humor, pathos and literary style all his own.” As Joshua Ferris recently wrote in an introduction for the reissue last fall, in e-book form, of “CivilWarLand”: “Part of the reason it’s so hard to talk about him is the shared acknowledgment among writers that Saunders is somehow a little more than just a writer. . . . [He] writes like something of a saint. He seems in touch with some better being.”

It is true that if there exists a “writer’s writer,” Saunders is the guy. “There is really no one like him,” Lorrie Moore wrote. “He is an original — but everyone knows that.” Tobias Wolff, who taught Saunders when he was in the graduate writing program at Syracuse in the mid-’80s, said, “He’s been one of the luminous spots of our literature for the past 20 years,” and then added what may be the most elegant compliment I’ve ever heard paid to another person: “He’s such a generous spirit, you’d be embarrassed to behave in a small way around him.” And Mary Karr, who has been a colleague of Saunders’s at Syracuse since he joined the faculty in the mid-’90s (and who also, incidentally, is a practicing Catholic with a wonderful singing voice and a spectacularly inventive foul mouth), told me, “I think he’s the best short-story writer in English alive.”

Aside from all the formal invention and satirical energy of Saunders’s fiction, the main thing about it, which tends not to get its due, is how much it makes you feel. I’ve loved Saunders’s work for years and spent a lot of hours with him over the past few months trying to understand how he’s able to do what he does, but it has been a real struggle to find an accurate way to express my emotional response to his stories. One thing is that you read them and you feel known, if that makes any sense. Or, possibly even woollier, you feel as if he understands humanity in a way that no one else quite does, and you’re comforted by it. Even if that comfort often comes in very strange packages, like say, a story in which a once-chaste aunt comes back from the dead to encourage her nephew, who works at a male-stripper restaurant (sort of like Hooters, except with guys, and sleazier), to start unzipping and showing his wares to the patrons, so he can make extra tips and help his family avert a tragic future that she has foretold.

Junot Díaz described the Saunders’s effect to me this way: “There’s no one who has a better eye for the absurd and dehumanizing parameters of our current culture of capital. But then the other side is how the cool rigor of his fiction is counterbalanced by this enormous compassion. Just how capacious his moral vision is sometimes gets lost, because few people cut as hard or deep as Saunders does.”

by Joel Lovell, NY Times |  Read more:
Photo: Damon Winter/The New York Times

The Inspiring Heroism of Aaron Swartz


[ed. If you're not familiar with Aaron Swartz - the struggles he endured and the causes he championed - please click on the links provided to get a fuller picture of his legacy. Here's another one.]

Aaron Swartz, the computer programmer and internet freedom activist, committed suicide on Friday in New York at the age of 26. As the incredibly moving remembrances from his friends such as Cory Doctorow and Larry Lessig attest, he was unquestionably brilliant but also - like most everyone - a complex human being plagued by demons and flaws. For many reasons, I don't believe in whitewashing someone's life or beatifying them upon death. But, to me, much of Swartz's tragically short life was filled with acts that are genuinely and, in the most literal and noble sense, heroic. I think that's really worth thinking about today.

At the age of 14, Swartz played a key role in developing the RSS software that is still widely used to enable people to manage what they read on the internet. As a teenager, he also played a vital role in the creation of Reddit, the wildly popular social networking news site. When Conde Nast purchased Reddit, Swartz received a substantial sum of money at a very young age. He became something of a legend in the internet and programming world before he was 18. His path to internet mogul status and the great riches it entails was clear, easy and virtually guaranteed: a path which so many other young internet entrepreneurs have found irresistible, monomaniacally devoting themselves to making more and more money long after they have more than they could ever hope to spend.

But rather obviously, Swartz had little interest in devoting his life to his own material enrichment, despite how easy it would have been for him. As Lessig wrote: "Aaron had literally done nothing in his life 'to make money' . . . Aaron was always and only working for (at least his conception of) the public good."

Specifically, he committed himself to the causes in which he so passionately believed: internet freedom, civil liberties, making information and knowledge as available as possible. Here he is in his May, 2012 keynote address at the Freedom To Connect conference discussing the role he played in stopping SOPA, the movie-industry-demanded legislation that would have vested the government with dangerous censorship powers over the internet.

Critically, Swartz didn't commit himself to these causes merely by talking about them or advocating for them. He repeatedly sacrificed his own interests, even his liberty, in order to defend these values and challenge and subvert the most powerful factions that were their enemies. That's what makes him, in my view, so consummately heroic.

In 2008, Swartz targeted Pacer, the online service that provides access to court documents for a per-page fee. What offended Swartz and others was that people were forced to pay for access to public court documents that were created at public expense. Along with a friend, Swartz created a program to download millions of those documents and then, as Doctorow wrote, "spent a small fortune fetching a titanic amount of data and putting it into the public domain." For that act of civil disobedience, he was investigated and harassed by the FBI, but never charged.

But in July 2011, Swartz was arrested for allegedly targeting JSTOR, the online publishing company that digitizes and distributes scholarly articles written by academics and then sells them, often at a high price, to subscribers. As Maria Bustillos detailed, none of the money goes to the actual writers (usually professors) who wrote the scholarly articles - they are usually not paid for writing them - but instead goes to the publishers.

This system offended Swartz (and many other free-data activists) for two reasons: it charged large fees for access to these articles but did not compensate the authors, and worse, it ensured that huge numbers of people are denied access to the scholarship produced by America's colleges and universities. The indictment filed against Swartz alleged that he used his access as a Harvard fellow to the JSTOR system to download millions of articles with the intent to distribute them online for free; when he was detected and his access was cut off, the indictment claims he then trespassed into an MIT computer-wiring closet in order to physically download the data directly onto his laptop.

Swartz never distributed any of these downloaded articles. He never intended to profit even a single penny from anything he did, and never did profit in any way. He had every right to download the articles as an authorized JSTOR user; at worst, he intended to violate the company's "terms of service" by making the articles available to the public. Once arrested, he returned all copies of everything he downloaded and vowed not to use them. JSTOR told federal prosecutors that it had no intent to see him prosecuted, though MIT remained ambiguous about its wishes.

But federal prosecutors ignored the wishes of the alleged "victims". Led by a federal prosecutor in Boston notorious for her overzealous prosecutions, the DOJ threw the book at him, charging Swartz with multiple felonies which carried a total sentence of several decades in prison and $1 million in fines.

Swartz's trial on these criminal charges was scheduled to begin in two months. He adamantly refused to plead guilty to a felony because he did not want to spend the rest of his life as a convicted felon with all the stigma and rights-denials that entails. The criminal proceedings, as Lessig put it, already put him in a predicament where "his wealth [was] bled dry, yet unable to appeal openly to us for the financial help he needed to fund his defense, at least without risking the ire of a district court judge."

To say that the DOJ's treatment of Swartz was excessive and vindictive is an extreme understatement. When I wrote about Swartz's plight last August, I wrote that he was "being prosecuted by the DOJ with obscene over-zealousness". Timothy Lee wrote the definitive article in 2011explaining why, even if all the allegations in the indictment are true, the only real crime committed by Swartz was basic trespassing, for which people are punished, at most, with 30 days in jail and a $100 fine, about which Lee wrote: "That seems about right: if he's going to serve prison time, it should be measured in days rather than years."

Nobody knows for sure why federal prosecutors decided to pursue Swartz so vindictively, as though he had committed some sort of major crime that deserved many years in prison and financial ruin. Some theorized that the DOJ hated him for his serial activism and civil disobedience. Others speculated that, as Doctorow put it, "the feds were chasing down all the Cambridge hackers who had any connection to Bradley Manning in the hopes of turning one of them."

I believe it has more to do with what I told the New York Times' Noam Cohen for an article he wrote on Swartz's case. Swartz's activism, I argued, was waged as part of one of the most vigorously contested battles - namely, the war over how the internet is used and who controls the information that flows on it - and that was his real crime in the eyes of the US government: challenging its authority and those of corporate factions to maintain a stranglehold on that information. In that above-referenced speech on SOPA, Swartz discussed the grave dangers to internet freedom and free expression and assembly posed by the government's efforts to control the internet with expansive interpretations of copyright law and other weapons to limit access to information.

That's a major part of why I consider him heroic. He wasn't merely sacrificing himself for a cause. It was a cause of supreme importance to people and movements around the world - internet freedom - and he did it by knowingly confronting the most powerful state and corporate factions because he concluded that was the only way to achieve these ends.

by Glenn Greenwald, The Guardian |  Read more:
Image: IMG_9892.JPG, a Creative Commons Attribution (2.0) image from quinn's photostream

Mike Rooney - Key’s Eatery, 2012
via:

Red shoes
via:

Slim Harpo


The Case for Walkability as an Economic Development Tool

A terrific street redesign is assisting economic development in a southern California community that has suffered from changing economic conditions but is nevertheless seeing significant population growth. This is a story of municipal foresight, excellent recent planning, and green ambition.

Lancaster is a fast-growing city of a little over 150,000 in far northern Los Angeles County, about 70 miles from downtown Los Angeles. Its population has more than tripled since 1980; it increased by nearly a third from 2000 to 2010. It is racially mixed (38 percent Latino, 34 percent white, 20 percent African-American) and, like so many fast-growing western cities, decidedly sprawling. The satellite view on Google Earth reveals a patchwork pattern of leapfrog development, carved out of the desert. It is a city with a very suburban character.


Lancaster’s economic condition isn’t among the country’s very worst, but it certainly has been better. According to City-Data.com, the median price of home sales in the city plummeted by almost two thirds from 2007 to 2009, from $350,000 to about $125,000, more or less where it still stands. As of August 2012, unemployment stood at 15.7 percent, way above the state average of 10.4 percent. Not far from Edwards Air Force Base and related industry, the city’s fortunes have long been associated with aerospace engineering and defense contractors, but some major employers, including Lockheed-Martin, have been moving their investments elsewhere in recent years.

Sprawl and disinvestment have also left scars. Greg Konar writes in the San Diego Planning Journal:
By the late 1980s the City’s historic downtown was in serious decline. Most retailers and commercial services had long since migrated to commercial centers and strip malls in other parts of the city. For years big box retailers and regional malls had captured nearly all new commercial growth. Much of it was concentrated along the Antelope Valley Freeway (I-14). Meanwhile the historic downtown deteriorated rapidly. Crime became an increasing problem and the surrounding older neighborhoods were suffering.
Before and after shots of Lancaster Boulevard. Images courtesy of the City of Lancaster

That’s a pattern all too typical of America in the late 20th century, but Lancaster moved to do something about it, including in 2008 the adoption of a form-based zoning code for the downtown Lancaster Boulevard corridor. (Form-based codes encourage walkability by encouraging mixed uses and a pedestrian-friendly streetscape.) The city also hired the well-known architecture and planning firm Moule & Polyzoides to capitalize on the opportunities created by the code by redesigning the boulevard to attract businesses and people.

The ramblas on the remade Lancaster boulevard. Image courtesy of Moule and Polyzoides

The results – a rejuvenated section of downtown now named THE BLVD – have been spectacular, as the photos accompanying this article show. The project has won multiple awards, including EPA’s top national award for smart growth achievement. Moule & Polyzoides describe the design features:
Among the Plan’s key elements are wide, pedestrian-friendly sidewalks, awnings and arcades, outdoor dining, single travel lanes, enhanced crosswalks, abundant street trees and shading, and added lighting, gateways and public art. Lancaster Boulevard has been transformed into an attractive shopping destination, a magnet for pedestrian activity and a venue for civic gatherings.
Greg Konar's article, which I cited above, provides an excellent review of what makes the design features of the project work so well.

The remade Lancaster boulevard. Image courtesy of Moule and Polyzoides

Justly proud of their work, the architects recount some of what’s happened in the area since the project was completed:
  • 49 new businesses along the boulevard and an almost doubling of revenue generated compared to just before the work began.
  • An almost 10 percent rise in downtown property values.
  • 800 new permanent jobs, 1,100 temporary construction jobs, and an estimated $273 million in economic output
  • 800 new and rehabbed homes.
  • Dramatically increased roadway safety, with traffic collisions cut in half and collisions with personal injury cut by 85 percent.
by Kaid Benfield, Atlantic Cities |  Read more:

Town of Whispers

One bottle of Astroglide. Four bottles of baby oil. One Nikon 35-mm. camera, one Sony camcorder, one Samsung camcorder, one Sony Handycam, one Pentax camera, one JVC camcorder, a black Fuji camera, two Canon Rebel cameras. Lots of laptops, DVDs, external hard drives, and condoms. A “Domination Fetish” sheet. A white envelope stuffed with $1,000 in cash. Surveillance glasses and black night-vision glasses. Eight Express Mail labels addressed to Strong Investigations. A notebook and a black leather appointment book, both filled with names. Excel spreadsheets containing e-mail addresses and phone numbers. Ledgers of sexual acts, with a monetary value given to each one, and hours of video recordings of many of them. A CD labeled “Yeah.”

These are just some of the items that were taken from the homes, offices, and cars of 30-year-old Alexis Wright, who made part of her living teaching a popular Latin-inspired fitness class called Zumba, and 57-year-old Mark Strong Sr., an insurance salesman. They’re the two figures at the center of a prostitution scandal that has captured the attention of the world. In court documents, the police allege not only that Wright was a prostitute but that she shared her professional encounters with Strong, either by sending him digital tapes of them or live video via Skype. Wright was sexually involved with Strong, who was also a licensed private investigator. Allegedly, she asked him to run her clients’ license‑plate numbers through the state motor-vehicle database, presumably to get their real identities. In the affidavit for Strong’s arrest warrant, the police say that “the numerous sex acts were video recorded unbeknownst to the males she was having sex with.”

Both Wright and Strong, who were indicted in October on a combined 165 charges consisting mainly of engaging in prostitution (her), promotion of prostitution (both), violation of privacy (both), and in her case benefits and tax fraud, have pleaded not guilty. In a press release, Strong called the charges “untrue,” and said, “I have made some bad choices but have broken no laws.” (Neither Wright nor Strong would be interviewed for this article.)

Wright’s choice of a locale in which to conduct her affairs was either inspired or twisted or both, depending on your point of view: the lovely, quaint seaside town of Kennebunk, Maine, population 10,798, home of Tom’s of Maine toothpaste, the heart of the land originally settled by the Puritans, just a hop, skip, and jump from Kennebunkport, where the Bush family has its Walker’s Point summer compound.

“Prostitution is not what Kennebunk wants to be known for,” a local morning drive-time radio host and divorce attorney named Ken Altshuler told TV-and-radio personality and addiction specialist Dr. Drew as the scandal was first breaking. “We’re a beautiful town, [a] tourist town.”

But what really made the story of the “Zumba Madam” go viral was Wright’s meticulous recordkeeping. Thanks to that, prosecutors can do something they often can’t in this type of case: figure out who the alleged johns were and charge them with crimes, too. The record of who did what has become known simply as “The List”—at one point, it was rumored to include 174 names—and the tantalizing prospect of finding out who is on it caused a feeding frenzy among the national media, from the Today show to Good Morning America to CNN. “The town was literally under siege,” says Laura Dolce, the editor of the local paper, the York County Coast Star. “You couldn’t walk down Main Street without being hounded by media. You couldn’t go into a coffee shop without reporters’ trying to overhear what people were talking about.” Who might be on the list? A member of the Bush family? Someone from the Secret Service? General Petraeus? Dolce says that she was asked about all three. The answers were no, no, and no. But, really, it could have been—and could still be—anyone.

by Bethany McLean, Vanity Fair |  Read more:
Photographs by Michele Stapleton

Friday, January 11, 2013