Saturday, November 10, 2012

Start Me Up Once More

“You can’t get away from that number,” Keith Richards said with a chuckle by telephone from Paris, where the Rolling Stones have been rehearsing for arena concerts and have played guerrilla club and theater shows. The Stones, led by Mick Jagger and Mr. Richards (although the other members have changed), played their first gig in 1962. And with less than two months remaining in this anniversary year, the machinery of commemoration and promotion has swung into motion.

There are arena concerts scheduled in London (Nov. 25 and 29) and Newark (Dec. 13 and 15). There are documentaries new (on HBO) and old (on DVD), as well as a comprehensive retrospective of Rolling Stones films and videos at the Museum of Modern Art from Nov. 15 to Dec. 2. There are even two new Stones songs recorded this year: “Doom and Gloom,” a Jagger song that mentions fracking, and “One More Shot,” written by Mr. Richards.

In one way the Stones have been doing the same thing for half a century: playing obstinately unpolished rock ’n’ roll. It’s American music — blues, country, R&B, gospel — refracted through English sensibilities while ditching decorum and riding the backbeat. Yet around that music, every conceivable meaning has changed.

What once was taken as radical, wanton, even dangerous has become old-school and privileged; tickets for the band’s two shows at the Prudential Center in Newark run $95 to $750 plus fees. (The Dec. 15 show will also be a pay-per-view broadcast.) The songs that once outraged parents are now oldies to pass on to the grandchildren. “You’d gone all the way from ‘It’s too dangerous to go’ to people bringing their children” to shows, Mr. Jagger said from Paris. “It became a family outing.” And a band that was once synonymous with a riotous volatility has become — despite all commercial, cultural and chemical odds — a symbol of stability. Members now describe the band with an unexpected word for the Rolling Stones: discipline. “It requires quite a bit of discipline to be a Rolling Stone,” Mr. Richards said. “Although it seems to be shambolic, it’s a very disciplined bunch.”

Interviewed separately, the guitarist Ronnie Wood, who joined the band in 1975, agreed. “No matter what was going on the outside, no matter how much we whooped it up,” he said, “we felt a responsibility, and we still do, to make great music.”

Simple familiarity, through the passage of time and generations, is one reason the Stones’ popularity has endured. Yet since the late 1980s, when the Stones pulled themselves together to make “Steel Wheels” and return to the stadium circuit, arguably every tour and album has been largely a victory lap for what they accomplished in their first 20 years.

By then Mr. Jagger and Mr. Richards had forged a catalog of great songs as diverse as — for starters — “The Last Time,” “(I Can’t Get No) Satisfaction,” “Ruby Tuesday,” “No Expectations,” “Honky Tonk Women,” “Brown Sugar” and “Gimme Shelter.” There’s no naïveté in Stones songs; they have worn well.

The band’s box office potential is unmistakable. Latter-day Stones studio albums, when they get around to making them — the last one was “A Bigger Bang,” back in 2005 — have each sold at least a million copies in the United States without major hit singles. Mr. Richards’s 2010 autobiography, “Life,” topped The New York Times’s best-seller list — and deserved to, with its frank and kaleidoscopic mingling of music lore, drug chronicles, romance, strife, loyalty, score-settling and improbable survival. The Stones dependably sell out arena tours. The fascination continues.

Nostalgia and durable songs are part of the Stones’ perpetual appeal. So are the big-stage rock spectacles that the Stones helped pioneer, with inflatable appendages, pyrotechnics or perhaps a cherry-picker lifting Mr. Jagger over the crowd. (Now Taylor Swift rides one.)

It doesn’t hurt ticket sales that Mr. Jagger, at 69, is still limber enough to prance, twitch and shimmy all over a stage; when Maroon 5 had a hit with “Moves Like Jagger,” younger listeners needed no footnote. In a heartening sight for his less spry contemporaries and baby boomer fans, Mr. Jagger had enough rock-star rambunctiousness to steal the show completely from hit makers less than half his age at the 2011 Grammy Awards. (“That’s pretty easy,” Mr. Jagger said from Paris. “If you’re only doing one number, you can tear anything up.”) Video from the Stones’ first concert since 2007, on Oct. 25 at the club Le Trabendo in Paris, shows a band that’s grizzled and scrappy but still game.

Onstage and, far more often than not, in the studio, the Rolling Stones keep their sound loose: it’s practiced and not to be mistaken for sloppy, precisely imprecise. Above Charlie Watts’s drumming the band’s two guitars share a musical cat’s cradle, constantly twining, unraveling, reconfiguring. “We’re always sliding between rhythm and lead,” Mr. Richards said. “It’s an intuitive thing, instinctive. You couldn’t map it.”

by Jon Pareles, NY Times |  Read more:
Photo: Credit: Bob Gruen

The Mad Max Economy

[ed. See also, Market-Based Disaster Justice.]

Folks here don’t wish disaster on their fellow Americans. They didn’t pray for Hurricane Sandy to come grinding up the East Coast, tearing lives apart and plunging millions into darkness.

But the fact is, disasters are good business in Waukesha. And, lately, there have been a lot of disasters.

This Milwaukee suburb, once known for its curative spring waters and, more recently, for being a Republican stronghold in a state that President Obama won on Election Day, happens to be the home of one of the largest makers of residential generators in the country. So when the lights go out in New York — or on the storm-savaged Jersey Shore or in tornado-hit Missouri or wherever — the orders come pouring in like a tidal surge.

It’s all part of what you might call the Mad Max Economy, a multibillion-dollar-a-year collection of industries that thrive when things get really, really bad. Weather radios, kerosene heaters, D batteries, candles, industrial fans for drying soggy homes — all are scarce and coveted in the gloomy aftermath of Hurricane Sandy and her ilk.

It didn’t start with the last few hurricanes, either. Modern Mad Max capitalism has been around a while, decades even, growing out of something like old-fashioned self-reliance, political beliefs and post-Apocalyptic visions. The cold war may have been the start, when schoolchildren dove under desks and ordinary citizens dug bomb shelters out back. But economic fears, as well as worries about climate change and an unreliable electronic grid have all fed it.

Driven of late by freakish storms, this industry is growing fast, well beyond the fringe groups that first embraced it. And by some measures, it’s bigger than ever.

Businesses like Generac Power Systems, one of three companies in Wisconsin turning out generators, are just the start.

The market for gasoline cans, for example, was flat for years. No longer. “Demand for gas cans is phenomenal, to the point where we can’t keep up with demand,” says Phil Monckton, vice president for sales and marketing at Scepter, a manufacturer based in Scarborough, Ontario. “There was inventory built up, but it is long gone.”

Even now, nearly two weeks after the superstorm made landfall in New Jersey, batteries are a hot commodity in the New York area. Win Sakdinan, a spokesman for Duracell, says that when the company gave away D batteries in the Rockaways, a particularly hard-hit area, people “held them in their hands like they were gold.”

by Andrew Martin, NY Times |  Read more:
Illustration: Karsten Moran for The New York Times

Why Private Email Accounts are a National Security Issue


Private e-mail services like Google’s, though considered significantly more secure than most, still have susceptibilities to foreign intrusion. And it happens. Technology writers have sometimes discussed what one writer called the “password fallacy,” the false sense of safety created by access systems such as Google’s that balance security against ease of use. Even with Google’s extra security features, the company must also avoid making security so onerous as to drive away customers, making it an easier target for foreign hackers even before Petraeus possibly started sharing access and thus diluting the account’s integrity. And, as a Wired magazine investigation demonstrated in August, personal e-mail accounts often allow hackers access to other personal accounts, worsening both the infiltration and the damage.

All of this might sound a little overly apprehensive – really, U.S. national security is compromised because the CIA director’s personal Gmail account might have been a little easier to hack? – until you start looking at the scale and sophistication of foreign attempts to infiltrate U.S. data sources. Chinese hacking efforts, perhaps the best-known but nowhere near the only threat to U.S. networks and computers, suggest the enormous scope and ferocious drive of foreign government hackers.

Some Americans who have access to sensitive information and who travel to China describe going to tremendous lengths to minimize government efforts to seize their data. Some copy and paste their passwords from USB thumb drives rather than type them out, for fear of key-logging software. They carry “loaner” laptops and cellphones and pull out cellphone batteries during sensitive meetings, worried that the microphone could be switched on remotely. The New York Times called such extreme measures, which also apply in other countries, “standard operating procedure for officials at American government agencies.”

by Max Fisher, Washington Post |  Read more:
Image: Pete Souza/The White House via Getty Images)

C C Barton - hand watercolored etchings
via:

Bruce CohenUntitled (Pink Balcony) 2008
via:

My Wife's Lover

[ed. Interesting...the things you find in newspaper advice columns. This one dated July 13, 2012.]

My wife is having an affair with a government executive. His role is to manage a project whose progress is seen worldwide as a demonstration of American leadership. (This might seem hyperbolic, but it is not an exaggeration.) I have met with him on several occasions, and he has been gracious. (I doubt if he is aware of my knowledge.) I have watched the affair intensify over the last year, and I have also benefited from his generosity. He is engaged in work that I am passionate about and is absolutely the right person for the job. I strongly feel that exposing the affair will create a major distraction that would adversely impact the success of an important effort. My issue: Should I acknowledge this affair and finally force closure? Should I suffer in silence for the next year or two for a project I feel must succeed? Should I be “true to my heart” and walk away from the entire miserable situation and put the episode behind me? NAME WITHHELD

by Chuck Klosterman, The Ethicist, NY Times | Read more:

Fantasyland

Mitt Romney is already slithering into the mists of history, or at least La Jolla, gone and soon to be forgotten. A weightless figure unloved and distrusted by even his own supporters, he was always destined, win or lose, to be a transitory front man for a radical-right GOP intent on barreling full-speed down the Randian path laid out by its true 2012 standard-bearer, Paul Ryan. But as was said of another unsuccessful salesman who worked the New England territory, attention must be paid to Mitt as the door slams behind him in the aftermath of Barack Obama’s brilliant victory. Though Romney has no political heirs in his own party or elsewhere, he does leave behind a cultural legacy of sorts. He raised Truthiness to a level of chutzpah beyond Stephen Colbert’s fertile imagination, and on the grandest scale. That a presidential hopeful so cavalierly mendacious could get so close to the White House, winning some 48 percent of the popular vote, is no small accomplishment. The American weakness that Romney both apotheosized and exploited in achieving this feat—our post-fact syndrome where anyone on the public stage can make up anything and usually get away with it—won’t disappear with him. A slicker liar could have won, and still might.

All politicians lie, and some of them, as Bob Kerrey famously said of Bill Clinton in 1996, are “unusually good” at it. Every campaign (certainly including Obama’s) puts up ads that stretch or obliterate the truth. But Romney’s record was exceptional by any standard. The blogger Steve Benen, who meticulously curated and documented Mitt’s false statements during 2012, clocked a total of 917 as Election Day arrived. Those lies, which reached a crescendo with the last-ditch ads accusing a bailed-out Chrysler of planning to ship American jobs to China, are not to be confused with the Romney flip-flops. The Etch-A-Sketches were a phenomenon of their own; if the left and right agreed about anything this year, it was that trying to pin down where Mitt “really” stood on any subject was a fool’s errand. His biography was no less Jell-O-like: There were the still-opaque dealings at Bain, and those Olympics, and a single (disowned) term in public service, and his churchgoing—and what else had he been up to for 65 years? We never did see those tax returns. We never did learn the numbers that might validate the Romney-Ryan budget. Given that Romney had about as much of a human touch with voters as an ATM, it sometimes seemed as if a hologram were running for president. Yet some 57 million Americans took him seriously enough to drag themselves to the polls and vote for a duplicitous cipher. Not all of this can be attributed to the unhinged Obama hatred typified by Mary Matalin’s postelection characterization of the president as “a political narcissistic sociopath.”

As GOP politicians and pundits pile on Romney in defeat, they often argue that he was done in by not being severely conservative enough; if only he’d let Ryan be Ryan, voters would have been won over by right-wing orthodoxy offering a clear-cut alternative to Obama’s alleged socialism. In truth, Romney was a perfect embodiment of the current GOP. As much as the Republican Party is a radical party, and a nearly all-white party, it has also become the Fantasyland Party. It’s an isolated and gated community impervious to any intrusions of reality from the “real America” it solipsistically claims to represent. This year’s instantly famous declaration by the Romney pollster Neil Newhouse that “we’re not going to let our campaign be dictated by fact-checkers” crystallized the mantra of the entire GOP. The Republican faithful at strata both low and high, from Rush’s dittoheads to the think-tank-affiliated intellectuals, have long since stopped acknowledging any empirical evidence that disputes their insular worldview, no matter how grounded that evidence might be in (God forbid) science or any other verifiable reality, like, say, Census reports or elementary mathematics. No wonder Romney shunned the word Harvard, which awarded him two degrees, even more assiduously than he did Mormon.

At the policy level, this is the GOP that denies climate change, that rejects Keynesian economics, and that identifies voter fraud where there is none. At the loony-tunes level, this is the GOP that has given us the birthers, websites purporting that Obama was lying about Osama bin Laden’s death, and not one but two (failed) senatorial candidates who redefined rape in defiance of medical science and simple common sense. It’s the GOP that demands the rewriting of history (and history textbooks), still denying that Barry Goldwater’s opposition to the Civil Rights Act of 1964 and Richard Nixon’s “southern strategy” transformed the party of Lincoln into a haven for racists. Such is the conservative version of history that when the website Right Wing News surveyed 43 popular conservative bloggers to determine the “worst figures in American history” two years ago, Jimmy Carter, Obama, and FDR led the tally, all well ahead of Benedict Arnold, Timothy McVeigh, and John Wilkes Booth.

The good news for Democrats this year was that the right’s brand of magical thinking (or non-thinking) bit the GOP in the ass, persuading it to disregard all the red flags and assume even a figure as hollow as Romney could triumph. (Retaking the Senate was once thought to be a lock, too.) The books chronicling what happened in 2012 will devote much attention to the failings of Romney’s campaign and to the ruthlessness and surgical rigor of Obama’s. But an equally important part of this history is the extraordinary lengths to which the grandees of the GOP—not just basket cases like Dick “Landslide!” Morris and Glenn Beck, but the supposed adults regarded by the Beltway Establishment and mainstream media as serious figures—enabled their party’s self-immolating denial of political reality. This was the election in which even George Will (who predicted a 321 Electoral College win for Romney) surrendered to the cult of the talk-radio base and drank the Kool-Aid without realizing it had been laced with political cyanide. If a tea-party voter in Texas was shocked that Obama won, he was no less thunderstruck than the Romney campaign, or Karl Rove. Rove’s remarkably graphic public meltdown on Fox News—babbling gibberish about how his Ohio numbers showed a path for Romney even after the election was lost—marked not just the end of his careers as a self-styled political brainiac and as a custodian of hundreds of millions of dollars in super-PAC money. It was an epic on-camera dramatization of his entire cohort’s utter estrangement from reality.

by Frank Rich, New York Magazine |  Read more:
Photo: Christopher Anderson/Magnum Photos/New York Magazine

Friday, November 9, 2012


nancy mccarthy
via:

How Human Beings Almost Vanished From Earth In 70,000 B.C.

Add all of us up, all 7 billion human beings on earth, and clumped together we weigh roughly 750 billion pounds. That, says Harvard biologist E.O. Wilson, is more than 100 times the biomass of any large animal that's ever walked the Earth. And we're still multiplying. Most demographers say we will hit 9 billion before we peak, and what happens then?

Well, we've waxed. So we can wane. Let's just hope we wane gently. Because once in our history, the world-wide population of human beings skidded so sharply we were down to roughly a thousand reproductive adults. One study says we hit as low as 40.

Forty? Come on, that can't be right. Well, the technical term is 40 "breeding pairs" (children not included). More likely there was a drastic dip and then 5,000 to 10,000 bedraggled Homo sapiens struggled together in pitiful little clumps hunting and gathering for thousands of years until, in the late Stone Age, we humans began to recover. But for a time there, says science writer Sam Kean, "We damn near went extinct."

I'd never heard of this almost-blinking-out. That's because I'd never heard of Toba, the "supervolcano." It's not a myth. While details may vary, Toba happened.

Toba, The Supervolcano

Once upon a time, says Sam, around 70,000 B.C., a volcano called Toba, on Sumatra, in Indonesia went off, blowing roughly 650 miles of vaporized rock into the air. It is the largest volcanic eruption we know of, dwarfing everything else...

That eruption dropped roughly six centimeters of ash — the layer can still be seen on land — over all of South Asia, the Indian Ocean, the Arabian and South China Sea. According to the Volcanic Explosivity Index, the Toba eruption scored an "8", which translates to "mega-colossal" — that's two orders of magnitude greater than the largest volcanic eruption in historic times at Mount Tambora in Indonesia, which caused the 1816 "Year Without a Summer" in the northern hemisphere.

With so much ash, dust and vapor in the air, Sam Kean says it's a safe guess that Toba "dimmed the sun for six years, disrupted seasonal rains, choked off streams and scattered whole cubic miles of hot ash (imagine wading through a giant ashtray) across acres and acres of plants." Berries, fruits, trees, African game became scarce; early humans, living in East Africa just across the Indian Ocean from Mount Toba, probably starved, or at least, he says, "It's not hard to imagine the population plummeting."

by Robert Krulwich, NPR |  Read more:
Illustration: Robert Krulwich

Leaving Digital for DIY

Wired's long-time editor in chief, Chris Anderson, announced on Friday that he was leaving the magazine to become CEO of his DIY-drone company, 3D Robotics. This move comes a month after the release of his latest book, Makers: The New Industrial Revolution. In an interview last week (and a brief follow-up after Friday's announcement), Anderson talked with me about today's biggest revolution in how and where we actually make things. If the last few decades have been about big digital forces — the Internet, social media — he notes that the future will be about applying all of that in the real world. "Wondrous as the Web is," he writes, "it doesn’t compare to the real world. Not in economic size (online commerce is less than 10 percent of all sales) and not in its place in our lives. The digital revolution has been largely limited to screens." But, he adds, the salient fact remains that "we live in homes, drive in cars, and work in offices." And it is that physical part of the economy that is undergoing the biggest and most fundamental change. (...)

Some people hear the word "maker" and imagine we are going back to the past, a world of artisans using traditional tools to make craft products. From reading your book, that’s not exactly what you mean. You're talking about a blurring of what might be called the analog and digital worlds. Tell us more about how you see this playing out.

The "Maker Movement" is simply what happened when the web revolution hit the real world. The term, in its current sense, was first coined in 2005 by Dale Dougherty of the tech book publisher O’Reilly, to describe what he saw as a resurgence of tinkering, that great American tradition. But rather than isolated hobbyists in their garages the way it used to be, this was coming out of Web communities and increasingly using digital tools, from 3D printers, which were just then starting to be available for regular consumers, and to a new generation of free and easy CAD software programs. ...The world’s factories are now increasingly open to anyone via the web, creating what amounts to "cloud manufacturing." And huge Maker communities have grown around sites such as Kickstarter and Etsy. In Silicon Valley, the phrase is that "hardware is the new software." The web's powerful innovation model can now be applied to making real stuff. As a result, we’re going from the "tinkerer" phase of this movement to entrepreneurship, too. What began as a social revolution is starting to look like an industrial revolution.

What are the key technological innovations and shifts that are enabling and powering the revolution in making things?

There are really two: the first on the desktop and the second in the cloud.

On the desktop, it's been the arrival of cheap and easy-to-use digital fabrication tools for consumers. Although the technology, from 3D printers to laser cutters and CNC machines, have been used in industry for decades, they've only reached the consumer desktop over the past few years. Five years ago, that started with the RepRap project, which was an open-source 3D printer design that could be assembled as a kit and led to the first MakerBots.

Call that the Apple II phase, where the machines were mostly sold to geeks who were willing to put up with a lot of complexity to experiment with an exciting new technology. But over the past year, to extend the analogy, we've entered the Macintosh phase: consumer 3D printers that come ready to run, and just work out of the box with simple software.

That allows anyone to fabricate complex objects, with no special machine-shop skills or tools. In the same way that the first consumer laser printers, back in the 1980s, were able to hide all the complexity of professional printing behind the a simple menu item that said "Print," today’s 3D printers hide the complexity of computer-controlled fabrication behind a simple menu item that says "Make."

That desktop manufacturing revolution is great for making a few of something, as a custom product or prototype, but it should not be confused with mass production. It can take an hour or more to 3D-print a single object. So how do we get from there to an industrial revolution? Enter the second enabling technology: the cloud.

Over the past few decade, the world’s factories have embraced the Web. Thanks to online marketplaces such as Alibaba (in China) and MFG.com (in the U.S.), factories that would once only work for big commercial customers will now take orders from anyone. That means that once you've prototyped your widget on your desktop, you can send the same digital design to a big factory to be turned into a form that can be mass-produced. You don't need to be a company, and typically such factories are willing to work at any scale, from hundreds to hundreds of thousands.

Once, to get into manufacturing, you needed to own a factory. Then, with outsourcing, you needed to at least know someone who owned a factory. Now all you need is a web browser and a credit card to get robots in China to work for you!

by Richard Florida, Atlantic Cities |  Read more:
Photo: Creative Commons by Joi Ito

The Heart Grows Smarter

If you go back and read a bunch of biographies of people born 100 to 150 years ago, you notice a few things that were more common then than now.

First, many more families suffered the loss of a child, which had a devastating and historically underappreciated impact on their overall worldviews.

Second, and maybe related, many more children grew up in cold and emotionally distant homes, where fathers, in particular, barely knew their children and found it impossible to express their love for them.

It wasn’t only parents who were emotionally diffident; it was the people who studied them. In 1938, a group of researchers began an intensive study of 268 students at Harvard University. The plan was to track them through their entire lives, measuring, testing and interviewing them every few years to see how lives develop.

In the 1930s and 1940s, the researchers didn’t pay much attention to the men’s relationships. Instead, following the intellectual fashions of the day, they paid a lot of attention to the men’s physiognomy. Did they have a “masculine” body type? Did they show signs of vigorous genetic endowments?

But as this study — the Grant Study — progressed, the power of relationships became clear. The men who grew up in homes with warm parents were much more likely to become first lieutenants and majors in World War II. The men who grew up in cold, barren homes were much more likely to finish the war as privates.

Body type was useless as a predictor of how the men would fare in life. So was birth order or political affiliation. Even social class had a limited effect. But having a warm childhood was powerful. As George Vaillant, the study director, sums it up in “Triumphs of Experience,” his most recent summary of the research, “It was the capacity for intimate relationships that predicted flourishing in all aspects of these men’s lives.”

Of the 31 men in the study incapable of establishing intimate bonds, only four are still alive. Of those who were better at forming relationships, more than a third are living.

It’s not that the men who flourished had perfect childhoods. Rather, as Vaillant puts it, “What goes right is more important than what goes wrong.” The positive effect of one loving relative, mentor or friend can overwhelm the negative effects of the bad things that happen.

In case after case, the magic formula is capacity for intimacy combined with persistence, discipline, order and dependability. The men who could be affectionate about people and organized about things had very enjoyable lives.

But a childhood does not totally determine a life. The beauty of the Grant Study is that, as Vaillant emphasizes, it has followed its subjects for nine decades. The big finding is that you can teach an old dog new tricks. The men kept changing all the way through, even in their 80s and 90s.

by David Brooks, NY Times |  Read more:
Illustration: via

Thursday, November 8, 2012


Sydney Bella Sparrow, Three Greens Convene 2009
via:

Noam Chomsky on Where Artificial Intelligence Went Wrong

In May of last year, during the 150th anniversary of the Massachusetts Institute of Technology, a symposium on "Brains, Minds and Machines" took place, where leading computer scientists, psychologists and neuroscientists gathered to discuss the past and future of artificial intelligence and its connection to the neurosciences.

The gathering was meant to inspire multidisciplinary enthusiasm for the revival of the scientific question from which the field of artificial intelligence originated: how does intelligence work? How does our brain give rise to our cognitive abilities, and could this ever be implemented in a machine?

Noam Chomsky, speaking in the symposium, wasn't so enthused. Chomsky critiqued the field of AI for adopting an approach reminiscent of behaviorism, except in more modern, computationally sophisticated form. Chomsky argued that the field's heavy use of statistical techniques to pick regularities in masses of data is unlikely to yield the explanatory insight that science ought to offer. For Chomsky, the "new AI" -- focused on using statistical learning techniques to better mine and predict data -- is unlikely to yield general principles about the nature of intelligent beings or about cognition.

This critique sparked an elaborate reply to Chomsky from Google's director of research and noted AI researcher, Peter Norvig, who defended the use of statistical models and argued that AI's new methods and definition of progress is not far off from what happens in the other sciences.

Chomsky acknowledged that the statistical approach might have practical value, just as in the example of a useful search engine, and is enabled by the advent of fast computers capable of processing massive data. But as far as a science goes, Chomsky would argue it is inadequate, or more harshly, kind of shallow. We wouldn't have taught the computer much about what the phrase "physicist Sir Isaac Newton" really means, even if we can build a search engine that returns sensible hits to users who type the phrase in.

It turns out that related disagreements have been pressing biologists who try to understand more traditional biological systems of the sort Chomsky likened to the language faculty. Just as the computing revolution enabled the massive data analysis that fuels the "new AI", so has the sequencing revolution in modern biology given rise to the blooming fields of genomics and systems biology. High-throughput sequencing, a technique by which millions of DNA molecules can be read quickly and cheaply, turned the sequencing of a genome from a decade-long expensive venture to an affordable, commonplace laboratory procedure. Rather than painstakingly studying genes in isolation, we can now observe the behavior of a system of genes acting in cells as a whole, in hundreds or thousands of different conditions.

The sequencing revolution has just begun and a staggering amount of data has already been obtained, bringing with it much promise and hype for new therapeutics and diagnoses for human disease. For example, when a conventional cancer drug fails to work for a group of patients, the answer might lie in the genome of the patients, which might have a special property that prevents the drug from acting. With enough data comparing the relevant features of genomes from these cancer patients and the right control groups, custom-made drugs might be discovered, leading to a kind of "personalized medicine." Implicit in this endeavor is the assumption that with enough sophisticated statistical tools and a large enough collection of data, signals of interest can be weeded it out from the noise in large and poorly understood biological systems.

The success of fields like personalized medicine and other offshoots of the sequencing revolution and the systems-biology approach hinge upon our ability to deal with what Chomsky called "masses of unanalyzed data" -- placing biology in the center of a debate similar to the one taking place in psychology and artificial intelligence since the 1960s.

Systems biology did not rise without skepticism. The great geneticist and Nobel-prize winning biologist Sydney Brenner once defined the field as "low input, high throughput, no output science." Brenner, a contemporary of Chomsky who also participated in the same symposium on AI, was equally skeptical about new systems approaches to understanding the brain. When describing an up-and-coming systems approach to mapping brain circuits called Connectomics, which seeks to map the wiring of all neurons in the brain (i.e. diagramming which nerve cells are connected to others), Brenner called it a "form of insanity."

Brenner's catch-phrase bite at systems biology and related techniques in neuroscience is not far off from Chomsky's criticism of AI. An unlikely pair, systems biology and artificial intelligence both face the same fundamental task of reverse-engineering a highly complex system whose inner workings are largely a mystery. Yet, ever-improving technologies yield massive data related to the system, only a fraction of which might be relevant. Do we rely on powerful computing and statistical approaches to tease apart signal from noise, or do we look for the more basic principles that underlie the system and explain its essence? The urge to gather more data is irresistible, though it's not always clear what theoretical framework these data might fit into. These debates raise an old and general question in the philosophy of science: What makes a satisfying scientific theory or explanation, and how ought success be defined for science?

by Yarden Katz, The Atlantic |  Read more:
Photo: Graham Gordon Ramsay

Jan Zrzavy: Girl Friends, 1923 - Oil on Canvas (Centre for Modern and Contemporary Art, Veletrzni (Trades Fair) Palace, Prague)
via:

Ryan Adams


Can't a Guy Just Make Some Friends Around Here? Maybe.

A little more than a year ago, I moved into a West Plaza apartment. It was neat and spacious, with hardwood floors for the rug I'd bought in Istanbul but never stepped on. And because it was walking distance from both the Plaza and Westport, I could mosey to a coffee shop or stagger home drunk from a bar without even glancing at my car. Best of all, it was cheap enough that I could live alone and realize one of the defining fantasies of many 20-something men: I'd be responsible for every mess I created, without fighting with a roommate for kitchen-counter space or bathroom time as if we were sharing the West Bank. This was city living as personal libertarian utopia.

Affordable rent and abundant space, after all, are what Kansas City is supposed to be about. After a rough few months in Washington, D.C. — where I hadn't liked the government achievatrons I met at parties and where an invasive landlord (who was cheating on her property taxes) had kicked me out of an egregiously expensive apartment (which was being rented illegally) — I was ready to build a better life in KC.

And a better life in KC was so easy — at first.

The Lattéland on Jefferson Street had a fine patio, so I spent a lot of time reading outside there. The Cinemark a few blocks from my apartment cost only $6 a show, so I saw every Zac Efron and Ryan Reynolds abomination that rolled through. (Because for six bucks, I'll watch anything in a theater.) These things I did alone, which was fine for a while.

Yet the warning signs of loneliness started to emerge. The apartment I'd chosen for its spaciousness began to look to me like that photo of Steve Jobs' living room, which had nothing but a rug and a lamp. The cheap movies I saw were stupid romantic comedies, and watching them alone lost its irony. And there were no co-workers to run into at the coffee shop after work because I was a freelancer with no co-workers. (...)

Last fall, I also spent a lot of time in downtown Cairo, which was like living inside a human beehive. The crush of people packed into small spaces, combined with more outgoing social norms between strangers, means that every time you step outside is a trip into the unexpected. It's almost impossible not to meet people. To live in Cairo is to share in a high-density experience, one in which all the human molecules jostle together to create a kind of social friction you rarely see in Kansas City. I thought about all the times I'd walked along Roanoke Parkway without passing another person on foot.

But you don't have to travel abroad to know that the way we live in Kansas City — by ourselves, in spread-out homes, often away from our families and detached from our friends, wedged into our cars — is a historical aberration and exceptional compared with many other parts of the world. And in recent years, various media outlets have singled us out for some embarrassing stats, telling the rest of the country what we already knew about ourselves. In 2009, for example, Forbes crunched population data for America's 40 largest metropolitan areas and ranked Kansas City dead last for the number of single people. Perhaps correspondingly, KC's number of bars, restaurants and nightclubs per capita didn't rank much better.

That same year, according to U.S. Census data, more than 30 percent of American 20-somethings moved. No other age group uproots itself as frequently. We move because of new jobs or new relationships, and we arrive with few attachments. We're looking for those bars and restaurants and clubs, and the ongoing renaissance of KC's downtown offers some encouragement that life for young, urban-minded people is getting a little more vibrant.

So maybe, I thought, it was time to look for some new friends. But when I set out to do that, I found that my fellow Kansas Citians were feeling all kinds of lonely. And some weren't shy about admitting it.

At least, that's what I learned from Craigslist.

by Matt Pearce, The Pitch |  Read more:
Illustration: Shannon Freshwater

Forget About 2012, the Race for the White House in 2016 has Already Begun

[ed. See also: this interesting piece on Obama's place in history.]

The 2012 presidential election is over; let the 2016 election begin! Without skipping a beat, the endless electoral process in the US has started over. It is the American equivalent of: “The king is dead. Long live the king.” It is why political junkies across the world love American politics. It never ends. So what do we know about the next race?

With Barack Obama still in the White House and unable to run again due to term limits, both parties are looking for a new candidate. In the Republican Party, it used to be that if you didn’t win the presidency one year, you could put yourself forward the next. Richard Nixon lost to John F Kennedy in 1960 but bounced back in 1968 to beat Hubert Humphrey. Now voters are so fickle and campaigns so punishing that you only get one shot. Americans don’t like losers and the electoral battlefield is strewn with the corpses of failed presidential candidates who overnight became unpersons. So bye-bye, Mitt. Missing you already. As F. Scott Fitzgerald observed, there are no second acts in American lives.

Team Billary

Yet politicians who merely fail to win their party’s nomination can keep plugging away. Hillary Clinton was beaten to the punch by Obama in 2008 but is expected to run in 2016. That explains why Bill Clinton has been going hoarse urging voters to back Obama. With Romney in the White House, Hillary would find it hard to unseat him. So, she needed Romney to lose and there was no better way to ensure that than to set her husband on to him. Having loyally served Obama as secretary of state, Hillary expects the president to repay the compliment and back her bid. The general view among Democrats is that if Hillary wants the nomination, it’s hers. They feel that her impeccable performance as senator for New York, then at the state department, has repaired the reputation for divisiveness and aggression that she acquired when she was first lady. She will be 69 in 2016.

It is not quite that easy. The Clintons may be the first couple of Democratic politics but after Obama’s rope-a-dope in the Denver debate, Vice-President Joe Biden became a Democratic hero by upstaging and smothering his rival, Paul Ryan, in their televised debate. No one would begrudge Biden a run at the primaries. Despite failing in 1988, when he was caught plagiarising a speech by Neil Kinnock, and in 2008, when he lost out to Obama and Hillary, his ambition remains undimmed. He will be 74 in 2016.

by Nicholas Wapshott, The New Statesman |  Read more:
Photograph: Getty Images