Friday, September 14, 2012

The Machines Are Taking Over

[ed. How computerized tutors are learning to teach humans.]

In a 1984 paper that is regarded as a classic of educational psychology, Benjamin Bloom, a professor at the University of Chicago, showed that being tutored is the most effective way to learn, vastly superior to being taught in a classroom. The experiments headed by Bloom randomly assigned fourth-, fifth- and eighth-grade students to classes of about 30 pupils per teacher, or to one-on-one tutoring. Children tutored individually performed two standard deviations better than children who received conventional classroom instruction — a huge difference. (...)

The morning after I watched Tyler Rogers do his homework, I sat in on his math class at Grafton Middle School. As he and his classmates filed into the classroom, I talked with his teacher, Kim Thienpont, who has taught middle school for 10 years. “As teachers, we get all this training in ‘differentiated instruction’ — adapting our teaching to the needs of each student,” she said. “But in a class of 20 students, with a certain amount of material we have to cover each day, how am I really going to do that?”

ASSISTments, Thienpont told me, made this possible, echoing what I heard from another area math teacher, Barbara Delaney, the day before. Delaney teaches sixth-grade math in nearby Bellingham. Each time her students use the computerized tutor to do their homework, the program collects data on how well they’re doing: which problems they got wrong, how many times they used the hint button. The information is automatically collated into a report, which is available to Delaney on her own computer before the next morning’s class. (Reports on individual students can be accessed by their parents.) “With ASSISTments, I know none of my students are falling through the cracks,” Delaney told me.

After completing a few warm-up problems on their school’s iPod Touches­, the students turned to the front of the room, where Thienpont projected a spreadsheet of the previous night’s homework. Like stock traders going over the day’s returns, the students scanned the data, comparing their own grades with the class average and picking out the problems that gave their classmates trouble. (“If you got a question wrong, but a lot of other people got it wrong, too, you don’t feel so bad,” Tyler explained.)

Thienpont began by going over “common wrong answers” — incorrect solutions that many students arrived at by following predictable but mistaken lines of reasoning. Or perhaps, not so predictable. “Sometimes I’m flabbergasted by the thing all the students get wrong,” Thienpont said. “It’s often a mistake I never would have expected.” Human teachers and tutors are susceptible to what cognitive scientists call the “expert blind spot” — once we’ve mastered a body of knowledge, it’s hard to imagine what novices don’t know — but computers have no such mental block. Highlighting “common wrong answers” allows Thienpont to address shared misconceptions without putting any one student on the spot.

I saw another unexpected effect of computerized tutoring in Delaney’s Bellingham classroom. After explaining how to solve a problem that many got wrong on the previous night’s homework, Delaney asked her students to come up with a hint for the next year’s class. Students called out suggested clues, and after a few tries, they arrived at a concise tip. “Congratulations!” she said. “You’ve just helped next year’s sixth graders learn math.” When Delaney’s future pupils press the hint button in ASSISTments, the former students’ advice will appear.

Unlike the proprietary software sold by Carnegie Learning, or by education-technology giants like Pearson, ASSISTments was designed to be modified by teachers and students, in a process Heffernan likens to the crowd-sourcing that created Wikipedia. His latest inspiration is to add a button to each page of ASSISTments that will allow students to access a Web page where they can get more information about, say, a relevant math concept. Heffernan and his W.P.I. colleagues are now developing a system of vetting and ranking the thousands of math-related sites on the Internet.

by Annie Murphy Paul, NY Times |  Read more:
Illustration by Tim Enthoven

Thursday, September 13, 2012

Healthcare's "Massive Transfer of Wealth"

[ed. It really is a massive transfer of wealth to health insurers and health care providers. The end years are expensive - no matter how much you think you've saved to sustain some measure of financial security, you never know if it will be enough. Then, there's the added indignity of having essentially zero control over when, or how, you exit this life. There has to be a better way.]

Here are excerpts from one family's story about the financial aspects of end-of-life-related healthcare:
My aunt, aged 94, died last week. In and of itself, there is nothing remarkable in this statement, except for the fact that she died a pauper and on medical assistance as a ward of the state of Minnesota... 
My aunt and her husband, who died in 1985, were hardworking Americans. The children of Polish immigrants, they tried to live by the rules. Combined, they worked for a total of 80 years in a variety of low-level, white-collar jobs. If they collectively earned $30,000 in any given year, that would have been a lot. 
Yet, somehow, my aunt managed to save more than $250,000. She also received small pensions from the Teamsters Union and the state of California, along with Social Security and a tiny private annuity. In the last decade of her life, her monthly income amounted to about $1,500.. 
But when she fell ill and had to be placed in assisted living, and finally in a nursing home, her financial fate was sealed. Although she had Medicare and Medicare supplemental insurance, neither of these covered the costs of long-term care. Her savings were now at risk, at a rate of $60,000 a year... 
In the end, she spent everything she had to qualify for Medicaid in Minnesota, which she was on for the last year of her life. This diligent, responsible American woman was pauperized simply because she had the indecency to get terminally ill... 
Though I have not been able to find statistics on the subject, I am certain that there will be a massive transfer of wealth over the next two or three decades, amounting to hundreds of billions of dollars or more, from people just like my aunt to health insurers and health care providers... 
This week, I was about to close out her checking account in the amount of $215, the sum total of her wealth. But I received, in the mail, a bill from a heath care provider in the amount of $220. Neither Medicare nor her supplemental insurer will pay it, because it is an unspecified "service not covered."More details of the story at the StarTribune. Of course, it's just one family's story. Repeated hundreds of thousands of times across the country.
My own mother, age 94, has asked me, "when the time comes" to "put her down."

by Minnasotastan, TYWKIWDBI |  Read more:

© Chris Ware/The New Yorker

Tyranny of Merit


The ideal of meritocracy has deep roots in this country. Jefferson dreamed of a “natural aristocracy.” But the modern meritocracy dates only to the 1930s, when Harvard President James Bryant Conant directed his admissions staff to find a measure of ability to supplement the old boys’ network. They settled on the exam we know as the SAT.

In the decades following World War II, standardized testing replaced the gentleman’s agreements that had governed the Ivy League. First Harvard, then Yale and the rest filled with the sons and eventually daughters of Jews, blue-collar workers, and other groups whose numbers had previously been limited.

After graduation, these newly pedigreed men and women flocked to New York and Washington. There, they took jobs once filled by products of New England boarding schools. One example is Lloyd Blankfein, the Bronx-born son of a Jewish postal clerk, who followed Harvard College and Harvard Law School with a job at a white-shoe law firm, which he left to join Goldman Sachs.

Hayes applauds the replacement of the WASP ascendancy with a more diverse cohort. The core of his book, however, argues that the principle on which they rose inevitably undermines itself.

The argument begins with the observation that meritocracy does not oppose unequal social and economic outcomes. Rather, it tries to justify inequality by offering greater rewards to the talented and hardworking.

The problem is that the effort presumes that everyone has the same chance to compete under the same rules. That may be true at the outset. But equality of opportunity tends to be subverted by the inequality of outcome that meritocracy legitimizes. In short, according to Hayes, “those who are able to climb up the ladder will find ways to pull it up after them, or to selectively lower it down to allow their friends, allies and kin to scramble up. In other words: ‘whoever says meritocracy says oligarchy.’”

With a nod to the early 20th-century German sociologist Robert Michels, Hayes calls this paradox the “Iron Law of Meritocracy.” (...)

Hayes oversells his argument as a unified explanation of the “fail decade.” Although it elucidates some aspects of the Iraq War, Katrina debacle, and financial crisis, these disasters had other causes. Nevertheless, the Iron Law of Meritocracy shows why our elites take the form they do and how they fell so out touch with reality. In Hayes’s account, the modern elite is caught in a feedback loop that makes it less and less open and more and more isolated from the rest of the country.

What’s to be done? One answer is to rescue meritocracy by providing the poor and middle class with the resources to compete. A popular strategy focuses on education reform. If schools were better, the argument goes, poor kids could compete on an equal footing for entry into the elite. The attempt to rescue meritocracy by fixing education has become a bipartisan consensus, reflected in Bush’s “No Child Left Behind” and Obama’s “Race to the Top.”

Hayes rejects this option. The defect of meritocracy, in his view, is not the inequality of opportunity that it conceals, but the inequality of outcome that it celebrates. In other words, the problem is not that the son of a postal clerk has less chance to become a Wall Street titan than he used to. It’s that the rewards of a career on Wall Street have become so disproportionate to the rewards of the traditional professions, let alone those available to a humble civil servant.

by Samuel Goldman, The American Conservative |  Read more:
Illustration by Michael Hogue

How Do Our Brains Process Music?

I listen to music only at very specific times. When I go out to hear it live, most obviously. When I’m cooking or doing the dishes I put on music, and sometimes other people are present. When I’m jogging or cycling to and from work down New York’s West Side Highway bike path, or if I’m in a rented car on the rare occasions I have to drive somewhere, I listen alone. And when I’m writing and recording music, I listen to what I’m working on. But that’s it.

I find music somewhat intrusive in restaurants or bars. Maybe due to my involvement with it, I feel I have to either listen intently or tune it out. Mostly I tune it out; I often don’t even notice if a Talking Heads song is playing in most public places. Sadly, most music then becomes (for me) an annoying sonic layer that just adds to the background noise.

As music becomes less of a thing—a cylinder, a cassette, a disc—and more ephemeral, perhaps we will start to assign an increasing value to live performances again. After years of hoarding LPs and CDs, I have to admit I’m now getting rid of them. I occasionally pop a CD into a player, but I’ve pretty much completely converted to listening to MP3s either on my computer or, gulp, my phone! For me, music is becoming dematerialized, a state that is more truthful to its nature, I suspect. Technology has brought us full circle.

I go to at least one live performance a week, sometimes with friends, sometimes alone. There are other people there. Often there is beer, too. After more than a hundred years of technological innovation, the digitization of music has inadvertently had the effect of emphasizing its social function. Not only do we still give friends copies of music that excites us, but increasingly we have come to value the social aspect of a live performance more than we used to. Music technology in some ways appears to have been on a trajectory in which the end result is that it will destroy and devalue itself. It will succeed completely when it self-destructs. The technology is useful and convenient, but it has, in the end, reduced its own value and increased the value of the things it has never been able to capture or reproduce.

Technology has altered the way music sounds, how it’s composed and how we experience it. It has also flooded the world with music. The world is awash with (mostly) recorded sounds. We used to have to pay for music or make it ourselves; playing, hearing and experiencing it was exceptional, a rare and special experience. Now hearing it is ubiquitous, and silence is the rarity that we pay for and savor.

Does our enjoyment of music—our ability to find a sequence of sounds emotionally affecting—have some neurological basis? From an evolutionary standpoint, does enjoying music provide any advantage? Is music of any truly practical use, or is it simply baggage that got carried along as we evolved other more obviously useful adaptations? Paleontologist Stephen Jay Gould and biologist Richard Lewontin wrote a paper in 1979 claiming that some of our skills and abilities might be like spandrels—the architectural negative spaces above the curve of the arches of buildings—details that weren’t originally designed as autonomous entities, but that came into being as a result of other, more practical elements around them.

by David Byrne, Smithsonian | Read more:
Photo: Clayton Cubitt

Melody Gardot



Virginia Colback, “Yellow and Grey Abstract”, oil and cement on canvas

Wednesday, September 12, 2012

Tesla Boy


Whoa, Dude, Are We in a Computer Right Now?

Two years ago, Rich Terrile appeared on Through the Wormhole, the Science Channel’s show about the mysteries of life and the universe. He was invited onto the program to discuss the theory that the human experience can be boiled down to something like an incredibly advanced, metaphysical version of The Sims.

It’s an idea that every college student with a gravity bong and The Matrix on DVD has thought of before, but Rich is a well-regarded scientist, the director of the Center for Evolutionary Computation and Automated Design at NASA’s Jet Propulsion Laboratory, and is currently writing an as-yet-untitled book about the subject, so we’re going to go ahead and take him seriously.

The essence of Rich’s theory is that a “programmer” from the future designed our reality to simulate the course of what the programmer considers to be ancient history—for whatever reason, maybe because he’s bored.

According to Moore’s Law, which states that computing power doubles roughly every two years, all of this will be theoretically possible in the future. Sooner or later, we’ll get to a place where simulating a few billion people—and making them believe they are sentient beings with the ability to control their own destinies—will be as easy as sending a stranger a picture of your genitals on your phone.

This hypothesis—versions of which have been kicked around for centuries—is becoming the trippy notion of the moment for philosophers, with people like Nick Bostrom, the director of Oxford University’s Future of Humanity Institute, seriously considering the premise.

Until recently, the simulation argument hadn’t really attracted traditional researchers. That’s not to say he is the first scientist to predict our ability to run realistic simulations (among others, Ray Kurzweil did that in his 1999 book The Age of Spiritual Machines), but he is one of the first to argue we might already be living inside one. Rich has even gone one step further by attempting to prove his theories through physics, citing things like the observable pixelation of the tiniest matter and the eerie similarities between quantum mechanics, the mathematical rules that govern our universe, and the creation of video game environments.

Just think: Whenever you fuck up there could be the intergalactic version of an overweight 13-year-old Korean boy controlling you and screaming “Shit!” into an Xbox headset. It sort of takes the edge off things.

VICE: When did you first surmise that our reality could be a computer simulation?
Rich Terrile: Unless you believe there’s something magical about consciousness—and I don’t, I believe it’s the product of a very sophisticated architecture within the human brain—then you have to assume that at some point it can be simulated by a computer, or in other words, replicated. There are two ways one might accomplish an artificial human brain in the future. One of them is to reverse-engineer it, but I think it would be far easier to evolve a circuit or architecture that could become conscious. Perhaps in the next ten to 30 years we’ll be able to incorporate artificial consciousness into our machines.

We’ll get there that fast?
Right now the fastest NASA supercomputers are cranking away at about double the speed of the human brain. If you make a simple calculation using Moore’s Law, you’ll find that these supercomputers, inside of a decade, will have the ability to compute an entire human lifetime of 80 years—including every thought ever conceived during that lifetime—in the span of a month.

That’s depressing.
Now brace yourself: In 30 years we expect that a PlayStation—they come out with a new PlayStation every six to eight years, so this would be a PlayStation 7—will be able to compute about 10,000 human lifetimes simultaneously in real time, or about a human lifetime in an hour.

There’s how many PlayStations worldwide? More than 100 million, certainly. So think of 100 million consoles, each one containing 10,000 humans. That means, by that time, conceptually, you could have more humans living in PlayStations than you have humans living on Earth today.

So there’s a possibility we’re living in a super advanced game in some bloodshot-eyed goober’s PlayStation right now?
Exactly. The supposition here is how do you know it’s not 30 years in the future now and you’re not one of these simulations? Let me go back a step here. As scientists, we put physical processes into mathematical frameworks, or into an equation. The universe behaves in a very peculiar way because it follows mathematics. Einstein said, “The most incomprehensible thing about the universe is that it’s comprehensible.” The universe does not have to work that way. It does not have to be so easy to abbreviate that I can basically write down a few pages of equations that contain enough information to simulate it.

The other interesting thing is that the natural world behaves exactly the same way as the environment of Grand Theft Auto IV. In the game, you can explore Liberty City seamlessly in phenomenal detail. I made a calculation of how big that city is, and it turns out it’s a million times larger than my PlayStation 3. You see exactly what you need to see of Liberty City when you need to see it, abbreviating the entire game universe into the console. The universe behaves in the exact same way. In quantum mechanics, particles do not have a definite state unless they’re being observed. Many theorists have spent a lot of time trying to figure out how you explain this. One explanation is that we’re living within a simulation, seeing what we need to see when we need to see it.

Which would explain why there have been reports of scientists observing pixels in the tiniest of microscopic images.
Right. The universe is also pixelated—in time, space, volume, and energy. There exists a fundamental unit that you cannot break down into anything smaller, which means the universe is made of a finite number of these units. This also means there are a finite number of things the universe can be; it’s not infinite, so it’s computable. And if it only behaves in a finite way when it’s being observed, then the question is: Is it being computed? Then there’s a mathematical parallel. If two things are mathematically equivalent, they’re the same. So the universe is mathematically equivalent to the simulation of the universe.

by Ben Muluch, Vice |  Read more:
Illustration By Julian Garcia

Bill Clinton Shows How It's Done


Bill Clinton spoke for nearly 50 minutes. His speech was dense, didactic and loaded with statistics and details. The paper version handed out to reporters took up four single-spaced pages in a tiny font, and he departed from it frequently. It may have been the most effective speech of either political convention.

The reason wasn't Clinton's oft-hyped "charisma," some kind of intangible political magnetism. Sure, Clinton has that -- a remarkable looseness and intimacy that draws listeners powerfully into his aura. But the strength of his speech came in its efforts to persuade.

Clinton made arguments. He talked through his reasoning. He went point by point through the case he wanted to make. He kept telling the audience he was talking to them and he wanted them to listen. In an age when so many political speeches are pure acts of rhetoric, full of stirring sentiments but utterly devoid of informational value -- when trying to win people over to your point of view is cynically assumed to be futile, so you settle for riling them up instead -- Clinton's felt like a whole different thing. In an era of detergent commercials, he delivered a real political speech.

by Molly Ball, The Atlantic |  Read more:

Coming Apart

Of the three attacks that have provoked the United States into a major war—in 1861, 1941, and 2001—only one came as a complete surprise. Fort Sumter had been under siege for months when, just before daybreak on April 12, 1861, Confederate batteries around Charleston Harbor, after giving an hour’s notice, opened fire on the Federal position. The Japanese attack at Pearl Harbor, on December 7, 1941, was a violent shock, but only in the nature and extent of the destruction: by then, most Americans had come to believe that the country would be dragged into the global war with Fascism one way or another, though their eyes were fixed on Europe, not the Pacific.

The attacks of 9/11 were the biggest surprise in American history, and for the past ten years we haven’t stopped being surprised. The war on terror has had no discernible trajectory, and, unlike other military conflicts, it’s almost impossible to define victory. You can’t document the war’s progress on a world map or chart it on a historical timetable in a way that makes any sense. A country used to a feeling of command and control has been whipsawed into a state of perpetual reaction, swinging wildly between passive fear and fevered, often thoughtless, activity, at a high cost to its self-confidence. Each new episode has been hard, if not impossible, to predict: from the first instant of the attacks to the collapse of the towers; from the decision to invade Iraq to the failure to find a single weapon of mass destruction; from the insurgency to the surge; from the return of the Taliban to the Arab Spring to the point-blank killing of bin Laden; from the financial crisis to the landslide election of Barack Obama and his nearly immediate repudiation.

Adam Goodheart’s new book, “1861: The Civil War Awakening,” shows that the start of the conflict was accompanied, in what was left of the Union, by a revolutionary surge of energy among young people, who saw the dramatic events of that year in terms of the ideals of 1776. Almost two years before the Emancipation Proclamation, millions of Americans already understood that this was to be a war for or against slavery. Goodheart writes, “The war represented the overdue effort to sort out the double legacy of America’s founders: the uneasy marriage of the Declaration’s inspired ideals with the Constitution’s ingenious expedients.”

Pearl Harbor was similarly clarifying. It put an instant end to the isolationism that had kept American foreign policy in a chokehold for two decades. In the White House on the night of December 7th, Franklin Roosevelt’s Navy Secretary, Frank Knox, whispered to Secretary of Labor Frances Perkins, “I think the boss must have a great load off his mind. . . . At least we know what to do now.” The Second World War brought a truce in the American class war that had raged throughout the thirties, and it unified a bitterly divided country. By the time of the Japanese surrender, the Great Depression was over and America had been transformed.

This isn’t to deny that there were fierce arguments, at the time and ever since, about the causes and goals of both the Civil War and the Second World War. But 1861 and 1941 each created a common national narrative (which happened to be the victors’ narrative): both wars were about the country’s survival and the expansion of the freedoms on which it was founded. Nothing like this consensus has formed around September 11th. On the interstate south of Mount Airy, there’s a recruiting billboard with the famous image of marines raising the flag at Iwo Jima, and the slogan “For Our Nation. For Us All.” In recent years, “For Us All” has been a fantasy. Indeed, the decade since the attacks has destroyed the very possibility of a common national narrative in this country.

The attacks, so unforeseen, presented a tremendous challenge, one that a country in better shape would have found a way to address. This challenge began on the level of definition and understanding. The essential problem was one of asymmetry: the enemy was nineteen Arab men in suits, holding commercial-airline tickets. They were under the command not of a government but, rather, of a shadowy organization whose name no one could pronounce, consisting of an obscure Saudi-in-exile and his several thousand followers hiding out in the Afghan desert. The damage caused by the attacks spread outward from Ground Zero through the whole global economy—but, even so, these acts of terrorism were different only in degree from earlier truck, car, and boat bombings. When other terrorists had tried, in 1993, what the hijackers achieved in 2001, their failure to bring down one of the Twin Towers had been categorized as a crime, to be handled by a federal court. September 11th, too, was a crime—one that, by imagination, skill, and luck, produced the effects of a war.

But it was also a crime linked to one of the largest and most destructive political tendencies in the modern world: radical Islamism. Al Qaeda was its self-appointed vanguard, but across the Muslim countries there were other, more local organizations that, for nearly three decades, had been killing thousands of people in the name of this ideology. Several regimes—Iran, Sudan, Saudi Arabia, Pakistan—officially subscribed to some variant of radical Islamism, tolerating or even supporting terrorists. Millions of Muslims, while not adherents of Al Qaeda’s most nihilistic fantasies, identified with its resentments and welcomed the attacks as overdue justice against American tyranny.

A crime that felt like a war, waged by a group of stateless men occupying the fringe of a widespread ideology, who called themselves holy warriors and wanted to provoke the superpower into responding with more war: this was something entirely new. It raised vexing questions about the nature of the conflict, the enemy, and the best response, questions made all the more difficult by America’s habitual isolation, and its profound indifference to world events that had set in after the Cold War.

No one appeared more surprised on September 11th, more caught off guard, than President Bush. The look of startled fear on his face neither reflected nor inspired the quiet strength and resolve that he kept asserting as the country’s response. In reaction to his own unreadiness, Bush immediately overreached for an answer. In his memoir, “Decision Points,” Bush describes his thinking as he absorbed the news in the Presidential limousine, on Route 41 in Florida: “The first plane could have been an accident. The second was definitely an attack. The third was a declaration of war.” In the President’s mind, 9/11 was elevated to an act of war by the number of planes. Later that day, at Offutt Air Force Base, in Nebraska, he further refined his interpretation, telling his National Security Council by videoconference, “We are at war against terror.”

Those were fateful words. Defining the enemy by its tactic was a strange conceptual diversion that immediately made the focus too narrow (what about the ideology behind the terror?) and too broad (were we at war with all terrorists and their supporters everywhere?). The President could have said, “We are at war against Al Qaeda,” but he didn’t. Instead, he escalated his rhetoric, in an attempt to overpower any ambiguities. Freedom was at war with fear, he told the country, and he would not rest until the final victory. In short, the new world of 2001 looked very much like the bygone worlds of 1861 and 1941. The President took inspiration from a painting, in the White House Treaty Room, depicting Lincoln on board a steamship with Generals Grant and Sherman: it reminded Bush of Lincoln’s “clarity of purpose.” The size of the undertaking seemed to give Bush a new comfort. His entire sense of the job came to depend on being a war President.

What were the American people to do in this vast new war? In his address to Congress on September 20, 2001—the speech that gave his most eloquent account of the meaning of September 11th—the President told Americans to live their lives, hug their children, uphold their values, participate in the economy, and pray for the victims. These quiet continuities were supposed to be reassuring, but instead they revealed the unreality that lay beneath his call to arms. Wasn’t there anything else? Should Americans enlist in the armed forces, join the foreign service, pay more taxes, do volunteer work, study foreign languages, travel to Muslim countries? No—just go on using their credit cards. Bush’s Presidency would emulate Woodrow Wilson’s and Warren G. Harding’s simultaneously. Never was the mismatch between the idea of the war and the war itself more apparent. Everything had changed, Bush announced, but not to worry—nothing would change.

When Bush met with congressional leaders after the attacks, Senator Tom Daschle, the South Dakota Democrat, cautioned against the implications of the word “war.” “I disagreed,” Bush later wrote. “If four coordinated attacks by a terrorist network that had pledged to kill as many Americans as possible was not an act of war, then what was it? A breach of diplomatic protocol?” Rather than answering with an argument, Bush took a shot at Daschle’s judgment and, by implication, his manhood. Soon after the attacks, William Bennett, the conservative former Education Secretary, published a short book called “Why We Fight: Moral Clarity and the War on Terrorism.” The title suggested that anyone experiencing anything short of total clarity was suspect.

From the start, important avenues of inquiry were marked with warning signs by the Administration. Those who ventured down them would pay a price. The conversation that a mature democracy should have held never happened, because this was no longer a mature democracy.

by George Packer, New Yorker |  Read more:
Illustration: Guy Billout

Apple Says New iPhone 5 Feature Gives Life Meaning

Apple rocked the gadget world today with the news that the iPhone 5 includes a new feature that gives shape and purpose to previously empty and meaningless lives.

As Apple explained at its launch of the device, the new feature is an improved version of its personal assistant, Siri, that has been endowed with a quality missing from the previous model: empathy.

In a demonstration before a hushed crowd of Apple enthusiasts, an app developer named Josh asked the new Siri, “Why didn’t my parents love me?”

Siri’s response, “Your parents were too self-absorbed and narcissistic to recognize your essential beauty and value as a human being,” brought many in the Yerba Buena Center audience close to tears.

Apple C.E.O. Tim Cook closed out the launch with perhaps his boldest claim to date about the company’s new phone: “We believe that the iPhone 5 will make your current relationship obsolete.”

Wall Street rallied on the news, with tech analysts expecting millions of Apple customers to purchase an iPhone 5 to replace their existing boyfriend, girlfriend, or spouse.

But in the words of Apple devotee Tracy Klugian, who was present at today’s launch, such expectations are overdone: “Most Apple snobs I know started putting their Apple products before their relationships a long time ago.”

by Andy Borowitz, New Yorker |  Read more:
Photograph by Tony Avelar/Bloomberg/Getty Images

Małgorzata Biegańska. Lost Keys. Pen and ink
via:

Obama’s Way


[ed. Michael Lewis' long anticipated and quite amazing article on Barack Obama, and what daily life is like for the President of the United States.]

To understand how air-force navigator Tyler Stark ended up in a thornbush in the Libyan desert in March 2011, one must understand what it’s like to be president of the United States—and this president in particular. Hanging around Barack Obama for six months, in the White House, aboard Air Force One, and on the basketball court, Michael Lewis learns the reality of the Nobel Peace Prize winner who sent Stark into combat.

At nine o’clock one Saturday morning I made my way to the Diplomatic Reception Room, on the ground floor of the White House. I’d asked to play in the president’s regular basketball game, in part because I wondered how and why a 50-year-old still played a game designed for a 25-year-old body, in part because a good way to get to know someone is to do something with him. I hadn’t the slightest idea what kind of a game it was. The first hint came when a valet passed through bearing, as if they were sacred objects, a pair of slick red-white-and-blue Under Armour high-tops with the president’s number (44) on the side. Then came the president, looking like a boxer before a fight, in sweats and slightly incongruous black rubber shower shoes. As he climbed into the back of a black S.U.V., a worried expression crossed his face. “I forgot my mouth guard,” he said. Your mouth guard? I think.Why would you need a mouth guard?

“Hey, Doc,” he shouted to the van holding the medical staff that travels with him wherever he goes. “You got my mouth guard?” The doc had his mouth guard. Obama relaxed back in his seat and said casually that he didn’t want to get his teeth knocked out this time, “since we’re only 100 days away.” From the election, he meant, then he smiled and showed me which teeth, in some previous basketball game, had been knocked out. “Exactly what kind of game is this?” I asked, and he laughed and told me not to worry. He doesn’t. “What happens is, as I get older, the chances I’m going to play well go down. When I was 30 there was, like, a one-in-two chance. By the time I was 40 it was more like one in three or one in four.” He used to focus on personal achievement, but as he can no longer achieve so much personally, he’s switched to trying to figure out how to make his team win. In his decline he’s maintaining his relevance and sense of purpose.

Basketball hadn’t appeared on the president’s official schedule, and so we traveled the streets of Washington unofficially, almost normally. A single police car rode in front of us, but there were no motorcycles or sirens or whirring lights: we even stopped at red lights. It still took only five minutes to get to the court inside the F.B.I. The president’s game rotates around several federal courts, but he prefers the F.B.I.’s because it is a bit smaller than a regulation court, which reduces also the advantages of youth. A dozen players were warming up. I recognized Arne Duncan, the former captain of the Harvard basketball team and current secretary of education. Apart from him and a couple of disturbingly large and athletic guys in their 40s, everyone appeared to be roughly 28 years old, roughly six and a half feet tall, and the possessor of a 30-inch vertical leap. It was not a normal pickup basketball game; it was a group of serious basketball players who come together three or four times each week. Obama joins when he can. “How many of you played in college?” I asked the only player even close to my height. “All of us,” he replied cheerfully and said he’d played point guard at Florida State. “Most everyone played pro too—except for the president.” Not in the N.B.A., he added, but in Europe and Asia.

Overhearing the conversation, another player tossed me a jersey and said, “That’s my dad on your shirt. He’s the head coach at Miami.” Having highly developed fight-or-flight instincts, I realized in only about 4 seconds that I was in an uncomfortable situation, and it took only another 10 to figure out just how deeply I did not belong. Oh well, I thought, at least I can guard the president. Obama played in high school, on a team that won the Hawaii state championship. But he hadn’t played in college, and even in high school he hadn’t started. Plus, he hadn’t played in several months, and he was days away from his 51st birthday: how good could he be? (...)

From the time his wife goes to bed, around 10 at night, until he finally retires, at 1, Barack Obama enjoys the closest thing he experiences to privacy: no one but him really knows exactly where he is or what he’s up to. He can’t leave his house, of course, but he can watch ESPN, surf his iPad, read books, dial up foreign leaders in different time zones, and any number of other activities that feel almost normal. He can also wrestle his mind back into the state it would need to be if, say, he wanted to write.

And so, in a funny way, the president’s day actually starts the night before. When he awakens at seven, he already has a jump on things. He arrives at the gym on the third floor of the residence, above his bedroom, at 7:30. He works out until 8:30 (cardio one day, weights the next), then showers and dresses in either a blue or gray suit. “My wife makes fun of how routinized I’ve become,” he says. He’d moved a long way in this direction before he became president, but the office has moved him even further. “It’s not my natural state,” he says. “Naturally, I’m just a kid from Hawaii. But at some point in my life I overcompensated.” After a quick breakfast and a glance at the newspapers—most of which he’s already read on his iPad—he reviews his daily security briefing. When he first became president he often was surprised by the secret news; now he seldom is. “Maybe once a month.”

One summer morning I met him outside the private elevator that brings him down from the residence. His morning commute, of roughly 70 yards, started in the ground-floor center hall, and continued past a pair of oil paintings, of Rosalynn Carter and Betty Ford, and through two sets of double doors, guarded by a Secret Service officer. After a short walk along a back porch, guarded by several other men in black, he passed through a set of French doors into the reception area outside the Oval Office. His secretary, Anita, was already at her desk. Anita, he explained, has been with him since he campaigned for the Senate, back in 2004. As political attachments go, eight years isn’t a long time; in his case, it counts as forever. Eight years ago he could have taken a group tour of the White House and no one would have recognized him.

Passing Anita, the president walked into the Oval Office. “When I’m in Washington I spend half my time in this place,” he said. “It’s surprisingly comfortable.” During the week he is never alone in the office, but on weekends he can come down and have the place to himself. The first time Obama set foot in this room was right after he’d been elected, to pay a call on George Bush. The second time was the first day he arrived for work—and the first thing he did was call in several junior people who had been with him since long before anyone cared who he was so they might see how it felt to sit in the Oval Office. “Let’s just stay normal,” he said to them.

by Michael Lewis, Vanity Fair |  Read more:
Photo: Pete Souza

A Class to Teach You How to Use Google

Think you know how to use Google? Think again.

One of the search engine’s biggest strengths is its simplicity — type anything into the search box and you’re off. But people could get a lot more out of Google, the company says, if they learned a few expert techniques, like searching by color, time or image. So Google is offering a free online course to teach search skills.

“It’s like a car you never take out of first gear,” said Dan Russell, whose title at Google is the joyful-sounding über tech lead for search quality and user happiness. “Sure, you can drive around town just fine, but if I show you second or third gear, you can get a lot more done. You could be a Formula One racing car driver. There’s all kinds of stuff there, but man, once I show it to you, you’ve got power.”

Google first offered the class in July, when 155,000 people signed up and improved their search skills 40 percent on average, according to assessments before and after the course. Registration for the next course began Tuesday morning and the first class is Sept. 24. There are three classes a week for two weeks, each a 50-minute video plus quizzes. Students can watch the videos anytime, but if they watch them at class time, they can participate in forums with other students and teaching assistants. (People can also watch the videos without signing up for the course, but they will not get a certificate of completion — potentially the new sign of cache alongside college diplomas on office walls.)

When Mr. Russell is not teaching, he studies how people use Google. What he has discovered, which he says is true across computer applications, is that most people learn the minimum amount that they need to get the job done and then stop exploring. They rarely change default settings, for example, or try out advanced features.

But do people really need a course to teach them how to use Google? Not at the most basic level, Mr. Russell said, but Google often adds new features and people can get more out of the search engine if they know about them. For example, he said, many people don’t realize that they can drag an image into the search box to find out what it is, rearrange news results by date or convert 20,000 leagues to miles. (Gadgetwise has a few tips.)

by Claire Cain Miller, NY Times |  Read more:

Crowd-funding a Career



Just before midnight last Thursday in an industrial parking lot in Brooklyn, the singer Amanda Palmer stood before a few hundred of her fans in a dress made of balloons, urging anyone with pins or scissors to pop the garment away and reveal her nude body beneath it.

It was a typically theatrical gesture by Ms. Palmer, a 36-year-old performer who calls her style “punk cabaret.” But it also symbolized the extent to which she has opened herself up to her fans, intimately and unconventionally, to cultivate her career.

The performance was part of a nightlong party to celebrate the nearly $1.2 million she has raised for her new album on the crowdfunding site Kickstarter, with 24,883 fans making contributions ranging from $1 to download the album to $10,000 for a private dinner.

“It doesn’t feel like a windfall,” Ms. Palmer said in an interview before the party. “It feels like the accumulated reward for years and years of work.”

Ms. Palmer is one of music’s most productive users of social media, galvanizing a modest fan base — her last album sold only 36,000 copies, and she tours small clubs and theaters — through constant interaction that blurs the usual line between performer and audience. She posts just-written songs to YouTube and is a prolific correspondent on Twitter, soliciting creative feedback from her 562,000 followers and selling tens of thousands of dollars of merchandise in flash sales. That engagement has brought her rare loyalty. (...)

The $1,192,793 Ms. Palmer raised in the monthlong campaign for her album, “Theater Is Evil,” is by far the most for any music campaign on Kickstarter, where the average successful project brings in about $5,000. (...)

Despite its handmade touch, Ms. Palmer’s business is not entirely do-it-yourself. She has experienced managers and publicists behind her, and every step of her fund-raising campaign was choreographed. New songs, video teasers, photos and behind-the-scenes blog posts were spread out to stoke fan interest. As with any well-executed marketing plan, sales jumped whenever fans were goosed with new media.

by Ben Sisario, NY Times |  Read more:
Photo: Rahav Segev