Wednesday, September 12, 2012

Coming Apart

Of the three attacks that have provoked the United States into a major war—in 1861, 1941, and 2001—only one came as a complete surprise. Fort Sumter had been under siege for months when, just before daybreak on April 12, 1861, Confederate batteries around Charleston Harbor, after giving an hour’s notice, opened fire on the Federal position. The Japanese attack at Pearl Harbor, on December 7, 1941, was a violent shock, but only in the nature and extent of the destruction: by then, most Americans had come to believe that the country would be dragged into the global war with Fascism one way or another, though their eyes were fixed on Europe, not the Pacific.

The attacks of 9/11 were the biggest surprise in American history, and for the past ten years we haven’t stopped being surprised. The war on terror has had no discernible trajectory, and, unlike other military conflicts, it’s almost impossible to define victory. You can’t document the war’s progress on a world map or chart it on a historical timetable in a way that makes any sense. A country used to a feeling of command and control has been whipsawed into a state of perpetual reaction, swinging wildly between passive fear and fevered, often thoughtless, activity, at a high cost to its self-confidence. Each new episode has been hard, if not impossible, to predict: from the first instant of the attacks to the collapse of the towers; from the decision to invade Iraq to the failure to find a single weapon of mass destruction; from the insurgency to the surge; from the return of the Taliban to the Arab Spring to the point-blank killing of bin Laden; from the financial crisis to the landslide election of Barack Obama and his nearly immediate repudiation.

Adam Goodheart’s new book, “1861: The Civil War Awakening,” shows that the start of the conflict was accompanied, in what was left of the Union, by a revolutionary surge of energy among young people, who saw the dramatic events of that year in terms of the ideals of 1776. Almost two years before the Emancipation Proclamation, millions of Americans already understood that this was to be a war for or against slavery. Goodheart writes, “The war represented the overdue effort to sort out the double legacy of America’s founders: the uneasy marriage of the Declaration’s inspired ideals with the Constitution’s ingenious expedients.”

Pearl Harbor was similarly clarifying. It put an instant end to the isolationism that had kept American foreign policy in a chokehold for two decades. In the White House on the night of December 7th, Franklin Roosevelt’s Navy Secretary, Frank Knox, whispered to Secretary of Labor Frances Perkins, “I think the boss must have a great load off his mind. . . . At least we know what to do now.” The Second World War brought a truce in the American class war that had raged throughout the thirties, and it unified a bitterly divided country. By the time of the Japanese surrender, the Great Depression was over and America had been transformed.

This isn’t to deny that there were fierce arguments, at the time and ever since, about the causes and goals of both the Civil War and the Second World War. But 1861 and 1941 each created a common national narrative (which happened to be the victors’ narrative): both wars were about the country’s survival and the expansion of the freedoms on which it was founded. Nothing like this consensus has formed around September 11th. On the interstate south of Mount Airy, there’s a recruiting billboard with the famous image of marines raising the flag at Iwo Jima, and the slogan “For Our Nation. For Us All.” In recent years, “For Us All” has been a fantasy. Indeed, the decade since the attacks has destroyed the very possibility of a common national narrative in this country.

The attacks, so unforeseen, presented a tremendous challenge, one that a country in better shape would have found a way to address. This challenge began on the level of definition and understanding. The essential problem was one of asymmetry: the enemy was nineteen Arab men in suits, holding commercial-airline tickets. They were under the command not of a government but, rather, of a shadowy organization whose name no one could pronounce, consisting of an obscure Saudi-in-exile and his several thousand followers hiding out in the Afghan desert. The damage caused by the attacks spread outward from Ground Zero through the whole global economy—but, even so, these acts of terrorism were different only in degree from earlier truck, car, and boat bombings. When other terrorists had tried, in 1993, what the hijackers achieved in 2001, their failure to bring down one of the Twin Towers had been categorized as a crime, to be handled by a federal court. September 11th, too, was a crime—one that, by imagination, skill, and luck, produced the effects of a war.

But it was also a crime linked to one of the largest and most destructive political tendencies in the modern world: radical Islamism. Al Qaeda was its self-appointed vanguard, but across the Muslim countries there were other, more local organizations that, for nearly three decades, had been killing thousands of people in the name of this ideology. Several regimes—Iran, Sudan, Saudi Arabia, Pakistan—officially subscribed to some variant of radical Islamism, tolerating or even supporting terrorists. Millions of Muslims, while not adherents of Al Qaeda’s most nihilistic fantasies, identified with its resentments and welcomed the attacks as overdue justice against American tyranny.

A crime that felt like a war, waged by a group of stateless men occupying the fringe of a widespread ideology, who called themselves holy warriors and wanted to provoke the superpower into responding with more war: this was something entirely new. It raised vexing questions about the nature of the conflict, the enemy, and the best response, questions made all the more difficult by America’s habitual isolation, and its profound indifference to world events that had set in after the Cold War.

No one appeared more surprised on September 11th, more caught off guard, than President Bush. The look of startled fear on his face neither reflected nor inspired the quiet strength and resolve that he kept asserting as the country’s response. In reaction to his own unreadiness, Bush immediately overreached for an answer. In his memoir, “Decision Points,” Bush describes his thinking as he absorbed the news in the Presidential limousine, on Route 41 in Florida: “The first plane could have been an accident. The second was definitely an attack. The third was a declaration of war.” In the President’s mind, 9/11 was elevated to an act of war by the number of planes. Later that day, at Offutt Air Force Base, in Nebraska, he further refined his interpretation, telling his National Security Council by videoconference, “We are at war against terror.”

Those were fateful words. Defining the enemy by its tactic was a strange conceptual diversion that immediately made the focus too narrow (what about the ideology behind the terror?) and too broad (were we at war with all terrorists and their supporters everywhere?). The President could have said, “We are at war against Al Qaeda,” but he didn’t. Instead, he escalated his rhetoric, in an attempt to overpower any ambiguities. Freedom was at war with fear, he told the country, and he would not rest until the final victory. In short, the new world of 2001 looked very much like the bygone worlds of 1861 and 1941. The President took inspiration from a painting, in the White House Treaty Room, depicting Lincoln on board a steamship with Generals Grant and Sherman: it reminded Bush of Lincoln’s “clarity of purpose.” The size of the undertaking seemed to give Bush a new comfort. His entire sense of the job came to depend on being a war President.

What were the American people to do in this vast new war? In his address to Congress on September 20, 2001—the speech that gave his most eloquent account of the meaning of September 11th—the President told Americans to live their lives, hug their children, uphold their values, participate in the economy, and pray for the victims. These quiet continuities were supposed to be reassuring, but instead they revealed the unreality that lay beneath his call to arms. Wasn’t there anything else? Should Americans enlist in the armed forces, join the foreign service, pay more taxes, do volunteer work, study foreign languages, travel to Muslim countries? No—just go on using their credit cards. Bush’s Presidency would emulate Woodrow Wilson’s and Warren G. Harding’s simultaneously. Never was the mismatch between the idea of the war and the war itself more apparent. Everything had changed, Bush announced, but not to worry—nothing would change.

When Bush met with congressional leaders after the attacks, Senator Tom Daschle, the South Dakota Democrat, cautioned against the implications of the word “war.” “I disagreed,” Bush later wrote. “If four coordinated attacks by a terrorist network that had pledged to kill as many Americans as possible was not an act of war, then what was it? A breach of diplomatic protocol?” Rather than answering with an argument, Bush took a shot at Daschle’s judgment and, by implication, his manhood. Soon after the attacks, William Bennett, the conservative former Education Secretary, published a short book called “Why We Fight: Moral Clarity and the War on Terrorism.” The title suggested that anyone experiencing anything short of total clarity was suspect.

From the start, important avenues of inquiry were marked with warning signs by the Administration. Those who ventured down them would pay a price. The conversation that a mature democracy should have held never happened, because this was no longer a mature democracy.

by George Packer, New Yorker |  Read more:
Illustration: Guy Billout

Apple Says New iPhone 5 Feature Gives Life Meaning

Apple rocked the gadget world today with the news that the iPhone 5 includes a new feature that gives shape and purpose to previously empty and meaningless lives.

As Apple explained at its launch of the device, the new feature is an improved version of its personal assistant, Siri, that has been endowed with a quality missing from the previous model: empathy.

In a demonstration before a hushed crowd of Apple enthusiasts, an app developer named Josh asked the new Siri, “Why didn’t my parents love me?”

Siri’s response, “Your parents were too self-absorbed and narcissistic to recognize your essential beauty and value as a human being,” brought many in the Yerba Buena Center audience close to tears.

Apple C.E.O. Tim Cook closed out the launch with perhaps his boldest claim to date about the company’s new phone: “We believe that the iPhone 5 will make your current relationship obsolete.”

Wall Street rallied on the news, with tech analysts expecting millions of Apple customers to purchase an iPhone 5 to replace their existing boyfriend, girlfriend, or spouse.

But in the words of Apple devotee Tracy Klugian, who was present at today’s launch, such expectations are overdone: “Most Apple snobs I know started putting their Apple products before their relationships a long time ago.”

by Andy Borowitz, New Yorker |  Read more:
Photograph by Tony Avelar/Bloomberg/Getty Images

Małgorzata Biegańska. Lost Keys. Pen and ink
via:

Obama’s Way


[ed. Michael Lewis' long anticipated and quite amazing article on Barack Obama, and what daily life is like for the President of the United States.]

To understand how air-force navigator Tyler Stark ended up in a thornbush in the Libyan desert in March 2011, one must understand what it’s like to be president of the United States—and this president in particular. Hanging around Barack Obama for six months, in the White House, aboard Air Force One, and on the basketball court, Michael Lewis learns the reality of the Nobel Peace Prize winner who sent Stark into combat.

At nine o’clock one Saturday morning I made my way to the Diplomatic Reception Room, on the ground floor of the White House. I’d asked to play in the president’s regular basketball game, in part because I wondered how and why a 50-year-old still played a game designed for a 25-year-old body, in part because a good way to get to know someone is to do something with him. I hadn’t the slightest idea what kind of a game it was. The first hint came when a valet passed through bearing, as if they were sacred objects, a pair of slick red-white-and-blue Under Armour high-tops with the president’s number (44) on the side. Then came the president, looking like a boxer before a fight, in sweats and slightly incongruous black rubber shower shoes. As he climbed into the back of a black S.U.V., a worried expression crossed his face. “I forgot my mouth guard,” he said. Your mouth guard? I think.Why would you need a mouth guard?

“Hey, Doc,” he shouted to the van holding the medical staff that travels with him wherever he goes. “You got my mouth guard?” The doc had his mouth guard. Obama relaxed back in his seat and said casually that he didn’t want to get his teeth knocked out this time, “since we’re only 100 days away.” From the election, he meant, then he smiled and showed me which teeth, in some previous basketball game, had been knocked out. “Exactly what kind of game is this?” I asked, and he laughed and told me not to worry. He doesn’t. “What happens is, as I get older, the chances I’m going to play well go down. When I was 30 there was, like, a one-in-two chance. By the time I was 40 it was more like one in three or one in four.” He used to focus on personal achievement, but as he can no longer achieve so much personally, he’s switched to trying to figure out how to make his team win. In his decline he’s maintaining his relevance and sense of purpose.

Basketball hadn’t appeared on the president’s official schedule, and so we traveled the streets of Washington unofficially, almost normally. A single police car rode in front of us, but there were no motorcycles or sirens or whirring lights: we even stopped at red lights. It still took only five minutes to get to the court inside the F.B.I. The president’s game rotates around several federal courts, but he prefers the F.B.I.’s because it is a bit smaller than a regulation court, which reduces also the advantages of youth. A dozen players were warming up. I recognized Arne Duncan, the former captain of the Harvard basketball team and current secretary of education. Apart from him and a couple of disturbingly large and athletic guys in their 40s, everyone appeared to be roughly 28 years old, roughly six and a half feet tall, and the possessor of a 30-inch vertical leap. It was not a normal pickup basketball game; it was a group of serious basketball players who come together three or four times each week. Obama joins when he can. “How many of you played in college?” I asked the only player even close to my height. “All of us,” he replied cheerfully and said he’d played point guard at Florida State. “Most everyone played pro too—except for the president.” Not in the N.B.A., he added, but in Europe and Asia.

Overhearing the conversation, another player tossed me a jersey and said, “That’s my dad on your shirt. He’s the head coach at Miami.” Having highly developed fight-or-flight instincts, I realized in only about 4 seconds that I was in an uncomfortable situation, and it took only another 10 to figure out just how deeply I did not belong. Oh well, I thought, at least I can guard the president. Obama played in high school, on a team that won the Hawaii state championship. But he hadn’t played in college, and even in high school he hadn’t started. Plus, he hadn’t played in several months, and he was days away from his 51st birthday: how good could he be? (...)

From the time his wife goes to bed, around 10 at night, until he finally retires, at 1, Barack Obama enjoys the closest thing he experiences to privacy: no one but him really knows exactly where he is or what he’s up to. He can’t leave his house, of course, but he can watch ESPN, surf his iPad, read books, dial up foreign leaders in different time zones, and any number of other activities that feel almost normal. He can also wrestle his mind back into the state it would need to be if, say, he wanted to write.

And so, in a funny way, the president’s day actually starts the night before. When he awakens at seven, he already has a jump on things. He arrives at the gym on the third floor of the residence, above his bedroom, at 7:30. He works out until 8:30 (cardio one day, weights the next), then showers and dresses in either a blue or gray suit. “My wife makes fun of how routinized I’ve become,” he says. He’d moved a long way in this direction before he became president, but the office has moved him even further. “It’s not my natural state,” he says. “Naturally, I’m just a kid from Hawaii. But at some point in my life I overcompensated.” After a quick breakfast and a glance at the newspapers—most of which he’s already read on his iPad—he reviews his daily security briefing. When he first became president he often was surprised by the secret news; now he seldom is. “Maybe once a month.”

One summer morning I met him outside the private elevator that brings him down from the residence. His morning commute, of roughly 70 yards, started in the ground-floor center hall, and continued past a pair of oil paintings, of Rosalynn Carter and Betty Ford, and through two sets of double doors, guarded by a Secret Service officer. After a short walk along a back porch, guarded by several other men in black, he passed through a set of French doors into the reception area outside the Oval Office. His secretary, Anita, was already at her desk. Anita, he explained, has been with him since he campaigned for the Senate, back in 2004. As political attachments go, eight years isn’t a long time; in his case, it counts as forever. Eight years ago he could have taken a group tour of the White House and no one would have recognized him.

Passing Anita, the president walked into the Oval Office. “When I’m in Washington I spend half my time in this place,” he said. “It’s surprisingly comfortable.” During the week he is never alone in the office, but on weekends he can come down and have the place to himself. The first time Obama set foot in this room was right after he’d been elected, to pay a call on George Bush. The second time was the first day he arrived for work—and the first thing he did was call in several junior people who had been with him since long before anyone cared who he was so they might see how it felt to sit in the Oval Office. “Let’s just stay normal,” he said to them.

by Michael Lewis, Vanity Fair |  Read more:
Photo: Pete Souza

A Class to Teach You How to Use Google

Think you know how to use Google? Think again.

One of the search engine’s biggest strengths is its simplicity — type anything into the search box and you’re off. But people could get a lot more out of Google, the company says, if they learned a few expert techniques, like searching by color, time or image. So Google is offering a free online course to teach search skills.

“It’s like a car you never take out of first gear,” said Dan Russell, whose title at Google is the joyful-sounding über tech lead for search quality and user happiness. “Sure, you can drive around town just fine, but if I show you second or third gear, you can get a lot more done. You could be a Formula One racing car driver. There’s all kinds of stuff there, but man, once I show it to you, you’ve got power.”

Google first offered the class in July, when 155,000 people signed up and improved their search skills 40 percent on average, according to assessments before and after the course. Registration for the next course began Tuesday morning and the first class is Sept. 24. There are three classes a week for two weeks, each a 50-minute video plus quizzes. Students can watch the videos anytime, but if they watch them at class time, they can participate in forums with other students and teaching assistants. (People can also watch the videos without signing up for the course, but they will not get a certificate of completion — potentially the new sign of cache alongside college diplomas on office walls.)

When Mr. Russell is not teaching, he studies how people use Google. What he has discovered, which he says is true across computer applications, is that most people learn the minimum amount that they need to get the job done and then stop exploring. They rarely change default settings, for example, or try out advanced features.

But do people really need a course to teach them how to use Google? Not at the most basic level, Mr. Russell said, but Google often adds new features and people can get more out of the search engine if they know about them. For example, he said, many people don’t realize that they can drag an image into the search box to find out what it is, rearrange news results by date or convert 20,000 leagues to miles. (Gadgetwise has a few tips.)

by Claire Cain Miller, NY Times |  Read more:

Crowd-funding a Career



Just before midnight last Thursday in an industrial parking lot in Brooklyn, the singer Amanda Palmer stood before a few hundred of her fans in a dress made of balloons, urging anyone with pins or scissors to pop the garment away and reveal her nude body beneath it.

It was a typically theatrical gesture by Ms. Palmer, a 36-year-old performer who calls her style “punk cabaret.” But it also symbolized the extent to which she has opened herself up to her fans, intimately and unconventionally, to cultivate her career.

The performance was part of a nightlong party to celebrate the nearly $1.2 million she has raised for her new album on the crowdfunding site Kickstarter, with 24,883 fans making contributions ranging from $1 to download the album to $10,000 for a private dinner.

“It doesn’t feel like a windfall,” Ms. Palmer said in an interview before the party. “It feels like the accumulated reward for years and years of work.”

Ms. Palmer is one of music’s most productive users of social media, galvanizing a modest fan base — her last album sold only 36,000 copies, and she tours small clubs and theaters — through constant interaction that blurs the usual line between performer and audience. She posts just-written songs to YouTube and is a prolific correspondent on Twitter, soliciting creative feedback from her 562,000 followers and selling tens of thousands of dollars of merchandise in flash sales. That engagement has brought her rare loyalty. (...)

The $1,192,793 Ms. Palmer raised in the monthlong campaign for her album, “Theater Is Evil,” is by far the most for any music campaign on Kickstarter, where the average successful project brings in about $5,000. (...)

Despite its handmade touch, Ms. Palmer’s business is not entirely do-it-yourself. She has experienced managers and publicists behind her, and every step of her fund-raising campaign was choreographed. New songs, video teasers, photos and behind-the-scenes blog posts were spread out to stoke fan interest. As with any well-executed marketing plan, sales jumped whenever fans were goosed with new media.

by Ben Sisario, NY Times |  Read more:
Photo: Rahav Segev

Tuesday, September 11, 2012

“It Smelled Something Like Pizza”

This is the story of how Apple reinvented the phone. The general outlines of this tale have been told before, most thoroughly in Isaacson’s biography. But the Samsung case—which ended last month with a resounding victory for Apple—revealed a trove of details about the invention, the sort of details that Apple is ordinarily loath to make public. We got pictures of dozens of prototypes of the iPhone and iPad. We got internal email that explained how executives and designers solved key problems in the iPhone’s design. We got testimony from Apple’s top brass explaining why the iPhone was a gamble.

Put it all together and you get remarkable story about a device that, under the normal rules of business, should not have been invented. Given the popularity of the iPod and its centrality to Apple’s bottom line, Apple should have been the last company on the planet to try to build something whose explicit purpose was to kill music players. Yet Apple’s inner circle knew that one day, a phone maker would solve the interface problem, creating a universal device that could make calls, play music and videos, and do everything else, too—a device that would eat the iPod’s lunch. Apple’s only chance at staving off that future was to invent the iPod killer itself. More than this simple business calculation, though, Apple’s brass saw the phone as an opportunity for real innovation. “We wanted to build a phone for ourselves,” Scott Forstall, who heads the team that built the phone’s operating system, said at the trial. “We wanted to build a phone that we loved.”

The problem was how to do it. When Jobs unveiled the iPhone in 2007, he showed off a picture of an iPod with a rotary-phone dialer instead of a click wheel. That was a joke, but it wasn’t far from Apple’s initial thoughts about phones. The click wheel—the brilliant interface that powered the iPod (which was invented for Apple by a firm called Synaptics)—was a simple, widely understood way to navigate through menus in order to play music. So why not use it to make calls, too?

In 2005, Tony Fadell, the engineer who’s credited with inventing the first iPod, got hold of a high-end desk phone made by Samsung and Bang & Olufsen that you navigated using a set of numerical keys placed around a rotating wheel. A Samsung cell phone, the X810, used a similar rotating wheel for input. Fadell didn’t seem to like the idea. “Weird way to hold the cellphone,” he wrote in an email to others at Apple. But Jobs thought it could work. “This may be our answer—we could put the number pad around our clickwheel,” he wrote. (Samsung pointed to this thread as evidence for its claim that Apple’s designs were inspired by other companies, including Samsung itself.)

Around the same time, Jonathan Ive, Apple’s chief designer, had been investigating a technology that he thought could do wonderful things someday—a touch display that could understand taps from multiple fingers at once. (Note that Apple did not invent multitouch interfaces; it was one of several companies investigating the technology at the time.) According to Isaacson’s biography, the company’s initial plan was to the use the new touch system to build a tablet computer. Apple’s tablet project began in 2003—seven years before the iPad went on sale—but as it progressed, it dawned on executives that multitouch might work on phones. At one meeting in 2004, Jobs and his team looked a prototype tablet that displayed a list of contacts. “You could tap on the contact and it would slide over and show you the information,” Forstall testified. “It was just amazing.”

Jobs himself was particularly taken by two features that Bas Ording, a talented user-interface designer, had built into the tablet prototype. One was “inertial scrolling”—when you flick at a list of items on the screen, the list moves as a function of how fast you swipe, and then it comes to rest slowly, as if being affected by real-world inertia. Another was the “rubber-band effect,” which causes a list to bounce against the edge of the screen when there were no more items to display. When Jobs saw the prototype, he thought, “My god, we can build a phone out of this,” he told the D Conference in 2010.

The company decided to abandon the click-wheel idea and try to build a multitouch phone. Jobs knew it was a risk—could Apple get typing to work on a touchscreen?—but the payoff could be huge: If the phone’s only interface was a touchscreen, it would be endlessly flexible—you could use it not just for talking and music but for anything else, including lots of third-party applications. In other words, a touchscreen phone wouldn’t be a phone but “really a computer in your pocket in some ways,” as Forstall said in court.

Apple is known for secrecy, but Jobs wanted the iPhone kept under tighter wraps than usual. The project was given a codename—“Project Purple”—and, as Forstall testified, Jobs didn’t let the iPhone team recruit anyone from outside the company to work on the device. Instead, Forstall had to make a strange pitch to superstar engineers in different parts of the company: “We're starting a new project,” he’d tell them. “It's so secret I can't even tell you what that project is. I can't tell you who you will work for.... What I can tell you is that if you accept this project … you will work nights, you will work weekends, probably for a number of years.”

by Farhad Manjoo, Slate |  Read more:
Photograph by Tony Avelar/AFP


Picasso’s portrait of his adored dachshund, Lump.

#0981009301 ¬
linda vachon / tête de caboche
via:

Actually, Literally, What Your Crutch Word Says About You

Joe Biden said literally quite literally a lot last night in his speech at the Democratic National Convention in Charlotte. He also said figuratively, and he alluded to Barack Obama's steel spine. He also mentioned Osama bin Laden and General Motors (one is dead and one is alive, he said). But back to literally. Politico reports that Biden used the word 9 times as recorded by transcription service TVEyes, others counted as many as 10 uses. It's enough that if it were the word-of-choice in a convention speech drinking game, less hardy sorts might be literally intoxicated by the end of his turn on stage, and so it was fodder for much semantic mockery around the Internet. If there's one thing moderately word-nerdy folks (folks, he said that, too) hate, it's the repeated and possibly improper use of one of those crutch words. In truth, we hate a lot of things, but it's fun to hate crutch words.

Crutch words are those expressions we pepper throughout our language as verbal pauses, and sometimes as written ones, to give us time to think, to accentuate our meaning (even when we do so mistakenly), or just because these are the words that have somehow lodged in our brains and come out on our tongues the most, for whatever reason. Quite often, they do little to add meaning, though. Sometimes we even use them incorrectly. Almost always, we don't need them at all, which doesn't mean we won't persist in using them. Here's our list of frequently used crutches, and what your crutch of choice has to reveal about you.

Actually. Actually, you may already know how we feel about actually. I've argued that it's worse than literally because it offers up sheer attitude in place of literally's intellectual pretensions. It is literally with a slap in the face. Imagine Biden replacing his literallys with actuallys. For instance, "I want to show you the character of a leader who had what it took when the American people literally actually stood on the brink of a new depression.” It's almost like saying the American people had been claiming to be on that brink for years, crying wolf as it were, and only now, finally, did it actually happen. Actually, for once, it turned out to be true! You could read it other ways, of course, but if actually is your crutch, you are a little bit angry, maybe, and certainly adamant about making your point with a bit of a zing. You are not boring, actually, and you'd probably do OK in a bar brawl.

As it were. If you use this, which I did above, you are possibly worse than a literally-dropper. You're the most self-aware of crutch-word users, because you know you're saying something rather cliched, a hackneyed expression or at best an aging metaphor, and yet you're going forward with it anyway. The trick is that you're doing it with the acknowledgement that you already know exactly what you're doing, thank you very much. You are the equivalent of the guy with a broken leg doing tricks on his crutches. It's a crutch-word brag.

Basically. You like to cut to the chase, to synopsize, to bring things down to old bottom line of what's really, truly important. You are always downsizing, cutting the clutter, throwing out a sweater for every new one you purchase. So, basically, this is what you do. You talk for a long time, maybe, and then you sum up what you really meant to say with a basically. Everything else was just chatter, but it got you to where you were going, so, basically, that's OK with you. Basically, that's it.

by Jen Doll, Atlantic Wire |  Read more:
Photo: Jim Young, Reuters

Difference Engine: The PC all over again?

[ed. See also: The Future Will be Printed in 3-D]

What could well be the next great technological disruption is fermenting away, out of sight, in small workshops, college labs, garages and basements. Tinkerers with machines that turn binary digits into molecules are pioneering a whole new way of making things—one that could well rewrite the rules of manufacturing in much the same way as the PC trashed the traditional world of computing.

The machines, called 3D printers, have existed in industry for years. But at a cost of $100,000 to $1m, few individuals could ever afford one. Fortunately, like everything digital, their price has fallen. So much so, industrial 3D printers can now be had for $15,000, and home versions for little more than $1,000 (or half that in kit form). “In many ways, today’s 3D printing community resembles the personal computing community of the early 1990s,” says Michael Weinberg, a staff lawyer at Public Knowledge, an advocacy group in Washington, DC.

As an expert on intellectual property, Mr Weinberg has produced a white paper that documents the likely course of 3D-printing's development—and how the technology could be affected by patent and copyright law. He is far from sanguine about its prospects. His main fear is that the fledgling technology could have its wings clipped by traditional manufacturers, who will doubtless view it as a threat to their livelihoods, and do all in their powers to hobble it. Because of a 3D printer's ability to make perfect replicas, they will probably try to brand it a piracy machine. (...)

The first thing to know about 3D printing is that it is an “additive”, rather than a “subtractive”, form of processing. The tools are effectively modified ink-jet printers that deposit successive layers of material until a three-dimensional object is built up. In doing so, they typically use a tenth of the material needed when machining a part from bulk. The goop used for printing can be a thermoplastic such as acrylonitrile butadiene styrene (ABS), polylactic acid or polycarbonate, or metallic powders, clays and even living cells depending on the application (see “Making it”, November 25th 2011).

As far as intellectual property is concerned, the 3D printer itself is not the problem. But before it can start making anything, it needs a CAD (computer-aided design) file of the object to be produced, along with specialised software to tell the printer how to lay down the successive layers of material. The object can be designed on a computer using CAD software, or files of standard objects can be downloaded from open-source archives such as Thingiverse and Fab@Home. Most likely, though, the object to be produced is copied from an existing one, using a scanner that records the three-dimensional measurements from various angles and turns the data into a CAD file.

This is where claims of infringement start—especially if the item being scanned by the machine’s laser beam is a proprietary design belonging to someone else. And unless the object is in the public domain, copyright law could well apply. This has caught out a number of unwitting users of 3D printers who have blithely made reproductions of popular merchandise.

by The Economist |  Read more: 
Photo: Wikipedia

Monday, September 10, 2012


Warhol
via:

Shizuka 静 (Nagare 流れ) (Silence no. 74
via:

Cindy RizzaSummer’s End
via:

Obstruct and Exploit

Does anyone remember the American Jobs Act? A year ago President Obama proposed boosting the economy with a combination of tax cuts and spending increases, aimed in particular at sustaining state and local government employment. Independent analysts reacted favorably. For example, the consulting firm Macroeconomic Advisers estimated that the act would add 1.3 million jobs by the end of 2012.

There were good reasons for these positive assessments. Although you’d never know it from political debate, worldwide experience since the financial crisis struck in 2008 has overwhelmingly confirmed the proposition that fiscal policy “works,” that temporary increases in spending boost employment in a depressed economy (and that spending cuts increase unemployment). The Jobs Act would have been just what the doctor ordered.

But the bill went nowhere, of course, blocked by Republicans in Congress. And now, having prevented Mr. Obama from implementing any of his policies, those same Republicans are pointing to disappointing job numbers and declaring that the president’s policies have failed.

Think of it as a two-part strategy. First, obstruct any and all efforts to strengthen the economy, then exploit the economy’s weakness for political gain. If this strategy sounds cynical, that’s because it is. Yet it’s the G.O.P.’s best chance for victory in November.

But are Republicans really playing that cynical a game?

You could argue that we’re having a genuine debate about economic policy, in which Republicans sincerely believe that the things Mr. Obama proposes would actually hurt, not help, job creation. However, even if that were true, the fact is that the economy we have right now doesn’t reflect the policies the president wanted.

Anyway, do Republicans really believe that government spending is bad for the economy? No.

Right now Mitt Romney has an advertising blitz under way in which he attacks Mr. Obama for possible cuts in defense spending — cuts, by the way, that were mandated by an agreement forced on the president by House Republicans last year. And why is Mr. Romney denouncing these cuts? Because, he says, they would cost jobs!

This is classic “weaponized Keynesianism” — the claim that government spending can’t create jobs unless the money goes to defense contractors, in which case it’s the lifeblood of the economy. And no, it doesn’t make any sense.

by Paul Krugman, NY Times |  Read more:

Animated Banksy


Animated Banksy is a marvelous series of animated GIF images, made up from famous satirical Banksy street art, created by Serbian artist ABVH of the Tumblr blog Made By ABVH. It is truly amazing how adding a slight degree of motion to an originally motionless piece of artwork can really amplify the message being put out by the artist.

by Justin Page, Laughing Squid | Read more:

Kathryn Frund
via:

Trail Ale

Say backpackers, ever had to leave a six-pack behind because it was too heavy?

Patrick Tatera has, and it's a pain he never wants to feel again. So the former chemist who once lived in Talkeetna decided to take things into his own hands, creating what appears to be the first concentrated beer through his company, Pat's Backcountry Beverages. Backpackers, canoeists or anyone looking to have a good time in the wilderness need not strain their backs carrying water, so long as they pack a carbonator (essentially a specialized water bottle, complete with a carbon-dioxide activator), plus packets of citric acid and sodium bicarbonate. A few pumps, a little shake and the reaction is complete. Add soda or beer concentrate, and get ready to get your drink on.

Tatera, who moved his family to Wheatridge, Colo., two months ago to start his beverage business, is a former chemist for Toyota who moved to Alaska 15 years ago to teach math in Galena, a Yukon River community of about 500 residents. After five years he moved to Talkeetna, where he worked as an assessment director for the Galena-based IDEA home-school program.

He's also an avid backpacker and longtime home brewer. It took a trip to Utah 15 years ago for him to realize how important concentrated beer could be. He and a friend left a six-pack of craft beer in their car, not wanting to haul the heavy booze into the backcountry. But once into the hike, he and his hiking partner realized a beer was the only thing that would make everything better. That meant a hasty retreat to the car to retrieve their stash -- and fresh thoughts about how to make beer lighter and easier to carry.

“It’s a sweet spot for me, to be honest. All my interests converged to a point of singularity,” Tatera said. “I get to incorporate the all my geekiness.”

Sketches on the backs of napkins eventually became Tatera's carbonator, which he claims is the first of its kind. Weighing just half a pound and not much bigger than a Nalgene bottle, Tatera said users can put any sort of treated water in the carbonator -- whether filtered or cleansed with iodine tablets. Follow the instructions and, presto, some trail ale.

by Suzanna Caldwell, Alaska Dispatch |  Read more: