Friday, December 2, 2011

Light Show


[ed. I shy away from product endorsements as a rule, but the roll-out for this one is quite spectacular.] 

On Monday 28th November Nokia Lumia 800 with Windows phone http://nokia.ly/uBVXxw brought deadmau5 and the world's most advanced 4D technology together and created an amazing free light show at Millbank Tower, London.

Millbank was plunged into darkness with the iconic tower acting as the canvas for a never-before-seen spectacular. Each of the 120 metre high building's 800 windows were covered with vinyl as 16 powerful projectors, stationed 300 metres away on the other side of the river, beamed 3D images onto the structure. Huge butterflies flew across the London skyline and the tower was twisted, pulsated and even fell down. Billed as the "future of live events" the spectacular show was accompanied by music from super producer deadmau5, who created exclusive remixes for the performance -- adding the 4th dimension.

Health Care for a Changing Work Force

[ed. Co-ops will be an increasing trend in this era of diminished expectations and economic uncertainty; all kinds of co-ops: financial, health, assisted living, food, etc. There is strength (and protection) in numbers.]

Big institutions are often slow to awaken to major social transformations. Microsoft was famously late to grasp the importance of the Internet. American auto manufacturers were slow to identify the demand for fuel-efficient cars. And today, the United States government is making a similar mistake: it still doesn’t seem to recognize that Americans no longer work the way they used to.

Today, some 42 million people — about a third of the United States work force — do not have jobs in the traditional sense. They fall into a catchall category the government calls “contingent” workers. These people — independent contractors, freelancers, temp workers, part-timers, people between jobs — typically work on a project-to-project basis for a variety of clients, and most are outcasts from the traditional system of benefits that provide economic security to Americans. Even as the economy has changed, employment benefits are still based on an outdated industrial-era model in which workers are expected to stay with a single company for years, if not their whole careers.

For most of the 20th century, it was efficient to link benefits to jobs this way. But today, more and more work falls outside the one-to-one, employee-to-employer relationship. Work is decentralized, workers are mobile, and working arrangements are fluid. However, the risks of life haven’t gone away: people still need protections. They just need a different system to distribute them. They need benefits that they can carry around, like their laptops. As things stand, millions of independent workers go without health and unemployment insurance, protection against discrimination and unpaid wages, and pension plans. It makes no sense.

One of the social innovators to recognize this problem early and act on it was Sara Horowitz, the founder of the Freelancers Union, which has more than 165,000 members across all 50 states. At Fixes, we highlight practical applications of ideas that have the potential to achieve widespread impact. That means looking at how ideas take root in institutions that become part of the fabric of society.

In the early 20th century, a landscape of new institutions — including the early labor unions and hundreds of civil society organizations like Rotary International, the Boy and Girl Scouts, and the N.A.A.C.P. — reshaped the American landscape. Today, the Freelancers Union offers a glimpse of the kind of social enterprise — mission-driven and pragmatic, market-savvy and cooperative — that is likely to proliferate in the coming years to meet the needs of a fast-changing work force and society.

by David Bornstein, NY Times | Read more:
Photo: Carolyn Silveira

Pakayla Biehn
via:

My Next Life

“In my next life I want to live my life backwards. You start out dead and get that out of the way. Then you wake up in an old people’s home feeling better every day. You get kicked out for being too healthy, go collect your pension, and then when you start work, you get a gold watch and a party on your first day. You work for 40 years until you’re young enough to enjoy your retirement. You party, drink alcohol, and are generally promiscuous, then you are ready for high school. You then go to primary school, you become a kid, you play. You have no responsibilities, you become a baby until you are born. And then you spend your last 9 months floating in luxurious spa-like conditions with central heating and room service on tap, larger quarters every day and then Voila! You finish off as an orgasm!” ― Woody Allen

[ed. One of my favorite Woody Allen quotes.]

Woody Allen, born December 1, 1935.
h/t GS

Zach Wahls Speaks About Family


Zach Wahls, a 19-year-old University of Iowa student spoke about the strength of his family during a public forum on House Joint Resolution 6 in the Iowa House of Representatives. Wahls has two mothers, and came to oppose House Joint Resolution 6 which would end civil unions in Iowa.

The Virtuous Cycle

[ed. Great article from one of the 1 percent.]

It is a tenet of American economic beliefs, and an article of faith for Republicans that is seldom contested by Democrats: If taxes are raised on the rich, job creation will stop.

Trouble is, sometimes the things that we know to be true are dead wrong. For the larger part of human history, for example, people were sure that the sun circles the Earth and that we are at the center of the universe. It doesn’t, and we aren’t. The conventional wisdom that the rich and businesses are our nation’s “job creators” is every bit as false.

I’m a very rich person. As an entrepreneur and venture capitalist, I’ve started or helped get off the ground dozens of companies in industries including manufacturing, retail, medical services, the Internet and software. I founded the Internet media company aQuantive Inc., which was acquired by Microsoft Corp. (MSFT) in 2007 for $6.4 billion. I was also the first non-family investor in Amazon.com Inc. (AMZN)

Even so, I’ve never been a “job creator.” I can start a business based on a great idea, and initially hire dozens or hundreds of people. But if no one can afford to buy what I have to sell, my business will soon fail and all those jobs will evaporate.

That’s why I can say with confidence that rich people don’t create jobs, nor do businesses, large or small. What does lead to more employment is the feedback loop between customers and businesses. And only consumers can set in motion a virtuous cycle that allows companies to survive and thrive and business owners to hire. An ordinary middle-class consumer is far more of a job creator than I ever have been or ever will be.

by Nick Hanauer, Bloomberg |  Read more:

Thursday, December 1, 2011

Fitz and the Tantrums, Daryl Hall


Space Invaders

Can I let you in on a secret? Typing two spaces after a period is totally, completely, utterly, and inarguably wrong.

And yet people who use two spaces are everywhere, their ugly error crossing every social boundary of class, education, and taste. * You'd expect, for instance, that anyone savvy enough to read Slate would know the proper rules of typing, but you'd be wrong; every third e-mail I get from readers includes the two-space error. (In editing letters for "Dear Farhad," my occasional tech-advice column, I've removed enough extra spaces to fill my forthcoming volume of melancholy epic poetry, The Emptiness Within.) The public relations profession is similarly ignorant; I've received press releases and correspondence from the biggest companies in the world that are riddled with extra spaces. Some of my best friends are irredeemable two spacers, too, and even my wife has been known to use an unnecessary extra space every now and then (though she points out that she does so only when writing to other two-spacers, just to make them happy).

What galls me about two-spacers isn't just their numbers. It's their certainty that they're right. Over Thanksgiving dinner last year, I asked people what they considered to be the "correct" number of spaces between sentences. The diners included doctors, computer programmers, and other highly accomplished professionals. Everyone—everyone!—said it was proper to use two spaces. Some people admitted to slipping sometimes and using a single space—but when writing something formal, they were always careful to use two. Others explained they mostly used a single space but felt guilty for violating the two-space "rule." Still others said they used two spaces all the time, and they were thrilled to be so proper. When I pointed out that they were doing it wrong—that, in fact, the correct way to end a sentence is with a period followed by a single, proud, beautiful space—the table balked. "Who says two spaces is wrong?" they wanted to know.

Typographers, that's who. The people who study and design the typewritten word decided long ago that we should use one space, not two, between sentences. That convention was not arrived at casually. James Felici, author of the The Complete Manual of Typography, points out that the early history of type is one of inconsistent spacing. Hundreds of years ago some typesetters would end sentences with a double space, others would use a single space, and a few renegades would use three or four spaces. Inconsistency reigned in all facets of written communication; there were few conventions regarding spelling, punctuation, character design, and ways to add emphasis to type. But as typesetting became more widespread, its practitioners began to adopt best practices. Felici writes that typesetters in Europe began to settle on a single space around the early 20th century. America followed soon after.

by Farhad Manjoo, Slate |  Read more:

We Are All Expendable Now

One of the signatures of the Great Recession is the fact that we have sustained, long-term high unemployment along with a labor shortage. While unskilled blue-collar workers can't find a job, corporations like American Electric Power are struggling to find enough trained electricians, pipe-fitters, and other skilled workers.

This is not just a product of the recession, but a long-term structural issue: the "skills gap" that differentiates the fate of those workers who have acquired marketable knowledge and skills and those who have not. The unskilled can get by when the economy is good, but they can't get ahead, and when there is a prolonged period of economic malaise they find that they are expendable, and they are simply pushed out of the economy.

The fate of the unskilled laborer is only going to get worse. And while this is now primarily hitting blue-collar workers without college degrees, a different sort of "skills gap" is beginning to open up for white-collar workers. Whole classes of professionals who think of themselves as irreplaceable skilled workers--in many cases, highly skilled workers--are soon going to discover how much of what they do can be automated or outsourced. We will all be expendable soon.

It is not new to talk about the need to acquire "irreplaceable" skills. But what is not properly appreciated is the scope of the challenge this poses to people in all kinds of jobs, and the exact defining characteristic of what will make a skill "irreplaceable."

The basic rule of economics after the Industrial Revolution is: if a task can be automated, it will be. Or to put it differently, if a worker can be replaced by a machine, he will be. Call it the principle of expendability. The only thing that has changed since the first power loom is the number and nature of the tasks that can be automated. The first thing the Industrial Revolution did was to automate physical tasks. But now we are beginning to automate mental tasks, and what we are just beginning to see is the scope of the mental work that can be automatized. It is much wider than you probably think.

An awful lot of work that is usually considered to require human intelligence really doesn't. Instead, these tasks require complex memorization and pattern recognition, perceptual-level skills that can be reduced to mechanical, digitized processes and done by a machine. These include many tasks that currently fill the days of highly educated, well paid professionals.

Take doctors. A recent article by Farhad Manjoo, the technology columnist for Slate, describes how computers have begun to automate the screening of cervical cancer tests. A task that used to be done by two physicians, who could only process 90 images per day, can now be done with better results by one doctor and a machine, processing 170 images per day.

Or take lawyers. A lot of work done in the legal profession consists of asking a client a series of simple questions about his needs, using the answers to select a standard, well-established legal procedure (such as incorporation or the writing of a will), and then filling out forms by plugging in "boilerplate" language. All of which can be programmed into a database and done by computers online, as it now is by services such as Legalzoom.com.

Everywhere you look, you see the same trend. A huge volume of trading on the stock exchanges is now done by computer programs, not floor traders. Or take customer service, which might seem to require someone who can understand questions and reply with a comforting human voice. Well, meet Siri.

by Robert Tracinski, Real Clear Markets |  Read more:

Kill Bill 3


Yangon, Myanmar — Secretary of State Hillary Rodham Clinton said on Thursday that the United States would loosen some restrictions on international financial assistance and development programs in Myanmar, in response to a nascent political and economic opening in the country.

by Steven Lee Meyers, NY Times | Read more:
Photo: Saul Loeb/Agence France-Presse — Getty Images
Headline h/t: via:

The Broken Contract

Inequality and American Decline

Iraq was one of those wars where people actually put on pounds. A few years ago, I was eating lunch with another reporter at an American-style greasy spoon in Baghdad's Green Zone. At a nearby table, a couple of American contractors were finishing off their burgers and fries. They were wearing the contractor's uniform: khakis, polo shirts, baseball caps, and Department of Defense identity badges in plastic pouches hanging from nylon lanyards around their necks. The man who had served their food might have been the only Iraqi they spoke with all day. The Green Zone was set up to make you feel that Iraq was a hallucination and you were actually in Normal, Illinois. This narcotizing effect seeped into the consciousness of every American who hunkered down and worked and partied behind its blast walls -- the soldier and the civilian, the diplomat and the journalist, the important and the obscure. Hardly anyone stayed longer than a year; almost everyone went home with a collection of exaggerated war stories, making an effort to forget that they were leaving behind shoddy, unfinished projects and a country spiraling downward into civil war. As the two contractors got up and ambled out of the restaurant, my friend looked at me and said, "We're just not that good anymore."

The Iraq war was a kind of stress test applied to the American body politic. And every major system and organ failed the test: the executive and legislative branches, the military, the intelligence world, the for-profits, the nonprofits, the media. It turned out that we were not in good shape at all -- without even realizing it. Americans just hadn't tried anything this hard in around half a century. It is easy, and completely justified, to blame certain individuals for the Iraq tragedy. But over the years, I've become more concerned with failures that went beyond individuals, and beyond Iraq -- concerned with the growing arteriosclerosis of American institutions. Iraq was not an exceptional case. It was a vivid symptom of a long-term trend, one that worsens year by year. The same ailments that led to the disastrous occupation were on full display in Washington this past summer, during the debt-ceiling debacle: ideological rigidity bordering on fanaticism, an indifference to facts, an inability to think beyond the short term, the dissolution of national interest into partisan advantage.

Was it ever any different? Is it really true that we're just not that good anymore? As a thought experiment, compare your life today with that of someone like you in 1978. Think of an educated, reasonably comfortable couple perched somewhere within the vast American middle class of that year. And think how much less pleasant their lives are than yours. The man is wearing a brown and gold polyester print shirt with a flared collar and oversize tortoiseshell glasses; she's got on a high-waisted, V-neck rayon dress and platform clogs. Their morning coffee is Maxwell House filter drip. They drive an AMC Pacer hatchback, with a nonfunctioning air conditioner and a tape deck that keeps eating their eight-tracks. When she wants to make something a little daring for dinner, she puts together a pasta primavera. They type their letters on an IBM Selectric, the new model with the corrective ribbon. There is only antenna television, and the biggest thing on is Laverne and Shirley. Long-distance phone calls cost a dollar a minute on weekends; air travel is prohibitively expensive. The city they live near is no longer a place where they spend much time: trash on the sidewalks, junkies on the corner, vandalized pay phones, half-deserted subway cars covered in graffiti.

By contemporary standards, life in 1978 was inconvenient, constrained, and ugly. Things were badly made and didn't work very well. Highly regulated industries, such as telecommunications and airlines, were costly and offered few choices. The industrial landscape was decaying, but the sleek information revolution had not yet emerged to take its place. Life before the Android, the Apple Store, FedEx, HBO, Twitter feeds, Whole Foods, Lipitor, air bags, the Emerging Markets Index Fund, and the pre-K Gifted and Talented Program prep course is not a world to which many of us would willingly return.

The surface of life has greatly improved, at least for educated, reasonably comfortable people -- say, the top 20 percent, socioeconomically. Yet the deeper structures, the institutions that underpin a healthy democratic society, have fallen into a state of decadence. We have all the information in the universe at our fingertips, while our most basic problems go unsolved year after year: climate change, income inequality, wage stagnation, national debt, immigration, falling educational achievement, deteriorating infrastructure, declining news standards. All around, we see dazzling technological change, but no progress. Last year, a Wall Street company that few people have ever heard of dug an 800-mile trench under farms, rivers, and mountains between Chicago and New York and laid fiber-optic cable connecting the Chicago Mercantile Exchange and the New York Stock Exchange. This feat of infrastructure building, which cost $300 million, shaves three milliseconds off high-speed, high-volume automated trades -- a big competitive advantage. But passenger trains between Chicago and New York run barely faster than they did in 1950, and the country no longer seems capable, at least politically, of building faster ones. Just ask people in Florida, Ohio, and Wisconsin, whose governors recently refused federal money for high-speed rail projects.

We can upgrade our iPhones, but we can't fix our roads and bridges. We invented broadband, but we can't extend it to 35 percent of the public. We can get 300 television channels on the iPad, but in the past decade 20 newspapers closed down all their foreign bureaus. We have touch-screen voting machines, but last year just 40 percent of registered voters turned out, and our political system is more polarized, more choked with its own bile, than at any time since the Civil War. There is nothing today like the personal destruction of the McCarthy era or the street fights of the 1960s. But in those periods, institutional forces still existed in politics, business, and the media that could hold the center together. It used to be called the establishment, and it no longer exists. Solving fundamental problems with a can-do practicality -- the very thing the world used to associate with America, and that redeemed us from our vulgarity and arrogance -- now seems beyond our reach.

THE UNWRITTEN CONTRACT

Why and how did this happen? Those are hard questions. A roundabout way of answering them is to first ask, when did this start to happen? Any time frame has an element of arbitrariness, and also contains the beginning of a theory. Mine goes back to that shabby, forgettable year of 1978. It is surprising to say that in or around 1978, American life changed -- and changed dramatically. It was, like this moment, a time of widespread pessimism -- high inflation, high unemployment, high gas prices. And the country reacted to its sense of decline by moving away from the social arrangement that had been in place since the 1930s and 1940s.

by George Packer, Foreign Affairs |  Read more:

Anne Penman Sweet  Ice Road
via:

lois dodd
via:

Our Parallel, Secret Justice System

Just a quick update on a big piece of news that came through yesterday. In one of the more severe judicial ass-whippings you’ll ever see, federal Judge Jed Rakoff rejected a slap-on-the-wrist fraud settlement the SEC had cooked up for Citigroup.

I wrote about this story a few weeks back when Rakoff sent signals that he was unhappy with the SEC’s dirty deal with Citi, but yesterday he took this story several steps further.

Rakoff’s 15-page final ruling read like a political document, serving not just as a rejection of this one deal but as a broad and unequivocal indictment of the regulatory system as a whole. He particularly targeted the SEC’s longstanding practice of greenlighting relatively minor fines and financial settlements alongside de facto waivers of civil liability for the guilty – banks commit fraud and pay small fines, but in the end the SEC allows them to walk away without admitting to criminal wrongdoing.

This practice is a legal absurdity for several reasons. By accepting hundred-million-dollar fines without a full public venting of the facts, the SEC is leveling seemingly significant punishments without telling the public what the defendant is being punished for. This has essentially created a parallel or secret criminal justice system, in which both crime and punishment are adjudicated behind closed doors.  (...)

Judge Rakoff blew a big hole in that practice yesterday. His ruling says secret justice is not justice, and that the government cannot hand out punishments without telling the public what the punishments are for.

by Matt Taibbi, Rolling Stone | Read more:

Are Freeways Doomed?


Everyone freak out: Carmageddon is back. Right now, several U.S. cities are scheming to shut down major freeways — permanently. In the push to take back cities from cars, this is what you’d call throwing down the gauntlet.

The drive to tear down the huge freeways that many blame for the inner-city blight of the ’60s and ’70s is one of the most dramatic signs of the new urban order. Proponents of such efforts have data to show that freeway removal is not at all bizarre, that we can return to human-size streets without causing a gridlock apocalypse. And that may be true. But pulling down these shrines to the automobile also feels like a bold rewriting of America’s 20th-century urban script: Revenge of the Pedestrian. This time it’s personal.

Ready or not, decision time is upon us. Many of these highways were built to last between 40 and 50 years — they’ll soon need to be either repaired or reinvented. “What’s going to happen in the next 10 years when we need to make a big investment to prevent them from collapsing like the one in Minneapolis?” asks John Renne, professor of urban studies at the University of New Orleans.

For some cities, this means a once-in-a-lifetime opportunity to reclaim a vast amount of downtown land and turn it into the public space of their dreams. A group in St. Louis is agitating for the removal of a one-and-a-half mile stretch of Interstate 70, which would reunite the city center with the Mississippi River and Eero Saarinen’s Gateway Arch. Advocates there hope that by opening the city’s “front door,” as they call it, for the first time since 1964, they’ll set the stage for a renaissance of St. Louis’ depopulated downtown. Trenton, N.J., has a similar goal, and is looking at converting the four-lane highway that runs along the Delaware River into a vibrant waterfront of parks and buildings. And as New Orleans implements a new master plan for the city following Hurricane Katrina, anything seems possible — including a pitch to tear down the Claiborne Expressway, the freeway that divided several of the city’s historically black neighborhoods when it was erected decades ago. It would be replaced with a vibrant boulevard that reunites those neighborhoods in an infrastructural act of poetic justice.

It’s hard to overstate the gravity of such proposals. Few urban design initiatives can instantly transform a large swath of a city like building (or unbuilding) a freeway. San Francisco saw this in 1991, when, ahead of the tear-down trend, the city demolished the bay-adjacent double-decker Embarcadero Freeway after it was damaged in an earthquake. Today, the area where the Embarcadero once stood has evolved from a forbidding dead zone to a bustling waterfront and tourist magnet. Standing there now, you’d never guess it was once the site of 16 lanes of through-traffic.

by Will Doig, Salon | Read more:
Illustration: iStockphoto/Diverstudio

The Literary Cubs


Rebecca Chapman, who has a master of arts in English and comparative literature from Columbia University, hit bottom professionally last summer when she could not even get a job that did not pay. Vying for an internship at a boutique literary agency in Manhattan, Ms. Chapman, 25, had gone on three separate interviews with three people on three different days. “They couldn’t even send me an e-mail telling me I didn’t get it,” she said. 

It’s a story familiar to anyone seeking to break into the New York publishing world. Willie Osterweil, 25, an aspiring novelist who graduated magna cum laude from Cornell in 2009, found himself sweeping Brooklyn movie theaters for $7.25 an hour. And the closest that Helena Fitzgerald, a recent Columbia graduate, got was an interview at a top magazine, during which the editor dismissed her literary career dreams, telling her, “C’mon, that’s not realistic.”

Which explains, in a way, how they all ended up on a crisp November night, huddled together at an invitation-only party at a cramped, bookshelved apartment on the Upper East Side.

It was the weekly meeting of The New Inquiry, a scrappy online journal and roving clubhouse that functions as an Intellectuals Anonymous of sorts for desperate members of the city’s literary underclass barred from the publishing establishment. Fueled by B.Y.O.B. bourbon, impressive degrees and the angst that comes with being young and unmoored, members spend their hours filling the air with talk of Edmund Wilson and poststructuralism.

Lately, they have been catching the eye of the literary elite, earning praise that sounds as extravagantly brainy as the thesis-like articles that The New Inquiry uploads every few days. 

by Alex Williams, NY Times | Read more:
Photo: Deidre Schoo for The New York Times

Secret Fed Loans to Banks Revealed

[ed.  Astonishing story gleaned from over 29,000 pages of Fed documents obtained under the Freedom of Information Act, fighting legal resistance all the way up to the US Supreme Court.]

The Federal Reserve and the big banks fought for more than two years to keep details of the largest bailout in U.S. history a secret. Now, the rest of the world can see what it was missing.

The Fed didn’t tell anyone which banks were in trouble so deep they required a combined $1.2 trillion on Dec. 5, 2008, their single neediest day. Bankers didn’t mention that they took tens of billions of dollars in emergency loans at the same time they were assuring investors their firms were healthy. And no one calculated until now that banks reaped an estimated $13 billion of income by taking advantage of the Fed’s below-market rates, Bloomberg Markets magazine reports in its January issue.

Saved by the bailout, bankers lobbied against government regulations, a job made easier by the Fed, which never disclosed the details of the rescue to lawmakers even as Congress doled out more money and debated new rules aimed at preventing the next collapse.

A fresh narrative of the financial crisis of 2007 to 2009 emerges from 29,000 pages of Fed documents obtained under the Freedom of Information Act and central bank records of more than 21,000 transactions. While Fed officials say that almost all of the loans were repaid and there have been no losses, details suggest taxpayers paid a price beyond dollars as the secret funding helped preserve a broken status quo and enabled the biggest banks to grow even bigger.  (...)

$7.77 Trillion

The amount of money the central bank parceled out was surprising even to Gary H. Stern, president of the Federal Reserve Bank of Minneapolis from 1985 to 2009, who says he “wasn’t aware of the magnitude.” It dwarfed the Treasury Department’s better-known $700 billion Troubled Asset Relief Program, or TARP. Add up guarantees and lending limits, and the Fed had committed $7.77 trillion as of March 2009 to rescuing the financial system, more than half the value of everything produced in the U.S. that year.

“TARP at least had some strings attached,” says Brad Miller, a North Carolina Democrat on the House Financial Services Committee, referring to the program’s executive-pay ceiling. “With the Fed programs, there was nothing.”

Bankers didn’t disclose the extent of their borrowing. On Nov. 26, 2008, then-Bank of America (BAC) Corp. Chief Executive Officer Kenneth D. Lewis wrote to shareholders that he headed “one of the strongest and most stable major banks in the world.” He didn’t say that his Charlotte, North Carolina-based firm owed the central bank $86 billion that day.

by Bob Ivry, Bradley Keoun and Phil Kuntz, Bloomberg | Read more:

Wednesday, November 30, 2011

The Personal Computer Is Dead

The PC is dead. Rising numbers of mobile, lightweight, cloud-centric devices don't merely represent a change in form factor. Rather, we're seeing an unprecedented shift of power from end users and software developers on the one hand, to operating system vendors on the other—and even those who keep their PCs are being swept along. This is a little for the better, and much for the worse.

For decades we've enjoyed a simple way for people to create software and share or sell it to others. People bought general-purpose computers—PCs, including those that say Mac. Those computers came with operating systems that took care of the basics. Anyone could write and run software for an operating system, and up popped an endless assortment of spreadsheets, word processors, instant messengers, Web browsers, e-mail, and games. That software ranged from the sublime to the ridiculous to the dangerous—and there was no referee except the user's good taste and sense, with a little help from nearby nerds or antivirus software. (This worked so long as the antivirus software was not itself malware, a phenomenon that turned out to be distressingly common.)

Choosing an OS used to mean taking a bit of a plunge: since software was anchored to it, a choice of, say, Windows over Mac meant a long-term choice between different available software collections. Even if a software developer offered versions of its wares for each OS, switching from one OS to another typically meant having to buy that software all over again.

That was one reason we ended up with a single dominant OS for over two decades. People had Windows, which made software developers want to write for Windows, which made more people want to buy Windows, which made it even more appealing to software developers, and so on. In the 1990s, both the U.S. and European governments went after Microsoft in a legendary and yet, today, easily forgettable antitrust battle. Their main complaint? That Microsoft had put a thumb on the scale in competition between its own Internet Explorer browser and its primary competitor, Netscape Navigator. Microsoft did this by telling PC makers that they had to ensure that Internet Explorer was ready and waiting on the user's Windows desktop when the user unpacked the computer and set it up, whether the PC makers wanted to or not. Netscape could still be prebundled with Windows, as far as Microsoft was concerned. Years of litigation and oceans of legal documents can thus be boiled down into an essential original sin: an OS maker had unduly favored its own applications.

When the iPhone came out in 2007, its design was far more restrictive. No outside code at all was allowed on the phone; all the software on it was Apple's. What made this unremarkable—and unobjectionable—was that it was a phone, not a computer, and most competing phones were equally locked down. We counted on computers to be open platforms—hard to think of them any other way—and understood phones as appliances, more akin to radios, TVs, and coffee machines.

Then, in 2008, Apple announced a software development kit for the iPhone. Third-party developers would be welcome to write software for the phone, in just the way they'd done for years with Windows and Mac OS. With one epic exception: users could install software on a phone only if it was offered through Apple's iPhone App Store. Developers were to be accredited by Apple, and then each individual app was to be vetted, at first under standards that could be inferred only through what made it through and what didn't. For example, apps that emulated or even improved on Apple's own apps weren't allowed.

The original sin behind the Microsoft case was made much worse. The issue wasn't whether it would be possible to buy an iPhone without Apple's Safari browser. It was that no other browser would be permitted—or, if permitted, it would be only through Apple's ongoing sufferance. And every app sold for the iPhone would have 30 percent of its price (and later, that of its "in-app purchases") go to Apple. Famously proprietary Microsoft never dared to extract a tax on every piece of software written by others for Windows—perhaps because, in the absence of consistent Internet access in the 1990s through which to manage purchases and licenses, there'd be no realistic way to make it happen.

Fast forward 15 years, and that's just what Apple did with its iOS App Store.

by Jonathan Zittrain, MIT |  Continue reading:
Image:  Wikipedia