Tuesday, October 18, 2011

The Great Tech War Of 2012


by Farhad Manjoo, Fast Company

To state this as clearly as possible: The four American companies that have come to define 21st-century information technology and entertainment are on the verge of war. Over the next two years, Amazon, Apple, Facebook, and Google will increasingly collide in the markets for mobile phones and tablets, mobile apps, social networking, and more. This competition will be intense. Each of the four has shown competitive excellence, strategic genius, and superb execution that have left the rest of the world in the dust. HP, for example, tried to take a run at Apple head-on, with its TouchPad, the product of its $1.2 billion acquisition of Palm. HP bailed out after an embarrassingly short 49-day run, and it cost CEO Léo Apotheker his job. Microsoft's every move must be viewed as a reaction to the initiatives of these smarter, nimbler, and now, in the case of Apple, richer companies. When a company like Hulu goes on the block, these four companies are immediately seen as possible acquirers, and why not? They have the best weapons--weapons that will now be turned on one another as they seek more room to grow.

There was a time, not long ago, when you could sum up each company quite neatly: Apple made consumer electronics, Google ran a search engine, Amazon was a web store, and Facebook was a social network. How quaint that assessment seems today.  (...)

Amazon, Apple, Facebook, and Google don't recognize any borders; they feel no qualms about marching beyond the walls of tech into retailing, advertising, publishing, movies, TV, communications, and even finance. Across the economy, these four companies are increasingly setting the agenda. Bezos, Jobs, Zuckerberg, and Page look at the business world and justifiably imagine all of it funneling through their servers. Why not go for everything? And in their competition, each combatant is getting stronger, separating the quartet further from the rest of the pack.

Everyone reading this article is a customer of Amazon, Apple, Facebook, or Google, and most probably count on all four. This passion for the Fab Four of business is reflected in the blogosphere's panting coverage of their every move. ExxonMobil may sometimes be the world's most valuable company, but can you name its CEO? Do you scour the Internet for rumors about its next product? As the four companies encroach further and further into one another's space, consumers look forward to cooler and cooler products. The coming years will be fascinating to watch because this is a competition that might reinvent our daily lives even more than the four have changed our habits in the past decade. And that, dear reader, is why you need a program guide to the battle ahead.

Read more:

Image: From left: The late Apple cofounder Steve Jobs, Facebook CEO Mark Zuckerberg, Google CEO Larry Page, and Amazon CEO Jeff Bezos. | Photos courtesy of David Paul Morris/Getty Images (Jobs); Justin Sullivan/Getty Images (Zuckerberg); Chip East/Reuters (Page); Mario Tama/Getty Images (Bezos).

Quantum Levitation

[ed. For Nate.]


Article:  Were hoverboards real?

Slow Club


Monday, October 17, 2011

DVA

[ed.  As I understand it: suppose the equity in your house declined by 25 percent, you then turn around and claim you actually made a 25 percent profit because now you can buy it back at a 25 percent discount (some time in the future).  Bankers..].

by Joe Weisenthal, Business Insider

JPMorgan Q3 earnings have come in better than expected, at $1.02.

But it's a bank report, so were going to need to look deeper to see if it's good.

The first red flag is the Debt Valuation Adjustment: The company booked a big gain BECAUSE its bonds worsened significantly, meaning technically on an accounting basis, the company's equity jumped. Read an explanation here.

Here's their commentary

Jamie Dimon, Chairman and Chief Executive Officer, commented: “The Firm reported third-quarter net income of $4.3 billion, representing a 13% return on tangible common equity1. It is notable that these results included several significant items(*), including a $542 million pretax loss in Private Equity, $1.0 billion pretax of additional litigation expense in Corporate and a $1.9 billion pretax DVA gain. The DVA gain reflects an adjustment for the widening of the Firm’s credit spreads which could reverse in future periods and does not relate to the underlying operations of the company. All things considered, we believe the Firm’s returns were reasonable given the current environment.”

Back to School, Ryan Conners
via:

The Right to Die

by Alex Lickerman, M.D., Psychology Today

The notion that dying is a right seems nonsensical to argue: death is given to all of us equally without the need of anyone's sanction. The right to die well, on the other hand—well, that's another matter entirely. A good death is, in many cases, something our fellow human beings have great power to grant or deny, and is therefore, sadly, a right for which we must indeed fight.

The notion that we'd even need to fight for the right to die well has only come to make sense relatively recently, within the last forty years or so. Prior to that, our ability to prolong dying—meaning, keep extremely ill people going in hopes that they might overcome whatever health problem threatens even when the likelihood is vanishingly small—was actually fairly limited. But with the advent of modern intensive care units and all the amazing technology that's emerged in the last four decades, we can now stretch the quantity of out our last days often to weeks or even months. Unfortunately, a similar stretching of quality hasn't yet occurred; if anything, we see the opposite (to be fair, the same technology also stretches some lives to years and even decades, meaning it's enabled some people to recover from insults that in the past would have undoubtedly killed them).

Health providers don't wield this technology to prolong suffering intentionally. As I argued in a previous post, Knowing When To Stop, it's quite difficult to predict the timing of death, even in the terminally ill. In one sense, then, the horrific deaths many patients experience at the hands of modern medicine reflects our species' profound optimism bias. Even when in our hearts we know it's time to stop, we often don't.

Read more:

Europe: Just Getting Warmed Up

[ed. Pretty troubling assessment from John Hussman, one of the sharpest mutual fund managers in the business (in my opinion).]

by John Hussman

From my perspective, Wall Street's "relief" about the economy, and its willingness to set aside recession concerns, is a mistake born of confusion between leading indicators and lagging ones. Leading evidence is not only clear, but on a statistical basis is essentially certain that the U.S. economy, and indeed, the global economy, faces an oncoming recession. As Lakshman Achuthan notes on the basis of ECRI's own (and historically reliable) set of indicators, "We've entered a vicious cycle, and it's too late: a recession can't be averted." Likewise, lagging evidence is largely clear that the economy was not yet in a recession as of, say, August or September. The error that investors are inviting here is to treat lagging indicators as if they are leading ones.

The simple fact is that the measures that we use to identify recession risk tend to operate with a lead of a few months. Those few months are often critical, in the sense that the markets can often suffer deep and abrupt losses before coincident and lagging evidence demonstrates actual economic weakness. As a result, there is sometimes a "denial" phase between the point where the leading evidence locks onto a recession track, and the point where the coincident evidence confirms it. We saw exactly that sort of pattern prior to the last recession. While the recession evidence was in by November 2007 (see Expecting A Recession ), the economy enjoyed two additional months of payroll job growth, and new claims for unemployment trended higher in a choppy and indecisive way until well into 2008. Even after Bear Stearns failed in March 2008, the market briefly staged a rally that put it within about 10% of its bull market high. 

The "Last Place Aversion" Paradox



If ever Americans were up for a bit of class warfare, now would seem to be the time. The current financial downturn has led to a $700 billion tax-payer-financed bank bailout and an unemployment rate stuck stubbornly above nine percent. Onto this scene has stepped the Occupy Wall Street (OWS) movement, which seeks to bring together a disparate group of protesters united in their belief that the current income distribution is unfair. “The one thing we all have in common is that We are the 99% that will no longer tolerate the greed and corruption of the 1%,” says their website. In an era of bank bailouts and rising poverty – and where recent data show that the top 1 percent control as much as 35 percent of the total wealth in America – it would appear that the timing of this movement to reconsider the allocation of wealth could not be more perfect.

Or, maybe not.

Support for redistribution, surprisingly enough, has plummeted during the recession. For years, the General Social Survey has asked individuals whether “government should reduce income differences between the rich and the poor.” Agreement with this statement dropped dramatically between 2008 and 2010, the two most recent years of data available. Other surveys have shown similar results.

What might explain this trend? First, the change is not driven by wealthy white Republicans reacting against President Obama’s agenda: the drop is if anything slightly larger among minorities, and Americans who self-identify as having below average income show the same decrease in support for redistribution as wealthier Americans.

Our recent research suggests that, far from being surprised that many working-class individuals would oppose redistribution, we might actually expect their opposition to rise during times of turmoil – despite the fact that redistribution appears to be in their economic interest. Our work suggests that people exhibit a fundamental loathing for being near or in last place – what we call “last place aversion.” This fear can lead people near the bottom of the income distribution to oppose redistribution because it might allow people at the very bottom to catch up with them or even leapfrog past them.

Read more:
Image: David Shankbone

Amazon Rewrites the Rules of Publishing

by David Streitfeld

Amazon.com has taught readers that they do not need bookstores. Now it is encouraging writers to cast aside their publishers.

Amazon will publish 122 books this fall in an array of genres, in both physical and e-book form. It is a striking acceleration of the retailer’s fledging publishing program that will place Amazon squarely in competition with the New York houses that are also its most prominent suppliers.

It has set up a flagship line run by a publishing veteran, Laurence Kirshbaum, to bring out brand-name fiction and nonfiction. It signed its first deal with the self-help author Tim Ferriss. Last week it announced a memoir by the actress and director Penny Marshall, for which it paid $800,000, a person with direct knowledge of the deal said.

Publishers say Amazon is aggressively wooing some of their top authors. And the company is gnawing away at the services that publishers, critics and agents used to provide.

Several large publishers declined to speak on the record about Amazon’s efforts. “Publishers are terrified and don’t know what to do,” said Dennis Loy Johnson of Melville House, who is known for speaking his mind.

“Everyone’s afraid of Amazon,” said Richard Curtis, a longtime agent who is also an e-book publisher. “If you’re a bookstore, Amazon has been in competition with you for some time. If you’re a publisher, one day you wake up and Amazon is competing with you too. And if you’re an agent, Amazon may be stealing your lunch because it is offering authors the opportunity to publish directly and cut you out.

Read more:
graphic: via NY Times and Scott Eells/Bloomberg News

Flow


[ed. Quite hypnotic.  Click graphic to start playing.]

by Drym Shyuan

Fl0w is calm, atmospheric, entertaining, challenging and fun. It's a game I recently stumbled upon and it has completely won my heart. In this game you control an organism with basic structure. You move it in a world that looks like an ocean. The main objective is to feed on smaller organisms and grow bigger. But this is where the game gets interesting.

In fl0w, you discover a massive ecosystem that doesn't only consist of basic and small organisms for you to feast upon, but it also has some enormous and hungry AI-controlled creatures that will do everything in the capabilities to hunt you down. The world itself is separated into layers and your objective is to become the largest of the ecosystem by completely consuming the final boss in the lowest depth - The Abyss like I love to call it, for it is pure darkness. To move between the layers, one has to consume a flashing creature. There are two kinds of these harmless organisms: the blue one and the red one. Blue thing will ascend you to a layer above you while the red one will descend you further towards the abyss.

A feature I like a lot about this simple Flash game is the Artificial Intelligence. The developer has created a masterpiece in my opinion. In fl0w, you will encounter various AI-controlled organisms, some that look basic and others very complex. During most of the time, every organism is white, meaning their attitude is at a peaceful level. But should anyone of them get angry, they will turn orange, which means it's time to run. Fortunately, this won't stay for a long time and after you have evaded the hostile organism, they will lose any interest in hunting you and turn back into white. But should you take a bite of another organism, it may turn blue, which means it's scared and it's time to hunt it down and consume it completely.

For more details and additional screenshots:

Read more:

But, Siri-ously

by Geoffrey A. Fowler

Now even your phone talks back.

One of the top draws to Apple's iPhone 4S is its new speech recognition software, called Siri, that's designed to talk back. In San Francisco, Ian Sherr hears some new owners' favorite questions.

Matt Legend Gemmell, a software designer from Edinburgh, got a new Apple Inc. iPhone on Friday and asked it: "Who's your daddy?"

"You are," the phone answered, in the voice of an authoritative man.

Earlier, he commanded: "Beam me up." This time, the iPhone responded: "Sorry, Captain, your tricorder is in Airplane Mode."

The real science of artificial intelligence is finally catching up to science fiction. HAL 9000, the creepy sentient computer from the movie "2001: A Space Odyssey," has been incarnated, in the form of Siri, a virtual personal assistant that comes with Apple's new iPhone 4S, which arrived in stores Friday.

Real humans are responding to this alarming breakthrough by asking their iPhones ridiculous questions.

The good news is, Siri has a sense of humor.

Micah Gantman, the director of mobile business at software firm HasOffers.com in Seattle, asked his iPhone: "How much wood would a woodchuck chuck if a woodchuck could chuck wood?" It answered: "Depends if you're talking about African or European wood."

Nicky Kelly, a 40-year-old from Suffolk, U.K., asked her iPhone: "Tell me a joke." It answered: "Two iPhones walk into a bar...I forget the rest."

There are already websites to collect some of Siri's best material, including one called "S— That Siri Says." Some of the responses appear to be pre-programmed.

Google Inc. is in on the AI joke, too, with its smartphone and search technology. After 13 years of research, some of the world's smartest engineers have created algorithms able to answer questions such as "What's that movie that's backwards and the guy can't remember anything?" (Answer: "Memento.")

Hold a Google Android phone up to your mouth and ask "What's the answer to life, the universe, and everything?" It will answer, in text on the screen, "42," a reference to the favorite geek book "The Hitchhiker's Guide to the Galaxy."

A lot of work went into so much artificial sarcasm.

The creators of Siri put "deep thought" into the personality of their software, says Norman Winarsky, a co-founder of the company that was bought by Apple for $200 million in 2010. Siri was born out of an artificial intelligence project at SRI International, a research institute. 

Read more:

Sunday, October 16, 2011


by Felice Casorat
via:

Cultural Faux Pas in New York City

  • Don’t say you’re “from New York” when you’re from New Jersey or Long Island. There are very nice parts of New Jersey and Long Island; some very nice people live there. But this is not Boston - you don’t get to say you’re “from New York City” if you’re from slightly outside it. If your prevarication is discovered, this is a quick route to contempt.
  • Never ever ever EVER refer to the city as “the Big Apple.” If you say this, you are a tourist, and a clueless one at that. Using the phrases “only in New York!” and “a New York minute” falls in the same category, but they may be used, sparingly, by long-time residents, with a heavy dose of irony.
  • Don’t refer to the subway lines by their color. Instead, refer to them by their numbers and letters - e.g. it’s not the “Green Line,” it’s the “4, 5, 6.” When referring to a specific service along that line, each is called a “train,” rather than a “subway” - e.g. the “6 train,” not the “6 subway.” When referring to the entire system, it’s the “subway” - not the “Metro,” the “Underground,” etc.
  • Don’t wear “I Heart NY” t-shirts, or indeed any article of clothing that mentions New York in any capacity, with the exception of gear supporting a sports team.
  • If there is a wait for something or a bottleneck, don’t mob it - form a line. And when a line has been formed do NOT try to cut it. Seriously. This is for your own health.
  • When you get on a bus or step up to a subway turnstile, have your change or MetroCard ready. There’s a special circle of hell devoted to people who waste 20 seconds of everyone else’s time with their fumbling.
  • Don’t ask people where you can find good “New York Pizza.” In New York, it’s just called pizza - most New Yorkers don’t even know “New York Pizza” is a thing outside New York, or that there is a “New York-style” (see Where can you get New York-style Pizza in London? and its ilk). Just go to the local corner pizza shop and help yourself; I promise it’ll have “New York-style pizza” unless it says very explicitly otherwise.
  • Corollary to the above - do not say you prefer Chicago, New Haven or (God help you) California pizza. This is a direct route to a heated argument. 

    Parking Space, Jen Hsieh
    via:

    Magical Thinking


    by Ben Tarnoff

    Fiction rarely influences politics anymore, either because fewer people read it or because it has fewer things to say. Yet novels have affected America in large and unsubtle ways: Uncle Tom’s Cabin and The Jungle shaped the contours of the national current no less profoundly than our periodic wars and bank panics. More recently, Ayn Rand’s tales of triumphant individualism, Atlas Shrugged and The Fountainhead, inspired a resilient strain of free-market fundamentalism that continues to color our economic life. A Russian immigrant who adored her adopted country, Rand strove to become American in all things, and in the process became an especially American sort of storyteller: the kind whose stories are a means to a social or political end. It’s an honored tradition in American writing, one that acquits fiction of its perennial charge of uselessness by making it practical, identifying problems and offering solutions—pragmatic books for the purpose of the country’s self-improvement.

    Few novels have sought to improve America as radically as Edward Bellamy’s bestseller Looking Backward, 2000-1887, published in 1888. Bellamy, like Rand, used fiction to popularize a philosophy, and with comparable results: Looking Backward sold nearly half a million copies in its first decade and appeared in several languages around the world. The book found many prominent admirers, among them Mark Twain and William Jennings Bryan—the latter borrowed the language of his Cross of Gold speech from the novel’s final chapter. It inspired a political movement called Nationalism and energized generations of American progressives, from populists to New Dealers. More than a century later, it remains an indispensable zeitgeist book, an X-ray of the American body politic during the violent creation of modern industrial society, when many different futures felt possible.

    Looking Backward is the story of Julian West, a wealthy young Bostonian who enters a hypnotist’s trance on May 30, 1887, and wakes up 113 years, three months, and eleven days later. Dr. Leete, an articulate citizen of the new century, greets him and explains, in the first of many leaps of logic required by the reader, that the hypnosis has perfectly preserved West’s body. He hasn’t aged an hour; Boston, however, has changed nearly beyond recognition. When West goes outside, he glimpses an idyll of tree-lined streets and majestic public buildings whose only familiar features are the Charles River and the islands of the harbor, gleaming through air clear of coal smoke.

    The following chapters consist mainly of expositive conversations between West and Dr. Leete—a technique beloved by science-fiction authors then and since—as the newcomer struggles to understand the enormous changes that have taken place since the nineteenth century. America is now a nearly perfect society. Prosperity is evenly distributed, people are highly educated, and crime, corruption, and poverty have disappeared. Humanity has reached “a new plane of existence.” Most miraculous is how this rebirth came about: through a bloodless social evolution, beginning in the late nineteenth century and concluding in the early twentieth. The industrial trusts of the 1880s were the first phase: they simply continued to consolidate until all of the country’s capital became the Great Trust—a single corporation, nationalized for the public benefit.

    Predictably, West is skeptical. “Human nature itself must have changed very much,” he says. Not human nature, Dr. Leete replies, but “the conditions of human life,” as governed by the great social mechanism whose elaborate workings he spends the rest of the book patiently describing. As the nation is now the only employer, its citizens are its employees. For a term of twenty-four years, from ages twenty-one to forty-five, they serve in an “industrial army” that runs the economy. New recruits begin as common laborers before they select a profession; testing and training ensures that each finds the vocation best suited to his abilities. Compensation remains the same regardless of productivity, even for those too weak to work. This sum isn’t paid in dollar bills, but in nontransferable units allotted to each citizen’s “credit card,” exchangeable for clothing, food, and other necessities from state-run stores. Better workers are rewarded with promotion through the officer grades, ascending through a chain of command that culminates in the president of the United States. These positions confer prestige but hold little power. The system works like a perpetual-motion machine, with a minimum of human intervention.  (...)

    Predictions of the future always carry the imprint of their present, which makes them useful for understanding the past. Science fiction in particular tends to betray its age as visibly as tree trunks or residual radiocarbon. An author may imagine inventions that fail to appear—pneumatic mail, for instance—but his deeper assumptions about the parameters of the possible are what date his work most strongly.

    Looking Backward reads very differently today than it did when Bellamy wrote it. Its utopian premise strains under the sobering weight of the twentieth century—the real twentieth century, the century of Hitler, Stalin, and Mao. The future produced nearly the opposite of what Bellamy had hoped. Large-scale social engineering led not only to utopian dreams but to genocides. Technology brought a better quality of life, but also made it easier to kill large numbers of people, whether with nuclear bombs or global warming. Bellamy expected moral progress to accelerate at the same rapid rate as science and industry; the last hundred years have made him look naive, even dystopian.

    Yet what appears irrevocable in retrospect was anything but certain in 1888, and a vision that is eerily totalitarian today struck many Americans then as a plausible blueprint for a brighter tomorrow. Bellamy wasn’t blind to the inhumane aspects of modern industrial life. On the contrary: his book achieved popularity because it offered an elegant solution to the crisis of the late nineteenth century, when radically new forces remade a nation shattered by war, and set the tone for the social landscape we still live in.

    Read more: 
    Illustration: Looking Backward, 2000-1887, by Edward Bellamy