Tuesday, December 27, 2011

Platform Wars


Want to learn MBA management skills and strategies for free?  Thanks to "Platform Wars," a video game simulator created by MIT’s Sloan School of Management, anyone can learn elements of a business school education by portraying an executive at a video game console manufacturer online.

The simulator has been used for the past four years in business management classes taught by professor John Sterman. A user playing an executive Nintendo, for example, might be tasked with figuring out how how to help the Wii beat out competition from Microsoft's XBox. The ultimate goal is to strategize against your competitor to maximize cumulative profit over 10 years. The player has to make all the applicable decisions to win the market—everything from setting the price of the console to determining the royalties video game makers will pay for the right to produce games for the platform.

“Platform Wars” proved to be so popular at the business school that in late November, MIT—the home of the renowned OpenCourseWare program—decided to make the simulator available to the public on the MIT Sloane Teaching Innovation Resources website. Users can play as an individual or as a class. To fully equip gamers, Sterman is also providing free case studies and video explanations for both students and teachers.

Platform markets “are increasingly common in settings besides video games,” so Sterman says that the skills users can learn through Platform Wars are “applicable in many markets." Figuring out how to ensure your product’s price, features, and complementary products stay competitive is in every business' best interests. After all, we all know what happened in the real-world platform war between VHS and Betamax.

by Liz Dwyer, Good |  Read more:

Why We Make Bad Decisions


What role do our surroundings have in the choices we make? Consider the fact that we are more likely to commit a “random” act of kindness toward a person who has already done something kind toward us. We are less likely to help someone in serious trouble when we’re in a crowd, or choose different professions based on the sound and spelling of our first names. It turns out the context in which we make our decisions has a huge impact on their outcomes.

In his new book “Situations Matter: Understanding How Context Transforms Your World,” author Sam Sommers, an associate professor of psychology at Tufts University, looks at what context can teach us about everything from test questions to romantic partners to career choices. Sommers offers a fascinating glimpse into the way our most important judgments are framed by the world around us.

Salon spoke with Sommers over the phone about Occupy Wall Street, online dating and Penn State’s Joe Paterno riot.

In the book you argue that this perception that, as you describe it, “What you see is what you get” is flawed and dangerous. Why are judgments based on first impressions misguided?

It’s our default assumption. It’s our fallback, automatic assumption about other people. It serves us well in a lot of respects. It makes the world a more predictable place. It allows us to make predictions about the world. But a variety of different research over the past few decades shows that this automatic judgment is a cognitive cutting of corners. It doesn’t give an accurate perspective on how human nature works. One of the really good examples is the quickness with which we turn to the “bad apple” explanation. When we read about bad behavior, whether it’s people committing crimes, rioting, etc., we immediately assume that that person is a bad apple, that we would never do something like that. It makes us feel better about ourselves at the end of the day, but it keeps us from solving some of the root issues at the heart of human nature.

by Hannah Tepper, Salon |  Read more:
Illustration: VLADGRIN via Shutterstock

Rejoice for Utopia is Nigh!

One hundred years ago an American immigrant invented science fiction.

Okay, that’s not true. Not even close. People have been building fantastic narratives out of scientific gobbledygook since the days of the Greeks. Lucian of Samosata imagined a trip to the moon over 17 centuries before Jules Verne took a whack at it. And decades before 1911 Verne and H.G. Wells wrote the stories that established the contours of the genre: fantastic voyages in space and time, alien encounters, technology run amok, and so forth. The term “science fiction” wouldn’t even be invented until 1929. But the genre as a coherent field of literary endeavour—as the thing that takes up a whole wall at your local Barnes & Noble or Waterstone’s—might not have come to be if it weren’t for a failed inventor-turned-publisher with aesthetic ambitions. Naive, utopian and romantic, a man named Hugo Gernsback ended up establishing a new strand of science fiction, one that helped shape (and was shaped by) the American century.

Gernsback had come to America in 1904 with the common immigrant dream of striking it rich. He planned to revolutionise battery technology, but when that didn’t pan out he turned to scientific-magazine publishing. He started out with mail-order catalogues for his imported radio-equipment business, but, as the years went on, his efforts took a more explicitly literary turn. Amazing Stories, which he founded in 1926, has a fair claim to being the first magazine dedicated solely to what he called “scientifiction”. It would go on to help define the genre, publishing the debuts of some of its greatest authors. The ever-expanding community of science-fiction readers and fans was so grateful it named its highest honour after him; there isn’t an science-fiction writer from Asimov to Zelazny who hasn’t coveted a Hugo trophy.

But in 1911 all that lay in the future—a topic which, to be fair, was something Gernsback was pretty interested in. As a young man of 27, he was witnessing a new century and a newly revitalised country all at once. America’s can-do spirit involved a gleeful embrace of technology (the trans-continental railroad! The wizard of Menlo Park: Thomas Edison! Henry Ford’s Model T!). New inventions, discoveries and achievements seemed to be rolling off the brand-new assembly line every day, and the factual articles of Modern Electrics, Gernsback’s magazine (its name a kind of romantic statement itself), were hardly capacious enough to contain the sense of possibility. And so he turned, diffidently, to fiction.

by Prospero, The Economist |  Read more:

Portlandia


A smugly enamored couple sit in a restaurant, their hands clasped as they fret over the menu. The chicken, for instance: can the waitress tell them a little bit about its provenance? Of course she can, because this is the kind of cool restaurant in Portland, Oregon, where patrons regularly seek elaborate assurances about the virtuousness of their food. The waitress informs the couple that the place serves only local, free-range, “heritage-breed, woodland-raised chicken that’s been fed a diet of sheep’s milk, soy, and hazelnuts.” But because the diners, Peter and Nance, are characters on “Portlandia”—a television comedy in which precious concerns spin into giddy lunacy—the conversation does not stop there. Peter, played by Fred Armisen, asks if the hazelnuts, too, are local. Nance, played by Carrie Brownstein, needs to know the size of the parcel of land where the chicken roamed freely. (Four acres.) The waitress excuses herself and returns to the table with a file folder and a photograph. “Here is the chicken you’ll be enjoying tonight,” she says, with therapeutic solemnity. “His name was Colin.” Peter seems appeased: “He looks like a happy little guy who runs around.” But then he wonders if the animal had “a lot of friends—other chickens as friends?” The waitress, who finds this a reasonable question, admits, “I don’t know that I can speak to that level of intimate knowledge about him.”

“Portlandia,” which débuted last winter, on the Independent Film Channel, and returns on January 6th, is the rare sketch-comedy series that has a sustained object of satire. It’s about life in hipster enclaves, and the self-consciousness that makes hipsters desperately disavow the label. Many of its characters are caught up in the prideful culture of D.I.Y. entrepreneurship, in which people reject office jobs in favor of becoming, say, an appliqué-pillow designer with a page on Etsy. (This season, a couple launch a business based on the catchphrase “We can pickle that!,” brining everything from eggs at an urban farm to a broken high heel found on the sidewalk.) “Portlandia” is an extended joke about what Freud called the narcissism of small differences: the need to distinguish oneself by minute shadings and to insist, with outsized militancy, on the importance of those shadings.

Brownstein, who is also one of the show’s writers and producers, told me, “In general, things in a place like Portland are really great, so little concerns become ridiculous. There are a lot of people here who can afford—financially but also psychologically—to be really, really concerned about buying local, for instance. It becomes mock epic. It’s like Alexander Pope’s ‘Rape of the Lock.’ I was standing in line at Whole Foods, and the guy in front of me says, ‘I really wish you guys sold locally made fresh pasta.’ And the cashier says, ‘Look, we do.’ And the guy says, ‘No, no—that’s from Seattle.’ Really? You don’t have a bigger battle?”

“Portlandia” presents a heightened version of the city’s twee urbanity: a company sells artisanal light bulbs, a hotel offers a manual typewriter to every guest, and a big local event is the Allergy Pride Parade. The mayor, played by Kyle MacLachlan, becomes an object of scandal when he’s “outed” as the bass guitarist in a middle-of-the-road reggae band. (The real Portland’s mayor, Sam Adams, who is openly gay, plays MacLachlan’s assistant on the show.) Armisen and Brownstein, wearing anthropologically precise wigs and outfits, portray most of the main characters: bicycle-rights activists, dumpster divers, campaigners against any theoretical attempt to bring the Olympics to Portland, animal lovers so out of touch that they free a pet dog tied up outside a restaurant. (“Who puts their dog on a pole like a stripper?”) Many characters recur, and, because they often seem to know one another, their intersections from sketch to sketch give the show the feel of a grownup “Sesame Street.” This childlike vibe has an edge to it, however; as an Armisen character explains at one point, Portland is “where young people go to retire.”

by Margaret Talbot, NewYorker |  Read more:
Photograph by Gabriele Stabile

Monday, December 26, 2011

What is the Greatest Invention?

Different writers at More Intelligent Life offer their own answers. Samantha Weinberg argues it is the Web. Edward Carr makes the case for the blade. Roger Highfield's candidate, the modern scientific method, is probably the answer I agree with most:
All great inventions rest on understanding how things work. And the greatest of all is the über-invention that has provided the insights on which other inventions depend: the modern scientific method, the realisation that we cannot grasp the way the world works by rational thought alone.
To gain meaningful insights into the scheme of things, logic has to be accompanied by asking probing questions of nature. To advance understanding, we need to devise rational conjectures and probe them to destruction through controlled tests, precise observations and clever analysis. The upshot is an unending dialogue between theory and experiment.
Unlike a traditional invention, the scientific method did not come into being at a particular time: its history is complex and stretches back long before 1833, when the term “scientist” was coined by the English polymath William Whewell. The method is not a concrete gadget like Gutenberg’s press, the computer or the Pill. Nor is it a brainwave like the non-geocentric universe, the Indo-Arab counting system or the theory of evolution. It is a fecund way of thinking on which the modern world rests. In relatively few generations, the rigorous application of the method has bootstrapped modern society through a non-linear accumulation of both knowledge and technology. Its impact on everyday life is ubiquitous and indisputable, even though a surprising number of people, including some senior politicians, have only a feeble grasp of its significance.
 via: 3 Quarks Daily and More Intelligent Life

Amazing Bamboo


[ed. One of the most versatile plants in the world, bamboo is classified as a grass and used for food, medicine, construction, furniture, textiles, paper, water processing, transportation, landscaping, and fishing.]

Bamboo is one of the fastest-growing plants on Earth with reported growth rates of 100 cm (39 in) in 24 hours. However, the growth rate is dependent on local soil and climatic conditions as well as species, and a more typical growth rate for many commonly cultivated bamboos in temperate climates is in the range of 3-10 cm (1-4 inches) per day during the growing period. Primarily growing in regions of warmer climates during the Cretaceous period, vast fields existed in what is now Asia. Some of the largest timber bamboo can grow over 30 metres (98 ft) tall, and be as large as 6-8 inches in diameter. However, the size range for mature bamboo is species dependent, with the smallest bamboos reaching only several inches high at maturity. A typical height range that would cover many of the common bamboos grown in the United States is 15-40 feet, depending on species.

Bamboo, one of the “four gentlemen” (bamboo, orchid, plum blossom and chrysanthemum), plays such an important role in traditional Chinese culture that it is even regarded as a behaviour model of the gentleman. As bamboo has some features like upright, tenacity and hollow heart, people endow bamboo with integrity, elegance and plainness though it is not physically strong. Ancient Chinese poets wrote countless poems to praise bamboo, but actually they were truly talking about people like bamboo and express their understanding of what is a real gentleman should be like. According to Laws,an ancient poet Bai, Juyi (772-846) thought that to be a gentleman, a man doesn’t need to be physically strong, but he must be mentally strong. He must be upright, perseverant, and, just as a bamboo is hollow-hearted, he should open his heart to accept anything that is benefit and never has arrogance and prejudice. Bamboo is not only a symbol of gentleman, but also an important role in Buddhism. In the first century, Buddhism was introduced into China. As cannons of Buddhism don’t allow its believers to do anything cruel to animals, meat, egg and fish were not allowed in the diet. However, people need something nutritional to live, thus, the tender bamboo shoot (it is called “sun” in Chinese) became a good choice. The bamboo shoot is nutritional and eating it does not violate the cannon. With thousands of years’ development, how to eat bamboo shoot has become a part of cuisine system, especially for monks. A Buddhist monk named Zan Ning, wrote a manual of the bamboo shoot called “Sun Pu”. He offered descriptions and recipes for many kinds of bamboo shoots. Bamboo shoot has always been a traditional dish on Chinese’s dinner table, especially in southern China. In ancient time, as long as people have money to buy a big house with yard, they will always plant bamboos in their garden. Bamboo is a necessary element of Chinese culture, or even in the whole Asian civilization. People plant bamboos, eat bamboo shoots, paint bamboos, write poems for bamboos, and speak highly of gentlemen who are like bamboos. Bamboo, is not only a plant, but also a part of people’s life.

In Japan, a bamboo forest sometimes surrounds a Shinto shrine as part of a sacred barrier against evil. Many Buddhist temples also have bamboo groves.

via: Wikipedia
Photo: via

Sunday, December 25, 2011


Susan Brown “Orchid and Three Pears”
via:

Measuring the Human Pecking Order

Measuring power and influence on the web is a matter of huge interest. Indeed, algorithms that distill rankings from the pattern of links between webpages have made huge fortunes for companies such as Google.

One the most famous of these is the Hyper Induced Topic Search or HITS algorithm which hypothesises that important pages fall into two categories--hubs and authorities--and are deemed important if they point to other important pages and if other important pages point to them. This kind of thinking led directly to Google's search algorithm PageRank

The father of this idea is John Kleinberg, a computer scientist now at Cornell University in Ithaca, who has achieved a kind of cult status through this and other work. It's fair to say that Kleinberg's work has shaped the foundations of the online world.

Today, Kleinberg and a few pals put forward an entirely different way of measuring power and influence; one that may one day have equally far-reaching consequences.

by MIT Technology Review |  Read more:

The Muses of Insert, Delete and Execute

The literary history of the typewriter has its well-established milestones, from Mark Twain producing the first typewritten manuscript with “Life on the Mississippi” to Truman Capote famously dismissing Jack Kerouac’s “On the Road,” pounded out on a 120-foot scroll, with the quip “That’s not writing, that’s typing.”

The literary history of word processing is far murkier, but that isn’t stopping Matthew G. Kirschenbaum, an associate professor of English at the University of Maryland, from trying to recover it, one casual deletion and trashed document at a time.

Pay no attention to the neatly formatted and deceptively typo-free surfaces of the average Microsoft Word file, Mr. Kirschenbaum declared at a recent lunchtime lecture at the New York Public Library titled “Stephen King’s Wang,” a cheeky reference to that best-selling novelist’s first computer, bought in the early 1980s.

“The story of writing in the digital age is every bit as messy as the ink-stained rags that would have littered Gutenberg’s print shop or the hot molten lead of the Linotype machine,” Mr. Kirschenbaum said, before asking a question he hopes he can answer: “Who were the early adopters, the first mainstream authors to trade in their typewriters for WordStar and WordPerfect?”

The lecture was drawn from Mr. Kirschenbaum’s book “Track Changes: A Literary History of Word Processing,” which Harvard University Press is set to publish in 2013, or as soon as he can finish tapping it out on his iBuyPower 64-bit laptop, and on the vintage computers he has assembled at the university’s College Park campus, where he is also the associate director of the Maryland Institute for Technology in the Humanities. (...)

The study of word processing may sound like a peculiarly tech-minded task for an English professor, but literary scholars have become increasingly interested in studying how the tools of writing both shape literature and are reflected in it, whether it’s the quill pen of the Romantic poets or the early round typewriter, known as a writing ball, that Friedrich Nietzsche used to compose some aphoristic fragments. (“Our writing tools are also working on our thoughts,” Nietzsche typed.)

by  Jennifer Schuessler, NY Times |  Read more:
Photo:Brendan Smialowski for The New York Times

Saturday, December 24, 2011

Feist



photo: markk

The Book of Books

The Bible is the model for and subject of more art and thought than those of us who live within its influence, consciously or unconsciously, will ever know.

Literatures are self-referential by nature, and even when references to Scripture in contemporary fiction and poetry are no more than ornamental or rhetorical — indeed, even when they are unintentional — they are still a natural consequence of the persistence of a powerful literary tradition. Biblical allusions can suggest a degree of seriousness or significance their context in a modern fiction does not always support. This is no cause for alarm. Every fiction is a leap in the dark, and a failed grasp at seriousness is to be respected for what it attempts. In any case, these references demonstrate that in the culture there is a well of special meaning to be drawn upon that can make an obscure death a martyrdom and a gesture of forgiveness an act of grace. Whatever the state of belief of a writer or reader, such resonances have meaning that is more than ornamental, since they acknowledge complexity of experience of a kind that is the substance of fiction. (...)

A number of the great works of Western literature address themselves very directly to questions that arise within Christianity. They answer to the same impulse to put flesh on Scripture and doctrine, to test them by means of dramatic imagination, that is visible in the old paintings of the Annunciation or the road to Damascus. How is the violence and corruption of a beloved city to be understood as part of an eternal cosmic order? What would be the consequences for the story of the expulsion from Eden, if the fall were understood as divine providence? What if Job’s challenge to God’s justice had not been overawed and silenced by the wild glory of creation? How would a society within (always) notional Christendom respond to the presence of a truly innocent and guileless man? Dante created his great image of divine intent, justice and grace as the architecture of time and being. Milton explored the ancient, and Calvinist, teaching that the first sin was a felix culpa, a fortunate fall, and providential because it prepared the way for the world’s ultimate reconciliation to God. So his Satan is glorious, and the hell prepared for his minions is strikingly tolerable. What to say about Melville? He transferred the great poem at the end of Job into the world of experience, and set against it a man who can only maintain the pride of his humanity until this world overwhelms him. His God, rejoicing in his catalog of the splendidly fierce and untamable, might ask, “Hast thou seen my servant Ahab?” And then there is Dostoyevsky’s “idiot” Prince Myshkin, who disrupts and antagonizes by telling the truth and meaning no harm, the Christ who says, “Blessed is he who takes no offense at me.”

Each of these works reflects a profound knowledge of Scripture and tradition on the part of the writer, the kind of knowledge found only among those who take them seriously enough to probe the deepest questions in their terms. These texts are not allegories, because in each case the writer has posed a problem within a universe of thought that is fully open to his questioning once its terms are granted. Here the use of biblical allusion is not symbolism or metaphor, which are both rhetorical techniques for enriching a narrative whose primary interest does not rest with the larger resonances of the Bible. In fact these great texts resemble Socratic dialogues in that each venture presupposes that meaning can indeed be addressed within the constraints of the form and in its language, while the meaning to be discovered through this argument cannot be presupposed. Like paintings, they render meaning as beauty.

by Marilynne Robinson, NY Times |  Read more:
Illustration by O.O.P.S

Friday, December 23, 2011


photo: markk

The Trouble with Scientific Secrets


[ed. This is a very big deal. Information sharing is a bedrock principle of scientific research and this is the first time a prohibition of this type has been requested.] 

In early September, the European Scientific Working group on Influenza convened on Malta to hold its fourth conference. Researchers delivered a variety of technical reports on the state of influenza research and the prospects for vaccines. As was the case with the first three conferences, the world took little notice.

The data presented by one group, however, has so alarmed public-health officials throughout the world that yesterday the National Science Advisory Board for Biosecurity, a federal group established by the United States Department of Health and Human Services, asked the journals Science and Nature to refrain from publishing essential details of the research. It was the first time the group had made such a request. Officials said the report had implications for bioterrorism that were too obvious too ignore, and too powerful to make public.

The report in question involved avian influenza. At the conference, Ron Fouchier, a virologist from Erasmus Medical Center in Rotterdam, had announced that he and his colleagues had created a form of the H5N1 influenza virus—more commonly known as bird flu—that could pass easily among ferrets. Flu experts got the point instantly: ferrets are mammals; if they can be infected though the airborne transmission of H5N1, so, almost certainly, can we. This was the extremely bad news that the epidemiological world had been waiting for—but hoping never to hear—since avian influenza began to spread across Asia nearly a decade ago.

by Michael Specter, NY Times |  Read more:
Photograph by Kin Cheung/AP Photo

Friday Book Club - The Marriage Plot

There was no predicting where Jeffrey Eugenides would go after his first two novels, so different were they in tone and form. “The Virgin Suicides” — humid, dreamlike, entranced — comes off as a single thought. “Middlesex,” a chatty multigenerational saga that winds its way from Turkey to Michigan to San Francisco to Berlin, sweeping together the burning of Smyrna, the rise and fall of Detroit, the immigrant experience, the Nation of Islam, the sins of Nixon and, of course, the lore and genetics of intersexuality, has as many moving parts as a Rube Goldberg machine. “The Virgin Suicides,” edged with antic wit and edging toward the surreal, glances in Nabokovian contempt at the petty preoccupations of “rangers and realists.” “Middlesex,” for all the novelty of its hermaphroditic protagonist, is straight-up realism, start to finish.

The books are far apart in quality, too. The language of “The Virgin Suicides” is taut and watchful from the first line, its mood a subtle synthesis of mystery and carnality. Like a myth, the novel imposes its own logic. In telling the story of five teenage sisters who kill themselves under the rapt gaze of the neighborhood boys, Eugenides showed a willingness to push to extremes, and the skill to bring it off once he got there. The book reminds me of Marilynne Robinson’s “Housekeeping,” another flaying first novel, both of them imagistically obsessive, spiritually uncompromising stories of water, light, death and girls.

You almost can’t believe the same person is responsible for “Middlesex.” Clanking prose, clunky exposition, transparent devices, telegraphed moves — the novel is “Midnight’s Children” without the magic, the intellect or the grand historical occasion, a hash of narrative contrivances with very little on its mind. In making these judgments, of course — the novel was a huge best seller and a Pulitzer Prize winner, to boot — I am joining a minority of perhaps no more than one. But I found the whole thing utterly unpersuasive. Take away its trendy theme and dollops of ethnic schmaltz (it could have been called “My Big Fat Greek Novel”), and “Middlesex” scarcely contains a single real character or genuine emotion.

“The Marriage Plot” is yet a new departure — daylight realism, like “Middlesex,” but far more intimate in tone and scale. Instead of three generations, it presents us with three characters, college students leaving Brown in 1982, the year before Eugenides did: Madeleine Hanna, a beautiful, uncertain WASP; Leonard Bankhead, her sometime boyfriend, brilliant, brooding, charismatic, poor; and Mitchell Grammaticus, authorial surrogate, a Greek from Grosse Pointe, Mich., who yearns in alternation for Madeleine and God. The novel starts the day the three graduate, returns to college to give us the back story, then follows their first year out. Mitchell heads to Europe and India, seeking sanctity; the others keep house on Cape Cod, where Leonard works in a genetics lab and Madeleine applies to graduate school.

by William Deresiewicz, NY Times |  Read more:
Image via: The Hairpin

George Tooker, Government Bureau (1956)
via:

$100 Hand of Blackjack, Foxwoods Casino

I met Anthony in a poker game at the Diamond Club in New York City. He was fairly nondescript, just a normal everyday thirty-something white guy, business casual, head-down and putting-in-work in the pot-limit game. We were having a conversation around the table about blackjack. I had just made a comment about card counting when his head shot up.

“You count cards?” he asked me.

“A little,” I responded. I had no idea how to count cards. “Do you?”

“Do I!” He laughed.

It turns out Anthony, a finance industry flunky by day, had a small crew that hit Atlantic City and Foxwoods on the weekends and counted cards. He said a typical weekend haul was “nothing serious, maybe twenty or thirty grand.” It just so happened they were looking for some new talent and would I like to go to Foxwoods with them for the weekend and give it a shot? It sounded like an adventure. The fact that I had no idea how to count cards never entered in to my mind before I enthusiastically agreed.

Card counting isn’t mathematically very complicated. You keep a running tally in your head of the high cards versus the low cards. Low cards add to the tally, high cards subtract from it. The higher the number the more favorable the conditions for betting; the idea being that a shoe with a high concentration of high cards in it will deal out more winning hands than a shoe with low cards. There’s more complexity to it than this, but that’s the basic gist. I went to the bookstore and bought a book on counting called “Blackjack for Blood.” I practiced on decks of cards at home. I thought I had it down. I felt like I was ready. Once again my overconfidence was not only unfounded but about to get me in to trouble.

by David Hill, McSweeny's |  Read more:
Photo: Gammonish

Elwyn Lynn, Darkness at Noon, Mixed media, 175 x 175cm
via: