Monday, December 17, 2012

Daniel K. Inouye (September, 1924 – December, 2012)

Daniel K. Inouye died today of a respiratory ailment at a Bethesda, Md., hospital, ending a life of remarkable service for his country and Hawaii that included sacrificing his right arm in World War II combat and spending 50 years as a U.S. senator. He was 88.

The senator succumbed to respiratory complications at 5:01 p.m. Eastern time at the Walter Reed National Military Medical Center where he had been hospitalized since Dec. 9. Inouye was first brought to George Washington University Hospital on Dec. 6 after fainting in a Senate office. He was transferred to Walter Reed three days later.

A statement from his office said that his wife Irene Hirano Inouye and his son Ken were at his side and that last rites were performed by Senate Chaplain Dr. Barry Black.

When he was asked recently how he wanted to be remembered, he said, "I represented the people of Hawaii and this nation honestly and to the best of my ability. I think I did OK," according to the statement.

His last words were, "Aloha."

"Senator Inouye's family would like to thank the doctors, nurses and staff at Walter Reed National Military Medical Center for the extraordinary care he received," the statement said.

Reaction to his death came swiftly from across the state and the nation.

”This keiki o ka aina, this child of Hawaii, has left us with a legacy I suspect we will never see again.” an emotional Gov. Neil Abercrombie said this afternoon.

Dante Carpenter, the chairman of the Democratic Party of Hawaii, "Our hearts are just full of grief, collectively as well as individually."

“We will all miss him, and that’s a gross understatement.” Senate Majority Leader Harry Reid, D-Nev., becoming emotional on the floor of the Senate. “No one’s been a better American than Sen. Inouye,” he said.

Inouye leaves an unparalleled legacy in Hawaii history — including Medal of Honor winner, nine-term U.S. senator, and key figure in the Senate investigations of both the Watergate and Iran-Contra scandals. As the longest-serving member of the Senate, the Hawaii Democrat was president pro tempore — third in line to the presidency.

His death is a huge loss for Hawaii which has come to rely on his decades of unwavering advocacy for the islands and his ability to direct billions of dollars in federal money to his home state. It was often said, only half-jokingly, that Hawaii had three major industries: tourism, the military, and Sen. Dan Inouye.

"He's long been known as a fierce protector of home-state interests," Christopher Deering, a political science professor at George Washington University in Washington, said before Inouye's death. "He's also been a highly respected inside player."

Daniel Ken Inouye was born in Honolulu on Sept. 7, 1924, to Japanese-American parents Hyotaro, a jewelry clerk, and Kame, a homemaker. He was named after biblical prophet Daniel and the Rev. Daniel Klinefelter, a Methodist minister who helped raised the orphaned Kame. Inouye's parents met at church and always preached family honor and discipline, a blend of Japanese tradition and Methodist sensibility. Inouye was the eldest of four siblings — sister May and brothers John and Robert — who grew up in Moiilili and McCully.

Although the family was poor and Inouye said he did not wear shoes regularly until he attended McKinley High School, he once wrote of his family ethos, "there was a fanatic conviction that opportunity awaited those who had the heart and strength to pursue it."

For many of his generation, the Dec. 7, 1941, Japanese attack on Pearl Harbor forever changed the trajectory of his life. Inouye had wanted to be a doctor and had taken a first-aid course from the American Red Cross, but once President Franklin D. Roosevelt agreed in 1943 to let nisei volunteer for the war, Inouye volunteered for the Army and was assigned to what was to become one of the most decorated military units in history, the segregated 442nd Regimental Combat Team.

Inouye, a sergeant when the 442nd landed in Europe, was promoted to first lieutenant as the nisei unit moved through Italy, then France, then back to Italy in the waning days of the war.

In northern Italy in April 1945 as the war in Europe was coming to an end, Inouye moved his platoon against German troops near San Terenzo. Inouye crawled up a slope and tossed two hand grenades into a German machine-gun nest. He stood up with his tommy gun and raked a second machine-gun nest before being shot in the stomach. But he kept charging until his right arm was hit by an enemy rifle grenade and shattered.

"I looked at it, stunned and disbelieving. It dangled there by a few bloody shreds of tissue, my grenade still clenched in a fist that suddenly didn't belong to me anymore," Inouye wrote in his 1967 autobiography, "Journey to Washington," written with Lawrence Elliott.

Inouye wrote that he pried the grenade out of his right hand and threw it at the German gunman, who was killed by the explosion. He continued firing his gun until he was shot in the right leg and knocked down the hillside. Badly wounded, he ordered his men to keep attacking and they took the ridge from the enemy.

by Derrick DePledge, The Hawaii Star Advertiser |  Read more:


Workingman’s Bread

“How well can we live,” Juliet Corson asked herself, “if we are moderately poor?” On the radical origins of home economics.

As it was for many who went to school in the early 1990s, my junior-high experience with home economics was brief. In theory, I liked cooking, but the idea of doing it in a dour classroom outfitted with fridges and Formica conjured visions of trembling Jell-O molds and glaucous mounds of pistachio-cream salad, crumbly refrigerated biscuits and mushy pinwheels of deviled ham, all tasting the way the cafeteria smelled. I registered for Beginning Consumer Sciences not out of a great love for things domestic but because I wanted to avoid physical education (the only other available elective) and to get an easy A. But my first assignment, broccoli salad, proved unexpectedly difficult: I glopped on too much Miracle Whip and burned the bacon, mistakes that earned me a C- and a gentle admonition to “follow the recipe.”

Thanks to the guidance of a teacher both cheerful and good-natured — as you inevitably must be when supervising a roomful of 13-year olds employing sharp knives and hot ranges — I managed to reverse my course. I soon found myself chopping, roasting, and frying with brio, turning out soggy but delicious pineapple upside-down cakes and loads of peanut brittle more salty than sweet.

As I was baking and cooking in that classroom, with its four small avocado-green ovens, little did I think that I was participating in something of cultural importance. But recently, academics and food critics have called for a return of home economics to high school curricula. In a 2011 National Public Radio interview, Michigan State University history professor Helen Zoe Veit sang the praises of instruction in the domestic arts. “Just by virtue of making foods at home,” she said, “you’re almost guaranteed to be making them much more healthfully than they would be if you buy them at a fast-food restaurant or in virtually any restaurant where fats and sugars are used in … enormous quantities.” In summer, Slate ran an article by Torie Bosch, who claimed that “home ec is more valuable than ever in an age when junk food is everywhere, obesity is rampant, and few parents have time to cook for their children.” What’s more, she argues, a course in home economics could help students to teach their cash-strapped families to stretch their dollar. Frugality and thrift, watchwords of austere times past, would once again see recession-wracked Americans through their present ill fortune.

Americans’ present ill fortune has persisted for a number of years now and threatens to grow worse, because most politicians appear to agree that present economic realities have made inevitable a rollback of New Deal programs. In a time of diminished prosperity, the thinking goes, citizens are summoned to tighten their belts further. Talk of austerity has the attractive effect of conflating notions of individual conduct with those of economic justice, and so shows itself utterly befitting a neoliberal age. Yet as history demonstrates, the present morality play involving a people perched perilously on a “fiscal cliff” has seen several dress rehearsals.

The association of clever cookery with economic security had long abided in pre-Roosevelt America. Early home economists thought a well-cooked roast could quell or eliminate everything from public drunkenness to factory riots. For them food was a way not only to ease the bitter pangs of poverty but also to curtail its more disruptive social repercussions. Juliet Corson, founder of the New York Cooking School, was one such advocate of better living through sensible-meal prep. The good life could be had through good cooking, she believed. Eager to show the working classes how to make the most of available foodstuffs that, thanks to the railroad’s aggressive expansion over the course of the 19th century, had increased in both variety and quantity, she penned one of the most popular cooking guides of the era, Fifteen-Cent Dinners for Workingmen’s Families(1877). Its simple, sensible advice helped thousands weather the period’s financial panics and made Corson one of the century’s most notable social reformers. (...)

After the onset of the 1873 depression, Corson offered her services to the Women’s Educational and Industrial Society of New York, which provided vocational training to women at a time when thousands of them needed to support themselves but lacked the knowledge and means. It sought to keep these women in work to prevent them from succumbing to “moral degradation,” as one contemporary circular phrased it. The school’s administrators tapped Corson to teach culinary arts. Having had little previous experience beyond that of making coffee and grilling steak, she turned to the best European cookbooks for guidance. “The thoroughness of the German and the delicacy of the French” impressed her, and she synthesized these two Continental influences into “a philosophy of her own.”

This philosophy proved immensely popular, because it informed an approach to cooking that appeared almost effortless. Just about anyone could whip up tasty, economical meals. Women flocked to hear Corson, whose instructional method was immersive. With basket in hand, she led her students to Fulton Market for lessons on selecting fresh meat, fish, and vegetables before adjourning to the cooking school, where, behind a brightly polished range topped with copper saucepans and boilers, she demonstrated how best to prepare them. So successful were her lessons that they attracted the notice of upper-class colleagues. They encouraged her not only to continue penning articles and books but also to open her own school, which she did in 1876.

Corson’s New York Cooking School, initially based in her own home, charged tuition on a sliding scale and taught both practical and more advanced cooking skills to everyone from ladies’ maids to young housewives. Her specialty, however, remained economical cookery. The 34-page Fifteen-Cent Dinners for Workingmen’s Families elaborated its finer points. “The cheapest kinds of food are sometimes the most wholesome and strengthening,” Corson insisted. On three nickels (the equivalent of $2.78 today) the poorest laborer could eat, if not like a king, then perhaps like his boss. “In Europe provinces would live upon what is wasted in towns here,” Corson lamented. Fifteen-Cent Dinners revealed veritable plenty in the midst of apparent dearth. Simply by eliminating waste, Corson claimed, a household could find nourishment to spare.  (...)

Corson’s uniquely cosmopolitan approach to economical cookery won her audience’s approval. Critics hailed Fifteen-Cent Dinners as a panacea. The Chicago Tribune claimed that between its covers lay “the secret of economy which gives skill to conceal cheap things,” and another prominent newspaper, the Christian Union, assured readers that, if faithfully followed, the recipes would “put upon the rich man’s table food more nourishing and palatable than nine out of ten well-to-do people ever taste outside of first-class restaurants.” Yet Corson’s book failed to please union leaders, who accused its author of conspiring with capitalists to suppress wages, reasoning that if workers learned they could subsist quite well on 15 cents, they would lose interest in agitating for much more.

by Christine Baumgarthuber, The New Inquiry | Read more:
Illustration: imp kerr

just here… by Jaya Suberg
via:

The Uses of Difficulty

Jack White, the former frontman of the White Stripes and an influential figure among fellow musicians, liked to make things difficult for himself. He uses cheap guitars that won’t stay in shape or in tune. When performing, he positions his instruments in a way that is deliberately inconvenient, so that switching from guitar to organ mid-song involves a mad dash across the stage. Why? Because he’s on the run from what he describes as a disease that preys on every artist: "ease of use". When making music gets too easy, says White, it becomes harder to make it sing.

It’s an odd thought. Why would anyone make their work more difficult than it already is? Yet we know that difficulty can pay unexpected dividends. In 1966, soon after the Beatles had finished work on "Rubber Soul", Paul McCartney looked into the possibility of going to America to record their next album. The equipment in American studios was more advanced than anything in Britain, which had led the Beatles’ great rivals, the Rolling Stones, to make their latest album, "Aftermath", in Los Angeles. McCartney found that EMI’s contractual clauses made it prohibitively expensive to follow suit, and the Beatles had to make do with the primitive technology of Abbey Road.

Lucky for us. Over the next two years they made their most groundbreaking work, turning the recording studio into a magical instrument of its own. Precisely because they were working with old-fashioned machines, George Martin and his team of engineers were forced to apply every ounce of their ingenuity to solve the problems posed to them by Lennon and McCartney. Songs like "Tomorrow Never Knows", "Strawberry Fields Forever", and "A Day in the Life" featured revolutionary aural effects that dazzled and mystified Martin’s American counterparts.

Sometimes it’s only when a difficulty is removed that we realise what it was doing for us. For more than two decades, starting in the 1960s, the poet Ted Hughes sat on the judging panel of an annual poetry competition for British schoolchildren. During the 1980s he noticed an increasing number of long poems among the submissions, with some running to 70 or 80 pages. These poems were verbally inventive and fluent, but also "strangely boring". After making inquiries Hughes discovered that they were being composed on computers, then just finding their way into British homes.

You might have thought any tool which enables a writer to get words on to the page would be an advantage. But there may be a cost to such facility. In an interview with the Paris Review Hughes speculated that when a person puts pen to paper, "you meet the terrible resistance of what happened your first year at it, when you couldn’t write at all". As the brain attempts to force the unsteady hand to do its bidding, the tension between the two results in a more compressed, psychologically denser expression. Remove that resistance and you are more likely to produce a 70-page ramble. There is even some support for Hughes’s hypothesis from modern neuroscience: a study carried out by Professor Virginia Berninger at the University of Washington found that handwriting activated more of the brain than keyboard writing, including areas responsible for thinking and memory.

Our brains respond better to difficulty than we imagine. In schools, teachers and pupils alike often assume that if a concept has been easy to learn, then the lesson has been successful. But numerous studies have now found that when classroom material is made harder to absorb, pupils retain more of it over the long term, and understand it on a deeper level. Robert Bjork, of the University of California, coined the phrase “desirable difficulties” to describe the counter-intuitive notion that learning should be made harder by, for instance, spacing sessions further apart so that students have to make more effort to recall what they learnt last time. Psychologists at Princeton found that students remembered reading material better when it was printed in an ugly font.

by Ian Leslie, Intelligent Life |  Read more:
Illustration Brett Ryder

Belize National Geographic, January 1972. Michael E Long
via:

Fucked Up


Street Art Isn't a Crime Until Somebody Steals It: Banksy in Miami


I’m finding it a little hard to feel upset at the Banksy “exhibition” that was on display in Art Miami and its sister fair CONTEXT this past week. Others have found reasons to boycott the affair, and Marc and Sara Schiller, two street art aficionados I respect, wrote on Wooster Collective that they are calling out the Miami Art Fair for letting all this happen: “Knowing that Banksy has condemned the show, they could have easily rejected the exhibition and not legitimized the stolen artwork. But they didn’t. And this tells you a lot about what their motivations are.”

RJ Rushmore of Vandalog echoed the Schillers’ sentiment and then highlighted the possible monetary motive for the display:
… as the Schillers note, Banksy’s best work really only works when experienced in context in which it was intended (whether that intended context be on the street or in a gallery), and bringing these pieces indoors probably makes most of them much much much weaker than they were on the street. 
This is certainly not the first time we’ve seen someone trying to make a buck off Banksy and it’s reasons like this that Banksy created Pest Control, a controversial committee which determines the authenticity of Banksy works on the market and which refuses to authenticate any street works or works not originally intended for resale.

Banksy, “Stop and Search” (2007), from Bethlehem, West Bank

I find the rejection of the display of these five iconic Banksy’s and the resulting anger a little misplaced. Certainly viewing the works in their original context would be desirable, but it’s also nearly impossible for most people. The work, like most street art, is often placed on private property, and in the process, the artist ceases to own it. The fate of the work is left in the hands of others, not the artist. This is the deal; it’s street art, after all. Where that work ends up is anyone’s guess. There was the obstacle of the art fair ticket price, but any American museumgoer is accustomed to paying to see art. Yes, these works were once free to see but so is almost all art before it enters a fair or museum. Then again, this is about other issues, in my opinion.

Street art by its very nature is an act of faith in the public trust. You place the work — most often illegally — in public, and you kiss it goodbye. A photo online is usually the only residue of most of the ephemeral work. As proof of this concept, all you have to do is look back to the 1970s and ’80s, which was a rich period for street art, and realize that little, if any, of the work from the streets remains. This, I believe, is unfortunate.


Keith Haring’s subway drawings exhibited at the Brooklyn Museum

Not all the work was lost, though, and this isn’t the first time that work taken from the street has been exhibited outside of its original context. Keith Haring’s white chalk on black paper subway drawings of the late 1970s and early ’80s were a critical part of his rise in the public’s imagination. Many people tore down the works from the streets, particularly in the subway stations, immediately after they were made, and some of those people had the intention to sell them ASAP. In recent years, those works have been showcased in high-end galleries, like the Tony Shafrazi Gallery, and earlier this year, the Brooklyn Museum’s Keith Haring show featured a whole room of them. I don’t think many people would argue that these works don’t deserve to be in either of those places.

But let’s not mistake the Banksy show for what it is and what it isn’t. It is no longer street art; it is a historic artifact much in the way Assyrian murals stripped of their original temples and public buildings are displayed in museums the world over. This is history, and this needs to be preserved.

The popularity of Banksy, I would argue, has propelled him into the realm of a cultural icon worthy of historic preservation. He’s the only artist since Warhol that has successfully become a household name, more so than Damien Hirst and Jeff Koons. What we’re really arguing about, I believe, is how his legacy should survive and what role the artist should have in those decisions.

by Hrag Vartanian, Hyperallergenic | Read more:
Photos: Hrag Vartanian

Utopian for Beginners

There are so many ways for speakers of English to see the world. We can glimpse, glance, visualize, view, look, spy, or ogle. Stare, gawk, or gape. Peek, watch, or scrutinize. Each word suggests some subtly different quality: looking implies volition; spying suggests furtiveness; gawking carries an element of social judgment and a sense of surprise. When we try to describe an act of vision, we consider a constellation of available meanings. But if thoughts and words exist on different planes, then expression must always be an act of compromise.

Languages are something of a mess. They evolve over centuries through an unplanned, democratic process that leaves them teeming with irregularities, quirks, and words like “knight.” No one who set out to design a form of communication would ever end up with anything like English, Mandarin, or any of the more than six thousand languages spoken today.

“Natural languages are adequate, but that doesn’t mean they’re optimal,” John Quijada, a fifty-four-year-old former employee of the California State Department of Motor Vehicles, told me. In 2004, he published a monograph on the Internet that was titled “Ithkuil: A Philosophical Design for a Hypothetical Language.” Written like a linguistics textbook, the fourteen-page Web site ran to almost a hundred and sixty thousand words. It documented the grammar, syntax, and lexicon of a language that Quijada had spent three decades inventing in his spare time. Ithkuil had never been spoken by anyone other than Quijada, and he assumed that it never would be.

In his preface, Quijada wrote that his “greater goal” was “to attempt the creation of what human beings, left to their own devices, would never create naturally, but rather only by conscious intellectual effort: an idealized language whose aim is the highest possible degree of logic, efficiency, detail, and accuracy in cognitive expression via spoken human language, while minimizing the ambiguity, vagueness, illogic, redundancy, polysemy (multiple meanings) and overall arbitrariness that is seemingly ubiquitous in natural human language.”

Ithkuil has two seemingly incompatible ambitions: to be maximally precise but also maximally concise, capable of capturing nearly every thought that a human being could have while doing so in as few sounds as possible. Ideas that could be expressed only as a clunky circumlocution in English can be collapsed into a single word in Ithkuil. A sentence like “On the contrary, I think it may turn out that this rugged mountain range trails off at some point” becomes simply “Tram-mļöi hhâsmaÅ™pÅ£uktôx.”

It wasn’t long after he released his manuscript on the Internet that a small community of language enthusiasts began to recognize what Quijada, a civil servant without an advanced degree, had accomplished. Ithkuil, one Web site declared, “is a monument to human ingenuity and design.” It may be the most complete realization of a quixotic dream that has entranced philosophers for centuries: the creation of a more perfect language.

Ithkuil’s first piece of press was a brief mention in 2004 in a Russian popular-science magazine called Computerra. An article titled “The Speed of Thought” noted remarkable similarities between Ithkuil and an imaginary language cooked up by the science-fiction writer Robert Heinlein for his novella “Gulf,” from 1949. Heinlein’s story describes a secret society of geniuses called the New Men who train themselves to think more rapidly and precisely using a language called Speedtalk, which is capable of condensing entire sentences into single words. Using their efficient language to communicate, the New Men plot to take over the world from the benighted “homo saps.”

Soon after the publication of the Russian article, Quijada began to receive a steady stream of letters from e-mail addresses ending in .ru, peppering him with arcane questions and requesting changes to the language to make its words easier to pronounce. Alexey Samons, a Russian software engineer based in Vladivostok, took on the monumental task of translating the Ithkuil Web site into Russian, and before long three Russian Web forums had sprung up to debate the merits and uses of Ithkuil.

At first, Quijada was bewildered by the interest emanating from Russia. “I was a third humbled, a third flattered, and a third intrigued,” he told me. “Beyond that, I just wanted to know: who are these people?”

by Joshua Foer, The New Yorker | Read more:
Photograph by Dan Winters.

Sunday, December 16, 2012


Le Temple Du Soliel Mario Cliche
via:

Zhang Weimin(å¼  伟民 Chinese, b.1955)
A Moon Rising in the Dark Night
via:

Dr. NakaMats, the Man With 3300 Patents

One of the oldest chestnuts about inventions involves a 19th-century patent official who resigned because he thought nothing was left to invent. The yarn, which periodically pops up in print, is patently preposterous. “The story was an invention,” says Yoshiro Nakamatsu. “An invention built to last.”

He should know. Nakamatsu—Dr. NakaMats, if you prefer, or, as he prefers, Sir Dr. NakaMats—is an inveterate and inexorable inventor whose biggest claim to fame is the floppy disk. “I became father of the apparatus in 1950,” says Dr. NakaMats, who conceived it at the University of Tokyo while listening to Beethoven’s Symphony No. 5. “There was no mother.”

Though Dr. NakaMats received a Japanese patent in 1952, this virgin birth is disputed by IBM, which insists its own team of engineers developed the device in 1969. Still, to avoid conflicts, Big Blue struck a series of licensing agreements with him in 1979. “My method of digitizing analog technology was the start of Silicon Valley and the information revolution,” Dr. NakaMats says. His voice is low, slow and patronizing, solicitously deliberate. “I am a cross between Steve Jobs and Leonardo da Vinci.”

The floppy is only a short subject in the nonstop invention film that’s running in Dr. NakaMats’ brain. Among his other creations (he will earnestly tell you) are the CD, the DVD, the fax machine, the taxi meter, the digital watch, the karaoke machine, CinemaScope, spring-loaded shoes, fuel-cell-powered boots, an invisible “B-bust bra,” a water-powered engine, the world’s tiniest air conditioner, a self-defense wig that can be swung at an attacker, a pillow that prevents drivers from nodding off behind the wheel, an automated version of the popular Japanese game pachinko, a musical golf putter that pings when the ball is struck properly, a perpetual motion machine that runs on heat and cosmic energy and...much, much more, much of which has never made it out of the multiplex of his mind.

Dr. NakaMats is the progenitor of one other novelty related to floppies: Love Jet, a libido-boosting potion that can be sprayed on the genitalia. The computer component and the mail-order aphrodisiac—and the cash they generate—have taken the inventor of NakaMusic, NakaPaper and NakaVision out of the ranks of the faintly bonkers basement crackpot. The two great financial successes in his perpetual printout of ideas, they give him credibility. Nobody dares to completely kiss off his wilder inventions.

Indeed, Dr. NakaMats has won the grand prize at the International Exposition of Inventors a record 16 times, or so he says, and has been feted all over the world. To commemorate his 1988 visit to the United States, more than roughly a dozen U.S. cities—from San Diego to Pittsburgh—held Dr. NakaMats Days. The State of Maryland made him an honorary citizen, Congress awarded him a Certificate of Special Recognition and then-president George H.W. Bush sent him a congratulatory letter. Dr. NakaMats even tossed out the first pitch at a Pittsburgh Pirates game.

Of all the tributes he says he has received, he is perhaps proudest of having been invested as a knight by the Sovereign Military Hospitaller Order of Saint John of Jerusalem of Rhodes and of Malta, an ancient Roman Catholic charitable order. “Which is why I should be addressed as Sir Dr. NakaMats,” he explains.

He’s saying this from behind a desk in an office of Dr. NakaMats House, a central Tokyo high-rise of his own design. Naturally, the front gate is shaped like a colossal floppy disk.

His office is a riot of not-quite-finished projects. A blackboard is slathered in mathematical equations. File folders are piled on chairs. Copies of books he has written—among them, Invention of Politics and How to Become a Superman Lying Down—are scattered on the floor. Everywhere Dr. NakaMats goes, he dislodges great stacks of scientific papers last examined in, say, 1997. While rummaging for a diagram of his Anti-Gravity Float-Vibrate 3-Dimensional Sonic System, a heap of magazines starts a sort of tsunami across the room, dislodging other heaps in its path. He looks straight ahead, firm and unsmiling.

Dr. NakaMats is lean, moderately intense and 84 years old. He wears a sharp, double-breasted pinstriped suit, a striped red tie with matching pocket square and an expression like Ahab looking for a crew to hunt the white whale. Scrupulously polite, he offers a visitor from the United States a cup of Dr. NakaMats Brain Drink (“Lose weight. Smooth skin. Avoid constipation”) and a plate of intellect-enhancing Dr. NakaMats Yummy Nutri Brain Snacks.

By his count, Dr. NakaMats has clocked 3,377 patents, or three times as many as Thomas Edison (1,093 and no longer counting). “The big difference between Edison and me,” he says, matter-of-factly, “is that he died when he was 84, while I am now just in the middle of my life.”

by Franz Lidz, Smithsonian |  Read more:
Photo: Yuriko Nakao

Ani DiFranco


the sky is grey, the sand is grey, and the ocean is grey. i feel right at
home in this stunning monochrome, alone in my way. i smoke and i drink and
every time i blink i have a tiny dream. but as bad as i am i'm proud of the
fact that i'm worse than i seem. what kind of paradise am i looking for? i've
got everything i want and still i want more. maybe some tiny shiny thing will
wash up on the shore. you walk through my walls like a ghost on tv. you
penetrate me and my little pink heart is on its little brown raft floating out
to sea. and what can i say but i'm wired this way and you're wired to me, and
what can i do but wallow in you unintentionally? what kind of paradise am i
looking for? i've got everything i want and still i want more. maybe some tiny
shiny key will wash up on the shore. regretfully, i guess i've got three
simple things to say. why me? why this now? why this way? overtone's ringing,
undertow's pulling away under a sky that is grey on sand that is grey by an
ocean that's grey. what kind of paradise am i looking for? i've got everything
i want and still i want more. maybe some tiny shiny key will wash up on the
shore.

lyrics via:

Saturday, December 15, 2012

FDR's Four Freedoms and Global Security

On January 6, 1941, at a time when democracy was literally under siege in much of Europe and Asia, US President Franklin D. Roosevelt called upon his fellow countrymen to help the United States establish a world based on four essential human freedoms: Freedom of Speech and Expression; Freedom of Worship; Freedom from Want; and Freedom from Fear. At the time of the speech, all of Western Europe lay under the heel of the Nazi dictatorship, and with only Great Britain and the Royal Navy standing between Hitler's war machine and the United States, FDR felt it was crucial that the US do all it could to help the British wage war and carry on their resistance to German aggression. In the meantime, things were not much better in the Far East, where the militarist Japanese regime continued its aggressive war in China and had now moved into Indochina in the wake of the French defeat in Europe.

With democracy itself teetering on the brink of collapse, and with Hitler having declared that he had established a ‘"New Order" of tyranny' in Europe, FDR proposed that the United States promote the very antithesis of such an order, "a greater conception" based on a "moral order" that embraced the Four Freedoms as its fundamental guiding principles. It was to establish these principles that he called upon the American people to make the sacrifices needed to help America's allies win the war. America, he said, must become the great "arsenal of democracy," and by the time the United States had formally entered the war in December 1941, establishing the Four Freedoms-"everywhere in the world"-had in essence become the war aims of the United States.

Few Americans -- especially younger Americans -- are familiar with the Four Freedoms, but the vision that FDR articulated in such simple yet eloquent language had an enormous impact not only on the war, but also on the post-war world. For in calling for a world based on these fundamental human freedoms, FDR established a clear link between fundamental human rights and global security. Equally important, the rights that the Four Freedoms called for not only included those that are essentially political in nature, such as speech and worship, but also those that concern one's well being and personal security -- want and fear.

Inspired by these goals the United States went on to direct the effort to establish the postwar multilateral economic and security apparatus -- including the United Nations and the Universal Declaration of Human Rights, but also the IMF and World Bank -- that would lead to an unprecedented period of economic prosperity; economic prosperity that helped prevent the possible outbreak of a Third World War.

For the generation that fought the war, then, the promotion of human rights and the establishment of global security were inseparable. As we head into the year that will mark the 70th anniversary of FDR's Four Freedoms speech, we will do well to remember this, as well as his admonition that achieving the Four Freedoms "everywhere in the world" is not some "vision of a distant millennium. It is a definite basis for a kind of world attainable in our own time and generation."

by David Woolner, Next New Deal |  Read more:

Newtown and the Madness of Guns


[ed. As a rule, I don't print entire stories or opinion pieces because of copyright issues. However, in this case, I hope the New Yorker will grant me an exception. Click on the 'Read More' link for other perspectives from the magazine.]

After the mass gun murders at Virginia Tech, I wrote about the unfathomable image of cell phones ringing in the pockets of the dead kids, and of the parents trying desperately to reach them. And I said (as did many others), This will go on, if no one stops it, in this manner and to this degree in this country alone—alone among all the industrialized, wealthy, and so-called civilized countries in the world. There would be another, for certain.

Then there were—many more, in fact—and when the latest and worst one happened, in Aurora, I (and many others) said, this time in a tone of despair, that nothing had changed. And I (and many others) predicted that it would happen again, soon. And that once again, the same twisted voices would say, Oh, this had nothing to do with gun laws or the misuse of the Second Amendment or anything except some singular madman, of whom America for some reason seems to have a particularly dense sample.

And now it has happened again, bang, like clockwork, one might say: Twenty dead children—babies, really—in a kindergarten in a prosperous town in Connecticut. And a mother screaming. And twenty families told that their grade-schooler had died. After the Aurora killings, I did a few debates with advocates for the child-killing lobby—sorry, the gun lobby—and, without exception and with a mad vehemence, they told the same old lies: it doesn’t happen here more often than elsewhere (yes, it does); more people are protected by guns than killed by them (no, they aren’t—that’s a flat-out fabrication); guns don’t kill people, people do; and all the other perverted lies that people who can only be called knowing accessories to murder continue to repeat, people who are in their own way every bit as twisted and crazy as the killers whom they defend. (That they are often the same people who pretend outrage at the loss of a single embryo only makes the craziness still crazier.)

So let’s state the plain facts one more time, so that they can’t be mistaken: Gun massacres have happened many times in many countries, and in every other country, gun laws have been tightened to reflect the tragedy and the tragic knowledge of its citizens afterward. In every other country, gun massacres have subsequently become rare. In America alone, gun massacres, most often of children, happen with hideous regularity, and they happen with hideous regularity because guns are hideously and regularly available.

The people who fight and lobby and legislate to make guns regularly available are complicit in the murder of those children. They have made a clear moral choice: that the comfort and emotional reassurance they take from the possession of guns, placed in the balance even against the routine murder of innocent children, is of supreme value. Whatever satisfaction gun owners take from their guns—we know for certain that there is no prudential value in them—is more important than children’s lives. Give them credit: life is making moral choices, and that’s a moral choice, clearly made.

All of that is a truth, plain and simple, and recognized throughout the world. At some point, this truth may become so bloody obvious that we will know it, too. Meanwhile, congratulate yourself on living in the child-gun-massacre capital of the known universe.

by Adam Gopnik, The New Yorker |  Read more:
Photograph by Douglas Healey/Getty

The Ironman: Triathlete Executives' Ultimate Status Feat

On the Thursday before the 2012 Ironman World Championship in Kona, on the Big Island of Hawaii, Troy Ford stood in the lobby of the King Kamehameha’s Kona Beach Hotel. Around him were several gaunt men with shaved legs, hands steadying their composite bicycles costing upwards of $10,000 each. Ford is the director of the Ironman Executive Challenge program, or XC, as everyone calls it. For $9,000, or about 10 times the regular registration price, XC provides a way to VIP the Ironman, which, for the uninitiated, is a 2.4-mile open-water swim followed by a scorching 112-mile bike ride and a full 26.2-mile marathon run. It’s the hardest major endurance race in the world and the ultimate status bauble for a certain set of high-earning, high-achieving, high-VO2-max CEOs.

Ford, a sinewy 43-year-old with a shaved head, was waiting for two of his client-athletes: Jim Callerame, regional general manager of International Paper (IP), and Luis Alvarez, chief executive officer of Mexican fuel tank manufacturer SAG-Mecasa. Both needed their bikes tuned. For non-XC athletes, a bike tune-up requires a sweaty, anxious wait at an overburdened cycling shop and lost sleep over whether a year of training will be lost to some stoner bike mechanic who fails to true a wheel. Not so for Ford’s guys. Expected wait time: zero. “We’re going to walk right in,” Ford said, smiling.

XC provides its 25 athletes with what it refers to as “high-touch” service: breakfast with the pros, a seat up front at the welcome banquet, Ford at your disposal. He books your travel. He’ll find out your favorite snack is Oreos and have a pack waiting in your suite. When your kids get bored in the hotel restaurant, he’ll improvise with an entire box of Coffeemate creamers that they can use as building blocks.  (...)

Callerame was in Kona to clear an item from his bucket list. Just getting to the start line had been a feat. World Triathlon Corp. (WTC), which controls the Ironman brand, metes out slots for its events on a scarcity model. The 2,500 spots for the 2013 Ironman in Arizona sold out in less than a minute. The 2,500 slots for the 2013 Ironman Asia-Pacific Championship Melbourne sold out in five. There are 30 such events each year. Most Ironman customers hate to be denied. Andrew Messick, the CEO of WTC, describes them this way: “When you tell them about the hardest one-day endurance event in the world, they think, ‘I could do that!’ ” What makes getting a bib number for Kona even sweeter is that no berths are openly for sale. This year 84 of the nearly 2,000 spots went to pros, 1,668 to people who qualified by placing at the top of their age groups at earlier Ironman events, 205 were doled out through a lottery, and six were auctioned on EBay (EBAY). The top bidder paid $45,605.  (...)

Ford guided Callerame and Alvarez through the deafening beat of the Ironman expo—a carnival of metal-tube and tarpaulin tents hawking everything a triathlete could want—to a backroom with a mechanic, who immediately put Callerame’s bike on a stand. Given that nobody at the expo or on Ali’i Drive wears much clothing, one of the few ways to decipher status between Ironman aspirants is by the color-coded security bracelets on everybody’s wrists. These look like little hospital bands, and they’re in the registration packets. Orange means racer, yellow means family member, purple volunteer, and blue VIP. None of the athletes swarming around the mechanic seemed to notice Ford’s high-touch service, which is just how he likes it. Lots of big egos; best not to ruffle feathers.

Later, back at the King Kamehameha, Ford confessed that there was one perk he couldn’t guarantee: a VIP port-a-potty at the race start. “It would start a riot,” he said. “We’d need a full-time security person.” Not that all XC Ironmen wait in line for the loo. “We did have one XC guy a few years ago who was staying down the road at the Four Seasons. He rented a room at the King Kam, too, for the full three-day minimum, just in case he needed to poop.”

by Elizabeth Weil,  Bloomberg Businessweek |  Read more:
Photograph by Kramon