Wednesday, December 25, 2013

Tuesday, December 24, 2013

The Year We Broke the Internet

As winter storms were buffeting parts of the country last week, our collective attention was drawn halfway around the world to Egypt. Images of the pyramids and the Sphinx covered in snow had emerged, and were being shared tens of thousands of times on Facebook and Twitter. It wasn’t hard to see why. For some, sharing the photos was a statement on global warming. For others, sharing was about the triumph of discovery, making them proud housecats dropping a half-chewed mouse of news on the Internet’s doorstep. For most, however, the photos were just another thoughtlessly processed and soon-forgotten item that represented our now-instinctual response to the unrelenting stream of information we’re subjected to every waking hour: Share first, ask questions later. Better yet: Let someone else ask the questions. Better still: What was the question again?

Needless to say, the photos were bullshit.

It’s hard not to note the tidy symbolism here. The Internet, like the Sphinx, is a ravenous beast that eats alive anyone who can’t answer its hoary riddle. We in the media have been struggling for twenty years to solve that riddle, and this year, the answer arrived: Big Viral, a Lovecraftian nightmare that has tightened its thousand-tentacled grip on our browsing habits with its traffic-at-all-costs mentality—veracity, newsworthiness, and relevance be damned. We solved the riddle, and then we got eaten anyway.

The Egypt photos weren’t the only viral hoax to hijack the social media conversation in the past month. Of the others, the most infamous was reality-TV producer Elan Gale’s in-flight pissing match with a fellow passenger, which he documented on Twitter, and which was shepherded along by BuzzFeed to the delight of hundreds of thousands of onlookers. That it was actually a prank rankled some, but even that turned out to be a boon for the sites that shared it: They got the clicks coming and going, both on the ramp-up and in the reveal. The story may well have been, in the words of Slate’s Dave Weigel, “the sort of shoddy reporting that would get a reporter at a small newspaper fired,” but it was also a perfect microcosm of the way the Internet works now.

“We’re not in the business of publishing hoaxes,” BuzzFeed’s news editor wrote in response to Weigel’s piece, “and we feel an enormous responsibility here to provide our readers with accurate, up-to-date information”—which sounds a bit like Altria’s health inspector saying they’re sorry they gave you cancer.

The fact is, that sort of double-dipping is what most of us who produce Internet content do, myself included. Give me the viral pictures, and I’ll give you the truth. And then, after an appropriate waiting period, I’ll give you the other truth, and capitalize on that traffic too. It’s almost a perfect callback to William Randolph Hearst’s infamous declaration on the eve of the Spanish-American War, “You furnish the pictures and I’ll furnish the war.” Even more fitting, historians don’t think he ever said anything like that. Then as now, it’s the myth that plays, not the reality. Today it just plays on an exponentially larger stage.

The media has long had its struggles with the truth—that’s nothing new. What is new is that we’re barely even apologizing for increasingly considering the truth optional. In fact, the mistakes, and the falsehoods, and the hoaxes are a big part of a business plan driven by the belief that big traffic absolves all sins, that success is a primary virtue. Haste and confusion aren’t bugs in the coding anymore, they’re features. Consider what Ryan Grim, Washington bureau chief for the Huffington Post, told The New York Times in its recent piece on a raft of hoaxes, including Gale’s kerfuffle, a child’s letter to Santa that included a handwritten Amazon URL, and a woman who wrote about her fictitious poverty so effectively that she pulled in some $60,000 in online donations. “The faster metabolism puts people who fact-check at a disadvantage,” Grim said. “If you throw something up without fact-checking it, and you’re the first one to put it up, and you get millions and millions of views, and later it’s proved false, you still got those views. That’s a problem. The incentives are all wrong.”

In other words, press “Publish” or perish.

by Luke O'Neil, Esquire |  Read more:
Image: uncredited

Christmas Song

She was his girl, he was her boyfriend
Soon to be his wife, make him her husband
A surprise on the way, any day, any day
One healthy little giggling, dribbling baby boy
The Wise Men came, three made their way
To shower him with love
While he lay in the hay
Shower him with love, love, love
Love love, love
Love, love was all around

Not very much of his childhood was known
Kept his mother Mary worried
Always out on his own
He met another Mary who for a reasonable fee
Less than reputable was known to be
His heart was full of love, love, love
Love, love, love
Love, love was all around

Lyrics

Written by: Mel Torme and Robert Wells via:

Man Ray. La femme et son poisson 1938.
via:
[ed. See also: here]

HSBC Settlement Proves the Drug War is a Joke

If you've ever been arrested on a drug charge, if you've ever spent even a day in jail for having a stem of marijuana in your pocket or "drug paraphernalia" in your gym bag, Assistant Attorney General and longtime Bill Clinton pal Lanny Breuer has a message for you: Bite me.

Breuer this week signed off on a settlement deal with the British banking giant HSBC that is the ultimate insult to every ordinary person who's ever had his life altered by a narcotics charge. Despite the fact that HSBC admitted to laundering billions of dollars for Colombian and Mexican drug cartels (among others) and violating a host of important banking laws (from the Bank Secrecy Act to the Trading With the Enemy Act), Breuer and his Justice Department elected not to pursue criminal prosecutions of the bank, opting instead for a "record" financial settlement of $1.9 billion, which as one analyst noted is about five weeks of income for the bank.

The banks' laundering transactions were so brazen that the NSA probably could have spotted them from space. Breuer admitted that drug dealers would sometimes come to HSBC's Mexican branches and "deposit hundreds of thousands of dollars in cash, in a single day, into a single account, using boxes designed to fit the precise dimensions of the teller windows."

This bears repeating: in order to more efficiently move as much illegal money as possible into the "legitimate" banking institution HSBC, drug dealers specifically designed boxes to fit through the bank's teller windows. Tony Montana's henchmen marching dufflebags of cash into the fictional "American City Bank" in Miami was actually more subtle than what the cartels were doing when they washed their cash through one of Britain's most storied financial institutions.

Though this was not stated explicitly, the government's rationale in not pursuing criminal prosecutions against the bank was apparently rooted in concerns that putting executives from a "systemically important institution" in jail for drug laundering would threaten the stability of the financial system. The New York Times put it this way:
Federal and state authorities have chosen not to indict HSBC, the London-based bank, on charges of vast and prolonged money laundering, for fear that criminal prosecution would topple the bank and, in the process, endanger the financial system. (...)
So you might ask, what's the appropriate financial penalty for a bank in HSBC's position? Exactly how much money should one extract from a firm that has been shamelessly profiting from business with criminals for years and years? Remember, we're talking about a company that has admitted to a smorgasbord of serious banking crimes. If you're the prosecutor, you've got this bank by the balls. So how much money should you take?

How about all of it? How about every last dollar the bank has made since it started its illegal activity? How about you dive into every bank account of every single executive involved in this mess and take every last bonus dollar they've ever earned? Then take their houses, their cars, the paintings they bought at Sotheby's auctions, the clothes in their closets, the loose change in the jars on their kitchen counters, every last freaking thing. Take it all and don't think twice. And then throw them in jail.

Sound harsh? It does, doesn't it? The only problem is, that's exactly what the government does just about every day to ordinary people involved in ordinary drug cases.

by Matt Taibbi, Rolling Stone |  Read more:
Image: MediaBistro

Imagining the Post-Antibiotics Future


Predictions that we might sacrifice the antibiotic miracle have been around almost as long as the drugs themselves. Penicillin was first discovered in 1928 and battlefield casualties got the first non-experimental doses in 1943, quickly saving soldiers who had been close to death. But just two years later, the drug’s discoverer Sir Alexander Fleming warned that its benefit might not last. Accepting the 1945 Nobel Prize in Medicine, he said:
“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”
As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.

Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940, while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigella emerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.

With antibiotics losing usefulness so quickly — and thus not making back the estimated $1 billion per drug it costs to create them — the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than 500 chronic-disease drugs for which resistance is not an issue — and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009, and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.

Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies — who calls antibiotic resistance as serious a threat as terrorism — recently published a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.

by Maryn McKenna, Medium |  Read more:
Image: Eneas De Troya

Sounding the Alarm


At 2:46 p.m. on March 11, 2011, the Pacific Plate, just off Japan's northeast coast, suddenly thrust downward, unleashing a monstrous, 9.0-magnitude earthquake that rocked the country for the next six minutes. The massive Tohoku quake and resulting tsunami are believed to have killed at least 16,000 people and injured 6,000 more. Another 2,600 people are still missing and presumed dead. The quake was the most powerful to ever strike Japan, and was the fourth-largest ever recorded. It also was the first earthquake to be heard in outer space, and was the most expensive natural disaster in human history, generating $235 billion in total damage. But there was a silver lining, if you could call it that: Tohoku was also the first time that Japanese citizens were given the precious, if limited, gift of time.

That gift came in the form of Japan's earthquake early warning system, which detected the giant temblor just before it hit and immediately sent computer-generated alerts across the country to cellphones, TVs, schools, factories, and transit systems. Japan put its finishing touches on its $500 million early warning system in 2007, leaving four years — barely the blink of an eye in geological timescales — before the investment paid off.

And in 2011, by all accounts it did. Although it's impossible to quantify the number of lives that the system saved, there were reports in the quake's aftermath of schools having had time to get all their students under desks, of eleven 320-mile-per-hour bullet trains slowing to a stop; of more than 16,000 elevators automatically shutting down when the alarm system went off. In the sixty seconds before the giant temblor struck, roughly 52 million people received text-message warnings that the quake was fast approaching and that they needed to get out of harm's way.

In 2007, the same year that Japan finished building its early warning system, earthquake scientists roughly 5,000 miles away in California marked a related, albeit far humbler, benchmark. Richard Allen, director of the Seismological Laboratory at UC Berkeley, was in his office on October 30 when a 5.6-magnitude earthquake hit the Alum Rock section of San Jose. The quake caused only moderate shaking and very little damage, but Allen had reason to be excited: The event marked the first time his Berkeley group was able to test its own early warning system, set up just two weeks before. "It was our first proof-of-concept event," Allen recalled in a recent interview. Thirty minutes after the light shaking ended, Allen received an email showing that the system had successfully detected the right waves, done the right math, and made the right prediction about when and how strongly the quake would hit.

Yet this was only a researcher's victory. The tiny system his team had built produced no cascade of texts, no TV or radio transmissions, and no widespread notification that an earthquake was on its way. In the event of a disaster, the technology wasn't even in place for Allen himself to receive a real-time notification from his own system.

But this was not a case of Japan being light years ahead of the United States in terms of earthquake-science research. Instead, the wide technological gap between the two countries has more to do with each nation's sense of urgency about the dangers of earthquakes, and the need to prepare for them. In fact, back in 2003, Allen had co-written what essentially became the seminal scientific paper on quake predictions. His work showed that it's technically possible to predict the size and location of quakes right before they strike, and argued for the methods that became the basis for early warning systems, much like the one later built in Japan.

And yet a decade after Allen co-authored that paper, California, the second-most seismically active state in the nation (behind only Alaska), still has next to nothing in terms of a public seismic warning system. The technology exists and has for years, but the state legislature has failed to find or allocate the necessary funds to make it happen.

by Azeen Ghorayshi, East Bay Express |  Read more:
Image: Stephen Loewinsohn

The Sense of an Ending

[ed. I tend to avoid books that seem overly hyped and/or have conflicting reviews, so I came late to The Sense of an Ending, but it's a wonderful (if somewhat short) novel that you almost want to read twice once you've finished it. It resonated with me, anyway. I have the habit of dog-earing the left-hand corner of pages in sections that contain particularly poignant or insightful passages (so I can find them again). After dog-earing nearly every other page of this book, I finally gave up. See also: Life in Smoke and Mirrors]

The new book is a mystery of memory and missed opportunity. Tony Webster, a cautious, divorced man in his 60s who “had wanted life not to bother me too much, and had succeeded,” receives an unexpected bequest from a woman he’d met only once, 40 years earlier. The mother of his college girlfriend, Veronica, has bequeathed him £500 — a legacy that unsettles Tony, pushing him to get in touch with Veronica (their relationship had ended badly) and seek answers to certain unresolved questions.

Had he loved Veronica? (At the time, it was an emotion he had lacked the spine to own up to.) What had happened to the energetic boy he used to be, “book-hungry, sex-hungry, meritocratic, anarchistic,” who thought of himself as “being kept in some kind of holding pen, waiting to be released” into an engaged adult life of “passion and danger, ecstasy and despair”? And what ever became of the friend he and Veronica both knew back then, a brainy, idealistic boy named Adrian Finn? Gradually, Tony assembles his willfully forgotten past impressions and actions, joining together the links that connect him to these people, as if trying to form a “chain of individual responsibilities” that might explain how it happened that his life’s modest wages had resulted in “the accumulation, the multiplication, of loss.” (...)

Adrian’s indifference to playing it cool somehow made him the leader of the boys’ clique when they were teenagers; he became the one they looked up to. Yet Tony never emulated Adrian, and was guilty of the pose Adrian deplored: pretending not to care. He pays for this failure again and again, from his 20s to his 60s. “Does character develop over time?” Tony asks himself, wondering at the “larger holding pen” that has come to contain his adult life. Maybe character freezes sometime between the ages of 20 and 30, he speculates. “And after that, we’re just stuck with what we’ve got. We’re on our own. If so, that would explain a lot of lives, wouldn’t it? And also — if this isn’t too grand a word — our tragedy.” (...)

But who does Tony enfold into his “we”? His agonized analysis is entirely self-­referential, as solitary and armored as the man himself. Decades earlier, Tony had accused Veronica of an “inability to imagine anyone else’s feelings or emotional life,” but it was he, not she, who was incapable of looking outside his own head. Barnes’s unreliable narrator is a mystery to himself, which makes the novel one unbroken, sizzling, satisfying fuse. Its puzzle of past causes is decoded by a man who is himself a puzzle. Tony resembles the people he fears, “whose main concern is to avoid further damage to themselves, at whatever cost,” and who wound others with a hypersensitivity that is insensitive to anything but their own needs. “I have an instinct for survival, for self-­preservation,” he reflects. “Perhaps this is what Veronica called cowardice and I called being peaceable.”

by Liesl Schillinger, NY Times |  Read more:
Image: via:

Monday, December 23, 2013


Donna Watson, Edge of Light
via:

Frieda Meaney
via:

Surviving Anxiety


I’ve finally settled on a pre-talk regimen that enables me to avoid the weeks of anticipatory misery that the approach of a public-speaking engagement would otherwise produce.

Let’s say you’re sitting in an audience and I’m at the lectern. Here’s what I’ve likely done to prepare. Four hours or so ago, I took my first half milligram of Xanax. (I’ve learned that if I wait too long to take it, my fight-or-flight response kicks so far into overdrive that medication is not enough to yank it back.) Then, about an hour ago, I took my second half milligram of Xanax and perhaps 20 milligrams of Inderal. (I need the whole milligram of Xanax plus the Inderal, which is a blood-pressure medication, or beta-blocker, that dampens the response of the sympathetic nervous system, to keep my physiological responses to the anxious stimulus of standing in front of you—the sweating, trembling, nausea, burping, stomach cramps, and constriction in my throat and chest—from overwhelming me.) I likely washed those pills down with a shot of scotch or, more likely, vodka, the odor of which is less detectable on my breath. Even two Xanax and an Inderal are not enough to calm my racing thoughts and to keep my chest and throat from constricting to the point where I cannot speak; I need the alcohol to slow things down and to subdue the residual physiological eruptions that the drugs are inadequate to contain. In fact, I probably drank my second shot—yes, even though I might be speaking to you at, say, 9 in the morning—between 15 and 30 minutes ago, assuming the pre-talk proceedings allowed me a moment to sneak away for a quaff.

If the usual pattern has held, as I stand up here talking to you now, I’ve got some Xanax in one pocket (in case I felt the need to pop another one before being introduced) and a minibar-size bottle or two of vodka in the other. I have been known to take a discreet last-second swig while walking onstage—because even as I’m still experiencing the anxiety that makes me want to drink more, my inhibition has been lowered, and my judgment impaired, by the liquor and benzodiazepines I’ve already consumed. If I’ve managed to hit the sweet spot—that perfect combination of timing and dosage whereby the cognitive and psychomotor sedating effect of the drugs and alcohol balances out the physiological hyperarousal of the anxiety—then I’m probably doing okay up here: nervous but not miserable; a little fuzzy but still able to speak clearly; the anxiogenic effects of the situation (me, speaking in front of people) counteracted by the anxiolytic effects of what I’ve consumed. But if I’ve overshot on the medication—too much Xanax or liquor—I may seem to be loopy or slurring or otherwise impaired. And if I didn’t self-medicate enough? Well, then, either I’m sweating profusely, with my voice quavering weakly and my attention folding in upon itself, or, more likely, I ran offstage before I got this far. I mean that literally: I’ve frozen, mortifyingly, onstage at public lectures and presentations before, and on several occasions I have been compelled to bolt from the stage.

Yes, I know. My method of dealing with my public-speaking anxiety is not healthy. It’s dangerous. But it works. Only when I am sedated to near-stupefaction by a combination of benzodiazepines and alcohol do I feel (relatively) confident in my ability to speak in public effectively and without torment. As long as I know that I’ll have access to my Xanax and liquor, I’ll suffer only moderate anxiety for days before a speech, rather than sleepless dread for months. (...)

My assortment of neuroses may be idiosyncratic, but my general condition is hardly unique. Anxiety and its associated disorders represent the most common form of officially classified mental illness in the United States today, more common even than depression and other mood disorders. According to the National Institute of Mental Health, some 40 million American adults, about one in six, are suffering from some kind of anxiety disorder at any given time; based on the most recent data from the Department of Health and Human Services, their treatment accounts for more than a quarter of all spending on mental-health care. Recent epidemiological data suggest that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetime. And it is debilitating: studies have compared the psychic and physical impairment tied to living with an anxiety disorder with the impairment tied to living with diabetes—both conditions are usually manageable, sometimes fatal, and always a pain to deal with. In 2012, Americans filled nearly 50 million prescriptions for just one antianxiety drug: alprazolam, the generic name for Xanax. (...)

Stigma still attaches to mental illness. Anxiety is seen as weakness. In presenting my anxiety to the world by writing publicly about it, I’ve been told, I will be, in effect, “coming out.” The implication is that this will be liberating. We’ll see about that. But my hope is that readers who share this affliction, to whatever extent, will find some value in this account—not a cure for their anxiety, but perhaps some sense of the redemptive value of an often wretched condition, as well as evidence that they can cope and even thrive in spite of it. Most of all, I hope they—and by “they” I mean “many of you”—will find some solace in learning that they are not alone.

by Scott Stossel, The Atlantic |  Read more:
Image: Jamie Chung

The Weight of the Past

This column is about Edward Snowden. And the National Football League. And, I suspect, most of the rest of us.

Although it’s about Snowden, it’s not about the National Security Administration, surveillance or privacy. And although it’s about the N.F.L., it’s not about concussions. Instead, it’s about the unbalanced trajectory of human life.

Snowden’s actions, regardless of whether one supports them or not, have had a prodigious impact on the debate about privacy in the United States and will likely continue to do so. They have had roughly the impact that Snowden wanted them to have. That is, they have altered how many of us think about our relation to the government and to our own technology, and because of this, they infuse this period of his life with a luminescence that will always be with him. He will not forget it, nor will others.

There is an assumption I would like to make here, one that I can’t verify but I think is uncontroversial. It is very unlikely that Edward Snowden will ever do anything nearly as significant again. Nothing he does for the remainder of his life will have the resonance that his recent actions have had. The powers that be will ensure it. And undoubtedly he knows this. His life will go on, and it may not be as tortured as some people think. But in an important sense his life will have peaked at age 29 or 30.

This is not to say that Snowden’s days will not have their pleasures or their meaningfulness. Rather, those pleasures and that meaningfulness will likely always have lurking in the background the momentous period in the spring of 2013.

Players in the N.F.L. have an average career of six years. For many of them, those years — typically along with their college years — are the most exciting of their lives. They represent the cities they play for, enjoy the adulation of fans and receive higher salaries than they are ever likely to again. Many develop deep bonds with their teammates. They get to experience in reality what many male children dream about. And then it is over. If concussions don’t take their toll, they can expect to live another 45 or 50 years while not playing football.

For many people — not just activists like Snowden or professional athletes — life crests early. But it doesn’t end there. It goes on, burdened by a summit that can never be reached again, which one can gaze upon only by turning back. This is not to say that good and worthwhile things will not happen to them, and for a fortunate few there will be other, higher summits. For many, however, those earlier moments will be a quiet haunting, a reminder of what has been and cannot be again.

We might think of these kinds of lives, lives whose trajectories have early peaks and then fall off, as exceptional. In a way they are. But in another way they are not. There is something precisely in the extremity of these lives that brings out a phenomenon that appears more subtly for the rest of us. It appears in different times and different places and under different guises, but it is there for us nevertheless. (...)

Many will balk, reasonably so, at the characterization I have just given. After all, who is to say that a life has crested? How do we know, except perhaps in unique cases like Snowden or many N.F.L. players, that there aren’t higher peaks in our future? Who is able to confidently say they have already lived the best year of their life?

That consideration, I think, only adds to the difficulty. We don’t know. We cannot know whether the future will bring new experience that will light the fire again or will instead be a slowly dying ember. And the puzzle then becomes, how to respond to this ignorance? Do we seek to introduce more peaks, watching in distress if they do not arise? And how would we introduce those peaks? After all, the arc of our lives is determined not simply by us but also our circumstances. Alternatively, do we go on about our days hoping for the best? Or do we instead, as many people do, lead what Thoreau called lives of quiet desperation?

This is not to say that nostalgia is our inescapable fate. The lesson I am trying to draw from reflecting on the examples of Snowden and the N.F.L. is not that the thrill ends early. Rather, in their extremity these examples bring out something else. For most of us, as our lives unfold we simply do not, we cannot, know whether we have peaked in an area of our lives — or in our lives themselves — in ways that are most important to us. The past weighs upon us, not because it must cancel the future, but because it is of uncertain heft.

by Todd May, NY Times |  Read more:

Sunday, December 22, 2013


Itō Jakuchū (Japanese, 1716 - 1800)
via:

Paul Cézanne, Seven Bathers ( c 1900).
via:

Voice Hero: The Inventor of Karaoke Speaks

It’s one a.m. The bar is closing but the night isn’t over yet. While milling about on the sidewalk, a friend suggests, ‘Karaoke?’ And suddenly the night gets a lot brighter—and a little more embarrassing.

It’s safe to say that at no point in human history have there been as many people singing the songs of themselves, uncaring that their song was first sung by Gloria Gaynor, Frank Sinatra, or Bruce Springsteen. Karaoke has become inescapable, taking over bars from Manila to Manchester. Passions run high. In the Philippines, anger over off-key renditions of ‘My Way’ have left at least six dead. That statistic hides, however, the countless renditions of the Sinatra anthem that leave people smiling—or at least just wincing. The sing-along music machine terrifies the truly introverted, but it is a hero to countless closet extroverts, letting them reveal their private musical joy. Literally, karaoke is the combination of two Japanese words, ‘empty,’ and ‘orchestra’—but we might also lovingly translate it as ‘awkward delight.’

Yet for all karaoke’s fame, the name of its Dr. Frankenstein is less known, perhaps because he never took a patent out on the device and only copyrighted its name in the U.S. in 2009. His name is Daisuke Inoue, a Japanese businessman and inventor born in Osaka in 1940. In 2004 he was honored with an Ig Nobel Prize, given for unusual inventions or research.

In 2005, he shared the story of his life leading up to the Ig Nobel in an interview with Robert Scott Field for Topic Magazine. No longer in print, Topic was one of The Appendix’s inspirations (along with StoryCorps) for its celebration of the everyday and undersung heroes of our world. As a history of another sort of invention, Mr. Inoue’s interview was particularly memorable and deserves to be more widely available. With the permission of both Topic and Mr. Inoue, we are pleased to re-present his delightfully inspiring account of his life and work.

We hope you sing along.

***

Last year I received a fax from Harvard University. I don’t really speak English, but lucky for me, my wife does. She figured out the letter was about the Ig Nobel Prizes, awards that Harvard presents for inventions that make people laugh—and then make them think. I was nominated for an Ig Nobel Peace Prize as the inventor of karaoke, which teaches people to bear the awful singing of ordinary citizens, and enjoy it anyway. That is “genuine peace,” they told me.

Before I tell you about my hilarious adventures at the prize ceremony, though, you need to know how I came to invent the first karaoke machine. I was born in May 1940, in a small town called Juso, in Osaka, Japan. My father owned a small pool hall. When I was three and a half years old, I fell from the second floor and hit my head. I was unconscious for two weeks. The doctors told my parents that if I lived, I would probably have brain damage. A Buddhist priest visited me, blessed me and replaced my birth name, Yusuke, with a new name: Daisuke, which means, in the written characters of kanji, “Big Help.” I needed it. Later I learned that the same Buddhist priest had commented that the name would also lead me to help others.

by Daisuke Inoue and Robert Scott, The Appendix | Read more:
Image: courtesy Daisuke Inoue