Thursday, April 4, 2013

How the Maker of TurboTax Fought Free, Simple Tax Filing

This story was co-produced with NPR.

Imagine filing your income taxes in five minutes — and for free. You'd open up a pre-filled return, see what the government thinks you owe, make any needed changes and be done. The miserable annual IRS shuffle, gone.

It's already a reality in Denmark, Sweden and Spain. The government-prepared return would estimate your taxes using information your employer and bank already send it. Advocates say tens of millions of taxpayers could use such a system each year, saving them a collective $2 billion and 225 million hours in prep costs and time, according to one estimate.

The idea, known as "return-free filing," would be a voluntary alternative to hiring a tax preparer or using commercial tax software. The concept has been around for decades and has been endorsed by both President Ronald Reagan and a campaigning President Obama.

"This is not some pie-in-the-sky that's never been done before," said William Gale, co-director of the Urban-Brookings Tax Policy Center. "It's doable, feasible, implementable, and at a relatively low cost."

So why hasn't it become a reality?

Well, for one thing, it doesn't help that it's been opposed for years by the company behind the most popular consumer tax software — Intuit, maker of TurboTax. Conservative tax activist Grover Norquist and an influential computer industry group also have fought return-free filing.

Intuit has spent about $11.5 million on federal lobbying in the past five years — more than Apple or Amazon. Although the lobbying spans a range of issues, Intuit's disclosures pointedly note that the company "opposes IRS government tax preparation."

The disclosures show that Intuit as recently as 2011 lobbied on two bills, both of which died, that would have allowed many taxpayers to file pre-filled returns for free. The company also lobbied on bills in 2007 and 2011 that would have barred the Treasury Department, which includes the IRS, from initiating return-free filing.  (...)

Proponents of return-free filing say Intuit and other critics are exaggerating the risks of government involvement. No one would be forced to accept the IRS accounting of their taxes, they say, so there's little to fear.

"It's voluntary," Austan Goolsbee, who served as the chief economist for the President's Economic Recovery Advisory Board, told ProPublica. "If you don't trust the government, you don't have to do it."

Goolsbee has written in favor of the idea and published the estimate of $2 billion in saved preparation costs in a 2006 paper that also said return-free "could significantly reduce the time lag in resolving disputes and accelerate the time to receive a refund."

Other advocates point out that the IRS would be doing essentially the same work it does now. The agency would simply share its tax calculation before a taxpayer files rather than afterward when it checks a return.

"When you make an appointment for a car to get serviced, the service history is all there. Since the IRS already has all that info anyway, it's not a big challenge to put it in a format where we could see it," said Paul Caron, a tax professor at University of Cincinnati College of Law. "For a big slice of the population, that's 100 percent of what's on their tax return."

by Liz Day, ProPublica |  Read more
istock photo by lvsigns

Bubba's Hover


Golf carts haven't changed much over the years. They look and feel the same. What if there was a way to improve the traditional golf cart concept and take away some of the limitations? That is what Bubba Watson and Oakley set out to do. They created the world's first hovercraft golf cart. Using hovercraft technology, the BW1 is able to glide over any terrain, including grass, sand, and water.

[ed...pretty sure the ducks won't like it.]

Nature’s Drone, Pretty and Deadly

African lions roar and strut and act the apex carnivore, but they’re lucky to catch 25 percent of the prey they pursue. Great white sharks have 300 slashing teeth and that ominous soundtrack, and still nearly half their hunts fail.

Dragonflies, by contrast, look dainty, glittery and fun, like a bubble bath or costume jewelry, and they’re often grouped with butterflies and ladybugs on the very short list of Insects People Like. Yet they are also voracious aerial predators, and new research suggests they may well be the most brutally effective hunters in the animal kingdom.

When setting off to feed on other flying insects, dragonflies manage to snatch their targets in midair more than 95 percent of the time, often wolfishly consuming the fresh meat on the spur without bothering to alight. “They’ll tear up the prey and mash it into a glob, munch, munch, munch,” said Michael L. May, an emeritus professor of entomology at Rutgers. “It almost looks like a wad of snuff in the mouth before they swallow it.”

Next step: grab more food. Dragonflies may be bantam, but their appetite is bottomless. Stacey Combes, who studies the biomechanics of dragonfly flight at Harvard, once watched a laboratory dragonfly eat 30 flies in a row. “It would have happily kept eating,” she said, “if there had been more food available.”

In a string of recent papers, scientists have pinpointed key features of the dragonfly’s brain, eyes and wings that allow it to hunt so unerringly. One research team has determined that the nervous system of a dragonfly displays an almost human capacity for selective attention, able to focus on a single prey as it flies amid a cloud of similarly fluttering insects, just as a guest at a party can attend to a friend’s words while ignoring the background chatter.

Other researchers have identified a kind of master circuit of 16 neurons that connect the dragonfly’s brain to its flight motor center in the thorax. With the aid of that neuronal package, a dragonfly can track a moving target, calculate a trajectory to intercept that target and subtly adjust its path as needed. (...)

Perhaps not surprisingly, much dragonfly research both here and abroad is supported by the United States military, which sees the insect as the archetypal precision drone.

by Natalie Angier, NY Times |  Read more: 
Image: via The Guardian

The Power of the Brushstroke


In 1949 Life magazine published a short feature on the artist Jackson Pollock where the editors famously asked: “Is this the greatest living American painter?” The headline was both genuine and rhetorical. The article was sparked by one of Pollock’s consummate supporters, the art critic Clement Greenberg, who, by the late 1940s was the vocal arbiter of modernism and, more acutely, the promoter of Abstract Expressionism. In the profile photograph, the 37-year-old Pollock stands in front of one is his long horizontal paintings, the chaos of colors and splatters stretching the length of the article. He is dressed in his distinctive overalls, his face expressionless as he crossing his arms and leans slightly back, his posture holding a mixture of private emotions and manly reserve. While the article never prescribes an answer to the question (the editors did receive over five hundred letters from readers with their own answers, mostly affirming their alarm and distain for his canvases), it does declare that Pollock “has burst forth as the shinning new phenomenon of American art.”

15 years later, the magazine would ask that question again about Roy Lichtenstein, only in a slightly different way. In its profile of the artist it asked: “Is this the worst artist in America?” This playful echo of the Pollock profile set the contrast between the two artists, but also christened the increasing interest in Lichtenstein’s work. The profile described Lichtenstein’s painting process, showing readers how he transformed cartoon images into paintings. It demonstrated his particular methods in achieving his distinctive benday dots, that repetitive surface that gives his canvases a mechanical sense of texture and depth. Contrasting to Pollock’s full body portrait in front of his canvas, Lichtenstein presents a more reserved image. He sits in a high-back wicker chair, one of his romance paintings propped in front of him, shielding his body from us. His head, slightly tilted back, rests above the canvas, a shy smile on his face as he gazes down at the camera looking almost regal.

This difference in the artist’s image reflected a deeper difference in the styles of art as well. Lichtenstein’s Pop Art was, in many respects, a much more controlled and quiet form compared to the loud canvases of the Abstract Expressionists, their works filled with emotional forces, undefined and unlimited. Pop Art offered the hum of the machine. Think of Andy Warhol’s famous mantra, “I want to be a machine.” Abstract Expressionism rested on the power of the brush stroke, the texture of paint, and the serendipitous surface of the canvas. Pop artists instead turned the brushstroke into line and dots, creating a constant repetition of surfaces, questioning the authentic power of any one imagine. If Abstract Expressionism was about the artist’s emotions, Pop Art was about the cool distance of the artist. In defining this contrast, French theorist Roland Barthes wrote in the late 1970s that the Pop artist “has no depth: he is merely the surface of his pictures, no signified, no intention, anywhere.”

by James Polchin, The Smart Set |  Read more: 
Video via: Tate Modern, Image via Roy Lictenstein, Masterpiece

How to Break Into Science Writing Using Your Blog and Social Media


[ed. This is excellent advice for a writing career in any discipline, not just science.]

There are two basic trajectories: one more traditional, which I like to call “vertical”, and the other one I call “horizontal” which, though it happened with individual writers for a long time, seems to be a much more frequent, if not dominant trajectory these days.

The vertical trajectory is the one taken by people who, perhaps from a very early age, knew they wanted to become writers or journalists, perhaps specifically science journalists. They major in journalism in college (perhaps double-major in a science as well), work on their school paper, start internships early in their local papers (or radio or TV stations), then go to a Master’s program in science journalism. By the time they graduate from that, they already have lots of experience, several internships, many clips, perhaps some local awards, and are ready to start making a living as staff writers or freelancers.

The horizontal trajectory describes people who start out in science, with every intention of making a career in research. But, as tenure track is now an alternative career in science, most science students need to find other options. Some of them – those who always liked to write, wrote diaries as kids, etc. – will explore the option of becoming science writers. The most direct horizontal trajectory involves starting a science blog while still doing research, becoming known for good writing there, then start pitching stories for online (and later print) magazines, and gradually leaving the lab bench and starting to make a living by writing alone. Brian Switek, John Timmerand Ed Yong are probably the best known examples of people who took this path. Heck, I am one of those examples, too. Many more are somewhere along that trajectory right now.

Of course, those are extremes, too neatly cut apart. Many people will do something in the middle, combining the two approaches in some way. For example, they may pursue a career in research while also taking summer internships at science magazines, or editing the science section of the college newspaper. Some may major in science, then go to j-school for Masters. Also, not all of the new entries into science writing are young. Sure, some make the switch after college or Masters in science, but others make the switch later, after getting a PhD, or finishing a postdoc, or after years of teaching as adjunct faculty with no hope of ever getting a tenure track position, or even after many years as full faculty, once grant money dries out and there are no more resources to keep running the lab.

Either way, there comes a time when one becomes a professional science writer/journalist and has to make a living that way. What does one need to do to succeed? (...)

First you have to write

People who want to become professional writers are, I assume, people who always liked to write. Childhood diaries. LiveJournals filled with teenage angst. Long Facebook updates. It’s time to take this seriously and do your writing in a more serious, organized, professional manner. Start a blog. This is your writing laboratory. Start blogging about science. Nobody will know about your blog until you start promoting it, so don’t worry that your early posts are clumsy (you can even delete the first few embarrassing posts later, once you are happy with your blog and start promoting it).

Practice the usual journalistic forms – the feature, the interview, the brief news story with inverted pyramid. You will need to demonstrate that you are capable of writing in such forms and styles. (...)

Try to figure out your beat (or obsession) – what is it that excites you the most? Write about that. Try to find your own niche. Become a “go to” person on a particular topic, become an expert (or at least a temporary expert) on that topic.

Ignore the “professional” advice about having to blog daily. It was a necessity a decade go, not any more. In the days of RSS feeds and social media, it does not matter for your readers any more – they will find your posts no matter how infrequently you post. It only matters for you and your own writing habit that you blog with some regularity.

Also ignore the “professional” advice about writing relatively short blog posts. Leave that for brief news articles. Blog posts are longform, at least most of the time. And longform works online much better than short articles – the traffic keeps on giving for years, as people rediscover long posts, see them as resources, and share with their friends.

Also important to remember: You’re A Human, So Write Like One. How do I write? First I read and study the topic. Then, I compose text in my head (usually during dog walks, often over a number of days, sometimes even months), imagining I am explaining something to a good non-scientist friend. Then I sit down and quickly transcribe that. Quick proofread. Click “Publish”.

by Bora Zivkovic, Scientific American | Read more:
Image: uncredited

Wednesday, April 3, 2013

We Regret to Inform You That Your Paper Has Not Been Accepted

As a PhD student:

As a post-doc:

As a professor:

by Nikolaj
via:

Sardine Life

New York didn’t invent the apartment. Shopkeepers in ancient Rome lived above the store, Chinese clans crowded into multistory circular tulou, and sixteenth-century Yemenites lived in the mud-brick skyscrapers of Shibam. But New York re-invented the apartment many times over, developing the airborne slice of real estate into a symbol of exquisite urbanity. Sure, we still have our brownstones and our townhouses, but in the popular imagination today’s New Yorker occupies a glassed-in aerie, a shared walk-up, a rambling prewar with walls thickened by layers of paint, or a pristine white loft.

The story of the New York apartment is a tale of need alchemized into virtue. Over and over, the desire for better, cheaper housing has become an instrument of urban destiny. When we were running out of land, developers built up. When we couldn’t climb any more stairs, inventors refined the elevator. When we needed much more room, planners raised herds of towers. And when tall buildings obscured our views, engineers took us higher still.

This architectural evolution has roughly tracked the city’s financial fortunes and economic priorities. The turn-of-the-century Park Avenue duplex represented the apotheosis of the plutocrat; massive postwar projects like Stuyvesant Town embodied the national mid-century drive to consolidate the middle class; and the thin-air penthouses of Trump World Tower capture the millennial resurgence of buccaneering capitalism. You can almost chart income inequality over the years by measuring the height of New York’s ceilings. (...)

The charms of standardization eventually wore thin, and the New York apartment soon experienced a transformation almost as fundamental as it had at the turn of the century. It began when the heirs to the cold-water bohemian culture of Greenwich Village drifted south across Houston Street and discovered a zone of gorgeous dereliction. In the sixties and seventies, the industries that had fueled the city’s growth a century earlier were withering, leaving acres of fallow real estate. At first, nobody was permitted to live in those abandoned factories, but the rents were low and the spaces vast, and artists were no more deterred by legal niceties than they were by graffiti, rodents, and flaking paint. They arrived with their drafting tables, their welding torches, movie cameras, and amplifiers. They scavenged furniture, blasted fumes and music into the night, and gloried in the absence of fussy neighbors. They would demarcate a bedroom by hanging an old sheet.

At a time when urban populations everywhere were leaching to the suburbs, this artists’ colonization had a profound and invigorating effect not just on Soho but on the entire city. The traditional remedy for decay was demolition, but artists demanded the right to stay, their presence attracted art galleries, and a treasury of cast-iron buildings acquired a new purpose. Artists didn’t think of themselves as creating real-estate value, but they did. Few events illustrate the maxim “Be careful what you wish for” better than the Loft Law of 1982, which forced owners to make Soho’s industrial buildings fully habitable without charging the tenants for improvements.

It was a triumph and a defeat. Legal clarity brought another wave of tenants, with more money and higher standards of comfort. As working artists drifted on to cheaper pastures in Long Island City, Williamsburg, and Bushwick, Soho’s post-­pioneers renovated their lofts, hiring architects to reinterpret the neighborhood’s industrial rawness, or merge it with cool pop minimalism, or carve the ballroom-size spaces into simulacra of uptown apartments.

Once everyone wanted to be a tycoon, then everyone wanted to be middle-class. Now everyone wanted to be an artist, or live like one. Soho filled up quickly, and the idea of the loft spread, reinterpreted as a marketable token of the unconventional life, promising to lift the curse of the bourgeoisie through the powers of renovation. Realtors began pointing out partition walls that could easily be torn out. Lawyers, dentists, and academics eliminated hallways and dining rooms, folding them into unified, flowing spaces. Happily for those with mixed feelings about the counterculture, loftlike expansiveness overlapped with the open-plan aesthetic of new suburban houses. Whether in imitation of Soho or Scarsdale, the apartment kitchen migrated from the servants’ area to the center of the household, shed its confining walls, and put on display its arsenal of appliances and the rituals of food preparation (not to mention the pileup of dirty crockery). Cooking became a social performance, one that in practice many apartment dwellers routinely skipped in favor of ordering in, going out, or defrosting a package—but at least the theater stood ready.

by Justin Davidson, New York Magazine |  Read more:
Photo: Adrian Gaut

Textile Length from a Cover (yutan) for Daimyo Woman’s Trousseau Travel Box (nagamochi) with Design of Family Crests (mon) and Pattern of Flower Diamonds (hanabishi). Japan, Edo period, 17th century

via: LACMA

Is This a Pandemic Being Born?


Here's how it would happen. Children playing along an urban river bank would spot hundreds of grotesque, bloated pig carcasses bobbing downstream. Hundreds of miles away, angry citizens would protest the rising stench from piles of dead ducks and swans, their rotting bodies collecting by the thousands along river banks. And three unrelated individuals would stagger into three different hospitals, gasping for air. Two would quickly die of severe pneumonia and the third would lay in critical condition in an intensive care unit for many days. Government officials would announce that a previously unknown virus had sickened three people, at least, and killed two of them. And while the world was left to wonder how the pigs, ducks, swans, and people might be connected, the World Health Organization would release deliberately terse statements, offering little insight.

It reads like a movie plot -- I should know, as I was a consultant for Steven Soderbergh's Contagion. But the facts delineated are all true, and have transpired over the last six weeks in China. The events could, indeed, be unrelated, and the new virus, a form of influenza denoted as H7N9, may have already run its course, infecting just three people and killing two.

Or this could be how pandemics begin.

On March 10, residents of China's powerhouse metropolis, Shanghai, noticed some dead pigs floating among garbage flotsam in the city's Huangpu River. The vile carcasses appeared in Shanghai's most important tributary of the mighty Yangtze, a 71-mile river that is edged by the Bund, the city's main tourist area, and serves as the primary source of drinking water and ferry travel for the 23 million residents of the metropolis and its millions of visitors. The vision of a few dead pigs on the surface of the Huangpu was every bit as jarring for local Chinese as porcine carcasses would be for French strolling the Seine, Londoners along the Thames, or New Yorkers looking from the Brooklyn Bridge down on the East River.

And the nightmarish sight soon worsened, with more than 900 animal bodies found by sunset on that Sunday evening. The first few pig carcass numbers soon swelled into the thousands, turning Shanghai spring into a horror show that by March 20 would total more than 15,000 dead animals. The river zigzags its way from Zhejiang province, just to the south of Shanghai, a farming region inhabited by some 54 million people, and a major pork-raising district of China. Due to scandals over recent years in the pork industry, including substitution of rendered pig intestines for a toxic chemical, sold as heparin blood thinner that proved lethal to American cardiac patients, Chinese authorities had put identity tags on pigs' ears. The pig carcasses were swiftly traced back to key farms in Zhejiang, and terrified farmers admitted that they had dumped the dead animals into the Huangpu.

by Laurie Garrett, Foreign Policy |  Read more: 
China Photos/Getty Images

Mutiny on the Bounty: Alaska Sea Otters in the Crosshairs


Bounties are often proposed as a way to reduce competition between people and animal populations for a limited resource. Alaska is no stranger to bounties. During the 40 years prior to statehood, the Territory of Alaska paid nearly $3 million in bounties for eagles, seals, wolves, coyotes, even Dolly Varden char. Bounties on some species continued after statehood.

Many people have no problem spending public money to “incentivize” their business or occupation. However, placating the demands of a special interest group can have ludicrous results. In “Big Game in Alaska: A History of Wildlife and People” Morgan Sherwood cited a cost-benefit analysis by C. Hart Merriam, who found that Pennsylvania “had spent $90,000 over a period (in the late 1800s) to destroy hawks and owls that killed rodents and other pests and were therefore worth $3.9 million to farmers, all in order to save $1,875 worth of poultry.”

Most professional wildlife managers believe bounties are ineffective. Wolves were eliminated throughout much of the American West, but dedicated government trappers and widespread use of poison accounted for most of the carnage. Coyotes have expanded their range and are more numerous than ever despite more than a century of bounties and other forms of lethal control.

Most bounty schemes fail because they neglected to consider more than one side of the issue or account for human nature. Bounties become a source of income, and fraud is often an issue. For example, when the Territory paid bounties on hair seals, agents required seal flippers for evidence – and they cared little if the flippers were from a target species. The Territory paid $1.2 million in bounties for 358,023 “hair seals” from 1927-1958.

Similarly, the bounty on Dolly Varden was quickly discontinued after the Territory doled out $96,344. Fisheries biologists examined 500 fish tails turned in for the bounty and found only 10 percent were Dolly Varden tails. Most were salmon tails. Bounties can also create an economic disincentive to eradicate or reduce a target species. A prudent bounty collector will leave the breeding population intact so more animals are produced next year.

Nevertheless, bounties seem to be effective on marine mammals. After all, they breathe air, and there aren’t many places to hide on the surface of the ocean. In the November 1915 edition of the Zoological Society Bulletin, C. H. Townsend reported that during the previous two years British Columbia paid bounties of $14,329 on Steller sea lions and hair seals. The province’s bounty fund was exhausted after 2,875 sea lions and 2,987 seals were claimed. Townsend and others believed the harbor seal population along the North Atlantic coast was destroyed through bounties instigated by fishermen.

But who’s to say a sea otter supposedly shot in Southeast Alaska wasn’t taken from Prince William Sound, Kachemak Bay, or elsewhere in Alaska? Fish and Game will be paying bounties on sea otters shot from Ketchikan to Attu Island, including individual animals taken from Southwest Alaska, where the U.S. Fish and Wildlife Service has designated some populations as threatened under the Endangered Species Act. When the funding for bounties is depleted, the program will have had less effect in Southeast Alaska than anticipated.

Townsend, a former chief of the fisheries division of the U.S. Fish Commission, was not amused by the use of bounties. He wrote, “This is the usual procedure with fishermen who may be depended upon to attribute the depletion of fisheries to other causes than the wasteful fishing methods practiced by themselves.”

by Rick Sinnott, Alaska Dispatch |  Read more:
Aaron Jansen illustration

William Ryan Fritch


Diagnosis: Human

The news that 11 percent of school-age children now receive a diagnosis of attention deficit hyperactivity disorder — some 6.4 million — gave me a chill. My son David was one of those who received that diagnosis.

In his case, he was in the first grade. Indeed, there were psychiatrists who prescribed medication for him even before they met him. One psychiatrist said he would not even see him until he was medicated. For a year I refused to fill the prescription at the pharmacy. Finally, I relented. And so David went on Ritalin, then Adderall, and other drugs that were said to be helpful in combating the condition.

In another age, David might have been called “rambunctious.” His battery was a little too large for his body. And so he would leap over the couch, spring to reach the ceiling and show an exuberance for life that came in brilliant microbursts.

As a 21-year-old college senior, he was found on the floor of his room, dead from a fatal mix of alcohol and drugs. The date was Oct. 18, 2011.

No one made him take the heroin and alcohol, and yet I cannot help but hold myself and others to account. I had unknowingly colluded with a system that devalues talking therapy and rushes to medicate, inadvertently sending a message that self-medication, too, is perfectly acceptable.

My son was no angel (though he was to us) and he was known to trade in Adderall, to create a submarket in the drug among his classmates who were themselves all too eager to get their hands on it. What he did cannot be excused, but it should be understood. What he did was to create a market that perfectly mirrored the society in which he grew up, a culture where Big Pharma itself prospers from the off-label uses of drugs, often not tested in children and not approved for the many uses to which they are put.

And so a generation of students, raised in an environment that encourages medication, are emulating the professionals by using drugs in the classroom as performance enhancers.

And we wonder why it is that they use drugs with such abandon. As all parents learn — at times to their chagrin — our children go to school not only in the classroom but also at home, and the culture they construct for themselves as teenagers and young adults is but a tiny village imitating that to which they were introduced as children.

The issue of permissive drug use and over-diagnosis goes well beyond hyperactivity. In May, the American Psychiatric Association will publish its D.S.M. 5, the Diagnostic and Statistical Manual of Mental Disorders. It is called the bible of the profession. Its latest iteration, like those before, is not merely a window on the profession but on the culture it serves, both reflecting and shaping societal norms. (For instance, until the 1970s, it categorized homosexuality as a mental illness.)

One of the new, more controversial provisions expands depression to include some forms of grief. On its face it makes sense. The grieving often display all the common indicators of depression — loss of interest in life, loss of appetite, irregular sleep patterns, low functionality, etc. But as others have observed, those same symptoms are the very hallmarks of grief itself.

by Ted Gup, NY Times |  Read more: 
Image:Keith Negley

The Marvels in Your Mouth

[ed. See also: This NPR Fresh Air interview with Mary Roach (h/t Scott)].

When I told people I was traveling to Food Valley, I described it as the Silicon Valley of eating. At this cluster of universities and research facilities, nearly 15,000 scientists are dedicated to improving — or, depending on your sentiments about processed food, compromising — the quality of our meals.

At the time I made the Silicon Valley comparison, I did not expect to be served actual silicone.

But here I am, in the Restaurant of the Future, a cafeteria at Wageningen University where hidden cameras record diners as they make decisions about what to eat. And here it is, a bowl of rubbery white cubes the size of salad croutons. Andries van der Bilt has brought them from his lab in the brusquely named Department of Head and Neck, at the nearby University Medical Center Utrecht.

“You chew them,” he said.

The cubes are made of a trademarked product called Comfort Putty, more typically used in its unhardened form for taking dental impressions. Dr. Van der Bilt isn’t a dentist, however. He is an oral physiologist, and he likely knows more about chewing than anyone else in the world. He uses the cubes to quantify “masticatory performance” — how effectively a person chews.

I take a cube from the bowl. If you ever, as a child, chewed on a whimsical pencil eraser in the shape of, say, an animal or a piece of fruit, then you have tasted this dish.

“I’m sorry.” Dr. Van der Bilt winces. “It’s quite old.” As though fresh silicone might be better. (...)

Most of the time, while you’re just breathing and not swallowing, the larynx (voice box) blocks the entrance to the esophagus. When a mouthful of food or drink is ready to be swallowed, the larynx has to rise out of the way, both to allow access to the esophagus and to close off the windpipe and prevent the food from “going down the wrong way.”

To allow this to happen, the bolus is held momentarily at the back of the tongue, a sort of anatomical metering light. If, as a result of dysphagia, the larynx doesn’t move quickly enough, the food can head down the windpipe instead. This is, obviously, a choking hazard. More sinisterly, inhaled food and drink can deliver a troublesome load of bacteria. Infection can set in and progress to pneumonia.

A less lethal and more entertaining swallowing misstep is nasal regurgitation. Here the soft palate — home turf of the uvula, that queer little oral stalactite — fails to seal the opening to the nasal cavity. This leaves milk, say, or chewed peas in peril of being horked out the nostrils. Nasal regurgitation is more common with children, because they are often laughing while eating and because their swallowing mechanism isn’t fully developed.

“Immature swallowing coordination” is the reason 90 percent of food-related choking deaths befall children under 5. Also contributing: immature dentition. Children grow incisors before they have molars; for a brief span of time they can bite off pieces of food but cannot chew them.

Round foods are particularly treacherous because they match the shape of the trachea. If a grape goes down the wrong way, it blocks the tube so completely that no breath can be drawn around it. Hot dogs, grapes and round candies take the top three slots in a list of killer foods published in the July 2008 issue of The International Journal of Pediatric Otorhinolaryngology (itself a calamitous mouthful). A candy called Lychee Mini Fruity Gels has killed enough times for the Food and Drug Administration to have banned its import.

by Mary Roach, NY Times |  Read more:
Image: David Plunkert