Sunday, April 29, 2012


Alfred Wertheimer: Hold Me Tight (1956) 
via:

Saturday, April 28, 2012

Most Likely to Succeed

Predicting success in football and teaching.

One of the most important tools in contemporary educational research is “value added” analysis. It uses standardized test scores to look at how much the academic performance of students in a given teacher’s classroom changes between the beginning and the end of the school year. Suppose that Mrs. Brown and Mr. Smith both teach a classroom of third graders who score at the fiftieth percentile on math and reading tests on the first day of school, in September. When the students are retested, in June, Mrs. Brown’s class scores at the seventieth percentile, while Mr. Smith’s students have fallen to the fortieth percentile. That change in the students’ rankings, value-added theory says, is a meaningful indicator of how much more effective Mrs. Brown is as a teacher than Mr. Smith.

It’s only a crude measure, of course. A teacher is not solely responsible for how much is learned in a classroom, and not everything of value that a teacher imparts to his or her students can be captured on a standardized test. Nonetheless, if you follow Brown and Smith for three or four years, their effect on their students’ test scores starts to become predictable: with enough data, it is possible to identify who the very good teachers are and who the very poor teachers are. What’s more—and this is the finding that has galvanized the educational world—the difference between good teachers and poor teachers turns out to be vast.

Eric Hanushek, an economist at Stanford, estimates that the students of a very bad teacher will learn, on average, half a year’s worth of material in one school year. The students in the class of a very good teacher will learn a year and a half’s worth of material. That difference amounts to a year’s worth of learning in a single year. Teacher effects dwarf school effects: your child is actually better off in a “bad” school with an excellent teacher than in an excellent school with a bad teacher. Teacher effects are also much stronger than class-size effects. You’d have to cut the average class almost in half to get the same boost that you’d get if you switched from an average teacher to a teacher in the eighty-fifth percentile. And remember that a good teacher costs as much as an average one, whereas halving class size would require that you build twice as many classrooms and hire twice as many teachers.

Hanushek recently did a back-of-the-envelope calculation about what even a rudimentary focus on teacher quality could mean for the United States. If you rank the countries of the world in terms of the academic performance of their schoolchildren, the U.S. is just below average, half a standard deviation below a clump of relatively high-performing countries like Canada and Belgium. According to Hanushek, the U.S. could close that gap simply by replacing the bottom six per cent to ten per cent of public-school teachers with teachers of average quality. After years of worrying about issues like school funding levels, class size, and curriculum design, many reformers have come to the conclusion that nothing matters more than finding people with the potential to be great teachers. But there’s a hitch: no one knows what a person with the potential to be a great teacher looks like. The school system has a quarterback problem.

by Malcolm Gladwell, The New Yorker |  Read more:
Illustration: Joost Swarte

Leonard Cohen


[ed. An abbreviated, but still very nice version of Hallelujah (for reasons known only to Mr. Cohen). Also. listen to Jeff Buckley's cover of the full song.]

Lyrics

The Limits to Environmentalism

[ed. If you find this article interesting be sure to read the comments section. I'm a firm believer that economic growth and good environmental stewardship are quite compatible, if you start with good design. Many degraded environments can be restored simply by correcting elements of bad design already in place. In other words, want new solutions? Stop creating (debating and accepting) old problems. As one commenter notes: "One of the places that environmentalism would be unrecognizable vis-a-vis the 70s is in architecture, landscape architecture, and urban planning. Perhaps landscape architecture especially – all fields concerned with modernity and technology. Today, there is a dynamic and creative embrace of technology in these fields." (see previous post on Communities for People)]

If you were cryogenically frozen in the early 1970s, like Woody Allen was in Sleeper, and brought back to life today, you would obviously find much changed about the world.

Except environmentalism and its underlying precepts. That would be a familiar and quaint relic. You would wake up from your Rip Van Winkle period and everything around you would be different, except the green movement. It’s still anti-nuclear, anti-technology, anti-industrial civilization. It still talks in mushy metaphors from the Aquarius age, cooing over Mother Earth and the Balance of Nature. And most of all, environmentalists are still acting like Old Testament prophets, warning of a plague of environmental ills about to rain down on humanity.

For example, you may have heard that a bunch of scientists produced a landmark report that concludes the earth is destined for ecological collapse, unless global population and consumption rates are restrained. No, I’m not talking about the UK’s just-published Royal Society report, which, among other things, recommends that developed countries put a brake on economic growth. I’m talking about that other landmark report from 1972, the one that became a totem of the environmental movement.

I mention the 40-year old Limits to Growth book in connection with the new Royal Society report not just to point up their Malthusian similarities (which Mark Lynas flags here), but also to demonstrate what a time warp the collective environmental mindset is stuck in. Even some British greens have recoiled in disgust at the outdated assumptions underlying the Royal Society’s report. Chris Goodall, author of Ten Technologies to Save the Planet, told the Guardian: “What an astonishingly weak, cliché ridden report this is…’Consumption’ to blame for all our problems? Growth is evil? A rich economy with technological advances is needed for radical decarbonisation. I do wish scientists would stop using their hatred of capitalism as an argument for cutting consumption.”

Goodall, it turns out, is exactly the kind of greenie (along with Lynas) I had in mind when I argued last week that only forward thinking modernists could save environmentalism from being consigned to junkshop irrelevance. I juxtaposed today’s green modernist with the backward thinking “green traditionalist,” who I said remained wedded to environmentalism’s doom and gloom narrative and resistant to the notion that economic growth was good for the planet. Modernists, I wrote, offered the more viable blueprint for sustainability:

“Pro-technology, pro-city, pro-growth, the green modernist has emerged in recent years to advance an alternative vision for the future. His mission is to remake environmentalism: Strip it of outdated mythologies and dogmas, make it less apocalyptic and more optimistic, broaden its constituency. In this vision, the Anthropocene is not something to rail against, but to embrace. It is about welcoming that world, not dreading it. It is about creating a future that environmentalists will help shape for the better.”

by Keith Kloor, Discover Magazine |  Read more:

Earth to Ben Bernanke

When the financial crisis struck in 2008, many economists took comfort in at least one aspect of the situation: the best possible person, Ben Bernanke, was in place as chairman of the Federal Reserve.

Bernanke was and is a fine economist. More than that, before joining the Fed, he wrote extensively, in academic studies of both the Great Depression and modern Japan, about the exact problems he would confront at the end of 2008. He argued forcefully for an aggressive response, castigating the Bank of Japan, the Fed’s counterpart, for its passivity. Presumably, the Fed under his leadership would be different.

Instead, while the Fed went to great lengths to rescue the financial system, it has done far less to rescue workers. The U.S. economy remains deeply depressed, with long-term unemployment in particular still disastrously high, a point Bernanke himself has recently emphasized. Yet the Fed isn’t taking strong action to rectify the situation.

The Bernanke Conundrum — the divergence between what Professor Bernanke advocated and what Chairman Bernanke has actually done — can be reconciled in a few possible ways. Maybe Professor Bernanke was wrong, and there’s nothing more a policy maker in this situation can do. Maybe politics are the impediment, and Chairman Bernanke has been forced to hide his inner professor. Or maybe the onetime academic has been assimilated by the Fed Borg and turned into a conventional central banker. Whichever account you prefer, however, the fact is that the Fed isn’t doing the job many economists expected it to do, and a result is mass suffering for American workers.

What the Fed Can Do

The Federal Reserve has a dual mandate: price stability and maximum employment. It normally tries to meet these goals by moving short-term interest rates, which it can do by adding to or subtracting from bank reserves. If the economy is weak and inflation is low, the Fed cuts rates; this makes borrowing attractive, stimulates private spending and, if all goes well, leads to economic recovery. If the economy is strong and inflation is a threat, the Fed raises rates; this discourages borrowing and spending, and the economy cools off.

Right now, the Fed believes that it’s facing a weak economy and subdued inflation, a situation in which it would ordinarily cut interest rates. The problem is that rates can’t be cut further. When the recession began in 2007, the Fed started slashing short-term interest rates until November 2008, when they bottomed out near zero, where they remain to this day. And that was as far as the Fed could go, because (some narrow technical exceptions aside) interest rates can’t go lower. Investors won’t buy bonds if they can get a better return simply by putting a bunch of $100 bills in a safe. In other words, the Fed hit what’s known in economic jargon as the zero lower bound (or, alternatively, became stuck in a liquidity trap). The tool the Fed usually fights recessions with had reached the limits of its usefulness.

by Paul Krugman, NY Times |  Read more:
Illustration by Kelsey Dake
 Industrial Farming. Almería Province, Spain

On the arid plains of southern Spain, produce is grown under the world's largest array of greenhouses and trucked north. Greenhouses use water and nutrients efficiently and produce all year—tomatoes in winter, for instance. But globally the challenge is grain and meat, not tomatoes. It takes 38 percent of Earth's ice-free surface to feed seven billion people today, and two billion more are expected by 2050.

From the essay: Enter the Anthropocene - Age of Man

Photo: Edward Burtynsky
via: National Geographic

Friday, April 27, 2012

Talking Heads


Lost on the Gene Map

A tiny dot of DNA, thousands of times smaller than a pinhead, exists in almost every cell of our bodies. Stored in its tightly wound double helix is the wisdom of nearly four billion years of evolution — the hereditary information that decides our hair colour, whether we might stutter, or if we have the potential to win an Olympic gold medal. Human DNA is typically divided into forty-six chromosomes, twenty-three inherited from each parent; the DNA on one chromosome includes hundreds, sometimes thousands, of genes. These gene segments of DNA (deoxyribonucleic acid) encode data that the cell expresses as proteins to build and operate the various parts of the body. The seven billion faces in the world, all different, reveal individual differences in our genetic makeup. But so much of our collective DNA is the same that we share a common genetic heritage: the human genome.

To comprehend genomes is to begin to unlock the mysteries of life. One of the aims of the Human Genome Project, an international research program launched in 1990, was to map and then sequence every bit of DNA in a composite human genome. The project was heralded as the first step toward personalized medicine, a new age in health care when prevention and treatment of illnesses would be guided by examining a person’s genome and genetic predispositions. Understandably, expectations for the Human Genome Project ran high, and in 1996 President Bill Clinton glowingly foretold a not-too-distant future in which parents, armed with a map of their newborn’s genetic structure, could identify the risks for illness. In his vision, the fruits of the project would help “organize the diet plan, the exercise plan, the medical treatment that would enable untold numbers of people to have far more full lives.”

When the HGP was completed in 2003, that vision was still out of reach. Thanks to technological advances, it’s now on the horizon. The expense of genomic sequencing is falling fast; in Canada today it costs $10,000 to sequence an individual genome. “Once a whole genome costs $1,000 or less, entire families will get their genomes sequenced,” says Michael Hayden, director of the Centre for Molecular Medicine and Therapeutics at the University of British Columbia. “But what will they do with that information?” Whole-genome sequencing generates enormous amounts of raw data that must be analyzed by highly qualified medical geneticists and genetic counsellors, both in short supply (Canada has about eighty medical geneticists and 230 genetic counsellors). “DNA Sequencing Caught in Deluge of Data,” ran one recent headline in the New York Times, reflecting a common view that modern medicine doesn’t yet have the expertise to tell us what this data means, much less how to act on it.  (...)

As the demand for whole-genome sequencing grows, so will profits, but the big money in personalized medicine will come from the development of treatments. Progress to date has been slow and confined to monogenic diseases such as Huntington’s, whose origin lies in a mutation on a single gene inherited from one parent. Because monogenic diseases are relatively rare, sequencing the genomes of those affected generates a manageable amount of data. Yet only 10 percent of monogenic diseases have yielded to treatment. On the other hand, multigenic disorders, such as cancer, diabetes, or Alzheimer’s, result from a complex interplay of genetic mutations and environmental factors. A given mutation on a person’s genome may not necessarily express as a malignant disease, so identifying the probability of a multigenic disease is extremely challenging. Traditional indicators such as family history, diet, and lifestyle may still be far more predictive than genetic testing for individual risk.

Compounding the problem, the bodily pathway of a multigenic disorder is complex and difficult to trace, and each person’s metabolism responds in a highly idiosyncratic way to the conditions that cause disease. To discover how individuals’ systems respond to the genetic risk for a multigenic disease requires comparing data gathered from the genomes of thousands of test subjects, ideally involving research findings and tissue samples from bio-banks worldwide. And once potential treatments for these disorders are identified, they require long-term clinical trials.

Convincing governments and other funders to support these kinds of initiatives rather than searching for a magic bullet to cure a disease such as cancer presents a challenge. “Getting population cohort studies launched in Canada is very difficult,” says Tom Hudson, president and scientific director of the Ontario Institute for Cancer Research. “It’s less sexy than funding basic human genome research.” Hudson has made consulting with clinicians and assessing their requirements a high priority. “We need to turn the question around,” he says. “We have to identify the medical need and make sure our research programs create paths to address those clinical questions. It’s like starting a puzzle from the end.”

More problematic is the reality that the human genome is still a vast catalogue of the unknown and scarcely known. The Human Genome Project’s most startling finding was that human genes, as currently defined, make up less than 2 percent of all the DNA on the genome, and that the total number of genes is relatively small. Scientists had predicted there might be 80,000 to 140,000 human genes, but the current tally is fewer than 25,000 — as one scientific paper put it, somewhere between that of a chicken and a grape. The remaining 98 percent of our DNA, once dismissed as “junk DNA,” is now taken more seriously. Researchers have focused on introns, in the gaps between the coding segments of genes, which may play a crucial role in regulating gene expression, by switching them on and off in response to environmental stimuli.

by Mark Czarnecki, The Walrus |  Read more:
Illustration by Alain Pilon

How (and Why) Athletes Go Broke

What the hell happened here? Seven floors above the iced-over Dallas North Tollway, Raghib (Rocket) Ismail is revisiting the question. It's December, and Ismail is sitting in the boardroom of Chapwood Investments, a wealth management firm, his white Notre Dame snow hat pulled down to his furrowed brow.

In 1991 Ismail, a junior wide receiver for the Fighting Irish, was the presumptive No. 1 pick in the NFL draft. Instead he signed with the CFL's Toronto Argonauts for a guaranteed $18.2 million over four years, then the richest contract in football history. But today, at a private session on financial planning attended by eight other current or onetime pro athletes, Ismail, 39, indulges in a luxury he didn't enjoy as a young VIP: hindsight.

"I once had a meeting with J.P. Morgan," he tells the group, "and it was literally like listening to Charlie Brown's teacher." The men surrounding Ismail at the conference table include Angels outfielder Torii Hunter, Cowboys wideout Isaiah Stanback and six former pros: NFL cornerback Ray Mickens and fullback Jerald Sowell (both of whom retired in 2006), major league outfielder Ben Grieve and NBA guard Erick Strickland ('05), and linebackers Winfred Tubbs ('00) and Eugene Lockhart ('92). Ismail ('02) cackles ruefully. "I was so busy focusing on football that the first year was suddenly over," he says. "I'd started with this $4 million base salary, but then I looked at my bank statement, and I just went, What the...?"

Before Ismail can elaborate on his bewilderment—over the complexity of that statement and the amount of money he had already lost—eight heads are nodding, eight faces smiling in sympathy. Hunter chimes in, "Once you get into the financial stuff, and it sounds like Japanese, guys are just like, 'I ain't going back.' They're lost."

At the front of the room Ed Butowsky also does a bobblehead nod. Stout, besuited and silver-haired, Butowsky, 47, is a managing partner at Chapwood and a former senior vice president at Morgan Stanley. His bailiwick as a money manager has long been billionaires, hundred-millionaires and CEOs—a club that, the Steinbrenners' pen be damned, still doesn't include many athletes. But one afternoon six years ago Butowsky was chatting with Tubbs, his neighbor in the Dallas suburb of Plano, and the onetime Pro Bowl player casually described how money spills through athletes' fingers. Tubbs explained how and when they begin earning income (often in school, through illicit payments from agents); how their pro salaries are invested (blindly); and when the millions evaporate (before they know it).

"The details were mind-boggling," recalls Butowsky, who would later hire Tubbs to work in business development at Chapwood. "I couldn't believe what I was hearing."

What happens to many athletes and their money is indeed hard to believe. In this month alone Saints alltime leading rusher Deuce McAllister filed for bankruptcy protection for the Jackson, Miss., car dealership he owns; Panthers receiver Muhsin Muhammad put his mansion in Charlotte up for sale on eBay a month after news broke that his entertainment company was being sued by Wachovia Bank for overdue credit-card payments; and penniless former NFL running back Travis Henry was jailed for nonpayment of child support.

In a less public way, other athletes from the nation's three biggest and most profitable leagues—the NBA, NFL and Major League Baseball—are suffering from a financial pandemic. Although salaries have risen steadily during the last three decades, reports from a host of sources (athletes, players' associations, agents and financial advisers) indicate that:

• By the time they have been retired for two years, 78% of former NFL players have gone bankrupt or are under financial stress because of joblessness or divorce.

• Within five years of retirement, an estimated 60% of former NBA players are broke.

by Pablo S. Torre, Sports Illustrated |  Read more:
Image via: Balls and Seeds

The Bravest Woman in Seattle


[ed. Winner of this year's Pulitzer prize for feature writing.]

The prosecutor wanted to know about window coverings. He asked: Which windows in the house on South Rose Street, the house where you woke up to him standing over you with a knife that night—which windows had curtains that blocked out the rest of the world and which did not?

She answered the prosecutor's questions, pointing to a map of the small South Park home she used to share with her partner, Teresa Butz, a downtown Seattle property manager. When the two of them lived in this house, it was red, a bit run-down, much loved, filled with their lives together, typical of the neighborhood. Now it was a two-dimensional schematic, State's Exhibit 2, set on an easel next to the witness stand. She narrated with a red laser pointer for the prosecutor and the jury: These windows had curtains that couldn't be seen through. These windows had just a sheer fabric.

Would your silhouettes have been visible through that sheer fabric at night?

Probably. She didn't know for sure. When she and her partner lived in the house, she noted, "I didn't spend a lot of time staring in my own windows."

Everyone in the courtroom laughed a small laugh—a laugh of nervous relief, because here was a woman testifying about her own rape, and the rape and murder of her partner, and yet she was smiling at the current line of questioning, at the weird perceptual cul-de-sac to which it led. She appeared to understand why people might need to hear these answers, though. What happened to her and Butz in that house in the early morning hours of July 19, 2009, is hard to comprehend. A juror, in order to ease into the reality of what occurred, might first need to imagine how the man picked these two women. At least, then, there'd be some sort of arc to the story.

Maybe he stalked them, looked in their windows, decided they would be his victims. A young South Park girl named Diana Ramirez had already told the court that the man looked familiar. "His eyes," Ramirez said. The prosecutor had also pointed out that the women only had a partial fence in their backyard, the yard where they liked to sit on warm evenings, staring at the sky above the South Park Community Center and the trees in the large surrounding park. It would have been easy for the man to approach their home, unseen, through this park at night.

Maybe he'd noticed the women around the neighborhood during the day, both attractive, both shorter than him, working in their front yard, or attending a local festival, or heading to and from their favorite bar, Loretta's. That July it was unusually hot. Butz, a brown-haired dynamo raised in much hotter St. Louis summers, thought it ridiculous to install air conditioning in Seattle, the court was told. Maybe the man saw that these women were keeping some windows open at night.

Maybe he also saw their love for each other, noticed it in silhouette or on a sidewalk, a love that was exploding that summer, making them inseparable, a love that had grown into plans for a commitment ceremony that fall. Maybe he realized he could turn that love against them, mercilessly, use it to control them in their own home, each subdued by the threat that he would kill the other.

They were two and he was one. But maybe he saw that, in a sense, they were one. He was six feet tall, 200 pounds, muscled. He would have two knives with him. Maybe, looking through one of their windows, he thought that if it did become a fight, the numbers would be on his side.

by Eli Sanders, The Stranger |  Read more:
Illustration: Aaron Bagley

The Never-to-Be Bride


Ours was a love affair that knew its finest hours on a screen. Dan and I could plan the next 50 years in a two-hour online conversation.

Maybe we were able to sketch our future so easily because we didn’t think we’d ever see it. In television dramas, I can tell when a wedding won’t go as planned; the clue is when a character rehearses his or her vows before the ceremony. That’s the sign that we, the audience, won’t be hearing them later; what’s worse is the dramatic irony of knowing what one real life never-to-be-bride-or-groom will never get to say.

Years ago, during one of my “off” periods with Dan, when I was feeling devastated by our being “off,” I went to a Buddhist-type therapist in San Francisco who tried an experimental therapy. He said he used this therapy on 9/11 survivors — guiding them through what would happen if the worst happened — to get them to the other side of their greatest fear. And I thought: how dramatic, how creepy, to use this therapy on me, just another heartbroken girl.

I sat on his big soft couch and stared at a painting of mountains and coyotes. He asked me to hold vibrating paddles, one in each hand, and close my eyes. He controlled the intensity of the vibration, and all I did was squeeze while he asked me to imagine Dan’s future wedding to a woman who wasn’t me.

With the cream-colored paddles surging, the therapist asked questions about the ceremony.

“What does Dan look like walking down the aisle?”

“He looks happy.”

“What are you doing while Dan is getting married?”

“I’m writing a novel.”

“What is the novel about?”

“It is about a lost man,” I said, because I didn’t know what else to say.

by Elissa Bassist, NY Times |  Read more:
Illustration: Brian Rea

Thursday, April 26, 2012

Communities for People

[ed. Well worth watching, just for the before and after pictures of what livable communities should look like, and how easily they can be achieved with careful planning.]



Dan Burden has spent more than 35 years helping the world get “back on its feet” and his efforts have not only earned him the first-ever lifetime-achievement awards issued by the New Partners for Smart Growth and the Association of Pedestrian and Bicycle Professionals, but in 2001, Dan was named by TIME magazine as “one of the six most important civic innovators in the world.”  Also that year, the Transportation Research Board of the National Academy of Sciences honored Dan by making him their Distinguished Lecturer.  In 2009, a user’s poll by Planetizen named Dan as one of the Top 100 Urban Thinkers of all time.  Early in his career, starting in 1980, Dan served for 16 years as the country’s first statewide Bicycle and Pedestrian Coordinator for the Florida Department of Transportation and that program became a model for other statewide programs in the United States.  In 1996, Dan sought to expand his reach and ability to really change the world, so he and his wife Lys co-founded a non-profit organization called Walkable Communities.  Since then, Dan has personally helped 3,500 communities throughout the world become more livable and walkable.

Walkable and Liveable Communities Institute

The A/B Test: Inside the Technology That’s Changing the Rules of Business


Dan Siroker helps companies discover tiny truths, but his story begins with a lie. It was November 2007 and Barack Obama, then a Democratic candidate for president, was at Google’s headquarters in Mountain View, California, to speak. Siroker—who today is CEO of the web-testing firm Optimizely, but then was a product manager on Google’s browser team—tried to cut the enormous line by sneaking in a back entrance. “I walked up to the security guard and said, ‘I have to get to a meeting in there,’” Siroker recalls. There was no meeting, but his bluff got him in.

At the talk, Obama fielded a facetious question from then-CEO Eric Schmidt: “What is the most efficient way to sort a million 32-bit integers?” Schmidt was having a bit of fun, but before he could move on to a real question, Obama stopped him. “Well, I think the bubble sort would be the wrong way to go,” he said—correctly. Schmidt put his hand to his forehead in disbelief, and the room erupted in raucous applause. Siroker was instantly smitten. “He had me at ‘bubble sort,’” he says. Two weeks later he had taken a leave of absence from Google, moved to Chicago, and joined up with Obama’s campaign as a digital adviser.

At first he wasn’t sure how he could help. But he recalled something else Obama had said to the Googlers: “I am a big believer in reason and facts and evidence and science and feedback—everything that allows you to do what you do. That’s what we should be doing in our government.” And so Siroker decided he would introduce Obama’s campaign to a crucial technique—almost a governing ethos—that Google relies on in developing and refining its products. He showed them how to A/B test.

Over the past decade, the power of A/B testing has become an open secret of high-stakes web development. It’s now the standard (but seldom advertised) means through which Silicon Valley improves its online products. Using A/B, new ideas can be essentially focus-group tested in real time: Without being told, a fraction of users are diverted to a slightly different version of a given web page and their behavior compared against the mass of users on the standard site. If the new version proves superior—gaining more clicks, longer visits, more purchases—it will displace the original; if the new version is inferior, it’s quietly phased out without most users ever seeing it. A/B allows seemingly subjective questions of design—color, layout, image selection, text—to become incontrovertible matters of data-driven social science.

by Brian Christian, Wired |  Read more:
Photo: Spencer Higgins; Illustration: Si Scott

1960: Marilyn Monroe in Reno captured by Eve Arnold. This picture was taken during the filming of The Misfits directed by John Huston.
via:

Brevity and the Soul

[ed. One of the commenters to this essay mentions Raymond Carver, a master of brevity, and provides a link to one of his stories Popular Mechanics.]

There is the apocryphal story in which Hemingway, sitting in a bar somewhere in Key West, is asked by an antagonistic admirer to follow his minimalism to its logical outcome and to tell a story in six words.  As the story goes, Hemingway picks up a napkin and writes out the following words:

For sale: baby shoes, never worn.

This is a pretty good story. The reader has to kind of inhabit it and fill in all that is unsaid (which is pretty much everything), but there’s an inexhaustible sadness there in the spaces between the words.  Everything pared away until there’s almost nothing left. The iceberg theory of fiction.

The genre of short short fiction (or microfiction, or whatever one might want to call it) is itself kinda small, and little of it is worth reading. But there are exceptions.

There’s this: “Sticks,” by George Saunders, perhaps the greatest super short story I’ve ever read:


Sticks


Originally published in Story, Winter 1995.

Every year Thanksgiving night we flocked out behind Dad as he dragged the Santa suit to the road and draped it over a kind of crucifix he'd built out of metal pole in the yard. Super Bowl week the pole was dressed in a jersey and Rod's helmet and Rod had to clear it with Dad if he wanted to take the helmet off. On the Fourth of July the pole was Uncle Sam, on Veteran’s Day a soldier, on Halloween a ghost. The pole was Dad's only concession to glee. We were allowed a single Crayola from the box at a time. One Christmas Eve he shrieked at Kimmie for wasting an apple slice. He hovered over us as we poured ketchup saying: good enough good enough good enough. Birthday parties consisted of cupcakes, no ice cream. The first I brought a date over she said: what's with your dad and that pole? and I sat there blinking.

We left home, married,  had children of our own, found the seeds of meanness blooming also within us. Dad began dressing the pole with more complexity and less discernible logic. He draped some kind of fur over it on Groundhog Day and lugged out a floodlight to ensure a shadow. When an earthquake struck Chile he lay the pole on its side and spray painted a rift in the earth. Mom died and he dressed the pole as Death and hung from the crossbar photos of Mom as a baby. We'd stop by and find odd talismans from his youth arranged around the base: army medals, theater tickets, old sweatshirts, tubes of Mom's makeup. One autumn he painted the pole bright yellow. He covered it with cotton swabs that winter for warmth and provided offspring by hammering in six crossed sticks around the yard. He ran lengths of string between the pole and the sticks, and taped to the string letters of apology, admissions of error, pleas for understanding, all written in a frantic hand on index cards. He painted a sign saying LOVE and hung it from the pole and another that said FORGIVE? and then he died in the hall with the radio on and we sold the house to a young couple who yanked out the pole and the sticks and left them by the road on garbage day.

Here there is an entire novel’s worth of intrigue and emotional complexity and backstory and difficult familial relationships and unhappinesses and losses and redemptions.  One can’t help but think of all those homes run by inexpressive and angry fathers who know something of love’s austere offices, these homes that suddenly erupt in holiday decorations that go waaay beyond the normal or expected.  Rudolphs and Santas and baby Jesuses and lights and holly all over the place.  This phenomenon…the phenomenon of the middle-to-lower-class father who has no creative outlet but finds an avenue in his front yard…this is an important aspect of contemporary life in the U.S., and one that needs more examination.  There are dissertations here.  And Saunders’ story is a most excellent jumping off point.

Then there is David Foster Wallace’s remarkable “A Radically Condensed History of Postindustrial Life.”

When they were introduced, he made a witticism, hoping to be liked. She laughed extremely hard, hoping to be liked. Then each drove home alone, staring straight ahead, with the very same twist to their faces.

The man who’d introduced them didn’t much like either of them, though he acted as if he did, anxious as he was to preserve good relations at all times. One never knew, after all, now did one now did one now did one.

I don’t think I’ve ever fully fathomed this one, but the final repetition of “now did one now did one now did one” is wildly suggestive.  It seems to suggest something of the radical uncertainty of what it means to live in a world where everyone is wearing a face to meet the faces on the street.

by Tom Jacobs, 3 Quarks Daily |  Read more: