Thursday, May 10, 2012

HP print cartridges through the ages, a slow-motion ripoff


[ed. Well knock me over with a feather...who would have thought?]

A little work with a handsaw reveals that over the years, the amount of ink in a new, official HP refill cartridge has been in decline. Prices, of course, have not been in decline.
I then removed the top of the cartridge with a handsaw and as you can see from the picture below the hydrophobic sponge fills the cartridge totally, just as I would have expected for the best part of fifteen quid, I then took another HP 350, the same cartridge but this time the manufacturers date was 2012 on the cartridge, I removed the top in the same way as before and to be totally honest I could not believe what I was looking at, the hydrophobic sponge inside the 2012 cartridge is only half the size!!
Mmm, I was beginning to smell a rat; as the saying goes… this got me thinking even more and I started to wonder if all the newer cartridges are like this, so this time I chopped the top off of a new HP 301 cartridge to have a look at the sponge, surely it can’t be any smaller…..or can it? Guess what! The sponge inside the HP301 is almost 40 percent smaller than the 2012 HP 350, which means that we are actually getting less ink for our money now than ever before. Why is that? The price isn’t shrinking though, that’s for sure!
HP Introduces Nano Sponge (via Reddit)

by Cory Doctorow, Boing Boing

Qin Tianzhu
via:

Keith Jarrett & Charlie Haden


[ed. Gorgeous song from a 2007 collaboration that produced the wonderful Jasmine album. This is the only video I could find and there's something odd going on for the first 45 seconds or so (art?) but it smooths out after that. Well worth a listen, especially late at night.]

The Lost Steve Jobs Tapes

If Steve Jobs's life were staged as an opera, it would be a tragedy in three acts. And the titles would go something like this: Act I--The Founding of Apple Computer and the Invention of the PC Industry; Act II--The Wilderness Years; and Act III--A Triumphant Return and Tragic Demise.

The first act would be a piquant comedy about the brashness of genius and the audacity of youth, abruptly turning ominous when our young hero is cast out of his own kingdom. The closing act would plumb the profound irony of a balding and domesticated high-tech rock star coming back to transform Apple far beyond even his own lofty expectations, only to fall mortally ill and then slowly, excruciatingly wither away, even as his original creation miraculously bulks up into the biggest digital dynamo of them all. Both acts are picaresque tales that end with a surge of deep pathos worthy of Shakespeare.

But that second act--The Wilderness Years--would be altogether different in tone and spirit. In fact, the soul of this act would undermine its title, a convenient phrase journalists and biographers use to describe his 1985 to 1996 hiatus from Apple, as if the only meaningful times in Jobs's life were those spent in Cupertino. In fact, this middle period was the most pivotal of his life. And perhaps the happiest. He finally settled down, married, and had a family. He learned the value of patience and the ability to feign it when he lost it. Most important, his work with the two companies he led during that time, NeXT and Pixar, turned him into the kind of man, and leader, who would spur Apple to unimaginable heights upon his return.

Indeed, what at first glance seems like more wandering for the barefoot hippie who dropped out of Reed College to hitchhike around India, is in truth the equivalent of Steve Jobs attending business school. In other words, he grew. By leaps and bounds. In every aspect of his being. With a little massaging, this middle act could even be the plotline for a Pixar movie. It certainly fits the simple mantra John Lasseter ascribes to all the studio's successes, from Toy Story to Up: "It's gotta be about how the main character changes for the better."

I had covered Jobs for Fortune and The Wall Street Journal since 1985, but I didn't come to fully appreciate the importance of these "lost" years until after his death last fall. Rummaging through the storage shed, I discovered some three dozen tapes holding recordings of extended interviews--some lasting as long as three hours--that I'd conducted with him periodically over the past 25 years. (Snippets are scattered throughout this story.) Many I had never replayed--a couple hadn't even been transcribed before now. Some were interrupted by his kids bolting into the kitchen as we talked. During others, he would hit the pause button himself before saying something he feared might come back to bite him. Listening to them again with the benefit of hindsight, the ones that took place during that interregnum jump out as especially enlightening.

The lessons are powerful: Jobs matured as a manager and a boss; learned how to make the most of partnerships; found a way to turn his native stubbornness into a productive perseverance. He became a corporate architect, coming to appreciate the scaffolding of a business just as much as the skeletons of real buildings, which always fascinated him. He mastered the art of negotiation by immersing himself in Hollywood, and learned how to successfully manage creative talent, namely the artists at Pixar. Perhaps most important, he developed an astonishing adaptability that was critical to the hit-after-hit-after-hit climb of Apple's last decade. All this, during a time many remember as his most disappointing.

Eleven years is a big chunk of a lifetime. Especially when one's time on earth is cut short. Moreover, many people--particularly creative types--are often their most prolific during their thirties and early forties. With all the heady success of Apple during Jobs's last 14 years, it's all too easy to dismiss these "lost" years. But in truth, they transformed everything. As I listened again to those hours and hours of tapes, I realized they were, in fact, his most productive.

by Brent Schlender, Fast Company |  Read more:
llustrations drawn on iPad by Jorge Colombo

The Cooler Me


Statistically speaking, there's probably a cooler you out there. The guy who's actually living that life you'd imagined for yourself before you got married, had a couple of kids, and strapped in to that desk job. Maybe he plays in a band, lives in California, wakes up at ten, and surfs before noon. Wherever he is, he's definitely having more fun than you are. What if you could track that guy down? Hang out with his friends. Eat, drink, and sleep according to his responsibility-free schedule? Then you could decide, once and for all, if the cooler life is the better life?

Not too long ago, I was sitting backstage at the Great American Music Hall in San Francisco and drinking beer with my doppelgänger, a 39-year-old singer-songwriter named Kyle Field. Or rather, I was drinking beer by myself while he entertained his fans, most of whom seemed to be half his age. Despite his best efforts, he'd failed to conceal his grizzly good looks. He was very tan, had a big amber beard, and was wearing a sea captain's hat that somehow added to his charm. There are not many grown men out there who can wear a captain's hat and not look like a member of the Village People, but my doppelgänger is one of them. He spotted me through the haze of pot smoke and lifted his beer, and I lifted mine back. We were the oldest people in the room—perhaps the whole club. And yet we'd entered some alternate universe: a Neverland where no one aged or had children or worried about pesky bourgeois things, like brain cells or health insurance.

"When's your bedtime?" he asked later, clapping me on the back.

"Three hours ago," I said.

He laughed. "We're headed to Edinburgh Castle, if you want to come. Last call isn't for thirty minutes."

"I can't remember the last time I shut down a bar," I said.

"I wish I could say the same."

He went off to talk to a blonde woman in the corner, the sexiest of his three backup singers. He was not hitting on her: Rather, they seemed like old friends, goofing around and singing a song I didn't recognize. I'd always dreamed of being a musician in another life, yet a lot of the ones I'd met were lethargic hipsters so cocooned in their coolness they could hardly bring themselves to smile. Even their sneezes seemed like "sneezes." Kyle, on the other hand, was sincere and friendly and strangely innocent, someone who talked about "making out" with women and threw his head back when he laughed. I glanced at the person sitting next to me, a very stoned-looking girl who'd maybe spent a summer or two working with Tibetan orphans.

"How do you know Kyle?" she asked enviously.

"I don't really," I said. "I just met him a month ago. He's my doppelgänger."

"Your what?"

"My double.... You know, the person I might have been."

by Eric Puchner, GQ |  Read more:
Photo: Chris Brooks

Afghanistan: A Gathering Menace


Since 2006 I have written off and on about the wars in Afghanistan and Iraq. Nearly all of my work in those countries has been done embedded with NATO, mostly American military units. Many times I have watched soldiers or Marines, driven by boredom or fear, behave selfishly and meanly, even illegally, in minor ways. In a few searing moments I have wondered what would come next, what the men would do to prisoners or civilians or suspected insurgents. And I have wondered how to describe these moments without reporting melodramatic minutiae or betraying the men who allowed me in.

Most soldierly stupidity does not amount to crime; most soldiers never commit atrocities. U.S. soldiers shooting at goats, for example, or pilots getting drunk on base, or guards threatening the lives of prisoners, all things I have seen, defy military rules and erode efforts to win hearts and minds. But how bad is it, really? Do we care? What is my responsibility when I see it? I have never found good ways to write about the subhuman wash of aggression and the small episodes of violence military men and women cycle through daily, or the choices they make in the midst of this.

We tend to ignore such problems unless they are connected to a crime. An editor at a major magazine once dismissed such unsteady subjects by saying, “Yes, but bad things happen everywhere.” Perhaps she was telling me to lighten up. She was also summarizing a national attitude toward the wars. I write about it now because what I witnessed with Destroyer, and other units, routinely and unquietly returns to me.

I joined the platoon last summer at the end of a weeklong mission designed to clear insurgents from a series of towns and valleys in central Afghanistan. In 10 years of war, I was told, NATO troops had never visited the region. Intelligence reports called it a Taliban stronghold, and commanders expected heavy fighting. Going in, many soldiers told me they believed they would die.

Destroyer and several other units had dropped into the valleys by helicopter at night. During the day, they pushed through a sun-killed landscape of rock and withered grasses, where it was Destroyer’s job to search for weapons caches and battle insurgents alongside a wobbly unit of Afghan National Army (ANA) troops.

Each night, the men slept in abandoned qalats (fortified residential compounds), or they moved into occupied ones, handed the residents some cash, and kicked them out. I met the soldiers at a qalat they had temporarily confiscated, a large, newly painted house. Tall walls enclosed a courtyard containing a small orchard, a garden, and a well. Several rooms ran along one wall, and the soldiers had moved into them, sleeping head to foot on floors littered with cigarette packs, candy wrappers, and food scraps. The place was heavy with a scent I would later follow through the night.

by Neil Shea, The American Scholar |  Read more:
Photo: Neil Shea

Skateboarding Past a Midlife Crisis


A grinding sound, somewhere between a rattle and a rumble, erupted over the suburban New Jersey hill. The figures, clad in motorcycle leathers and helmets, started to appear, one, two, three, until there were almost 20, crouched on skateboards, like a squadron of roller-villains.

One by one, the skaters rounded a turn, dropping a gloved hand to the asphalt as they scraped their wheels in a slide, taking care that nobody crossed into oncoming traffic and found themselves splattered across the grill of a Buick.

At the bottom of the hill, the leader of the Bergen County Bombers, as this gang of skateboarders is known, wrestled the black motorcycle helmet from his head, revealing a mortgage broker with gray flecks in his beard and the crow’s feet that come from decades in the white-collar trenches.

“It’s a total midlife crisis,” said the group’s leader, Tom Barnhart, 47, of Cresskill, N.J., who started skateboarding two years ago for the first time since the Carter administration. His life, he said, had been in a rut. “My kids grew old, so I got a dog. My dog grew old, so I got a skateboard.”

“That was what knocked the cobwebs out of my head,” he added.

Forget the little red sports car: the new symbol for midlife crisis is the skateboard. Graying members of Generation X, and even their older brothers, are reclaiming their youth and rebellious streak by hopping back on a skate deck. Some are even showing off old tricks in the skate park.

It’s the latest gasp for a generation of perma-dudes who listen to Black Flag in their BMWs and trade high-fives in client meetings. It’s a bid to escape the corporate grind, beat back their flagging vigor and even make good on a generational cliché: to extend their adolescence until their federal prescription-drug benefit kicks in.

by Alex Williams, NY Times |  Read more:
Illustration: Tony Cenicola

Beach Boys

Wednesday, May 9, 2012


Tom Wesselmann, Bedroom Painting #76, (1984-93). Oil on canvas on board with working tv.
via:

Too Much Information


We all know there’s a whole lot more information in our worlds than there used to be. As to how much more, well, most of of us are pretty clueless.

Here’s a priceless nugget about all that info, compliments of Dave Turek, the guy in charge of supercomputer development at IBM: From the year 2003 and working backwards to the beginning of human history, we generated, according to IBM’s calculations, five exabytes–that’s five billion gigabytes–of information. By last year, we were cranking out that much data every two days. By next year, predicts Turek, we’ll be doing it every 10 minutes.

But how is this possible? How did data become such digital kudzu? Put simply, every time your cell phone sends out its GPS location, every time you buy something online, every time you click the Like button on Facebook, you’re putting another digital message in a bottle. And now the oceans are pretty much covered with them.

And that’s only part of the story. Text messages, customer records, ATM transactions, security camera images…the list goes on and on. The buzzword to describe this is “Big Data,” though that hardly does justice to the scale of the monster we’ve created.

It’s the latest example of technology outracing our capacity to use it. In this case, we haven’t begun to catch up with our ability to capture information, which is why a favorite trope of management pundits these days is that the future belongs to companies and governments that can make sense of all the data they’re collecting, preferably in real time.

by Randy Reiland, Smithsonian |  Read more:
Photo courtesy of Flickr user mrflip

Papercraft Seasons by Lizzie Thomas
via:

Bathers in the Sea
Henri Lebasque
via:

Holding Transcripts Hostage


Students traditionally have a soft spot for their alma maters. But as growing numbers of students run up debt in the high five and even six figures to pay for college, that may change. Especially when they discover their old school is actively blocking them from getting a job or going on to a higher degree.

That's what increasing numbers of students are finding when they try to obtain an official transcript to send to potential employers or graduate admissions offices.  (...)

It turns out many colleges and universities refuse to issue these critical documents if students are in default on student loans, or in many cases, even if they just fall one or two months behind.

It's no accident that colleges are using the withholding of official transcripts to punish students behind in their loan payments. It turns out the federal government encourages the practice. Schools are not required by law to withhold transcripts, but a spokeswoman at the Department of Education confirmed that the department "encourages" them to use the draconian tactic, saying that the policy "has resulted in numerous loan repayments."

It is a strange position for colleges to take, however, since the schools themselves are not owed any money. Student loan funds come from private banks or the federal government. For federal Perkins loans, schools get a pool of federal money to apply to students' financial aid, and if students don't pay, that pool gets smaller. But the creditor is still the government, not the college. And in the case of so-called Stafford loans, schools are not on the hook in any way; they are simply acting as collection agencies, and in fact may get paid for their efforts at collection.  (...)

Andrew Ross, an NYU professor who helped spark the Occupy Student Debt movement in November, says of the no-transcript tactic: "It's worse than indentured servitude. With indentured servitude, you had to pay in order to work, but then at least you got to work. When universities withhold these transcripts, students who have been indentured by loans are being denied even the ability to work or to finish their education so they can repay their indenture."

by Dave Lindorff, LA Times |  Read more:
Photo: LA Times

To the Class of 2012

[ed. Must be that time of year again - the annual lecture.]

Dear Class of 2012:

Allow me to be the first one not to congratulate you. Through exertions that—let's be honest—were probably less than heroic, most of you have spent the last few years getting inflated grades in useless subjects in order to obtain a debased degree. Now you're entering a lousy economy, courtesy of the very president whom you, as freshmen, voted for with such enthusiasm. Please spare us the self-pity about how tough it is to look for a job while living with your parents. They're the ones who spent a fortune on your education only to get you back— return-to-sender, forwarding address unknown.

No doubt some of you have overcome real hardships or taken real degrees. A couple of years ago I hired a summer intern from West Point. She came to the office directly from weeks of field exercises in which she kept a bulletproof vest on at all times, even while sleeping. She writes brilliantly and is as self-effacing as she is accomplished. Now she's in Afghanistan fighting the Taliban.

If you're like that intern, please feel free to feel sorry for yourself. Just remember she doesn't.

Unfortunately, dear graduates, chances are you're nothing like her. And since you're no longer children, at least officially, it's time someone tells you the facts of life. The other facts.

Fact One is that, in our "knowledge-based" economy, knowledge counts. Yet here you are, probably the least knowledgeable graduating class in history.

A few months ago, I interviewed a young man with an astonishingly high GPA from an Ivy League university and aspirations to write about Middle East politics. We got on the subject of the Suez Crisis of 1956. He was vaguely familiar with it. But he didn't know who was president of the United States in 1956. And he didn't know who succeeded that president.

Pop quiz, Class of '12: Do you?

Many of you have been reared on the cliché that the purpose of education isn't to stuff your head with facts but to teach you how to think. Wrong. I routinely interview college students, mostly from top schools, and I notice that their brains are like old maps, with lots of blank spaces for the uncharted terrain. It's not that they lack for motivation or IQ. It's that they can't connect the dots when they don't know where the dots are in the first place.

by Bret Stephens, WSJ |  Read more:

Tuesday, May 8, 2012


It was only

the thin thread of a cloud,
almost transparent,
leading me along the way
like an ancient sacred song

Yosano Akiko

From River of Stars - tr. Sam Hamill & Keiko Matsui Gibson
via:

Current Events: US Attack Kills 5 Afghan Kids

Yesterday, I noted several reports from Afghanistan that as many as 20 civilians were killed by two NATO airstrikes, including a mother and her five children. Today, the U.S. confirmed at least some of those claims, acknowledging and apologizing for its responsibility for the death of that family:
The American military claimed responsibility and expressed regret for an airstrike that mistakenly killed six members of a family in southwestern Afghanistan, Afghan and American military officials confirmed Monday.

The attack, which took place Friday night, was first revealed by the governor of Helmand Province, Muhammad Gulab Mangal, on Monday. His spokesman, Dawoud Ahmadi, said that after an investigation they had determined that a family home in the Sangin district had been attacked by mistake in the American airstrike, which was called in to respond to a Taliban attack. . . . The victims were the family’s mother and five of her children, three girls and two boys, according to Afghan officials.
This happens over and over and over again, and there are several points worth making here beyond the obvious horror:

(1) To the extent these type of incidents are discussed at all — and in American establishment media venues, they are most typically ignored — there are certain unbending rules that must be observed in order to retain Seriousness credentials. No matter how many times the U.S. kills innocent people in the world, it never reflects on our national character or that of our leaders. Indeed, none of these incidents convey any meaning at all. They are mere accidents, quasi-acts of nature which contain no moral information (in fact, the NYT article on these civilian deaths, out of nowhere, weirdly mentioned that “in northern Afghanistan, 23 members of a wedding celebration drowned in severe flash flooding” — as though that’s comparable to the U.S.’s dropping bombs on innocent people). We’ve all been trained, like good little soldiers, that the phrase “collateral damage” cleanses and justifies this and washes it all way: yes, it’s quite terrible, but innocent people die in wars; that’s just how it is. It’s all grounded in America’s central religious belief that the country has the right to commit violence anywhere in the world, at any time, for any cause.

At some point — and more than a decade would certainly qualify — the act of continuously killing innocent people, countless children, in the Muslim world most certainly does reflect upon, and even alters, the moral character of a country, especially its leaders. You can’t just spend year after year piling up the corpses of children and credibly insist that it has no bearing on who you are. That’s particularly true when, as is the case in Afghanistan, the cause of the war is so vague as to be virtually unknowable. It’s woefully inadequate to reflexively dismiss every one of these incidents as the regrettable but meaningless by-product of our national prerogative. But to maintain mainstream credibility, that is exactly how one must speak of our national actions even in these most egregious cases. To suggest any moral culpability, or to argue that continuously killing children in a country we’re occupying is morally indefensible, is a self-marginalizing act, whereby one reveals oneself to be a shrill and unSerious critic, probably even a pacifist. Serious commentators, by definition, recognize and accept that this is merely the inevitable outcome of America’s supreme imperial right, note (at most) some passing regret, and then move on.

(2) Yesterday — a week after it leaked that it was escalating its drone strikes in Yemen — the Obama administration claimed that the CIA last month disrupted a scary plot originating in Yemen to explode an American civilian jet “using a more sophisticated version of the underwear bomb deployed unsuccessfully in 2009.” American media outlets — especially its cable news networks — erupted with their predictable mix of obsessive hysteria, excitement and moral outrage. (...)

Needless to say, the fact that the U.S. has spent years and years killing innocent adults and children in that part of the world — including repeatedly in Yemen — was never once mentioned, even though it obviously is a major factor for why at least some people in that country support these kinds of plots. Those facts are not permitted to be heard. Discussions of causation — why would someone want to attack a U.S. airliner? – is an absolute taboo, beyond noting that the people responsible are primitive and hateful religious fanatics. Instead, it is a simple morality play reinforced over and over: Americans are innocently minding their own business — trying to enjoy our Freedoms — and are being disgustingly targeted with horrific violence by these heinous Muslim Terrorists whom we must crush (naturally, the solution to the problem that there is significant anti-American animosity in Yemen is to drop even more bombs on them, which will certainly fix this problem).

by Glenn Greenwald, Salon |  Read more:

Géza Faragó, Slim Woman with a Cat, 1913.
via:

Too Much Information

[ed. If you've read this blog for a while you know I'm an unabashed DFW fan, and so, please excuse another review of his postumous book The Pale King, which I somehow managed to miss the first time around.]

One of the few detectable lies in David Foster Wallace's books occurs in his essay on the obscure '90s-era American tennis prodigy Michael Joyce, included in Wallace's first nonfiction anthology, A Supposedly Fun Thing I'll Never Do Again. Apart from some pages in his fiction, it's the best thing he wrote about tennis—better even than his justly praised but disproportionately famous piece on Roger Federer—precisely because Joyce was a journeyman, an unknown, and so offered Wallace's mind a white canvas. Wallace had almost nothing to work with on that assignment: ambiguous access to the qualifying rounds of a Canadian tournament, a handful of hours staring through chain link at a subject who was both too nice to be entertaining and not especially articulate. Faced with what for most writers would be a disastrous lack of material, Wallace looses his uncanny observational powers on the tennis complex, drawing partly on his knowledge of the game but mainly on his sheer ability to consider a situation, to revolve it in his mental fingers like a jewel whose integrity he doubts. In the mostly empty stadium he studies the players between matches. "They all have the unhappy self-enclosed look of people who spend huge amounts of time on planes and waiting around in hotel lobbies," he writes, "the look of people who have to create an envelope of privacy around them with just their expressions." He hears the "authoritative pang" of tour-tight racket strings and sees ball boys "reconfigure complexly." He hits the practice courts and watches players warm up, their bodies "moving with the compact nonchalance I've since come to recognize in pros when they're working out: the suggestion is one of a very powerful engine in low gear."

The lie comes at the start of the piece, when Wallace points out a potential irony of what he's getting ready to do, namely write about people we've never heard of, who are culturally marginal, yet are among the best in the world at a chosen pursuit. "You are invited to try to imagine what it would be like to be among the hundred best in the world at something," Wallace says. "At anything. I have tried to imagine; it's hard."

What's strange is that this was written in 1996—by then, Wallace had completed his genre-impacting second novel, Infinite Jest, as well as the stories, a couple already considered classic, in the collection Girl with Curious Hair. It's hard to believe he didn't know that he was indeed among the hundred best at a particular thing, namely imaginative prose, and that there were serious people ready to put him among an even smaller number. Perhaps we should assume that, being human, he knew it sometimes and at other times feared it wasn't true. Either way, the false modesty—asking us to accept the idea that he'd never thought of himself as so good and had proposed the experiment naively—can't help reading as odd. Which may itself be deliberate. Not much happens by accident in Wallace's stuff; his profound obsessive streak precluded it. So could it be there's something multilayered going on with sport as a metaphor for writing—even more layers than we expect? It does seem curious that Wallace chose, of all the players, one named Joyce, whose "ethnic" Irishness Wallace goes out of his way to emphasize, thereby alluding to an artist whose own fixation on technical mastery made him a kind of grotesque, dazzling but isolated from healthful, human narrative concerns. Certainly Wallace played textual games on that level.

Here's a thing that is hard to imagine: being so inventive a writer that when you die, the language is impoverished. That's what Wallace's suicide did, two and a half years ago. It wasn't just a sad thing, it was a blow. ···

It's hard to do the traditional bio-style paragraph about Wallace for readers who, in this oversaturated mediascape, don't know who he was or why he mattered, because you keep flashing on his story "Death Is Not the End," in which he parodies the practice of writing the traditional bio-style paragraph about writers, listing all their honors and whatnot, his list becoming inexplicably ridiculous as he keeps naming the prizes, and you get that he's digging into the frequent self-congratulating silliness of the American literary world, "a Lannan Foundation Fellowship, [...] a Mildred and Harold Strauss Living Award from the American Academy and Institute of Arts and Letters...a poet two separate American generations have hailed as the voice of their generation." Wallace himself had many of the awards on the list, including "a 'Genius Grant' from the prestigious MacArthur Foundation." Three novels, three story collections, two books of essays, the Roy E. Disney Professorship of Creative Writing at Pomona College...

When they say that he was a generational writer, that he "spoke for a generation," there's a sense in which it's almost scientifically true. Everything we know about the way literature gets made suggests there's some connection between the individual talent and the society that produces it, the social organism. Cultures extrude geniuses the way a beehive will make a new queen when its old one dies, and it's possible now to see Wallace as one of those. I remember well enough to know it's not a trick of hindsight, hearing about and reading Infinite Jest for the first time, as a 20-year-old, and the immediate sense of: This is it. One of us is going to try it. The "it" being all of it, to capture the sensation of being alive in a fractured superpower at the end of the twentieth century. Someone had come along with an intellect potentially strong enough to mirror the spectacle and a moral seriousness deep enough to want to in the first place. About none of his contemporaries—even those who in terms of ability could compete with him—can one say that they risked as great a failure as Wallace did.

by John Jeremiah Sullivan, GQ |  Read more:

The Aquarium

There’s a psychological mechanism, I’ve come to believe, that prevents most of us from imagining the moment of our own death. For if it were possible to imagine fully that instant of passing from consciousness to nonexistence, with all the attendant fear and humiliation of absolute helplessness, it would be very hard to live. It would be unbearably obvious that death is inscribed in everything that constitutes life, that any moment of your existence may be only a breath away from being the last. We would be continuously devastated by the magnitude of that inescapable fact. Still, as we mature into our mortality, we begin to gingerly dip our horror-tingling toes into the void, hoping that our mind will somehow ease itself into dying, that God or some other soothing opiate will remain available as we venture into the darkness of non-being.

But how can you possibly ease yourself into the death of your child? For one thing, it is supposed to happen well after your own dissolution into nothingness. Your children are supposed to outlive you by several decades, during the course of which they live their lives, happily devoid of the burden of your presence, and eventually complete the same mortal trajectory as their parents: oblivion, denial, fear, the end. They’re supposed to handle their own mortality, and no help in that regard (other than forcing them to confront death by dying) can come from you—death ain’t a science project. And, even if you could imagine your child’s death, why would you?

But I’d been cursed with a compulsively catastrophic imagination, and had often involuntarily imagined the worst. I used to picture being run over by a car whenever I crossed the street; I could actually see the layers of dirt on the car’s axis as its wheel crushed my skull. When I was stuck on a subway with all the lights out, I’d envision a deluge of fire advancing through the tunnel toward the train. Only after I met Teri did I manage to get my tormentful imagination somewhat under control. And, after our children were born, I learned to quickly delete any vision I had of something horrible happening to them. A few weeks before Isabel’s cancer was diagnosed, I’d noticed that her head seemed large and somewhat asymmetrical, and a question had popped into my mind: What if she has a brain tumor? But I banished the thought almost immediately. Even if you could imagine your child’s grave illness, why would you?

by Aleksandar Hemon, New Yorker |  Read more:
Illustration: Guy Billout