Friday, April 15, 2011

Bunker Mentality

by Arnie Cooper, Wired Magazine


Larry Hall believes in preparing for scenarios that the Man would have you believe are fictional—Mayan disaster prophecies, pole shifts, alien invasions, that sort of thing. So the 54-year-old software engineer shelled out $250,000 for a decommissioned Atlas F Missile Base in Kansas. “I thought, wow, I can transform it into an ultrasafe, energy-efficient fortress,” Hall says. Then he figured that other people might also sleep better 200 feet underground within epoxy-hardened concrete walls. And with a custom retrofit featuring GE Monogram stainless-steel appliances and Kohler fixtures, they could also eat (and flush) in style. So Hall announced a “condo suite package”—starting at $900,000—that includes a five-year food supply (think hydroponics and aquaculture) and “simulated view windows” with light levels calibrated to the time of day to keep you from going crazy. Hall says his silo will have a military-grade security system and electricity powered by geothermal energy and wind turbines, as well as a theater, workout area, and pool with a waterfall. Not a bad place to wait out the apocalypse. Hall is still building this dream silo, but he’s already getting applicants. “When they call me up,” he says, “they’re like, you had me at MISSILE BASE!” With three out of seven floors already spoken for, you’d better get your bid in. You’d hate to be stuck in a moving van when the aliens touch down.

via: 
More bunker culture here and here.

Taxman


Written by George Harrison. The music was inspired by the theme song for the popular 1960s TV series Batman, which was written and originally recorded by the conductor/trumpeter Neal Hefti, and covered by the Surf Rock group The Marketts early in 1966 in a version that hit #17 in the US. Harrison was a big fan of the show.

Friday Book Club - Continental Drift

By Michiko Kakutani
NY Times

''THIS is an American story of the late 20th century,'' writes Russell Banks in the Faulknerian invocation that opens ''Continental Drift,'' and this remarkable novel goes on to fulfill that ambitious introduction - in the largest sense. Sweeping in narrative and vivid in its depiction of fragmented, fragmenting lives, ''Continental Drift'' accelerates like a fast, sleek railroad train to its swift conclusion, but Mr. Banks's sure command of plot proves to be only one of many novelistic tools employed in the service of a larger vision.

Like Graham Greene and Robert Stone, Mr. Banks is concerned with moral ambiguities and their consequences on ordinary lives, and his tale of how one man named Bob Dubois went in search of a better life and got in over his head becomes, at once, a visionary epic about innocence and evil and a shattering dissection of contemporary American life.

At 30, Bob Dubois has a wife whom he loves, two daughters and another child on the way. All his life, he's lived in Catamount, N. H., and since high school he's worked as a repairman for the Abenaki Oil Company. ''He stays honest, he doesn't sneak copper tubing or tools into his car at night, he doesn't put in for time he didn't work, he doesn't drink on the job.'' He owns a run-down duplex in a working-class neighborhood, a 13-foot Boston whaler he built from a kit, and a battered Chevrolet station wagon, and he owes the local savings and loan - for the house, the boat and the car - a little over $22,000. ''We have a good life. We do,'' his wife, Elaine, keeps telling him.

Although Catamount may, at first, recall Bedford Falls, the setting of ''It's a Wonderful Life,'' that surface image soon dissolves into another - an image more reminiscent of an Edward Hopper painting. There's something somber, depressed and even vaguely menacing about this community ''closed in by weather and geography, where the men work at jobs and the women work at home and raise children and there's never enough money,'' where ''the men and the women tend to feel angry toward one another much of the time, especially in the evenings when the work is done and the children are sleeping and nothing seems improved over yesterday.''

Bob is no exception. Never having really grown up, Bob finds it hard to know right from wrong; instead, ''he relies on taboos and circumstances to control his behavior, to make him a 'good man' - and lately, he's begun to feel even more confused and disconnected. He hates his humdrum life, feels trapped and angry that none of the dreams he grew up with are likely to come true. He feels there are two Bob Dubois's: the version he's invented for the real world - a man ''who's dutiful, prudent, custodial, faithful and even-tempered;'' and another, secret self - a man who's ''feckless, reckless, irresponsible, faithless and irrational.''

So far, there's not much to distinguish Bob Dubois from the host of disaffected characters who people the fiction of Raymond Carver, Bobbie Ann Mason and Mary Robison: not- so-young survivors of the dislocations of the 60's, afflicted with vague existential doubts and given to drifting, absentmindedly, from day to day. Bob, however, determines to try to make a new life for himself - to start again; and one fine day, he abruptly picks up his family and moves to Florida, where he's soon drawn into partnership with his fast-talking brother, Eddie, and with Ave, a childhood pal who's making a bundle running drugs.

For Mr. Banks, Florida is what California used to be for Raymond Chandler and Nathanael West - a seedy, dangerous place, a magnet for dreamers, entrepreneurs and people with no place else to go. It's the final frontier, where all that's left of the old pioneer spirit is a sort of lawlessness and ''me-first'' individualism, where those willing to play fast and dirty can get rich quick but where other, more tentative folk, like Bob Dubois, see their dreams disintegrating in damp, pastel-colored trailer parks. Bob, in fact, discovers that his life has skidded out of control in Florida. By moving there, he hasn't lassoed the bright future he fantasized about; he's only succeeded in losing his past - the job, the house, everything that once gave his life a modicum of coherence and meaning.

To refugees from the Carribean, however, Florida still represents the promised land, the tip of the American dream, its palm trees whispering ''luxury and power.'' And in a series of alternating takes that counterpoint the story of Bob Dubois, Mr. Banks tells the tale of a young Haitian woman named Vanise, who literally risks everything to get to Miami. Because Vanise's inner life is never delineated with the care lavished on Bob's, the reader sometimes feels the author straining to use her as a metaphor for the yet unspoiled immigrant dream. All the same, the collision between her life and Bob's is so powerfully orchestrated that it takes on the terrible inevitablity of real life, and it lingers in our mind long after we finish the novel.

One of the reasons ''Continental Drift'' possesses such emotional resonance is that Mr. Banks makes the tenuousness of contemporary life - our fears of not being able to hold onto our dreams and protect the people we love - seem entirely palpable, a by- product of our individual failings and our susceptibility to all the changes wrought by recent history's manic metabolic rate. While the scope of ''Continental Drift'' is huge - the author wants to do nothing less than capture American life as it exists today - it remains, somehow, acutely personal; in the story of Bob Dubois's sad, brief life, we catch a frightening glimpse of our own mortality.

via:

I Hate It When This Happens


[No, not Richard Dreyfuss's house]

The Management Myth

by Matthew Stewart

During the seven years that I worked as a management consultant, I spent a lot of time trying to look older than I was. I became pretty good at furrowing my brow and putting on somber expressions. Those who saw through my disguise assumed I made up for my youth with a fabulous education in management. They were wrong about that. I don’t have an M.B.A. I have a doctoral degree in philosophy—nineteenth-century German philosophy, to be precise. Before I took a job telling managers of large corporations things that they arguably should have known already, my work experience was limited to part-time gigs tutoring surly undergraduates in the ways of Hegel and Nietzsche and to a handful of summer jobs, mostly in the less appetizing ends of the fast-food industry.

The strange thing about my utter lack of education in management was that it didn’t seem to matter. As a principal and founding partner of a consulting firm that eventually grew to 600 employees, I interviewed, hired, and worked alongside hundreds of business-school graduates, and the impression I formed of the M.B.A. experience was that it involved taking two years out of your life and going deeply into debt, all for the sake of learning how to keep a straight face while using phrases like “out-of-the-box thinking,” “win-win situation,” and “core competencies.” When it came to picking teammates, I generally held out higher hopes for those individuals who had used their university years to learn about something other than business administration.

After I left the consulting business, in a reversal of the usual order of things, I decided to check out the management literature. Partly, I wanted to “process” my own experience and find out what I had missed in skipping business school. Partly, I had a lot of time on my hands. As I plowed through tomes on competitive strategy, business process re-engineering, and the like, not once did I catch myself thinking, Damn! If only I had known this sooner! Instead, I found myself thinking things I never thought I’d think, like, I’d rather be reading Heidegger! It was a disturbing experience. It thickened the mystery around the question that had nagged me from the start of my business career: Why does management education exist?

Management theory came to life in 1899 with a simple question: “How many tons of pig iron bars can a worker load onto a rail car in the course of a working day?” The man behind this question was Frederick Winslow Taylor, the author of The Principles of Scientific Management and, by most accounts, the founding father of the whole management business.

Read more:

This Tech Bubble Is Different

Tech bubbles happen, but we usually gain from the innovation left behind. This one—driven by social networking—could leave us empty-handed

by Ashlee Vance

As a 23-year-old math genius one year out of Harvard, Jeff Hammerbacher arrived at Facebook when the company was still in its infancy. This was in April 2006, and Mark Zuckerberg gave Hammerbacher—one of Facebook's first 100 employees—the lofty title of research scientist and put him to work analyzing how people used the social networking service. Specifically, he was given the assignment of uncovering why Facebook took off at some universities and flopped at others. The company also wanted to track differences in behavior between high-school-age kids and older, drunker college students. "I was there to answer these high-level questions, and they really didn't have any tools to do that yet," he says.

Over the next two years, Hammerbacher assembled a team to build a new class of analytical technology. His crew gathered huge volumes of data, pored over it, and learned much about people's relationships, tendencies, and desires. Facebook has since turned these insights into precision advertising, the foundation of its business. It offers companies access to a captive pool of people who have effectively volunteered to have their actions monitored like so many lab rats. The hope—as signified by Facebook's value, now at $65 billion according to research firm Nyppex—is that more data translate into better ads and higher sales.

After a couple years at Facebook, Hammerbacher grew restless. He figured that much of the groundbreaking computer science had been done. Something else gnawed at him. Hammerbacher looked around Silicon Valley at companies like his own, Google (GOOG), and Twitter, and saw his peers wasting their talents. "The best minds of my generation are thinking about how to make people click ads," he says. "That sucks."

You might say Hammerbacher is a conscientious objector to the ad-based business model and marketing-driven culture that now permeates tech. Online ads have been around since the dawn of the Web, but only in recent years have they become the rapturous life dream of Silicon Valley. Arriving on the heels of Facebook have been blockbusters such as the game maker Zynga and coupon peddler Groupon. These companies have engaged in a frenetic, costly war to hire the best executives and engineers they can find. Investors have joined in, throwing money at the Web stars and sending valuations into the stratosphere. Inevitably, copycats have arrived, and investors are pushing and shoving to get in early on that action, too. Once again, 11 years after the dot-com-era peak of the Nasdaq, Silicon Valley is reaching the saturation point with business plans that hinge on crossed fingers as much as anything else. "We are certainly in another bubble," says Matthew Cowan, co-founder of the tech investment firm Bridgescale Partners. "And it's being driven by social media and consumer-oriented applications."

Read more: 

image credit:

Thursday, April 14, 2011

America's Two-Tiered Justice System

By Glenn Greenwald, Salon Magazine

The two-tiered justice system:  an illustration
AP Photo/Jacquelyn Martin
Attorney General Eric Holder speaks at a news conference at the Justice Department in Washington, Monday, April 4, 2011

Of all the topics on which I've focused, I've likely written most about America's two-tiered justice system -- the way in which political and financial elites now enjoy virtually full-scale legal immunity for even the most egregious lawbreaking, while ordinary Americans, especially the poor and racial and ethnic minorities, are subjected to exactly the opposite treatment: the world's largest prison state and most merciless justice system.

That full-scale destruction of the rule of law is also the topic of my forthcoming book. But The New York Times this morning has a long article so perfectly illustrating what I mean by "two-tiered justice system" -- and the way in which it obliterates the core covenant of the American Founding: equality before the law -- that it's impossible for me not to highlight it.

The article's headline tells most of the story: "In Financial Crisis, No Prosecutions of Top Figures." It asks: "why, in the aftermath of a financial mess that generated hundreds of billions in losses, have no high-profile participants in the disaster been prosecuted?" And it recounts that not only have no high-level culprits been indicted (or even subjected to meaningful criminal investigations), but few have suffered any financial repercussions in the form of civil enforcements or other lawsuits. The evidence of rampant criminality that led to the 2008 financial crisis is overwhelming, but perhaps the clearest and most compelling such evidence comes from long-time Wall-Street-servant Alan Greenspan; even he was forced to acknowledge that much of the precipitating conduct was "certainly illegal and clearly criminal" and that "a lot of that stuff was just plain fraud."

Despite that clarity and abundance of the evidence proving pervasive criminality, it's entirely unsurprising that there have been no real criminal investigations or prosecutions. That's because the overarching "principle" of our justice system is that criminal prosecutions are only for ordinary rabble, not for those who are most politically and financially empowered. We have thus created precisely the two-tiered justice system against which the Founders most stridently warned and which contemporary legal scholars all agree is the hallmark of a lawless political culture. Lest there be any doubt about that claim, just consider the following facts and events:

Read more:

Picasso's Muse

by John Richardson 

A major new exhibition at the Gagosian Gallery tracks the affair between Picasso and Marie-Thérèse Walter, who became his mistress at 17, bore him a child, and committed suicide after his death, 50 years after they met. John Richardson tells the love story behind Walter’s encoded appearances in some of the 20th century’s most important artworks, including Picasso’s anti-war masterpiece, Guernica.

Left, Marie-Thérèse Leaning on One Elbow, 1939 (oil on canvas, 25 1/2 in. by 18 1/8 in.). Right, a 1933 photograph by Cecil Beaton of Picasso with Nude, Green Leaves and Bust, which sold for $106.5 million in 2010. Photographs: left, © 2011 Estate of Pablo Picasso/Artists Rights Society (ARS), New York; right, from the Cecil Beaton Studio Archive, Sotheby’s; for details, go to vf.com/credits.

Marie-Thérèse Walter is the subject of “Picasso and Marie-Thérèse: L’Amour Fou,” a major exhibition opening at the Gagosian Gallery on West 21st Street, in New York, this month. Marie-Thérèse was Picasso’s love and principal muse from the time he came upon her—she was 17, he was 45—outside the Galeries Lafayette department store, in Paris, in January 1927, until 1941. Art historian Diana Widmaier-Picasso, Marie-Thérèse’s granddaughter, who is preparing a catalogue raisonné of Picasso’s sculptures, has made this retrospective possible. As the guest curator, she has been instrumental in obtaining rarely seen works as well as archival material from the Picasso family and loans from important collections and museums.

Marie-Thérèse was an easygoing but respectable bourgeois girl who lived in Maisons-Alfort, a suburb southeast of Paris, with her mother and two sisters. She was at the Galeries Lafayette that day to buy a col Claudine—a Peter Pan collar—and matching cuffs. “You have an interesting face,” Picasso told her. “I would like to do a portrait of you. I am Picasso.” The name meant nothing to Marie-Thérèse, but the fact that an artist found her beautiful thrilled her.

Although she always claimed to have resisted Picasso for six months, she was sleeping with him a week later. They needed to be very discreet, for she was six months under the conventional age of consent. The absence of a legitimate father facilitated Picasso’s seduction of the girl. At first, her mother made a show of parental propriety, but soon she was welcoming her daughter’s seducer as a friend. “Pic,” she and the girls called him, and she allowed him to use a shed in her garden to paint in and be alone with Marie-Thérèse.

The first time Marie-Thérèse went to the artist’s studio on Rue la Boétie (January 11, 1927), on the floor above the apartment he shared with his wife, Picasso did little more than observe her face and body very closely. As she left, he told her to come back the following day. “From then on it would always be tomorrow; and I had to tell my mother that I had got a job,” she later said. “He told me that I had saved his life, but I had no idea what he meant.” She had indeed saved him: from the psychic stress of his marriage.

Read more:

El Guincho | Bombay


EL GUINCHO | Bombay from MGdM | Marc Gómez del Moral on Vimeo.

[ed.  Love this video.  Great beats, hot babes, wildly inventive narrative and video editing.  What's not to like?]

900 Miles

Scientists at the National Oceanic and Atmospheric Administration Vents Program at Pacific Marine Environmental Laboratory and Oregon State University didn't feel the massive earthquake that struck off Japan on March 11. But they did hear it.

An underwater microphone located near the Aleutian Islands of Alaska, 900 miles from the quake epicenter, captured the sound of the disaster on tape, and a portion of the recording has now been put up on YouTube.  The recording has been sped up 16 times. First comes the roar of the earthquake sounds "propagating through the earth's crust," then you hear a second roar of the sounds "propagating through the ocean."

Listen:

via:

Trouble@Twitter

Boardroom power plays, disgruntled founders, and CEO switcheroos are clipping the wings of this tech high flier.

by Jessi Hempel

In March, shortly after Jack Dorsey went back to work for Twitter, the company he co-founded four years ago, he did a Q&A session with an entre­preneurship class at Columbia Business School. As students tapped away on their laptops (were they sending tweets?), Dorsey, 34, answered questions about his commitment to his new gig as Twitter's product chief. Dorsey, after all, is also CEO of Square, a hot payments business, and he returns to Twitter after a rocky run as its CEO -- the board demoted him in 2008. (Co-founder Evan Williams took over and held the job for almost two years; then operating chief Dick Costolo assumed the top job.) "Seems like a revolving door," mused the interviewer.

Dorsey laughed lightly and replied, "You know, we're just individuals. We're just humans running these companies." And he compared managing a startup to, of all things, supervising a theater company.

There's no shortage of drama at Twitter these days: Besides the CEO shuffles, there are secret board meetings, executive power struggles, a plethora of coaches and consultants, and disgruntled founders. (Like Williams. The day after Dorsey announced his return to the company -- via tweet, naturally -- Williams quit his day-to-day duties at the company, although he remains a board member and Twitter's largest shareholder, with an estimated 30% to 35% stake.) These theatrics, which go well beyond the usual angst at a new venture, have contributed to a growing perception that innovation has stalled and management is in turmoil at one of Silicon Valley's most promising startups, which some 20 million active users rely on each month for updates on everything from subway delays to election results -- and which a growing number of companies, big and small, seek to use to market themselves and track customers.

Just two years ago Twitter was the hottest thing on the web. But in the past year U.S. traffic at Twitter.com, the site users visit to read and broadcast 140-character messages, has leveled off. Nearly half the people who have Twitter accounts are no longer active on the network, according to an ExactTarget report from January 2011. It has been months -- an eternity in Silicon Valley -- since the company rolled out a new product that excited consumers. Facebook's Mark Zuckerberg used to watch developments at Twitter obsessively; now he pays much less attention to the rival service. Meanwhile companies are hungry to advertise, but Twitter hasn't been able to provide marketers with enough opportunities. Last year the company pulled in a mere $45 million in ad revenue, according to research firm eMarketer. Facebook brought in $1.86 billion.

Twitter doesn't lack talented engineers, potential paying customers, or loyal users -- and it certainly has plenty of money in the bank: It has raised more than $360 million from such heavyweights as Jeff Bezos and Kleiner Perkins. The problem is a board and top executive team that don't always appear to have control of its wide-ranging cast of characters, including founders who have attained near-celebrity status (another co-founder, Biz Stone, is a regular on NPR, and earlier this year Dorsey was profiled in Vanity Fair), headstrong and divisive managers, and investors used to getting their way. For some time Twitter's runaway growth -- in the first half of 2009, Twitter added more users more quickly than almost any web service in history -- masked its execution problems. But now, with growth of traffic to its site slowing and its rivals beefing up (new social-media darling Groupon has raised more than $1 billion, and Facebook has been on a hiring spree), Twitter needs to get its act together or risk losing buzz, potential ad revenue, and its bright future too.

To be fair, Twitter's founders didn't set out to build the next Facebook: Consumers turned it into a social phenomenon and kept signing on to see what it was about. Dorsey, Stone, and Williams started the service as an experimental side project; it was never designed to accommodate the 200 million–plus registered accounts worldwide it now hosts. Twitter crashed so frequently in its early days that its "fail whale" logo that signaled the service was down became a cultural icon emblazoned on ironic hipster T-shirts.

Read more:

Don't Wear Fur

Jean-Charles de Castelbajac

Mad Science

by Mark McClusky 

The perfect french fry—golden brown, surpassingly crispy on the outside, with a light and fluffy interior that tastes intensely of potato—is not easy to cook.

Here’s how most people do it at home: Cut some potatoes into fry shapes—classic 3/8-inch batons—and toss them into 375-degree oil until they’re golden brown. This is a mediocre fry. The center will be raw.

Here’s how most restaurants do it: Dunk the potatoes in oil twice, once at 325 degrees for about four minutes until they’re cooked through and then again at 375 degrees to brown them. This is a pretty great fry.

But let’s get serious. The chef Heston Blumenthal—owner of the Fat Duck restaurant in Bray, England, holder of three Michelin stars—created what he calls triple-cooked chips. (He’s English.) The raw batons are simmered in water until they almost fall apart and then placed on a wire rack inside a vacuum machine that pulls out the moisture. The batons then get the traditional double fry. You need an hour and a $2,000 vacuum chamber, but these are the best fries in the world. Or rather, they used to be.

The new contender was created by Nathan Myhrvold, the former CTO of Microsoft. Myhrvold cuts his potatoes into batons and rinses them to get rid of surface starch. Then he vacuum-seals them in a plastic bag, in one even layer, with water. He heats the bag to 212 degrees for 15 minutes, steaming the batons. Then he hits the bag with ultrasound to cavitate the water—45 minutes on each side. He reheats the bag in an oven to 212 degrees for five minutes, puts the hot fries on a rack in a vacuum chamber, and then blanches them in 338-degree oil for three minutes. When they’re cool, Myhrvold deep-fries the potatoes in oil at 375 degrees until they’re crisp, about three more minutes, and then drains them on paper towels. Total preparation time: two hours.

The result is amazing. The outside nearly shatters when you bite into it, yielding to a creamy center that’s perfectly smooth. The key is the cavitation caused by the ultrasonic bath—it creates thousands of tiny fissures on the potato’s surface, all of which become crunchy when it’s fried. When Plato saw the shadow of a french fry on the wall of his cave, the guy standing behind him was snacking on these.

The recipe is one of 1,600 in Myhrvold’s new cookbook, Modernist Cuisine. It’s a big book—2,400 pages big. Six volumes big. Big like the original slipcase failed Amazon .com’s shipping tests and had to be replaced with acrylic. Big like it weighs nearly 50 pounds and costs $625.

This is the way Myhrvold operates. After leaving Microsoft with all the money in the world, he started a company called, immodestly, Intellectual Ventures and turned his attention to busting some of the biggest problems in science and technology. And he dove into a few hobbies. Now most of us, if we were to get interested in cooking, might start to putter around the kitchen at home or do a little reading. Maybe we’d take a class. Because cooking is primarily a craft, dominated by artisans—or artists, if that’s how you view what a chef does. Every once in a while, a chemist drops in to take a look or heads for the world of industrial-scale food.

But Myhrvold—a theoretical physicist and computer scientist—has the lifestyle flexibility of a multimillionaire and the mental discipline of a world-class researcher. To him, cooking is about fundamental interactions in the material world: How heat enters food. How you mix two separate materials most effectively. How water molecules interact in a solution. You see a pork chop and some mashed potatoes; he sees a mesh of proteins that coagulate at a specific temperature next to an emulsion of starch and fat. “Chefs think about what it’s like to make food,” Myhrvold says. “Being a scientist in the kitchen is about asking why something works, and how it works.” To him, a kitchen is really just a laboratory that everyone has in their house. And when you have that attitude with that brain and those resources, well, you might not be the best cook in the world, but you just might put together the best cookbook.

Read more:

Infinite Attention

by Matt Feeney 

It seems a telling sign of our technology-angst that we're getting nostalgic for, of all things, boredom. I have memories of youthful boredom that are as vivid and unpleasant as the memories I harbor of my more serious sports injuries, and yet, when I read of some new research saying the brain needs boredom, or kids today aren't bored enough, my first thought is: Ah, blessed boredom. (My second thought is: Check email.) And it's not just me. A trickle of pro-boredom research has inspired a flood of pro-boredom sentiment.

On one hand, defending boredom seems stern and unsympathetic, like a Depression-born mom impatient with her complaining children. (Hi, Mom.) But the depression-era parent urged a kind of stoicism, bearing-up against fake or minor suffering as a moral lesson of childhood. For today's middle-agers, relishing the image of a teenager thrown into fidgets by a dead cellphone, boredom is not merely fake suffering. It's important in its own right, a state of latent fertility. It leads to creativity. The contemporary defender of boredom is not a stoic. She's a graying humanist, the martinet as art teacher.

From this I would like to advance a claim that might come off as either loony or pedantic or just obvious: Our ready nostalgia for boredom shows how deeply our culture—both our actual cultural products and our default ideas about how they happen and what they're for—remains rooted in the Romantic movement that spanned the late 18th and mid-19th centuries. Today's technology-anxious and pro-boredom pathos grows from well-wrought Romantic conceptions of freedom, aesthetic experience, artistic creation, and, indeed, technology. The Romantics, seeing the encroaching haste of commerce and industrial production, and people living on a clock set by money and machines, envisioned modes of experience that might partake of a more humane slowness. From Kant's Critique of Judgment (sometimes called the founding text of German Romanticism), which describes aesthetic pleasure as a "purposeless play of the faculties," to Thoreau's solitary puttering around Walden Pond, Romanticism saw people finding moments of freedom through withdrawal and retreat. In this process, we slow ourselves down to experience beauty, and, through this beauty, we might experience a deeper part of ourselves. Or vice versa.

And perhaps, spurred by some natural beauty we encounter in our retreat, we might create artistic beauty. The vague image in the back of the mind of our reflexive defender of boredom, whether or not this person has read a word of Wordsworth, is a guy sitting by himself in a field, surrounded by a host of golden daffodils, letting his mind wander lonely as a cloud, and then recollecting, in this moment of tranquility, the other host of golden daffodils he saw earlier that day, which he plans to write a poem about, or maybe paint a picture of. That, anyway, is the vague image in the back of my mind when I read about the neurological virtues of boredom. I'm something of a Romantic, by inclination and academic training. When I think of human flourishing, the freedom called "positive" by Isaiah Berlin, I tend to think of aesthetic experience and culture. I imagine people slowing down to enjoy high-quality television, turning inward to think, and maybe, depending on how noisy and hasty things have gotten in the real world, dropping out altogether, picking up and moving to, like, a pond.

I own up to my Romantic leanings, and I'm prepared to defend the decadence and blasé politics they suggest. But if there's anything that makes me regret or question this position, it's the mournful late work of David Foster Wallace, especially the posthumous fictional writing compiled as The Pale King, which is basically a 538-page monument against Romanticism. Dropping out and turning inward and dawdling in lovely otherness do not arise as alternatives in The Pale King. What Wallace offers instead is a humbling challenge for us to give the fallen world, and the fallen people who live in it, a heroic measure of simple attention.

The Pale King feels heroic and humbling because (besides the light cast upon it by the author's own life and death) Wallace actually shares the Romantics' pessimism about the fate of humans stuck within inhuman systems. Indeed, he paints an even grimmer picture of this predicament than they do. Technology and commerce are more soul-killing in his fictional universe. They were an advancing threat for the Romantics. In The Pale King they have simply won, on every level. Their predominance has rendered itself banal. The book's main setting is an IRS outpost in Peoria, Illinois, where humans process tax returns in the stunned and passive attitudes of feed-lot cattle, and where what counts as public art is a huge photorealist mosaic of a 1978 IRS Form 1040.

Read more:

Gold Fish

In one corner of Alaska's Bristol Bay, the sockeye salmon, a $300 million resource that's sustained fishermen like 29-year-old captain Lindsey Bloom for more than 100 years. In the other, the Pebble Mine, with its projected hundreds of billions in copper and gold. Get ready for the fiercest wilderness rumble since ANWR.

by Tim Sohn

"We got 30 minutes!" It was 1:30 P.M., half an hour before the start of the commercial salmon-fishing season in Bristol Bay, Alaska, and I was sitting on the flying bridge of the F/V Erika Leigh with Lindsey Bloom, 29, one of perhaps a dozen female captains out of the 1,500 or so who convene here every June to chase one of the largest sockeye salmon runs in the world. Bloom, five foot two, her curly red hair pulled back into a ponytail, was hustling her three-man crew through final preparations. "How you guys doing down there?" she shouted to the back deck. It seemed less like a question than an order to keep moving.

Around us bobbed hundreds of similar 32-foot boats, each guarding its own small patch of gray-brown water, with two or three crewmen busily readying themselves to do battle with the fish. The captains eyed one another and jockeyed for position, a coiled-spring pregame tension hanging over it all. "Barely managed chaos is the only way to describe it," one of Bloom's crewmen said.

At the center of this, Bloom was focused, her mouth set in a slight scowl, but obviously enjoying herself. It was her sixth season at the helm of the Erika Leigh, but she'd been fishing the bay for 13 years, learning the ropes from her father, Art, who has worked these waters for nearly two decades. "I've gained some confidence knowing I can hang with the big boys," she said. But the start of the season is always a nervy time. She was scanning the nearby swells for signs of the 40 million salmon that manage to find their way from the Pacific back to this corner of southwest Alaska every year, where they return to their natal streams to spawn and die.

"Check the net and the reel!" Bloom yelled, and two of her crewmen spun the large wheel amidships, smoothly spooling a 900-foot net to insure it would pay out cleanly when the time came.

Fluttering above us on a radio antenna was a white pennant printed with the words PEBBLE MINE crossed through by a bright-red X, a reference to the Pebble prospect, a massive lode of copper, gold, and molybdenum ore worth hundreds of billions of dollars. In a potentially unfortunate coincidence for the salmon of Bristol Bay and the small constellation of people who rely on them for their livelihood, the proposed mine sits at the headwaters of the two main river systems—the Nushagak and the Kvichak—that feed into the bay and provide the spawning habitat and fish nursery. The hundred miles of open tundra between where we floated and where the copper and gold sit in the ground is perhaps the largest intact salmon habitat left in the world, supporting five species of the fish, as well as a remarkable ecosystem full of moose and caribou, brown bears and grizzlies, eagles and wolverines, and trophy rainbow trout. The 40,000-square-mile watershed encompasses two national parks—Lake Clark and Katmai—and is home to two dozen native communities, many of which still rely on subsistence fishing and hunting for a large part of their diet.

"Southwest Alaska," as a fisherman told me, "is one of the last undeveloped, pristine places left in our country."

Read more:

The Destruction of Money


That cash in your wallet won't last forever, so what happens to it when it needs to be replaced?

by Daniel Indiviglio

Think about money being created. A furiously spinning printing press might come to mind. Now imagine money being destroyed. Do you think of a three-story shredder, a bonfire, a wide blue recycling bin?

You might have noticed that it's pretty hard to find any cash printed much earlier than the 1990s in circulation. Just as more money is constantly being created, it's also constantly being destroyed. Who are the destroyers of money, and how do they do it?

In order to explain money destruction, we have to define what we mean by money destruction. For example, are we talking about money being eliminated, its very presence disappearing from the economy? Or are we talking about when money is physically destroyed but replaced with newer, crisper currency? Let's consider both questions.

When Money Disappears

You probably know that the Federal Reserve controls the money supply, the technical term for the amount of money in the economy. When the money supply expands, money flows into the financial system. When the money supply contracts, money drains out of the financial system. But how does the money actually disappear?

In 2010, 2.6 billion $1 bills were destroyed.

Read more:

The Pen Gets Mightier

by James Fallows, The Atlantic

Every few years, an invention appears that makes all previous life seem backward. The digital camera is an obvious example. I might be nostalgic for old albums full of glossy prints. But the idea that it could take days before you saw how a photo had “turned out,” that you could snap only so many pictures before the roll of film was full, that the only way to share pictures was through the mail—these assumptions are hard even to imagine now.

For my own workaday purposes, the most useful recent invention has been the Livescribe Pulse pen, which I bought just after its introduction early last year and now can hardly be without. It looks like a somewhat bulky, cigar-shaped metallic writing instrument. Inside it contains a high-end audio recording system and assorted computer circuitry.

When you turn it on, it starts recording what you are hearing—and also matches what is being said, instant by instant (in fact, using photos it takes 72 times per second), with notes or drawings that you’re making in a special Livescribe notebook. The result is a kind of indexing system for an audio stream. If a professor is explaining a complex equation during a lecture, you write “equation,” or anything else—and later when you click on that term, either in the original notebook or on images of the pages transferred to your computer screen, it plays back that exact part of the discussion. (Works on both Macs and PCs.) For me this means instant access to the three interesting sentences—I just write “interesting!” in the notebook or put a star—in the typical hour-long journalistic interview. The battery lasts for several full days’ use between recharges, and the pen can hold dozens of hours of recordings.

Pens cost $129 and up, depending on capacity, and some 400,000 have been sold—two of them to me; I lost my first one in a taxi in Beijing. This summer Livescribe introduced a new model, the Echo, with a slightly thinner, futuristic-looking design and some new features, like the ability to transmit marked-up notebook pages over a computer network, so they serve as a shared “whiteboard” for remote meetings.

Through the years, the real reason I have liked writing about technological innovations has been having an excuse to talk with the people behind the devices or programs—to hear about what problems they were trying to solve, what nonobvious challenges they faced in making their gadget work, how they became inventors and entrepreneurs. In Livescribe’s case the man behind the technology is Jim Marggraff, an MIT-trained engineer in his early 50s who, like nearly all such people I have interviewed, radiates uncontrollable parental joy at his creation. Some famous tech titans are forever associated with one company—Jobs with Apple, Gates and Ballmer with Microsoft. Marggraff is of the serial-entrepreneur camp. He was one of several founders of the very successful networking company StrataCom, acquired by Cisco for $4.5 billion. More recently he was an executive with LeapFrog, whose LeapPad interactive books, which respond with voices when children touch words or pictures, are (according to Marggraff) in three-quarters of all U.S. households with small children. Many voices featured in the books are those of Marggraff and his son.

Marggraff’s ongoing fascination has been with technology’s effect on how we think, learn, and communicate. He designed a talking-globe device because, he says, he was appalled at how little Americans knew about world geography. He believes that future improvements in melding visual and audio information will help teachers teach, students learn, and groups collaborate. In creating the Livescribe pen, he delved into cognitive psychology—including a study showing that people are disproportionately influenced by gee-whiz features they can show off to their friends. The pen has several. For instance: a translator that lets you write out “One coffee, please” and have the results read out in Mandarin, Spanish, etc. A calculator that lets you write out a math problem and click on it for the answer. And what I think of as a notebook orchestra: you sketch a crude grid representing eight keys on a piano and it becomes a music synthesizer, letting you tap out tunes and hear them “played” by piano, steel drums, or other instruments.

“People like to be amazed,” Marggraff said as he played a tune. Yes we do.

via:

Bruno Catalano

via:
source: