Thursday, April 26, 2012

Communities for People

[ed. Well worth watching, just for the before and after pictures of what livable communities should look like, and how easily they can be achieved with careful planning.]



Dan Burden has spent more than 35 years helping the world get “back on its feet” and his efforts have not only earned him the first-ever lifetime-achievement awards issued by the New Partners for Smart Growth and the Association of Pedestrian and Bicycle Professionals, but in 2001, Dan was named by TIME magazine as “one of the six most important civic innovators in the world.”  Also that year, the Transportation Research Board of the National Academy of Sciences honored Dan by making him their Distinguished Lecturer.  In 2009, a user’s poll by Planetizen named Dan as one of the Top 100 Urban Thinkers of all time.  Early in his career, starting in 1980, Dan served for 16 years as the country’s first statewide Bicycle and Pedestrian Coordinator for the Florida Department of Transportation and that program became a model for other statewide programs in the United States.  In 1996, Dan sought to expand his reach and ability to really change the world, so he and his wife Lys co-founded a non-profit organization called Walkable Communities.  Since then, Dan has personally helped 3,500 communities throughout the world become more livable and walkable.

Walkable and Liveable Communities Institute

The A/B Test: Inside the Technology That’s Changing the Rules of Business


Dan Siroker helps companies discover tiny truths, but his story begins with a lie. It was November 2007 and Barack Obama, then a Democratic candidate for president, was at Google’s headquarters in Mountain View, California, to speak. Siroker—who today is CEO of the web-testing firm Optimizely, but then was a product manager on Google’s browser team—tried to cut the enormous line by sneaking in a back entrance. “I walked up to the security guard and said, ‘I have to get to a meeting in there,’” Siroker recalls. There was no meeting, but his bluff got him in.

At the talk, Obama fielded a facetious question from then-CEO Eric Schmidt: “What is the most efficient way to sort a million 32-bit integers?” Schmidt was having a bit of fun, but before he could move on to a real question, Obama stopped him. “Well, I think the bubble sort would be the wrong way to go,” he said—correctly. Schmidt put his hand to his forehead in disbelief, and the room erupted in raucous applause. Siroker was instantly smitten. “He had me at ‘bubble sort,’” he says. Two weeks later he had taken a leave of absence from Google, moved to Chicago, and joined up with Obama’s campaign as a digital adviser.

At first he wasn’t sure how he could help. But he recalled something else Obama had said to the Googlers: “I am a big believer in reason and facts and evidence and science and feedback—everything that allows you to do what you do. That’s what we should be doing in our government.” And so Siroker decided he would introduce Obama’s campaign to a crucial technique—almost a governing ethos—that Google relies on in developing and refining its products. He showed them how to A/B test.

Over the past decade, the power of A/B testing has become an open secret of high-stakes web development. It’s now the standard (but seldom advertised) means through which Silicon Valley improves its online products. Using A/B, new ideas can be essentially focus-group tested in real time: Without being told, a fraction of users are diverted to a slightly different version of a given web page and their behavior compared against the mass of users on the standard site. If the new version proves superior—gaining more clicks, longer visits, more purchases—it will displace the original; if the new version is inferior, it’s quietly phased out without most users ever seeing it. A/B allows seemingly subjective questions of design—color, layout, image selection, text—to become incontrovertible matters of data-driven social science.

by Brian Christian, Wired |  Read more:
Photo: Spencer Higgins; Illustration: Si Scott

1960: Marilyn Monroe in Reno captured by Eve Arnold. This picture was taken during the filming of The Misfits directed by John Huston.
via:

Brevity and the Soul

[ed. One of the commenters to this essay mentions Raymond Carver, a master of brevity, and provides a link to one of his stories Popular Mechanics.]

There is the apocryphal story in which Hemingway, sitting in a bar somewhere in Key West, is asked by an antagonistic admirer to follow his minimalism to its logical outcome and to tell a story in six words.  As the story goes, Hemingway picks up a napkin and writes out the following words:

For sale: baby shoes, never worn.

This is a pretty good story. The reader has to kind of inhabit it and fill in all that is unsaid (which is pretty much everything), but there’s an inexhaustible sadness there in the spaces between the words.  Everything pared away until there’s almost nothing left. The iceberg theory of fiction.

The genre of short short fiction (or microfiction, or whatever one might want to call it) is itself kinda small, and little of it is worth reading. But there are exceptions.

There’s this: “Sticks,” by George Saunders, perhaps the greatest super short story I’ve ever read:


Sticks


Originally published in Story, Winter 1995.

Every year Thanksgiving night we flocked out behind Dad as he dragged the Santa suit to the road and draped it over a kind of crucifix he'd built out of metal pole in the yard. Super Bowl week the pole was dressed in a jersey and Rod's helmet and Rod had to clear it with Dad if he wanted to take the helmet off. On the Fourth of July the pole was Uncle Sam, on Veteran’s Day a soldier, on Halloween a ghost. The pole was Dad's only concession to glee. We were allowed a single Crayola from the box at a time. One Christmas Eve he shrieked at Kimmie for wasting an apple slice. He hovered over us as we poured ketchup saying: good enough good enough good enough. Birthday parties consisted of cupcakes, no ice cream. The first I brought a date over she said: what's with your dad and that pole? and I sat there blinking.

We left home, married,  had children of our own, found the seeds of meanness blooming also within us. Dad began dressing the pole with more complexity and less discernible logic. He draped some kind of fur over it on Groundhog Day and lugged out a floodlight to ensure a shadow. When an earthquake struck Chile he lay the pole on its side and spray painted a rift in the earth. Mom died and he dressed the pole as Death and hung from the crossbar photos of Mom as a baby. We'd stop by and find odd talismans from his youth arranged around the base: army medals, theater tickets, old sweatshirts, tubes of Mom's makeup. One autumn he painted the pole bright yellow. He covered it with cotton swabs that winter for warmth and provided offspring by hammering in six crossed sticks around the yard. He ran lengths of string between the pole and the sticks, and taped to the string letters of apology, admissions of error, pleas for understanding, all written in a frantic hand on index cards. He painted a sign saying LOVE and hung it from the pole and another that said FORGIVE? and then he died in the hall with the radio on and we sold the house to a young couple who yanked out the pole and the sticks and left them by the road on garbage day.

Here there is an entire novel’s worth of intrigue and emotional complexity and backstory and difficult familial relationships and unhappinesses and losses and redemptions.  One can’t help but think of all those homes run by inexpressive and angry fathers who know something of love’s austere offices, these homes that suddenly erupt in holiday decorations that go waaay beyond the normal or expected.  Rudolphs and Santas and baby Jesuses and lights and holly all over the place.  This phenomenon…the phenomenon of the middle-to-lower-class father who has no creative outlet but finds an avenue in his front yard…this is an important aspect of contemporary life in the U.S., and one that needs more examination.  There are dissertations here.  And Saunders’ story is a most excellent jumping off point.

Then there is David Foster Wallace’s remarkable “A Radically Condensed History of Postindustrial Life.”

When they were introduced, he made a witticism, hoping to be liked. She laughed extremely hard, hoping to be liked. Then each drove home alone, staring straight ahead, with the very same twist to their faces.

The man who’d introduced them didn’t much like either of them, though he acted as if he did, anxious as he was to preserve good relations at all times. One never knew, after all, now did one now did one now did one.

I don’t think I’ve ever fully fathomed this one, but the final repetition of “now did one now did one now did one” is wildly suggestive.  It seems to suggest something of the radical uncertainty of what it means to live in a world where everyone is wearing a face to meet the faces on the street.

by Tom Jacobs, 3 Quarks Daily |  Read more:

Bic FlameDisk

[ed. I don't usually do product endorsements but these little instant barbeque pans are the bee's knees. Just pull from the package, place in a small grill or sand pit, light with a match and you're ready to go. When you're done just extinguish and let cool before crumpling and disposing. About the size of a stove-top popcorn pan. Extremely handy in a pinch.]




Photos: markk

Clemens Behr
via:

What Your Klout Score Really Means


Last spring Sam Fiorella was recruited for a VP position at a large Toronto marketing agency. With 15 years of experience consulting for major brands like AOL, Ford, and Kraft, Fiorella felt confident in his qualifications. But midway through the interview, he was caught off guard when his interviewer asked him for his Klout score. Fiorella hesitated awkwardly before confessing that he had no idea what a Klout score was.

The interviewer pulled up the web page for Klout.com—a service that purports to measure users’ online influence on a scale from 1 to 100—and angled the monitor so that Fiorella could see the humbling result for himself: His score was 34. “He cut the interview short pretty soon after that,” Fiorella says. Later he learned that he’d been eliminated as a candidate specifically because his Klout score was too low. “They hired a guy whose score was 67.”

Partly intrigued, partly scared, Fiorella spent the next six months working feverishly to boost his Klout score, eventually hitting 72. As his score rose, so did the number of job offers and speaking invitations he received. “Fifteen years of accomplishments weren’t as important as that score,” he says.

Much as Google’s search engine attempts to rank the relevance of every web page, Klout—a three-year-old startup based in San Francisco—is on a mission to rank the influence of every person online. Its algorithms comb through social media data: If you have a public account with Twitter, which makes updates available for anyone to read, you have a Klout score, whether you know it or not (unless you actively opt out on Klout’s website). You can supplement that score by letting Klout link to harder-to-access accounts, like those on Google+, Facebook, or LinkedIn. The scores are calculated using variables that can include number of followers, frequency of updates, the Klout scores of your friends and followers, and the number of likes, retweets, and shares that your updates receive. High-scoring Klout users can qualify for Klout Perks, free goodies from companies hoping to garner some influential praise.  (...)

Matt Thomson, Klout’s VP of platform, says that a number of major companies—airlines, big-box retailers, hospitality brands—are discussing how best to use Klout scores. Soon, he predicts, people with formidable Klout will board planes earlier, get free access to VIP airport lounges, stay in better hotel rooms, and receive deep discounts from retail stores and flash-sale outlets. “We say to brands that these are the people they should pay attention to most,” Thomson says. “How they want to do it is up to them.”

Not everyone is thrilled by the thought of a startup using a mysterious, proprietary algorithm to determine what kind of service, shopping discounts, or even job offers we might receive. The web teems with resentful blog posts about Klout, with titles like “Klout Has Gone Too Far,” “Why Your Klout Score Is Meaningless,” and “Delete Your Klout Profile Now!” Jaron Lanier, the social media skeptic and author of You Are Not a Gadget, hates the idea of Klout. “People’s lives are being run by stupid algorithms more and more,” Lanier says. “The only ones who escape it are the ones who avoid playing the game at all.” Peak outrage was achieved on October 26, when the company tweaked its algorithm and many people’s scores suddenly plummeted. To some, the jarring change made the whole concept of Klout seem capricious and meaningless, and they expressed their outrage in tweets, blog posts, and comments on the Klout website. “Not exactly fun having the Internet want to punch me in the face,” tweeted Klout CEO Joe Fernandez amid the uproar.

by Seth Stevenson, Wired |  Read more:
Photo: Garry McLeod

Erin Cone
via:

Wednesday, April 25, 2012

On the Origins of the Arts


Since the fading of the original Enlightenment during the late eighteenth and early nineteenth centuries, stubborn impasse has existed in the consilience of the humanities and natural sciences. One way to break it is to collate the creative process and writing styles of literature and scientific research. This might not prove so difficult as it first seems. Innovators in both of two domains are basically dreamers and storytellers. In the early stages of creation of both art and science, everything in the mind is a story. There is an imagined denouement, and perhaps a start, and a selection of bits and pieces that might fit in between. In works of literature and science alike, any part can be changed, causing a ripple among the other parts, some of which are discarded and new ones added. The surviving fragments are variously joined and separated, and moved about as the story forms. One scenario emerges, then another. The scenarios, whether literary or scientific in nature, compete. Words and sentences (or equations or experiments) are tried. Early on an end to all the imagining is conceived. It seems a wondrous denouement (or scientific breakthrough). But is it the best, is it true? To bring the end safely home is the goal of the creative mind. Whatever that might be, wherever located, however expressed, it begins as a phantom that might up until the last moment fade and be replaced. Inexpressible thoughts flit along the edges. As the best fragments solidify, they are put in place and moved about, and the story grows and reaches its inspired end. Flannery O’Connor asked, correctly, for all of us, literary authors and scientists, “How can I know what I mean until I see what I say?” The novelist says, “Does that work?,” and the scientist says, “Could that possibly be true?”

The successful scientist thinks like a poet but works like a bookkeeper. He writes for peer review in hopes that “statured” scientists, those with achievements and reputations of their own, will accept his discoveries. Science grows in a manner not well appreciated by nonscientists: it is guided as much by peer approval as by the truth of its technical claims. Reputation is the silver and gold of scientific careers. Scientists could say, as did James Cagney upon receiving an Academy Award for lifetime achievement, “In this business you’re only as good as the other fellow thinks you are.”

But in the long term, a scientific reputation will endure or fall upon credit for authentic discoveries. The conclusions will be tested repeatedly, and they must hold true. Data must not be questionable, or theories crumble. Mistakes uncovered by others can cause a reputation to wither. The punishment for fraud is nothing less than death—to the reputation, and to the possibility of further career advancement. The equivalent capital crime in literature is plagiarism. But not fraud! In fiction, as in the other creative arts, a free play of imagination is expected. And to the extent it proves aesthetically pleasing, or otherwise evocative, it is celebrated.

The essential difference between literary and scientific style is the use of metaphor. In scientific reports, metaphor is permissible—provided it is chaste, perhaps with just a touch of irony and self-deprecation. For example, the following would be permitted in the introduction or discussion of a technical report: “This result if confirmed will, we believe, open the door to a range of further fruitful investigations.” Not permitted is: “We envision this result, which we found extraordinarily hard to obtain, to be a potential watershed from which many streams of new research will surely flow.”

What counts in science is the importance of the discovery. What matters in literature is the originality and power of the metaphor. Scientific reports add a tested fragment to our knowledge of the material world. Lyrical expression in literature, on the other hand, is a device to communicate emotional feeling directly from the mind of the writer to the mind of the reader. There is no such goal in scientific reporting, where the purpose of the author is to persuade the reader by evidence and reasoning of the validity and importance of the discovery. In fiction the stronger the desire to share emotion, the more lyrical the language must be. At the extreme, the statement may be obviously false, because author and reader want it that way.

by E.O. Wilson, Harvard Magazine |  Read more:
Photograph courtesy of the French Ministry of Culture and Communication, Regional Direction for Cultural Affairs, Rhône-Alpes region/Regional Department of Archaeology

urban forestation (von pannaphotos)
via:

Paul Theroux’s Quest to Define Hawaii


Hawaii seems a robust archipelago, a paradise pinned like a bouquet to the middle of the Pacific, fragrant, sniffable and easy of access. But in 50 years of traveling the world, I have found the inner life of these islands to be difficult to penetrate, partly because this is not one place but many, but most of all because of the fragile and floral way in which it is structured. Yet it is my home, and home is always the impossible subject, multilayered and maddening.

Two thousand miles from any great landmass, Hawaii was once utterly unpeopled. Its insularity was its salvation; and then, in installments, the world washed ashore and its Edenic uniqueness was lost in a process of disenchantment. There was first the discovery of Hawaii by Polynesian voyagers, who brought with them their dogs, their plants, their fables, their cosmology, their hierarchies, their rivalries and their predilection for plucking the feathers of birds; the much later barging in of Europeans and their rats and diseases and junk food; the introduction of the mosquito, which brought avian flu and devastated the native birds; the paving over of Honolulu; the bombing of Pearl Harbor; and many hurricanes and tsunamis. Anything but robust, Hawaii is a stark illustration of Proust’s melancholy observation: “The true paradises are the paradises we have lost.”  (...)

I have lived in Hawaii for 22 years, and in this time have also traveled the world, writing books and articles about Africa, Asia, South America, the Mediterranean, India and elsewhere. Though I have written a number of fictional pieces, including a novel, Hotel Honolulu, set in Hawaii, I have struggled as though against monster surf to write nonfiction about the islands. I seldom read anything that accurately portrayed in an analytical way the place in which I have chosen to live. I have been in Hawaii longer than anywhere else in my life. I’d hate to die here, I murmured to myself in Africa, Asia and Britain. But I wouldn’t mind dying in Hawaii, which means I like living here.

Some years ago, I spent six months attempting to write an in-depth piece for a magazine describing how Hawaiian culture is passed from one generation to the other. I wrote the story, after a fashion, but the real tale was how difficult it was to get anyone to talk to me. I went to a charter school on the Big Island, in which the Hawaiian language was used exclusively, though everyone at the place was bilingual. Aware of the protocol, I gained an introduction from the headmaster of the adjoining school. After witnessing the morning assembly where a chant was offered, and a prayer, and a stirring song, I approached a teacher and asked if she would share with me a translation of the Hawaiian words I had just heard. She said she’d have to ask a higher authority. Never mind the translation, I said; couldn’t she just write down the Hawaiian versions?

“We have to go through the proper channels,” she said.

That was fine with me, but in the end permission to know the words was refused. I appealed to a Hawaiian language specialist, Hawaiian himself, who had been instrumental in the establishment of such Hawaiian language immersion schools. He did not answer my calls or messages, and in the end, when I pressed him, he left me with a testy, not to say xenophobic, reply.  (...)

I was not dismayed: I was fascinated. I had never in my traveling or writing life come across people so unwilling to share their experiences. Here I was living in a place most people thought of as Happyland, when in fact it was an archipelago with a social structure that was more complex than any I had ever encountered—beyond Asiatic. One conclusion I reached was that in Hawaii, unlike any other place I had written about, people believed that their personal stories were their own, not to be shared, certainly not to be retold by someone else. Virtually everywhere else people were eager to share their stories, and their candor and hospitality had made it possible for me to live my life as a travel writer.

by Paul Theroux, The Smithsonian |  Read more:
Photo: Jacques Descloitres / Modis Land Rapid Response Team / NASA GSFC

Music Is My Bag

The image I want to get across is that of the fifteen-year-old boy with the beginning traces of a mustache who hangs out in the band room after school playing the opening bars of a Billy Joel song on the piano. This is the kid who, in the interests of adopting some semblance of personal style, wears a fedora hat and a scarf with a black-and-white design of a piano keyboard. This is the kid who, in addition to having taught himself some tunes from the Songs from the Attic sheet music he bought at the local Sam Ash, probably also plays the trombone in the marching band, and experienced a seminal moment one afternoon as he vaguely flirted with a not-yet-kissed, clarinet-playing girl, a girl who is none too popular but whose propensity for leaning on the piano as the boy plays the opening chords of "Captain Jack" give him a clue as to the social possibilities that might be afforded him via the marching band.

If the clarinet-playing girl is an average student musician, she carries her plastic Selmer in the standard-issue black plastic case. If she has demonstrated any kind of proficiency, she carries her Selmer in a tote bag that reads "Music Is My Bag." The boy in the piano-key scarf definitely has music as his bag. He may not yet have the tote bag, but the hat, the Billy Joel, the tacit euphoria brought on by a sexual awakening that, for him, centers entirely around band, is all he needs to be delivered into the unmistakable realm that is Music Is My Bagdom.

I grew up in Music Is My Bag culture. The walls of my parents' house were covered with framed art posters from musical events: The San Francisco Symphony's 1982 production of St. Matthew's Passion, The Metropolitan Opera's 1976 production of Aida, the original Broadway production of Sweeney Todd. Ninety percent of the books on the shelves were about music, if not actual musical scores. Childhood ceramics projects made by my brother and me were painted with eighth notes and treble clef signs. We owned a deck of cards with portraits of the great composers on the back. A baby grand piano overtook the room that would have been the dining room if my parents hadn't forgone a table and renamed it "the music room." This room also contained an imposing hi-fi system and a $300 wooden music stand. Music played at all times: Brahms, Mendelssohn, cast recordings of Sondheim musicals, a cappella Christmas albums. When my father sat down with a book, he read musical scores, humming quietly and tapping his foot. When I was ten, my mother decided we needed to implement a before-dinner ritual akin to saying grace, so she composed a short song, asking us all to contribute a lyric, and we held hands and sang it before eating. My lyric was, "There's a smile on our face and it seems to say all the wonderful things we've all done today." My mother insisted on harmonizing at the end. She also did this when singing "Happy Birthday."

Harmonizing on songs like "Happy Birthday" is a clear indication of the Music Is My Bag personality. If one does not have an actual bag that reads "Music Is My Bag"—as did the violist in the chamber music trio my mother set up with some women from the Unitarian Church—a $300 music stand and musical-note coasters will more than suffice. To avoid confusion, let me also say that there are many different Bags in life. Some friends of my parents have a $300 dictionary stand, a collection of silver bookmarks, and once threw a dinner party wherein the guests had to dress up as members of the Bloomsbury Group. These people are Literature Is My Bag. I know people who are Movies Are My Bag (detectable by key chains shaped like projectors, outdated copies of Halliwell's Film Guide, and one too many T-shirts from things like the San Jose Film Festival), people who are Cats Are My Bag (self-explanatory), and, perhaps most annoyingly, Where I Went To College Is My Bag (Yale running shorts, plastic Yale tumblers, Yale Platinum Plus MasterCard, and, yes, even Yale screensavers—all this in someone aged forty or more, the perennial contributor to the class notes).

Having a Bag connotes the state of being overly interested in something, and yet, in a certain way, not interested enough. It has a hobbyish quality to it, a sense that the enthusiasm developed at a time when the enthusiast was lacking in some significant area of social or intellectual life. Music Is My Bag is the mother of all Bags, not just because in the early 1980s some consumer force of the public radio fund-drive variety distributed a line of tote bags that displayed that slogan, but because its adherents, or, as they tend to call themselves, "music lovers," give off an aura that distinguishes them from the rest of the population. It's an aura that has to do with a sort of benign cluelessness, a condition that, even in middle age, smacks of that phase between prepubescence and real adolescence. Music Is My Bag people have a sexlessness to them. There is a pastiness to them. They can never seem to find a good pair of jeans. You can spot them on the street, the female French horn player in concert dress hailing a cab to Lincoln Center around seven o'clock in the evening, her earrings too big, her hairstyle unchanged since 1986. The fifty-something recording engineer with the running shoes and the shoulder bag. The Indiana marching band kids in town for the Macy's Thanksgiving Day Parade, snapping photos of each other in front of the Hard Rock Cafe, having sung their parts from the band arrangement of Hello Dolly the whole way on the bus, thinking, knowing, that it won't get better than this. Like all Music Is My Bag people, they are a little too in love with the trappings. They know what their boundaries are and load up their allotted space with memorabilia, saving the certificates of participation from regional festivals, the composer-a-month calendars, the Mostly Mozart posters. Their sincerity trumps attempts at snideness. The boys' sarcasm only goes a fraction of the way there, the girls will never be great seducers. They grow up to look like high school band directors even if they're not. They give their pets names like Wolfgang and Gershwin. Their hemlines are never quite right.

by Meghan Daum, Harper's Magazine |  Read more:

Mount Fuji from in the bamboo, Hokusai.
via:

David Hockney: Piscinas
A Bigger Splash (1967)
via:

Machine-Made News

I speak Spanish to God, Italian to women, French to men, and German to my horse.
-- Emperor Charles V

But in which language does one speak to a machine, and what can be expected by way of response? The questions arise from the accelerating data-streams out of which we’ve learned to draw the breath of life, posed in consultation with the equipment that scans the flesh and tracks the spirit, cues the ATM, the GPS, and the EKG, arranges the assignations on Match.com and the high-frequency trades at Goldman Sachs, catalogs the pornography and drives the car, tells us how and when and where to connect the dots and thus recognize ourselves as human beings.

Why then does it come to pass that the more data we collect -- from Google, YouTube, and Facebook -- the less likely we are to know what it means?


The conundrum is in line with the late Marshall McLuhan’s noticing 50 years ago the presence of “an acoustic world,” one with “no continuity, no homogeneity, no connections, no stasis,” a new “information environment of which humanity has no experience whatever.” He published Understanding Media in 1964, proceeding from the premise that “we become what we behold,” that “we shape our tools, and thereafter our tools shape us.”

Media were to be understood as “make-happen agents” rather than as “make-aware agents,” not as art or philosophy but as systems comparable to roads and waterfalls and sewers. Content follows form; new means of communication give rise to new structures of feeling and thought.

To account for the transference of the idioms of print to those of the electronic media, McLuhan examined two technological revolutions that overturned the epistemological status quo. First, in the mid-fifteenth century, Johannes Gutenberg’s invention of moveable type, which deconstructed the illuminated wisdom preserved on manuscript in monasteries, encouraged people to organize their perceptions of the world along the straight lines of the printed page. Second, in the nineteenth and twentieth centuries, the applications of electricity (telegraph, telephone, radio, movie camera, television screen, eventually the computer), favored a sensibility that runs in circles, compressing or eliminating the dimensions of space and time, narrative dissolving into montage, the word replaced with the icon and the rebus.

Within a year of its publication, Understanding Media acquired the standing of Holy Scripture and made of its author the foremost oracle of the age. The New York Herald Tribune proclaimed him “the most important thinker since Newton, Darwin, Freud, Einstein, and Pavlov.” Although never at a loss for Delphic aphorism -- “The electric light is pure information”; “In the electric age, we wear all mankind as our skin” -- McLuhan assumed that he had done nothing more than look into the window of the future at what was both obvious and certain.

Floating the Fiction of Democracy

In 1964 I was slow to take the point, possibly because I was working at the time in a medium that McLuhan had listed as endangered -- writing, for The Saturday Evening Post, inclined to think in sentences, accustomed to associating a cause with an effect, a beginning with a middle and an end. Television news I construed as an attempt to tell a story with an alphabet of brightly colored children’s blocks, and when offered the chance to become a correspondent for NBC, I declined the referral to what I regarded as a course in remedial reading.

The judgment was poorly timed. Within five years The Saturday Evening Post had gone the way of the great auk; news had become entertainment, entertainment news, the distinctions between a fiction and a fact as irrelevant as they were increasingly difficult to parse. Another 20 years and I understood what McLuhan meant by the phrase, “The medium is the message,” when in the writing of a television history of America’s foreign policy in the twentieth century, I was allotted roughly 73 seconds in which to account for the origins of World War II, while at the same time providing a voiceover transition between newsreel footage of Jesse Owens running the hundred-yard dash at the Berlin Olympics in the summer of 1936, and Adolf Hitler marching the Wehrmacht into Vienna in the spring of 1938.

McLuhan regarded the medium of television as better suited to the sale of a product than to the expression of a thought. The voice of the first person singular becomes incorporated into the collective surges of emotion housed within an artificial kingdom of wish and dream; the viewer’s participation in the insistent and ever-present promise of paradise regained greatly strengthens what McLuhan identified as “the huge educational enterprise that we call advertising.” By which he didn’t mean the education of a competently democratic citizenry -- “Mosaic news is neither narrative, nor point of view, nor explanation, nor comment” -- but rather as “the gathering and processing of exploitable social data” by “Madison Avenue frogmen of the mind” intent on retrieving the sunken subconscious treasure of human credulity and desire.

by Lewis Lapham, TomDispatch.com |  Read more:

The Crisis of Big Science

Last year physicists commemorated the centennial of the discovery of the atomic nucleus. In experiments carried out in Ernest Rutherford’s laboratory at Manchester in 1911, a beam of electrically charged particles from the radioactive decay of radium was directed at a thin gold foil. It was generally believed at the time that the mass of an atom was spread out evenly, like a pudding. In that case, the heavy charged particles from radium should have passed through the gold foil, with very little deflection. To Rutherford’s surprise, some of these particles bounced nearly straight back from the foil, showing that they were being repelled by something small and heavy within gold atoms. Rutherford identified this as the nucleus of the atom, around which electrons revolve like planets around the sun.

This was great science, but not what one would call big science. Rutherford’s experimental team consisted of one postdoc and one undergraduate. Their work was supported by a grant of just £70 from the Royal Society of London. The most expensive thing used in the experiment was the sample of radium, but Rutherford did not have to pay for it—the radium was on loan from the Austrian Academy of Sciences.

Nuclear physics soon got bigger. The electrically charged particles from radium in Rutherford’s experiment did not have enough energy to penetrate the electrical repulsion of the gold nucleus and get into the nucleus itself. To break into nuclei and learn what they are, physicists in the 1930s invented cyclotrons and other machines that would accelerate charged particles to higher energies. The late Maurice Goldhaber, former director of Brookhaven Laboratory, once reminisced:
The first to disintegrate a nucleus was Rutherford, and there is a picture of him holding the apparatus in his lap. I then always remember the later picture when one of the famous cyclotrons was built at Berkeley, and all of the people were sitting in the lap of the cyclotron.

After World War II, new accelerators were built, but now with a different purpose. In observations of cosmic rays, physicists had found a few varieties of elementary particles different from any that exist in ordinary atoms. To study this new kind of matter, it was necessary to create these particles artificially in large numbers. For this physicists had to accelerate beams of ordinary particles like protons—the nuclei of hydrogen atoms—to higher energy, so that when the protons hit atoms in a stationary target their energy could be transmuted into the masses of particles of new types. It was not a matter of setting records for the highest-energy accelerators, or even of collecting more and more exotic species of particles, like orchids. The point of building these accelerators was, by creating new kinds of matter, to learn the laws of nature that govern all forms of matter. Though many physicists preferred small-scale experiments in the style of Rutherford, the logic of discovery forced physics to become big.

by Steven Weinberg, NY Review of Books |  Read more:
Photo: Superconducting Super Collider Laboratory/Photo Researchers

Narcissism in Pink and Blue

I’m typically a year or two behind any cultural trend, so you probably already know about gender-reveal parties. I first heard of them over the weekend: a couple, strangers to me, had invited friends and relatives over to bite into cupcakes at the same instant and share the moment when the blue or pink custard inside would inform them all of the sex of the baby. (The sonogram result had gone from lab to baker without being seen by anyone else, including the parents-to-be.) Other couples choose different methods of revelation: grip the knife together and cut into a cake with blue or pink filling. Open a sealed box that releases pink or blue helium balloons. Then put the scene on the Web so that everyone not invited can participate vicariously.

These events are becoming more and more popular. The first video of a gender-reveal party was posted on YouTube in 2008, but in just the last six months almost two thousand have been uploaded. You can watch one from last month. (Spoiler alert: it’s a girl.)

Maybe it was the context—I happened to hear about the gender-reveal party in a rundown inner-city café full of ex-felons who were having a very hard time finding jobs—but my initial take was incredulity trending negative. These parties seem to marry the oversharing of Facebook and Instagram with the contrived ceremonies that modern people in search of meaning impose on normal life events: food journaling, birthday parties for grownups, workout diaries, birth-experience planning. (One birth-planning center offers a “baby gender selection kit” involving three safe and natural steps that turn sex itself into a gender-reveal party.)

In the case of gender-reveal parties, couples take a private moment made possible by science and oblige others to join in, with the result—as in so many invented rituals of our day—that the focus turns from where it ought to be (in this case, the baby) to the self. At a bris or christening, the emotional emphasis falls on the arrival of a new life in the embrace of family and community. At a gender-reveal party, the camera is on the expectant father tearing up at the sight of pink cake.

by George Packer, The New Yorker |  Read more:

Tuesday, April 24, 2012

Has Physics Made Philosophy and Religion Obsolete?

It is hard to know how our future descendants will regard the little sliver of history that we live in. It is hard to know what events will seem important to them, what the narrative of now will look like to the twenty-fifth century mind. We tend to think of our time as one uniquely shaped by the advance of technology, but more and more I suspect that this will be remembered as an age of cosmology---as the moment when the human mind first internalized the cosmos that gave rise to it. Over the past century, since the discovery that our universe is expanding, science has quietly begun to sketch the structure of the entire cosmos, extending its explanatory powers across a hundred billion galaxies, to the dawn of space and time itself. It is breathtaking to consider how quickly we have come to understand the basics of everything from star formation to galaxy formation to universe formation. And now, equipped with the predictive power of quantum physics, theoretical physicists are beginning to push even further, into new universes and new physics, into controversies once thought to be squarely within the domain of theology or philosophy. 

In January, Lawrence Krauss, a theoretical physicist and Director of the Origins Institute at Arizona State University, published A Universe From Nothing: Why There Is Something Rather Than Nothing, a book that, as its title suggests, purports to explain how something---and not just any something, but the entire universe---could have emerged from nothing, the kind of nothing implicated by quantum field theory. But before attempting to do so, the book first tells the story of modern cosmology, whipping its way through the big bang to microwave background radiation and the discovery of dark energy. It's a story that Krauss is well positioned to tell; in recent years he has emerged as an unusually gifted explainer of astrophysics. One of his lectures has been viewed over a million times on YouTube and his cultural reach extends to some unlikely places---last year Miley Cyrus came under fire when she tweeted a quote from Krauss that some Christians found offensive. Krauss' book quickly became a bestseller, drawing raves from popular atheists like Sam Harris and Richard Dawkins, the latter of which even compared it to The Origin of Species for the way its final chapters were supposed to finally upend "last trump card of the theologian." 

By early spring, media coverage of "A Universe From Nothing" seemed to have run its course, but then on March 23rd the New York Times ran a blistering review of the book, written by David Albert, a philosopher of physics from Columbia University. Albert, who has a PhD in theoretical physics, argued that Krauss' "nothing" was in fact a something and did so in uncompromising terms: 

"The particular, eternally persisting, elementary physical stuff of the world, according to the standard presentations of relativistic quantum field theories, consists (unsurprisingly) of relativistic quantum fields... they have nothing whatsoever to say on the subject of where those fields came from, or of why the world should have consisted of the particular kinds of fields it does, or of why it should have consisted of fields at all, or of why there should have been a world in the first place. Period. Case closed. End of story."

Because the story of modern cosmology has such deep implications for the way that we humans see ourselves and the universe, it must be told correctly and without exaggeration---in the classroom, in the press and in works of popular science. To see two academics, both versed in theoretical physics, disagreeing so intensely on such a fundamental point is troubling. Not because scientists shouldn't disagree with each other, but because here they're disagreeing about a claim being disseminated to the public as a legitimate scientific discovery. Readers of popular science often assume that what they're reading is backed by a strong consensus. Having recently interviewed Krauss for a different project, I reached out to him to see if he was interested in discussing Albert's criticisms with me. He said that he was, and mentioned that he would be traveling to New York on April 20th to speak at a memorial service for Christopher Hitchens. As it happened, I was also due to be in New York that weekend and so, last Friday, we were able to sit down for the extensive, and at times contentious, conversation that follows.

by Ross Andersen, The Atlantic |  Read more: