Saturday, September 29, 2012

Fender: A Guitar Maker Aims to Stay Plugged In

In 1948, a radio repairman named Leo Fender took a piece of ash, bolted on a length of maple and attached an electronic transducer.

You know the rest, even if you don’t know you know the rest.

You’ve heard it — in the guitar riffs of Buddy Holly, Jimi Hendrix, George Harrison, Keith Richards, Eric Clapton, Pete Townshend, Bruce Springsteen, Mark Knopfler, Kurt Cobain and on and on.

It’s the sound of a Fender electric guitar. Mr. Fender’s company, now known as the Fender Musical Instruments Corporation, is the world’s largest maker of guitars. Its Stratocaster, which made its debut in 1954, is still a top seller. For many, the Strat’s cutting tone and sexy, double-cutaway curves mean rock ’n’ roll.

But this heart of rock isn’t beating quite the way it once did. Like many other American manufacturers, Fender is struggling to hold on to what it’s got in a tight economy. Sales and profits are down this year. A Strat, after all, is what economists call a consumer discretionary item — a nonessential.

More than macroeconomics, however, is at work here. Fender, based in Scottsdale, Ariz., is also being buffeted by powerful forces on Wall Street.

A private investment firm, Weston Presidio, controls nearly half of the company and has been looking for an exit. It pushed to take Fender public in March, to howls in the guitar-o-sphere that Fender was selling out. But, to Fender’s embarrassment, investors balked. They were worried about the lofty price and, even more, about how Fender can keep growing.

And that, really, is the crux of the matter. Times have changed, and so has music. In the 1950s, ’60s and ’70s, electric guitars powered rock and pop. Today, turntable rigs, drum machines and sampler synthesizers drive music like hip-hop. Electric guitars, huge as they are, have lost some of their old magic in this era of Jay-Z, Kanye West and “The Voice.”

Games like Guitar Hero have helped underpin sales, but teenagers who once might have hankered after guitars now get by making music on laptops. It’s worth remembering that the accordion was once the most popular instrument in America.

Granted, Fender is such a powerful brand that it can ride out the lean times. But sales of all kinds of musical instruments plunged during the recession, and they still haven’t recovered fully. Sales of all instruments in the United States totaled $6.5 billion last year, down roughly 13 percent from their peak in 2005, according to Music Trades, which tracks the industry.

Many of the guitars that are selling these days are cheap ones made in places like China — ones that cost a small fraction of, say, a $1,599 Fender Artist “Eric Clapton” Strat. Fender has been making its own lines of inexpensive guitars overseas for years, but the question is how the company can keep growing and compete profitably in a fast-moving, global marketplace. Its margins are already under pressure.

“What possible niche is left unexploited by Fender?” asks Jeffrey Bronchick, founder of Cove Street Capital, an investment advisory firm in El Segundo, Calif., and the owner of some 40 guitars, including four Fenders.

by Janet Morrissey, NY Times |  Read more:
Photo: Monica Almeida/

How Wikipedia Works


Imagine a world without Wikipedia: a place where this encyclopaedic behemoth couldn’t be scanned by anxious school students keen to speed up their homework by rewriting the first four paragraphs of hard-working Wikipedia editors. Can you envision University undergraduates actually picking up a dead-tree media copy of Encyclopaedia Britannica, or actually physically visiting a brick and mortar library to ferret out original sources? Neither can I – especially when considering the fact Wikipedia is now one of the most established and well patronised reference websites in the contemporary world.

Not only is Wikipedia one of the most used resources for data-gathering and seemingly instantaneous information retrieval – it’s also free to use with no advertisements clogging up the interface (except those quirky requests to donate). So just how does such a key knowledge resource function, and who are the faces behind such an indispensable modern take on the traditional Encyclopaedia?

In 1994, Ward Cunningham created a website format previously unknown to Internet users at the time: one that would drive knowledge creation and collation to new heights. This website was called a Wiki.

This style of website propelled the user into the collaboration spotlight, by encouraging anyone to update and edit internet-hosted content in real time. Although this early version of collaborative content creation now seems standard, remember that this was both pre social media and prior to the multitudes of current platforms and applications that cater for users who are keen to create, edit and share information in an aggregation space.

On his personal web page, Ward says of the Wiki:
The idea of a “Wiki” may seem odd at first, but dive in, explore its links and it will soon seem familiar. “Wiki” is a composition system; it’s a discussion medium; it’s a repository; it’s a mail system; it’s a tool for collaboration. We don’t know quite what it is, but we do know it’s a fun way to communicate asynchronously across the network.
In 2000, this original concept of tying large amounts of content to an open, networked based collation system propelled Jimmy Wales and Larry Sanger to create Nupedia. Nupedia (unlike Wikipedia) was initially created as a for-profit project that acted as an online encyclopaedia funded by Bomis. The project had lofty ideals: it was to be free to access for all users and all content was to be academically robust, with a mandatory peer-review policy as well as a 7-step review process.

From the outset, Nupedia had performance problems. In its first year, Nupedia had only 20 or so articles approved for publication, with as many as 150 drafts stagnating in the yet-to-be published vault. It was also assumed by the founders of Nupedia that scholars and experts would want to voluntarily provide high-end content, regardless of the absence of incentives to do so. Then there was the infighting between Sanger and Wales, with Sanger determined to adhere strongly to the content control of all published material and demanding more reliable content (Sanger would go on to create a more academically-robust alternative called Citizendium that is still in operation).

What further added to the demise of Nupedia was the fact that both Sanger and Wales wanted to adopt the Wiki format in order to utilise elements they thought would act to enhance the existing Nupedia model, such as ease of editing, less restricted review processes and a more inclusive and open approach to information organization. Thus, Wikipedia was born as a side-project to help enhance Nupedia – instead, it ended up eclipsing it and providing the trigger for Nupedia’s eventual demise.

When the non-profit Wikipedia project went live in January 2001, both Sanger and Wales had no idea that this side-project would become the force it is today. By 2002, this knowledge repository contained upwards of 20,000 entries: at the end of 2006, it had reached the 1 million article mark. Wikipedia itself provides the latest up-to-date stats regarding the current statistics concerning the present contributor base and popularity:

As of September 2012, Wikipedia includes over 23 million freely usable articles in 285 languages, written by over 36 million registered users and numerous anonymous contributors worldwide. According to Alexa Internet, Wikipedia is the world’s sixth-most-popular website, visited monthly by around 12% of all internet users.

by Mez Breeze, The Next Web |  Read more:
Photo: Mandel Ngan/Getty Images

[ed. Golf ball impact at 150 mph. Today's PGA tour pros normally hit the ball around 120 mph, so this is a bit of an exaggeration but still quite impressive.]

[ed. I've been experiencing this a lot lately and it pisses me off (sometimes even the ad won't load). My usual response is to just shut the whole thing down and move on. Hopefully video delivery systems will find a better way to monetize their content than by alienating viewers.]
via

Native Tongues


The scene is a mysterious one, beguiling, thrilling, and, if you didn’t know better, perhaps even a bit menacing. According to the time-enhanced version of the story, it opens on an afternoon in the late fall of 1965, when without warning, a number of identical dark-green vans suddenly appear and sweep out from a parking lot in downtown Madison, Wisconsin. One by one they drive swiftly out onto the city streets. At first they huddle together as a convoy. It takes them only a scant few minutes to reach the outskirts—Madison in the sixties was not very big, a bureaucratic and academic omnium-gatherum of a Midwestern city about half the size of today. There is then a brief halt, some cursory consultation of maps, and the cars begin to part ways.

All of this first group of cars head off to the south. As they part, the riders wave their farewells, whereupon each member of this curious small squadron officially commences his long outbound adventure—toward a clutch of carefully selected small towns, some of them hundreds and even thousands of miles away. These first few cars are bound to cities situated in the more obscure corners of Florida, Oklahoma, and Alabama. Other cars that would follow later then went off to yet more cities and towns scattered evenly across every corner of every mainland state in America. The scene as the cars leave Madison is dreamy and tinted with romance, especially seen at the remove of nearly fifty years. Certainly nothing about it would seem to have anything remotely to do with the thankless drudgery of lexicography.

But it had everything to do with the business, not of illicit love, interstate crime, or the secret movement of monies, but of dictionary making. For the cars, which would become briefly famous, at least in the somewhat fame-starved world of lexicography, were the University of Wisconsin Word Wagons. All were customized 1966 Dodge A100 Sportsman models, purchased en masse with government grant money. Equipped for long-haul journeying, they were powered by the legendarily indestructible Chrysler Slant-Six 170-horsepower engine and appointed with modest domestic fixings that included a camp bed, sink, and stove. Each also had two cumbersome reel-to-reel tape recorders and a large number of tape spools.

The drivers and passengers who manned the wagons were volunteers bent to one overarching task: that of collecting America’s other language. They were being sent to more than a thousand cities, towns, villages, and hamlets to discover and record, before it became too late and everyone started to speak like everybody else, the oral evidence of exactly what words and phrases Americans in those places spoke, heard, and read, out in the boondocks and across the prairies, down in the hollows and up on the ranges, clear across the great beyond and in the not very long ago.

These volunteers were charged with their duties by someone who might at first blush seem utterly unsuitable for the task of examining American speech: a Briton, born in Kingston, of a Canadian father and a Jamaican mother: Frederic Gomes Cassidy, a man whose reputation—he died twelve years ago, aged ninety-two—is now about to be consolidated as one of the greatest lexicographers this country has ever known. Cassidy’s standing—he is now widely regarded as this continent’s answer to James Murray, the first editor of the Oxford English Dictionary; Cassidy was a longtime English professor at the University of Wisconsin, while Murray’s chops were earned at Oxford—rests on one magnificent achievement: his creation of a monumental dictionary of American dialect speech, conceived roughly half a century ago, and over which he presided for most of his professional life.

The five-thousand-page, five-volume book, known formally as the Dictionary of American Regional English and colloquially just as DARE, is now at last fully complete. The first volume appeared in 1985: it listed tens of thousands of geographically specific dialect words, from tall flowering plants known in the South as “Aaron’s Rod,” to a kind of soup much favored in Wisconsin, made from duck’s blood, known as “czarina.” The next two volumes appeared in the 1990s, the fourth after 2000, so assiduously planned and organized by Cassidy as to be uninterrupted by his passing. The fifth and final volume, the culminating triumph of this extraordinary project, is being published this March—it offers up regionalisms running alphabetically from “slab highway” (as concrete-covered roads are apparently still known in Indiana and Missouri) to “zydeco,” not the music itself, but a kind of raucous and high-energy musical party that is held in a long swathe of villages arcing from Galveston to Baton Rouge.

“Aaron’s rod” to “zydeco”—between these two verbal bookends lies an immense and largely hidden American vocabulary, one that surely, more than perhaps any other aspect of society, reveals the wonderfully chaotic pluribusout of which two centuries of commerce and convention have forged the duller reality of the unum. Which was precisely what Cassidy and his fellow editors sought to do—to capture, before it faded away, the linguistic coat of many colors of this immigrant-made country, and to preserve it in snapshot, in part for strictly academic purposes, in part for the good of history, and in part, maybe, on the off chance that the best of the lexicon might one day be revived.

by Simon Winchester, Lapham's Quarterly |  Read more:
Photo: UW-Madison Archives

Color of Blues
via:

Can Etsy Go Pro Without Losing Its Soul?

Two years after setting up her online shop, Terri Johnson had the kind of holiday season most business owners dream about. By Thanksgiving 2009, orders for her custom-embroidered goods started streaming in at a breakneck pace. And the volume only increased heading into December. Johnson was hardly feeling festive, though. To get the merchandise out the door, she worked nonstop, hunched over the embroidery machine in her basement, stitching robes, aprons, and shirts until just a few days before Christmas. “I was barely seeing my family,” she recalls. The problem was that Johnson’s main venue, shopmemento, is a storefront on Etsy.com. And she feared that if she hired help, invested in new equipment, or rented a commercial workspace, she might run afoul of Etsy policies and get kicked off the site.

After all, Etsy was designed as a marketplace for “the handmade.” The whole point is that the site offers a way for individual makers to connect with individual buyers. But trying to keep up with orders on her own was threatening to turn Johnson’s business into a one-woman sweatshop. Etsy rules allow “collectives,” but that’s a vague and unbusinesslike term. “No one knows what it means,” she says. After the holiday crush, Johnson was so spent that she shuttered her store for the entire month of January to recover. She knew that if she wanted to build a real business, she’d eventually have to scale up production. She wondered if she had outgrown Etsy.

This was a big problem for Johnson, but it was also troubling for Etsy. Today the site attracts 42 million unique visitors a month, who browse almost 15 million products. More than 800,000 sellers use the service. Most are producing handmade goods as a sideline. But losing motivated sellers like Johnson, who are making a full-time living on Etsy, means saying good-bye to a hugely profitable part of its community.

From its start in 2005, Etsy was a rhetoric-heavy enterprise that promised to do more than simply turn a profit. It promoted itself as an economy-shifter, making possible a parallel retail universe that countered the alienation of mass production with personal connections and unique, handcrafted items. There was no reason to outsource manufacturing, the thinking went, if a sea of individual sellers took the act of making into their own hands—literally.

The approach worked well enough to establish the startup. Etsy makes money from every listing (20 cents apiece) as well as every sale (a 3.5 percent cut). It has been profitable since 2009, and in July 2012 year-over-year sales were up more than 75 percent. Not bad for a retailer selling mostly nonessential products during one of the most sluggish chapters in the history of American consumer spending.

But now Etsy finds itself at a crossroads. Sellers like Johnson, reaching the limits of what the service allows (as well as what it can do for them), are being forced to consider moving on. Meanwhile, the hobbyists and artisans who make up the rest of the marketplace still value Etsy’s founding ethos—that handmade items have an intrinsic value that should be celebrated and given a forum outside of traditional retail.

How to reconcile these competing visions of what it means to be an Etsy seller isn’t clear. While the site wants to remain an accessible entry point for newbies, it doesn’t want the narrative arc for successful sellers to arrive at the inevitable plot point: “And then I started a real business.”

by Rob Walker, Wired |  Read more:
Photo: Zachary Zavislak

Arthur O. Sulzberger, Publisher Who Changed The Times, Dies at 86

[ed. One of the longest obituaries I think I've ever read. A history of the New York Times reflected in the life of Mr. Sulzberger.]

Arthur Ochs Sulzberger, who guided The New York Times and its parent company through a long, sometimes turbulent period of expansion and change on a scale not seen since the newspaper’s founding in 1851, died on Saturday at his home in Southampton, N.Y. He was 86.

His death, after a long illness, was announced by his family.

Mr. Sulzberger’s tenure, as publisher of the newspaper and as chairman and chief executive of The New York Times Company, reached across 34 years, from the heyday of postwar America to the twilight of the 20th century, from the era of hot lead and Linotype machines to the birth of the digital world.

The paper he took over as publisher in 1963 was the paper it had been for decades: respected and influential, often setting the national agenda. But it was also in precarious financial condition and somewhat insular, having been a tightly held family operation since 1896, when it was bought by his grandfather Adolph S. Ochs.

By the 1990s, when Mr. Sulzberger passed the reins to his son, first as publisher in 1992 and then as chairman in 1997, the enterprise had been transformed. The Times was now national in scope, distributed from coast to coast, and it had become the heart of a diversified, multibillion-dollar media operation that came to encompass newspapers, magazines, television and radio stations and online ventures.

The expansion reflected Mr. Sulzberger’s belief that a news organization, above all, had to be profitable if it hoped to maintain a vibrant, independent voice. As John F. Akers, a retired chairman of I.B.M. and for many years a Times company board member, put it, “Making money so that you could continue to do good journalism was always a fundamental part of the thinking.”

Mr. Sulzberger’s insistence on independence was shown in his decision in 1971 to publish a secret government history of the Vietnam War known as the Pentagon Papers. It was a defining moment for him and, in the view of many journalists and historians, his finest.

In thousands of pages, this highly classified archive detailed Washington’s legacy of deceit and evasion as it stumbled through an unpopular war. When the Pentagon Papers were divulged in a series of articles in June 1971, an embarrassed Nixon administration demanded that the series be stopped immediately, citing national security considerations. The Times refused, on First Amendment grounds, and won its case in the United States Supreme Court in a landmark ruling on press freedom.  (...)

A newspaper publisher may be a business executive, but the head of an institution like The Times is also inevitably cast as a leader in legal defenses of the First Amendment. It was a role Mr. Sulzberger embraced, and never with more enduring consequences than in his decision to publish the Pentagon Papers.

“This was not a breach of the national security,” Mr. Sulzberger said at the time. “We gave away no national secrets. We didn’t jeopardize any American soldiers or Marines overseas.” Of the government, he added, “It’s a wonderful way if you’ve got egg on your face to prevent anybody from knowing it, stamp it secret and put it away.”

The government obtained a temporary restraining order from a federal judge in Manhattan. It was the first time in United States history that a court, on national security grounds, had stopped a newspaper in advance from publishing a specific article. The Washington Post soon began running its own articles based on the same documents, and both papers took their case to the Supreme Court. In late June, the court issued its decision rejecting the administration’s national-security arguments and upholding a newspaper’s right to publish in the face of efforts to impose “prior restraint.”

The significance of that ruling for the future of government-press relations has been debated. But this much was certain: It established the primacy of a free press in the face of a government’s insistence on secrecy. In the 40 years since the court handed down its ruling, there has not been another instance of officially sanctioned prior restraint to keep an American newspaper from printing secret information on national security grounds.

In a 1996 speech to a group of journalists, Mr. Sulzberger said of the documents that he “had no doubt but that the American people had a right to read them and that we at The Times had an obligation to publish them.” But typically — he had an unpretentious manner and could not resist a good joke or, for that matter, a bad pun — he tried to keep even a matter this weighty from becoming too ponderous.

The fact is, Mr. Sulzberger said, the documents were tough sledding. “Until I read the Pentagon Papers,” he said, “I did not know that it was possible to read and sleep at the same time.”

Nor did he understand why President Richard M. Nixon had fought so hard “to squelch these papers,” he added.

“I would have thought that he would bemoan their publication, joyfully blame the mess on Lyndon Johnson and move on to Watergate,” Mr. Sulzberger said. “But then I never understood Washington.”

by Clyde Haberman, NY Times | Read more:
Photo: Barton Silverman

Friday, September 28, 2012

My Name is Joe Biden and I’ll Be Your Server


Hey, chief. There’s the guy. How you doin’? Got your friends here, party of six. Lady in the hat. Great to see you. My name is Joe Biden and I’ll be your server tonight. Lemme tell you a story. (He pulls up a chair and sits.)

Folks, when I was six years old my dad came to me one night. My dad was a car guy. Hard worker, decent guy. Hadn’t had an easy life. He climbed the stairs to my room one night and he sat on the edge of my bed and he said to me, he said, “Champ, your mom worked hard on that dinner tonight. She worked hard on it. She literally worked on it for hours. And when you and your brothers told her you didn’t like it, you know what, Joey? That hurt her. It hurt.” And I felt (lowers voice to a husky whisper) ashamed. Because lemme tell you something. He was right. My dad was right. My mom worked hard on that dinner, and it was delicious. Almost as delicious as our Chicken Fontina Quesadilla with Garlicky Guacamole. That’s our special appetizer tonight. It’s the special. It’s the special. (His voice rising) And the chef worked hard on it, just like my mom, God love her, and if you believe in the chef’s values of hard work and creative spicing you should order it, although if you don’t like chicken we can substitute shrimp for a small upcharge.

Thank you. Thank you. Now, hold on. There’s something else you need to know.

Our fish special is halibut with a mango-avocado salsa and Yukon Gold potatoes, and it’s market-priced at sixteen-ninety-five. Sounds like a lot of money, right? Sounds like “Hey, Joe, that’s a piece of fish and a little topping there, and some potatoes.” “Bidaydas,” my great-grandmother from County Louth would have called ’em. You know what I’m talking about. Just simple, basic, sitting-around-the-kitchen-table-on-a-Tuesday-night food. Nothin’ fancy, right? But, folks, that’s not the whole story. If you believe that, you’re not . . . getting . . . the whole . . . story. Because lemme tell you about these Yukon Gold potatoes. These Yukon Gold potatoes are brushed with extra-virgin olive oil and hand-sprinkled with pink Himalayan sea salt, and then José, our prep guy. . . . Well. Lemme tell you about José. (He pauses, looks down, clears his throat.)

I get . . . I get emotional talking about José. This is a guy who—José gets here at ten in the morning. Every morning, rain or shine. Takes the bus here. Has to transfer twice. Literally gets off one bus and onto another. Twice. Never complains. Rain, snow, it’s hailin’ out there. . . . The guy literally does not complain. Never. Never heard it. José walks in, hangs his coat on a hook, big smile on his face, says hello to everybody—Sal the dishwasher, Angie the sous-chef, Frank, Donna, Pat. . . . And then do you know what he does? Do you know what José does? I’ll tell you what he does, and folks, folks, this is the point I want to make. With his own hands, he sprinkles fresh house-grown rosemary on those potatoes (raises voice to a thundering crescendo), and they are golden brown on the outside and soft on the inside and they are delicious! They are delicious! They are delicious!

by Bill Barol, New Yorker |  Read more:
Illustration: Miguel Gallardo

Botticelli: The Birth of Venus (detail)
via:

Ume


Glass Works

[ed. The story of Corning and Gorilla Glass, the 'ultrathin, ultrastrong material of the future'.]

From above, Corning’s headquarters in upstate New York looks like a Space Invaders alien: Designed by architect Kevin Roche in the early ’90s, the structure fans out in staggered blocks. From the ground, though, the tinted windows and extended eaves make the building look more like a glossy, futuristic Japanese palace.

The office of Wendell Weeks, Corning’s CEO, is on the second floor, looking out onto the Chemung River. It was here that Steve Jobs gave the 53-year-old Weeks a seemingly impossible task: Make millions of square feet of ultrathin, ultrastrong glass that didn’t yet exist. Oh, and do it in six months. The story of their collaboration—including Jobs’ attempt to lecture Weeks on the principles of glass and his insistence that such a feat could be accomplished—is well known. How Corning actually pulled it off is not.

Weeks joined Corning in 1983; before assuming the top post in 2005, he oversaw both the company’s television and specialty glass businesses. Talk to him about glass and he describes it as something exotic and beautiful—a material whose potential is just starting to be unlocked by scientists. He’ll gush about its inherent touchability and authenticity, only to segue into a lecture about radio-frequency transparency. “There’s a sort of fundamental truth in the design value of glass,” Weeks says, holding up a clear pebble of the stuff. “It’s like a found object; it’s cool to the touch; it’s smooth but has surface to it. What you’d really want is for this to come alive. That’d be a perfect product.”

Weeks and Jobs shared an appreciation for design. Both men obsessed over details. And both gravitated toward big challenges and ideas. But while Jobs was dictatorial in his management style, Weeks (like many of his predecessors at Corning) tends to encourage a degree of insubordination. “The separation between myself and any of the bench scientists is nonexistent,” he says. “We can work in these small teams in a very relaxed way that’s still hyperintense.”

Indeed, even though it’s a big company—29,000 employees and revenue of $7.9 billion in 2011—Corning still thinks and acts like a small one, something made easier by its relatively remote location, an annual attrition rate that hovers around 1 percent, and a vast institutional memory. (Stookey, now 97, and other legends still roam the halls and labs of Sullivan Park, Corning’s R&D facility.) “We’re all lifers here,” Weeks says, smiling. “We’ve known each other for a long time and succeeded and failed together a number of times.”

One of the first conversations between Weeks and Jobs actually had nothing to do with glass. Corning scientists were toying around with microprojection technologies—specifically, better ways of using synthetic green lasers. The thought was that people wouldn’t want to stare at tiny cell phone screens to watch movies and TV shows, and projection seemed like a natural solution. But when Weeks spoke to Jobs about it, Apple’s chief called the idea dumb. He did mention he was working on something better, though—a device whose entire surface was a display. It was called the iPhone.

by Brian Gardner, Wired |  Read more:
Photo: Max Aguilera-Hellweg

Meet Mira, the Supercomputer That Makes Universes


Cosmology is the most ambitious of sciences. Its goal, plainly stated, is to describe the origin, evolution, and structure of the entire universe, a universe that is as enormous as it is ancient. Surprisingly, figuring out what the universe used to look like is the easy part of cosmology. If you point a sensitive telescope at a dark corner of the sky, and run a long exposure, you can catch photons from the young universe, photons that first sprang out into intergalactic space more than ten billion years ago. Collect enough of these ancient glimmers and you get a snapshot of the primordial cosmos, a rough picture of the first galaxies that formed after the Big Bang. Thanks to sky-mapping projects like the Sloan Digital Sky Survey, we also know quite a bit about the structure of the current universe. We know that it has expanded into a vast web of galaxies, strung together in clumps and filaments, with gigantic voids in between.

The real challenge for cosmology is figuring out exactly what happened to those first nascent galaxies. Our telescopes don't let us watch them in time-lapse; we can't fast forward our images of the young universe. Instead, cosmologists must craft mathematical narratives that explain why some of those galaxies flew apart from one another, while others merged and fell into the enormous clusters and filaments that we see around us today. Even when cosmologists manage to cobble together a plausible such story, they find it difficult to check their work. If you can't see a galaxy at every stage of its evolution, how do you make sure your story about it matches up with reality? How do you follow a galaxy through nearly all of time? Thanks to the astonishing computational power of supercomputers, a solution to this problem is beginning to emerge: You build a new universe.

In October, the world's third fastest supercomputer, Mira, is scheduled to run the largest, most complex universe simulation ever attempted. The simulation will cram more than 12 billion years worth of cosmic evolution into just two weeks, tracking trillions of particles as they slowly coalesce into the web-like structure that defines our universe on a large scale. Cosmic simulations have been around for decades, but the technology needed to run a trillion-particle simulation only recently became available. Thanks to Moore's Law, that technology is getting better every year. If Moore's Law holds, the supercomputers of the late 2010s will be a thousand times more powerful than Mira and her peers. That means computational cosmologists will be able to run more simulations at faster speeds and higher resolutions. The virtual universes they create will become the testing ground for our most sophisticated ideas about the cosmos.

Salman Habib is a senior physicist at the Argonne National Laboratory and the leader of the research team working with Mira to create simulations of the universe. Last week, I talked to Habib about cosmology, supercomputing, and what Mira might tell us about the enormous cosmic web we find ourselves in.

Help me get a handle on how your project is going to work. As I understand it, you're going to create a computer simulation of the early universe just after the Big Bang, and in this simulation you will have trillions of virtual particles interacting with each other -- and with the laws of physics -- over a time period of more than 13 billion years. And once the simulation has run its course, you'll be looking to see if what comes out at the end resembles what we see with our telescopes. Is that right?

Habib: That's a good approximation of it. Our primary interest is large-scale structure formation throughout the universe and so we try to begin our simulations well after the Big Bang, and even well after the microwave background era. Let me explain why. We're not sure how to simulate the very beginning of the universe because the physics are very complicated and partially unknown, and even if we could, the early universe is structurally homogenous relative to the complexity that we see now, so you don't need a supercomputer to simulate it. Later on, at the time of the microwave background radiation, we have a much better idea about what's going on. WMAP andPlanck have given us a really clear picture of what the universe looked like at that time, but even then the universe is still very homogenous -- its density perturbations are something like one part in a hundred thousand. With that kind of homogeneity, you can still do the calculations and modeling without a supercomputer. But if you fast forward to the point where the universe is about a million times denser than it is now, that's when things get so complicated that you want to hand over the calculations to a supercomputer.

Now the trillions of particles we're talking about aren't supposed to be actual physical particles like protons or neutrons or whatever. Because these trillions of particles are meant to represent the entire universe, they are extremely massive, something in the range of a billion suns. We know the gravitational mechanics of how these particles interact, and so we evolve them forward to see what kind of densities and structure they produce, both as a result of gravity and the expansion of the universe. So, that's essentially what the simulation does: it takes an initial condition and moves it forward to the present to see if our ideas about structure formation in the universe are correct.

by Ross Andersen, The Atlantic |  Read more:
Photo:Argonne National Laboratory

Thursday, September 27, 2012

Yes, Texas is Different


One of the attractions of the Bob Bullock Museum of Texas State History, an imposing institution just across the street from the campus of the University of Texas at Austin, is “The Star of Destiny,” a fifteen-minute “multimedia experience” purporting to tell “the stories of determination, perseverance, and triumph that have formed the Texas spirit.” It’s a World’s Fair-type presentation, narrated by an actor dressed up as Sam Houston and filmed to look like he’s standing onstage. What makes it “multimedia” is that, besides film, it uses slides projected on a three-screen setup and, in a segment about the gargantuan Galveston hurricane of 1900, employs strobe lights to mimic lightning and a hidden wind machine to blow great gusts of cold misty air into the startled faces of audience members.

At one point, the screens go black and we see projected in white letters:

TEXAS IS BIGGER THAN FRANCE AND ENGLAND

Black again. Then (you knew it was coming):

…COMBINED.

A bit later, another black screen/white letters sequence:

BEFORE TEXAS WAS A STATE…

(portentous pause)

TEXAS WAS A NATION

This, of course, is a reference to the Republic of Texas, as this spacious corner of the world styled itself from 1836 to 1846. In truth, the Republic of Texas was a transitional entity, the larval stage of the State of Texas. Nevertheless, “The Star of Destiny” has a point. Texas is different. It is big, for a start. Not as big as Alaska, which is bigger than France and England and Germany and Japan … combined, but big enough. And it was a nominally independent if ramshackle republic, with embassies and a Congress and everything. Vermont, Hawaii, and, arguably, California were once independent republics, too, but they don’t make a fetish of it. Texas does.

Texas is different. The qualities—the very existence—of the Bob Bullock Museum of Texas State History are evidence of that. Modesty is not the museum’s keynote. On the plaza out front is a huge sculpture of a five-pointed star. It must be twenty feet high. (“Mmm, subtle,” our ninth-grader murmured.) Inside, the exhibits are an uneasy combination of ethnic correctness and unrestrained boasting. One would think that Texas, besides being very, very great, has always been ruled by a kind of U.N. Security Council consisting of one white male, one white female, one black person, one American Indian, and one Mexican or Mexican-American, all of them exemplars of—the phrase is repeated ad nauseam—courage, determination, and hard work.

The stories the exhibits tell are mostly about the state’s economy, agricultural and industrial. Whether it’s oil extraction or cattle raising, rice farming or silicon chipmaking, quicksilver mining or sheepherding, the elements of each are usually the same. A few men become extraordinarily rich. These men are praised for their courage, determination, and hard work. The laborers whose labor produces their wealth are ruthlessly exploited. (The exhibits don’t put it this way, obviously, but the facts are there if you have eyes to see them.) These unfortunates may be poor white men; they may be Mexican immigrant women; they may be enslaved blacks or African-Americans held in sharecropper peonage. They, too, are praised for their courage, determination, and hard work. It all adds up to an unending progression of triumphs for the Texas spirit.

The boasting does not take long to taste a little sour. It begins to feel defensive and insecure. One begins to sense that the museum, on some level, knows that a lot of it is, well, bullockshit.

And yet, and yet. There are redeeming grace notes. The current temporary exhibit at the Bob Bullock Museum is one of them. It’s about Texas music: blues, rock, country, country rock, bluegrass, singer-songwriter, alt-whatever. In this exhibit, the boastfulness feels like simple accuracy and the nods to “diversity” are not a stretch. Respect is shown, properly, to Willie Nelson, Leadbelly, Stevie Ray Vaughan (whose battered Stratocaster occupies a place of honor), Janis Joplin, Big Mama Thornton, and many equally deserving others. And, as befits Austin, there’s live music. During our visit, a fine, fringed six-piece cowboy-country band played and sang a tribute to mid-century radio. All was forgiven.

Does the name Bob Bullock ring a bell? As lieutenant governor “under” George W. Bush (in Texas the post is independently elected and has powers that rival those of the governorship itself), Bob Bullock (1929-1999), a Democrat, was responsible for Dubya’s pre-Presidential reputation for bipartisanship and moderation. In his long career in state government, Bullock was, as far as I can tell, a net plus for Texas, even if his late-in-life Bush-enabling made him a net minus for the nation and the world. But you have to hand it to Texas. How many states would name their enormous marble-clad museum of state history not after a big donor but after a backroom career politician who, by the way, was also a five-times-married alcoholic?

by Hendrick Hertzberg, New Yorker |  Read more:
Photo: Paul Morse