Wednesday, January 11, 2012

With Enough Bandwidth, Many Join the Band


When Dr. John McClure, a pathologist in Edina, Minn., was pondering his wish list several years ago, he added something a little out of the ordinary: learn to play the bagpipes. But his goal seemed like a long shot after a friend who had been teaching him moved away.

Now he is getting lessons from a top-tier teacher — Jori Chisholm, whose résumé includes a first-place award at the 2010 Cowal Highland Gathering in Dunoon, Scotland. Mr. Chisholm lives in Seattle, but distance is no longer a problem — Dr. McClure now takes lessons over Skype.

They even squeeze in a lesson sometimes when Dr. McClure, 50, is at work, though he keeps the noise down by using a practice chanter, essentially a pipe without a bag. “I’ve been on call, waiting for a specimen from the O.R., and I’ll do a lesson with Jori,” Dr. McClure said.

Skype and other videochat programs have transformed the simple phone call, but the technology is venturing into a new frontier: it is upending and democratizing the world of music lessons.

Students who used to limit the pool of potential teachers to those within a 20-mile radius from their homes now take lessons from teachers — some with world-class credentials — on other coasts or continents. The list of benefits is long: Players of niche instruments now have more access to teachers. Parents can simply send their child down the hall for lessons rather than driving them. And teachers now have a new way to build their business.

“I’ve seen videos of individuals teaching students all over the world,” said Gary Ingle, chief executive of the Music Teachers National Association. “There will be people who would never take a music lesson unless it’s done online. As music teachers, we should be willing to meet students where they are.”

by Catherine Saint Louis, NY Times |  Read more:
Photo: T.C. Worley for The New York Times

Sue Corr, untitled 10
via:

Research Bought, Then Paid For

Through the National Institutes of Health, American taxpayers have long supported research directed at understanding and treating human disease. Since 2009, the results of that research have been available free of charge on the National Library of Medicine’s Web site, allowing the public (patients and physicians, students and teachers) to read about the discoveries their tax dollars paid for.

But a bill introduced in the House of Representatives last month threatens to cripple this site. The Research Works Act would forbid the N.I.H. to require, as it now does, that its grantees provide copies of the papers they publish in peer-reviewed journals to the library. If the bill passes, to read the results of federally funded research, most Americans would have to buy access to individual articles at a cost of $15 or $30 apiece. In other words, taxpayers who already paid for the research would have to pay again to read the results.

This is the latest salvo in a continuing battle between the publishers of biomedical research journals like Cell, Science and The New England Journal of Medicine, which are seeking to protect a valuable franchise, and researchers, librarians and patient advocacy groups seeking to provide open access to publicly funded research.

The bill is backed by the powerful Association of American Publishers and sponsored by Representatives Carolyn B. Maloney, Democrat of New York, and Darrell Issa, a Republican from California. The publishers argue that they add value to the finished product, and that requiring them to provide free access to journal articles within a year of publication denies them their fair compensation. After all, they claim, while the research may be publicly funded, the journals are not.

by Michael B. Eisen, NY Times |  Read more:

Lockdown: The Coming War on General-Purpose Computing


General-purpose computers are astounding. They're so astounding that our society still struggles to come to grips with them, what they're for, how to accommodate them, and how to cope with them. This brings us back to something you might be sick of reading about: copyright.

But bear with me, because this is about something more important. The shape of the copyright wars clues us into an upcoming fight over the destiny of the general-purpose computer itself.

In the beginning, we had packaged software and we had sneakernet. We had floppy disks in ziplock bags, in cardboard boxes, hung on pegs in shops, and sold like candy bars and magazines. They were eminently susceptible to duplication, were duplicated quickly, and widely, and this was to the great chagrin of people who made and sold software.

Enter Digital Rights Management in its most primitive forms: let's call it DRM 0.96. They introduced physical indicia which the software checked for—deliberate damage, dongles, hidden sectors—and challenge-response protocols that required possession of large, unwieldy manuals that were difficult to copy.

These failed for two reasons. First, they were commercially unpopular, because they reduced the usefulness of the software to the legitimate purchasers. Honest buyers resented the non-functionality of their backups, they hated the loss of scarce ports to the authentication dongles, and they chafed at the inconvenience of having to lug around large manuals when they wanted to run their software. Second, these didn't stop pirates, who found it trivial to patch the software and bypass authentication. People who took the software without paying for it were untouched.

Typically, the way this happened is a programmer, with possession of technology and expertise of equivalent sophistication to the software vendor itself, would reverse-engineer the software and circulate cracked versions. While this sounds highly specialized, it really wasn't. Figuring out what recalcitrant programs were doing and routing around media defects were core skills for computer programmers, especially in the era of fragile floppy disks and the rough-and-ready early days of software development. Anti-copying strategies only became more fraught as networks spread; once we had bulletin boards, online services, USENET newsgroups and mailing lists, the expertise of people who figured out how to defeat these authentication systems could be packaged up in software as little crack files. As network capacity increased, the cracked disk images or executables themselves could be spread on their own.

This gave us DRM 1.0. By 1996, it became clear to everyone in the halls of power that there was something important about to happen. We were about to have an information economy, whatever the Hell that was. They assumed it meant an economy where we bought and sold information. Information technology improves efficiency, so imagine the markets that an information economy would have! You could buy a book for a day, you could sell the right to watch the movie for a Euro, and then you could rent out the pause button for a penny per second. You could sell movies for one price in one country, at another price in another, and so on. The fantasies of those days were like a boring science fiction adaptation of the Old Testament Book of Numbers, a tedious enumeration of every permutation of things people do with information—and what might be charged for each.

Unfortunately for them, none of this would be possible unless they could control how people use their computers and the files we transfer to them. After all, it was easy to talk about selling someone a tune to download to their MP3 player, but not so easy to talk about the the right to move music from the player to another device. But how the Hell could you stop that once you'd given them the file? In order to do so, you needed to figure out how to stop computers from running certain programs and inspecting certain files and processes. For example, you could encrypt the file, and then require the user to run a program that only unlocked the file under certain circumstances.

But, as they say on the Internet, now you have two problems.

by Cory Doctorow, Boing Boing |  Read more:

Tuesday, January 10, 2012

How Many Stephen Colberts Are There?


There used to be just two Stephen Colberts, and they were hard enough to distinguish. The main difference was that one thought the other was an idiot. The idiot Colbert was the one who made a nice paycheck by appearing four times a week on “The Colbert Report” (pronounced in the French fashion, with both t’s silent), the extremely popular fake news show on Comedy Central. The other Colbert, the non-idiot, was the 47-year-old South Carolinian, a practicing Catholic, who lives with his wife and three children in suburban Montclair, N.J., where, according to one of his neighbors, he is “extremely normal.” One of the pleasures of attending a live taping of “The Colbert Report” is watching this Colbert transform himself into a Republican superhero.

Suburban Colbert comes out dressed in the other Colbert’s guise — dark two-button suit, tasteful Brooks Brothersy tie, rimless Rumsfeldian glasses — and answers questions from the audience for a few minutes. (The questions are usually about things like Colbert’s favorite sport or favorite character from “The Lord of the Rings,” but on one memorable occasion a young black boy asked him, “Are you my father?” Colbert hesitated a moment and then said, “Kareem?”) Then he steps onstage, gets a last dab of makeup while someone sprays his hair into an unmussable Romney-like helmet, and turns himself into his alter ego. His body straightens, as if jolted by a shock. A self-satisfied smile creeps across his mouth, and a manically fatuous gleam steals into his eyes.

Lately, though, there has emerged a third Colbert. This one is a version of the TV-show Colbert, except he doesn’t exist just on screen anymore. He exists in the real world and has begun to meddle in it. In 2008, the old Colbert briefly ran for president, entering the Democratic primary in his native state of South Carolina. (He hadn’t really switched parties, but the filing fee for the Republican primary was too expensive.) In 2010, invited by Representative Zoe Lofgren, he testified before Congress about the problem of illegal-immigrant farmworkers and remarked that “the obvious answer is for all of us to stop eating fruits and vegetables.”

But those forays into public life were spoofs, more or less. The new Colbert has crossed the line that separates a TV stunt from reality and a parody from what is being parodied. In June, after petitioning the Federal Election Commission, he started his own super PAC — a real one, with real money. He has run TV ads, endorsed (sort of) the presidential candidacy of Buddy Roemer, the former governor of Louisiana, and almost succeeded in hijacking and renaming the Republican primary in South Carolina. “Basically, the F.E.C. gave me the license to create a killer robot,” Colbert said to me in October, and there are times now when the robot seems to be running the television show instead of the other way around.

“It’s bizarre,” remarked an admiring Jon Stewart, whose own program, “The Daily Show,” immediately precedes “The Colbert Report” on Comedy Central and is where the Colbert character got his start. “Here is this fictional character who is now suddenly interacting in the real world. It’s so far up its own rear end,” he said, or words to that effect, “that you don’t know what to do except get high and sit in a room with a black light and a poster.”

by Charles McGrath, NY Times |  Read more:
Photo: Todd Heisler/The New York Times

The Fragile Teenage Brain


If the sport of football ever dies, it will die from the outside in. It won't be undone by a labor lockout or a broken business model — football owners know how to make money. Instead, the death will start with those furthest from the paychecks, the unpaid high school athletes playing on Friday nights. It will begin with nervous parents reading about brain trauma, with doctors warning about the physics of soft tissue smashing into hard bone, with coaches forced to bench stars for an entire season because of a single concussion. The stadiums will still be full on Sunday, the professionals will still play, the profits will continue. But the sport will be sick.

The sickness will be rooted in football's tragic flaw, which is that it inflicts concussions on its players with devastating frequency. Although estimates vary, several studies suggest that up to 15 percent of football players suffer a mild traumatic brain injury during the season. (The odds are significantly worse for student athletes — the Centers for Disease Control and Prevention estimates that nearly 2 million brain injuries are suffered by teenage players every year.) In fact, the chances of getting a concussion while playing high school football are approximately three times higher than the second most dangerous sport, which is girls' soccer. While such head injuries have long been ignored — until recently, players were resuscitated with smelling salts so they could re-enter the game — it's now clear that these blows have lasting consequences.

The consequences appear to be particularly severe for the adolescent brain. According to a study published last year in Neurosurgery, high school football players who suffered two or more concussions reported mental problems at much higher rates, including headaches, dizziness, and sleeping issues. The scientists describe these symptoms as "neural precursors," warning signs that something in the head has gone seriously wrong.

This research builds on previous work documenting the hazards of football for the teenage brain. In 2002, a team of neurologists surveying several hundred high school football players concluded that athletes who had suffered three or more concussions were nearly ten times more likely to exhibit multiple "abnormal" responses to head injury, including loss of consciousness and persistent amnesia. A 2004 study, meanwhile, revealed that football players with multiple concussions were 7.7 times more likely to experience a "major drop in memory performance" and that three months after a concussion they continued to experience "persistent deficits in processing complex visual stimuli." What's most disturbing, perhaps, is that these cognitive deficits have a real-world impact: When compared with similar students without a history of concussions, athletes with two or more brain injuries demonstrate statistically significant lower grade-point averages.

by Jonah Lehrer, Grantland |  Read more:
Photo: Charles LeClaire/US Presswire

Michael Kiwanuka


Advice From Life’s Graying Edge on Finishing With No Regrets

At 17, I wrote a speech titled, “When You Come to the End of Your Days, Will You Be Able to Write Your Own Epitaph?” It reflected the approach to life I adopted after my mother’s untimely death from cancer at age 49. I chose to live each day as if it could be my last — but with a watchful eye on the future in case it wasn’t.

My goal was, and still is, to die without regrets.

For more than 50 years, this course has served me well, including my decision to become a science journalist instead of pursuing what had promised to be a more lucrative and prestigious, but probably less enjoyable, career as a biochemist. I find joy each day in mundane things too often overlooked: sunrises and sunsets, an insect on a flower, crows chasing a hawk, a majestic tree, a child at play, an act of kindness toward a stranger.

Eventually, most of us learn valuable lessons about how to conduct a successful and satisfying life. But for far too many people, the learning comes too late to help them avoid painful mistakes and decades of wasted time and effort.

In recent years, for example, many talented young people have denied their true passions, choosing instead to pursue careers that promise fast and big monetary gains. High rates of divorce speak to an impulsiveness to marry and a tenuous commitment to vows of “till death do us part.”

Parents undermine children’s self-confidence and self-esteem by punishing them physically or pushing them down paths, both academic and athletic, that they are ill equipped to follow. And myriad prescriptions for antidepressants and anti-anxiety drugs reflect a widespread tendency to sweat the small stuff, a failure to recognize time-honored sources of happiness, and a reliance on material acquisitions that provide only temporary pleasure.

Enter an invaluable source of help, if anyone is willing to listen while there is still time to take corrective action. It is a new book called “30 Lessons for Living” (Hudson Street Press) that offers practical advice from more than 1,000 older Americans from different economic, educational and occupational strata who were interviewed as part of the ongoing Cornell Legacy Project.

Its author, Karl Pillemer, a professor of human development at the College of Human Ecology at Cornell and a gerontologist at the Weill Cornell Medical College, calls his subjects “the experts,” and their advice is based on what they did right and wrong in their long lives. Many of the interviews can be viewed at legacyproject.human.cornell.edu.

Here is a summary of their most salient thoughts.

by Jane Brody, NY Times |  Read more:

    All They That Labored

    Scholars piece together the monumental job of creating the King James Bible—and reinterpret its legacy

    Generations of Protestant Christians have heard God speaking through the language of the King James Bible. Four hundred years after it was first published, in 1611, it still has an unrivalled reputation as a shaper of English prose, its phrases a lasting contribution to how we use the language. It's given us such expressions as "out of the mouth of babes," "suffer fools gladly," "seek, and ye shall find," and "Am I my brother's keeper?"

    Yet the 50 or so learned men who labored in teams to create the King James Bible did not set out to create a literary masterpiece. They wanted to establish as direct a connection as they could to the original languages of the Old and New Testaments. And it's not a miracle that this monumental exercise in translation-by-committee turned out as well as it did. By the time they set to work, in 1604, the King James translators had a hundred years of pioneering work on which to draw. They leaned heavily on texts and translations put together by theologians and linguists such as Erasmus and William Tyndale.

    In recent decades, scholarship on the making of the King James Bible has made it plain just how much cumulative human labor and debate went into its creation. "The King James Bible didn't drop from the sky in 1611," says Helen Moore, a fellow and tutor in English at Corpus Christi College at the University of Oxford. Moore led the curatorial committee that put together "Manifold Greatness," an anniversary exhibit at Oxford's Bodleian Library devoted to the making of the King James Bible. The most famous Bible in English, she says, was "made by many different people in many different places using many different people's words and many reference texts."

    by Jennifer Howard, Chronicle of Higher Education |  Read more:
    Photo: Annotated text, Bodleian Library, University of Oxford, 2011

    The Incredibly, Insanely, Undeniably Awesome Return of Van Halen


    Van Halen performed at Café Wha? last night. It’s possible you’ve already heard reports of this, since Café Wha? only holds 250 people and just about every single person inside the venue was a journalist, an industry bozo, or a former Wimbledon champion (John McEnroe was there). This event was partially the result of Café Wha? being previously owned by David Lee Roth’s 92-year-old uncle, but it mostly happened because Van Halen assumed unfathomable intimacy would be an easy way to remind the media that they’re still awesome. The stage was about 15 feet long and eight feet deep; in 1981, it’s possible Roth could have touched the ceiling with his foot, or at least with his samurai sword. It was a little like watching Darryl Dawkins dunk over Kareem Abdul-Jabbar on a Nerf hoop in your grandparents' basement.

    So, just to be clear: Van Halen is still awesome.

    They were really, truly, absolutely incredible. Their 45-minute performance exceeded my expectations, which were unrealistically high to begin with. The musicianship was muscular and impeccable. After Dime Bag Darrell’s funeral and Sammy Hagar’s autobiography, I had a real fear that Eddie Van Halen was going to come across as a stumbling, vomiting, toothless hobbit; in actuality, he was flawless and (seemingly) quite happy. Alex Van Halen was a little restrained owing to the size of the room, but his drumming remained precise and propulsive. Eddie’s son Wolfgang was equally competent on bass and did a remarkable job simulating Michael Anthony’s soaring background vocals, even on songs like “Dance the Night Away.” As a pure power trio, Van Halen has virtually no peers. Robert Christgau once wrote that “this music belongs on an aircraft carrier,” which he meant as a criticism — but for anyone who loves Van Halen, that reality defines the magnitude of their merit. These guys are hydro-electric destroyers. Watching Eddie Van Halen play guitar is like watching the detonation of a nuclear bomb from inside the warhead.

    by Chuck Klosterman, Grantland |  Read more:
    Photo Courtesy of Chuck Klosterman

    The Willpower Trick


    January is the month of broken resolutions. The gyms are packed for a week, Jenny Craig is full of new recruits and houses are cleaned for the first time in ages. We pledge to finally become the person we want to be: svelte, neat and punctual.

    Alas, it doesn’t take long before the stairmasters are once again sitting empty and those same dirty T-shirts are piling up at the back of the closet. We start binging on pizza and beer — sorry, Jenny — and forget about that pledge to become a kinder, gentler person. Human habits, in other words, are stubborn things, which helps explain why 88 percent of all resolutions end in failure, according to a 2007 survey of over 3,000 people conducted by the British psychologist Richard Wiseman. (...)

    Is there a way out of this willpower trap? Are there secret exercises that can make it easier to stick with our new year resolutions? Not really. Baumeister has found that getting people to focus on incremental improvements, such as the posture of the back, can build up levels of self-control, just as doing bicep curls can strength the upper arm. Nevertheless, it’s not clear that most people even have the discipline to focus on their posture for an extended period, or that these willpower gains will last over the long term.

    But there is a neat way to circumvent the intrinsic weakness of the will, which helps explain why some people have a much easier time sticking to their diet and getting to the gym. A fascinating new paper, led by an all-star team of willpower researchers including Wilhelm Hofmann, Baumeister and Kathleen Vohs, gave 205 participants in Würzburg, Germany a specially designed smartphone. For seven days, the subjects were pinged seven times a day and asked to report whether they were experiencing a strong desire. The participants were asked to describe their nature of their desire, how strongly it was felt, and whether it caused an “internal conflict,” suggesting that this was a desire they were attempting to resist. If a conflict existed, the subjects were asked to describe their ensuing success: Did they manage to not eat the ice cream? The researchers suggest that this is the first time experience-sampling methods have been used to “map the course of desire and self-control in everyday life.”

    Christian Jarrett, at the excellent BPS Research Digest, summarizes the results:
    The participants were experiencing a desire on about half the times they were beeped. Most often (28 per cent) this was hunger. Other common urges were related to: sleep (10 per cent), thirst (9 per cent), media use (8 per cent), social contact (7 per cent), sex (5 per cent), and coffee (3 per cent). About half of these desires were described as causing internal conflict, and an attempt was made to actively resist about 40 per cent of them. Desires that caused conflict were more likely to prompt an attempt at active self-constraint. Such resistance was often effective. In the absence of resistance, 70 per cent of desires were consummated; with resistance this fell to 17 per cent.
    But not everyone was equally successful at resisting the psychological conflict triggered by unwanted wants. According to the survey data, people with higher levels of self-control had just as many desires, but they were less likely to feel that their desires were dangerous. Their desires also tended to be less intense, and thus required less inner strength to resist.

    These findings are incredibly revealing, as they document the banal secret of willpower. It’s not that these people have immaculate wills, able to stare down tempting calories. Instead, they are able to intelligently steer clear of situations that trigger problematic desires. They don’t resist temptation — they avoid it entirely. While unsuccessful dieters try to not eat the ice cream in their freezer, thus quickly exhausting their limited willpower resources, those high in self-control refuse to even walk down the ice cream aisle in the supermarket.

    by Jonah Lehrer, Wired |  Read more:
    Image: lucidtech/Flickr/CC-licensed

    Monday, January 9, 2012


    Gold fireflies in Japan
    via:

    How (not) to Communicate New Scientific Information

    In 1983, at the Urodynamics Society meeting in Las Vegas, Professor G.S. Brindley first announced to the world his experiments on self-injection with papaverine to induce a penile erection. This was the first time that an effective medical therapy for erectile dysfunction (ED) was described, and was a historic development in the management of ED. The way in which this information was first reported was completely unique and memorable, and provides an interesting context for the development of therapies for ED. I was present at this extraordinary lecture, and the details are worth sharing. Although this lecture was given more than 20 years ago, the details have remained fresh in my mind, for reasons which will become obvious.

    The lecture, which had an innocuous title along the lines of ‘Vaso-active therapy for erectile dysfunction’ was scheduled as an evening lecture of the Urodynamics Society in the hotel in which I was staying. I was a senior resident, hungry for knowledge, and at the AUA I went to every lecture that I could. About 15 min before the lecture I took the elevator to go to the lecture hall, and on the next floor a slight, elderly looking and bespectacled man, wearing a blue track suit and carrying a small cigar box, entered the elevator. He appeared quite nervous, and shuffled back and forth. He opened the box in the elevator, which became crowded, and started examining and ruffling through the 35 mm slides of micrographs inside. I was standing next to him, and could vaguely make out the content of the slides, which appeared to be a series of pictures of penile erection. I concluded that this was, indeed, Professor Brindley on his way to the lecture, although his dress seemed inappropriately casual.

    The lecture was given in a large auditorium, with a raised lectern separated by some stairs from the seats. This was an evening programme, between the daytime sessions and an evening reception. It was relatively poorly attended, perhaps 80 people in all. Most attendees came with their partners, clearly on the way to the reception. I was sitting in the third row, and in front of me were about seven middle-aged male urologists, and their partners in ‘full evening regalia’.

    Professor Brindley, still in his blue track suit, was introduced as a psychiatrist with broad research interests. He began his lecture without aplomb. He had, he indicated, hypothesized that injection with vasoactive agents into the corporal bodies of the penis might induce an erection. Lacking ready access to an appropriate animal model, and cognisant of the long medical tradition of using oneself as a research subject, he began a series of experiments on self-injection of his penis with various vasoactive agents, including papaverine, phentolamine, and several others. (While this is now commonplace, at the time it was unheard of). His slide-based talk consisted of a large series of photographs of his penis in various states of tumescence after injection with a variety of doses of phentolamine and papaverine. After viewing about 30 of these slides, there was no doubt in my mind that, at least in Professor Brindley's case, the therapy was effective. Of course, one could not exclude the possibility that erotic stimulation had played a role in acquiring these erections, and Professor Brindley acknowledged this.

    The Professor wanted to make his case in the most convincing style possible. He indicated that, in his view, no normal person would find the experience of giving a lecture to a large audience to be erotically stimulating or erection-inducing. He had, he said, therefore injected himself with papaverine in his hotel room before coming to give the lecture, and deliberately wore loose clothes (hence the track-suit) to make it possible to exhibit the results. He stepped around the podium, and pulled his loose pants tight up around his genitalia in an attempt to demonstrate his erection.

    At this point, I, and I believe everyone else in the room, was agog. I could scarcely believe what was occurring on stage. But Prof. Brindley was not satisfied. He looked down sceptically at his pants and shook his head with dismay. ‘Unfortunately, this doesn’t display the results clearly enough’. He then summarily dropped his trousers and shorts, revealing a long, thin, clearly erect penis. There was not a sound in the room. Everyone had stopped breathing.

    by Laurence Klotz, Wiley Online Library |  Read more:

    perfect storm
    via:

    Sex, Bombs and Burgers


    Our lives today are more defined by technology than ever before. Thanks to Skype and Google, we can video chat with our family from across the planet. We have robots to clean our floors and satellite TV that allows us to watch anything we want, whenever we want it. We can reheat food at the touch of a button. But without our basest instincts — our most violent and libidinous tendencies — none of this would be possible. Indeed, if Canadian tech journalist Peter Nowak is to be believed, the key drivers of 20th-century progress were bloodlust, gluttony and our desire to get laid.

    In his new book, “Sex, Bombs and Burgers,” Nowak argues that porn, fast food and the military have completely reshaped modern technology and our relationship to it. He points to inventions like powderized food, which emerged out of the Second World War effort and made restaurant chains like McDonald’s and Dairy Queen possible. He shows how outsourced phone sex lines have helped bring wealth to poor countries, like Guyana. And he explains how pornography helped drive both the home entertainment industry and modern Web technology, like video chat. An entertaining and well-research read, filled with surprising facts, “Sex, Bombs and Burgers” offers a provocative alternate history of 20th-century progress.

    Salon spoke with Nowak over the phone from Toronto about the importance of the Second World War, the military roots of the Barbie Doll and why the Roomba is our future.

    How would you summarize the broader argument behind the book?

    It’s a look at some of the darker instincts that we as a race have: the need to fight, the need to engorge ourselves and the need to reproduce. Despite thousands of years of conscious evolution, we haven’t been able to escape those things. It’s the story of how our negative side has resulted in some of our most positive accomplishments.

    So much of the technology you talk about came out of the Second World War. Why was that period so important for innovation?

    It was when the military really started spending a lot of money on research. At one point during the war, the U.S. was devoting something like 85 percent of its entire income to military spending. So when you take that kind of effort and those resources and that brainpower and you devote them to one particular thing, the effects are going to be huge and long-lasting, which is why World War II was probably the most important technological event in human history. And the sequel, at least technologically speaking, to that period was the Space Race. I’m of the belief that cancer could be cured if somebody in the United States would dedicate the same kinds of resources in the same amount of time as it did to developing the atom bomb and putting someone on the moon.

    What kinds of things came out of the war?

    The food innovations that happened during the war paved the way for the rest of the 20th century. The U.S. military had to move large numbers of troops over to other parts of the world and then feed them, so a lot of techniques were created and perfected, from packaging to dehydrating and powderizing foods. Powdered coffee and powdered milk came of age during World War II. These advancements in food processing techniques created the foundation of the food plentifulness in the U.S. and created the opportunity for countries to become global food exporting powers.

    Plastics are interesting because they — 60 years later it’s hard for us to think about this — but they really revolutionized the way everything was done because materials were running short in every sense during the war. During the war, there was a lot of emphasis put on creating synthetic materials and chemicals. These plastics were used during the war for things like insulating cables or lining drums or coating bullets. Then, after the war, chemical-makers like Dow started to come up with new uses for these things, which translated into everything from Tupperware to Saran wrap to Teflon to Silly Putty to Barbie dolls.

    by Thomas Rogers, Salon |  Read more:
    Photo: (Credit: Olinchuck and Anetlanda via Shutterstock/Wikipedia)

    Get a Midlife

    You may be surprised to learn that when researchers asked people over 65 to pick the age they would most like to return to, the majority bypassed the wild and wrinkle-less pastures of their teens, 20s and 30s, and chose their 40s.

    We are more accustomed to seeing the entry into middle age treated as a punch line or a cause for condolences. Despite admonishments that “50 is the new 30,” middle age continues to be used as a metaphor for decline or stasis. Having just completed a book about the history and culture of middle age, I found that the first question people asked me was, “When does it begin?” anxiously hoping to hear a number they hadn’t yet reached.

    Elderly people who find middle age to be the most desirable period of life, however, are voicing what was a common sentiment in the 19th century, when the idea of a separate stage of development called “middle age” began to emerge. Although middle age may seem like a universal truth, it is actually as much of a manufactured creation as polyester or the rules of chess. And like all the other so-called stages into which we have divvied up the uninterrupted flow of life, middle age, too, is a cultural fiction, a story we tell about ourselves.

    The story our great-great-great-grandparents told was that midlife was the prime of life. “Our powers are at the highest point of development,” The New York Times declared in 1881, “and our power of disciplining these powers should be at their best.”

    Yes, yes, you think, bully for higher powers and all, but what about thickening waistlines, sagging skin, aching knees and multiplying responsibilities for aging, ailing parents? Is there anyone past 40 who, at one point or other, hasn’t pushed aside qualms and pushed back the skin above their cheekbones to smooth out those deepening nasolabial folds? Gym addicts aside, when it comes to face and physique, middle age doesn’t have a chance.

    The problem with the physical inventory of middle age, though, is that it inevitably emphasizes loss — the end of fertility, decreased stamina, the absence of youth. Middle age begins, one cultural critic declared, the moment you think of yourself as “not young.” The approach is the same as that taken by physicians and psychologists, who have defined wellness and happiness in terms of what was missing: health was an absence of illness; a well-adjusted psyche meant an absence of depression and dysfunction.

    The most recent research on middle age, by contrast, has looked at gains as well as deficits. To identify the things that contribute to feeling fulfilled and purposeful, Carol Ryff, the director of the Institute on Aging at the University of Wisconsin, Madison, developed a list of questions to measure well-being and divided them into six broad categories: personal growth (having new experiences that challenge how you think about yourself); autonomy (having confidence in your opinions even if they are contrary to the general consensus); supportive social relationships; self-regard (liking most aspects of your personality); control of your life; and a sense of purpose.

    by Patricia Cohen, NY Times |  Read more:
    Illustration: Gemma Correl

    Sunday, January 8, 2012

    How Scientists Came to Love the Whale


    “Whale Carpaccio — 130 Kroner.”

    Thus read an appetizer on a menu at a restaurant in Bergen, Norway, when I dined there a few years back. I wanted to sample this odd dish. What would the experience be like? Would the meat be chewy like pork, or flaky like fish?

    These were my thoughts when the waitress approached and asked (maybe a little sadistically?) if I’d like to “try the whale.” But before I could signal my assent, somewhere in the back of my mind a fuzzy ’70s-era television memory arose — the image of a Greenpeace Zodiac bobbing on the high seas defensively poised between a breaching whale and a Soviet harpoon cannon. “No,” I said, “I’ll have the mussels.”

    I reprise this anecdote here not to show how evolved I am, but rather to juxtapose my hazy whale-belief structure with the much more nuanced understanding of a man who has immersed himself in the subtleties, trickeries, scandals and science of cetaceans. D. Graham Burnett, the author of “The Sounding of the Whale,” a sweeping, important study of cetacean science and policy, has quite literally “tried the whale” and could probably describe for you whale meat’s precise consistency. But he has also been tried by the whale in the deepest sense, because he spent a decade poring over thousands upon thousands of pages scattered in far-flung archives. If the whale swallowed Jonah whole, then Burnett has made a considerable effort to get as much of the whale as possible down his voluminous intellectual gullet.

    A reviewer pressed for time could, in lieu of an essay, put together a very respectable (or at least very weird) collage of all the “you’re kidding me, right?” facts about whales and whaling that appear on almost every one of Burnett’s information-soaked pages. That the waxy plug in a whale’s ear might work as a sound lens focusing song from miles away. That the Japanese World War II pilots who spotted submarines were retrained, postwar, to find whales. That whale scientists were seriously considering using tropical atolls as corrals for whale farms. But what makes Burnett’s book notable is the big-picture arc he traces, from the early “hip-booted” cetologist who earned his stripes “the old-fashioned way, by cutting his way into the innards of hundreds of whales while standing in the icy slurry of an Antarctic whaling station,” right up to the scientist-turned-Age-of-Aquarius-psychedelic-guru who studied the behavior of living cetaceans, drawing such conclusions as “Whales and dolphins quite naturally go in the directions we call spiritual, in that they get into meditative states quite simply and easily.” While tracking the evolution of something he probably wouldn’t quite call interspecies empathy (but that I might), Burnett keeps a cool head and gives what should become the definitive account of whalekind’s transformation from cipher to signifier.

    by Paul Greenberg, NY Times |  Read more:
    Photo: Associated Press via NY Times


    photo: markk