Friday, February 3, 2012
Atomic Bread Baking at Home
When Hana enters the small bakery I have borrowed for a day, I am dividing a loaf into 1.5-centimeter slices. The loaf's tranches articulate a white fanned deck, each one the exact counterpart of its fellows. The bread is smooth and uniform, like a Bauhaus office block. There are no unneeded flourishes or swags. Each symmetrical slice shines so white it is almost blue. This is a work of modern art. My ten-year-old daughter does not pause to say hello. She rushes to the cutting board, aghast, and blurts, "Its fake!" Then she devours a piece in three bites, and asks for more.
I have just spent a day re-creating the iconic loaf of 1950s-era soft white industrial bread, using easily acquired ingredients and home kitchen equipment. With the help of a 1956 government report detailing a massive, multiyear attempt to formulate the perfect loaf of white bread, achieving that re-creation proved relatively easy. Until Hana's arrival, however, I did not fully understand why I was doing it. I had sensed that extracting this industrial miracle food of yesteryear from the dustbin of kitsch might have something to teach about present-day efforts to change the food system; that it might offer perspective on our own confident belief that artisanal eating can restore health, rebuild community, and generally save the world. But, really, it was reactions like Hana's that I wanted to understand. How can a food be so fake and yet so eagerly eaten, so abhorred and so loved?
Sliced white bread as we know it today is the product of early twentieth-century streamlined design. It is the Zephyr train of food. But, in the American imagination, industrial loaves are more typically associated with the late '50s and early '60s—the Beaver Cleaver days of Baby Boomer nostalgia, the Golden Age of Wonder Bread. This is not without justification: during the late '50s and early '60s, Americans ate a lot of it. Across race, class, and generational divides, Americans consumed an average of a pound and a half of white bread per person, every week. Indeed, until the late '60s, Americans got from 25 to 30 percent of their daily calories from the stuff, more than from any other single item in their diet (and far more than any single item contributes to the American diet today—even high-fructose corn syrup).
Only a few years earlier, however, as world war morphed into cold war, the future of industrial bread looked uncertain. On the cusp of the Wonder years, Americans still ate enormous quantities of bread, but, even so, government officials and baking-industry experts worried that bread would lose its central place on the American table. In a world of rising prosperity and exciting new processed foods, the Zephyr train of food looked a bit tarnished. And so, in 1952, hoping to offset possible declines in bread consumption, the U.S. Department of Agriculture teamed up with baking-industry scientists to launch the Manhattan Project of bread.
Conceived as an intensive panoramic investigation of the country's bread-eating habits, the project had ambitious goals: First, gain a precise, scientific understanding of exactly how much and what kind of bread Americans ate, when and why they ate it, and what they thought about it. Second, use that information to engineer the perfect loaf of white bread—a model for all industrial white bread to come.
by Aaron Bobrow-Strain, The Believer | Read more:

Sliced white bread as we know it today is the product of early twentieth-century streamlined design. It is the Zephyr train of food. But, in the American imagination, industrial loaves are more typically associated with the late '50s and early '60s—the Beaver Cleaver days of Baby Boomer nostalgia, the Golden Age of Wonder Bread. This is not without justification: during the late '50s and early '60s, Americans ate a lot of it. Across race, class, and generational divides, Americans consumed an average of a pound and a half of white bread per person, every week. Indeed, until the late '60s, Americans got from 25 to 30 percent of their daily calories from the stuff, more than from any other single item in their diet (and far more than any single item contributes to the American diet today—even high-fructose corn syrup).
Only a few years earlier, however, as world war morphed into cold war, the future of industrial bread looked uncertain. On the cusp of the Wonder years, Americans still ate enormous quantities of bread, but, even so, government officials and baking-industry experts worried that bread would lose its central place on the American table. In a world of rising prosperity and exciting new processed foods, the Zephyr train of food looked a bit tarnished. And so, in 1952, hoping to offset possible declines in bread consumption, the U.S. Department of Agriculture teamed up with baking-industry scientists to launch the Manhattan Project of bread.
Conceived as an intensive panoramic investigation of the country's bread-eating habits, the project had ambitious goals: First, gain a precise, scientific understanding of exactly how much and what kind of bread Americans ate, when and why they ate it, and what they thought about it. Second, use that information to engineer the perfect loaf of white bread—a model for all industrial white bread to come.
by Aaron Bobrow-Strain, The Believer | Read more:
Colbert v. the Court
The Supreme Court has always had its critics. Chief Justice John Marshall had to contend with the temper of President Andrew Jackson (“John Marshall has made his decision, now let him enforce it!”). And Chief Justice Charles Evans Hughes went toe-to-toe with FDR, who wouldn’t let up with the court-packing. But in the history of the Supreme Court, nothing has ever prepared the justices for the public opinion wrecking ball that is Stephen Colbert. The comedian/presidential candidate/super PAC founder has probably done more to undermine public confidence in the court’s 2010 Citizens United opinion than anyone, including the dissenters. In this contest, the high court is supremely outmatched.
Citizens United, with an assist from a 1976 decision Buckley v. Valeo, has led to the farce of unlimited corporate election spending, “uncoordinated” super PACs that coordinate with candidates, and a noxious round of attack ads, all of which is protected in the name of free speech. Colbert has been educating Americans about the resulting insanity for months now. His broadside against the court raises important questions about satire and the court, about protecting the dignity of the institution, and the role of modern media in public discourse. Also: The fight between Colbert and the court is so full of ironies, it can make your molars hurt.
When President Obama criticized Citizens United two years ago in his State of the Union address, at least three justices came back at him with pitchforks and shovels. In the end, most court watchers scored it a draw. But when a comedian with a huge national platform started ridiculing the court last summer, the stakes changed completely. This is no pointy-headed deconstruction unspooling on the legal blogs. Colbert has spent the past few months making every part of Justice Anthony Kennedy’s majority opinion in Citizen United look utterly ridiculous. And the court, which has no access to cameras (by its own choosing), no press arm, and no discernible comedic powers, has had to stand by and take it on the chin.
It all started when Colbert announced that, as permitted by Citizens United, he planned to form a super PAC (“Making a better tomorrow, tomorrow”). As he explained to his viewers, his hope was that “Colbert Nation could have a voice, in the form of my voice, shouted through a megaphone made of cash ... the American dream. And that dream is simple. That anyone, no matter who they are, if they are determined, if they are willing to work hard enough, someday they could grow up to create a legal entity which could then receive unlimited corporate funds, which could be used to influence our elections."
by Dahlia Lithwick, Slate | Read more:
Photograph by Richard Ellis/Getty Images.
The Ballad of Mark Zuckerberg
Mark Zuckerberg, these days, isn't just known as one of the world's youngest billionaires, or as the CEO of a company that just filed Silicon Valley's biggest-ever IPO. He has also become, through his leadership of Facebook, a kind of PR person for publicity itself, working to connect the world of the web one friend at a time.
It wasn't always that way, though. For Zuckerberg, it's been a slow, and sometimes painful, evolution from an entrepreneur of the social web to an evangelist for it.
Phase 1: The Fickle Founder
Business Insider reported this week on a series of IM chats between the 19-year-old Zuckerberg and Adam D'Angelo, his best friend from high school (and the guy who would later become Facebook's CTO). In the chat below, Zuckerberg and D'Angelo discuss a dating site Zuck is developing for fellow Harvard students -- and consider how it'll fit in with "the Facebook thing."
Zuckerberg: I ... hate the fact that I'm doing it for other people haha. Like I hate working under other people. I feel like the right thing to do is finish the facebook and wait until the last day before I'm supposed to have their thing ready and then be like "look yours isn't as good as this so if you want to join mine you can...otherwise I can help you with yours later." Or do you think that's too dick?
D'Angelo: I think you should just ditch them
Zuckerberg: The thing is they have a programmer who could finish their thing and they have money to pour into advertising and stuff. Oh wait I have money too. My friend who wants to sponsor this is head of the investment society. Apparently insider trading isn't illegal in Brazil so he's rich lol.
D'Angelo: lol
Later, Zuckerberg would go on to suggest that he was interested in Facebook in part for its ability to be sold and in part for its ability to drive another product he'd been building: the file-sharing service Wirehog. In an IM chat with an unnamed confidant about the lawsuit heard round the world, Zuckerberg declared: "I won't pay
the legal fees. The company that buys us will haha."
"Cool hopefully that'll be soon so you can move on and just work on what you want to," the confidant replied.
To which Zuckerberg responded, "Well it just needs to propel Wirehog."
by Megan Garber, The Atlantic | Read more:
Image: Reuters
Feet In Smoke
On the morning of April 21, 1995, my elder brother, Worth (short for Ellsworth), put his mouth to a microphone in a garage in Lexington, Kentucky, and in the strict sense of having been "shocked to death," was electrocuted. He and his band, the Moviegoers, had stopped for a day to rehearse on their way from Chicago to a concert in Tennessee, where I was in school. Just a couple of days earlier, he had called to ask if there were any songs I wanted to hear at the show. I asked for something new, a song he'd written and played for me the last time I'd seen him, on Christmas Day. Our holidays always end the same way, with the two of us up late drinking and trying out our new "tunes" on each other. There's something biologically satisfying about harmonizing with a sibling. We've gotten to where we communicate through music, using guitars the way fathers and sons use baseball, as a kind of emotional code. Worth is seven years older than I am, an age difference that can make brothers strangers. I'm fairly sure the first time he ever felt we had anything to talk about was the day he caught me in his basement bedroom at our old house in Indiana, trying to teach myself how to play "Radio Free Europe" on a black Telecaster he'd forbidden me to touch.
The song I had asked for, "Is It All Over," was not a typical Moviegoers song. It was simpler and more earnest than the infectious power-pop they made their specialty. The changes were still unfamiliar to the rest of the band, and Worth had been about to lead them through the first verse, had just leaned forward to sing the opening lines—"Is it all over? I'm scanning the paper / For someone to replace her"—when a surge of electricity arced through his body, magnetizing the mike to his chest like a tiny but obstinate missile, searing the first string and fret into his palm, and stopping his heart. He fell backward and crashed, already dying.
Possibly you know most of this already. I got many of my details from a common source, an episode of Rescue 911 (the show hosted by William Shatner) that aired about six months after the accident. My brother played himself in the dramatization, which was amusing for him, since he has no memory whatsoever of the real event. For the rest of us, his family and friends, the segment is hard to watch.
The story Shatner tells, which ends at the moment we learned that my brother would live, is different from the story I know. But his version offers a useful reminder of the danger, where medical emergencies are involved, of talking too much about "miracles." Not to knock the word—the staff at Humana Hospital in Lexington called my brother's case "miraculous," and they've seen any number of horrifying accidents and inexplicable recoveries—but it tends to obscure the human skill and coolheadedness that go into saving somebody's life. I think of Liam, my brother's best friend and bandmate, who managed not to fall apart while he cradled Worth in his arms until help arrived, and who'd warned him when the band first started practicing to put on his Chuck Taylors, the rubber soles of which were the only thing that kept him from being zapped into a more permanent fate than the one he did endure. I think of Captain Clarence Jones, the fireman and paramedic who brought Worth back to life, strangely with two hundred joules of pure electric shock (and who later responded to my grandmother's effusive thanks by giving all the credit to the Lord). Without people like these and doubtless others whom I never met and Shatner didn't mention, there would have been no miracle.
by John Jeremiah Sullivan, Deadspin | Read more:
Thursday, February 2, 2012
Ani DiFranco
Ani DiFranco performs Life Boat off her new album Which Side Are You On?
Dog-Gone Genetics: A Few Genes Control Fido's Looks
Humans are complicated genetic jigsaw puzzles. Hundreds of genes are involved in determining something as basic as height.
But man's best friend is a different story. New research shows that almost every physical trait in dogs — from a dachshund's stumpy legs to a shar-pei's wrinkles — is controlled by just a few genes.
Writer Evan Ratliff has been looking into dog genetics for National Geographic Magazine. He tells weekends on All Things Considered host Guy Raz that that quirk makes it extremely easy for breeders to develop new, custom-designed dogs — like the German hunters who bred the original dachshunds a few hundred years ago.
"These German hunters wanted some sort of dog to hunt badgers and other sort of small rodents that live in holes." So they crossed long, low basset hounds with tenacious terriers, to produce a dog that could chase badgers into their dens and then be yanked out again by the tail if necessary. The breeders also built in loose fur, so any bites wouldn't do much damage.
Other breeds, like the shar-pei, developed after breeders pursued a particularly favored look, Ratliff says.
For years, scientists thought that dogs were just as genetically complicated as humans, Ratliff says. But that turned out not to be the case. Scientists at Cornell, UCLA, Stanford and the National Institutes of Health have been comparing dog DNA as part of a project called CanMap.
"They took a whole large collection of dogs, 900 dogs from, I think, 80 breeds," Ratliff says. "And what they learned was that in these dogs, if you look at their physical traits, everything from their body size to their coat color to whether they have floppy ears, it's determined by a very small number of genes."
It's actually human interference that's the cause of what Ratliff calls "Tinker-Toy genetics" in dogs. "The way that natural selection works, it usually works on very small changes," he says. Sudden large changes can actually be harmful.
But breeders can introduce large changes in a dog relatively rapidly, by selecting the genes that have the strongest effects.
"If I want a tall dog, a large dog, then I end up selecting for this gene called IGF1, which has a very very strong effect on the size of a dog. And when you do that over a couple of hundred years, what happens is ... it becomes the gene that controls body size."
by NPR Staff, All Things Considered | Listen to more:
Photo: Istockphoto.com
But man's best friend is a different story. New research shows that almost every physical trait in dogs — from a dachshund's stumpy legs to a shar-pei's wrinkles — is controlled by just a few genes.
Writer Evan Ratliff has been looking into dog genetics for National Geographic Magazine. He tells weekends on All Things Considered host Guy Raz that that quirk makes it extremely easy for breeders to develop new, custom-designed dogs — like the German hunters who bred the original dachshunds a few hundred years ago.
"These German hunters wanted some sort of dog to hunt badgers and other sort of small rodents that live in holes." So they crossed long, low basset hounds with tenacious terriers, to produce a dog that could chase badgers into their dens and then be yanked out again by the tail if necessary. The breeders also built in loose fur, so any bites wouldn't do much damage.
Other breeds, like the shar-pei, developed after breeders pursued a particularly favored look, Ratliff says.
For years, scientists thought that dogs were just as genetically complicated as humans, Ratliff says. But that turned out not to be the case. Scientists at Cornell, UCLA, Stanford and the National Institutes of Health have been comparing dog DNA as part of a project called CanMap.
"They took a whole large collection of dogs, 900 dogs from, I think, 80 breeds," Ratliff says. "And what they learned was that in these dogs, if you look at their physical traits, everything from their body size to their coat color to whether they have floppy ears, it's determined by a very small number of genes."
It's actually human interference that's the cause of what Ratliff calls "Tinker-Toy genetics" in dogs. "The way that natural selection works, it usually works on very small changes," he says. Sudden large changes can actually be harmful.
But breeders can introduce large changes in a dog relatively rapidly, by selecting the genes that have the strongest effects.
"If I want a tall dog, a large dog, then I end up selecting for this gene called IGF1, which has a very very strong effect on the size of a dog. And when you do that over a couple of hundred years, what happens is ... it becomes the gene that controls body size."
by NPR Staff, All Things Considered | Listen to more:
Photo: Istockphoto.com
Kill The Caps Lock
Perhaps it should have occurred to me years ago, but it wasn’t until recently that I fully realized that everybody hates something about their computer keyboard. I was in the company of several family members and friends, and had just mistyped my Gmail password for the 458th time in calendar 2011. I knew straightaway what had gone wrong—caps lock was depressed by accident—but instead of simply taking my lumps and re-entering my password, I vented: “Is there anything on the computer keyboard more annoying than the caps lock key?”
Yes, my companions told me matter-of-factly, there is. Thirty minutes of conversation ensued, with each participant attempting to outdo the others with tales of keyboard frustration and fiery screeds relegating various keys to eternal damnation. The conversation was painfully nerdy, yet cathartic—and eye-opening.
Since that initial conversation, I’ve spoken with dozens of folks about computer keyboard annoyances, and I’ve compiled a list of five small-scale adjustments that would greatly improve the typing experience. My goal in compiling this list is narrowly tailored. I don’t want to fundamentally change the way we type—I don’t have time to learn the Dvorak keyboard, and I suspect you don’t either. These are small, one-key fixes that could make typing easier, faster, and less prone to error.
1. For starters, please allow me to reiterate the following: CAPS LOCK IS THE WORST! It is of very little use to the average citizen. Nearly everything that results from depressing this key is annoying.
While it’s important to consider the interests of groups that rely on the key (those with disabilities that make it difficult to press more than one key at a time, for instance, and people engaged in professions that frequently use all-uppercase text), caps lock also inherently favors yell-y Internet commenters, people who design terrible flyers, and others who deserve little consideration. For the rest of us, the key is a nuisance, its prime real estate leading us to depress it unintentionally and often unwittingly. The next thing you know, you’re submitting to a security-question inquisition from your banking institution, trying desperately to prove your identity having thrice entered your case-sensitive password incorrectly.
The utility derived from not having to hold down “shift” when drafting venomous complaint emails to Time Warner Cable does not justify all those needlessly mistyped words in other contexts. So, as a first-step move aimed at improving the keyboard, let’s scrap the caps lock key altogether. (Disabling it by using the Keyboard tab in System Preferences on a Mac, or specialized anti–caps lock software for PCs, doesn’t result in any freed up space on the board for new keys.) For the serially furious or enthusiastic, there would of course still be a caps lock function: Upper-casers could use a new key-combo or, for instance, access the function as iPhone users already do, by quickly tapping the shift button twice. Google eliminated the caps lock key from its laptops, and though the company replaced it with a branded search key that can still be annoying when pressed by mistake, it’s high time for other computer makers to open up that space for new, less-infuriating keys.
by Matthew J.X. Malady, Slate | Read more:
Illustration by Robert Neubecker

Since that initial conversation, I’ve spoken with dozens of folks about computer keyboard annoyances, and I’ve compiled a list of five small-scale adjustments that would greatly improve the typing experience. My goal in compiling this list is narrowly tailored. I don’t want to fundamentally change the way we type—I don’t have time to learn the Dvorak keyboard, and I suspect you don’t either. These are small, one-key fixes that could make typing easier, faster, and less prone to error.
1. For starters, please allow me to reiterate the following: CAPS LOCK IS THE WORST! It is of very little use to the average citizen. Nearly everything that results from depressing this key is annoying.
While it’s important to consider the interests of groups that rely on the key (those with disabilities that make it difficult to press more than one key at a time, for instance, and people engaged in professions that frequently use all-uppercase text), caps lock also inherently favors yell-y Internet commenters, people who design terrible flyers, and others who deserve little consideration. For the rest of us, the key is a nuisance, its prime real estate leading us to depress it unintentionally and often unwittingly. The next thing you know, you’re submitting to a security-question inquisition from your banking institution, trying desperately to prove your identity having thrice entered your case-sensitive password incorrectly.
The utility derived from not having to hold down “shift” when drafting venomous complaint emails to Time Warner Cable does not justify all those needlessly mistyped words in other contexts. So, as a first-step move aimed at improving the keyboard, let’s scrap the caps lock key altogether. (Disabling it by using the Keyboard tab in System Preferences on a Mac, or specialized anti–caps lock software for PCs, doesn’t result in any freed up space on the board for new keys.) For the serially furious or enthusiastic, there would of course still be a caps lock function: Upper-casers could use a new key-combo or, for instance, access the function as iPhone users already do, by quickly tapping the shift button twice. Google eliminated the caps lock key from its laptops, and though the company replaced it with a branded search key that can still be annoying when pressed by mistake, it’s high time for other computer makers to open up that space for new, less-infuriating keys.
by Matthew J.X. Malady, Slate | Read more:
Illustration by Robert Neubecker
Path Is Found for the Spread of Alzheimer’s
Alzheimer’s disease seems to spread like an infection from brain cell to brain cell, two new studies in mice have found. But instead of viruses or bacteria, what is being spread is a distorted protein known as tau.
The surprising finding answers a longstanding question and has immediate implications for developing treatments, researchers said. And they suspect that other degenerative brain diseases like Parkinson’s may spread in a similar way.
Alzheimer’s researchers have long known that dying, tau-filled cells first emerge in a small area of the brain where memories are made and stored. The disease then slowly moves outward to larger areas that involve remembering and reasoning.
But for more than a quarter-century, researchers have been unable to decide between two explanations. One is that the spread may mean that the disease is transmitted from neuron to neuron, perhaps along the paths that nerve cells use to communicate with one another. Or it could simply mean that some brain areas are more resilient than others and resist the disease longer.
The new studies provide an answer. And they indicate it may be possible to bring Alzheimer’s disease to an abrupt halt early on by preventing cell-to-cell transmission, perhaps with an antibody that blocks tau.
Photo: Chang W. Lee
Wednesday, February 1, 2012
Are Women Better at Living Alone?
Earlier this month, divorcee Dominique Browning published an essay in the New York Times positing a gender gap in the talent for living alone. She and her single female neighbors, she wrote, relish the freedom to eat at odd hours and monopolize the bed, while men are indifferent to these perks. Nesting at home, she went on to assert, women feel safe. “Men,” though, “are hard-wired to feel danger all the time … Being alone feels dangerous to a man.”
These generalizations incensed commenters and bloggers, one of whom offered this summary: “Binary gender norms are alive and thriving, except the roles have reversed (sort of).” But according to sociological research, Browning wasn’t entirely off the mark. On average, women may be better suited to solitary habitation than men, at least past a certain age. It’s not, however, because men don’t love to eat Cheerios for dinner and hog the bed. Nor is it that women are more self-sufficient or inclined to solitude. On the contrary: Women are more likely to have strong social networks, which enable them to live alone without being alone. Men are more at risk of withdrawing into isolation that, at the extremes, can be miserable and indeed dangerous.
The contrast emerges clearly in Eric Klinenberg’s new book, Going Solo: The Extraordinary Rise and Surprising Appeal of Living Alone. And it matters because the people in question are hardly a negligible demographic. Though they may not realize it, they’re part of a major societal shift. In 1950, Klinenberg reports, 4 million American adults lived alone, which accounted for 9 percent of households. Today, that number is 31 million, a whopping 28 percent of all households.
by Rebecca Tuhus-Dubrow, Slate | Read more:
Illustration by Rob Donnelly
Making Circumcisions Easier
The day of the assembly-line circumcision is drawing closer.
Now that three studies have shown that circumcising adult heterosexual men is one of the most effective “vaccines” against AIDS — reducing the chances of infection by 60 percent or more — public health experts are struggling to find ways to make the process faster, cheaper and safer.
The goal is to circumcise 20 million African men by 2015, but only about 600,000 have had the operation thus far. Even a skilled surgeon takes about 15 minutes, most African countries are desperately short of surgeons, and there is no Mohels Without Borders.
So donors are pinning their hopes on several devices now being tested to speed things up.
Dr. Stefano Bertozzi, director of H.I.V. for the Bill and Melinda Gates Foundation, said it had its eyes on two, named PrePex and the Shang Ring, and was supporting efforts by the World Health Organization to evaluate them.
Circumcision is believe to protect heterosexual men because the foreskin has many Langerhans cells, which pick up viruses and “present” them to the immune system — which H.I.V. attacks.
PrePex, invented in 2009 by four Israelis after one of them, a urologist, heard an appeal for doctors to do circumcisions in Africa, was approved by the Food and Drug Administration three weeks ago. The W.H.O. will make a decision on it soon, said Mitchell Warren, an AIDS-prevention expert who closely follows the process.
From the initial safety studies done so far, PrePex is clearly faster, less painful and more bloodless than any of its current rivals. And it relies on the simplest and least-threatening technology — a rubber band.
by Donald G. McNeil, Jr., NY Times | Read more:
What's Wrong With the Teenage Mind?
"What was he thinking?" It's the familiar cry of bewildered parents trying to understand why their teenagers act the way they do.
How does the boy who can thoughtfully explain the reasons never to drink and drive end up in a drunken crash? Why does the girl who knows all about birth control find herself pregnant by a boy she doesn't even like? What happened to the gifted, imaginative child who excelled through high school but then dropped out of college, drifted from job to job and now lives in his parents' basement?
Adolescence has always been troubled, but for reasons that are somewhat mysterious, puberty is now kicking in at an earlier and earlier age. A leading theory points to changes in energy balance as children eat more and move less.
At the same time, first with the industrial revolution and then even more dramatically with the information revolution, children have come to take on adult roles later and later. Five hundred years ago, Shakespeare knew that the emotionally intense combination of teenage sexuality and peer-induced risk could be tragic—witness "Romeo and Juliet." But, on the other hand, if not for fate, 13-year-old Juliet would have become a wife and mother within a year or two.
Our Juliets (as parents longing for grandchildren will recognize with a sigh) may experience the tumult of love for 20 years before they settle down into motherhood. And our Romeos may be poetic lunatics under the influence of Queen Mab until they are well into graduate school.
What happens when children reach puberty earlier and adulthood later? The answer is: a good deal of teenage weirdness. Fortunately, developmental psychologists and neuroscientists are starting to explain the foundations of that weirdness.
The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again.
by Alison Gopnik, WSJ | Read more:
Illustration: Harry Campbell

If you think of the teenage brain as a car, today's adolescents acquire an accelerator a long time before they can steer and brake.
Adolescence has always been troubled, but for reasons that are somewhat mysterious, puberty is now kicking in at an earlier and earlier age. A leading theory points to changes in energy balance as children eat more and move less.
At the same time, first with the industrial revolution and then even more dramatically with the information revolution, children have come to take on adult roles later and later. Five hundred years ago, Shakespeare knew that the emotionally intense combination of teenage sexuality and peer-induced risk could be tragic—witness "Romeo and Juliet." But, on the other hand, if not for fate, 13-year-old Juliet would have become a wife and mother within a year or two.
Our Juliets (as parents longing for grandchildren will recognize with a sigh) may experience the tumult of love for 20 years before they settle down into motherhood. And our Romeos may be poetic lunatics under the influence of Queen Mab until they are well into graduate school.
What happens when children reach puberty earlier and adulthood later? The answer is: a good deal of teenage weirdness. Fortunately, developmental psychologists and neuroscientists are starting to explain the foundations of that weirdness.
The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again.
by Alison Gopnik, WSJ | Read more:
Illustration: Harry Campbell
The Wandering Gene and the Indian Princess
The Wandering Gene and the Indian Princess: Race, Religion and DNA spans continents and millennia but takes place largely in Colorado's barren and impoverished San Luis Valley, which, author Jeff Wheelwright notes drily, is "not a place you would expect to find a flare-up of Jewish consciousness." But the San Luis Valley is home to the Medinas, a large Hispano family of Spanish and Native American descent, and many of them have tested positive for the BRCA1.185delAG gene, the breast cancer mutation considered to be unambiguous evidence of Jewish ancestry.
The heart of Wheelwright's alternately fascinating and painful book is Shonnie Medina, who was diagnosed with aggressive breast cancer at age twenty-six and dead by twenty-eight. What fascinates is the author's account of how the Jewish marker first came to be and how it eventually showed up among the Catholics of the American Southwest. Scientists believe that the mutation, discovered in the mid-1990s, is 2,500 years old and that it entered the Israelite gene pool via a single founder. (Unlike recessive genes like those that cause the deadly Tay-Sachs, a rare genetic disease affecting Jews, this mutation acts alone, requiring only one parent to pass it down.) Wheelwright, a science journalist whose previous books were about the Exxon Valdez oil spill and illnesses afflicting Gulf War veterans, explains that in a bitter twist, some of the early Israelite strategies to survive in the face of oppression, including preserving "sacred separateness" and "blood purity," led to genetic isolation and the concentration of the mutation. While 1 in 100 Ashkenazi Jews are thought to be carriers of 185delAG, it likely came to the San Luis Valley by way of Sephardic Jews who colonized what was then New Spain after being forced to convert to Catholicism during the Spanish Inquisition, beginning in the late fifteenth century.
The painful part of the book, of course, concerns Shonnie, whose DNA brought her toward her terrible fate but whose culture and temperament finished her off. "Marginal medicine" and "marginal religion," Wheelwright writes, "swirl about the story of Shonnie Medina like two furies." She and many of her relatives abandoned Catholicism and became Jehovah's Witnesses in the 1980s. Shonnie was passionate about door-to-door evangelizing, and the booklet she carried with her on her home visits equated original sin to "a terrible inherited disease from which no one can escape." Wheelwright suggests that the Witnesses' apocalyptic beliefs and distrust of secular society (Shonnie's father, Joseph, eschewed banks, instead burying cash in jars spread all over his property) contributed to the young woman's decision to refuse surgery and chemotherapy. Vanity also played a role -- she couldn't accept the prospect of a mastectomy. Instead, she traveled five times to Tijuana for specious herbal therapies before her death in 1999.
As part of his research, Wheelwright spent considerable time with the extended Medina clan. He sat in on a Sunday afternoon session with a genetic counselor held at Shonnie's parents' restaurant in 2007; the counselor drove in from Colorado Springs to explain the science behind the mutation and to urge the gathered family members to undergo DNA testing. Two years later, the restaurant again played host, this time to the Hispano DNA Project, which, led by the head of the Human Genetics Program at New York University, collected blood samples from locals in an effort to amass more information about their genetic ancestry. By then the possibility of a "crypto-Jewish" community in the Valley had aroused interest from academics and the press. While some Hispanos in the area were skeptical, others, including some distant relatives of Shonnie's, began to plumb their pasts, recalling grandparents quietly lighting candles on Friday nights or avoiding pork.
by Barbara Spindel, Barnes and Noble Review | Read more:
Subscribe to:
Posts (Atom)