Sunday, June 10, 2012
Guitar Zero
Can science turn a psychologist into Jimi Hendrix?
Are musicians born or made? All my life, I wanted to become musical but I always assumed that I never had a chance. My ears are dodgy, my fingers too clumsy. I have no sense of rhythm and a lousy sense of pitch. I have always loved music, but could never sing, let alone play an instrument; in school, I came to believe that I was destined to be a spectator rather than a participant, no matter how hard I tried.
As I grew older, I figured my chances only diminished. Our lives, once we finish school, tend to focus on execution rather than enrichment. Whether we are breadwinners or caretakers, our success is measured by outcomes. The work it takes to achieve those outcomes, we are meant to understand, is something that should happen quickly and behind closed doors. If the conventional wisdom is right, by the time we are adults it's too late to learn anything new. Children may be able to learn anything, but if you wanted to learn French you should have started when you were six.
Until recently, science supported this theory. Virtually everybody in developmental psychology was a firm believer in "critical periods" of learning. The idea is that there are particular time windows in which complex skills, such as languages, can be learned; if you don't learn them by the time the window shuts, you never will. Case closed. But the more people have actually studied critical periods, the shakier the data has become. Although adults rarely achieve the same level of fluency that children do, the scientific research suggests that differences typically pertain more to accent than grammar.
There is also no magical window that slams shut the moment puberty begins. In fact, in recent years, scientists have identified people who have managed to learn languages with near-native fluency, even though they only started as adults.
If critical periods aren't quite so firm as once believed, a world of possibility emerges for the many adults who harbour secret dreams – whether to learn a language, become a pastry chef or pilot a plane. And quests such as these, no matter how quixotic they may seem, and whether they succeed, could bring unanticipated benefits, not just for their ultimate goals but for the journey itself.
Exercising our brains helps maintain them, by preserving plasticity (the capacity of the nervous system to learn new things), warding off degeneration and literally keeping the blood flowing. Beyond the potential benefits for our brains, there are benefits for our emotional wellbeing, too. There may be no better way to achieve lasting happiness – as opposed to mere fleeting pleasure – than pursuing a goal that helps broaden our horizons.
From primary school, every musical attempt I made ended in failure. The first time I tried to play guitar, a few years ago, my friend Dan Levitin (who had not yet finished his book This Is Your Brain on Music) kindly offered to give me a few lessons. When I came back to him after a week or two of practice, he quickly realised what my primary school teachers had realised long ago: that I had no sense of rhythm whatsoever. Dan offered me a metronome, and when that didn't help, he gave me something my teachers couldn't – a diagnosis: congenital arrhythmia.
And yet I never lost the desire to play. Music hasn't been studied as systematically as language in terms of critical periods, but there are certainly artists who started late and still became first-rate musicians. Tom Morello, the guitarist of Rage Against the Machine and among Rolling Stone magazine's greatest guitarists of all time, didn't start until he was 17. Patti Smith scarcely considered becoming a professional singer until she was in her mid-20s. Then there is the jazz guitar star Pat Martino, who relearned how to play after a brain aneurysm at the age of 35, and Dr John, who switched his primary allegiance from guitar to piano at 21 (after his left ring finger was badly injured in a bar-room fight) and won the first of his five Grammy awards in his late 40s.
Given my arrhythmia, I had no aspiration of reaching such heights, but at 38, long after I had completed my PhD and become a professor of cognitive psychology, I realised that my desire to become musical wasn't going away. I wanted to know whether I could overcome my intrinsic limits, my age and my lack of talent. Perhaps few people had less talent for music than I did, but few people wanted to be able to play more acutely.
I began to read the scientific literature. How did children learn music? Were there any lessons for adults? To my surprise, although children had been well studied, there was hardly any systematic research on people my age. Nobody seemed to know much about whether adults could learn to play late in life, and it wasn't just music that we knew little about; the literature on the capacity of adults to learn new skills in general was far sparser than I had imagined.
We know something about gradual declines in memory, but the only truly firm result I could find concerned perfect pitch (the ability to identify a single note in isolation). For that, one must indeed start early, but luckily for me and anyone else starting late, it is also clear that perfect pitch is more luxury than necessity. Duke Ellington didn't have it and neither did Igor Stravinsky (nor, for that matter, did Joey Ramone).
Other studies show some advantages for music learners who began earlier in life, but most of those don't take consideration of the total amount of practice. When it came to other aspects of music, such as the ability to improvise or compose, or even to learn a simple melody, there was almost no compelling literature. Although any number of studies have shown that the more you practise the better you get, startlingly few have compared what happens when people of different ages get the same amount of practice.
How could such a basic scientific question remain so unanswered? I wondered about this for months, until Caroline Palmer, a professor of psychology at McGill University in Montreal, explained the answer to me. The problem wasn't a lack of scientific interest – it was a lack of subjects. To learn a musical instrument, you need to put in a lot of work – 10,000 hours is an oft-mentioned (if somewhat oversold) number – and to do a proper study, you'd need a reasonably large sample of participants, which is to say a big group of adult novices with sufficient commitment. Nobody has studied the outcomes of adults who put in 10,000 hours of practice starting at 42 because most people of that age have lives and responsibilities – few adult learners are prepared to invest the kind of time that a teenager has. No subjects, no science. At that point, I decided to become a guinea pig.
by Gary Marcus, The Guardian | Read more:
Photograph: Jan Persson/Redferns
How to Say Goodbye
We are taught to start our stories at the beginning. We open with “once upon a time,” hoping to capture the nascent moment when everything came to be. But there are few lessons — in our culture, in our schooling, in our socialization — in how to exit well. Our culture applauds the spirit, gumption and promise of beginnings. We admire the entry — the moment when people launch themselves into something new, plan and execute a new project, embark on important work, get married, take an adventure. Our habit is to tilt toward the future, perpetually poised for the next move, the strategic opportunity.
By contrast, our exits are often ignored or invisible. They seem to represent the negative spaces in our life narratives. There is little appreciation or applause when we decide (or it is decided for us) that it is time to move on. We often slink away in the night hoping that no one will notice; that the darkness will make the departure disappear. If the entry recalls a straight and erect posture, a person who is strong and determined; then we imagine a person stooped, weakened, and despairing as he makes his exit. (...)
Why is all of this so important? Why do we need to wrest our exits from the shadows of inattention and guilt? Why must we readjust our cultural lens in order to see and compose the exits in our lives? Because, I believe, that our preoccupation with beginnings reveals only half the story; offering a partial and distorted view of the layers and trajectories of our growth and development; exaggerating the power and potential of our launches while neglecting the undertows of over-reaching. We might chart and judge our journeys very differently if we looked through the prism of our exits; a prism that would reveal the interplay of reflection and propulsion, hindsight and generativity that come with navigating our endings well.
The wisdom and insights I gathered from listening to dozens of people tell their stories of exit — some in the midst of composing them, others anticipating their departures, still others looking back over long years; revisiting the ancient narratives that had changed their lives — point to a radical reframing of the meaning and worthiness of exits, moving exits from the shadows to the light, from the invisible to the visible. In order for exits to be productive and expansive, we must give them our full attention, and grapple with the range of emotions they stir up in us; the often paradoxical sensations of loss and liberation, grief and jubilation, and pain and beauty that accompany our departures from our relationships, families, institutions, and communities; from our former identities. And the daily practice of navigating the small exits that punctuate our days — a hug at the door, a lullaby at bedtime, a thank you as you leave the office — helps us design and enact the grander send-offs with intentionality and care. The micro and the macro seem to be inextricably linked, the former informing and heralding the latter.
Another paradox: The exit signs are bold and blurred; clear and confusing. On the one hand, people can recall the exact moment —in bold relief, the blood red exit sign in a darkened movie theater — when they decided to leave, when they felt that they no longer had a choice, when all the forces and sensations came together in a perfect storm and they said to themselves, “I’m out of here.” On the other hand, those who take leave, see the messiness and ambivalence of their departures through their rear view mirrors; the long process of retreat that came well before the marked moment of announced leaving and the many aftershocks of exiting that followed. Exits feel both abrupt and final — a leap of faith, a moment of reckless abandon — and gnawingly cautious and iterative. Those who exit must be ready to ride out these paradoxical sensations.
by Sara Lawrence-Lightfoot, Salon | Read more:
Credit: Rose-Marie Henriksson via Shutterstock
Gustav Klimt, Portrait of Adele Bloch Bauer
via:
Looking for the umpteenth time at Gustav Klimt’s “Portrait of Adele Bloch-Bauer I” (1907) at the estimable Neue Galerie, on the occasion of a show celebrating Klimt’s hundred and fiftieth birthday, I’ve changed my mind. The gold- and silver-encrusted picture, bought by the museum’s co-founder Ronald Lauder for a headline-grabbing hundred and thirty-five million dollars, in 2006, isn’t a peculiarly incoherent painting, as I had once thought. It’s not a painting at all, but a largish, flattish bauble: a thing. It is classic less of its time than of ours, by sole dint of the money sunk in it.
“Adele” belongs to a special class of iffy art works whose price is their object. A dispirited version in pastels of Edvard Munch’s “The Scream,” which fetched a hundred and nineteen million last month. Another example is the sadly discolored van Gogh “Sunflowers,” which set a market record—forty million—in 1987, when sold to a Japanese insurance company. (The purchase amounted to a cherry on top of Japan’s then ballooning, doomed real-estate bubble.) And I remember asking the director of Australia’s National Gallery why, in 1973, he had plunked an unheard-of two million for Jackson Pollock’s amazing but, to my eye, overworked “Blue Poles.” He mused, “Well, I’ve always liked blue.”
by Peter Schjeldahl, The New Yorker | Changing My Mind About Gustav Klimt's "Adele" | Read More:
In 1839, a year after the first photo containing a human being was made, photography pioneer Robert Cornelius made the first ever portrait of a human being.
On a sunny day in October, Robert Cornelius set up his camera in the back of his father’s gas lamp-importing business on Chestnut Street in Center City, Philadelphia. After removing the lens cap, he sprinted into the frame, where he sat for more than a minute before covering up the lens. The picture he produced that day was the first photographic self-portrait. It is also widely considered the first successful photographic portrait of a human being.
[…] the words written on the back of the self-portrait, in Cornelius’ own hand, said it all: “The first light Picture ever taken. 1839.”
via:
See also: History of Photography (Wikipedia)
The Ultralightlife
Lightweight Trail shoes that zip completely into themselves to minimize the space it takes up in your pack. Ultralightlife.
via: YMFY
via: YMFY
A Roe, by Any Other Name
The swampy Atchafalaya Basin is a far cry from the cold waters of the Caspian Sea. And its lowly native bowfin, often derided as a throwaway fish, is no prized sturgeon. Yet it is laying golden eggs.
Bowfin caviar, from the single-employee Louisiana Caviar Company (motto: “Laissez-les manger beaucoup Cajun caviar!”) is earning a place on the menus at such top-notch establishments here as Commander’s Palace and Restaurant Stella. The executive chef of Galatoire’s Restaurant, Michael Sichel, served it up at the New Orleans Wine and Food Experience last month, an annual bacchanal.
And now, even the Russians are coming.
“There’s pretty good demand from lots of clients,” said Igor Taksir, a Russian-born exporter who ships the glistening roe, which is actually black but turns yellow-gold when cooked, to Moscow and Ukraine. Mr. Taksir said he was “skeptical in the beginning,” when he discovered bowfin caviar at a seafood show in Boston three years ago. “But when we started tasting,” he said, “we realized the quality was surprisingly good.”
Still, this is not the caviar of gilded dreams. If beluga sturgeon from the Caspian Sea, the king of them all, is paired best with Champagne, then bowfin from the bayou, some of it infused with hot pepper and served deep-fried, might go better with a beer. It represents what is a populist twist and an accommodation by chefs to the environmental and ethical realities that come with serving Russian and Iranian caviar.
Global efforts to all but ban the international trade of caviar from the Caspian Sea, where overfishing and pollution have depleted sturgeon populations, have opened enormous opportunities for affordable substitutes from unlikely places in America. Even landlocked Montana, North Dakota and Oklahoma have thriving markets based on wild river fish.
“I think any chef or any food person with a conscience is only eating domestic or farmed caviar,” said Mitchell Davis, executive vice president of the James Beard Foundation.
The world has come to have a taste for the growing American market of caviar and fish roe. Between 2001 and 2010, annual exports of white sturgeon, shovelnose sturgeon (also called hackleback) and paddlefish roe increased to about 37,712 pounds from roughly 5,214 pounds, with a majority of wild origin, according to the American branch of the Convention on International Trade in Endangered Species of Wild Fauna and Flora and the federal Fish and Wildlife Service.
Seventy percent of the total caviar and roe exported from the United States in 2010 went to countries in the European Union, Ukraine and Japan. (...)
Some caviar enthusiasts will never agree.
“I haven’t sampled bowfin myself, and quite frankly wouldn’t want to,” said Ryan Sutton, the food critic at Bloomberg News, who has lived and studied in Russia. Mr. Sutton was also critical of American paddlefish caviar, which he described as lacking both texture and flavor.
In caviar, a taster wants firmness and pop, “with a clean flavor of the sea,” Mr. Sutton said. (...)
Even the fact that the Food and Drug Administration allows the roe from fish other than the sturgeon to be called caviar — as long as it is qualified by the fish’s name, as in “bowfin caviar” — rubs some people the wrong way.
“The F.D.A. looks at the word ‘caviar’ as synonymous with roe, but that is not true,” said Douglas Peterson, associate professor of fisheries and aquaculture research at the Warnell School of Forestry and Natural Resources at the University of Georgia. “Caviar only comes from sturgeon,” he said. “Everything else is fish eggs.”
Susan Saulny, NY Times | Read more:
Photo: William Widmer for The New York Times
Saturday, June 9, 2012
Herbie Hancock & Leonard Cohen
[ed. Sorry about the ad at the beginning. You can skip it after a few seconds...]
A Universe of Self-Replicating Code
...What's, in a way, missing in today's world is more biology of the Internet. More people like Nils Barricelli to go out and look at what's going on, not from a business or what's legal point of view, but just to observe what's going on.
Many of these things we read about in the front page of the newspaper every day, about what's proper or improper, or ethical or unethical, really concern this issue of autonomous self-replicating codes. What happens if you subscribe to a service and then as part of that service, unbeknownst to you, a piece of self-replicating code inhabits your machine, and it goes out and does something else? Who is responsible for that? And we're in an increasingly gray zone as to where that's going.
The most virulent codes, of course, are parasitic, just as viruses are. They're codes that go out and do things, particularly codes that go out and gather money. Which is essentially what these things like cookies do. They are small strings of code that go out and gather valuable bits of information, and they come back and sell it to somebody. It's a very interesting situation. You would have thought this was inconceivable 20 or 30 years ago. Yet, you probably wouldn't have to go … well, we're in New York, not San Francisco, but in San Francisco, you wouldn't have to go five blocks to find five or 10 companies whose income is based on exactly that premise. And doing very well at it.
Walking over here today, just three blocks from my hotel, the street right out front is blocked off. There are 20 police cars out there and seven satellite news vans, because Apple is releasing a new code. They're couching it as releasing a new piece of hardware, but it's really a new gateway into the closed world of Apple's code. And that's enough to block human traffic.
Why is Apple one of the world's most valuable companies? It's not only because their machines are so beautifully designed, which is great and wonderful, but because those machines represent a closed numerical system. And they're making great strides in expanding that system. It's no longer at all odd to have a Mac laptop. It's almost the normal thing.
But I'd like to take this to a different level, if I can change the subject... Ten or 20 years ago I was preaching that we should look at digital code as biologists: the Darwin Among the Machines stuff. People thought that was crazy, and now it's firmly the accepted metaphor for what's going on. And Kevin Kelly quoted me in Wired, he asked me for my last word on what companies should do about this. And I said, "Well, they should hire more biologists."
But what we're missing now, on another level, is not just biology, but cosmology. People treat the digital universe as some sort of metaphor, just a cute word for all these products. The universe of Apple, the universe of Google, the universe of Facebook, that these collectively constitute the digital universe, and we can only see it in human terms and what does this do for us?
We're missing a tremendous opportunity. We're asleep at the switch because it's not a metaphor. In 1945 we actually did create a new universe. This is a universe of numbers with a life of their own, that we only see in terms of what those numbers can do for us. Can they record this interview? Can they play our music? Can they order our books on Amazon? If you cross the mirror in the other direction, there really is a universe of self-reproducing digital code. When I last checked, it was growing by five trillion bits per second. And that's not just a metaphor for something else. It actually is. It's a physical reality.
by George Dyson, Edge | Read more:
What is Cool?
[ed. Is there really a Journal of Individual Differences?]
Do rebelliousness, emotional control, toughness and thrill-seeking still make up the essence of coolness?
Can performers James Dean and Miles Davis still be considered the models of cool?
Research led by a University of Rochester Medical Center psychologist and published by the Journal of Individual Differences has found the characteristics associated with coolness today are markedly different than those that generated the concept of cool.
“When I set out to find what people mean by coolness, I wanted to find corroboration of what I thought coolness was,” said Ilan Dar-Nimrod, Ph.D., lead author of “Coolness: An Empirical Investigation.” “I was not prepared to find that coolness has lost so much of its historical origins and meaning—the very heavy countercultural, somewhat individualistic pose I associated with cool.
“James Dean is no longer the epitome of cool,” Dar-Nimrod said. “The much darker version of what coolness is still there, but it is not the main focus. The main thing is: Do I like this person? Is this person nice to people, attractive, confident and successful? That’s cool today, at least among young mainstream individuals.” (...)
“We have a kind of a schizophrenic coolness concept in our mind,” Dar-Nimrod said. “Almost any one of us will be cool in some people’s eyes, which suggests the idiosyncratic way coolness is evaluated. But some will be judged as cool in many people’s eyes, which suggests there is a core valuation to coolness, and today that does not seem to be the historical nature of cool. We suggest there is some transition from the countercultural cool to a generic version of it’s good and I like it. But this transition is by no way completed.”
by Michael Wentzel, University of Rochester | Read more:
Do rebelliousness, emotional control, toughness and thrill-seeking still make up the essence of coolness?

Research led by a University of Rochester Medical Center psychologist and published by the Journal of Individual Differences has found the characteristics associated with coolness today are markedly different than those that generated the concept of cool.
“When I set out to find what people mean by coolness, I wanted to find corroboration of what I thought coolness was,” said Ilan Dar-Nimrod, Ph.D., lead author of “Coolness: An Empirical Investigation.” “I was not prepared to find that coolness has lost so much of its historical origins and meaning—the very heavy countercultural, somewhat individualistic pose I associated with cool.
“James Dean is no longer the epitome of cool,” Dar-Nimrod said. “The much darker version of what coolness is still there, but it is not the main focus. The main thing is: Do I like this person? Is this person nice to people, attractive, confident and successful? That’s cool today, at least among young mainstream individuals.” (...)
“We have a kind of a schizophrenic coolness concept in our mind,” Dar-Nimrod said. “Almost any one of us will be cool in some people’s eyes, which suggests the idiosyncratic way coolness is evaluated. But some will be judged as cool in many people’s eyes, which suggests there is a core valuation to coolness, and today that does not seem to be the historical nature of cool. We suggest there is some transition from the countercultural cool to a generic version of it’s good and I like it. But this transition is by no way completed.”
by Michael Wentzel, University of Rochester | Read more:
Friday, June 8, 2012
The Lonely Polygamist
Meet Bill. He has four wives and thirty-one kids. And something's missing.
Polygamy is not something you try on a whim. You don't come home from work one day, pop open a beer, settle down for your nightly dose of Seinfeld reruns, and think, "Boy, my marriage is a bore. Maybe I should give polygamy a whirl." It's true that polygamy, as a concept, sounds downright inviting. Yes, there are lots of women involved, women of all shapes and sizes and personalities, a wonderful variety of women, and yes, they'll fulfill your every need, cook your dinner, do your laundry, sew the buttons on your shirts. And yes, you're allowed to sleep with these women, each of them, one for every night of the week if you want, and what's more, when you wake up in the morning, you won't have to deal with even the tiniest twinge of guilt, because these women, all of them, are your sweethearts, your soul mates, your wives.
Then what, you're asking yourself, could possibly be the problem?
The problem is this: Polygamy is not what you think it is. It has nothing to do with the little fantasy just spelled out for you. A life of polygamy is not a joyride, a guiltless sexual free-for-all. Being a polygamist is not for the easygoing or the weak of heart. It's like marine boot camp or working for the mob; if you're not cut out for it, if you don't have that essential thing inside, it will eat you alive. And polygamy doesn't just require simple cojones, either. It requires the devotion of a monk, the diplomatic prowess of Winston Churchill, the doggedness of a field general, the patience of a pine tree.
Put simply: You'd have to be crazy to want to be a polygamist.
That's what's so strange about Bill. Bill has four wives and thirty-one children. Bill is an ex-Mormon, and he doesn't seem crazy at all. If anything, he seems exceptionally sane, painfully regular, as normal as soup. He's certainly not the wild-eyed, woolly-bearded zealot you might expect. Approaching middle age, Bill has the unassuming air of an accountant. He wears white shirts, blue ties, and black wing tips. He is Joe Blow incarnate. The only thing exceptional about Bill is his height: He is six foot eight and prone to hitting his head on hanging lamps and potted plants.
Bill's wives are not who you'd expect, either. They're not ruddy-faced women with high collars buttoned up to their chins. These are the women you see every day of your life. They wear jeans and T-shirts; they drive minivans; they have jobs. Julia is a legal secretary; Emily manages part of Bill's business; Susan owns a couple of health-food stores; and Stacy stays at home with the younger children. They are also tall, all of them around six feet; if you didn't know better, you'd think Bill and his wives had a secret plan to create a race of giants.
Each of Bill's wives lives in a different house in the suburbs around Salt Lake City. They've lived in different configurations over the years--all in one place, two in one and two in another--but this is the way that seems best nowadays, since there are teenagers in the mix, and one thing everybody seems to agree on is how much teenagers need their space. Bill himself is homeless. He wanders from house to house like a nomad or a beggar, sometimes surprising a certain wife with the suddenness of his presence. In the past, he has used a rigid rotation schedule but now opts for a looser approach. He believes that intuition and nothing else should guide where he stays for the night.
Okay, now: Put yourself in Bill's size-14 wing tips for a minute. You've just finished an exhausting day at work. It's that time of the evening when you think to yourself, "Hmmm. Which house am I going to tonight?" You get in your car and head off toward Emily's house; you haven't seen Emily for several days, and besides, she's having trouble with one of your teenage daughters--she's not sticking to her curfew. But you remember that your son Walt has a soccer game on the other side of town at 5:30. You start to turn around, but then you think of Susan, wife number two, who has come down with the flu and is in need of some comfort and company. Then it hits you that not only did you promise to look at the bad alternator in Stacy's Volvo tonight, not only did you tell Emily that you'd be home in time to meet with the insurance man to go over all your policies, but that Annie, your six-year-old daughter, is having a birthday tomorrow and you've yet to get her a present.
Sitting there at the intersection--cars honking, people flipping you the bird--do you feel paralyzed? Do you feel like merging with the rest of the traffic onto I-15 and heading for Las Vegas, leaving it all behind?
This is Bill's life.
by Brady Udall, Standard-Examiner (1998) | Read more:
Polygamy is not something you try on a whim. You don't come home from work one day, pop open a beer, settle down for your nightly dose of Seinfeld reruns, and think, "Boy, my marriage is a bore. Maybe I should give polygamy a whirl." It's true that polygamy, as a concept, sounds downright inviting. Yes, there are lots of women involved, women of all shapes and sizes and personalities, a wonderful variety of women, and yes, they'll fulfill your every need, cook your dinner, do your laundry, sew the buttons on your shirts. And yes, you're allowed to sleep with these women, each of them, one for every night of the week if you want, and what's more, when you wake up in the morning, you won't have to deal with even the tiniest twinge of guilt, because these women, all of them, are your sweethearts, your soul mates, your wives.
Then what, you're asking yourself, could possibly be the problem?
The problem is this: Polygamy is not what you think it is. It has nothing to do with the little fantasy just spelled out for you. A life of polygamy is not a joyride, a guiltless sexual free-for-all. Being a polygamist is not for the easygoing or the weak of heart. It's like marine boot camp or working for the mob; if you're not cut out for it, if you don't have that essential thing inside, it will eat you alive. And polygamy doesn't just require simple cojones, either. It requires the devotion of a monk, the diplomatic prowess of Winston Churchill, the doggedness of a field general, the patience of a pine tree.
Put simply: You'd have to be crazy to want to be a polygamist.
That's what's so strange about Bill. Bill has four wives and thirty-one children. Bill is an ex-Mormon, and he doesn't seem crazy at all. If anything, he seems exceptionally sane, painfully regular, as normal as soup. He's certainly not the wild-eyed, woolly-bearded zealot you might expect. Approaching middle age, Bill has the unassuming air of an accountant. He wears white shirts, blue ties, and black wing tips. He is Joe Blow incarnate. The only thing exceptional about Bill is his height: He is six foot eight and prone to hitting his head on hanging lamps and potted plants.
Bill's wives are not who you'd expect, either. They're not ruddy-faced women with high collars buttoned up to their chins. These are the women you see every day of your life. They wear jeans and T-shirts; they drive minivans; they have jobs. Julia is a legal secretary; Emily manages part of Bill's business; Susan owns a couple of health-food stores; and Stacy stays at home with the younger children. They are also tall, all of them around six feet; if you didn't know better, you'd think Bill and his wives had a secret plan to create a race of giants.
Each of Bill's wives lives in a different house in the suburbs around Salt Lake City. They've lived in different configurations over the years--all in one place, two in one and two in another--but this is the way that seems best nowadays, since there are teenagers in the mix, and one thing everybody seems to agree on is how much teenagers need their space. Bill himself is homeless. He wanders from house to house like a nomad or a beggar, sometimes surprising a certain wife with the suddenness of his presence. In the past, he has used a rigid rotation schedule but now opts for a looser approach. He believes that intuition and nothing else should guide where he stays for the night.
Okay, now: Put yourself in Bill's size-14 wing tips for a minute. You've just finished an exhausting day at work. It's that time of the evening when you think to yourself, "Hmmm. Which house am I going to tonight?" You get in your car and head off toward Emily's house; you haven't seen Emily for several days, and besides, she's having trouble with one of your teenage daughters--she's not sticking to her curfew. But you remember that your son Walt has a soccer game on the other side of town at 5:30. You start to turn around, but then you think of Susan, wife number two, who has come down with the flu and is in need of some comfort and company. Then it hits you that not only did you promise to look at the bad alternator in Stacy's Volvo tonight, not only did you tell Emily that you'd be home in time to meet with the insurance man to go over all your policies, but that Annie, your six-year-old daughter, is having a birthday tomorrow and you've yet to get her a present.
Sitting there at the intersection--cars honking, people flipping you the bird--do you feel paralyzed? Do you feel like merging with the rest of the traffic onto I-15 and heading for Las Vegas, leaving it all behind?
This is Bill's life.
by Brady Udall, Standard-Examiner (1998) | Read more:
Team of Mascots
Just four years ago, when it was clear that he would be the Democratic presidential nominee, Barack Obama famously declared that, if elected, he would want “a team of rivals” in his Cabinet, telling Joe Klein, of Time magazine, “I don’t want to have people who just agree with me. I want people who are continually pushing me out of my comfort zone.” His inspiration was Doris Kearns Goodwin’s best-selling book about Abraham Lincoln, who appointed three men who had been his chief competitors for the presidency in 1860—and who held him, at that point, in varying degrees of contempt—to help him keep the Union together during the Civil War. To say that things haven’t worked out that way for Obama is the mildest understatement. “No! God, no!” one former senior Obama adviser told me when I asked if the president had lived up to this goal. There’s nothing sacred about the team-of-rivals idea—for one thing, it depends on who the rivals were. Obama does have one former rival, Hillary Clinton, in his Cabinet, and another, Joe Biden, is vice president. Mitt Romney would have fewer options. Can anyone really imagine Romney making Rick Santorum his secretary of health and human services, or Herman Cain his commerce secretary, or Newt Gingrich the administrator of nasa? Well, maybe the last, if only so Romney could have the satisfaction of sending the former Speaker—bang! zoom!—to the moon! For the record, Gingrich has said he’d be unlikely to accept any position in a Romney administration, and Romney himself has given almost no real hints about whom he might appoint. In light of his propensity to bow to prevailing political pressures, his Cabinet might well be, as he described himself, “severely conservative.” But the way presidents use their Cabinets says a lot about their style of governing. Richard Nixon created a deliberately weak Cabinet (he ignored his secretary of state William Rogers to the point of humiliation, in favor of his national-security adviser, Henry Kissinger), and he rewarded their loyalty by demanding all their resignations on the morning after his landslide re-election, in 1972. John F. Kennedy, having won a whisker-close election against Nixon, in 1960, wanted Republicans such as Douglas Dillon at Treasury and Robert McNamara at Defense to lend an air of bipartisan authority and competence. George W. Bush had a very powerful Cabinet, especially in the persons of Donald Rumsfeld, Robert Gates, and Condoleezza Rice, if only to compensate for his pronounced lack of experience in foreign policy and military affairs. (...)
The days when presidential Cabinets contained the likes of Thomas Jefferson as secretary of state, or Alexander Hamilton as secretary of the Treasury, are long since gone (and those early Cabinets displayed a fractiousness that no modern president would be likely to tolerate), though Cabinet officers retain symbols of office—from flags to drivers to, in some cases, chefs—befitting grander figures. The lingering public image of Cabinet meetings as the scene of important action is largely a myth. “They are not meetings where policy is determined or decisions are made,” the late Nicholas Katzenbach, who served Lyndon Johnson as attorney general, recalled in his memoirs. Nevertheless, Katzenbach attended them faithfully, “not because they were particularly interesting or important, but simply because”—remembering L.B.J.’s awful relationship with the previous attorney general, Bobby Kennedy—“I did not want the president to feel I was not on his team.” Even as recently as the 1930s, Cabinet figures such as Labor Secretary Frances Perkins, Interior Secretary Harold Ickes, and Postmaster General James A. Farley were important advisers to Franklin D. Roosevelt (and, in the cases of Perkins and Ickes, priceless diarists and chroniclers) in areas beyond their lanes of departmental responsibility, just as Robert F. Kennedy was his brother’s all-purpose sounding board and McNamara provided J.F.K. with advice on business and economics well outside his purview at the Pentagon. “Cabinet posts are great posts,” says Dan Glickman, who was Bill Clinton’s agriculture secretary. “But you realize that the days of Harry Hopkins and others who were in the Cabinet and were key advisers to the president—that really isn’t true anymore.” “In the case of Clinton,” Glickman went on, “it was a joy to work for him, because, in large part, he gave each of us lots of discretion. He said, ‘If it’s bad news, don’t call me. If it’s good news, call me. If it’s exceptionally good news, call me quicker.’ ” The way Cabinet officers relate personally to the president is—no surprise—often the crucial factor in their success or failure. Colin Powell had a worldwide profile and a higher approval rating than George W. Bush, and partly for those very reasons had trouble building a close rapport with a president who had lots to be modest about. Obama’s energy secretary, Steven Chu, may have a Nobel Prize in physics, but that counted for little when he once tried to make a too elaborate visual presentation to the president. Obama said to him after the third slide, as one witness recalls, “O.K., I got it. I’m done, Steve. Turn it off.” Attorney General Eric Holder has been particularly long-suffering, although he and his wife, Dr. Sharon Malone, are socially close to the Obamas. Set aside the controversy that surrounded his failure, as deputy attorney general at the end of the Clinton administration, to oppose a pardon for Marc Rich, the fugitive financier whose ex-wife was a Clinton donor. Holder, the first black attorney general, has taken a political beating more recently for musing that the country is a “nation of cowards” when it comes to talking about race, and for following through on what seemed to be the president’s own wishes on such matters as proposing to try the 9/11 mastermind Khalid Sheikh Mohammed in an American courtroom (in the middle of Manhattan, no less). The sharp growth in the White House staff in the years since World War II has also meant that policy functions once reserved for Cabinet officers are now performed by top aides inside the White House itself. Obama meets regularly and privately with Tim Geithner and Hillary Clinton, but almost certainly sees his national-security adviser, Tom Donilon, and his economic adviser, Gene Sperling, even more often. The relentless media cycle now moves so swiftly that any president, even one less inclined toward centralized discipline than Obama, might naturally rely on the White House’s quick-on-the-draw internal-messaging machine instead of bucking things through the bureaucratic channels of the executive departments. In dealing with a Cabinet, as with life itself, there is no substitute for experience. Clinton-administration veterans told me that their boss made better, fuller use of the Cabinet in his second term than he did in his first, when officials such as Les Aspin at the Pentagon and Warren Christopher at the State Department sometimes struggled to build a cohesive team. Lincoln’s choice of William H. Seward at State, Salmon P. Chase at Treasury, and Edward Bates as attorney general were far from universally applauded. “The construction of a Cabinet,” one editorial admonished at the time, “like the courting of a shrewd girl, belongs to a branch of the fine arts with which the new Executive is not acquainted.” Lincoln’s Cabinet did solve one political problem but it created others—Lincoln had to fight not one but two civil wars.
by Todd S. Purdum, Vanity Fair | Read more:
Darrow
The days when presidential Cabinets contained the likes of Thomas Jefferson as secretary of state, or Alexander Hamilton as secretary of the Treasury, are long since gone (and those early Cabinets displayed a fractiousness that no modern president would be likely to tolerate), though Cabinet officers retain symbols of office—from flags to drivers to, in some cases, chefs—befitting grander figures. The lingering public image of Cabinet meetings as the scene of important action is largely a myth. “They are not meetings where policy is determined or decisions are made,” the late Nicholas Katzenbach, who served Lyndon Johnson as attorney general, recalled in his memoirs. Nevertheless, Katzenbach attended them faithfully, “not because they were particularly interesting or important, but simply because”—remembering L.B.J.’s awful relationship with the previous attorney general, Bobby Kennedy—“I did not want the president to feel I was not on his team.” Even as recently as the 1930s, Cabinet figures such as Labor Secretary Frances Perkins, Interior Secretary Harold Ickes, and Postmaster General James A. Farley were important advisers to Franklin D. Roosevelt (and, in the cases of Perkins and Ickes, priceless diarists and chroniclers) in areas beyond their lanes of departmental responsibility, just as Robert F. Kennedy was his brother’s all-purpose sounding board and McNamara provided J.F.K. with advice on business and economics well outside his purview at the Pentagon. “Cabinet posts are great posts,” says Dan Glickman, who was Bill Clinton’s agriculture secretary. “But you realize that the days of Harry Hopkins and others who were in the Cabinet and were key advisers to the president—that really isn’t true anymore.” “In the case of Clinton,” Glickman went on, “it was a joy to work for him, because, in large part, he gave each of us lots of discretion. He said, ‘If it’s bad news, don’t call me. If it’s good news, call me. If it’s exceptionally good news, call me quicker.’ ” The way Cabinet officers relate personally to the president is—no surprise—often the crucial factor in their success or failure. Colin Powell had a worldwide profile and a higher approval rating than George W. Bush, and partly for those very reasons had trouble building a close rapport with a president who had lots to be modest about. Obama’s energy secretary, Steven Chu, may have a Nobel Prize in physics, but that counted for little when he once tried to make a too elaborate visual presentation to the president. Obama said to him after the third slide, as one witness recalls, “O.K., I got it. I’m done, Steve. Turn it off.” Attorney General Eric Holder has been particularly long-suffering, although he and his wife, Dr. Sharon Malone, are socially close to the Obamas. Set aside the controversy that surrounded his failure, as deputy attorney general at the end of the Clinton administration, to oppose a pardon for Marc Rich, the fugitive financier whose ex-wife was a Clinton donor. Holder, the first black attorney general, has taken a political beating more recently for musing that the country is a “nation of cowards” when it comes to talking about race, and for following through on what seemed to be the president’s own wishes on such matters as proposing to try the 9/11 mastermind Khalid Sheikh Mohammed in an American courtroom (in the middle of Manhattan, no less). The sharp growth in the White House staff in the years since World War II has also meant that policy functions once reserved for Cabinet officers are now performed by top aides inside the White House itself. Obama meets regularly and privately with Tim Geithner and Hillary Clinton, but almost certainly sees his national-security adviser, Tom Donilon, and his economic adviser, Gene Sperling, even more often. The relentless media cycle now moves so swiftly that any president, even one less inclined toward centralized discipline than Obama, might naturally rely on the White House’s quick-on-the-draw internal-messaging machine instead of bucking things through the bureaucratic channels of the executive departments. In dealing with a Cabinet, as with life itself, there is no substitute for experience. Clinton-administration veterans told me that their boss made better, fuller use of the Cabinet in his second term than he did in his first, when officials such as Les Aspin at the Pentagon and Warren Christopher at the State Department sometimes struggled to build a cohesive team. Lincoln’s choice of William H. Seward at State, Salmon P. Chase at Treasury, and Edward Bates as attorney general were far from universally applauded. “The construction of a Cabinet,” one editorial admonished at the time, “like the courting of a shrewd girl, belongs to a branch of the fine arts with which the new Executive is not acquainted.” Lincoln’s Cabinet did solve one political problem but it created others—Lincoln had to fight not one but two civil wars.
by Todd S. Purdum, Vanity Fair | Read more:
Darrow
The Library of Utopia
In his 1938 book World Brain, H.G. Wells imagined a time—not very distant, he believed—when every person on the planet would have easy access to "all that is thought or known."
The 1930s were a decade of rapid advances in microphotography, and Wells assumed that microfilm would be the technology to make the corpus of human knowledge universally available. "The time is close at hand," he wrote, "when any student, in any part of the world, will be able to sit with his projector in his own study at his or her convenience to examine any book, any document, in an exact replica."
Wells's optimism was misplaced. The Second World War put idealistic ventures on hold, and after peace was restored, technical constraints made his plan unworkable. Though microfilm would remain an important medium for storing and preserving documents, it proved too unwieldy, too fragile, and too expensive to serve as the basis for a broad system of knowledge transmission. But Wells's idea is still alive. Today, 75 years later, the prospect of creating a public repository of every book ever published—what the Princeton philosopher Peter Singer calls "the library of utopia"—seems well within our grasp. With the Internet, we have an information system that can store and transmit documents efficiently and cheaply, delivering them on demand to anyone with a computer or a smart phone. All that remains to be done is to digitize the more than 100 million books that have appeared since Gutenberg invented movable type, index their contents, add some descriptive metadata, and put them online with tools for viewing and searching.
Google had the smarts and the money to scan millions of books into its database, but the major problems with constructing a universal library has little to do with technology.It sounds straightforward. And if it were just a matter of moving bits and bytes around, a universal online library might already exist. Google, after all, has been working on the challenge for 10 years. But the search giant's book program has foundered; it is mired in a legal swamp. Now another momentous project to build a universal library is taking shape. It springs not from Silicon Valley but from Harvard University. The Digital Public Library of America—the DPLA—has big goals, big names, and big contributors. And yet for all the project's strengths, its success is far from assured. Like Google before it, the DPLA is learning that the major problem with constructing a universal library nowadays has little to do with technology. It's the thorny tangle of legal, commercial, and political issues that surrounds the publishing business. Internet or not, the world may still not be ready for the library of utopia.
by Nicholas Carr, MIT Technology Review | Read more:
Illustration: Stuart Bradford
Subscribe to:
Posts (Atom)